This same phenomenon occurs when developing telecommunications and computing products and services with accessibility in mind. Experts in the telecom access engineering field call it the "Electronic Curb-cut Effect."
Television (TV) manufacturers in the U.S. will tell you that their caption decoders for the deaf wound up benefiting tens-of-millions more consumers than originally intended. As the electronic curb cut effect has shown in the past, televisions with decoders are simply better than those without. For example, captioning can enable TV viewers to:
search for and retrieve video content, by word, through the use of multimedia databases;
listen to programs in silence while someone is sleeping;
listen to programs in noisy environments like sports bars;
watch their favorite program while talking on the telephone, without appearing rude to the person being spoken to;
read more effectively, and at an earlier age, by enabling them to see the words being spoken at the same time they hear them (i.e. Sesame Street);
Learn to read/speak a second language by displaying foreign words at the same time they are being spoken; and,
Understand foreign programming through the use of native language captions.
What follows is a listing of IT innovations, originally developed by, or in support of, people with disabilities that wound up benefiting everyone. Do any of them strike a familiar note?
1808:
The first typewriter is built by Pellegrino Turri. He built it for his blind friend Countess Carolina Fantoni da Fivizzono. He wanted to help her write legibly.
See: http://xavier.xu.edu:8000/~polt/tw-history.html
1872:
Alexander Graham Bell takes up permanent residence in the United States at 35 Newton Street, Boston where he conducts normal classes for teachers of the deaf.
See: http://www.webbconsult.com/1800.html
1873:
Herman Hollerith, a young student who experts now recognize as having had a cognitive processing disabilities, begins making a habit out of jumping from his second-story schoolroom window to avoid having to take his spelling lessons.
See: http://www.webpixie.com/secret/Our-past.htm
1876:
A patent for the telephone (No. 174,465) is issued to Alexander Graham Bell. The telephone was one of the many devices Bell developed in support of his work with the deaf.
See: http://www.webbconsult.com/1800.html
1886:
Herman Hollerith thought of the idea to use punched cards to keep and transport information, a technology used up to the late 1970s. Those punched cards were read electronically: the cards were transported between brass rods, and when there were holes in the cards, the rods made contact and a electric current could flow. This device was constructed to allow the 1890 census to be tabulated. This construction meant a great improvement as hand tabulation was projected to take more than a decade. They called this little invention the computer.
See: http://www-stall.rz.fht-esslingen.de/studentisches/Computer_Geschichte/grp2/holler.html
1896:
The Tabulating Machine Company was founded by Hollerith.
See: http://www-stall.rz.fht-esslingen.de/studentisches/Computer_Geschichte/grp2/holler.html
1916:
Harvey Fletcher joined the Research Division of Bell Labs to work with Irving Crandall on hearing and speech, Fletcher built the Western Electric Model 2A hearing aid and a binaural headset in the 1920's and published the widely-read book Speech and Hearing in 1929 that analyzed the characteristics of sound.
See: http://ac.acusd.edu/History/recording/bell-labs.html
1917:
E.C. Wente [Bell Labs] developed the condenser microphone to translate sound waves into electrical waves that could be transmitted by the vacuum tube amplifier.
See: http://ac.acusd.edu/History/recording/bell-labs.html
1918:
Henry Egerton patents the first balanced-armature loudspeaker driver, based on the 1882 balanced armature telephone patent of Thomas Watson, and used in the Bell Labs No. 540AW speakers developed by N. H. Ricker Oct. 6, 1922.
See: http://ac.acusd.edu/History/recording/bell-labs.html
1921:
The amplifier, microphone, loudspeaker innovations were combined to create the first public address systems. The largest public demonstration of such as system took place on Armistice Day for the national broadcast of the burial of the Unknown Soldier at Arlington Cemetery, heard over 80 loudspeakers linked by telephone lines in New York, San Francisco, and Arlington. By the next year, standardized p.a. systems were introduced.
See: http://ac.acusd.edu/History/recording/bell-labs.html
1922:
When he turned 70, Bell stated that "recognition for my work with the deaf has always been more pleasing than the recognition of my work with the telephone." But it was the telephone that had transformed America. As a final tribute to Bell, upon his death in 1922 at age 75, the nation's telephones all stopped ringing for one full minute.
1924:
Twenty-eight years after Hollerith [1896] founded the Tabulating Machine Company it becomes known as International Business Machines (IBM), a company, which is well known nowadays. Everybody links the name of this company with the use of computers.
See: http://www-stall.rz.fht-esslingen.de/studentisches/Computer_Geschichte/grp2/holler.html
1929:
Harvey Fletcher [see 1916] published the widely read book Speech and Hearing that analyzed the characteristics of sound. Fletcher led much of the research on binaural "stereophonic" (stereo) sound recording, at Bell Labs.
See: http://ac.acusd.edu/History/recording/bell-labs.html
1934:
The Readphone, an invention which reproduced literature and music on long-playing discs was invented. This "Readophone Talking Book", was demonstrated to Dr. Herbert Putnam, librarian, and to Dr. H.H. B. Meyer, director, Project, Books for the Blind, Library of Congress, The Readophone disc had two hours and twenty minutes of recording time, the equivalent of twenty-eight thousand words. Did you ever play a 33-1/3 RPM record?
See: http://www.wcblind.org/fyi/trivia.html
1935:
The American Foundation for the Blind publishes first issue of Talking Book Bulletin. Listened to a book-on-tape lately?
See: http://www.wcblind.org/fyi/trivia.html
1936:
Since its earliest days, Bell Labs had been concerned with the properties and analysis of human speech, originally developed to help people who were deaf learn to speak intelligibly. Because of this work it was inevitable that a Bell Labs scientist would invent an artificial talking machine and, in 1936, H.W. Dudley did. It was the world's first electronic speech synthesizer, and it required an operator with a keyboard and foot pedals to supply "prosody" - the pitch, timing, and intensity of speech. Dudley called his device the "voice coder" though it quickly became known as, simply, "Voder." It was a hit at the New York and San Francisco World's Fairs of 1939.
See: http://www.research.att.com/history/36speech.html
1948:
National Bureau of Standards develops specifications for a low-cost reliable talking-book machine for the blind. Tape recorder anyone?
See: http://www.wcblind.org/fyi/trivia.html
1948:
In support of the quest to develop more reliable, powerful, flexible, smaller, cheaper, cooler-running and less power-consuming hearing aids, John Bardeen along with his fellow associates William B. Shockley and Walter H. Brattain, all Bell Labs scientists invented the transistor. This famous invention earned Bardeen and his associates the 1956 Nobel Prize for physics. Sony was not convinced that this was the best use for the transistor and acquired a license for the technology, for $25,000, and invented the transistor radio. Needless to say, this marvelous invention became the primary technology responsible for fueling a revolution in the telecommunications industry that continues today.
See: http://www.teleport.com/~richards/japanno/part05.html
See: http://www-users.cs.umn.edu/~dyue/wiihist/japsayno/japsayno.7.html
1952:
For Bell, whose invention of the telephone created the telecommunications revolution, the original goal of easing the isolation of the deaf remained elusive. His insights into separating the speech signal into different frequency components and rendering those components as visible traces were not successfully implemented until Potter, Kopp, and Green designed the spectrogram and Dreyfus-Graf developed the steno-sonograph in the late 1940s. These devices generated interest in the possibility of automatically recognizing speech [speech recognition] because they made the invariant features of speech visible for all to see.
See: http://mitpress.mit.edu/e-books/Hal/chap7.java/seven8.html
1952:
As an off-shoot of Bell’s work in the deaf community, the first speech recognizer was developed in 1952 by Davis, Biddulph, and Balashek of Bell Labs. With training, it was reported, the machine achieved 97 percent accuracy on the spoken forms of ten digits.
See: http://mitpress.mit.edu/e-books/Hal/chap7.java/seven8.html
1960:
Pilgrim Imaging started open captioning for the deaf in 1960, for the Captioned Films for the Deaf Program, under the Dept. of Health, Education & Welfare.
See: http://www.robson.org/gary/writing/jcr-fcc.html
1964:
This year was the turning point when Deaf orthodontist Dr. James C. Marsters of Pasadena, California shipped a teletype machine to Deaf scientist Robert Weitchrecht in Redwood City, California and requested a way to attach it to the telephone system so that phone communication could take place. Who would have guessed that in 1998 over 100 million people, in all parts of the world, would be communicating with each other, over the Internet, using basically the same technology. Instead of calling our devices Telecommunications Device for the Deaf (TDDs) or TTYs, we call them Internet chat rooms!
See: http://www.deafexpo.org/tty_museum-history.htm
1972:
The first nationally broadcast open-captioned program was WGBH's The French Chef, with Julia Child, which aired on PBS on August 5, 1972.
See: http://www.robson.org/gary/writing/jcr-fcc.html
1972:
Vinton Cerf developed the host-level protocols for the ARPANET. ARPANET was the first large-scale packet network. Cerf, hard-of-hearing since birth, married a lady who was deaf. Cerf communicated with his wife via text messaging. According to Cerf, "I have spent, as you can imagine, a fair chunk of my time trying to persuade people with hearing impairments to make use of electronic mail because I found it so powerful myself." Had it not been for this experience Cerf may not have used text-messaging to the extent that he did and may not have integrated e-mail as part of the functionality of ARPANET, the precursor to Internet.
See: http://www.charweb.org/webinfo/cerf.html
1975:
CCD (Charge Coupled Device) flatbed scanners, ubiquitous today, did not exist back the early 1970s when Ray Kurzweil and his team at Kurzweil Computer Products created the Kurzweil Reading Machine and the first omni-font OCR (optical character recognition) technology. The Kurzweil team created its own scanner using the first CCD integrated chip, a 500 sensor linear array from Fairchild. They did this work in support of the blind.
See: http://www.kurzweiltech.com/techfirsts/techfirsts.htm
1976:
Radio Reading Services begins at Minnesota State Services for the Blind.
See: http://www.wcblind.org/fyi/trivia.html
1980:
Voice indexing used for the first time in talking book Access to National Parks: A Guide for Handicapped Visitors by the Library of Congress. This technology enables the listener of an audiotape to access book section using an index to navigate!
See: http://www.wcblind.org/fyi/trivia.html
1984:
The first music keyboard, with accoustic sound, was developed by Ray Kurzweil. The inspiration for having done this came, in part from a conversation he had with Stevie Wonder, who had been a user of the Kurzweil Reading Machine for the blind!
See: http://www.kurzweiltech.com/techfirsts/techfirsts.htm
1988:
Retail Point-Of-Sale (POS) devices began to use picture-based keyboards (mostly fast food restaurants). This technology was originally developed in the mid-60’s to enable people who were unable to speak to use a keyboard, computer and speech synthesizer to speak. Today, picture-based keyboards are enabling retail establishments to employ individuals who, for one reason or another, were unemployable 10 years ago.
1990:
The Americans with Disabilities Act mandated that all telephones, required to be accessible, shall be equipped with a volume control and/or a shelf and outlets to accommodate Telecommunication Devices for the Deaf. This includes a phone jack and a power plug. Cranking up the volume on an "accessible" phone makes it usable for everyone in a noisy environment. Have you ever used, or seen someone use, an "accessible" public telephone to connect up their laptop and retrieve their e-mail messages? Another benefit of the ADA is the lowering of pay telephones so that wheelchair users can access them. Because of this mandate children are now able to access these same phones. They can even reach and read the phone books! Wouldn't it be great if all public telephones were accessible?
See: http://www.trace.wisc.edu/docs/adaag_only/adaag.htm#4.1.3(17)(c)
1994:
National Federation of the Blind establishes dial-up synthetic-speech talking newspaper, making a daily newspaper available to blind people by 6:30 a.m. on day of issue for the first time. Anyone interested in listening to your favorite newspaper?
See: http://www.wcblind.org/fyi/trivia.html
Mid-1990's:
Many new products come on the scene:
For people who are paralyzed there were voice-activated phones, lamps and switches. For people who are blind there were talking caller IDs, pagers, alarm clocks, calculators, watches and variable-speed/pitch tape recorders. For people with mobility disabilities there were phones with keypads that have large buttons. For people who are hard-of-hearing there were phones with volume control.
1996:
Productivity Works develops, pwWebSpeak, a browser which translates information content from Web pages into speech.This great new technology can provide web access to anyone in eyes busy-environments [like driving a car though I don't recommend this particular use!].
See: http://www.prodworks.com/
1997:
NCR Corporation develops the world's first Audio ATM designed to provide access to banking for blind and partially sighted people. According to Rick Makos, Vice President of NCR Canada's Financial Solutions Group.. "Technology is the great equalizer. The Audio ATM has the potential to allow more than 50 million people around the world who are visually-impaired, as well as 1.4 billion people who can neither read nor write, have comfortable access to self services - when and where they need it.
See: http://www3.ncr.com/press_release/pr111297b.html
See: http://www3.ncr.com/press_release/pr082698.html
1998:
Nokia releases LPS-1 Loopset. Hearing aid users have new found mobile freedom with this new device. Based on induction technology, the Loopset allows hearing aid users to talk on digital mobile phones. It has a built-in microphone for hands-free operation, and is compatible with Nokia 5100 series and Nokia 6100 series mobile phones, which have an automatic answer function that works with the Nokia Loopset. By the way, people who are not hard-of-hearing or deaf can use this loopset for hands-free ooperation of their cellular telephone. One extra hand on the wheel means added safety for both the driver and those around them!
See: http://www.shopnokia.com/ (Click on "Buy Accessories.")
1998:
Productivity Works launches another voice-based browsing product, which utilize the telephone. The firm's pwTelephone is geared not only to the visually impaired, but also to people without access to Internet-ready PCs. The software may also prove useful to firms that want to provide information, such as schedules or price lists, both by phone and over the Web and from a single source.
See: http://www.prodworks.com/
1999:
The World Wide Web Consortium (W3C) releases their Web Content Accessibility Guidelines specification. Using this specification web content developers can develop web pages that not only meet their sales, marketing and information objectives but web pages which:
Can be accessed by a standard telephone [no computer]. Anyone could use a pay phone to access, navigate, and retrieve information from web pages. [See Productivity Works, 1996]
Are less costly to translate into foreign languages. Developing for access by people with cognitive disabilities stresses the simplification of words, and the elimination of extraneous words, from web pages. This can benefit all web users.
Can be accessed by lower-power PCs and from within narrower bandwidth information infrastructures. These specifications demonstrate how to develop graphical web pages that have the ability to present their full message with the browser's graphics display turned off. This programming technique enables a company to free up bandwidth at critical times without impacting their Intranet sites.
See: http://www.w3.org/TR/1999/WAI-WEBCONTENT-19990505
About the Association of Access Engineering Specialists (AAES)
AAES, the Association of Access Engineering Specialists, is a professional organization concerned with improving access to telecommunications and computing products and services for millions of people with disabilities.
The primary purpose of AAES is the development of the field of access engineering. AAES actively seeks to refine a technical consensus among all parties affected by telecommunications and other electronic and information technology access. To this end AAES will initiate, foster, and promote dialog between the disability community and industry involving accessibility issues.
AAES http://www.narte.org/aaes.html was formed as a specialist group under The National Association of Radio and Telecommunications Engineers (NARTE) http://www.narte.org/ and in partnership with, The Rehabilitation Engineering and Assistive Technology Society of North America (RESNA) http://www.resna.org/ in 1997.
AAES emerged from the collaborative work of the Telecommunications Access Advisory Committee (TAAC) http://www.access-board.gov/pubs/taacrpt.htm which was formed to assist the U.S. Access Board http://www.access-board.gov/ in fulfilling its mandate to issue accessibility guidelines under Section 255 of the Telecommunications Act of 1996.
© 1999 by Steven I. Jacobs
Steve Jacobs is a Senior Technology Consultant with the NCR Corporation in Dayton, Ohio; his area of technical expertise is information technology access. He has served on the U.S. Electronic and Information Technology Access Advisory Committee and the U.S. Telecommunications Access Advisory Committee and was part of the team that proposed the establishment of a Web Accessibility Initiative International Program Office He is President of IDEAL at NCR, a not-for-profit organization whose mission is to support NCR employees with disabilities and the development of information technologies usable by people with disabilities. Steve may be reached via e-mail at steve.jacobs@daytonoh.ncr.com
For more background or sources on this topic, contact Deputy Director William Stothers at 619-232-2727 X104 (cellphone 619-886-2727) or by email at wstothers@accessiblesociety.org.