Blog

The Hidden History of Digital Accessibility: How Blind and Deaf Innovators Shaped Modern Technology (Pre-1990)

Michael Bervell
Michael Bervell
December 2, 2025

When most people think about digital accessibility, they picture modern screen readers, WCAG guidelines, and ADA compliance. But the foundations of accessible technology stretch back nearly 200 years—and the story involves wartime research programs, fathers inventing for their blind daughters, and deaf scientists hacking telephone networks.

Just as the origins of computer programming trace back to women like Ada Lovelace and the ENIAC programmers, and early computers relied on paper punch cards, digital accessibility has roots that are often overlooked. Here's the true history of how accessible technology evolved before the internet age.


The Original Accessibility Innovation: Louis Braille (1824)

The story begins with a 15-year-old French boy who would change literacy forever.

Louis Braille lost his sight at age three after an accident with an awl in his father's harness-making shop. While studying at France's Royal Institute for Blind Youth, he encountered a 12-dot military code created by French Army Officer Charles Barbier for nighttime battlefield communication.

Braille saw potential where others saw complexity. He developed a simpler six-dot system that could represent letters, numbers, and punctuation. In 1824, at just fifteen years old, he presented his raised-dot system to his school's headmaster.

What makes Braille's system remarkable from a technical standpoint? As noted by the Braille entry on Wikipedia, the second revision published in 1837 was the first binary form of writing developed in the modern era—predating digital computers by over a century.

The system wasn't immediately accepted. It took until 1854—two years after Braille's death—for his own school to officially adopt it. A universal Braille code for English wasn't standardized until 1932.


The Optophone: The World's First "Digital" Reading Machine (1913)

Over a century ago, a physicist in Birmingham, England created a device that sounds remarkably modern.

Dr. Edmund Fournier d'Albe invented the optophone in 1913, which used selenium photosensors to detect black print and convert it into audible tones that blind users could interpret as letters. It was one of the earliest known applications of sonification—turning visual data into sound.

The origin story reveals how user feedback shaped early accessibility technology. According to IEEE Spectrum's historical account, Fournier d'Albe first demonstrated an "exploring optophone" in 1912 that helped blind people detect light sources. News spread rapidly through blind communities—until a blind solicitor named Washington Ranger sent a blunt critique: "The blind problem is not to find lights or windows, but how to earn your living."

Chastened by this feedback, Fournier d'Albe pivoted to building a reading machine that could translate text into sound. He realized that each printed letter has a unique ratio of white to black, which could be converted into distinctive tones.

One user, Mary Jameson, began using the optophone in 1918 and eventually achieved 60 words per minute by 1972—demonstrating that with proper training, the technology could provide genuine reading access.


World War II: How Blinded Veterans Accelerated Accessibility Research (1943-1950s)

The Second World War created thousands of veterans with vision loss—and mobilized unprecedented federal resources to help them. This period established the government-funded research model that would drive accessibility innovation for decades.

Vannevar Bush and the Committee on Sensory Devices

In 1943, Vannevar Bush established a federal government program within the wartime Office of Scientific Research and Development to develop sensory aids for the blind. From 1943 to 1947, RCA worked under OSRD to develop working prototypes of reading machines, while Haskins Laboratories initiated research on speech that would later enable computer-driven speech synthesizers.

The Haskins Laboratories chronology describes how the U.S. Office of Scientific Research and Development asked them to evaluate and develop technologies for blinded veterans. Experimental psychologist Alvin Liberman joined to develop a "sound alphabet"—essentially an auditory Braille.

The researchers discovered a fundamental limitation: the human ear cannot resolve rapid sequences of discrete sounds quickly enough. No acoustic code they devised could convey text at more than one-tenth the typical rate of speech.

This "failure" led to groundbreaking discoveries about how humans process language—research that would eventually make modern text-to-speech possible.

The Pattern Playback: Birth of Speech Synthesis (1950s)

In the 1950s, Haskins Laboratories President Franklin Cooper invented the Pattern Playback, the earliest speech synthesis device. This technology laid the groundwork for all modern text-to-speech systems, from screen readers to voice assistants.


The 1960s: Parallel Breakthroughs Transform Communication

The 1960s saw two revolutionary accessibility technologies emerge almost simultaneously—one for blind users, one for deaf users—both driven by personal necessity.

The Optacon: A Father's Gift to His Blind Daughter (1962-1971)

Few accessibility innovations have such a touching origin story as the Optacon.

John Linvill, a professor of Electrical Engineering at Stanford University, developed the device because his daughter Candy had been blind since age three. Using the Optacon, Candy eventually graduated from Stanford and earned a PhD.

The eureka moment came during a 1962 sabbatical. According to Stanford's obituary for Linvill, he visited an IBM laboratory in Germany and observed a high-speed printer using small pins to print letters. He thought: "If you could feel the hammers with your fingertip, you could surely recognize the image."

Linvill's solution was the Optacon (optical-to-tactile converter)—a portable device with a small hand-held camera that generated tactile images on a fingertip-sized display. He received a patent in 1966 and co-founded Telesensory Systems Inc. in 1970 to manufacture the device.

Here's a remarkable footnote about how accessibility tech drove broader innovation. As Jim Bliss recalled in an oral history: "The funding of the Optacon established the Integrated Circuits Lab at Stanford, which put the electrical engineering department on the map and actually put them ahead of MIT in integrated circuit research."

The Optacon project didn't just help blind readers—it helped launch Silicon Valley's semiconductor industry.

TTY: Deaf Scientists Hack the Telephone Network (1964)

While Linvill worked on tactile reading, a deaf physicist was solving a different problem: how could deaf people use the telephone?

Robert Weitbrecht (1920-1983) was born deaf but earned a B.S. in Astronomy from UC Berkeley in 1942. He worked as a physicist at the Radiation Laboratory (now Lawrence Livermore National Laboratory) and later at the U.S. Naval Air Missile Test Center.

Weitbrecht was an amateur radio enthusiast who had used radiotelegraph to communicate since high school. In 1964, he and orthodontist James Marsters figured out how to adapt surplus teletypewriter equipment for telephone communication.

According to RIT's archival collection, in May 1964, Marsters in Pasadena and Weitbrecht in Redwood City made history with the first long-distance TTY phone call between two deaf persons on a regular telephone line. Weitbrecht developed an acoustic coupler—essentially a modem—that converted typed text into audio tones that could travel over phone lines.

As the Wisconsin Historical Society notes, the teletype had existed since 1910, and its text-based nature made its application to deaf communication obvious. But establishing a teletype infrastructure for every household proved impossible—until Weitbrecht's acoustic coupler allowed existing telephone lines to carry the signal.

The TTY network that Weitbrecht enabled would serve deaf communities for decades. According to the National Association of the Deaf, smaller and more compact TTY versions were manufactured throughout the late 1970s and 1980s, eventually leading to state TTY equipment distribution programs.


The late 1960s and early 1970s saw a fundamental shift in how society viewed disability—from a medical problem to a civil rights issue. Two landmark laws established the legal framework for all future accessibility requirements.

The Architectural Barriers Act of 1968

The Architectural Barriers Act was enacted by President Lyndon B. Johnson on August 12, 1968. It requires that facilities designed, built, altered, or leased with federal funds be accessible to people with disabilities.

The law's champion was Hugh Gallagher, a wheelchair user who served as an aide to Senator E.L. Bartlett of Alaska. According to the U.S. Access Board's history, Gallagher had experienced firsthand the barriers to access at government buildings, including landmark museums along the National Mall. His vision was simple: "I wanted accessibility to be one of the items on the checklist of designers and builders."

The Access Board's anniversary retrospective notes that Gallagher called the ABA "the first law asserting the civil and constitutional rights of disabled people ever passed anywhere."

Section 504 of the Rehabilitation Act (1973)

Section 504 of the Rehabilitation Act of 1973 was one of the first U.S. federal civil rights laws offering protection for people with disabilities—and it set precedents for the Americans with Disabilities Act seventeen years later.

The law's creation was almost accidental. According to Wikipedia's detailed history, Section 504 brought the language of the Civil Rights Act of 1964 to disability policy. Working behind the scenes on what most believed was a budget-related bill, a staffer added thirty-five words prohibiting discrimination based on disability. This marked a departure from prevailing views that considered disability purely a medical condition.

These laws established the principle that would eventually apply to digital accessibility: equal access is a civil right, not a charitable accommodation.


The Kurzweil Reading Machine: Text-to-Speech Goes Commercial (1975-1976)

The modern era of accessibility technology arguably begins with Ray Kurzweil.

According to the National Inventors Hall of Fame, Kurzweil invented the Kurzweil Reading Machine—the first device to transform print into computer-spoken words. When the machine launched in 1976, it was regarded as the most significant advancement for the blind since Braille's introduction in 1829.

Kurzweil's innovation combined multiple technologies he developed: the first omnifont optical character recognition (OCR) in 1974, which could recognize letters in any typeface, and the first charge-coupled device (CCD) flatbed scanner.

The idea came from a chance encounter. As Kurzweil recounted to the American Foundation for the Blind, a blind man seated next to him on an airplane explained that he needed access to printed materials. Blind scientists from the National Federation of the Blind tested prototypes and provided feedback. The machine was completed in late 1975 and announced on January 13, 1976.

That same day, legendary CBS newscaster Walter Cronkite used the machine to read his signature sign-off on national television. One of the first individual purchasers was musician Stevie Wonder, beginning a long relationship between Kurzweil and the blind community.


The Screen Reader Revolution (1980s)

As personal computers entered homes and offices, a new challenge emerged: how could blind users access graphical interfaces?

IBM Screen Reader: The First Commercial Solution (1984-1986)

According to Knowbility, IBM researcher Jim Thatcher created the first screen reader in 1986. The IBM Screen Reader worked with the text-based DOS operating system and was initially only available within IBM.

The backstory involves personal connection. As detailed in AccessWorld's history of IBM accessibility, Thatcher worked with Dr. Jesse Wright, a blind research mathematician. In the early 1980s, "dumb" terminals connected to mainframe computers were totally inaccessible to blind users. Thatcher and Wright began developing an audio access system for the IBM Personal Computer in 1984.

Wikipedia's screen reader history explains that Wright and Thatcher, both mathematicians at IBM, adapted earlier talking terminal technology into PC-SAID (Personal Computer Synthetic Audio Interface Driver), which was renamed and released in 1984 as IBM Screen Reader—the proprietary term that became generic for the entire category of assistive technology.

JAWS: Setting the Stage for Modern Accessibility (Late 1980s)

The screen reader that would eventually dominate the market emerged from a personal need.

According to a historical account, Ted Henter was a motorcycle racing champion who lost his sight in a car accident. When he consulted Florida's Division of Blind Services, a counselor told him computer programming was becoming a popular career for blind people. In 1987, Henter and Rex Skipper founded Henter-Joyce and released the first version of JAWS (Job Access With Speech) for DOS.

The JAWS Wikipedia entry notes that what set JAWS apart from other screen readers was its use of macros that allowed users to customize the interface for different applications. Henter and Skipper released version 2.0 in mid-1990, positioning JAWS for the Windows era that would follow.


Key Lessons from Accessibility History

1. Personal Motivation Drove Innovation

The most impactful accessibility technologies came from people with direct stakes in the outcome:

  • Louis Braille was blind himself
  • John Linvill invented the Optacon for his blind daughter Candy
  • Robert Weitbrecht invented TTY because he was deaf
  • Jim Thatcher collaborated with his blind colleague Jesse Wright
  • Ted Henter created JAWS after losing his own sight

2. War and Crisis Accelerated Research

World War II created political will and federal funding for sensory aids research that might otherwise never have materialized. The blinded veterans of WWII became powerful advocates for accessibility investment.

3. Accessibility Tech Often Pioneered Broader Technologies

The Optacon project established Stanford's integrated circuits laboratory. Speech synthesis research at Haskins Labs laid groundwork for modern text-to-speech. TTY modems anticipated the internet age. Accessibility innovation has consistently been at the cutting edge.

4. The Civil Rights Framework Emerged in the 1960s-70s

The Architectural Barriers Act (1968) and Section 504 (1973) shifted thinking from "medical problem" to "civil rights issue." This conceptual framework underlies all modern digital accessibility laws, including the ADA and WCAG guidelines.


Conclusion: Why This History Matters Today

Understanding the origins of digital accessibility reveals an important truth: accessible technology has never been an afterthought. From Braille's binary dot system in 1824 to the first screen readers in the 1980s, people with disabilities and their allies have consistently been at the forefront of technological innovation.

Today's web accessibility standards, screen readers, and compliance requirements stand on the shoulders of inventors like Louis Braille, Edmund Fournier d'Albe, John Linvill, Robert Weitbrecht, Ray Kurzweil, and Jim Thatcher. Their work reminds us that accessibility isn't just about compliance—it's about ensuring that technology serves everyone.

The next time you encounter an accessibility feature, remember: you're experiencing the result of nearly 200 years of innovation, advocacy, and determination.


Related Reading:


Last updated: November 2025

Stay informed

Accessibility insights delivered
straight to your inbox.

Contact Us

Automate the software work for accessibility compliance, end-to-end.

Empowering businesses with seamless digital accessibility solutions—simple, inclusive, effective.

Book a Demo