Blog

The 2025 TestParty Guide to WCAG 1.2.4 – Captions (Live) (Level AA)

TestParty
TestParty
January 5, 2025

Why did the live stream fail its accessibility audit? Because it was all talk and no captions!

WCAG 1.2.4 requires real-time captions for all live audio content in synchronized media—think webinars, live streams, virtual events, and broadcasts. If someone can't hear your live content, captions ensure they can still follow along. This Level AA criterion is essential for ADA compliance and creates inclusive experiences for deaf and hard-of-hearing users, people in sound-sensitive environments, and non-native speakers.

Table of Contents

  • What WCAG 1.2.4 Requires
  • Why This Matters
  • Quick Implementation Guide
  • Common Mistakes to Avoid
  • How to Test for WCAG 1.2.4
  • How TestParty Helps
  • FAQs

What WCAG 1.2.4 Requires

WCAG 1.2.4 mandates that captions are provided for all live audio content in synchronized media. This means any live video with audio—whether it's a webinar, town hall, product launch, or live-streamed conference—must include real-time captions that accurately convey spoken dialogue and important sound effects.

Key points:

  • "Live" means real-time: This applies to content being broadcast or streamed as it happens, not pre-recorded videos (those fall under WCAG 1.2.2).
  • "Synchronized media" means video + audio: If you're streaming audio-only content (like a podcast), this criterion doesn't apply—but WCAG 1.2.9 (Audio-only, Live) at Level AAA does.
  • Captions must be accurate and synchronized: They should appear in real-time (or near-real-time) and match the spoken content closely enough to be useful.

What's covered:

  • Webinars and virtual meetings
  • Live-streamed events and conferences
  • Broadcast-style content (town halls, product demos)
  • Live social media streams with audio
  • Any other live video content with spoken dialogue

What's not covered:

  • Pre-recorded videos (see WCAG 1.2.2)
  • Audio-only live streams (covered by a different criterion)
  • Two-way video calls where captions would be impractical (though best practice is to offer them when possible)

Why This Matters

Live captions aren't just a compliance checkbox—they're a lifeline for millions of users.

For users: Deaf and hard-of-hearing individuals rely on captions to access live content. Without them, they're excluded from webinars, virtual events, and real-time announcements. Captions also help people in noisy environments, non-native speakers, and anyone who processes information better through reading.

For compliance: WCAG 1.2.4 is a Level AA requirement, which means it's part of the baseline for ADA Title II and Title III compliance, Section 508 for federal agencies, EN 301 549 in the EU, and the European Accessibility Act. Organizations that host live events without captions face legal risk—especially in education, government, healthcare, and finance.

For business: Live captions expand your audience. They make virtual events more inclusive, improve comprehension and retention, and signal that your organization values accessibility. Plus, captions create a searchable text record of your live content, which can be repurposed for SEO and content marketing.

Quick Implementation Guide

Providing live captions requires planning, the right tools, and sometimes human support. Here's how to get started:

1. Choose a captioning method:

  • Automatic Speech Recognition (ASR): Tools like Zoom's live transcription, Google Meet captions, Microsoft Teams live captions, or third-party services like Otter.ai and Rev.com provide real-time captions generated by AI. These are fast and affordable but require monitoring for accuracy.
  • Human captioners (CART): Communication Access Realtime Translation (CART) services provide professional stenographers who caption live events with high accuracy. This is the gold standard for critical events (legal proceedings, medical webinars, etc.) but costs more.
  • Hybrid approach: Use ASR with a human editor monitoring and correcting errors in real-time.

2. Test your captioning setup before going live:

  • Run a dry-run with your captioning tool to check accuracy, timing, and display.
  • Ensure captions are visible and readable (good contrast, appropriate font size).
  • Confirm captions appear in the correct language and don't obscure important visual content.

3. Integrate captions into your streaming platform:

  • Most webinar platforms (Zoom, WebEx, Microsoft Teams, Google Meet) have built-in captioning or support third-party integrations.
  • For custom streams (YouTube Live, Twitch, Facebook Live), use services that support real-time caption overlays or CEA-608/708 closed captioning standards.

4. Provide a way to toggle captions on/off:

  • Users should be able to enable or disable captions based on their preference.
  • Ensure captions are on by default or prominently advertised at the start of the event.

5. Announce captioning availability:

  • Let attendees know captions are available at the start of your event.
  • Include captioning information in event descriptions and registration pages.

Example: Zoom webinar with live captions

<!-- Event page promoting live captioning -->
<section aria-labelledby="event-accessibility">
  <h2 id="event-accessibility">Accessibility Features</h2>
  <p>
    This webinar will include live captions powered by Zoom's 
    automatic transcription. Attendees can enable captions 
    by clicking the "CC" button in the Zoom toolbar.
  </p>
  <p>
    For CART (human captioning) requests, please contact 
    <a href="mailto:accessibility@example.com">accessibility@example.com</a> 
    at least 5 business days in advance.
  </p>
</section>

Common Mistakes to Avoid

Even with the best intentions, live captioning can go wrong. Here are the most common pitfalls:

1. Relying solely on unchecked ASR captions: Automatic captions are a great starting point, but they struggle with accents, technical jargon, proper nouns, and background noise. Always monitor ASR output during live events and have a plan to correct errors.

2. Forgetting to enable captions: It sounds obvious, but many organizations forget to turn on captioning during live events. Add "enable captions" to your pre-event checklist.

3. Captions that are out of sync: If captions lag too far behind the audio (more than a few seconds), they lose value. Test your setup to minimize latency.

4. Poor caption visibility: Captions that are too small, low-contrast, or obscured by other content are useless. Ensure captions are readable and don't cover critical visual information like speaker slides or sign language interpreters.

5. No backup plan: If your captioning tool fails mid-event, you need a fallback. Have a secondary captioning service or human captioner on standby for high-stakes events.

How to Test for WCAG 1.2.4

Testing live captions requires a mix of planning, observation, and user feedback.

Manual testing checklist:

  • Verify captions are present: Join the live event as a participant and confirm captions are available and enabled.
  • Check caption accuracy: Monitor captions during the event. Are they accurate enough to convey meaning? Do they capture important sound effects or speaker changes?
  • Test caption timing: Are captions synchronized with the audio? Is there noticeable lag?
  • Evaluate readability: Are captions easy to read? Check font size, contrast, and positioning.
  • Confirm user control: Can users toggle captions on/off? Are instructions clear?

What automated tools can't do:

Automated accessibility scanners can't evaluate live captions because they require real-time observation. You need human testers to assess accuracy, timing, and usability during actual live events.

What automated tools can do:

  • Verify that your streaming platform supports captioning (e.g., checking for caption controls in the player UI).
  • Test the accessibility of caption controls (keyboard access, ARIA labels, focus indicators).
  • Scan event registration and promotional pages to ensure captioning is mentioned.

How TestParty Helps

While live captioning happens in real-time and requires human oversight, TestParty helps organizations build accessible live event workflows and ensure their digital properties support captioning best practices.

What TestParty detects:

TestParty scans your event pages, webinar platforms, and video player implementations to identify accessibility issues that impact live captioning:

  • Missing or inaccessible caption controls (e.g., "CC" buttons that aren't keyboard-accessible or lack proper ARIA labels)
  • Video players that don't support captions or lack documentation about captioning features
  • Event registration pages that don't mention captioning availability
  • Embedded live streams that lack caption tracks or fallback options
  • Poor contrast or readability issues in caption display areas

How TestParty suggests fixes:

For issues related to live captioning infrastructure, TestParty provides:

  • Code-level fixes to make caption controls keyboard-accessible and properly labeled (e.g., adding aria-label="Toggle captions" to caption buttons)
  • Guidance on integrating captioning APIs or third-party services into your streaming platform
  • Template updates for event pages to include captioning information and accessibility statements
  • Recommendations for caption display styling (contrast, font size, positioning) that meet WCAG standards

Developer workflow integration:

TestParty integrates into your CI/CD pipeline and code review process to catch captioning-related issues before they reach production:

  • Pre-merge checks: TestParty flags video players or event components that lack caption support during pull requests, with line-level annotations and suggested fixes.
  • Regression prevention: If a code change breaks caption controls or removes captioning documentation, TestParty alerts your team before deployment.
  • Real-time feedback: Developers receive guidance on accessible video player implementation as they build, reducing the need for post-launch remediation.

Ongoing monitoring:

TestParty continuously monitors your live event infrastructure to ensure captioning remains accessible:

  • Dashboards track the accessibility of caption controls across your video players and streaming platforms.
  • Alerts notify your team if captioning features are removed or degraded during site updates.
  • Audit-ready reports document your captioning capabilities for legal and procurement reviews.

Because live captioning requires real-time human judgment (monitoring ASR accuracy, coordinating CART services, etc.), TestParty focuses on ensuring your technical infrastructure supports captions and that your event pages clearly communicate captioning availability. This shift-left approach helps you build accessible live event workflows from the start, reducing compliance risk and improving the user experience.


FAQs About WCAG 1.2.4

What is WCAG 1.2.4 in plain language?

WCAG 1.2.4 requires live captions for any video content that's being streamed or broadcast in real-time. If you're hosting a webinar, live stream, or virtual event with audio, you need to provide captions so deaf and hard-of-hearing users can follow along.

Is WCAG 1.2.4 required for ADA compliance?

Yes. WCAG 1.2.4 is a Level AA criterion, which is the baseline for ADA compliance in most contexts. Organizations subject to ADA Title II or Title III (government entities, places of public accommodation) must provide live captions for their video content to avoid legal risk.

Can I use automatic captions to meet WCAG 1.2.4?

Automatic captions (ASR) can meet WCAG 1.2.4 if they're accurate enough to convey meaning. However, ASR often struggles with accents, jargon, and background noise, so you should monitor and correct errors in real-time. For high-stakes events, consider professional CART services.

Do I need captions for pre-recorded videos?

Pre-recorded videos are covered by WCAG 1.2.2 (Captions, Prerecorded), not 1.2.4. Pre-recorded content should have captions added during post-production, which allows for higher accuracy and better synchronization than live captions.

What's the difference between captions and subtitles?

Captions include both spoken dialogue and important sound effects (e.g., "[applause]" or "[door slams]"), making them essential for deaf and hard-of-hearing users. Subtitles typically only include dialogue and are designed for viewers who can hear the audio but may not understand the language.


Some TestParty features described in this article are currently under development. Visit TestParty.ai to learn more about our current capabilities and roadmap, or book a demo at TestParty.ai/book-a-demo to see TestParty in action.

Disclaimer: Some of this article was generated with Large Language Models (LLMs) and Artificial Intelligence (AI). There may be some errors and we advise you to consult with human professionals for detailed questions.

Stay informed

Accessibility insights delivered
straight to your inbox.

Contact Us

Automate the software work for accessibility compliance, end-to-end.

Empowering businesses with seamless digital accessibility solutions—simple, inclusive, effective.

Book a Demo