Blog

Google's GenTabs Just Changed the Game: Why Digital Accessibility Is Now Your AI Visibility Strategy

Michael Bervell
Michael Bervell
December 17, 2025

How the accessibility tree became the interface for AI — and what e-commerce brands need to understand right now


Google just launched something called GenTabs, and honestly, it's one of the most significant shifts I've seen in how AI interacts with the web.

But here's what's fascinating: almost nobody is talking about the real implication.

GenTabs doesn't just browse websites. It parses them. It extracts meaning from their underlying structure. And that process? It's nearly identical to how assistive technologies like screen readers have worked for decades.

If you're running an e-commerce store — whether you're on Shopify, BigCommerce, or a custom platform — this matters more than you might think. Because the same accessibility issues that make your site difficult for people with disabilities are now making your site invisible to AI.

Let me walk you through exactly what's happening under the hood, why it matters, and what you can actually do about it.


First, What Is GenTabs and Why Should You Care?

GenTabs is the first feature inside Google's new experimental browser called Disco (short for "discovery"). Built with Gemini 3, Google's most advanced AI model, GenTabs does something genuinely novel.

Here's how Google describes it:

"GenTabs helps you navigate the web by proactively understanding your complex tasks (through your open tabs and chat history) and creating interactive web applications to help you complete the tasks."

-

In plain English: you're researching a trip to Japan across multiple browser tabs. GenTabs reads all those tabs, understands what you're trying to do, and builds you a custom trip planning app on the fly. No coding required. Just natural language.

Planning meals for the week? It'll build you a meal planner from the recipe sites you're browsing. Studying for an exam? It might generate flashcards or a visualization tool.

The Verge's coverage calls it "Googling meets vibe coding," and that's pretty accurate.

But here's the part that caught my attention, buried in Google's announcement:

"Because every generative element ties back to the web, it always links to the original sources."

-

GenTabs doesn't just summarize content. It cites it. It pulls from it. It builds with it.

Which means your website either becomes source material for these AI-generated apps... or it doesn't.

And what determines whether GenTabs can actually use your content?

The same thing that determines whether a screen reader can use it.


The DOM, the Accessibility Tree, and Why They Matter for AI

Okay, let's get a little technical here — but I promise this is the key to understanding everything.

When you visit a website, your browser doesn't just display a page. It builds something called the Document Object Model, or DOM. The DOM is essentially a tree structure that represents every element on the page — every heading, paragraph, image, link, button, and form field — along with their relationships to each other.

Think of it like the skeleton of your webpage. The visual design (colors, fonts, layout) is the skin. The DOM is the bones.

Now here's where it gets interesting.

Browsers also build something called the accessibility tree. This is a simplified, semantic version of the DOM that strips away visual styling and focuses purely on meaning and structure. It's what assistive technologies — like screen readers, voice control software, and switch devices — use to help people with disabilities navigate the web.

As MDN Web Docs explains:

"Browsers convert markup into an internal representation called the DOM tree... Browsers then create an accessibility tree based on the DOM tree, which is used by platform-specific Accessibility APIs to provide a representation that can be understood by assistive technologies."

-

The accessibility tree answers questions like:

  • What is this element? (A button? A heading? A link?)
  • What is it called? (The accessible name)
  • What state is it in? (Expanded? Checked? Disabled?)
  • How does it relate to other elements?

If your HTML is semantic and well-structured, the accessibility tree is rich and meaningful. If your HTML is a mess of <div> tags styled to look like buttons and headings, the accessibility tree is sparse and confusing.

Here's the connection to AI:

AI web agents — including the technology powering GenTabs — parse the DOM in remarkably similar ways.

A 2022 research paper from Google titled "Understanding HTML with Large Language Models" found that LLMs pretrained on natural language transfer remarkably well to HTML understanding tasks. In fact, fine-tuned LLMs were 12% more accurate at semantic classification when pages used proper semantic HTML structure.

And a recent academic study on AI web agents put it even more directly:

"DOM-based agents, much like screen reader users, rely heavily on the semantic structure of web content to perceive and act. Where humans use vision to interpret visual layout and cues, these agents depend on HTML and ARIA attributes to infer purpose and interactivity."

-

Let that sink in for a moment.

The same structural cues that help a blind person navigate your website are what help AI understand and use your content.


The Technical Overlaps: Where Accessibility and AI Meet

This isn't just a philosophical connection. There are specific, technical areas where accessibility best practices directly impact AI readability.

Let me walk through the big ones.

1. Reading Order: Code Order vs. Visual Order

Here's something most website owners don't realize: AI agents read your page in DOM order, not visual order.

If you've used CSS Flexbox with flex-direction: row-reverse or CSS Grid to visually reposition elements, the visual layout might look perfectly logical to a sighted user. But under the hood, the code order — which is what screen readers and AI agents follow — could be completely jumbled.

WebAIM explains the problem:

"Since the DOM order is what determines the screen reader reading order (and the tab order, in the case of active elements), the elements will be announced in the opposite order from the visual reading order."

-

This is a WCAG 2.2 Level A requirement — Success Criterion 1.3.2: Meaningful Sequence. It states that "when the sequence in which content is presented affects its meaning, a correct reading sequence can be programmatically determined."

For AI agents like GenTabs, this matters enormously. If your product descriptions, pricing information, or key selling points are out of sequence in the DOM, the AI will extract and synthesize that information in a confusing order.

Real example: Imagine an e-commerce product page where the price appears visually next to the product title, but in the code, the price <div> is actually positioned at the bottom of the page and CSS-floated to appear at the top. A screen reader (and an AI agent) would read the title, then all the product details, then finally the price — completely out of context.

2. Semantic HTML: Meaning Over Appearance

Semantic HTML means using HTML elements for their intended purpose. A <button> for clickable actions. A <nav> for navigation. <h1> through <h6> for headings in proper hierarchy. <table> with <th> for tabular data.

Why does this matter?

Because semantic elements carry implicit meaning that both assistive technologies and AI can understand without additional explanation.

When you use a <button> element, browsers automatically know it's interactive, focusable, and activatable with both click and keyboard. When you use a <div> styled to look like a button, you have to manually add all that functionality — and you often miss something.

The W3C's guidance on WCAG 1.3.1 (Info and Relationships) is clear:

"Information, structure, and relationships conveyed through presentation can be programmatically determined or are available in text."

-

That phrase — programmatically determined — is the key. It means software (whether a screen reader or an AI model) can extract the meaning without relying on visual cues.

Research on LLM web agents confirms this. Tools that convert DOM to formats optimized for LLMs specifically preserve "semantic structure of web content" because it's what enables accurate AI interpretation.

Real example: If your Shopify store uses a custom-built product comparison table, but it's constructed with <div> elements styled as a grid rather than actual <table>, <th>, and <td> elements, an AI agent will see a jumbled stream of text. It won't understand that "256GB" belongs to "Storage" and "$999" belongs to "Price." The relationships are lost.

3. Descriptive Link Text: Context for Citations

GenTabs builds apps that "link to the original sources." That means it needs to understand what your links actually point to.

Links with text like "Click here," "Read more," or "Learn more" are accessibility failures because they provide no context when read in isolation. Screen reader users often navigate by pulling up a list of all links on a page — and a list of fifteen "Read more" links tells them nothing.

Section508.gov's accessibility guidance puts it plainly:

"Ensure link text is meaningful and descriptive, indicating the purpose or destination of the link. Avoid using vague terms like 'click here' or 'read more.'"

-

For AI, the same principle applies. When GenTabs is deciding which sources to cite or pull from, descriptive link text helps it understand the relevance and context of each link.

Real example: "Download our WCAG 2.2 compliance checklist" is infinitely more useful to both screen readers and AI than "Click here to download."

4. Heading Hierarchy: The Content Outline

Headings (<h1> through <h6>) create a hierarchical outline of your content. They're not just visual styling — they're structural markers that indicate the organization of information.

Screen reader users rely heavily on headings to navigate. They can jump from heading to heading, quickly scanning the structure of a page without having to read every word.

AI agents use headings similarly — to "chunk" content into logical sections for processing and synthesis.

WCAG Success Criterion 2.4.6 requires that "headings and labels describe topic or purpose."

And research on generative AI and accessibility notes:

"Logical headings (H1–H4) and semantic regions (e.g., <main>, <nav>, <footer>) facilitate screen reader navigation and enable retrieval systems to identify self-contained sections."

-

Real example: If your product category page has fifty products but no heading structure — just styled text that looks like headings — an AI agent will struggle to understand where one product ends and another begins. It sees a wall of text, not organized listings.

5. Alt Text: Describing the Visual

Alt text provides text alternatives for images. For screen reader users, it's how they understand images they can't see. For AI, it's additional context that becomes part of the semantic meaning of your page.

When GenTabs synthesizes content from your website, images without alt text are gaps in understanding. The AI knows an image exists, but has no idea what it contains or why it matters.

WebAIM's alt text guide explains that alt text should be:

  • Accurate and equivalent
  • Succinct
  • Not redundant
  • Not using phrases like "image of..." or "graphic of..."

For e-commerce, this is particularly important. Product images with empty or generic alt text ("product-image-1.jpg") fail both accessibility and AI readability tests.

Real example: An AI building a meal planning app from your recipe site will be far more effective if your images have alt text like "Finished chicken parmesan with melted mozzarella and fresh basil garnish" rather than "IMG_4523.jpg."


The Data: Accessibility Drives Discoverability

This isn't just theoretical. There's hard data showing that accessible websites perform better in search — and by extension, in AI-powered discovery.

A 2025 study by Semrush analyzed 10,000 websites and found:

  • Websites with higher accessibility scores saw a 23% average increase in organic traffic
  • Accessible sites ranked for 27% more organic keywords
  • Accessibility correlated with 19% higher Authority Scores

The researchers concluded:

"Accessibility is no longer optional. The evidence is clear: making your website more inclusive can also make it more discoverable."

-

Meanwhile, the 2025 WebAIM Million report — an annual accessibility audit of the top one million websites — found that 94.8% of homepages had detectable WCAG failures, with an average of 51 accessibility errors per page.

The six most common failures? Low contrast text, missing alt text, empty links, missing form labels, empty buttons, and missing document language. All fundamental issues. All easily fixable.

And all directly impacting how AI systems can parse and use that content.


The Bigger Picture: AIO and the Future of Discovery

We're entering what some are calling the era of AIO — AI Optimization.

Just as SEO (Search Engine Optimization) emerged when Google became the dominant way people found information, AIO is emerging as AI chatbots, AI search (like Google's AI Overviews), and AI agents (like GenTabs) become primary interfaces for discovery.

BrightEdge's analysis of structured data in AI search puts it well:

"Structured data is becoming part of the semantic layer that underpins AI. As generative models demand verifiable facts, clear schema provides the grounding they need... Schema turns your site into a machine-readable knowledge graph, and future AI tools will rely on that graph to answer questions accurately."

-

Google's own AI Overviews — the AI-generated summaries that appear at the top of many search results — pull from websites that are clearly structured and semantically marked up.

Google's official guidance on performing well in AI search features emphasizes:

"Structured data is useful for sharing information about your content in a machine-readable way that our systems consider and makes pages eligible for certain search features and rich results."

-

The common thread across all of this? Machine readability. The ability for software — whether a screen reader, a search crawler, or an AI agent — to extract meaning from your content.

And the standard that's been defining machine readability for over two decades? WCAG — the Web Content Accessibility Guidelines.


What This Means for E-Commerce Brands

If you're running an online store, here's the bottom line:

Digital accessibility isn't just about compliance or doing the right thing (though it's both of those). It's now directly tied to your visibility in AI-powered discovery.

Every accessibility issue on your site is a potential barrier to AI understanding:

  • Missing alt text on product images = AI can't describe or recommend your products accurately
  • Non-semantic HTML = AI struggles to understand your page structure and content relationships
  • Poor heading hierarchy = AI can't chunk your content into meaningful sections
  • Generic link text = AI can't contextualize your internal and external references
  • Broken reading order = AI extracts and synthesizes information in the wrong sequence

At TestParty, we've fixed over 270,000 accessibility issues for brands in the past year alone. And what we're seeing is that the same fixes that make sites usable for people with disabilities are the fixes that make sites understandable for AI.

This is the insight I keep coming back to:

The accessibility tree IS the AI interface.

The same semantic structure that helps a blind user navigate your Shopify store is what helps GenTabs build apps from your content.

The same descriptive link text that helps a screen reader user understand your navigation is what helps AI cite your sources correctly.

The same logical heading hierarchy that helps someone using voice control jump to the right section is what helps AI chunk your content for synthesis.

Accessibility and AI readability aren't parallel tracks. They're the same track.


What You Can Do Right Now

If you want your e-commerce site to be ready for GenTabs and the broader shift toward AI-powered discovery, start here:

  1. Audit your semantic HTML. Are you using actual <button>, <nav>, <main>, <header>, and <footer> elements? Or are you faking it with styled <div>s?
  2. Check your heading hierarchy. Do you have one <h1> per page? Do your <h2>s and <h3>s follow a logical structure without skipping levels?
  3. Review your alt text. Do all product images have descriptive, accurate alt text? Not "product image" — actual descriptions.
  4. Fix your link text. Search your site for "click here" and "read more." Replace with descriptive alternatives.
  5. Verify reading order. Turn off CSS and see if your page still makes sense in the order the content appears. That's what AI sees.
  6. Add structured data. Implement schema.org markup (Product, FAQ, HowTo, etc.) to give AI explicit context about your content.

Or, if you want to skip the manual audit, tools like TestParty's accessibility scanner can identify these issues automatically and fix them at the source code level — not with overlay widgets that don't actually solve the underlying problems.


The Bottom Line

Google's GenTabs is just the beginning. As AI becomes a primary interface for how people discover, research, and interact with the web, the websites that win will be the ones AI can actually understand.

And the blueprint for AI-understandable websites? It's been around since 1999. It's called WCAG.

The brands that figure this out now — that treat digital accessibility as a visibility strategy, not just a compliance checkbox — will show up everywhere AI looks.

The rest will wonder why they're invisible.


Want to see how your site scores for both accessibility and AI readability? Get a free accessibility audit from TestParty and find out what's actually blocking your visibility.

Stay informed

Accessibility insights delivered
straight to your inbox.

Contact Us

Automate the software work for accessibility compliance, end-to-end.

Empowering businesses with seamless digital accessibility solutions—simple, inclusive, effective.

Book a Demo