A visit to Google’s Accessibility Discovery Centre: What I saw, What I learned
In September, I had the opportunity to visit Google’s Accessibility Discovery Centre (ADC) in London. Tucked inside what used to be a server room on the seventh floor in Google’s Kings Cross office, the space has been transformed into something far more meaningful: a living, learning environment dedicated to accessibility and inclusive design.

The ADC isn’t a typical tech showcase. Yes, there are plenty of devices, tools and features on display from eye gaze systems and gaming controllers to Android and Chrome accessibility settings but what makes the space stand out is how deeply it considers not just products or users, but people.
That human-centred, inclusive thinking came through in ways both big and small. For example, the team initially provided straws for those who might need them, which in itself is a detail often not thought of. But they quickly learned that paper straws aren’t always ideal; they soften and disintegrate too quickly. Their solution? Pasta straws. Still environmentally friendly, but more durable. A great example of what happens when accessibility and practicality meet.

Google’s mission statement is “to organize the world’s information and make it universally accessible and useful.” That word “universally” isn’t decorative. It’s a guiding principle. At the ADC, it felt like they were taking it seriously. The space feels open and inviting, but also adaptable to accommodate the needs of whoever is in the room.
The ADC is divided into zones that reflect different access needs and perspectives:
- Vision – tools like TalkBack, Guided Frame for taking selfies, and magnifiers
- Hearing – Live Caption and transcription services across platforms
- Dexterity – adaptive input devices, switch access, and keyboard remapping
- Cognitive and learning – tools like text-to-speech, simplified layouts, distraction reduction
- Neurodiversity – environmental controls, communication supports, and sensory awareness features
Each area offers a hands-on, interactive experience, thoughtfully designed not to impress, but to invite curiosity, challenge assumptions, and start better conversations. The ADC is a place where those with little knowledge or experience of disability and accessibility can begin their journey, while those with lived experience can share and learn, too.

While it’s a great introduction, to accessibility, it’s also a space for practitioners, technologists, and advocates to reflect, refine their thinking, and explore emerging ideas in inclusive design.
It’s used by both external guests and Google staff.
The visit began with a general discussion about accessibility and inclusion, and the importance of recognising that everyone has unique needs. One example involved passing around a Harry Potter book in braille, a small exercise that invites empathy and insight.

Each zone offered something memorable. At the ADC Arcade, visitors could try out switch access and eye gaze systems by playing games. A great resource worth exploring is Everyone Can, a UK charity specialising in accessible gaming technology. They offer assessments, gaming sessions, and custom controller design to support disabled and neurodivergent people.

In the Cognitive and Learning zone, the discussion included access methods, but also showcased alternative ways to make music and communicate using Augmentative and Alternative Communication (AAC). I was impressed that AAC was included, as this is often overlooked in mainstream accessibility conversations.

Within the Neurodiversity zone, we explored various environmental adaptations from using IKEA leaf material to create private workspaces, to simple dyslexia-friendly tools. One item that stood out didn’t involve technology at all: a small wearable slider badge called a Social Battery Badge.

It lets the wearer indicate whether they’re open to interaction or would prefer personal space. A quiet, respectful way to let others know how to support you, no explanations needed.

At the Hearing zone, we also got a glimpse into what future AI could mean for signed communication. One highlight was SignGemma, Google’s most advanced model for sign language understanding. Built as part of the open-source Gemma family, it uses multimodal learning to interpret and translate sign languages, starting with American Sign Language (ASL) into English.

What makes it particularly exciting is that it isn’t limited to a fixed dataset or static gestures. Its architecture is designed to be extensible, meaning it can be adapted and trained for other sign languages over time. When it becomes publicly available, it will allow developers, researchers, and the Deaf and Hard of Hearing community to build on the model, fine-tune it, and explore new applications from live interpretation to education, captioning, and beyond.
Imagine being able to watch any film or video and have a virtual sign language interpreter appear in real time, powered not by pre-recorded footage, but by a model that understands and translates as it goes.
Reflections
I thoroughly enjoyed my visit to Google’s ADC. More than any single device or feature, what struck me was the attitude: accessibility wasn’t framed as a checkbox or a polished finished product, but as a shared responsibility and an ongoing commitment.
Christopher Patnoe, Google’s EMEA Lead for Accessibility and Disability Inclusion, was once quoted as saying: “When people have equitable access to information and opportunity, everyone wins – but we know people’s needs are constantly changing, throughout their lives or even their day.”

Being in the ADC was a powerful reminder that accessibility is less about tech specs and more about mindset. That awareness, that accessibility isn’t static or limited to a specific context, was present in every corner. It doesn’t always have to mean technology either, and when it does, it’s rarely the shiny parts that matter most. It’s the thoughtful design choices that make a real difference: the option to navigate without using your hands, a camera that guides someone who can’t see the screen, a badge that lets you quietly signal “not today,” captions that follow you across devices, or straws that don’t collapse before the drink is finished.
Some of these choices might appear small at first, but their effect can be profound. They reflect something deeper: an understanding that inclusion starts with respect, empathy, and is sustained through iteration. Progress over perfection.
Throughout the Karten Network, many of us work at the leading edge of these realities, supporting people whose communication, movement, or sensory needs require creativity, compassion, and flexibility. The ADC didn’t offer all the answers, but it reaffirmed many of the questions we ask daily:
- Who’s being excluded?
- How can we support and enable people?
- What can we change so people don’t have to ask?
It also reminded me how important awareness is. So many accessibility features—on phones, tablets, browsers, and other devices, are already built in. But if people don’t know they’re there, they may as well not exist.
For example, tools like Live Transcribe on Android devices turn spoken words into real-time text, useful not only for people who are Deaf or hard of hearing, but also in noisy environments or for temporary communication barriers.
Microsoft’s Immersive Reader helps reduce visual distractions, reads text aloud, and supports focus and comprehension, particularly valuable for neurodivergent users or anyone with literacy challenges.
And Back Tap on iOS lets users trigger custom shortcuts, like opening the magnifier, launching an AAC app, or turning on VoiceOver, just by tapping the back of the phone.
Features like these can support communication, focus, and independence, but they’re often hidden in settings menus or never switched on. That’s why conversations like these matter. And it’s why I’m always happy to help uncover what’s already possible.
If you’d like to explore the Google’s Accessibility Discovery Centre for yourself, there’s a short ADC video tour hosted by Darren Ryden, with examples ranging from LEGO to literacy tools.
As always, I would love to hear how you’re using mobile technology, AI, or assistive tools in your setting. If there’s a topic you’d like covered in the next newsletter, or if you need technical help or advice, please don’t hesitate to contact me.
Martin Pistorius
Karten Network Technology Advisor
Article meta data
Clicking on any of the links in this section will take you to other articles that have been tagged in the same category.
- Featured in the Karten Autumn 2025 Newsletter
- This article is listed in the following subject areas: Technology, Update from Technology Advisor
