Intelligent virtual assistant accessibility

It wasn’t that long ago when the idea of talking to and interacting with a computer by speaking was the stuff of science fiction. Now we “Hey Google…”, “Siri…”, and “Alexa…” without giving a second thought to it.
While Intelligent virtual assistants (IVA) are still maturing, they already offer an interface to many who would otherwise find traditional computer interfaces difficult to use. However, for some, accessing virtual assistants is still challenging. Thankfully, built in accessibility features may make this easier. As of iOS 11 you are able to type rather than speak to Siri.

Google Home accessibility features are largely dependent on the device. On mobile devices, the app relies on Android’s accessibility features. On Google Nest smart speakers and displays accessibility features are controlled through the Google Home app. To access these features, ensure you mobile device is connected to the same Wi-Fi network as your smart speaker or display. Open the Google Home app. Tap your speaker or Smart Display. Tap on “Device settings, then “Accessibility”. Currently the options are limited. They mainly include additional audio feedback and cues. For smart displays in addition to auditory options, including closed captioning, it is possible to adjust the colours and the amount of contrast, as well as magnify the screen.
Amazon’s Alexa has a large number of accessibility features. Similar to Google Home, some of which are device specific. These accessibility features can be accessed either through the Alexa app or directly through the device. The features include audio instructions for configuration of Amazon Alexa devices; customisable sound cues; text size and contrast; screen reader support for the Alexa app; support for keyboard navigation in the app and on some Alexa devices; screen magnification; and the rate at which Alexa speaks can also be adjusted.

The “wake word” can be changed, although this is currently limited to four options – “Alexa,” “Amazon,” “Echo,” and “Computer.”
On supported devices (e.g. Amazon echo show) you can interact with Alexa without speaking. This includes using a keyboard during video calls made using the supported device. The Real Time Text (RTT) feature adds a live, real-time chat feed during calls and “Drop Ins”. When RTT is enabled, a keyboard pops up on the screen (external Bluetooth keyboards are also supported), enabling you to type text which appears in real time on both parties’ screens.
Ongoing efforts promise to expand access to virtual assistants for people with disabilities. Google recently announced a partnership with Tobii Dynavox to integrate Google’s virtual assistant into Tobii Dynavox augmentative and alternative communication devices.
The Karten Network is excited to be a partner in the European Union funded Nuvoic Project, led by specialist app developer Voiceitt to further develop the Voiceitt app. The app is designed to translate impaired or unclear (‘dysarthric’) speech into intelligible speech as well as control other voice-driven technologies such as virtual assistants. (see the Nuvoic project article for more information).
While privacy and data protection concerns exists, intelligent virtual assistants are hear to stay and possess the potential to make all our lives, particularly those with disabilities a little easier.
As always, I am interested to hear about how you are using mobile and other smart technology. I am also available to support and help where I can.
Martin Pistorius
Karten Network Mobile Technology Advisor
Article meta data
Clicking on any of the links in this section will take you to other articles that have been tagged in the same category.
- Featured in the Karten Autumn 2020 Newsletter
- This article is listed in the following subject areas: Technology, Update from Technology Advisor
