Karten Centres have been invited to be involved in an exciting project assessing the inclusive design and functionality of Westfield’s Connected Autonomous Vehicles. Four of our Centres are currently involved in the project. Due to the lockdown, activity is limited to remote activity and virtual reality, but once restrictions are eased Westfield will be dropping vehicles off at Karten Centres and participants will have the opportunity to experience the vehicles first-hand. This is a fabulous opportunity for people with disabilities to input into the design process. Take a look at the Westfield’s Connected Autonomous Vehicles website and the YouTube clip.
Our Summer 2020 newsletter featured Derwen College Industry Champion, Neil Bevan, and the work his company had been doing with Derwen College in the development of their award-winning ‘Support Work’ mobile apps.
The lessons learned, the success of the project and the collaboration with the college, which includes the ‘Working in a Printshop’ app used in Derwen’s Karten Printshop, prompted Neil to separate out the app development side of his design and marketing business to form a new company – Starfish Labs Ltd.
Starfish Labs has recently launched the first of a suite of apps to support people with special educational needs and disabilities to understand the changes in society and regulations due to the ongoing Coronavirus pandemic.
The app, ‘COVID-19: Staying Safe’, has been part funded by the Welsh Government rapid response Covid-19 RD&I scheme, and is unique in allowing carers, parents or teachers to customise the content using their own photographs, video and spoken word, and simple steps to make the wearing of masks, lockdown restrictions and hand washing etc. familiar and relevant to the user’s own environment and to the changing regional and local lockdown rules. Generic content is available in English and Welsh. You can find out more about how the apps can be customised and a link to download at: https://starfishlabs.co.uk/products/
A further two apps, ‘COVID-19: Social Space’ covering social distancing and the concept of support bubbles etc., and ‘COVID-19: Happy&Healthy’, covering general health and wellbeing are due for launch during February.
The first app is available now for iPhone and iPad from the App Store, and will shortly be available for Android on Google Play. The additional apps will be available on both platforms as soon as they are published. The cost of the app is £3.99 and proceeds will be re-invested in further projects to support vulnerable people.
The apps were created in collaboration with Derwen College, working with their tutors on content, and feature Derwen clients (many of whom are former students) demonstrating the correct use of face masks, how to wash your hands, the use of hand sanitiser and other aspects of Covid-safe support through videos and photographs, along with cartoon images and Makaton symbols.
Teaming up with two additional directors, Neil and Kirsten Bevan established the new tech startup at Aberystwyth University Innovation & Enterprise Centre, to take advantage of the R&D collaboration opportunities with the University. The company is specifically focussed on developing apps for the SEND sector, to support people with training apps to improve their lives, and to developing relationships which will also be financially beneficial or will provide enhanced learning and enrichment opportunities to the people and organisations with whom they work.
The company has already created employment for two graduates and is about to employ a further developer, having recently won a significant export project in the SEND sector in the UAE.
Starfish Labs Director, Neil Bevan, says, “As the Covid-19 pandemic is still very much with us, the ongoing changes in lockdown rules and guidance in different parts of the UK are confusing for many people, and especially for those with learning difficulties.”
“So much of the existing guidance doesn’t really mean very much to someone who has Autism or who doesn’t recognise that generic images of hand washing, or face coverings have any meaning to their own life. We recognised that a suite of apps could help to simplify the guidance – breaking instructions down into understandable short sequences – and the ability for people to use their own photos or videos of their own masks, their own washbasin, and their own local environment would make the rules much more relevant.”
Neil continues, “We are grateful to Welsh Government for supporting Starfish Labs in funding the development of the apps, and we’re proud to be developing them in Welsh, as well as English, to support vulnerable people in Wales who may have Welsh as their first language. We’re also really pleased to be working with Derwen College again on this project.”
Starfish Labs is also looking to develop versions of the apps in other languages, such as Urdu and Punjabi.
The Karten Network has continued working on the Nuvoic Project our collaboration with specialist app developer Voiceitt, aiming to improve access to speech recognition technology for people who have speech difficulties. Voiceitt’s app supports people who want to communicate using their own voice but have difficulty being understood by unfamiliar people, or who want to use their voice for smart home control but can’t access mainstream technologies. The Karten Network is leading on user involvement and testing in the UK.
Updates
Since our last newsletter, we’ve been working with several of our partner organisations to recruit individuals to take part in the project, and we’re delighted that we now have participants from Beaumont College in Lancaster, Cedar Foundation in Northern Ireland, Enable Ireland, Hill House care home in Sandbach (Leonard Cheshire), Homefield College in Leicestershire, National Star in Cheltenham, The Grange Centre and Young Epilepsy in Surrey, as well as some individuals who’ve contacted us directly. We’d like to say a huge thank you to all of our participants and partner organisations for your contribution so far, especially in such challenging circumstances!
In December, Voiceitt announced their collaboration with Amazon to make Alexa accessible for users with impaired speech, and in January were awarded a ‘Best of Innovation’ award in the Accessibility category by the Consumer Technology Association. Congratulations Voiceitt!
We’ve also recently published our new Nuvoic project web pages, check these out for information and updates about the project.
Get involved!
We’re still recruiting participants to test the Voiceitt app, especially the new Smart Home mode which gives integrated control of an Amazon Echo smart speaker. We’re also looking to recruit people with impaired speech who are willing to donate voice recordings to help develop Voiceitt’s technology.
We’re keen to work with new partner organisations who support people with impaired speech, and we have funding available to reimburse organisations for time spent supporting the project. We can also work directly with individuals, family members and carers to support participation, and we offer vouchers, as well as free use of Voiceitt during participation and for six months afterwards, to thank participants for their contribution. Please see our project web pages for details of what’s involved.
Get in touch!
We would love to hear from you if you, your organisation or someone you know may be interested in taking part, or if you’d like more information. Please email our project co-ordinator: liz@karten-network.org.uk, or you can find more information and get in touch via our project web pages.
Every Thursday at 4.15pm the EdTech Demonstrator Programme will be delivering short webinars (between 20 to 40 mins) focusing on the best ways to integrate accessible technology into your classrooms – enabling you to improve the outcomes for all of your learners.
Accessible technology means technology that allows full access to digital content, whatever your needs. As a result, these webinars are for all educators, as we all know that mainstream classes include a range of students with needs, diagnosed or not. An accessible classroom makes learning easier for everyone.
What’s more, is the last Thursday of the month we will open up the end of the webinar as a “surgery” where you can get help with student issues around accessible tech. The first theme is access, where we cover the main tools learners can use to break through accessibility barriers.
Introduction
Meet the team of specialists, find out how best to access the programme and get some tips on getting September started right. Watch on YouTube.
Text-to-Speech
An introduction to one of the most useful tools to help learners access written text. We will cover options on different platforms and cover those extra features that can make all the difference. Watch on YouTube below.
Early Switch Skills
Taking a look at alternative access to computers/communication aids using switches. It will provide a step-by-step guide from the assessment process, to early switch skills such as awareness of cause and effect. It will explore a range of switch accessible software, websites, and activities. Watch on YouTube.
Dictation
Using your voice to write can be transformative for learners and is a useful life-skill for everyone! We will cover the main options and give options for the classroom, mobile devices, and exams. Watch on YouTube below.
Vision
Changing visual options can make resources accessible. We will go through the main software and features that can help your learners see the work in the best possible way. Watch on YouTube below.
Physical Equipment
How do we work out what equipment learners need? What is out there and where do we buy it from? We can help you with these questions and more. Stay on at the end for our “Drop-in surgery”Register for the drop-in surgery.
More webinars to follow.
Video Calls. Thursday 8 Oct 2020 at 4.15pm
Never before have we spent so long in video calls! But do we know how to make these accessible and do we know which platform to choose to suit our students’ needs? We will look at how to make your calls on Zoom, Teams and Hangouts more accessible with captions, transcripts, accessible resources, shared folders and more. Register for the Video Calls webinar on Zoom.
Remote Therapy. Thursday 15 Oct 2020 at 4.15pm
There are some excellent resources and tips on delivering remote therapy, we’ll review these and help you to feel comfortable delivering therapy in this manner. Designed with Speech and Language Therapists in mind, but with guidance that should help all therapists. Register for the Remote Therapy webinar on Zoom.
Blended / Flipped Learning. Thursday 22 Oct 2020 at 4.15pm
Blended learning is an approach to education that combines online educational materials and opportunities for interaction online with traditional place-based classroom methods. Flipped learning has been described as “school work at home and home work at school.” They are both effective when done correctly and can improve the differentiation, accessibility and engagement of learners. Register for the Blended / Flipped Learning webinar on Zoom.
Visual Learning. Thursday 5 Nov 2020 at 4.15pm
Mind-mapping, social stories, essay planning, symbols and timetables. We will discuss the merits of each, provide you with solutions (PC and mobile) and give you some new tools to deliver your sessions more inclusively. Register for the Visual Learning webinar on Zoom.
It wasn’t that long ago when the idea of talking to and interacting with a computer by speaking was the stuff of science fiction. Now we “Hey Google…”, “Siri…”, and “Alexa…” without giving a second thought to it.
While Intelligent virtual assistants (IVA) are still maturing, they already offer an interface to many who would otherwise find traditional computer interfaces difficult to use. However, for some, accessing virtual assistants is still challenging. Thankfully, built in accessibility features may make this easier. As of iOS 11 you are able to type rather than speak to Siri.
Google Home accessibility features are largely dependent on the device. On mobile devices, the app relies on Android’s accessibility features. On Google Nest smart speakers and displays accessibility features are controlled through the Google Home app. To access these features, ensure you mobile device is connected to the same Wi-Fi network as your smart speaker or display. Open the Google Home app. Tap your speaker or Smart Display. Tap on “Device settings, then “Accessibility”. Currently the options are limited. They mainly include additional audio feedback and cues. For smart displays in addition to auditory options, including closed captioning, it is possible to adjust the colours and the amount of contrast, as well as magnify the screen.
Amazon’s Alexa has a large number of accessibility features. Similar to Google Home, some of which are device specific. These accessibility features can be accessed either through the Alexa app or directly through the device. The features include audio instructions for configuration of Amazon Alexa devices; customisable sound cues; text size and contrast; screen reader support for the Alexa app; support for keyboard navigation in the app and on some Alexa devices; screen magnification; and the rate at which Alexa speaks can also be adjusted.
The “wake word” can be changed, although this is currently limited to four options – “Alexa,” “Amazon,” “Echo,” and “Computer.”
On supported devices (e.g. Amazon echo show) you can interact with Alexa without speaking. This includes using a keyboard during video calls made using the supported device. The Real Time Text (RTT) feature adds a live, real-time chat feed during calls and “Drop Ins”. When RTT is enabled, a keyboard pops up on the screen (external Bluetooth keyboards are also supported), enabling you to type text which appears in real time on both parties’ screens.
Ongoing efforts promise to expand access to virtual assistants for people with disabilities. Google recently announced a partnership with Tobii Dynavox to integrate Google’s virtual assistant into Tobii Dynavox augmentative and alternative communication devices.
The Karten Network is excited to be a partner in the European Union funded Nuvoic Project, led by specialist app developer Voiceitt to further develop the Voiceitt app. The app is designed to translate impaired or unclear (‘dysarthric’) speech into intelligible speech as well as control other voice-driven technologies such as virtual assistants. (see the Nuvoic project article for more information).
While privacy and data protection concerns exists, intelligent virtual assistants are hear to stay and possess the potential to make all our lives, particularly those with disabilities a little easier.
As always, I am interested to hear about how you are using mobile and other smart technology. I am also available to support and help where I can.
Martin Pistorius Karten Network Mobile Technology Advisor
by Derwen College ‘industry champion’ and project manager Neil Bevan, owner of design and app development business Hunter Bevan Ltd
Hunter Bevan Ltd is working in partnership with Derwen College on a three-year project to support young people with special educational needs and disabilities (SEND) to gain supported work placements as part of their learning journey. For some students this may ultimately lead to paid employment, the added independence and sense of achievement that brings. Our involvement in the project is in the development of a suite of iOS and Android-compatible apps to support training through an accessible, understandable platform.
As Project Manager, it is interesting to interface between the needs of the educational sector and the technical delivery of desired outcomes from the apps. The challenges include how we deliver those outcomes through the use of technology and mobile devices for training which are understandable, easy to use and accessible to students, using simple design and specific content such as Makaton symbols, which will be of benefit to staff and students as a teaching and learning aid.
In ‘mainstream’ commercial training apps we might take a totally different approach to that of working in a specialist college setting. Some of the things we are learning through working with the college are shaping the future development of apps in this sector – such as the example of fingerprint recognition patterns being different in students with Down syndrome to those of the wider community, meaning that using that method of recognising an individual may be difficult using current technology. These subtle differences are informing future accessibility to the apps we are developing in relation to security and safeguarding issues. We are also exploring the use of different learning methods – using video and spoken word, Makaton and written words, and a series of pictures and written words, to learn which approach different types of students prefer for their own personal learning. One thing that is definite is that the use of technology is being welcomed by the students and they enjoy using the apps in their learning. This is demonstrated by students requesting to continue to use the apps on their personal devices after leaving the college!
We have worked with the college to systemise the approach to determining the desired learning outcomes, producing easy-to-understand flowcharts of how the apps will work so that the specification can be agreed at an early stage. User interface designs are then created and presented to college staff so that we can discuss accessibility and user-friendliness appropriate to the students’ needs, learning from their specialist knowledge, before coding begins. We then share beta test versions of the apps with the college during the build process for evaluation, and continue to improve the ‘user journey’ through staff and student feedback. This ensures ongoing quality control and agility in being able to modify functionality and usability as we proceed to build the apps. The software behind the apps is the Laravel Nova admin panel, Cordova and React Native. We are partnering with Amazon Web Servers (AWS) to deliver server-side functionality where required, and the college is in the process of accessing grant funding from Amazon to support development on AWS. These state-of-the art development tools will enable further development of the apps in the future.
Current apps include:
Housekeeping training developed in conjunction with Premier Inn – working with Derwen College to convert Premier Inn’s training manual into a pictorial, Makaton and simple sentence version on paper – and then into a fully functioning app with the choice of using video and spoken word, Makaton and written words, and a series of pictures and written words.
‘Working in a Café’ – a step-by-step guide to common tasks when working in a café, using pictures, Makaton and words, which can be customised to an individual café’s own processes and ways of undertaking different tasks (e.g. multiple ways of making a cup of coffee using different equipment.
Digital CV builder, which enables a student to select descriptive words about themselves, upload pictures and personal data, and to incorporate videos and documents as evidence of their skills and work experience. As well as choosing from options, students can further personalise their CV using text, or voice recognition. The app also allows for schools and colleges attended, qualifications, work experience, references and a mock interview to be uploaded. An online or pdf CV is then generated which can be accessed by a potential employer who has been granted a secure access code.
Early-stage development of a ‘Getting ready for work’ calendar-based app to support being ‘work-ready’ and the use of public transport to travel to work.
As the apps are currently for internal use within the college and a limited user-base outside the college, they are distributed to users as .apk files for Android and via TestFlight for iOS. Hunter Bevan Ltd are approved Apple Developers for iOS.
We are delighted to be working on this project as it presents us with design and development challenges in building effective tools to benefit people with learning difficulties and disabilities which are really making a difference to the students’ lives and employment opportunities. The project has forged stronger links with Hunter Bevan and Derwen College and I have taken on a voluntary role as an Industry Champion for the Retail Pathway with the College – promoting their work, and supporting the staff and students with ‘real life’ business experience. The project has also proven to be an effective platform to introduce and present the work of the college to the wider business community.
The Karten Network is very pleased to begin work this Summer on the Nuvoic project, which aims to improve access to voice recognition technologies for people who have dysarthric or unclear speech. Our project partner Voiceitt produces a specialist voice recognition app, designed to be used by people who have dysarthric speech and who are unable to use mainstream voice-controlled systems.
We want to recruit participants who would be willing to provide voice samples to extend Voiceitt’s database of English-accented dysarthric speech, which will help improve the performance of their recognition systems. Voiceitt are also working to develop new and existing functions for their apps. Potential uses include voice output, to help users to communicate more easily with people who are unfamiliar with their speech, and control of other voice-driven technologies such as the Amazon Alexa and other smart-home and environmental control systems. There are also plans to develop a voice-controlled online shopping app.
We are looking for participants aged 16 or over to join this exciting project, to test these apps and provide feedback to Voiceitt on how they could be improved. The apps give written instructions to the user, so some literacy is needed. If you would like to get involved with this project or would like to find out more, please contact Liz Howarth who is the project co-ordinator for the Karten Network: liz@karten-network.org.uk.
We would be very grateful for any help from colleagues in our Karten Centres to help us to publicise this project and make contact with any potential participants. The Voiceitt apps currently run on Apple devices but Android versions are planned for the future. Participants using their own iPhone or iPad will benefit most easily but some funding is available to provide testing kit where needed.
Apple’s annual Worldwide Developers Conference (WWDC) took place last month. Unlike the previous 30 conferences, it was held virtually and free for anyone to attend.
The WWDC keynote, streamed directly from Apple Park included, the introduction of iOS and iPadOS 14, watchOS 7, tvOS 14, and more. The full keynote is available on YouTube.
iOS and iPadOS 14
Version 14 of iOS and iPadOS, expected to be released in September, brings the first major change to the user interface since iOS was first released with the introduction of widgets. A concept that Apple have used successfully on their Apple watch. These widgets, available in different shapes and sizes can be placed on the home screen, creating a more data-rich home screen. A variety of widgets that can be added through the Widget Gallery.
Widget Stacks
To optimise the use of space you can create Widget Stacks, allowing you to stack up to 10 widgets on top of each other. Swapping between them with a swipe.
A Smart Stack can be added. This folder uses on device intelligence to automatically display the best widget option based on time, location, and activity. For example, if you stack up the weather widget, the calendar widget, and the maps widget, you might see the weather when you wake up, the calendar as events as they come up and maps when you are out.
Siri is also ever present, providing a Siri Suggested widget based on your device usage. If you read the news every morning on the bus or train, you may see the BBC or Apple News app. If you order coffee each day around lunch time the Costa or Starbucks app will appear around that time.
App Library
To compliment widgets, a new App Library feature has been added. Similar to Apple Watch’s app list view, App Library automatically organises every app you have installed into category folders.
Swipe right to get past the Home Screen pages and onto App Library view. This seems a great to quickly access apps that normally reside beyond the first or second page.
App Library includes an intelligent “Suggestions” folder where four recommended apps are shown based on factors like, usage, the time of day, location, and activity.
It is now possible to hide individual pages or apps, limiting apps to the App Library. This will allow for much tidier screens.
Other screen-space saving changes include a new compact incoming call and Siri interfaces. Siri is not only now more compact but has received a significant update, making Siri “smarter”.
Improved Accessibility
As with every new iteration of iOS, the accessibility has been improved and expanded. Voice Control, introduced last year gets a new British English voice and expanded capabilities, support for Braille has been enhanced and expanded, and more. Some of the new features worth mentioning are:
Sound recognition
While this is not a new concept, it is the first-time sound recognition has specifically been used for accessibility purposes. Amazon, Google, and others have used AI-based sound recognition for personal safety applications. For example, Google Pixel phones are able to listen for a car crash and Amazon’s Alexa can listen for the sound of broken glass. Once enabled in the accessibility section of iOS 14 the phone will listen for 14 different sounds, including a knock at the door, a doorbell, sirens, smoke detector alarm, dog barking, a crying baby, and more. If one of these sounds is heard you are alerted.
Headphone enhancements
This new feature allows people to adjust the frequency response and boost softer sounds to make it easier to hear. While this requires one of the compatible headphone sets (e.g. AirPods Pro, AirPods, Powerbeats, Powerbeats Pro and Beats Solo Pro, EarPods) it is a great enabler for anyone who has trouble hearing.
Back Tap
Back Tap this new simple feature lets you assign an action to a tap on the rear of your iPhone and it even works when the phone is in a case. Currently, Back Tap only supports two movements, a double and a triple tap. However, there are large number of actions that can be assigned to the taps.
FaceTime Sign Language
The update to FaceTime now includes artificial intelligence that will recognise if someone is using sign language during group FaceTime call. The system will then make the person on the call who is signing appear more prominent.
VoiceOver
VoiceOver has been upgraded. Most notably is VoiceOver utilises on-device machine learning and Apple’s Neural Engine to recognise and audibly describe what’s happening on screen. VoiceOver can now identify key display elements, especially on websites and apps that don’t have their own accessibility functionality. This can include text within images as well as interface controls that can all now be read out.
Apple Magnifier
Apple Magnifier has been upgraded too. It now magnifies more of the area you are pointing at, as well as capture multi-shot freeze frames. You can now filter or brighten images for better clarity as well as capture multiple images at once – making it simpler to review multipage documents or longer content.
Other new features and changes in iOS
With more than the reported 250 changes and enhancements from iOS 13 here are some of the changes and new features:
Picture-in-Picture (PiP). While not a new concept Apple has added it’s “Apple touch” to it. If you swipe away while watching a full-screened video, the window will now float on your home screen, allowing you to move and resize it. The video can also be minimise into a small button on the side of the screen. All this means that you can continue to play a video while doing something else.
App Clips – similar to Android’s Instant Apps, App Clips allow you to use an app without needing to download the full version. App Clips will also support Apple Pay. This feature could be useful for those times when you need quick access or only need an app temporarily.
A significant update to Apple Maps and includes new cycling-specific navigation, among other things. This can alert you to increases in elevation as well as a notification if there are stairs along your route. This could be very useful for wheelchair users too.
Unfortunately, for now, is US and China focused and will be first available for New York City, Los Angeles, San Francisco, Shanghai, and Beijing. Although more cities will be added once iOS 14 is released.
Third-Party Default Apps – for the first time since iOS was released Apple allows you to set third-party browser and e-mail apps as the default. This means Google Chrome users and those who prefer other e-mail apps can use them more easily.
Privacy Protections – A significant amount of work has gone into improving and expanding the privacy protections. It is now even clearer what information apps collect, requiring more user permission. It is now possible to use location dependant features without providing your specific location data.
Changes have been made to the Apple app store to make is clearer what the privacy impact of each app is before you download it. Apple now require developers to self-report their privacy practices, including any data they collect and used to track people.
Developers must now also obtain express permission from the user to access or use any tracking data.
The Apple Clipboard now provides a notification, so you know what app is accessing text copied to it.
Apps that need to discover and access devices on your local network now need to gain your permission to do so first. Fine-grain control to your photos has been added. Any app that requests access to the Photo Library, no longer needs to have access to all your photos. You can choose to block access, select specific photos that the app can view, or allow full access.
If an app uses either the camera or microphone for recording purposes an indicator light will now appear next to the mobile signal bar. This will happen whether an app is being used or running in the background. This will ensure that apps are not secretly recording without your knowledge.
To prevent operators tracking your device Wi-Fi now includes the option to “Use Private Address”; and Bluetooth devices can be renamed.
Translate App – Siri’s translation capabilities added in iOS 13 have been expanded into a dedicated translation app.The underling translation engine has also been integrated into the new version of Safari. This makes it possible for websites to be translated too.
Memoji will be expanded, including more age options, and accessories for Memoji, such as face masks – I sign of the extraordinary times we are living in.
iPadOS 14
Most of the changes and new features of iOS 14 will also be included in iPadOS 14. Apple also introduced a new “Scribble” feature for Apple Pencil. This can automatically convert handwriting into text. Built in intelligence makes Scribble context aware. This means it is able to for example recognise a phone number or address and offer you an appropriate app to use the data.
watchOS 7
Apple has finally added sleep tracking to Apple watch. While more in-depth data will be gathered if you use an Apple watch, the companion app for iPhone doesn’t require you to use an Apple Watch.
With the coronavirus pandemic hand hygiene has become even more important. watchOS 7 now includes a Handwashing app. This new app not only can detect the motion of you washing your hands but uses the watch’s microphones listen for the sound of splashing water to confirm that you’re actually washing your hands. The app then displays a countdown to ensure you wash your hands for an adequate amount of time.
Google I/O
Google took the decision to cancel their annual developer conference entirely. Google did however announce some accessibility improvements.
An Accessible Places feature has been added to Google Maps. Available on both Android and iOS, Accessible Places is designed to display wheelchair accessibility information about a location or business. You can enable Accessible Places by open the latest version of the Google Maps app. Navigate to “Settings”, then choose Accessibility and turn on Accessible Places.
Once turned on, it’ll show a wheelchair icon for places with an accessible entrance. More detailed information is also available, including Blue Badge parking, accessible seating, and toilets. Accessible Places will be released in the UK, US, Japan and Australia, with more countries being added later.
Action blocks, mentioned in the Autumn 2019 newsletter has now been released. The app enables you to create customisable home screen buttons. Creating a string of tasks or actions that can be trigged by a single tap
Live Transcribe, Google’s real-time, speech-to-text transcriptions for conversations has been updated. You can add custom words or names for the system to recognize and spell. You can use search to search through past conversations. To enable this feature though you need to enable “Saving Transcriptions”. This will then save transcriptions to the device for three days.
It is now also possible to set Live Transcribe to listen for your name. Your phone will then vibrate whenever someone nearby says your name.
The following article was produced in collaboration with our Karten Centre and is provided courtesy of Jewish Care Interact. For more information please visit: https://www.jewishcareinteract.org
While you may be eager to embrace technology, it’s important to protect yourself in the digital world.
Top tips
Here are the top 10 best practices for you to follow:
Passwords. Use hard-to-guess, unique passwords. Secure your accounts with your phone number.
Logins. Store your login information by using a passphrase or password manager.
Social media. Be a savvy social media user by selecting higher privacy settings and thinking twice before sharing personal information.
Devices. Protect your devices by setting a PIN or password and making sure your devices aren’t left unattended.
Banking. Keep your online banking information private.
Emails. Delete emails requesting personal information or urgent money transfers.
Locations. Only login to your accounts on computers you trust. Use your own devices when you can.
Privacy. When using shared computers, browse privately and log out of your accounts.
Virus protection. Restart your browser or computer if you’re told it has a virus, and don’t click on any virus alert messages.
Ad blocking. Use ad blocking tools for safer Internet browsing.
Each strategy is explained below.
Passwords: make them strong
Having a strong password is probably the most important thing you can do to reduce your risk online. These basic dos and don’ts can go a long way.
Password dos:
Do log out of your accounts when you’re finished using them—ALWAYS.
Do consider using a password manager or app and two-factor authentication.
Do use long passwords with symbols, since they are more secure.
Do have a different password for each account.
If you do write your passwords down (although this is not advisable), keep them in a safe space that’s far away from your computer/device.
Password don’ts:
Don’t use obvious passwords, like password, 123456, qwerty, letmein, dragon, shadow, abc123, master, sinatra, etc.
Don’t use passwords that someone who knows you can easily guess: birthdays, home towns, pets, relatives, etc.
Don’t share your passwords with anyone, and don’t let anyone see you type them in.
Don’t carry your devices and passwords in the same bag.
Don’t log in to your accounts on computers you aren’t sure are secure.
When deciding on your password, keep in mind that it shouldn’t be so difficult to remember that you need to write it down or tell someone about it. Age UK says a strong password should not be too short and should include a combination of letters, numbers and punctuation marks. The ideal password would be some obscure nonsense word that only has meaning to you.
If memorising a password is too difficult, you may want to try using a passphrase. A different sequence of words (like “Fido is a good dog”) for each account can be written down and stored somewhere safe. Passphrases are especially helpful if you have the option for a longer password.
Another helpful option is a password manager. This tool stores encrypted and protected versions of all of your passwords in one place. Ideally, the password you use for your password manager will be the only one you need to remember!
Social media: be selective
A breach of privacy can sometimes cause more damage than a financial loss. You might feel like you have nothing to hide, but at the same time, you might not want all your affairs to be public knowledge.
To protect yourself from identity theft, here are some good habits for Facebook, Twitter and Instagram:
Don’t share things on social media that you don’t want associated with you. Your posts might live forever on the Internet.
Be careful about photos you share, particularly those that relationship scammers might use to trick you into revealing private information or sending money.
Adjust your privacy settings to restrict who can view your posts.
Don’t share everything online. Information such as your birthday, address history, likes and dislikes can easily be used to impersonate you.
If you do store photos, videos and important documents online through social media, you may want to plan what will eventually happen to this information when you pass away. The Digital Legacy Association suggests that you download a copy of all of your photos and videos from social media accounts and share them with a person you trust. You may also want to assign administrative access of your social media accounts to someone trustworthy. You can download a template Social Media Will from the Digital Legacy Association.
Devices: protect them with PINs
Preventing others from hacking into your electronic devices is an important part of staying safe online. You can safeguard your devices (and your information) by taking the following steps:
Turn on the screen lock from your security settings.
Don’t use a pattern lock (PINs and passwords are safer).
Don’t leave your device unattended in public spaces.
Don’t write your access codes on the case of the device or keep them written down anywhere near the device.
Use anti-malware software if possible.
Banking: take care with financial information
The most common online banking scams typically happen when criminals trick you into proving information that opens the door to your account. Money transfers through job adverts, prepayment requests, false charitable donations, medication scams and other fraudulent actions can be avoided if you investigate before you send money or provide credit or debit card details for payment. And remember this: your bank will never email you or send you messages through the Internet.
Online banking and shopping can be used safely if you:
Discontinue any online transaction if your browser warns you that a website is not legitimate.
Be wary whenever someone requests money from you online; ask for advice from someone you trust if the request seems even the slightest bit questionable.
When checking out from an online store, be sure you are purchasing the things you really want (and not something that has ended up in your cart unintentionally).
Look up reviews, especially if you’re planning to buy from a business you are unfamiliar with (a good source is uk.trustpilot.com for British businesses).
Never give others access to your bank account, PIN or banking apps. Protecting your identity is an essential part of staying safe online. (For more advice on this subject, visit the Protect your identity page from NI Direct.)
Your bank’s fraud helpline is a good place to start if you have any questions about online banking.
Emails: be cautious about requests for money
Citizen’s Advice offers these helpful tips on spotting signs of an email scam. They warn you to be careful if:
Something comes out of the blue or from someone you don’t know.
Something sounds out of the ordinary, like you’ve won the lottery, or you’ve been invited to invest in an ‘amazing’ scheme but asked to keep it a secret.
You receive an email message urging you to phone an expensive number (these start with 070, 084, 087, 090, 091 or 098) or make a quick purchasing decision (a trustworthy company will be happy to wait).
Since scammers may mimic familiar email addresses by changing a letter or two, always check to make sure the source is accurate. The email could look very official—it might claim to be from HM Revenue and Customs or come in the form of an invoice from someone you do know—but if it’s unexpected, it’s probably a scam. If in doubt, give the sender a call, but be sure use the phone number you have in your records (not the phone number included with the questionable email).
If you do receive an email that’s suspicious or includes a request for your financial information, just delete it. Don’t bother to respond. Even if you just request to be deleted from the email list, this signals to the scammer that your email address is legitimate. This can ultimately lead to a continued flood of unwanted emails in your inbox.
While it may be tempting to log in to your online accounts from an unfamiliar device or location, it’s very important to avoid devices that are set up in places you don’t trust. A computer in a public location like a library or store could be saving and storing your personal data without your knowledge.
Privacy: keep your details top secret
If you do decide to log on to a public device, try to follow these three tips:
Use a private browsing mode (like Incognito) to keep your information safe.
Make sure you don’t save login information on a shared computer, and
Fully log off when you are finished.
Virus protection: use software to stay safe
Antivirus tools and firewalls that come with your machine can protect you from computer viruses and other unwanted cyber intruders (like spyware, malware, worms and more). But even if you have virus protection on your machine, make it a habit not to open attachments or click on links in emails that come from suspicious sources.
Keeping your operating system, firewalls and antivirus tools up to date is good practice and should be part of your regular online routine.
Ad blockers: avoid pop up ads the easy way
By installing ad blocker software on your machine, you can avoid clicking on messages designed to trick you into sharing information or making your machine vulnerable. Even if a message looks legitimate or seems urgent, it’s more than likely some sort of scam.
If you’re an advanced computer user, you may want to look into VPNs (or virtual private networks) for even more protection. These tools can block annoying ads, but they can also block scripts that track your online behaviour, prevent distracting banners and even speed up your web browsing.
Other ways to protect yourself online
Roughly half of all fraud crimes that happen each year take place online. Very often, these crimes go unreported.
Being aware of the most common scams will help you avoid them. Here are a few examples of scams that happen frequently:
Money transfer scams. These may be disguised as a transaction where you could be asked to provide information, such as your bank details, so that transfers can be made through a UK bank account and you will be paid generously for your trouble. This technique is used by fraudsters to launder money and could get you into serious trouble.
Medication scams. You can be encouraged to buy some sort of wonder medication online that turns out to be fake or sometimes isn’t delivered at all.
Relationship scams. This happens when someone finds your details online, pretends to be interested in you and then tries to manipulate you into sending them money.
Stranded traveller schemes. Scammers might pose as a friend or family member or pose as an authority figure and will then tell you that your friend or relative is in hospital or prison abroad. Using this information, the scammer will try to convince you to transfer money as soon as possible.
For more information on common scams, visit the following pages:
If you do suspect a scam, it’s a good idea to report it to an authority. That way you will help fight online crime and prevent others from being targeted by the same scammers. The following websites will give you more information on how to report a scam:
Amidst the Covid-19 pandemic many organisations are making their products available either for free or at a reduced rate. There is an ever-growing collection of resources being developed to support people who are staying at home. For an extensive list of these and other resources please see the Karten Network website: https://karten-network.org.uk/home-learning-support/
For a bit of fun, invite a horse into your house, a
lion into your Livingroom and while you are at it, have a tiger round for tea!
You will require a mobile device that supports
Augmented Reality (AR). This will need to be, either an Android device running Android 7.0 or later
or an Apple device running iOS 11.0 or later.
Open your web browser and do a Google search for an
animal e.g. “tiger”. If an AR animal is
available, it’ll show up in a small box with some information and an invitation
to “Meet a life-sized tiger up close.” In that box will be an option to “View
in 3D”, tap that and the website (Wikipedia) will place an animated 3D model on
your screen. Tap on “View in your space,”. You may be asked to allow access to
your camera, if so tap “allow”.
Point your device at the
floor and the view will switch to an AR mode. You will be asked to move your
phone around – this step may take a couple of minutes. Typically, you also need
a room with fairly good light and a flat-ish surface. Then almost by magic the
animal will pop up in your space. You can now move your device around to view
the animal.
The current list of
available animals is:
Alligator
Angler fish
Brown bear
Cat
Cheetah
Dog
Duck
Eagle
Emperor penguin
Goat
Hedgehog
Horse
Lion
Macaw
Octopus
Pug
Giant panda
Rottweiler
Shark
Shetland pony
Snake
Tiger
Turtle
Wolf
As always, I am interested to hear about how you are using mobile and
other smart technology. I am also available to support and help where I
can, even more so during these
exceptional times.
Martin Pistorius, Karten Network Mobile Technology Advisor
The power of the Karten Network is its ability to share its wealth of knowledge and expertise. A few years ago, Matt Harrison, then at Portland College, now at Beacon Centre, shared his use of QR codes during the Karten Network Events. More recently TechAbility and National Star’s Neil Beck showed me a project he had done using both QR codes and RIFD tags. This inspired me to write this tutorial on how to use QR codes and RFID tags.
How to use QR codes and RFID tags
QR Codes
A QR (Quick Response) code is a two-dimensional
barcode that enables you to quickly access the data associated with a QR code.
Most of the time this is a URL (a website address).
As the name suggests, QR
codes are ideal for quickly and easily accessing a network linked
resource, like a video, photos or other information. An example could be to
allow a learner to access a video on how to perform a task. The learner would
then simply need to point their device at the QR code and link through to the
resource. Or QR codes could be placed on the packaging of a product or printing
job, making it easier to tell customers more
about your Karten Centre or enable them to place
another order. What and how you use QR codes for is really only limited
by your imagination.
There are a number of online services, most of them
free that allow you to generate QR codes. I would suggest simply doing a Google
search for “qr code generator”. Fill in your information e.g. the URL and
generate the code. You will typically get an image file
containing the QR code to download.
This can then be printed and placed wherever needed.
To read (scan) a QR code used to require an app
however any iOS device running iOS 11 or later has a QR reader built into the
native camera app. Android 9 and later also have
the feature included with the camera app courtesy of Google Lens.
To scan a code, open your camera app, and point it
at the code. Typically, a window will pop up asking you if you want to open the
link. Tapping OK/Allow will take you to
wherever the QR code is set to go.
If your device doesn’t support reading QR codes through the camera, then you will need to download and install an app such as QR Reader for iPhone and QR Droid for Android.
RFID tags
RFID (Radio-frequency identification) tags are tiny
radio devices that can store data. A lot of us use them all the time as they
are what make contactless payments and hotel card keys possible.
There are generally speaking two kinds of RIFD tags
– passive and active. The main difference is passive tags get their power from
the reader and active tags have their own power source.
You may have also either heard of or seen NFC on
your device. NFC (Near-Field communication) is technically a sub-set of RFID and is based on the RFID protocols. The main difference
between RFID is that an NFC device can also emulate a tag. It is also possible
to use NFC in a peer-to-peer mode, to transfer information between two NFC
devices.
While I have provided this
basic overview of the technology, in reality you don’t really need to worry
about the underlying details to use the RFID/NFC tags. For the sake of
simplicity, I will use the term “NFC tags” for the rest of this tutorial.
NFC tags usually come
either embedded in a plastic card or fob, or as a sticker. They are cheap. For
the purposes of this tutorial, I purchased 10 blank cards for £3.57 and 10
blank stickers for £2.93 including postage on eBay.
Similar to QR codes blank
NFC tags allow you to store data on them. In most cases more than you could
using a QR code. However, the major advantage to NFC tags is you don’t need to
open an app and simply need to be close enough for your device to read the tag.
Dedicated NFC tags writers and readers are available, however, for the purpose of this tutorial I have only used a mobile phone and an App.
You will need:
An Android or iOS device (Please see the “devices” section for details on supported iPhone models)
A NFC app – for this tutorial I used the NFC Tools app, available for both Android and iOS. Another good option are the, NFC TagInfo and NFC TagWriter apps by NXP. These are available for both Android and iOS. (See the links section for more details)
Blank NFC tags. I recommend tags with the NTAG213 chip. These can be purchased online. I used eBay but they are available elsewhere. If you prefer a reputable UK supplier, try Seritag
Devices
While both Android and iOS
can read and write Apple have, until recently been far more restrictive. In
fact, creating (writing) NFC tags is easier to do on an Android phone.
Android
Most Android phones can
read NFC tags without the need for an app. You may however need to enable the
NFC on the device. To do so go to settings, or swipe down from the top of the
screen and tap the NFC icon to toggle it to “on/enabled”.
You will however need an
app to write to NFC tags.
iOS
Apple devices have NFC
enabled natively within iOS, so you don’t need to turn it on. However, until
the iPhone 7 Apple didn’t allow the use of NFC other than for payments.
To read a NFC tag you will
need an iPhone 7, 8 or X running iOS 11
or later as well as an NFC reader app. The app doesn’t need to be open but must
be installed on the phone.
The newer iPhone XR, XS, XS
Max, 11, 11 Pro and 11 Pro models can all read NFC tags natively without
requiring an additional app.
iPhone will also not read
blank tags. The tag must be encoded with NDEF data otherwise it is ignored.
To write/encode an NFC tag
you will need an iPhone 7 or newer and be running iOS13 or later.
Writing/Encoding NFC Tags
There is no real difference between Android and iOS
when it comes to encoding NFC tags using the NFC Tools app. Although the apps
interface is slightly different between Android and iOS.
While either an Android or iOS device can be used, I would recommend using an Android device to create the NFC tags.
Select the type of record you want to add. In most cases this will be a URL
Enter the data e.g. the URL
Tap “OK”
You can add additional records by repeating the
steps above.
Once you are happy with the record. Tap “write”.
You will now be prompted to touch the NFC tag to the device.
Your NFC tag should now be ready to be used.
Please note that the records you have entered
remain “loaded” in the app allowing you to encode multiple tags with the same
data. To write new data first remove the existing records by:
Tap the NFC Tools app
Tap “write”
Tap “More options”
Tap “Clear record list”
Tap “Yes” when asked if you are sure
Reading NFC tags.
As mentioned above Android devices read NFC tags
without the need for any additional apps, provided NFC is turned on. Simply either tap your
device to the tag or bring the tag to the device.
Only iPhone 7 or newer running iOS11 or later will read NFC tags. If you have an iPhone 7, 8 or X you will need to install an NFC reader app such as TagInfo or NFC Launch apps by NXP (See the links section). The TagInfo app has more features whereas NFC Launch is a lightweight app designed for reading a URL from an NFC tag.
Using NFC tags
Now that you have the tags, where and how you use
them is limited only by your imagination.
If you have an iPhone running iOS 13, using Apple’s
short cuts app you can do all sorts of fun things, e.g. turn off smart lights,
take a photo, play music etc. From my testing this also
appears to. be an exception where iOS
will read a blank NFC tag.
To do this:
Tap the Shortcuts app
At the bottom of the app, tap “Automation”
Tap the + in the top right corner
Tap “Create Personal Automation”
Scroll down to “NFC”
Tap NFC
Tap scan and scan the NFC tag
Scan the tag
You will be prompted to “Name This Tag” – Enter a name, and tap “OK”
Tap “Add Action”. From here you have a vast number of options such as playing music for example:
Continuing on from the instructions above, tap “Apps”, then “Music” then “Play Music”
This will now add on the text “Music” text, this will take you into the music library where you can select what exactly gets played.
Finally, you have the option to be prompted to confirm before running the action, or to simply run the action.
Now whenever you tap on that tag, the automation
will be performed.
While both QR codes and RFID/NFC tags are reasonably safe, please excise a degree of caution and common sense when scanning unknown codes and tags.
As always, I am interested to hear about how you are using mobile and
other smart technology. I am also available to support and help where I
can, even more so during these
exceptional times.
Martin Pistorius, Karten Network Mobile Technology Advisor
Technology for people with disabilities can be truly liberating and empowering. It enhances, enriches and potentially transforms lives. From my personal perspective as a person with a disability I am heavily reliant on technology to function in my everyday life. However, one key aspect to the use of technology by people with disabilities is that it is accessible.
By accessible, in this context, I mean digitally accessible. Digital accessibility is a practice to ensure that websites, mobile apps and other digital resources e.g. eBooks can be accessed and used by people with impairments, either directly or through the use of assistive technology.
Legislation within the USA, EU and UK require developers to ensure that website and apps are accessibly. However, in reality, compliance with this is mixed – this is why we find that some apps don’t support switch access.
In the UK the Equality Act (2010) (and the Disability Discrimination Act 1995 in Northern Ireland) requires organisations not to discriminate against people with disabilities and provide reasonable adjustments where needed. On the 23rd of September 2018 new regulations on the accessibility of websites and mobile applications of public sector bodies were introduced.
23 September 2019 – Public sector websites published on or after 23 September 2018 must be compliant.
23 September 2020 – Websites published before 23 September 2018 must be compliant.
23 June 2021 – Apps must be compliant.
In simple terms, to comply with the Public Sector Bodies (Websites and Mobile Applications) Accessibility Regulations two main requirements must be met:
Meet accessibility standards, either the international accessibility standard, WCAG 2.1 AA or the European equivalent, EN301 549
These guidelines are extensive, and while much effort has gone into improving the readability, the documentation still tends to be quite technical and can be tedious to read. I will therefore attempt to provide to some high-level guidelines. Current web accessibility (which is applicable to apps and other digital accessibility) is structured around four principles: Perceivable, Operable, Understandable, and Robust (POUR).
POUR
Note, I use “website” in this article, but it is applicable to apps too.
Perceivable:
Website users must be able to process information presented on/through the website. In broad terms, this means that a website and the content contained within it must be presented in a way that people of all abilities are able to process it. For example, text support for any audio content for people with a hearing impairment; audio for people with a visual impairment – this does not necessarily mean creating audio for all text but that screen readers and other assistive technologies can access the content of the website.
Simply put: Is there anything on the website that someone who has a visual impairment (including colour blind), or who is deaf would not be able to perceive?
Operable:
Website users must be able to operate the website with a variety of tools. Many people with a disability either have difficulty or cannot operate a mouse at all. It is therefore imperative that the website supports keyboard-based interaction.
To support users with cognitive impairments to operate a website, animations and media should be controllable. Any time limits for completing an action should be generous or configurable. All people, not just those with disabilities, make mistakes so users should be supported by providing appropriate instructions, cancellation options, and warnings.
Simply put: Can all functions of the website be performed with a keyboard? Can users control interactive elements of the website? Does the website make completing tasks easy?
Understandable:
If the website users can perceive and operate the website, can they understand it? Support users by using clear, concise language and offering functionality that is easy to comprehend. If a user takes an action, the connection between the action and the result should be obvious. Navigation should be consistent throughout the website. Forms should follow a logical flow, be clearly labelled and provide adequate guidance.
Simply put: Is all of the text on the website clearly written? Are all of the interactions easy to understand?
Robust:
Website users use their own preferred technologies. Within reasonable limits, a website should work well across platforms, browsers, and devices i.e. websites should not dictate the technology users can use. Ensuring that a website conforms to standards and conventions is one of the best ways to meet the principle of robustness. Clean well written code is generally more robust and accessible across platforms.
Simply put: Does the website only support specific browsers or operating systems, or devices? Is the website developed in accordance with standards and best practices?
Specific technical guidelines
Within these principles there are specific technical guidelines on how to create accessible websites. The General ones are:
Navigation and website structure
Make use of well-structured mark up. i.e. heading should be marked with the appropriate heading tags and in a logical order H1 > H2 > H3….etc. Ensure all parts of the website are able to be accessed without a mouse and that the reading and navigation order is logical and intuitive including multiple ways of finding information. Provide a means for users to skip repetitive elements on the page e.g. providing a “Skip to Main Content,” or “Skip Navigation” link at the top of the page which jumps to the main content of the page.
Colours
Ensure there is high contrast between the text presented and the background colour. Ensure that colour is not only used to convey information or as a prompt e.g. do not say “click the red button to continue”.
Images
Ensure all images have alternative (alt) text unless they are purely for decoration. Ideally decorative images should be called from the style sheet, not embedded in the page. For images that are purely decorative, such as a bullet point or border, the alt text should be empty or null. Alt texts should be appropriately descriptive – think of describing what the image is about.
Audio and video
Ensure you have subtitles, captions or at the very least written transcripts available with video and audio content. If there is audio that plays automatically on a website, ensure that these sounds can be paused or stopped by the users. Check your audio and video has open captions (captions that are available all the time) or written transcripts and that there is a pause or stop on automatic audio on your pages.
Text
Ensure the text can be made larger without affecting the content or function of the page or website. Don’t use images of text purely for decorative purposes.
Tables
The use of tables for layout should be avoided and only used for tabular data. Tables should be marked up with tags (table heading) to aid screen reader users to make sense of the content.
Links
Ensure that links describe where the link is going, what the link is or the purpose of the link. Links should make sense when read out of context.
Forms
Ensure there are labels immediately next to fields you want people to type in or click on. Check fields that prompt for an input (e.g. name, email, comments) have a label next to them which explains what data is to be entered.
Page time limits and flashing
Ensure pages with a time limit can have the time limit adjusted or turned off. Moving, blinking or scrolling can be used to highlight content so long as it lasts less than three seconds. However, do not put anything in your pages that flashes more than three times in any one second.
Resources and further reading:
The primary resource for web accessibility best practice guidelines is the W3C’s Web Accessibility Initiative (WAI) who set the standards. The standard that is most relevant to the new regulations is the Web Content Accessibility Guidelines (WCAG) 2.1. You may have seen that some websites state that they have an “A”, “AA”, or “AAA” rating this is based on their compliance to the WCAG.
Having a conversation with family, friends and colleagues is something
most of us take for granted, but one that can be denied to people with motor
neurone disease (MND) – a neurological disease which attacks the nerves that
control movement, leaving people unable to move, talk and eventually breathe. As the disease progresses, over 80% of people with MND will have
communication difficulties as their vocal muscles cease to work.
With thanks to the Ian Karten Charitable Trust, we were able to expand our communication aids service for people with MND. Part of this service is to support people with MND who wish to voice or message bank, and the Karten-funded equipment is helping us to do this.
Specialist
communication aids make a huge difference to daily life for people living with
MND by providing people with a voice through the conversion of text to speech
using an in-built synthesised voice, however many people wish to create a
personalised synthetic version of their own voice (more commonly referred to as
voice banking).
Voice banking has existed for over ten years, but over the last six years technology and services have improved to such an extent that more and more people are able to realise the benefits.
“I’ve now got a digital model of my voice which sounds really like me, it is quite impressive. So if my voice goes completely then I will be able to use my model voice on a little gadget. Luckily things have moved on since Stephen Hawking first got his synthetic voice.”
Michael who is living with MND.
Although technology is
improving, the process of voice banking remains ostensibly the same as it
always has. The user records a set of phrases, using a laptop or computer and a
headset microphone, that are then used to form the basis of the synthetic
version of the voice. The number of phrases needed depends on the service used
but can be anywhere between 215 and 3,500.
With the technology
improving, the time taken to make the recordings has reduced from an average of
90 days in 2018, to 6 days in 2019 which has made a big difference to people
with MND undertaking the process.
The quality of the voice
produced is also rising. Until recently the services would use the recordings
to capture enough phonemes to create the voice, whereas now the process
captures the algorithms within the recordings, allowing for clearer and faster
voice creation.
Message banking is another
option. Message banking allows you to
add emotion to your recordings, but as it is simply recording messages it means
you are limited to what you record. However, it can be useful for recording particular
catchphrases, place names, or a distinctive laugh.
One example of how message
banking can benefit is a lady who had voice banked, but her dog did not
recognise the synthetic voice due to the lack of emotion – however when using
her message banked phrase, her dog instantly recognised the opportunity to go
for a walk!
Through our communication aids service, we loan laptops and headset microphones to enable people with MND to voice and message bank. We also provide financial support towards the cost of creating a voice. To help promote message and voice banking to people with MND, we have created a short video, which you can view below.
What is voice and message banking?
“The equipment provision via the Karten Trust has been very important as it has ensured very quick access to the correct items needed, minimising delays, which really matter when someone’s voice is already changing.”
Louise Rickenbach, Regional Care Development Adviser, MND Association.
A single portable and light weight device, Connect & Learn is
the newest way for low vision learners
to access the curriculum. Unlike many specialist VI devices, Connect &
Learn centres around a popular mainstream Windows 10 tablet – Microsoft’s
Surface Pro. SuperNova Magnifier comes included as standard, along with a large
print wireless keyboard, a folding stand and a backpack.
Place the tablet on the stand and
Connect & Learn acts as a digital magnifier. Place a document under the
built in camera and students can: magnify the document, zoom in using their
fingers on the touchscreen, add a colour scheme, rotate or lock the image. They
can even save the image to add to their work or to explore again at home.
Choose the Connect & Learn
package with SuperNova’s speech included and your student can tap the screen
and hear the words on the document read aloud. And the scan and read feature is
impressively accurate. Send the text to a Word document, move fingers to the keyboard
and Connect & Learn becomes a Windows 10 laptop complete with SuperNova’s
full set of magnification and colour enhancing tools.
Connect wirelessly to the interactive
whiteboard and low vision learners, can ‘see’ and magnify the whiteboard. Again
adjust colours, zoom in, take photos – Connect & Learn doesn’t need an
extra & expensive camera that points at the board so your partially sighted
student isn’t forced to sit at the front of the class. [We also know IT skills
aren’t always readily available in school, so we’ve recently released a free whiteboard wizard download to
help you connect to the whiteboard.]
As with all of Dolphin’s
products, Connect & Learn gives direct access to browse and download from
the RNIB Bookshare Education collection of more than 320,000 textbooks.
[Contact the RNIB to get your school’s free login.]
There are also a couple of other
Connect & Learn features that teachers tell us are worthy of note. The
Surface Pro charger connects magnetically, so it’s super easy for partially
sighted students to plug in the power. And the tablet weighs less than 2kg – so
much lighter to move between classes than your typical VI solution. It’s also
worth saying that because Connect & Learn is built on Windows 10 and uses MS
Office products, students are developing their essential IT skills for life
outside of school.
Connect & Learn is best
summarised by a young VI gentleman I recently met in year 8 at secondary school
in the Midlands. “With Connect & Learn I don’t feel different!”
SeeAbility’s specialist teams working to reduce isolation and promote education opportunities to ensure that people with learning disabilities, autism and sight loss are able to access the latest technology to get connected and grow their independence.
Introducing voice activated home assistants, like Amazon’s Echo Dot, opens up whole worlds of possibilities for the people we support to live with greater independence. Using the Echo Dot is one of the easiest ways for someone with sight loss to verbally access the internet and gain immediate auditory response. It helps people do everything from turning on their favourite music to searching the internet for information and sharing ideas with others. It’s hugely empowering and gives individuals greater control in their lives.
Group activity and skills sessions at the Millennium Centre in Surrey have taken on a new energy and are far more interactive since the introduction of the Echo Dot. This new gadget has transformed everything from our music workshops to keep fit sessions.
Learning to make voice commands has taken time and a lot of support from Our Vision Rehabilitation and Speech and Language Therapy teams. At first the people we support had to learn how to ask short, factual questions that that could be deciphered by the Echo Dot. Some people may have struggled with getting their words out in time but everyone enjoys having their questions answered so they persevered.
David and Anne use an Echo Dot to choose music in their Book Club session
Sessions now offer a more inclusive experience since the Echo Dot has been embedded. In fact, our volunteers and specialist teams can engage more with the people they are supporting now that the voice activated technology is on hand to offer up answers to tricky questions. Workshops have more spontaneous interaction and people are developing confidence to lead conversations.
Anne says: “With Alexa it means you can hear lots of different types of music in the same session. We all get to request what we want on it. It’s made our music slot so much more fun.”
David, who has a love of folk and blue grass music, uses Alexa to play obscure tracks as part of his music session. He says: “Alexa helps me find rare tracks and I enjoy hearing and singing along with my favourite songs and sharing them with my friends. It brings us closer.”
As we are now well and truly in to the season of winter and approaching Christmas, I decided to discuss the Hive App and hub. The app is fully accessible for both Apple and Android devices and is relatively easy to navigate and use.
I purchased the Hive Hub back in August of this year and
although it is pricy at £300 I have found the benefits to be enormous in terms
of giving me more control and access to my heating, lighting, security and
everyday appliances in particular.
As I
have a busy daily schedule, I wanted to make the process of operating my
heating that little bit easier to manage. Until I purchased Hive, I had no way
of setting timers or controlling the thermostat when trying to warm my home up
effectively. This is now a thing of the past thanks to the Hive app and hub.
From controlling heating and hot water, switching lights on and off and even
turning everyday appliances such as the kettle on using Smart Plugs, I have
total control of my entire home using the Hive app on my iPhone at anytime,
anywhere. This has helped make my life much easier and in theory, should help
me keep my utility bills affordable.
So
how does Hive work?
Firstly,
you must have a smart meter and a reliable internet connection through your
wifi router before even considering looking into purchasing the Hive Hub. Also
important to note is that Hive is exclusive to British Gas customers.
Therefore, if you are with an alternative energy supplier you will need to
contact them directly to find out if they have a similar option.
You then
need to have the hub installed by a qualified engineer. Once this has been
achieved, you can manage and setup your smart thermostat, plugs and lights
straight from the Hive app on your smart phone. The Hive thermostats allow you
to monitor the temperature in your house, set heating timers and even set the thermostat
to a target temperature so that if it falls below the set degree, the heating
will turn on automatically.
The
Hive Smart Plugs can make energy use more efficient by enabling electrical
appliances in the home to be turned on and off or by setting schedules. So for
example, if I am walking my guide dog and returning home, I can turn the kettle
on using my iPhone and it will be ready to pour as soon as I am through the
door. The use of smart plugs also gives the added benefit of allowing me to
switch off appliances I might have accidentally left on even though I am not
actually in the house. So they basically can help take the stress out of my working
day because I can monitor everything on the go. The Hive smart light bulbs work
in essentially the same way as the plugs meaning you can schedule timers so
they will be switched on when you are out of the house which is really good
from a security point of view. The additional advantage of being able to
connect all of your Hive appliances and heating to a smart speaker such as a
Google Home or Amazon Echo means that you have the opportunity to control
everything just by using your voice.
In summary, finally going down the route of making my home
“smart” gives me total control of almost every appliance, heating and security,
all through the use of an app. It makes my life far easier and definitely gives
me peace of mind for the future, both in terms of monitoring my utility bills
and also for security reasons. In my personal opinion, for those of you who are
smart phone users and are reasonably tech savvy, this kind of technology could
be a game changer for you in the future.
Over the last few years it is fair to say that there have
been some extremely exciting, innovative and life changing apps which have been
developed which increase the independence of people with all kinds of visual impairment.
Once again, I am delighted to bring another of these to your attention. This
time it’s an indoor navigation app called Clew.
Clew is a free
iPhone app that records a user’s path and then guides you back to your starting
point. Clew was created to help visually impaired people to remember a location
such as returning to a seat in a room. Designed to work indoors, Clew uses
the camera on your iPhone to record a video of landmarks along your route. It
will then save certain points such as stairs, turns etc and guide you back to
your initial starting point.
How to use Clew
Hold your phone
upright in front of you with the camera pointing straight ahead. Press the “record
path button”, then walk the route you want the app to remember. It is worth
stating that at this stage, it is recommended that when recording a route, you
ask for sighted guide if possible. Press Stop Recording at the end of the
route. When ready to return, Press the “Start Navigation Button” and Wait for
Clew to convert the information. Clew will then provide verbal, haptic and
visual feedback as you reverse the route. Visually, the screen shows an image of
the next part of the route with a red
pin indicating where to go. If the user veers off the path, the red pin
disappears off the edge of the screen. When walking along the desired path, a
clicking sound is heard; the clicking sound stops to indicate when the traveler
has veered. A whistle tone indicates a turn; the app also verbally announces
which way to turn. The route is available until the app is closed from the app
switcher. According to the website, Clew works best with short indoor routes
and it is not advisable to use the app when outdoors because of varying
lighting conditions and possible glare from sunlight which could interfere with
the video recording.
First impressions
I have been hoping for an app like Clew for years to assist
with travelling independently when an indoor environment. I have mainly tested
the app in familiar areas such as our centre and so far, I am delighted to say that the
experience has been fantastic. I recently used it to create a route from my
upstairs office to the downstairs kitchen as sometimes I still get confused
with this route as I don’t do it very often. The app then guided me almost
flawlessly back to my office from the kitchen and only stopped giving me
directions when I was back at my chair. It even told me when I was coming up to
stairs and where to turn when I had reached the first landing and needed to go
up another flight. Sometimes the app does get confused if you have to make a
lot of left and right turns in quick succession, but for the most part it is
extremely reliable and has really increased my confidence when using it. I
think the technology used within the app is also extremely interesting as
everything is achieved by video. There is no GPS or internet connection
required to use the app which means you can use it in any indoor environment
you choose. Please note that this app is still in it’s early stages so it will
be a case of trial and error when using. The developers are planning to update
the app regularly though and have lots of great ideas for improving it’s
capabilities. Also, as always with these apps, please remember that they are
not designed to be a replacement for a Cane or Guide Dog. It is designed to
work alongside your mobility aid and you will need to depend on your mobility
and orientation skills when using this app.
At the beginning of March, Microsoft released a new and extremely
innovative app designed to assist people with a visual impairment to navigate
and understand their surrounding environment. The app is called Microsoft
Soundscape and is free to download and use.
Note: At this stage, the app is only available for iPhone users.
Overview
Soundscape uses 3D audio sound to give you a full audio map of what’s
around you when you are out and about. It is designed to help you navigate
independently and encourage you to be more confident in exploring streets and
getting to destinations. The app announces streets and points of interest such
as shops, cafes etc. in 3D so that you actually hear the exact direction in
which the place is located. To use Soundscape accurately, you need to use a
pair of either bone conducting headphones or Apple AirPods. As soon as you put
the headphones in your ears and start walking, the app will begin telling you
what is in your surrounding area.
Example:
As you walk along a street, you may suddenly hear directly in your right ear, “McDonalds, twenty five metres.” This means that there is a McDonalds restaurant not too far away from you, immediately to your right. Then if you wish, you can actually tell the app to direct you to McDonalds by setting a beacon. This will then give you constant audio information and feedback to let you know if you are heading in the direction of McDonalds or if you are going off course.
Operating the app
Soundscape offers three modes – ‘My Location’ tells you the direction
you are facing and the streets and intersections which are closest to you,
‘Around Me’ gives you places of interest which are near you in all four
cardinal directions, and ‘Ahead of Me’ provides the names of five places of interest
which are nearest to you and directly ahead of you. It is also worth mentioning
that the app is fully accessible and that VoiceOver does not have to be enabled
in order to use the app. So basically, anyone with a visual impairment can use
this app.
First impressions
I was actually a tester for this app when it was under development and
have been impressed with it from the very start. I find the app extremely easy
to use and another great feature of the app I haven’t mentioned yet is that it
is compatible with Apple Watch. This means that I don’t have to keep stopping
in the street and taking my iPhone out of my pocket when I want to use the app.
I can just tap the relevant button from my Apple Watch which is constantly on
my left wrist.
I think my favourite feature of the app is definitely the beacon
feature. I used the beacon feature once when it was snowing very heavily and I
wanted to get to my local Co-op. I became disorientated, but once I had told
the app to direct me to the Co-op, it kept me on track the whole way and always
kept me heading in the right direction. In fact, the app only stopped giving me
information when I was outside the door of the store. I felt very reassured
after this experience I now actually use Soundscape almost every day when I am
walking with my guide dog.
Important additional note!! The app is designed to be used in tandem with your usual mobility aid, such as a cane or guide dog. Do not use this app on its own when you are travelling outside. Also, to use this app, a constant internet connection is required, so a 3 or 4G mobile data plan is essential.
Seeing AI is a Microsoft research project that uses Artificial
Intelligence features to deliver an intelligent app, designed to assist
visually impaired people with performing everyday tasks such as reading text,
recognising people’s faces, identifying products, and identifying your
surroundings. This app is fully accessible with VoiceOver and magnification
features on iPhones.
How it works:
The app has nine different ‘channels’, which you can use for performing a specific task. Below is a brief description of each of the channels.
Short text:
Simply point the camera at text and it will be read aloud. Very useful for reading signs, text on noticeboards or even on food tins etc.
Document:
Hold the phone camera over a document such as a magazine, newspaper or letter and the app will automatically take a picture of the document and read it aloud to you. It will also try to format the document for you as well, so you have an idea of how the page is laid out.
Product:
Hold the camera over the barcode of a product and the app will scan it and then tell you the name of the product. You can also find additional information about the product such as cooking instructions or ingredients. Barcodes are sometimes difficult to locate on certain products, but a really cool feature of the app is that audible beeps are given to help with the locating and scanning of barcodes.
Person:
This channel allows you to take a picture of a person’s face and the app will then attempt to work out the age of the person and tell you what they look like. Note that you can also train the app to recognise people’s faces as well, so if you point the camera towards them, the app will actually tell you who they are.
Currency preview:
Point the camera at different types of currency such as Euro, Dollar and Pound to hear their values.
Scene preview:
Simply point the camera in front of you and take a picture. The app will attempt to describe everything in the view of the camera. Note that the Scene channel is still being developed, so is not always accurate in describing certain things. Hopefully it will improve over the next few months though.
Colour preview:
Just point the camera of your phone at any object and its colour will be announced. Note: lighting conditions are a factor.
Handwriting preview:
This experimental channel allows you to take a picture of handwriting and the app will attempt to recognise it and read it to you. Note: the text has to be the right way up for this channel to work.
Light detector:
The camera on the phone will detect the amount of light around you. This works by using pitch sounds. The higher the pitch, the more light there is.
First impressions
I have used this app for around two months and am really enjoying most
of its features. The barcode and text reading facilities along with the facial
recognition features are extremely accurate most of the time. The colour,
handwriting and light detecting features also have huge potential, as does the
scene preview mode. To be honest though, I am a little unsure about the currency
feature, as now that plastic notes are being printed, all £10 notes have
Braille in the top left corner. On the whole though, I really feel that this
app has made a big difference in terms of increasing my independence on a daily
basis. Now I don’t need to depend on friends or family to read my mail or find
products in my freezer for me. The fact that all these features are available
in one easy to use app is fantastic!
Price:
Seeing AI is free to download and use.
Note: This app is currently only available for Apple products and requires a constant internet connection.
I have been working with Sarah Jones and the team at Create Education regarding 3D scanning and printing developments. Create Education have a 3D Printing loan scheme that may be of interest to the network.
For more information and how to apply please visit the Loan agreement page on the Create Education website: https://www.createeducation.com/loan-scheme/
The website also contains lots of information and resources to help prepare people that are looking to develop their 3D Printing and Design skills.
Should you wish to get involved please let me know and I will connect you with Sarah who will help you to find out more.
Dawn Green Karten Network and Development Co-ordinator
Over the summer we hosted a series of training sessions
called the Sight and Sound Summer School. We realise that for many people,
accessing training can be difficult and costly, so we made it as easy as
possible by delivering the sessions online, using the very popular Zoom
meeting platform.
Over the course of 5 days, we covered a range of
solutions, with sessions on JAWS and keyboard shortcuts, ZoomText, RUBY
magnifiers and Braille displays. Finishing with a surgery style session, where
the attendees had the chance to ask us any questions about the Sight and Sound
product range.
With August being the prime time for summer holidays, we
made sure that we recorded each session in both audio and video format for
people to catch up on. You can find the details for these sessions below:
Is there something else you’d like us to cover or did you have a question that wasn’t quite answered in these sessions? We’d love to hear from you! You can email you suggestions or questions to carla.barker@sightandsound.co.uk
We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.OkRead more