I’ve been a speech and language therapist for more than 10 years. Throughout my career, I’ve specialized in working with people living with speech and motor impairments, particularly those who are non-verbal and require assistance to communicate. It’s more than a job for me, it’s a passion. Every day, I strive to help people find easier and more accessible ways to express their everyday needs, opinions, feelings and identity.
Assistive technology helps this community express themselves. For example, eye gaze technology helps people type messages on a communication device and share them using eye movement alone. As mobile devices become more ubiquitous and powerful, with technologies like machine learning built right into them, I’ve thought about the ways phones can work alongside assistive technologies. Together, these tools can open up new possibilities—especially for people around the world who might now have access to this technology for the very first time.
Turns out a small group of people at Google has been experimenting with a similar idea: how can we use the latest technology to help people communicate. Earlier this year I started working with them on a project called Look to Speak, an app that lets people use their eyes to choose pre-written phrases for their phone to speak out loud. Today, Look to Speak is available to everyone and is compatible with Android 9.0 and above, including Android One.
With the app, people simply have to look left, right or up to quickly select what they want to say from a list of phrases. Perhaps my favorite feature is the ability to personalize the words and phrases—it lets people share their authentic voice. The eye gaze sensitivity settings can be adjusted, and all of the data is private and never leaves the phone. To help people put this app to use, we created a tutorial and a guide with top tips, like how to position the phone and use the simplified eye gaze interaction.