This Made-for-iPad accessory can transform lives

 

People with disabilities such as ALS, Motor neurone disease, cerebral palsy, or spinal cord injuries can control their iPads using only their eyes.


In a boost to accessible technology, people with disabilities such as Amyotrophic lateral sclerosis (ALS), Motor neurone disease (MND), cerebral palsy, or spinal cord injury can now control their iPads using only their eyes and a newly introduced device.

TD Pilot brings a voice and control

TD Pilot makes it possible for a user to control the iPad, use apps, and even generate natural-sounding speech using their eyes. It relies on the support for eye-tracking devices Apple introduced in iPadOS 15 and is medically certified for use by people with disabilities such as ALS and Cerebral Palsy.

The product is an authorized Made-for-iPad accessory developed by Tobii subsidiary Tobii Dynavox working with Apple. Tobii is a global leader in eye-tracking tech, with solutions in use across thousands of enterprises and research institutes worldwide.

What is this and what does it do?

The system makes use of the iPad, the custom cover, and Tobii Dynavox apps running on the device. These apps include TD Talk, which generates natural sounding speech, and TD Snap, which is described as a symbol-supported solution to facilitate communication.

The rugged water and dust-proof cover is also of note. It augments the iPad with additional features, including powerful speakers, a battery pack, and wheelchair mounting and boasts a small rear-mounted display that mirrors what the TD Pilot user is saying to help make face-to-face communication feel more natural.

Eye-tracking tech can be unreliable in bright light, but TD Pilot says its system, which is available today, can track the eye even in bright sunlight.

Apple builds foundational technology

The product introduction hasn’t come easy. Not only have Apple and Tobii worked together to develop it for some time, but iPadOS support has been years in the making.

Apple has always been ahead of the game in accessibility. But some features, such as eye tracking or gesture detection, have taken time to build, though it is already possible to control devices using its innovative Voice Over technologies.

The principle of build it and they will come applies here, too, of course. Now that Apple’s mobile products support such a wide array of accessibility features, it seems inevitable we’ll see more developers deliver  solutions of this kind to Apple’s market.

Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives, said in a TD Pilot statement:

“We are excited that people who need this technology will have the opportunity to control iPad using just their eyes…. We build foundational technology, including support for eye tracking, into our operating systems to make them accessible, and we’re thrilled that Tobii Dynavox’s TD Pilot is leveraging that to enable people with disabilities to pursue their passions.”

With Apple’s AI/ML Accessibility research lead, Jeffrey Bigham, Herrlinger will speak at the Sight Tech Global Conference on Dec. 1. (The session involving them is called "Designing for Everyone: Accessibility and Machine Learning at Apple.") Herrlinger also spoke at the same event in 2020, when she focused on VoiceOver Recognition and machine learning.

Tools that make a difference

Solutions such as these can make a real difference to people. Team Gleason is one of the biggest ALS non-profits. Its Chief Impact Officer, Blair Casey, called the TD Pilot for iPad launch “a significant turning point for accessibility.”

In part, that's because this powerful technology is now available on an iPad for the first time, though it has been on Windows since inception. “So many people that need eye tracking technology are forced to abandon their native technology,” Casey said.

If there is a snag, it’s the cost.

The full system can cost many thousands of dollars. And while it is available for iPad, it's not yet on the Mac. It seems inevitable that price will put this out of reach of many people who would benefit from it, though the results can be profound. To be fair, this isn’t precisely a plug-and-play solution; a great deal of work must also take place around training and assessing a person’s condition.

Competing solutions that use the same technology are also beginning to emerge, so hopefully costs will fall over time, enabling even more people to make use of this technology.

Comments

Popular Posts