Next Thursday is Global Accessibility Day, and with it we reflect on all the ways that tech companies like Apple with the iPhone and iPad, as well as competitors, governments and humanity an sich can make life easier for people with disabilities. So also enabling speech in your own voice and AI in Apple’s hardware wants to contribute to that with new updates. OMT editor Dennis Mons is quiet about it.
To get a sneak peek: OMT spoke with Eva Eikhout about accessibility of technology such as on the iPhone or iPad. She explained in great detail how these features can improve the lives of many people with disabilities. That interview with her still follows for sure. But Apple, for example, is about to release an update that will allow people to speak again. Intriguing.
Apple’s iPhone may become your best buddy on Accessibility Day
Do not underestimate, for example, the fact that eighty percent of people with ALS (Amyotrophic Lateral Sclerosis), for example, also have difficulty speaking. Not being able to communicate is therefore unimaginably agonizing, a “hell hole. But an iPhone or iPad can help with that.
So Apple is announcing new accessibility features that allow people with that disease, for example, to speak to loved ones through their iPhone, iPad or Mac in their own voice. It also combines LiDAR tech so one can easily also use the Vergoot glass to share messages more easily. It also reads typed messages aloud in the user’s voice.
Apple’s technology was developed in partnership with Team Gleason, a nonprofit foundation for ALS awareness. Personal Voice is without a doubt the most exciting because instead of one of its pre-made Siri voices (Australian, English), the iPhone uses a synthesized version of your own voice to say whatever you type.
Does it know the conjecture of, say, the sharpness of sarcasm? Nah, but somehow I think it’s just a matter of time when Apple’s AI learns how you react to things.
What is ALS?
ALS is a progressive neurological disorder that affects motor nerve cells. It leads to muscle weakness, paralysis and eventually breathing problems. Patients gradually lose their ability to speak, move and breathe. Although there is currently no cure, there are treatments and support to alleviate symptoms and improve quality of life.
If I can do it with friends of mine who can barely speak (I know when they *ssholes), then AI on the iPhone should be able to do it soon too. I personally can’t wait to hear them again.
Teaching a dog a trick
Like me (I slightly slower), AI knows how to figure out on an iPhone what one wants to say. To train Apple’s system, you position yourself about 15 to 30 centimeters from the iPhone’s microphone and then repeat a series of randomly selected phrases.
Enough so to train the machine learning (ML) in the iPhone and enable the phone to repeat in your synthetically generated voice whatever you type. I immediately feel like hacking a Furby and putting it in the bedroom when my girlfriend thinks about going to sleep.
Although many companies are tilting over each other when it comes to AI, I personally still feel that Apple is handling this smarter than the rest. All the insightful search engines notwithstanding, it’s more important to literally feel heard on your iPhone, iPad but also Android, especially on Accessibility Day.