At WWDC 2023, Apple showed off a fantastic iOS 17 feature: Live Voicemail. Namely, it converts voicemails into plain text. This allows you to decide whether to answer a call, call back later, or let a call slip nicely. Great for your iPhone 14 or soon the iPhone 15, but it could actually be even better if you look at the Google Pixel.
So this new Live Voicemail feature can save you a lot of time. But it also immediately raises a question. Why can the voicemail app do this and not the dictaphone app? After all, the bizarre thing is that Pixel Android phones have been doing this excellently for years. And we want this on the Apple iPhone 15, too.
Apple’s iOS 17 on iPhone 15 lags behind the Pixel
It’s a bit like we’re back to square one. Remember the days when you had to jailbreak an iPhone to Cydia to get certain Android features? The same seems to be the case with transcribing voice to text.
The Google Pixel has already had a recording app capable of converting audio to text since 2019. It has made life (and work) easier for a lot of people. That Apple is introducing a feature like Live Voicemail with iOS 17 is nice, but why not pull it wider?
By the way, it doesn’t mean that you can’t enjoy one as an iPhone user. There are plenty of (paid) applications that offer the functionality, but it feels a bit cumbersome to do it that way. Especially since there are thus competing companies offering the feature as free.
Certainly, because Apple itself already offers the functionality in its Phone app for free.
Instant text on the Google Pixel. (Image: Google)
Easily share and save everything
The rigid thing is also that the Pixel does this with verve. For example, record a conversation or (in our case) an interview with the recorder, and you can post a “typed out” message directly into a Google Docs document. In addition, you can also share it immediately with others. Not a luxury on an iPhone 15.
While the Google app is certainly not flawless, it does make your life a lot easier. So it’s somewhat agonizing that Apple doesn’t just deploy the transcribe feature on iOS 17 for the dictaphone app. Perhaps the company is waiting for a usable version of AI?