The imminent arrival of iOS 17 has already allowed us to know the first official news from Apple. These focus on accessibility, and are going to improve many of the ones we already have today. But also, we are going to see new functions that they are going to change the way we interact with the iPhone. So if you want to know what they are about, in this post we will introduce them to you.
What you see, as important as what you hear
One of the best accessibility features on the iPhone is Magnifier. This integrated application allows us to use the camera to zoom in on objects and texts that, due to vision problems, we might not be able to see with the naked eye. However, it will expand its functionalities thanks to Point and Speaka new option that will be added with iOS 17.
When we open the Magnifier app and focus the camera on something, if that object has text, we can make the iPhone say what it is that we are focusing on, to read those text letters aloud. This is possible thanks to the combination of the following elements:
- own iphone camera
- He LiDAR sensor that incorporate some models
- The functionality of VoiceOver
- The artificial intelligence of the iPhonewhich makes the device learn as we use it
So this is not a function that, once it arrives, “stays there.” Rather, as we use it, it gets to know our usage habits and can recognize objects more accurately and, therefore, distinguish texts more accurately. As the company itself explains, this new tool is designed for people who are blind or have severe vision problems.
The new accessibility of the iPhone covers a lot
One of the greatest strengths of the Apple ecosystem is the integration with other devices. Here the Mac comes into play, since it will be compatible with a new accessory that was already available on the phones. We are talking about the earphonesand if we already had devices that were compatible with the Made for iPhone standard, Now we can synchronize them with the computer.
But the iPhone not only makes us hear better, but also speaks for us, and is capable of reading what we ask. Thanks to the Voice Control and Voice Over functions, this is possible. Now, with iOS 17, these have been enhanced, increasing the possibilities of use. And now we can ask the iPhone, iPad or Macthrough our voice, the different actions that we want you to carry outthanks to the new Voice Control guide.
Continuing in the field of auditory accessibility, they have improved the functionality of Voice Over. In this case, the improvement goes through the naturalness in which Siri’s voice is expressed when we vary the speed. When we decrease or increase the speed with which we want the iPhone to read something to us, we can now understand it much better, thanks to the improvement in diction and pronunciation.