WWDC is just around the corner. And with it, not only new versions of iOS will arrive, but also new functions. Some of them have already been seen, and they will completely change our experience when interacting with the iPhone. And this time, these changes are designed for people with functional diversity. But not only are we going to have new features as such, but, some of them are powered by artificial intelligence in an incredible way, and having the possibility of doing things like recreating our voice.
The iPhone now speaks like us
Artificial intelligence is coming to iPhone in a way we’ve never seen it before. The implementation of this technology, focused on functional diversity, offers the possibility of digitizing our voice, based on inputs that we ourselves are going to record. No need to download third-party programs or use peripherals. “Being able to create your own synthetic voice on your iPhone, in 15 minutes”as explained by Apple itself.
The two functions are called Live Speech and Personal Voice Advance Speech Accessibility. Live Speech allows you to play short phrases during FaceTime video calls. These are going to have our own voice, and their use is intended for people with speech difficulties (or who no longer have that ability).
Personal Voice Advance Speech Accessibility allows us to record fragments of our own voice, which we can use instead of using the predefined voices, typical of apps like Text To Speech or Loquendo. Thus, people who are at risk of losing their speech will be able to record short phrases, as inputs, which the iPhone will later collect to convert into a synthetic voice.
What you hear is as important as what you see
Accessibility functions, although they are a very important part of the experience of using the iPhone, many times, some of them remain hidden and it is the users themselves who discover them. In the same way, new features that are coming in new versions of systems have not arrived yet, they are not usually seen until at the time of presentation.
In this case, it has been the company itself who has revealed new functions focused on vision problemswhich go through a complete redesign of the interface and the merger of applications to unify similar functions in one place.
Assistive Access is here to stay
The design change in the interface goes through Assistive Access, a simplification and unification of elements so that people with different types of functional diversity can use the iPhone without problems. This happens by offering simplified menus, large buttons with even larger texts, initially, in the following native applications:
- the menu of start
- the app of Camera
- The merger of FaceTime and the Phone app
- Apple Music
- the app of Photos
Thus, from the same place, and with the redesign of these apps, people with vision or mobility problems will be able to access basic functions of these. Taking a photo, calling someone or sending a message will now be a much easier task.