Apple, just like any other company, uses its users as “testers” to improve its products and services. However, there are many ways to make your customers, in addition to money, provide positive value to improve the infrastructure, in this case, at a technological level. In this post we tell you about Apple’s interest in AI and how it uses us, the users, to improve it without us realizing it.
Siri is the assistant that has been left behind
Siri is one of the pillars on which Apple moves, through software interaction in its ecosystem. And in addition to being a voice assistant (which uses, of course, AI), Apple has found a very original way so that, without us noticing, we are training the algorithm. Can you imagine where things are going?
Apple Music Voice plan is one of the answers. Hundreds of thousands of inputs, to play music, browse artists, albums, and browse internet radio stations. All this, executed from devices such as HomePods or Apple TVs (among others). With a reduced price of €4.99 per month, we have access to millions of songs just by “asking for that little mouth”.
Invoke the famous command “Hey Siri”, to play songs, on devices ranging from the Mac to CarPlay, passing through the HomePod or the iPhone, it’s not a workout. It’s a boot camp: in many situations, a music plan that only works with your voice, and with a knockdown price to make it much more attractive, with the aim that the more people, the better, say “Hey Siri”, and they end up learning of our pronunciation, be more efficient in searches, be able to give more appropriate answers, etc.
Shazam has a scandalous algorithm
Shazam is another way that Apple trains its systems through users. Since the company bought Shazam in 2018, functions such as “Automatic Shazam” have been incorporated, which allows, by pressing the Shazam button once, to detect songs from our environment; It has even made it possible for Shazam to be directly in the Control Center, and without having to open the app, detect music.
Not to mention its direct implementation with Apple Music, which we recommend with a little star, those songs we haven’t heard yet, and that, on many occasions (because there are no perfect algorithms) we end up liking it and we end up adding it to our list. This is even seen in playlists of new “created for you” music songs. Apple Music learns from our tastes and habits, to recommend us more and more efficiently and accurately. Or the “Like” or “Recommend less of this style” buttons.
In fact, an Apple patent has been published, in which Shazam would be able to recognize songs thanks to a movement of our body. Like when, for example, we are somewhere, a song that we like comes on and we turn to where the sound comes from. Well, it would work with motion sensors and music would be detected automatically, without us having to touch the Shazam button, or open the application. All this, thanks to improvements in sensors and the evolution of AI which, with each passing day, is more and more capable. We therefore launch the following question, with some reflection: Do the AirPods motion sensors open the season, perhaps? We believe that it is something that cannot be known at the moment. But it could still be a fairly logical move.