The passing of the days brings us closer to WWDC24, one of Apple's most important events in the last decade. Not only because of the great news regarding the new operating systems of the Big Apple, but because it seems that it will be the event at which Apple reveals its plans for artificial intelligence (AI) in the nearer future. The latest rumors published by Mark Gurman continue along the lines that Apple will integrate a great language model in local to feed generative AI locally. In this way, user security and privacy would be guaranteed, two fundamental bastions for Apple.

A local LLM to enhance the AI ​​of iOS 18

The month of January was the starting signal for the strongest rumors around Apple's plans with iOS 18 and the integration of its artificial intelligence. Since then there is a lot of news that we have learned. Among them, Apple's possible collaborations with Google or OpenAI to bring more developed AI tools and platforms to their devices. However, The main rumor that is constantly repeated is that Apple is likely to take its first step in iOS 18 AI with a local LLM.

iOS 18

Related article:

The first AI features in iOS 18 will run on the device

A large language model (LLM) is a type of AI capable of performing different tasks, depending on the parameters with which the neural network has been trained and for the objective for which it was created. Some examples of them would be ChatGPT by OpenIA, one of the most used AI bots of the moment that has many similarities with Gemini, Google's AI. According to the latest rumors leaked by Mark Gurman, Apple would be thinking about integrating its own LLM within the device, locally, something we had also been hearing for months.

One more week, Gurman in his Sunday newsletter continues to support this idea of ​​local AI and now defends this idea from the point of view of Apple, which could easily argue that a local LLM would defend privacy and protect the user. However, there are other limitations to the execution of this AI locally, such as the lack of information to perform certain tasks as it is not in contact with the network or with a continuously updated database. But having this LLM locally would allow a complete scan of the device (mails, SMS, etc.) and would allow iOS 18 to better understand the user to anticipate answers and other aspects that we may not be able to obtain so safely if we connect iOS 18 to an LLM in the cloud.