Well-known insider Mark Gurman reports that the new AI features in the iPhone on iOS 18 will be powered entirely by Apple’s own Large Language Model (LLM). This language model will live directly in users’ devices, not on servers, 9to5Mac writes.

Large language models that run directly on devices can boast of query processing speed, but they are clearly inferior to models that run on huge server farms with tens of billions of parameters and constantly updated data.

However, Apple engineers are likely to take advantage of the full vertical integration of their platforms with software customized for Apple chips inside the devices.

While local LLMs may not have as rich a knowledge base as ChatGPT to answer questions about all sorts of random trivia, they can be customized to be very effective at what a particular user is interested in.

Another big plus is privacy. There’s nothing wrong with having an AI process all emails and text messages through an on-device model, as the data stays local.

On-device models can also perform generative AI tasks, such as creating documents or images, based on prompts, with decent results.

It is likely that Apple will present its work at the WWDC conference in June.