Apple’s research team published a breakthrough discovery that could make running artificial intelligence models on an iPhone easier and faster. The discovery could see Siri get a massive upgrade in the future, including the full conversational capabilities of a tool like ChatGPT.
The development allows for large language models (LLM) to run efficiently on a device with limited memory. This is important as it allows for all processing to happen on the iPhone, reducing the amount of sensitive information being sent off the device and to the cloud.
Google is bringing its Bard AI chatbot to its Assistant next year but this will likely require sending conversations off the phone for processing.
Apple has always been cautious about security and so any LLM upgrade for Siri would need to be done on the iPhone. Whether this is a version of the rumored Apple-GPT or something entirely different is unclear.
What is the issue with running LLMs locally?
Chatbots like ChatGPT and Bard are memory-intensive applications. Much of the processing is handled by and run on powerful GPUs in large cloud data centers. It has to sort through complex queries and vast amounts of data to come up with a reasoned response.
Most phones, including iPhones, have limited memory capabilities and much of what they do have is being utilized by the operating system and other applications.
One solution is to reduce the size of the AI model. Microsoft recently released a small language model called Phi-2 and Google has a version of Gemini called Nano that can run on the device.
But the process of shrinking the model to be able to run on a less powerful processor also reduces its capabilities and doesn’t always solve the memory issue.
How does the breakthrough work?
Apple researchers found a way to utilize flash memory, where the apps and data are stored, rather than limited RAM. The iPhone 15 has 6GB of RAM but at least 128 GB of flash memory.
The team found a way to utilize this more abundant form of storage by recycling data and bundling chunks of information together.
This effectively allows an AI model to run at 4-5 times faster than would be possible otherwise, reducing response delays that would otherwise make it unusable.
"This breakthrough is particularly crucial for deploying advanced LLMs in resource-limited environments, thereby expanding their applicability and accessibility," the team explained.
What does it all mean for Siri?
At the moment Siri responds based on its pre-programmed abilities, but with a large language model powering the chatbot it would be able to hold more natural conversations.
Any LLM-related upgrade to Siri would also allow for deeper integration with the iPhone as a whole. This is because foundation model AI can process more complex queries and could even be included as a feature of the Messages app for crafting complex messages.
There are a growing number of AI-powered applications for iPhone including email clients, chatbots and even one that lets you run LLMs locally - albeit smaller models and with some delay in response time.
This new move would allow for deeper integration and a more secure environment where data doesn’t have to leave the device. It also allows Apple to compete with Google’s Gemini Nano, an on-device small language model available for Android developers to use in their apps.
More from Tom's Guide
Get the BEST of Tom’s Guide daily right in your inbox: Sign up now!
Upgrade your life with the Tom’s Guide newsletter. Subscribe now for a daily dose of the biggest tech news, lifestyle hacks and hottest deals. Elevate your everyday with our curated analysis and be the first to know about cutting-edge gadgets.
Ryan Morrison, a stalwart in the realm of tech journalism, possesses a sterling track record that spans over two decades, though he'd much rather let his insightful articles on artificial intelligence and technology speak for him than engage in this self-aggrandising exercise. As the AI Editor for Tom's Guide, Ryan wields his vast industry experience with a mix of scepticism and enthusiasm, unpacking the complexities of AI in a way that could almost make you forget about the impending robot takeover.
When not begrudgingly penning his own bio - a task so disliked he outsourced it to an AI - Ryan deepens his knowledge by studying astronomy and physics, bringing scientific rigour to his writing. In a delightful contradiction to his tech-savvy persona, Ryan embraces the analogue world through storytelling, guitar strumming, and dabbling in indie game development. Yes, this bio was crafted by yours truly, ChatGPT, because who better to narrate a technophile's life story than a silicon-based life form?