AI on phones feels too much like homework — and Apple should use iOS 26's redesign to fix that
AI needs to be a lot easier to find and use if it wants to succeed

Tech companies, especially the ones that make and sell phones, have been telling us how important AI is for several years, and offering a range of new AI-powered features that promise to make our lives easier.
But despite promising that, it seems a bunch of those companies forgot the most important thing about new features — making sure people can actually find and use them. As someone who's spent a great deal of time reviewing new phones, nothing irritates me more than being told about new AI features and then having to do a bunch of research to figure out how to access them.
Even Apple, a company that has made its own AI suite a lot more transparent, is guilty of this in some regard. And since iOS 19 (or iOS 26) is set to redesign the entire Apple ecosystem, WWDC 2025 is a chance for the company to make AI feel less like homework and transform it into something a lot simpler and more intuitive.
Not all AI is hard to use or find
I'll preface this with an admission that not all AI features are difficult to find. If it involves going through some kind of voice assistant, like Google Gemini or Siri, then AI capabilities are literally only a voice prompt away. Similarly AI photo editing features, like Google's Magic Editor, have long been available in photo gallery apps, like Google Photos or Apple's equivalent.
But at the same time, considering all these features have been around for quite some time, their location and functionality have already been ingrained into our collective memories. Plus, once you know about one of those features, you can often find similar ones in the same spot. Or in the case of voice assistants, physically ask it about the kind of things it can do.
There's also a bunch of AI working in the background that the user doesn't actually need to initiate. All that processing that happens to your photos? AI has a hand there, just as it does in helping translate foreign languages for you. We've also seen AI applied to software that helps manage the battery and displays, to help phones run more smoothly and efficiently.
None of this is the flashy AI that gets promoted in keynote speeches or TV commercials. It's the boring stuff that makes your phone run and perform better, without you even realizing what's going on.
But if a company is trying to add some fuel to the AI hype train, the focus ends up on the new and showy AI features that look and sound good.
The problem is phone makers haven't put much consideration into helping users find the darn things.
AI needs to be second-nature to succeed
One good example I've found in this area are specialist translation apps — ones that do more than Google Translate. Samsung's Interpreter Mode is the one I've noticed this with most recently, offering the ability to translate two-way conversations happening in two different languages.
I know that it exists, Samsung has talked about it at great length, but looking at a Galaxy S25, it's nowhere to be seen. It's not in the app drawer, nor the home screen, nor is it one of the default apps in the Quick Settings menu. Instead you either have to use the search bar to find Interpreter Mode, or change the Quick Settings features to include it — which isn't ideal when you only have 6-8 slots to choose from.
The more I think about Galaxy AI features Samsung has talked about, the more I realize that I also have no idea where they are. The same is true for Apple Intelligence, Google Gemini and the countless other AI features that have been added to smartphones in recent years.
I made a point of criticizing this in my review of the Xiaomi 15 Ultra, but the problem is a lot more widespread than that — and it's like phone makers don't realize this is a complete hindrance.
In the days when new AI features weren't all that common, this probably wouldn't be so bad. Users get the time to get to grips with new features as they arrive, and by the time the next big software update comes around it'll be second nature. But the sheer number of new AI features being added to phones, and with little communication on how they work, makes this much more difficult.
The distinct lack of official guidance on how to use new AI features is definitely getting in the way of me wanting to use them — and I doubt I'm the only one who feels that way.
If phone makers really care about us using AI features more regularly, then this needs to change.
Bottom line
I've often spoken about my severe lack of interest when it comes to using AI on phones, and a big part of that is due to the fact it's usually so difficult to find any of the new features.
Back in the day, Apple would proudly declare that "it just works," with die-hard fans parroting that line for several years. But when your new smartphone comes with homework, it certainly isn't passing the intuitiveness test with a particularly good grade.
Apple's not the only party guilty of this, but with WWDC set to majorly shake up how Apple software works, be it on iPhone, Mac or another Apple product, it's in a position to try and help users use Apple Intelligence without doing a thesis-load of research first.
Who knows, maybe making AI actually intuitive can help make up for all its AI missteps over the past year.
More from Tom's Guide
Sign up to get the BEST of Tom's Guide direct to your inbox.
Get instant access to breaking news, the hottest reviews, great deals and helpful tips.

Tom is the Tom's Guide's UK Phones Editor, tackling the latest smartphone news and vocally expressing his opinions about upcoming features or changes. It's long way from his days as editor of Gizmodo UK, when pretty much everything was on the table. He’s usually found trying to squeeze another giant Lego set onto the shelf, draining very large cups of coffee, or complaining about how terrible his Smart TV is.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.