Time’s Ticking! This Will Make or Break the iPhone 17

Now that I’ve spent nearly a year with the iPhone 16 Pro Max, I’m a little afraid of the iPhone 17 lineup, which is reportedly going to be introduced at a September event. That's because, after testing out some of the other best phones out there, I think the iPhone 17’s success will hinge largely on an improved Apple Intelligence.
As it stands right now — even with the release of the iOS 26 public beta, and now the latest iOS 26 developers beta — Apple Intelligence features are starting to feel dated against the competition. And I'm not just saying that — I’ve tested out the new set of Galaxy AI features that launched with the Galaxy S25 Ultra earlier this year, while the slew of new Google AI features with the Pixel 10 are looking very promising.
If Apple wants us to buy the next generation of its hardware, it can’t afford for a weak showing with the iPhone 17 with Apple Intelligence. Here’s why.
It feels like it’s always playing catch up
When Apple Intelligence features first rolled out last year alongside the rollout of iOS 18.1, it was already playing catch up to its rivals.
Don’t get me wrong, I was just as thrilled as everyone else when they finally launched, but once I broadened my horizons with other platforms, suddenly Apple Intelligence just didn't seem so ground-breaking anymore.



Writing Tools, Photo Clean Up, and Siri enhancements all injected much needed like to the iOS experience, but none of them were game-changers.
In fact, I compared Photo Clean Up to Magic Eraser — which Google debuted with the Pixel 6 back in 2021 — and while Apple’s version is intuitive to use, it is nowhere as reliable or smart as Magic Eraser when it comes to generating new elements. Just take a look at the photos I edited above to see what I mean.
Get instant access to breaking news, the hottest reviews, great deals and helpful tips.
This is indicative of how I feel overall about Apple Intelligence as a whole: It feels like it’s constantly just playing catch up.
The need for more multimodal AI
One thing I’ve come to enjoy with the best Android phones is how I can use the Gemini app to access more multimodal AI experiences. Take for example the Motorola Razr Ultra (2025) and Samsung Galaxy Z Flip 7, two of the best foldable phones around right now, that perfectly showcase the power of multimodal AI.
Not only can I have a conversation with Gemini Live, but the truly powerful thing about this AI tool is how it goes to the next level by tapping into the camera to see what I see. Visual Intelligence is almost similar, but it’s much more limited in what it can do.
Yes, I can use Visual Intelligence to learn more about a restaurant I want to dine at, or how iOS 26 extends those capabilities to onscreen searches on my iPhone, but it lacks Gemini’s native multimodality and advanced reasoning. When the power went out in my home, Gemini Live inspected my circuit breaker to see if anything was wrong with it — and when it noticed a breaker was tripped, it instructed me on how to reset it.
That is a practical, real world application of the power of AI. I don’t want Apple Intelligence to match that, I want it to exceed it.
iPhone 17 cameras could benefit the most
I capture a lot of photos for work, often so that I can pit the best camera phones against one another in our photo face-offs. While the iPhone 16 Pro Max has performed very well, Apple Intelligence could make the iPhone 17 cameras even better.
Samsung has leaned on Galaxy AI to add new features to its phones, like how it uses generative AI to convert standard videos into slow motion. Likewise, the most recent Pixel 10 reveal showed me how AI is having more of an effect to how people capture content — and how they look too.
Take for example the Pixel 10’s new Camera Coach feature, which uses Gemini to guide users on how to capture a scene using on-screen instructions. It’s like having a professional photographer right there giving you advice on how to frame the shot and adjust the exposure. There’s also generative AI in Pro Res Zoom, an AI feature exclusive to the Pixel 10 Pro and Pixel 10 Pro XL, that enhances zoom photos with a little help from AI.
Apple currently doesn’t have any Apple Intelligence features that specifically are tied to the in-camera experience.
Apple currently doesn’t have any Apple Intelligence features that specifically are tied to the in-camera experience. It’s simply relying on the hardware and image processing algorithms to get the best results, but those won’t be enough to save the iPhone 17.
Whatever happens at this rumored September iPhone event, Apple Intelligence can’t afford to have a weak showing. Apple’s already behind Google and Samsung when it comes to the amount of AI features it offers, but it can pull itself ahead if it finally brings us new and innovative ideas around Apple Intelligence.
Follow Tom's Guide on Google News to get our up-to-date news, how-tos, and reviews in your feeds. Make sure to click the Follow button.
More from Tom's Guide
- iOS 26 will bring GPT-5 to Apple Intelligence — what you need to know
- Your Apple Watch finally has the Notes app — here's how to unlock it
- I biked 11 miles with the Apple Watch 10 vs Garmin Forerunner 570 — here’s the winner

John’s a senior editor covering phones for Tom’s Guide. He’s no stranger in this area having covered mobile phones and gadgets since 2008 when he started his career. On top of his editor duties, he’s a seasoned videographer being in front and behind the camera producing YouTube videos. Previously, he held editor roles with PhoneArena, Android Authority, Digital Trends, and SPY. Outside of tech, he enjoys producing mini documentaries and fun social clips for small businesses, enjoying the beach life at the Jersey Shore, and recently becoming a first time homeowner.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.