Apple announces Visual Intelligence — its take on Google Lens
Vision AI on iPhone 16
Here at Tom’s Guide our expert editors are committed to bringing you the best news, reviews and guides to help you stay informed and ahead of the curve!
You are now subscribed
Your newsletter sign-up was successful
Want to add more newsletters?
Daily (Mon-Sun)
Tom's Guide Daily
Sign up to get the latest updates on all of your favorite content! From cutting-edge tech news and the hottest streaming buzz to unbeatable deals on the best products and in-depth reviews, we’ve got you covered.
Weekly on Thursday
Tom's AI Guide
Be AI savvy with your weekly newsletter summing up all the biggest AI news you need to know. Plus, analysis from our AI editor and tips on how to use the latest AI tools!
Weekly on Friday
Tom's iGuide
Unlock the vast world of Apple news straight to your inbox. With coverage on everything from exciting product launches to essential software updates, this is your go-to source for the latest updates on all the best Apple content.
Weekly on Monday
Tom's Streaming Guide
Our weekly newsletter is expertly crafted to immerse you in the world of streaming. Stay updated on the latest releases and our top recommendations across your favorite streaming platforms.
Join the club
Get full access to premium articles, exclusive features and a growing list of member rewards.
Apple is bringing a new AI-powered feature to the iPhone 16 that will let users turn the camera into a glorified visual search engine. This is similar to Google Lens on Android and is powered by Apple Intelligence but includes integration with any app or service running on the phone.
Visual Intelligence is essentially AI Vision, where a language model can analyze and understand images. This is something Claude, Gemini and ChatGPT are also able to do well.
With its deep integration into the iPhone 16, including access to the new Camera Control Button, Apple’s approach is likely to be much more user-friendly. One example given during the Glowtime event included using it to add an event from a poster to your phone calendar.
What is Visual Intelligence?

Apple Visual Intelligence was one of the standout announcements for me during Glowtime. Vision AI is likely to be the most user-friendly AI feature as it lets the AI see the world around us.
Some Vision AI features are already in use on the iPhone and have been for some time, including being able to copy text from an image or identify an animal type in a photo, but this brings those features to the real world via the camera.
Using a combination of on-board and cloud-based (through Apple’s Private Cloud Compute), AI models are able to analyze what the camera is seeing in near real-time and provide feedback.
How it handles the image depends on the user. For example, it could add an event to the calendar if it identifies one in the image, or it could just tell you a dog breed. Alternatively, if you see a product you want to buy you could have Apple Intelligence redirect you to Google.
Get instant access to breaking news, the hottest reviews, great deals and helpful tips.
How secure is Visual Intelligence?
Apple says it doesn't store any images captured by the AI as part of the Apple Intelligence search and removes images sent to the cloud for deeper analysis.
As much of the data gathered in Apple Intelligence features, including Visual Intelligence, is processed on device, particularly on the iPhone 16 with the new powerful A18 processor, but where it does go to the cloud Apple says it goes to great lengths to protect the information.
This is largely driven by its Private Cloud Compute, a new cloud system built on Apple Silicon and a custom version of the iPhone operating system. As well as ensuring nothing is accessible to anyone beyond the user, the architecture is also open to audit by third parties.
If a user opts-in to send data to a third party such as Google for search or OpenAI’s ChatGPT for a deeper analysis it won’t have the same security, but Apple says that will always be opt-in and optional with nothing sent without express permission.
What are the use cases for Apple Visual Intelligence?
Apple Visual Intelligence gives the AI a view on the world outside your phone. It can be used to take a photo of a bag of groceries and have the AI generate a recipe, or of an empty fridge and have it generate a shopping list.
Outside of food, it could be used for live translation of signs, to identify potentially risky ingredients for someone with food allergies or identify a location from a simple photo.
If you take a photo of a dog you can go into Photos and Apple Intelligence will tell you the breed, but now you won’t have to take the photo as holding the camera up to a dog will give you that information. This will also work with spiders or any other animal.
There are as many use cases as there are types of things to look at. It could be used to get the history of a building, find a review of a book or even get a link to buy that bike. It is an impressive and logical feature to be built into the iPhone 16.
More from Tom's Guide
- iOS 18 release date rumors — here’s when it might launch
- Samsung Galaxy S25 Ultra leak reveals major upgrade to fight iPhone 16 Pro Max
- Android 16 could make an iPhone-inspired change to your notifications

Ryan Morrison, a stalwart in the realm of tech journalism, possesses a sterling track record that spans over two decades, though he'd much rather let his insightful articles on AI and technology speak for him than engage in this self-aggrandising exercise. As the former AI Editor for Tom's Guide, Ryan wields his vast industry experience with a mix of scepticism and enthusiasm, unpacking the complexities of AI in a way that could almost make you forget about the impending robot takeover.
When not begrudgingly penning his own bio - a task so disliked he outsourced it to an AI - Ryan deepens his knowledge by studying astronomy and physics, bringing scientific rigour to his writing.










