Ray-Ban Meta smart glasses just got a ton of upgrades, including new AI features and video calling

Ray-Ban Meta Smart Glasses
(Image credit: Future)

The Ray-Ban Meta Smart Glasses just gained some significant upgrades. In a push from Meta to make its photo-taking smart glasses even smarter, camera-enabled AI features are rolling out in beta to all users. New video calling functionality and a fresh frame shape are other Ray-Ban Meta Smart Glasses updates Mark Zuckerburg highlighted in a video shared to Instagram.

By tapping into what you're seeing through the cameras on the glasses, Meta AI can provide insights on what's in front of you. The assistant can identify objects, offer landmark information, and even translate languages encountered while traveling. As I explored a few months ago, Meta AI can look at your wardrobe and pick out matching outfits, too.

At a time of heightened competition among AI hardware products, Meta's glasses hope to provide service via vision. But is eyesight by proxy a solution to avoiding the negative reception that muddied the Humane AI Pin's launch? While it's true the Humane AI Pin has a camera for similar object identification and setting descriptions, there's something frictionless about the camera sitting at eye level. By just saying, "Hey Meta, look and...," a response is returned privately through the glasses speakers.

Having experimented with the multimodal AI integration through limited beta access in recent months, I've found that it mostly succeeds in identification. For example, Meta AI could name some New York City landmarks just by taking a picture through the glasses. But it's not right every time, and the glasses are prone to the same kind of occasional connectivity headaches that reviewers reported for the Humane AI Pin.

Ray-Ban Meta Smart Glasses:

Ray-Ban Meta Smart Glasses: $299 @ Amazon
These smart glasses are a camera, bluetooth headphones, and stylish designed sunglasses all in one. Read our Ray-Ban Meta Smart Glasses review for more info.

That said, the Ray-Ban Meta Smart Glasses have a lot more going for them than just AI, and that's exactly what makes them more attractive to an average techie like me who isn't ready to tote around a gadget solely dedicated to AI. My favorite feature is how the glasses take quality photos and videos in a snap, as I learned when I compared them to my iPhone 15 Pro Max's cameras on an international trip. I've been able to score some impressive clips while bike riding and visiting a bird sanctuary, to name a few times hands-free capture has proven useful.

But besides taking content for keeps, the glasses can connect to Instagram and be used for the in-app livestream feature. And new with this update, apps such as WhatsApp and Messenger let you switch to your Ray-Ban's camera on video calls to show what you're seeing from your perspective — sort of like live-streaming, but for whoever you're on a chat with. Meta suggests this will come in handy for everything from calling your friends into concerts to showing your mom ingredients at the grocery store to make sure you're coming home with the right items.

Good looks are a major perk of the Ray-Ban Meta Smart Glasses. They mostly look like an average pair of designer glasses. As part of this update, Meta and its design partner Ray-Ban have also revealed new designs for customers to chose from in the Ray-Ban Remix customization platform. The cat-eye Skyler frames join the existing Headliner and Wayfarer styles, introducing fresh lens and frame color options as well. The Headliner frames now offer a low bridge option for a more comfortable fit for certain nose or face shapes.

Can other AI hardware learn from Ray-Ban Meta Smart Glasses?

Meta reminds users in this blog that the multimodal AI features are still in beta, so there will be times when Meta AI "doesn't get things quite right." Whereas devices like the Humane AI Pin don't offer much else if the AI functionality isn't working as advertised, the Ray-Ban Meta Smart Glasses have other use cases that make them worth getting.

That's not to say every gadget needs to be a dozen things in one, but if it's going to do one thing, it better do it well. It seems like Meta is giving its AI room to improve by starting with an all-user beta experience, while continuing to improve in areas the glasses already doing well. 

More from Tom's Guide

Kate Kozuch

Kate Kozuch is the managing editor of social and video at Tom’s Guide. She covers smartwatches, TVs and audio devices, too. Kate appears on Fox News to talk tech trends and runs the Tom's Guide TikTok account, which you should be following. When she’s not filming tech videos, you can find her taking up a new sport, mastering the NYT Crossword or channeling her inner celebrity chef.