I stopped weighing my food — and used Ray-Ban Meta glasses to track calories instead

Amanda Caswell wearing Ray Ban Meta Smart Display glasses
(Image credit: Future)

For three decades, my morning ritual has been a constant; I lace up my running shoes, head out into the dawn and eventually engage in the tedious math of post-run fueling. If you’ve ever tried to maintain a marathon training diet (or simply trying to stay healthy), you know the "data entry burnout" is real. Scanning barcodes and weighing chicken breasts is the quickest way to suck the joy out of any meal.

But with Meta’s April 2026 update, the frustration just vanished for me. By integrating the Muse Spark multimodal model, my Ray Ban Meta Smart Display glasses have once again proven useful — even beyond translating the Superbowl Half Time show or shopping at Target.

Here’s what it’s actually like to let your eyewear audit your plate.

Ray-Bans Ray-Ban Meta Display
Ray-Bans Ray-Ban Meta Display: $799 at Meta

These smart glasses look like normal Ray-Bans but quietly pack in cameras, speakers, and AI. You can translate conversations, identify what you’re looking at, make calls, and capture moments hands-free.

Less counting, more enjoyment

meta breakdown of food

(Image credit: Future)

Standing in my kitchen with a pre-run snack, usually involves a "stop and scan" moment. Now, I simply put on my Ray Ban Meta smart display glasses and ask:

Article continues below

"Hey Meta, tell me approximately the calories in what I’m eating,"

Through the display, I saw the glasses instantly outline the banana and a handful of almonds using Muse Spark’s multimodal segmentation. A small overlay appeared in my line of sight: 105 calories for the banana, 160 for the almonds.

The glasses estimated the calories far more accurately than a standard flat photo ever could. I now do this throughout the day with whatever I eat, whether it's a salad, apple or even a cheeseburger.

Beyond the label

Starbucks Nitro Cold Brew

(Image credit: Starbucks)

The real test happened at the local coffee shop. As I looked at my cup, the glasses identified the "Starbucks" logo and the size of the container. Because I had previously synced my preferences in the Meta View app, it knew I opted for oat milk.

It estimated the sugar content and I was able to immediately add it to my food log before I even took the first sip. This is what I think is one of the coolest features of AI. It's essentially data that quietly exists in the periphery of your life, but AI has the ability to spot patterns and provide details based on its training data.

Can AI really 'see' a recipe?

breakdown of calories in food

(Image credit: Future)

Of course, this experiment wouldn't be a true test without an accuracy challenge. It's not hard to identify a whole apple with fairly accurate data. However, it's much harder to identify a home-cooked chili. To stress-test the model, I used a "context hack." While sautéing onions, I looked at the pan and said, "Meta, I’m adding two tablespoons of olive oil and a pound of lean ground turkey."

Later, when I sat down to eat, the glasses didn’t just see "chili." It utilized recall to remember the ingredients from thirty minutes prior. This local processing solves the biggest hurdle in AI nutrition for me — where the hidden fats and sugars exist but a camera alone can’t see. Although the numbers are estimates, for me, that's better than not knowing at all (or trying to manually do the math myself — not my strong suit).

Social stealth with the Neural Band

Meta Ray-Ban Display neural band

(Image credit: Future)

I have to say that the most futuristic moment happened at a recent party. While I don't usually wear the Meta display glasses around people (talking to my glasses while your friends are passing the salad is a social non-starter), the Meta Neural Band changed the game.

When a notification appeared in my view asking to confirm a "Large Garden Salad," I used a subtle "pinch" gesture with my hand under the table. The band’s sensors picked up the motor intent and confirmed the image. It was the first time calorie tracking felt truly invisible.

Bottom line

Using Ray-Ban Meta glasses to help me eat healthier is not something I had on my Bingo card for 2026. It's wild to be witnessing first-hand a fundamental shift in how AI can be used for personal wellness.

With the arrival of Muse Spark and the Ray-Ban Meta Display, it's clear we are moving toward a state of ambient wellness, where your nutritional load is tracked with the same "set it and forget it" ease as your heart rate or daily steps.

Although the information given is approximate, for me, that's enough for me. As a runner, this means the end of "data entry burnout" after a grueling 20-miler. For the parent, it’s one less mental plate to spin as we attempt to reduce the copious amounts of sugar our kids consume. And for the tech enthusiast, it’s the first true proof-of-concept for the Personal Superintelligence Meta has been promising. I'm excited to continue experimenting with the possibilities of what Meta can do.


Google News

Follow Tom's Guide on Google News and add us as a preferred source to get our up-to-date news, analysis, and reviews in your feeds.


More from Tom's Guide

TOPICS
Amanda Caswell
AI Editor

Amanda Caswell is one of today’s leading voices in AI and technology. A celebrated contributor to various news outlets, her sharp insights and relatable storytelling have earned her a loyal readership. Amanda’s work has been recognized with prestigious honors, including outstanding contribution to media.

Known for her ability to bring clarity to even the most complex topics, Amanda seamlessly blends innovation and creativity, inspiring readers to embrace the power of AI and emerging technologies. As a certified prompt engineer, she continues to push the boundaries of how humans and AI can work together.

Beyond her journalism career, Amanda is a long-distance runner and mom of three. She lives in New Jersey.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.