I wore Ray-Ban Meta Display smart glasses to watch the Super Bowl halftime show — and understood Bad Bunny in real time

Bad Bunny/Amanda Caswell wearing Meta Ray-Bans Display smart glasses
(Image credit: Getty Images/Future)

I’ve never been a passionate sports fan, but I love live music. In other words, I watch the Super Bowl for the halftime show. The energy, the talent and the excitement on such a big stage is fun to see regardless of the artist, in my opinion.

When I found out Bad Bunny was performing this year, I decided to run a small experiment. Instead of reaching for my phone when Bad Bunny started rapping in Spanish, I kept my hands in my lap — and my eyes on the stage. That’s because I was wearing Ray-Ban Meta Display smart glasses.

Ray-Bans Ray-Ban Meta Display
Ray-Bans Ray-Ban Meta Display: $799 at Meta

These smart glasses look like normal Ray-Bans but quietly pack in cameras, speakers, and AI. You can translate conversations, identify what you’re looking at, make calls, and capture moments hands-free.

What I wanted to test

meta Ray-Ban Display apps

(Image credit: Future)

I wasn’t expecting to demo tech while watching the Super Bowl. But there I was literally seeing if these glasses could help me in a very normal, human way:

  • Could they help me understand lyrics in real time?
  • Could they provide context without forcing me onto my phone?
  • Could they keep me in the moment instead of dragging me out of it?

What Ray-Ban Meta Display glasses actually are

Meta Ray-Ban Display on charging case

(Image credit: Future)

Ray-Ban Meta Display glasses are the company’s most advanced consumer AR-style smart glasses to date — and the first from a major brand to include an in-lens display.

Unlike regular “smart glasses,” which are mostly cameras and speakers, these can actually show visual information directly in your field of view via a tiny screen visible only to you.

Here’s the quick, clean rundown:

  • In-lens display: A full-color 600 × 600 pixel overlay that appears in your right lens
  • Meta AI: You can ask for translations, context, directions or messages hands-free
  • Neural Band: A wristband that lets you control the glasses with subtle muscle gestures
  • Built-in camera: You can take photos or videos and see a viewfinder in your glasses
  • Phone connectivity: Works with iOS and Android over Bluetooth and Wi-Fi
  • Battery: About 6 hours of use, plus a charging case
  • Price: Around $799 (more for Rx lenses)

Importantly, the display does not block your vision. It feels more like a subtle heads-up overlay that appears only when you need it — similar to a floating notification in your peripheral view. That said, you should not drive a vehicle while wearing them.

The halftime moment: when Bad Bunny came on

Bad Bunny puts on a hell of a show

(Image credit: Getty Images)

When Bad Bunny hit the stage, I did what I normally do: I watched, listened and tried to keep up.

Then, I quietly switched to the translations with my wrist and forefinger and I said: “Hey Meta, what is he saying right now?”

Within seconds, text appeared in my lens. It wasn’t full karaoke subtitles streaming line by line all the time. Sometimes it only caught a few words or left some out (I know a little bit of Spanish and Bad Bunny’s lyrics well enough to notice).

But overall, I got concise, real-time translations of key phrases and themes — enough to understand the meaning without staring at words instead of the performance.I could still see the dancers. I could still feel the energy of the crowd. I could still watch the choreography and lighting.

For the first time, I didn’t feel “behind” in a multilingual performance. It was so cool.

What worked surprisingly well

Meta ai

(Image credit: Future)
  • I didn't need my phone. This was the biggest win. Normally, I’d be pulled into a mini black hole of scrolling. Instead, I stayed focused on the show.
  • The translations were fast enough to feel live. There was a slight delay, but not enough to break the experience. It felt like having a very quick translator sitting next to me.
  • It deepened my appreciation of the performance. Understanding the lyrics gave me a better emotional connection to what Bad Bunny was doing on stage.

What didn’t work perfectly

No tech is magic — and these glasses are no exception. If I wasn’t facing the TV directly or the camera moved to dancers and not Bad Bunny, I would lose the lines of translations.
And, when that happened, it took a minute to get translations back on again. So, I probably missed a quarter of the lyrics in real time.

Gesture control is cool, but it’s not completely intuitive right away. I mostly relied on voice instead. I’ve been playing around with the wristband and hand motions and it’s still taking me a little while to get used to everything.

The takeaway

For the first time, AI didn’t pull me out of the experience — it pulled me deeper into it. I didn’t leave halftime thinking, “Wow, the glasses are amazing.” I left thinking, “Wow — I actually understood Bad Bunny.” It made the halftime show even more impressive and interesting to me.


Google News

Follow Tom's Guide on Google News and add us as a preferred source to get our up-to-date news, analysis, and reviews in your feeds.


More from Tom's Guide

Amanda Caswell
AI Editor

Amanda Caswell is an award-winning journalist, bestselling YA author, and one of today’s leading voices in AI and technology. A celebrated contributor to various news outlets, her sharp insights and relatable storytelling have earned her a loyal readership. Amanda’s work has been recognized with prestigious honors, including outstanding contribution to media.

Known for her ability to bring clarity to even the most complex topics, Amanda seamlessly blends innovation and creativity, inspiring readers to embrace the power of AI and emerging technologies. As a certified prompt engineer, she continues to push the boundaries of how humans and AI can work together.

Beyond her journalism career, Amanda is a long-distance runner and mom of three. She lives in New Jersey.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.