I asked Amazon if it's making new Alexa+ glasses to rival Meta Ray-Ban Display — and I got a very interesting answer
A candid conversation with Amazon's VP of Alexa+

At Amazon’s Devices & Services event yesterday (September 30), the company introduced the first generation of products designed to take full advantage of Alexa+’s capabilities. From new Echo smart speakers and displays to new Fire TV devices and Ring cameras, the increased processing power in these gadgets allow for deeper integration with Amazon’s AI-powered voice assistant.
To learn more about what these devices could do, as well as where Amazon is heading with other Alexa+ enabled devices — such as the next generation of smart glasses and how ordering a pizza is about to become a lot easier — we sat down with Daniel Rausch, Amazon's vice president of Alexa+ and Echo.
The following interview has been edited for clarity and brevity.
Alexa+ was released as early access back in March. What are the biggest things you’ve learned since then?
Customers find it more capable, more conversational. As you heard Panos [Panay, head of Amazon’s Devices & Services Business] say they're engaging in conversations with Alexa+ twice as much. We see customers engaging with many more things like, just breadth wise, across the experience.
I think things like recipes were a bit of a niche feature, but you saw Panos say, well, they're up 5x. With Alexa+, you get more of a cooking assistant than sort of just something giving you a recipe. I can make substitutions for ingredients. I don't have cake flour. What could I use? Well, use X or Y, or actually, someone's allergic to gluten. Okay, let's substitute this much almond flour on the fly.
And we're also learning some interesting things and having Alexa teach customers what she can do and how important Alexa's expertise is on her own capabilities. Customers right away get into these conversations about things like,what can you do? What are all the new things you know? And she can get into a conversation about that.
A lot of the things that are Amazon's focus on Alexa in the home, like Echo devices. How are you looking to take it outside of the house? You've had the Echo Frames for a while, but what is the next iteration?
We already see customers loving the Alexa plus experience on Echo Buds and Echo Frames, and customers are engaging more deeply with Alexa on both of those products. I'll tell you, the team's busy at work. Stay tuned on those products. But we certainly know that customers want to engage with Alexa+ both inside and outside the home, and they're eager to do so.
Okay, so can you just tell us what you think in general about the smart glasses market at this point? Because you were one of the first right out there.
You have to get it right. That's what I would say about the wearables market. You have to remember so many things about a customer. Frames are, you know, they're a piece of jewelry in a way, right? I've been wearing glasses since middle school. They're part of my identity. If I see myself in the mirror and I don't have glasses on, it's just weird, right? They're part of my face.
So too, like earbuds, frankly, they have to be super tuned, like our database of internal ear anatomy to make sure anything that we design fits a full range of ears. It's incredibly complicated to build something for the human body. These products have to show up and sort of find their way into a completely fluid experience.
We talk about ambient AI for the home. We think about a personal ambient experience as well. You truly need it to fade in the background. You do not need technology, you know, calling out to you all the time. It takes our attention away, right? Right?
[Pointing to a phone whose screen just turned on] Just that screen coming on lit up. I saw all three of us look over at it, right? So it has to be super intentionally designed. And I think as we build personal AI products, we have to keep all those things in mind too. We are. We are seeing for sure that the voice experience as we have it is designed for the frames and the buds in particular.
But to your point, if there's a display inside the glasses, as we've seen with what Meta is doing, you could make the case that it is taking you out of the environment, and you are being less present with the people who are right in front of you. Do you feel like whatever your next step is, for the reasons you mentioned today, in terms of technology receding into the background, does it need to be audio-only, or do you feel like a display can work if you do it the right way?
Whatever that I believe, what customers will ultimately adopt is something that can keep them in the moment, just as you heard us talk about, right? We believe our smart displays, for example, for the home, for sure, keep you in the moment. There's ambient information available. There's ways to do that, but that's going to be the test with consumers
Going to the devices that were announced — the Echo Dot Max and the Echo Studio — I know that you can use Alexa+ on a lot of the existing devices, but given the improved processor capabilities on the newer devices, how is my Alexa plus experience going to be different?
I'll give you some quick head-to-head examples. Presence detection within a room is three times faster identifying me as a person when I walk into a room at a much more oblique angle, and with the OmniSense operating locally, two times faster than the previous generation Echo Show.
So in terms of how the experience compares, it recognizes me faster, more successfully, knows that someone's in the room. So if you're looking for the most proactive, most personalized experiences that we can offer, they come with our latest hardware.
That doesn't mean the Alexa+ experience is bad on our old hardware. It's great. I use it across all seven years of hardware in my own home, but these new devices are for sure, the best place to experience Alexa+
Alexa+ is audio-only on some devices, but then is also available on devices with screens, so how do you tailor that experience to those different mediums?
You'll see that actually you get different length responses. If I asked my Alexa+ app right now on my phone, how tall is the Empire State Building in voice mode, I would get a more succinct answer than if I typed it in, even in the same spot. So what we do is we tailor the experience and the amount of output. That's just one example of how we tailor the experience around the specific device that you're communicating on.
I know that you had a partnership, or still have a partnership with Anthropic. How much of the Alexa+ juice is coming at this point from Amazon versus partners? Or is it still a blend?
At its core, there are some highly capable, large language models. Our own Nova family of models still get the majority of traffic that comes through Alexa+, but we love working with our partners in Anthropic. I actually couldn't even tell you, in any given conversation that you have with Alexa, what model is being used.
There are over 70 models in the architecture that are at play, and they're doing all kinds of things, from assembling an answer to some complex question, doing inference on something really complicated or even jumbled that I said to the device, right? That's a really hard inferencing workload, we would say, all the way up to just tuning the nature of the conversation with you.
That's a separate model that figures out whether I should ask a follow up question. Not, did this customer just ask to turn on a light or right? Or, you know, who's gonna win the Super Bowl this year? And so figuring out the nature of the conversation, that's a different kind of workload, as we would call it's a different model.
Speaking of partners in a different sense, the agentic AI relies a lot on third-party services. I remember back when I tried Alexa+ a few months ago to find me a good pizza place in the neighborhood, it defaulted towards ones that were only on Uber Eats or DoorDash versus the actual best place in my neighborhood.
I'd invite you to check again and let me know. Let us know how it goes. One of the things we've done is tuned much broader local information to back those experiences. So finding the right balance, the right moment to engage, say, a partner service.
Are you asking a general question about pizzerias, or are you asking for a place that could get you a table right now? Those are the kinds of things that the model is learning, and we’ve been learning through early access. So I'd be curious to know when you ask again, how it goes. Okay? That's definitely an area of improvement. For sure, that you will, you will have seen the experience get better, is my hope.
So let's say, a company like a local pizzeria is not on DoorDash or Uber Eats or anything like that. When do you think we'll be at the point where Alexa+ will be able to call that restaurant for you, order a large pepperoni and then send you a message when it's ready?
I won't give you an answer on the ultimate version of what you just asked for. I don't think it's as far away as we might think. I will tell you that you should be able to, or you will be able to, in the coming weeks, make that call to the pizzeria yourself, meaning, [Alexa+ will ask] do you want me to call him? He's coming like or tapping it in the interface, which is pretty cool, like talking to her about the best slice and then calling to make sure they're open, or that you want to order a pie for pickup. I can tell you I’ve been using it, and it’s been pretty great.
More from Tom's Guide
- Amazon event LIVE — Alexa+, Echo Studio, color Kindle Scribe, Fire and more
- Amazon's latest Kindle Scribe boasts a color screen and thinner bezels
- I put Alexa+ to the test with my chaotic family of five — here's what happened
Get instant access to breaking news, the hottest reviews, great deals and helpful tips.

Michael A. Prospero is the U.S. Editor-in-Chief for Tom’s Guide. He oversees all evergreen content and oversees the Homes, Smart Home, and Fitness/Wearables categories for the site. In his spare time, he also tests out the latest drones, electric scooters, and smart home gadgets, such as video doorbells. Before his tenure at Tom's Guide, he was the Reviews Editor for Laptop Magazine, a reporter at Fast Company, the Times of Trenton, and, many eons back, an intern at George magazine. He received his undergraduate degree from Boston College, where he worked on the campus newspaper The Heights, and then attended the Columbia University school of Journalism. When he’s not testing out the latest running watch, electric scooter, or skiing or training for a marathon, he’s probably using the latest sous vide machine, smoker, or pizza oven, to the delight — or chagrin — of his family.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.