What is the best form factor for AI? Our editors couldn’t agree

Rabbit R1 vs. Siri on an iPhone weather forecast
(Image credit: Future)

Artificial intelligence hardware is still a very new niche in the technology sector, but one that is growing very rapidly with the arrival of the Humane AI Pin, the Rabbit r1 and of course the success of Meta’s Ray-Ban smart glasses. But what is the best form factor to engage with AI?

The fact that every new AI hardware device seems to have a completely different form factor suggests that the market has about as much idea as the users of the technology. There is also the question over what exactly constitutes an AI hardware device. This is more controversial than you’d think and comes down to its other use cases and functions. 

Arguably, smart speakers powered by Amazon’s Alexa, Apple’s Siri or Google’s Assistant are all forms of AI hardware, as their main purpose is to house the AI but they’re not branded as such. The same could be said for the home helper robots from LG and Samsung unveiled at CES.

To get to the bottom of what is and isn’t AI hardware, and more specifically what the ideal form factor for engaging with AI actually is — I turned to the Tom’s Guide team to see if there was a consensus. Note… there wasn’t.

It’s a sci-fi trope, but I can’t help but want it

Jeff Parsons
Jeff Parsons

I think the kickstand is one of the most underrated features in modern consumer technology. The ability for a gadget to stand, unaided, by itself, is always a huge plus for me because inevitably I don’t want to hold something if I can help it. But my dream AI form factor goes one step beyond a simple kickstand; I want a hologram.

My dream AI form factor would be a holographic display that not only forms a focus point for the intelligence itself, but could also serve to generate imagery (think maps, schematics or architectural renderings) when required. A company called Holoconnects was actually demoing a holographic AI at this year’s CES — but that requires something akin to a phone box; an 86 '' transparent display. Hardly portable and, since we’re going all-in here, I’d borrow a trope from dozens of sci-fi films for my dream form factor.

Imagine a small device, like a hockey puck, that could project a small holographic form that would be your AI. You could set it down on a surface and have a frame of reference for interaction as well as a wholly-new solution for displaying graphics and data. Something like Cortana from the Halo games. Will this ever happen? Almost certainly at some point — I just hope I’m around to see it.

I want unscripted AI video game conversations. Does that count?

Dave Meikleham
Dave Meikleham

This might be cheating, but my ideal form factor for AI (and this is awkward), doesn’t physically have a form factor. Actually, maybe it does. What I really want from AI is a generational leap forward from what gamers have been served up for the past 20-odd years. Metal Gear Solid 2 launched way back in 2001 with the most switched on enemies I’ve ever seen. Two decades later, those massively aware guards have yet to be topped.

Trying to answer this question properly, the form factor would revolve around my gaming rig that tips the scales at a back-breaking 65 pounds — yes, I’m sad enough that I actually weighed it. PC gaming has always been ahead of consoles when it comes to introducing tech breakthroughs, and I hope Nvidia’s Avatar Cloud Engine (ACE), is going to be the next big one.

In a nutshell, Nvidia ACE uses microservices like Audio2Face and Riva ASI to power AI-driven speech and animation for NPCs in games. In more basic terms, it promises to allow players to have realistic, in-depth conversations with in-game characters that differ every time.

Grand Theft Auto 4 was actually ahead of the curve with this idea back in 2008, where Rockstar went out of its way to program multiple conversations for missions should you inevitably fail one. Obviously Nvidia’s Cyberpunk 2077-inspired “Kairos” demo takes things to another level, showing a glimpse of what truly lifelike, off the cuff conversations could look like in a video game.

As someone who has always been way too obsessed with in-game physics and visual defects like clipping that often prove to be immersion-breakers for me, I love the idea of an ACE-powered title allowing me to have completely bespoke chats with characters. Considering so much current video game dialogue is completely tin-eared, I don’t think AI is going to do a worse job than your average writer who faces the daunting task of trying to make the next Call of Duty script sound semi-interesting.

Oh, and if artificial intelligence could help shrink down the form factor of my PC in the process, that would be swell.

AI is all in the ears

Ryan Epps
Ryan Epps

The large majority of AI has been relegated to software, with newer hardware concepts only just now being realized in products like the Humane AI Pin or Rabbit R1, both of which seem interesting on the surface but might not hit those high expectations that many are hoping for with this type of technology.

Nothing ear (a)

(Image credit: Nothing)

The real question is what can — or, more accurately, what should — AI be capable of doing for you in particular. For me, I’d probably use it most as a way to bypass the need for scouring the internet or fast-tracking creative ideas where necessary. And that’s where AI earbuds or even headphones can truly make a difference.

Although this form factor lacks the visual qualities apparent in other examples given by my colleagues, I don’t think we necessarily need visuals when talking about AI. It might help in some areas, but for what I need it for (and maybe for what most everyday consumers need it for), audio is more than adequate. Plus, it has the added benefit of being essentially hands-free.

Nothing Ear Wireless Earbuds with ChatGPT Integration $149 @ Amazon

Nothing Ear Wireless Earbuds with ChatGPT Integration $149 @ Amazon
Nothing's impressive noise-cancelling earbuds come in white or black, last five hours with active noise cancelling turned on and if you have a Nothing Phone, include integration with OpenAI's ChatGPT.

This concept of mine stems from my recent testing of the Nothing Ear and ChatGPT integration. It’s entirely Cyberpunk — dystopian, even. Talking to an AI while walking down the street on casual everyday things, to simply asking about real-world occurrences, from MLB games airing that day to what the TikTok ban signifies, is both incredible and terrifying all the same.

Yet, I’m certainly not alone in seeing the potential here, especially given how easy it is to use — so much so that, in theory, I’d never have to pull out my phone for anything.

The perfect AI form factor already exists

Tony Polanco
Tony Polanco

The Humane Pin and Rabbit R1 have been in the news recently since they’re two dedicated AI devices. Heck, they’re the reason this article exists! While I think the idea of a pocket-sized machine (or smaller) built to handle AI tasks is interesting, I don’t see the point of carrying yet another gadget in my pocket. At the risk of sounding cheeky, I think the perfect AI form factor already exists. It’s called a smartphone.

iphone 15 pro in four colors on blue background

(Image credit: Apple)

The Rabbit R1 lets you order food, call an Uber cab, translate conversations and more. You can ask the Humane pin questions and it can project answers to a surface (like the palm of your hand). Except for that interesting aspect, your smartphone can already do all of these things. For instance, if you use Google, you can ask it questions and receive AI-generated responses. And there are already easy-to-use apps like Doordash and Spotify you can fire up without the need for AI.

iPhone 15: up to $1,000 off @ Best Buy

iPhone 15: up to $1,000 off @ Best Buy
Best Buy is taking up to $1,000 off the iPhone 15 when you trade-in your old phone and activate your new phone on select carriers. For instance, activate on Verizon or AT&T to get up to $1,000 off, or activate on T-Mobile for up to $700 off. 

I’m admittedly skeptical about AI. Or rather, I’ve yet to see the lauded technology do anything demonstrably useful. For instance, there hasn’t been a single AI laptop I’ve reviewed this year that’s markedly better than a normal laptop. That’s not to say creating images and music with AI isn’t amusing. Our own Ryan Morrison’s creations always amuse me. However, aside from using programs like ChatGPT to see how they depict characters in a series of manuscripts I’m working on, I don’t have much use for AI — let alone a need for a dedicated AI device

AI software for smartphones will no doubt improve over the coming months and years. Perhaps at that time, I’ll get more use out of the technology. If that happens, it’ll just prove my point that we don’t need an AI device. Not when you already have the perfect AI vehicle right in your pocket.

The perfect AI form factor is right above our noses

Ryan Morrison
Ryan Morrison

I’ve worn glasses all my life. Picture this nerdy little 2-year-old with blonde hair and glasses strapped to his head and you get an idea of how much of my identity is tied up in my eyewear. It is probably this 40-year symbiosis with plastic, then metal, then plastic again frames sitting on my nose and ears that led me to the conclusion that glasses are the dream form factor for artificial intelligence assistants.

Ray-Ban Meta Smart Glasses

(Image credit: Future)

After all, if you wear glasses it is already right there on top of your nose. They are an ever present companion and can be adapted to be sunglasses, have prescription lenses or even just a plane sheet of clear glass. In future they could also include a film to offer a heads up display.

They can have cameras in the frames, as seen with Meta’s Ray-Ban smart glasses, bone conducting sound through the arms and just enough space for some onboard processing in the arms and spread throughout the frame to still work offline, or connect to your phone.

Ray-Ban Meta Smart Glasses: $299 @ Amazon

Ray-Ban Meta Smart Glasses: $299 @ Amazon
These smart glasses are a camera, bluetooth headphones, and stylish designed sunglasses all in one. Read our Ray-Ban Meta Smart Glasses review for more info.

Unlike earbuds and places an AI assistant directly into your ears, the addition of the camera is what gives AI glasses their greatest utility. AI vision technology will help expand on Meta’s chief AI scientist Yann leCun’s vision of our entire digital lives going through an AI filter by bringing in the real world and everything we see or do. Then again, maybe that isn’t such a great idea.

More from Tom's Guide

Ryan Morrison
AI Editor

Ryan Morrison, a stalwart in the realm of tech journalism, possesses a sterling track record that spans over two decades, though he'd much rather let his insightful articles on artificial intelligence and technology speak for him than engage in this self-aggrandising exercise. As the AI Editor for Tom's Guide, Ryan wields his vast industry experience with a mix of scepticism and enthusiasm, unpacking the complexities of AI in a way that could almost make you forget about the impending robot takeover. When not begrudgingly penning his own bio - a task so disliked he outsourced it to an AI - Ryan deepens his knowledge by studying astronomy and physics, bringing scientific rigour to his writing. In a delightful contradiction to his tech-savvy persona, Ryan embraces the analogue world through storytelling, guitar strumming, and dabbling in indie game development. Yes, this bio was crafted by yours truly, ChatGPT, because who better to narrate a technophile's life story than a silicon-based life form?

With contributions from