BARCELONA — Sit inside a new BMW today, and you have a few different modes of interaction at your disposal. There’s a touch screen front-and-center, the German automaker’s trademark iDrive controller near the shift lever and some basic voice and gesture commands.
That’s more than what you get from most cars, but it’s still not ideal. The automotive industry is undergoing a rebirth of sorts — less of an identity crisis, more of a transformation into the realm of smart interconnected devices. And BMW has imagined a new interface to fit, that thankfully doesn’t require any more knobs or screens than you’re used to today.
It’s called Natural Interaction, and BMW introduced it at Mobile World Congress this week ahead of the system’s eventual launch in the carmaker’s semi-autonomous Vision iNEXT vehicle, which is planned for 2021.
A car that understands you
There are two basic principles to Natural Interaction. The first is a expansion of the focus on gestural commands. The iNEXT will read your gaze to determine what you’re looking at, and if you point to something, the car will be able to tell you more information about the subject.
The other principle is contextual understanding. For all of this to work properly, the system must be able to thread together multiple styles of communication, from gestures to vision to speech. This is perhaps the biggest challenge the whole Natural Interaction initiative faces, though BMW’s user interaction vice president, Dr. Fathi El-Dwaik, believes they’ve struck the proper approach.
“When I’m speaking to you right now, you’re not only hearing my voice — you’re looking at my gestures and my face expressions,” Dr. El-Dwaik told Tom’s Guide. “And that is what we think the car must look and hear to understand naturally what is the intention of the driver.”
Natural Interaction doubles down on BMW’s philosophy of “Shy Tech” that the company spoke at length about at CES in January — features that are helpful, but stay out of sight when not in use. Ideally, it’d be a way for cars to become as intelligent as your smartphone or AI voice assistant of choice, without being too intrusive — guidance always at the ready, but only if you want it. And your BMW would be built to accommodate your preferred methods of interaction, rather than forcing you to thumb through a series of touchscreen menus all jockeying for your attention and taking your eyes off the road.
“For us this will be a very good step forward toward the car getting more natural, more like your companion — understanding you and what you’re saying even if you haven’t said something exactly a certain way,” El-Dwaik said.
To provide an idea of how Natural Interaction will work, BMW brought a mixed reality experience to MWC, similar to a demo the company conducted at CES that explored potential uses for the Intelligent Personal Assistant. And in this new demo, I virtually took the wheel of a Vision iNEXT and let the car do most of the driving while I took up some sightseeing from the driver’s seat.
One of the main objectives of Natural Interaction is to leverage the car’s myriad sensors and systems to educate passengers on the world around them. My assistant encouraged me to point at various buildings while asking for more information; when I did, overlays appeared in front of me within the car’s windshield, sometimes with contextual actions. For example, when I pointed at a distant movie theater, I had the option to buy tickets to a showing of Mission Impossible Fallout. I could also point at one of the car’s windows to open or close it.
Right now, some BMW models on the road today feature embedded cameras looking at the driver from the instrument cluster and down from the interior lights. These cameras can detect specified motions, like twirling your finger in a circle to raise the stereo’s volume and making a stabbing motion with two fingers to mute or pause playback.
In the future, however, BMW wants to expand the system’s range of interaction to all passengers and well beyond the limits of the car. Theoretically, you’d be able to point to a parking garage and ask how much they charge, and the car would fetch the details.
“We’re combining all the sensor information with A.I. algorithms and deep learning,” El-Dwaik said. “We’re going to learn a lot before launching the system and we’ll keep learning after launching the system.”
The long road ahead
As with any technology that relies so heavily on decoding human behavior, it’s going to take constant development and likely a series of iterations before BMW achieves its goals. And while I do firmly believe the carmaker will get there one day, that day seems awfully far away. Its arrival depends on a lot of moving parts — from coordinating an abundance of sensors to monitor your every move to building software that can deduce intent based on a wealth of information.
And that’s to say nothing of 5G networks required to move all this data to cloud computing centers and back (BMW says local processing ideally would be available as a fallback, but of course commands will always register faster when the car is connected) and the dream of true autonomy, which could arrive anywhere between 2021 and 2025 — depending on which automaker you ask, that is.
For what it’s worth, BMW says the production iNEXT will launch with Level 3 autonomy in 2021, which means it will be able to handle all driving duties in specified conditions, albeit with human oversight. Eventually however, the hope is for the iNEXT to reach Level 4, which would allow drivers to fully take their attention away from the road when the car deems it has everything under control. Should the vehicle get in over its head, it will simply pull over and revert to human operation.
Long story short, there are a lot of things that have to go right for Natural Interaction to work the way BMW envisions. And as much as I appreciate the direction the carmaker’s taking, I couldn’t help but leave the mixed reality demo with a bit of a numb feeling — the same one I get anytime carmakers romanticize the autonomous future that isn’t here yet and won’t be for a while, no matter how very badly we all want it to be.
Credits: Tom's Guide