The biggest fear of engineers and designers working on safer, autonomous vehicles has been that a serious accident in the early days of development could stymie the technology. This week they learned that fear had become a reality when a Tesla S owner tragically died in an accident while his car was in Autopilot mode.
The National Highway Traffic Safety Administration revealed the information in announcing it was investigating the accident. According to a police report, back on May 7th in Florida, a Tesla driver struck the side of a semi-tractor trailer when the truck turned left in front of him. The car submarined under the main stretch of the truck, killing the driver. At the time, the car was in Tesla's semi-autonomous Autopilot mode.
The news came just ahead of BMW's announcement with IBM and Mobileye that the automaker plans to introduce a fully autonomous BMW iNext car by 2021.
Tesla's Autopilot system in the model S uses a combination of adaptive cruise control, lane keeping, and automatic braking to guide a car down the road on its own. Drivers are supposed to keep their hands on the wheel, but as demonstrated in countless YouTube videos, they often flout this warning and let the car do all the driving. Other semi-autonomous systems from the likes of Volvo, Mecedes-Benz, and BMW only allow you to take your hands off the wheel for a few seconds.
In an emailed statement earlier this week before the accident was revealed, a Tesla spokesperson went to great lengths to reiterate to me that owners are reminded to keep their hands on the wheel at all times and to remain vigilant. A dashboard warning message also has to be confirmed by the driver before they can engage the system, which Tesla describes as a “public beta.” The spokesperson underscored the fact that Autopilot “does not turn a Tesla into an autonomous vehicle and does not allow the driver to abdicate responsibility.”
Many engineers and regulators have worried for years that a misstep or serious accident attributed to a self-driving or semi-autonomous car could hold back technological advances for years to come. Consumers—from teenagers to baby boomers--are already skeptical about letting the car take control, according to several studies.
At a presentation I moderated two and a half years ago, more than a half dozen engineers from major automakers, as well as the head of NHTSA at the time, David Strickland, expressed concern that pushing the technology too far, too fast could end up hurting potential safety advances associated with self-driving cars.
Privately, many engineers and researchers from competing companies have pointed to Tesla as putting technology on the road before it was fully ready. There will doubtless be more criticism from competitors now that there's been a fatality.
Was the accident avoidable? That's difficult to tell before NHTSA's investigation is complete. However, it was not an unusual circumstance by any measure. Trucks the size of a barn turning left in front of oncoming traffic is extremely common. Tesla pointed out in an online statement that the white side of the truck would have been difficult to see against the bright sky.
However, that's a very troubling weakness in the system given the high resolution available in some cameras and the ability of sensors to detect objects hundreds of meters down the road regardless of their color (many can even detect much smaller objects in the dark). Technically, the sensors and software should have easily “seen” the truck turning in front—or seen it even beginning to make the turn in front--and have applied the brakes long before the collision.
Other semi-autonomous systems I've used have braked in similar circumstances. On the other hand, for some reason the truck driver failed to see the Tesla and at the very least misjudged the turn. So it's still not clear why the accident occurred or if there were other extenuating circumstances.
There are already several semi-autonomous systems available from other automakers, ranging in sophistication, from BMW to Mercedes-Benz. Volvo has had automatic braking and pedestrian collision prevention systems on the market for roughly 6 years. But the company has taken an extremely cautious approach to introducing the technology, in spite of the fact it has more experience. It has added more sensors and positioned them higher, and the software has been set to err on the side of caution, braking or turning off at the slightest sign of a problem.
Indeed, in my own test drives, I've found that although the semi-autonomous systems can save you from crashing into a car ahead, they can also be flummoxed in certain conditions. (See my experience with the semi-autonomous features of the Volvo S90.) A particular corner on a local road I use, for example, will trigger every emergency braking system I've ever tested, even though there's nothing on the road ahead. Sharp changes in lighting conditions can trick other vehicles into suddenly braking.
There will doubtless be questions about Tesla's response and the fact that the publicly traded company failed to reveal the fatal accident until nearly two months later when NHTSA indicated it was investigating the incident. Should Tesla owners have been warned about the accident, reminding them to remain vigilant and pointing out this particular vulnerability in the Autopilot system?
More important, if you're driving a car with a similar system—a BMW 7 Series, a Mercedes-Benz S-class or any car with lane keeping and automatic braking—should you be alarmed? No, but the tragic accident should be a cautionary tale to drivers that they must pay attention at all times. The semi-autonomous systems are still “driver assistance” systems, not fully autonomous systems. I believe they do add an extra measure of safety when used properly and I will continue to use them when I'm driving.
Like airbags, electronic stability control, and the soon-to-be-mandatory rear-view cameras, these semi-autonomous safety systems have the potential to save thousands of lives on U.S. roads every year. Hopefully, the unfortunate Tesla accident won't stymie development and instead will help teach designers how to make these systems better and prevent such tragedies in the future.