Self-Driving Car Accidents Will Make Us All Safer

Contributing Writer
Updated

Credit: GoogleCredit: Google

Nobody said a world with computer-controlled cars would be perfect. This week that fact was made abundantly clear as Google and Delphi confirmed that their experimental autonomous vehicles being tested in California had been involved in several minor accidents.

Following a request for department of motor vehicle records, the Associated Press reported that the fleet of more than two dozen Google cars had been involved in three accidents and Delphi's vehicle had been involved in one collision. In a subsequent blog post, Google pointed out its vehicles have been involved in a total of 11 accidents. These public relations revelations have raised some eyebrows.

Back in 2012, Google boasted that after 300,000 miles, none of its autonomous vehicles had been involved in an accident. Now the company has done considerably more roadwork — 1.7 million miles (about 1 million of which was autonomous). According to a blog post by Chris Urmson, director of Google's self-driving car program, the company's cars were hit from behind seven times, sideswiped "a couple of times," and hit by a car that rolled through a stop sign.

None of the accidents were the fault of the autonomous cars, according the companies. "Safety is our highest priority," said a Google representative in response to an e-mail query. "Since the start of our program six years ago, we've driven nearly a million miles autonomously, on both freeways and city streets, and the self-driving car hasn't caused a single accident."

MORE: I Rode in the First Street-Legal Autonomous Truck

Humans behind the wheel of other cars were at fault in each case. Delphi's vehicle, for example, wasn't even in autonomous mode last year when it was hit while sitting patiently at a light waiting to turn (Delphi provided a redacted accident report to prove its point). That's not surprising given how poorly most people drive. More than 30,000 people are killed on the road in the U.S. each year and 94 percent of those accidents are due to driver error.

There is no question that technology can make driving safer. Not only can passive systems like airbags prevent injuries, active systems like electronic stability control can actually prevent accidents. Autonomous and driver-assist programs promise even further improvements. Existing safety systems that can follow the flow of traffic and keep a car properly in its lane can alleviate driver fatigue -- and even prevent common rear-end collisions like those experienced by Google's self-driving cars.

However, having put thousands miles on the road using driver-assist and crash-prevention systems myself, I know that there is a learning curve ahead for drivers. While some of the underlying technology behind such systems is similar, their implementations vary considerably. Some cars brake as soon as any car ahead dips into their lane; others only brake when a car ahead reaches the center of the lane. And if you forget to shut off some autonomous-driving features, you can suddenly find yourself accelerating dangerously around a sharp curve or down an exit ramp.

MORE: Android Auto Test Drive: Meet Pioneer's In-Dash Unit

Determining how to handle these different situations is part of the learning curve for programmers and engineers as well. Should a car brake severely if another car crosses the center line, or head for the shoulder, with the potential risk of a rollover? How can you teach an autonomous car to detect erratic drivers ahead? How should the car react when there's a fallen tree lying across the roadway?

As part of the testing program in California, companies have to disclose even minor scrapes to the DMV -- although the specifics can remain confidential. However, as reaction to the news points out, if companies want to promote self-driving or driver assist-technologies, they are probably going to have to be even more forthcoming about accidents and the possible limitations of the technology.

Several auto engineers and executives have said to me over the last couple of years that their biggest fear is not about the reliability of the technology, but about that one case where an autonomous vehicle is involved in a serious accident. The resulting negative publicity could set the industry back years, meaning that in the meantime more people will die on our highways for want of these safety systems. Such an accident seems ineluctable -- autonomous cars will be contending with cars driven by humans for many years. But public alarm over these systems can be avoided, if companies get out in front of the news with more information about minor accidents now.

More Auto Tech Coverage from Tom's Guide

John R. Quain is the contributing automotive editor for Tom's Guide. Follow him @jqontech. Follow us @tomsguide, on Facebook and on Google+.