Skip to main content

Self-Driving Car Accidents Will Make Us All Safer

Credit: Google

(Image credit: Google)

Nobody said a world with computer-controlled cars would be perfect. This week that fact was made abundantly clear as Google and Delphi confirmed that their experimental autonomous vehicles being tested in California had been involved in several minor accidents.

Following a request for department of motor vehicle records, the Associated Press reported that the fleet of more than two dozen Google cars had been involved in three accidents and Delphi's vehicle had been involved in one collision. In a subsequent blog post, Google pointed out its vehicles have been involved in a total of 11 accidents. These public relations revelations have raised some eyebrows.

Back in 2012, Google boasted that after 300,000 miles, none of its autonomous vehicles had been involved in an accident. Now the company has done considerably more roadwork — 1.7 million miles (about 1 million of which was autonomous). According to a blog post by Chris Urmson, director of Google's self-driving car program, the company's cars were hit from behind seven times, sideswiped "a couple of times," and hit by a car that rolled through a stop sign.

None of the accidents were the fault of the autonomous cars, according the companies. "Safety is our highest priority," said a Google representative in response to an e-mail query. "Since the start of our program six years ago, we've driven nearly a million miles autonomously, on both freeways and city streets, and the self-driving car hasn't caused a single accident."

MORE: I Rode in the First Street-Legal Autonomous Truck

Humans behind the wheel of other cars were at fault in each case. Delphi's vehicle, for example, wasn't even in autonomous mode last year when it was hit while sitting patiently at a light waiting to turn (Delphi provided a redacted accident report to prove its point). That's not surprising given how poorly most people drive. More than 30,000 people are killed on the road in the U.S. each year and 94 percent of those accidents are due to driver error.

There is no question that technology can make driving safer. Not only can passive systems like airbags prevent injuries, active systems like electronic stability control can actually prevent accidents. Autonomous and driver-assist programs promise even further improvements. Existing safety systems that can follow the flow of traffic and keep a car properly in its lane can alleviate driver fatigue -- and even prevent common rear-end collisions like those experienced by Google's self-driving cars.

However, having put thousands miles on the road using driver-assist and crash-prevention systems myself, I know that there is a learning curve ahead for drivers. While some of the underlying technology behind such systems is similar, their implementations vary considerably. Some cars brake as soon as any car ahead dips into their lane; others only brake when a car ahead reaches the center of the lane. And if you forget to shut off some autonomous-driving features, you can suddenly find yourself accelerating dangerously around a sharp curve or down an exit ramp.

MORE: Android Auto Test Drive: Meet Pioneer's In-Dash Unit

Determining how to handle these different situations is part of the learning curve for programmers and engineers as well. Should a car brake severely if another car crosses the center line, or head for the shoulder, with the potential risk of a rollover? How can you teach an autonomous car to detect erratic drivers ahead? How should the car react when there's a fallen tree lying across the roadway?

As part of the testing program in California, companies have to disclose even minor scrapes to the DMV -- although the specifics can remain confidential. However, as reaction to the news points out, if companies want to promote self-driving or driver assist-technologies, they are probably going to have to be even more forthcoming about accidents and the possible limitations of the technology.

Several auto engineers and executives have said to me over the last couple of years that their biggest fear is not about the reliability of the technology, but about that one case where an autonomous vehicle is involved in a serious accident. The resulting negative publicity could set the industry back years, meaning that in the meantime more people will die on our highways for want of these safety systems. Such an accident seems ineluctable -- autonomous cars will be contending with cars driven by humans for many years. But public alarm over these systems can be avoided, if companies get out in front of the news with more information about minor accidents now.

More Auto Tech Coverage from Tom's Guide

John R. Quain is the contributing automotive editor for Tom's Guide. Follow him @jqontech. Follow us @tomsguide, on Facebook and on Google+.

  • surphninja
    Eventually, manual drivers will be the most dangerous threat on the road, and the day is coming when driving manually (at least on major highways) will be illegal.

    It'll be really cool to see how an entire highway of linked autonomous cars coordinating traffic perfectly.
    Reply
  • Vlad Rose
    Autonomous vehicles = no more drunk driving. What will counties do to get their funds from people? I wonder if it'd even be illegal to drink and be in the car still; open intox... lol

    But seriously, it would be nice to be able to have a car that does all the driving for you so that you can either nap, do work, or even just relax. Anyone who's been on long distance road trips know exactly what I mean.
    Reply
  • merikafyeah
    The headline is almost misleading. It almost makes you think the self-driving car "caused" some accidents but...

    "None of the accidents were the fault of the autonomous cars..." "...the self-driving car hasn't caused a single accident."

    In every case the self-driving car was on the receiving end of the accident, meaning it's the regular meat bags in regular cars doing all the harm, as usual.
    Reply
  • iam2thecrowe
    just what happenes when the system is hacked and hijacked.......then what........?
    Reply
  • Dkminors
    Programmers need to learn from these accidents. Some are certainly unavoidable, but I would suggest that some could have been avoided by a defensive minded human (or more refined AI). Anyone who has ever moved to a chicken lane to avoid being rear ended or otherwise avoided imminent impacts from other drivers knows what I mean. If I were to have the same accident rate per mile as these autonomous vehicles I would be involved in an accident about every 6 or 7 years vs. none in the last over 30 years of driving. How does an automated vehicle actively identify and avoid bad drivers before they can do their damage?
    Reply
  • belardo
    If Google Self-Driving cars are anything like the latest Chrome update - then its going to be garbage.

    For those who noticed the new "bookmark manager" that works like crap, there is a somewhat hidden fix (since Google wasn't smart enough to put a toggle on the UI or settings). Type this into the URL of Chrome:
    chrome://flags/#enhanced-bookmarks-experiment

    The first option is: Enable Enhanced Bookmarks Mac, Windows, Linux, Chrome OS, Android Provides an off switch for enhanced bookmarks experiment #enhanced-bookmarks-experiment

    Switch it to DISABLED

    Scroll to the bottom to save and restart Chrome.

    Hint Google: The setting said "experimental" - it shouldn't have ever left your labs. its about as crappy a screw over as when you attached the cloud-bookmark manager into the desktop version without telling anyone or explaining what happens if you do it wrong (ie: I was never signed on with a google account, when I did - all my bookmarks were wiped out - thank you asshats)

    Seriously Google - DO NOT hire the Windows8 team to work on your products.
    Reply
  • shiitaki
    Humans are terrible at driving largely because, they don't think they are terrible at driving. The very people who think they are great, are delusional.

    So the potential is there for the autonomous cars to be superior simply because a computer process multiple instructions in the time it takes light to travel a foot. Humans at 60 mph, can't process information any faster than about 30 in ideal conditions and state of mind.

    30,000 people die in car crashes? And yet we do thing about gun violence?
    Reply
  • bluestar2k11
    I'm going to pass on my car driving for me...
    I prefer to be in control rather then rely on a machine to keep me safe or get me somewhere, there's too many things that could go wrong for my tastes, plus the general lack of control.

    Unless we get a real life AI who loves humans, then I might reconsider, but an inanimate processing list is not suitable to drive my car.
    Reply
  • pocketdrummer
    I like systems that act to prevent an accident or attempt to mitigate the damage, but I am completely against the idea that ALL cars should drive themselves everywhere. You can't tell me that my insurance won't go up if I decide to drive a car that does NOT have that system (when they become ubiquitous). You also can't tell me the government won't try to make laws that punish those who drive themselves or even outlaw it altogether.

    I like what companies like Subaru do where they notify you if you're driving out of the lane or apply the brakes if it detects you are traveling into another object. Those are things we need. But, if we make self-driving cars mandatory, you can kiss motorsports goodbye. Nobody will want to buy a self driving Ferrari.
    Reply
  • Jamie and the torch
    At some point an autonomous car is going to have to make a morale decision, a child runs into the road, it can’t possibly stop in time, does it?;
    A) Hit the child
    B) Plough you into the oncoming traffic
    C) Go the other way into the people waiting at the bus stop.

    It will have to decide who dies……..
    Reply