Tesla Autopilot crash is a reminder that your car still can’t drive itself

Tesla model 3 vs model y: power
(Image credit: Tesla)

Another Tesla has been involved in a high profile accident, with a reportedly Autopilot-controlled Tesla Model 3 colliding with an emergency vehicle last week — specifically a Florida Highway Patrol Cruiser near Orlando.

While nobody is hurt, these kinds of stories highlight the inherent flaws in current autonomous car software, be it Autopilot or something else. And that’s exactly why you’re told to have an attentive driver at the wheel at all times.

According to CNN the incident happened on Interstate 4 just before 5 a.m. ET. The Orange County trooper had stopped to aid a broken down vehicle, only for a Tesla Model to hit the side of the patrol car and then crash into the broken-down Mercedes — narrowly missing the trooper in question. The patrol car did have its emergency lights flashing at the time.

The drivers of both the Tesla and the Mercedes were left with minor injuries, though nobody was seriously hurt. The Tesla’s driver also confirmed to the Trooper on the scene that the car was in Autopilot mode at the time of the crash.

Florida police said the crash would be reported to Tesla and the National Highway Traffic Safety Administration (NHTSA) — the latter of which is currently investigating Tesla Autopilot. 

The NHTSA claims that Teslas have collided with emergency vehicles, including police cars and ambulances, at least 11 times between January 2018 and July 2021. The incidents happened in nine different states, and most of them apparently took place at night.

What’s more, the NHTSA said that the scenes had utilized emergency vehicles lights, flares, illuminated arrow boards and road cones prior to each accident.

In this instance, it’s not clear whether the driver was misusing Autopilot or not. However it’s another warning of why drivers shouldn’t be too trusting of Autopilot, or any other semi-autonomous driver assistance tech. It may seem like the car is capable of driving itself, but it’s not effective enough to completely replace the driver.

Autopilot is not a self-driving car system, no matter what it sounds like

Tesla itself has said that Autopilot could “do the wrong thing at the worst time,"  which is when the driver is needed to take control. If the driver isn’t paying attention, or worse, has actually got out of the driver’s seat, then the car is essentially left to its own devices when serious situations occur.

Semi-autonomous driver assistance tech is a massive help, especially on longer journeys, but it’s not an alternative to actually driving. Even if terms like ‘Autopilot’ and ‘Full Self Driving’ make it sound like the car is able to do everything for you.

Tesla CEO Elon Musk has constantly defended the name Autopilot, claiming it’s based on the autopilot used in planes that was built to assist an attentive pilot. But that hasn’t stopped the automaker from landing in hot water.

German courts have ruled the name Autopilot, alongside marketing that suggested Teslas could drive themselves, is misleading. Likewise the NHTSA has asked the FTC to investigate Tesla’s use of the name Autopilot as a form of false advertising, though it isn’t clear what the FTC’s response was.

Tesla also needs to do more to stop people being able to get out of the driver seat while Autopilot is engaged. Currently the system uses sensors in the steering wheel to check if the driver's hands are present, and will disengage if the seatbelt is unbuckled. 

However, testing has shown these safety measures are terrifyingly easy to get around. Weights on the steering wheel can mimic the presence of hands, and drivers could, in theory, sit on top of a buckled seat belt to give them freedom to leave the driver’s seat. 

This is not a problem exclusive to Tesla, with other tests showing that autonomous driving safety measures are just as easy to cheat. And the biggest problem these systems all share are the lack of weight sensors in the driver’s seat, checking if someone is actually there or not. 

Clearly, something needs to be done across the board to stop this happening. Keeping someone in the driver’s seat isn’t going to stop them from getting distracted or taking their eyes off the road, but it’s a good start. In the meantime just remember that your ‘autonomous’ car isn’t. We still have a long way — and at the very least several years — to go before your own car will be driving you around without needing any supervision.

Tom Pritchard
UK Phones Editor

Tom is the Tom's Guide's UK Phones Editor, tackling the latest smartphone news and vocally expressing his opinions about upcoming features or changes. It's long way from his days as editor of Gizmodo UK, when pretty much everything was on the table. He’s usually found trying to squeeze another giant Lego set onto the shelf, draining very large cups of coffee, or complaining about how terrible his Smart TV is.