How to Crash a Self-Driving Car

LAS VEGAS — It's easy to fool the navigation sensors and cameras of highly automated vehicles, such as Google's self-driving car, Teslas or even regular luxury vehicles, into thinking there are no obstacles in the road ahead, three Chinese researchers said at the Defcon 24 hacker conference here Sunday (Aug. 7).

Credit: Hadrian/Shutterstock

(Image credit: Hadrian/Shutterstock)

Jianhao Liu, Chen Yan and Wenyuan Xu showed how they used cheap ultrasonic transmitters and more elaborate radar jammers to make real obstacles "disappear." They also used pocket-sized laser pointers and LED flashlights to blind cameras, sometimes causing permanent damage. Car makers should, but don't, anticipate intentional attacks on these sensors, the trio said.

"The reliability of the sensors will affect the reliability of the autonomous driving vehicle," Liu said. "We shouldn't trust semi-autonomous cars just yet."

MORE: Self-Driving Cars Could 'Create Hell on Earth'

Cars that provide full or partial automation of driving and navigation rely on cameras to see the road ahead and backward, on ultrasonic sensors to detect short-range obstacles when backing up or inching forward, and on radar to detect obstacles up to within several hundred feet.

Tesla's Autopilot mode uses all three technologies when enabled, but many other new vehicles rely on one or more for parking assistance, for spotting vehicles in the driver's blind spot and for maintaining a safe distance from other vehicles while on cruise control.

Yet all these technologies can be blinded or fooled. Liu and Yan played video clips showing how an ultrasonic jammer they built from off-the-shelf parts caused obstacles to vanish from the parking-assistance screens of a Tesla S and an Audi SUV. Yan made himself "disappear" by wrapping himself in acoustic-dampening foam tile while walking in front of a car.

In each instance, the car bumped into the cloaked obstacle, which was usually a person. The transmitter could also "spoof" the signal, making obstacles appear closer than they actually were.

Cars use ultrasonic sensors at short distances and very low speeds, so serious damage or personal injury resulting from jammed or spoofed signals would be unlikely. More serious are the risks from radar jamming and spoofing, which the trio achieved by analyzing and matching the signals emitted from the radar system of a Tesla S. The researchers could make a leading vehicle disappear from the road ahead, or change the leading vehicle's apparent relative distance.

Car cameras can be temporarily blinded by laser pointers or LED flashlights, but the blinding objects must be within a few feet of the target vehicle. In a white paper, Liu, Xu and Yan noted that another car-sensor technology, light distance and ranging or LiDAR, can accidentally blind other cars' cameras.

Both laser pointers and LiDAR can permanently damage camera sensors. However, LiDAR is expensive to deploy and few commercially available vehicles use it.

The takeaway from their research was described by Yan as "attacking physical sensors on cars is feasible, but the sky is not falling."

However, he added, "manufacturers should design sensors with security in mind, and think about intentional attacks."