Researchers blind autonomous cars by tricking LIDAR

As I was on the motorway, I saw a man who wasn't there. Then things went pear-shaped

If you've ever been dazzled by some idiot's high-beam driving towards you at night, you'd probably welcome a self-driving car – except one of the key “eyes”, LIDAR, can also be blinded, or tricked into reacting to objects that aren't there.

LIDAR - Light Detection and Ranging - is an important self-driving vehicle technology: it gathers distances to objects by firing a pulsed laser at them and collating the reflections.

Hocheol Shin, Dohyun Kim, Yujin Kwon, and Yongdae Kim of the Korea Advanced Institute of Science and Technology have demonstrated two kinds of attacks against LIDAR: a spoofing attack, and a saturation attack. Their work is published at the International Association for Cryptologic Research's pre-print archive here.

While their work was in a lab, they write that the potential damage from an attack is serious.

“As per the data from UK Department for Transport, 55m is the braking distance for a car driving at 60mph. Because the braking distance is the distance required solely for braking, even autonomous vehicles have no room for checking the authenticity of the observed dots, but need to immediately activate emergency braking or evasive manoeuvres. Such sudden actions are sufficient to endanger the surrounding vehicles.”

The subject for their proof-of-concept attacks was the Velodyne VLP-16 sensor.

The saturation attack is very straightforward: “By illuminating the LIDAR with a strong light of the same wavelength as that the LIDAR uses, we can actually erase the existing objects in the sensed output of the LIDAR.”

The spoofing attack was more complex: the four researchers not only gave the LIDAR an optical illusion, they made it appear closer than the device creating the illusion.

To do this, the attackers exploited two characteristics of LIDAR, one of them intrinsic to the technology, the other specific to the implementation.

Basic illustration of LIDAR operation

Simplified illustration of how LIDAR operation. Image: IACR paper

Rather than capturing whole objects (as a camera does), LIDAR captures a point cloud sufficient to infer that an object is in its view (the car's computers then decide what action to take, if any). To spoof an object, the attackers only need to make the sensors respond to points of light that look like the point cloud of an object.

Using refraction to trick the LIDAR

Exploiting refraction. Image: IACR paper

If the sensor only responded in a single direction (say, straight ahead), spoofing isn't much an attack, since you'd have to put your attack device in the path of the vehicle.

That's where the implementation comes in: the researchers noticed that the Velodyne LIDAR (and many similar devices) protect their sensors with curved glass. A laser generating a point cloud at an angle can exploit refraction to change the “apparent” direction and distance the point cloud lies in.

“Fake dots in directions other than the direction of the attacker can be a severe threat to the victim, because the detected points have different significances according to their directions on roads”, they write.

The researchers demonstrated a second spoofing attack: they captured the laser pulse emitted by a LIDAR, added a bit of delay, and sent back a corresponding pulse using their own laser.

The paper also points out the difficulty of defending systems against these attacks: adding technology to authenticate the perceived dots, for example, could slow things down too much in an autonomous vehicle. ®

Sponsored: Detecting cyber attacks as a small to medium business

SUBSCRIBE TO OUR WEEKLY TECH NEWSLETTER


Related

chip

Arm gets edgy: Tiny neural-network accelerator offered for future smart speakers, light-bulbs, fridges, etc

Meet the Ethos-U55 and the Cortex-M55 for edge devices
extreme_cold

And now, here's Cli-Mate 9000 with the weather... Pattern-recognizing neural network tries its hand at forecasting

Not perfect, not going to replace supercomputer math engines, fascinating nonetheless
supercomputer

Why build your own cancer-sniffing neural network when this 1.3 exaflop supercomputer can do if for you?

Small problem – you need 9,216 CPUs and 27,648 GPUs
asteroid

Good news: Neural network says 11 asteroids thought to be harmless may hit Earth. Bad news: They are not due to arrive for hundreds of years

And also, crucial point, the software may be wrong and we'll never be released by these angels of death
sleep

AI of the needle: Here's how neural networks could detect nighttime low blood-sugar levels using your heart beat

Any one thought about actually testing this on diabetics? Er, no?
Two women hailing a ride

Taxi for Uber: Ride-hailing app giant stripped of licence to operate in London

String of failures put passengers at risk, says city transport authority
tesla_police_crash

Tesla has a smashing weekend: Model 3 on Autopilot whacks cop cars, Elon's Cybertruck demolishes part of LA

Dude was distracted by his dog when flash motor pranged police, officers say
guy_eating_bagel

Tesla Autopilot crash driver may have been eating a bagel at the time, was lucky not to get schmeared on road

Don't drive hungry, folks – or take your hands off the wheel even when Elon's super cruise-control is active

Biting the hand that feeds IT © 1998–2020