Self-driving cars need to see what’s around them to avoid obstacles and drive safely. These autonomous vehicles typically use a spinning radar-type device called LiDAR that acts as the eyes of the car. It provides constant information about the distance to objects so the car can decide what actions are safe to take.
However, a collaboration of researchers from the University of Florida, the University of Michigan, and the University of Electro-Communications in Japan have now shown these eyes can be tricked with a fairly basic laser setup. They demonstrated that expertly timed lasers shined at an approaching LiDAR system can create a blind spot in front of the vehicle large enough to completely hide moving pedestrians and other obstacles.
The deleted data causes the cars to think the road is safe to continue moving along, endangering whatever may be in the attack’s blind spot. This is the first time when a LiDAR sensor has been tricked into deleting data about obstacles. The team also provides upgrades that could eliminate this weakness to protect people from malicious attacks.
Short for Light Detection and Ranging, the LiDAR system works by emitting laser light and capturing the reflections to calculate distances between itself and the obstacles in its path. The attack creates fake reflections to scramble the sensor.
“We mimic the lidar reflections with our laser to make the sensor discount other reflections that are coming in from genuine obstacles,” said Sara Rampazzi, a UF professor of computer and information science and engineering who led the study. “The lidar is still receiving genuine data from the obstacle, but the data are automatically discarded because our fake reflections are the only ones perceived by the sensor.”
Using this technique, the researchers were able to delete data for static obstacles as well as moving pedestrians. They also demonstrated with real-world experiments that the attack could follow a slow-moving vehicle using basic camera tracking equipment. In simulations of autonomous vehicle decision-making, this deletion of data caused a car to continue accelerating toward a pedestrian it could no longer see instead of stopping as it should.
Updates to the LiDAR sensors or the software that interprets the raw data could address this vulnerability. Manufacturers could teach the software to look for the tell-tale signatures of the spoofed reflections added by the laser attack.
“Revealing this liability allows us to build a more reliable system,” said Yulong Cao, a Michigan doctoral student and primary author of the study. “In our paper, we demonstrate that previous defense strategies aren’t enough, and we propose modifications that should address this weakness.”