A First Physical-World Trajectory Prediction Attack via LiDAR-induced Deceptions in Autonomous Driving

Authors: 

Yang Lou, City University of Hong Kong; Yi Zhu, State University of New York at Buffalo; Qun Song, Delft University of Technology; Rui Tan, Nanyang Technological University; Chunming Qiao, State University of New York at Buffalo; Wei-Bin Lee, Information Security Center, Hon Hai Research Institute, and Feng Chia University; Jianping Wang, City University of Hong Kong

Abstract: 

Trajectory prediction forecasts nearby agents' moves based on their historical trajectories. Accurate trajectory prediction (or prediction in short) is crucial for autonomous vehicles (AVs). Existing attacks compromise the prediction model of a victim AV by directly manipulating the historical trajectory of an attacker AV, which has limited real-world applicability. This paper, for the first time, explores an indirect attack approach that induces prediction errors via attacks against the perception module of a victim AV. Although it has been shown that physically realizable attacks against LiDAR-based perception are possible by placing a few objects at strategic locations, it is still an open challenge to find an object location from the vast search space in order to launch effective attacks against prediction under varying victim AV velocities.

Through analysis, we observe that a prediction model is prone to an attack focusing on a single point in the scene. Consequently, we propose a novel two-stage attack framework to realize the single-point attack. The first stage of prediction-side attack efficiently identifies, guided by the distribution of detection results under object-based attacks against perception, the state perturbations for the prediction model that are effective and velocity-insensitive. In the second stage of location matching, we match the feasible object locations with the found state perturbations. Our evaluation using a public autonomous driving dataset shows that our attack causes a collision rate of up to 63% and various hazardous responses of the victim AV. The effectiveness of our attack is also demonstrated on a real testbed car. To the best of our knowledge, this study is the first security analysis spanning from LiDAR-based perception to prediction in autonomous driving, leading to a realistic attack on prediction. To counteract the proposed attack, potential defenses are discussed.

Open Access Media

USENIX is committed to Open Access to the research presented at our events. Papers and proceedings are freely available to everyone once the event begins. Any video, audio, and/or slides that are posted after the event are also free and open to everyone. Support USENIX and our commitment to Open Access.

BibTeX
@inproceedings {299758,
author = {Yang Lou and Yi Zhu and Qun Song and Rui Tan and Chunming Qiao and Wei-Bin Lee and Jianping Wang},
title = {A First {Physical-World} Trajectory Prediction Attack via {LiDAR-induced} Deceptions in Autonomous Driving},
booktitle = {33rd USENIX Security Symposium (USENIX Security 24)},
year = {2024},
isbn = {978-1-939133-44-1},
address = {Philadelphia, PA},
pages = {6291--6308},
url = {https://www.usenix.org/conference/usenixsecurity24/presentation/lou},
publisher = {USENIX Association},
month = aug
}