Skip to content

y1y5/OMP-ATTACK

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Enduring, Efficient and Robust Trajectory Prediction Attack in Autonomous Driving via Optimization-Driven Multi-Frame Perturbation Framework

PyTorch Conference

Updates

  • (02/27/2025) Paper Accepted!

Abstract

Trajectory prediction plays a crucial role in autonomous driving systems, and exploring its vulnerability has garnered widespread attention. However, existing trajectory prediction attack methods often rely on single-point attacks to make efficient perturbations. This limits their applications in real-world scenarios due to the transient nature of single-point attacks, their susceptibility to filtration, and the uncertainty regarding the deployment environment. To address these challenges, this paper proposes a novel LiDAR-induced attack framework to impose multi-frame attacks by optimization-driven adversarial location search, achieving endurance, efficiency, and robustness. This framework strategically places objects near the adversarial vehicle to implement an attack and introduces three key innovations. First, successive state perturbations are generated using a multi-frame single-point attack strategy, effectively misleading trajectory predictions over extended time horizons. Second, we efficiently optimize adversarial objects' locations through three specialized loss functions to achieve desired perturbations. Lastly, we improve robustness by treating the adversarial object as a point without size constraints during the location search phase and reduce dependence on both the specific attack point and the adversarial object's properties. Extensive experiments confirm the superior performance and robustness of our framework.

Attack Steps

Setup

  1. Clone this repository.
  2. CD to the PIXOR_nuscs directory and create a conda environment for the PIXOR detector:
    # create env
    cd PIXOR_nuscs
    conda create -n pixor_nuscs python=3.7
    
    # install dependecies
    Follow [PIXOR](https://github.com/philip-huang/PIXOR) to install the dependecies
    
    # compile
    cd srcs/preprocess_nuscs
    make
  3. CD to the Trajectron-plus-plus directory and create a conda environment for the Trajectron++ predictor:
    # create env
    cd Trajectron-plus-plus
    conda create -n trajectron++ python=3.7
    
    # install dependecies
    Follow [Trajectron-plus-plus](https://github.com/StanfordASL/Trajectron-plus-plus) to install the dependecies

Data preparation

Download the Full dataset (V1.0) of the nuScenes dataset from nuScenes.

Training of PIXOR detector and Trajectron++ predictor

You can follow the official guide to train your own models. Then replace the models in OMP-ATTACK/PIXOR_nuscs/srcs/experiments/nusce_kitti and OMP-ATTACK/Trajectron-plus-plus/experiments/nuScenes/models/int_ee with your own models.

Attack

  • Initialize paths and files in ./omp-attack.sh, including 'root_dir', 'target_scene_instance_file', 'code_dir', and 'nusc_datadir'.
  • Follow the steps in ./omp-attack.sh for attack and evaluation.

Citation

If you find this paper or the code useful for your research, please consider citing:

@inproceedings{yu2025enduring,
  title={Enduring, Efficient and Robust Trajectory Prediction Attack in Autonomous Driving via Optimization-Driven Multi-Frame Perturbation Framework},
  author={Yu, Yi and Han, Weizhen and Wu, Libing and Liu, Bingyi and Wang, Enshu and Zhang, Zhuangzhuang},
  booktitle={Proceedings of the Computer Vision and Pattern Recognition Conference},
  pages={17229--17238},
  year={2025}
}

Acknowledgments

Our code is developed based on Lou, et al. "A First {Physical-World} Trajectory Prediction Attack via {LiDAR-induced} Deceptions in Autonomous Driving." 33rd USENIX Security Symposium (USENIX Security 24). Thanks for their contributions and useful help.

About

Enduring, Efficient and Robust Trajectory Prediction Attack in Autonomous Driving via Optimization-Driven Multi-Frame Perturbation Framework (CVPR 2025 Highlight)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors