Shaocong Wang, Fengkui Cao, Xieyuanli Chen, Ting Wang and Lianqing Liu
1 March 2025: Code updata7 January 2025: Accepted by IEEE RAL!
BEV-LSLAM requires an input point cloud of type sensor_msgs::PointCloud2
- Ubuntu 18.04 or 20.04
- ROS Melodic or Noetic (
roscpp,std_msgs,sensor_msgs,geometry_msgs,pcl_ros) - cv_bridge
- Opencv
- C++ 14
- OpenMP
- Point Cloud Library
- Eigen >=3.3.4
- Ceres >=1.14
Create a catkin workspace, clone the imagecloud_msg and orb_lio repository into the src folder, and compile via the catkin_tools package (or catkin_make if preferred):
mkdir ws && cd ws && mkdir src && catkin init && cd src
git clone https://github.com/ROBOT-WSC/BEV-LSLAM.git
catkin_makeFor your convenience, KITTI, Urbanloco and Groundrobot can be test on BEV-LSLAM.(Different datasets have different intensity range, please check the 222 line in scantoscan.cpp before your experiment.) For Groundrobot, we provide example test data here. To run, first launch BEV-LSLAM via:
roslaunch orb_lio orb_lo.launchIn a separate terminal session, play back the downloaded bag:
rosbag play bag's name --clock
If you find BEV-LSLAM is useful in your research or applications, please consider giving us a star 🌟 and citing it by the following BibTeX entry.
@ARTICLE{10845798,
author={Cao, Fengkui and Wang, Shaocong and Chen, Xieyuanli and Wang, Ting and Liu, Lianqing},
journal={IEEE Robotics and Automation Letters},
title={BEV-LSLAM: A Novel and Compact BEV LiDAR SLAM for Outdoor Environment},
year={2025},
volume={10},
number={3},
pages={2462-2469},
keywords={Laser radar;Feature extraction;Simultaneous localization and mapping;Point cloud compression;Visualization;Tracking loops;Robots;Optimization;Pose estimation;Pipelines;SLAM;localization;mapping},
doi={10.1109/LRA.2025.3531727}}We thank the authors of the FastGICP, orb-slam and A-LOAM open-source packages.