A ROS2 Jazzy package for SLAM using the IMU of Waveshare Sense Hat B and YDLidar.
- Single Board Computer: Raspberry Pi or OrangePi 5
- Waveshare Sense HAT B
- YDLidar Tmini Pro (can be switch to another YDLidar device)
- Dupont Connector
On Orange Pi 5 with ArmbianOS, the physical pins 3 (SDA) and 5 (SCL) for the 26-pin header are actually connected to I2C-5 controller in the RK3588 SoC, and are configured in mux mode 3.
- Add user overlay of /boot/ArmbianEnv.txt:
user_overlays= rk3588-i2c5-m3 - Run
sudo armbian-add-overlay ./hw/rk3588/rk3588-i2c5-m3.dts - Reboot and recheck if there are some entries with
sudo i2cdetect -y 5. - Another alternative to list all of the i2c-bus by running
ls /dev/i2c* | while read line; do id="$(echo $line | cut -d '-' -f 2)"; echo -e "\\n## Detecting i2c ID: $id"; sudo i2cdetect -y $id; done
See other dts overlay files in https://github.com/orangepi-xunlong/linux-orangepi/tree/orange-pi-5.10-rk3588/arch/arm64/boot/dts/rockchip/overlay
On RaspberryPi with Ubuntu 24.04 Server, the sensor is connected to I2C-1 follow these steps:
sudo raspi-config- Choose "3. Interface Options" -> "I5 I2C" -> select "yes".
- Reboot and check that
sudo i2cdetect -y 1returns device addresses on 0x29, 0x48, 0x5c, 0x68, and 0x70.
Overall sudo i2cdetect -y 5 on Orange Pi 5 or sudo i2cdetect -y 1 on Raspberry Pi 4 should returns:
0 1 2 3 4 5 6 7 8 9 a b c d e f
00: -- -- -- -- -- -- -- --
10: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
20: -- -- -- -- -- -- -- -- -- 29 -- -- -- -- -- --
30: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
40: -- -- -- -- -- -- -- -- 48 -- -- -- -- -- -- --
50: -- -- -- -- -- -- -- -- -- -- -- -- 5c -- -- --
60: -- -- -- -- -- -- -- -- 68 -- -- -- -- -- -- --
70: 70 -- -- -- -- -- -- --
Where
- 0x29 represents the Color recognition sensor TCS34725
- 0x48 represents the AD conversion ADS1015.
- 0x68 represents the IMU 9-axis sensor ICM-20948.
- 0x5C represents the Air pressure sensor LPS22HB
- 0x70 represents the Temperature and humidity sensor SHTC3
Add user to the group permission of i2c and dialout:
sudo usermod -aG i2c $USER
sudo usermod -aG dialout $USER
- Install ROS2 Jazzy on your Ubuntu 24.04 system, see https://docs.ros.org/en/jazzy/Installation/Ubuntu-Install-Debs.html.
- Create a new ROS2 workspace:
mkdir -p ~/ros2_ws_slam/src && cd ros2_ws_slam. - Clone and install the YD Lidar SDK outside the ROS2 workspace
git clone https://github.com/YDLIDAR/YDLidar-SDK.git ~/YDLidar-SDK. - Clone the YD Lidar driver for ROS2 from the
humblebrunch, inside the ROS2 workspace:cd ~/ros2_ws_slam/src && git clone https://github.com/YDLIDAR/ydlidar_ros2_driver.git -b humble. - Install robot_localization package
sudo apt install ros-${ROS_DISTRO}-robot-localization
- Clone this repository inside the src folder of the ROS workspace root folder.
- Build the package from the workspace root folder:
colcon build --symlink-install. - If the build is successful, source the local setup:
source ./install/setup.bash. - Run the test script to validate setup:
./scripts/test_slam_setup.sh - Run the launch script
ros2 launch portable_slam launch_opi5.pyfor Orange Pi 5 orros2 launch portable_slam launch_rpi4.pyfor Raspberry Pi 4B.
This project supports a hybrid development workflow that allows editing code locally in your IDE while testing builds on the target Raspberry Pi hardware. This eliminates the need for constant remote access during development.
- SSH key-based authentication set up between your local machine and Raspberry Pi
- Rsync installed on your local machine
- Raspberry Pi accessible via hostname/IP
- Docker installed on host (for visualization, optional)
To visualize SLAM mapping without installing ROS2 on your host, use Docker:
-
Build the ROS2 Jazzy desktop container:
git clone https://github.com/ywiyogo/ubuntu-gui-container.git cd ubuntu-gui-container ./build_podman.sh -d ros2_jazzy_desktop_dev.dockerfile -n ros2-jazzy-dev -
Run the container with network access:
./run_podman_for_gui.sh ros2-jazzy-dev
If your host utilizes
ufw, allow UDP Multicast by running:sudo ufw allow in proto udp from 192.168.8.0/24 to any -
Inside the container, connect to your target:
export ROS_DOMAIN_ID=8 # Same ID as your target export RMW_IMPLEMENTATION=rmw_cyclonedds_cpp rviz2 -d /root/portable_slam.rviz
-
On your target device, launch SLAM with the same ROS_DOMAIN_ID:
export ROS_DOMAIN_ID=8 export RMW_IMPLEMENTATION=rmw_cyclonedds_cpp ros2 launch portable_slam launch_rpi4.py
- Edit code locally in your IDE (e.g., VSCode)
- Set the target environment variable:
Or run in one line:
export TARGET="user@host" # e.g., "pi@raspberrypi.local" or "user@192.168.1.100"
TARGET=pi@raspberrypi.local ./scripts/sync_and_build.sh - Run
./scripts/sync_and_build.shto:- Rsync your local changes to the Pi (excluding .git to avoid conflicts)
- Automatically build the package on the Pi with
colcon build --symlink-install
- Test the functionality on the Pi
- Iterate: make more local changes, sync, test, repeat
When changes are stable and tested:
- Commit locally:
git add . && git commit -m "your message" - Push to remote:
git push - Sync Pi to official version: SSH to Pi and run
cd ~/ros2_slam_ws/src/portable_slam && git pull
- Rapid iteration: Test changes immediately without committing
- Safe commits: Only push thoroughly tested code
- No remote editing: Full IDE features locally
- Git integration: Maintains version control workflow
The portable SLAM system is configured for handheld walking operation without wheel encoders, GPS, or visual-inertial odometry (VIO). Instead, it relies on:
- Laser Odometry: rf2o_laser_odometry provides scan-matching odometry from LiDAR data
- IMU-based Orientation: 9-axis IMU with magnetometer for heading stabilization
- Sensor Fusion: EKF combines laser odometry + IMU for robust pose estimation
Key Configuration:
-
EKF Configuration for
robot_localization:- EKF: 20Hz processing rate
- IMU: 20Hz to match EKF
- rf2o Laser Odometry: 10Hz scan-matching
- Optimized for walking dynamics:
- Increased position process noise (0.12-0.15) to account for vertical bobbing and lateral sway
- Increased velocity process noise (0.04-0.05) for variable walking speeds
- Moderate trust in IMU orientation (stddev: 0.1) accounting for walking vibrations
- Gravity compensation enabled
-
slam_toolboxConfiguration:-
Optimized for handheld walking SLAM:
- Minimum laser range: 0.5m (captures nearby obstacles during walking)
- Scan processing at 10Hz (minimum_time_interval: 0.1)
- Transform timeout: 0.8s (accommodates SBC processing delays)
- Motion model max angle: 1.2 rad (~69°) for natural head rotations
- Scan deskewing enabled for motion compensation
-
IMU Integration:
- Madgwick filter with reduced beta (0.1) for stable orientation during walking
- Magnetometer enabled for absolute heading reference
- Quick transform updates (transform_publish_period: 0.02)
- Balanced angle/distance penalties
-
QoS Settings:
- Best effort reliability (matching YDLidar)
- History: Keep last 10 messages
- Configured through parameters file for better compatibility
-
Our integration strategies are:
* LiDAR as primary source for mapping (using scan matching)
* rf2o laser odometry for short-term position tracking
* IMU for orientation and motion detection
* EKF for sensor fusion combining rf2o + IMU
* Proper message handling between components
Raw IMU data → Calibration → Madgwick filter (with magnetometer) → Filtered IMU with orientation → EKF (fusing IMU + rf2o odometry) → Odometry estimate → SLAM toolbox for mapping
- Lidar (10Hz) → SLAM (10Hz with optimized processing) → IMU (20Hz) → Madgwick (50Hz) → EKF (20Hz)
- Hardware Integration: Well-structured with proper I2C configuration, transform publishing, and calibration workflow
- sense_hat_node: Publishes raw IMU data at 20Hz with configurable QoS and covariance matrices, includes calibration service
- imu_filter_madgwick: Processes raw IMU data into orientation estimates with magnetometer integration for heading stability
- robot_localization EKF: Fuses IMU and rf2o odometry data for robust pose estimation at 20Hz
- rf2o_laser_odometry: Provides additional odometry from lidar scan matching for enhanced sensor fusion
- slam_toolbox: Performs lidar-based mapping with motion compensation using EKF odometry and optimized processing rates
- ydlidar_ros2_driver: Provides laser scan data from YDLidar Tmini Pro
map
└── odom
└── base_link
├── imu_link
└── laser_frame
- IMU provides orientation and motion detection through Madgwick filtering with magnetometer stabilization
- rf2o provides odometry from lidar scan matching for short-term accuracy
- EKF fuses IMU and rf2o odometry for robust long-term pose estimation
- SLAM toolbox uses fused odometry for scan deskewing and pose extrapolation
- Lidar scan matching provides the primary mapping capability with enhanced stability
Hardware Optimization:
- Processing frequencies optimized for Raspberry Pi 4B and Orange Pi 5
- Conservative noise models based on ICM20948 datasheet specifications
- Automatic IMU calibration workflow in launch sequence
- Enhanced sensor fusion with rf2o odometry for better drift reduction
- Motion compensation ensures scan quality during movement with magnetometer heading stabilization
Walking Motion Optimization:
- Process noise tuned for human walking dynamics (0.5-1.5 m/s speed range)
- Madgwick filter beta reduced to 0.1 to prevent orientation errors during accelerations
- Motion model supports fast rotations up to 69° (1.2 rad) for natural head movements
- Minimum laser range 0.5m to capture nearby obstacles within arm's reach
- Transform timeout increased to 0.8s to handle SBC processing during intensive operations
Expected Performance:
- Suitable for: Slow walking (< 1.0 m/s), indoor environments, short sessions (< 10 min)
- Position drift: ~1-3% of distance traveled (with rf2o + IMU fusion)
- Heading drift: < 5° per minute (with magnetometer)
- Limitations: Fast walking/running, long corridors, outdoor magnetic interference
