Peiyu Chen†, Fuling Lin†, Weipeng Guan, Yi Luo, Peng Lu*
Adaptive Robotic Controls Lab (ArcLab), The University of Hong Kong.
SuperEIO is a novel event-based visual inertial odometry framework that leverages self-supervised learning networks to enhance the accuracy and robustness of ego-motion estimation. Our event-only feature detection employs a convolutional neural network under continuous event streams. Moreover, our system adopts the graph neural network to achieve event descriptor matching for loop closure. The proposed system utilizes TensorRT to accelerate the inference speed of deep networks, which ensures low-latency processing and robust real-time operation on resource-limited platforms. Besides, we evaluate our method extensively on multiple challenging public datasets, particularly in high-speed motion and high-dynamic-range scenarios, demonstrating its superior accuracy and robustness compared to other state-of-the-art event-based methods.
We test our SuperEIO on Ubuntu 20.04. Before you build the SuperEIO, you should install the following dependency:
- Ceres 1.14.0
- OpenCV 4.2
- Eigen 3
- TensorRT 8.4.1.5
- CUDA 11.6
- ROS noetic
Other event camera drivers are stored in the folder dependencies.
mkdir -p catkin_ws_supereio/src
cd catkin_ws_supereio
catkin config --init --mkdirs --extend /opt/ros/noetic --merge-devel --cmake-args -DCMAKE_BUILD_TYPE=Release
cd ~/catkin_ws_supereio/src
git clone git@github.com:your-repo/SuperEIO.git --recursive
You should modifie your .bashrc file through gedit ~/.bashrc, add the following codes in it:
source ~/catkin_ws_supereio/devel/setup.bash
alias supereiobuild='cd /home/YOUR_name/catkin_ws_supereio && catkin build -j8 supereio_ba event_detector loop_closure -DCMAKE_BUILD_TYPE=Debug'
alias supereiorun='cd /home/YOUR_name/catkin_ws_supereio/src/SuperEIO/script && sh run.sh'
After that, run the source ~/.bashrc and supereiobuild command in your terminal.
You can test our SuperEIO on hku_agg_translation. After you download bag files, just run the example:
roslaunch supereio_ba hku_stereo.launch
rosbag play YOUR_DOWNLOADED.bag
or you can just revise the script file and run in your terminal:
supereiorun
To run the system on your dataset, you need to create a corresponding configuration folder and YAML file in the ’config‘ directory. Then configure your camera intrinsics, event/IMU topics, and the extrinsic transformation between the event camera and IMU in the YAML file. For the extrinsic calibration, we recommend you follow the link (DVS-IMU Calibration and Synchronization) to kindly calibrate your sensors.
Following that, you can execute the provided command to run SuperEIO on your dataset:
roslaunch supereio_ba YOUR_DATASET.launch
rosbag play YOUR_BAG.bag
We present the qualitative performance of our event detetor and descriptor matcher on multiple public datasets.
Visual comparison of other event feature detectors, SuperPoint, and ours on multiple datasets with corresponding images. From top to bottom: DAVIS240C, Mono HKU, Stereo HKU, and VECtor.
Examples of our event descriptor matches in loop closure under boxes translation and hku agg translation sequences.
We present video demo of our SuperEIO system, showcasing its visulization performance on both hdr_boxes and aggressive_flight scenarios.
SuperEIO is available in the Arxiv.
@article{SuperEIO,
title={SuperEIO: Self-Supervised Event Feature Learning for Event Inertial Odometry},
author={Chen, Peiyu and Lin, Fuling and Guan, Weipeng and Luo, Yi and Lu, Peng},
journal={IEEE Transactions on Industrial Electronics},
year={2026}
}
If you feel like SuperEIO has indeed helped in your current research or work, a simple star or citation of our works should be the best affirmation for us. 😊
This work was supported by the General Research Fund under Grant 17204222, and in part by the Seed Fund for Collaborative Research and General Funding Scheme-HKU-TCL Joint Research Center for Artificial Intelligence. We gratefully acknowledge sair-lab/AirSLAM for providing the Superpoint TensorRT acceleration template, which significantly enhanced the compute effiency of our system.
The source code is released under the GPLv3 license. We are still working on improving the code reliability. If you are interested in our project for commercial purposes, please contact Dr. Peng LU for further communication.


