Wanting Xu* Si’ao Zhang* Li Cui* Xin Peng Laurent Kneip
*equal contribution
Mobile Perception Lab, ShanghaiTech
International Conference on 3D Vision (3DV) 2024, Davos, CH
Paper | arXiv | Video | BibTeX
The open-source implementation of "Event-based Visual Odometry on Non-holonomic Ground Vehicles".
We consider visual odometry with a forward-facing event camera mounted on an Ackermann steering vehicle for which motion can be locally approximated by an arc of a circle about an Instantaneous Centre of Rotation (ICR). We assume constant rotational velocity during the time interval for which events are considered.
Despite the promise of superior performance under challenging conditions, event-based motion estimation remains a hard problem owing to the difficulty of extracting and tracking stable features from event streams. In order to robustify the estimation, it is generally believed that fusion with other sensors is a requirement. In this work, we demonstrate reliable, purely event-based visual odometry on planar ground vehicles by employing the constrained non-holonomic motion model of Ackermann steering platforms. We extend single feature n-linearities for regular frame-based cameras to the case of quasi time-continuous event-tracks, and achieve a polynomial form via variable degree Taylor expansions. Robust averaging over multiple event tracks is simply achieved via histogram voting. As demonstrated on both simulated and real data, our algorithm achieves accurate and robust estimates of the vehicle’s instantaneous rotational velocity, and thus results that are comparable to the delta rotations obtained by frame-based sensors under normal conditions. We furthermore significantly outperform the more traditional alternatives in challenging illumination scenarios.
- dlib
- Eigen3
git clone git@github.com:gowanting/NHEVO.git ./nhevo
cd ./nhevo
cmake -B build && cmake --build build
# optional
cmake --install build
We provide a minimal ros example under example
, you can easily modify example/src/main.cpp
to remove ros dependency. Parameters are defined in config/config.yaml
.
You need to attach your own implementations of a feature detector (Point or Verticle Line) and a tracker, see details in tracker.hh
and detector.hh
.
cd ./example
cmake -B build && cmake --build build
roscore &
rosbag play your_dataset.bag &
./build/nhevo
If you find this work useful, please consider citing:
@INPROCEEDINGS{xu24,
title = {Event-based Visual Odometry on Non-holonomic Ground Vehicles},
author = {Xu,~W. and Zhang,~S. and Cui,~L. and Peng,~X. and Kneip,~L.},
year = {2024},
booktitle = {In Proceedings of the International Conference on 3D Vision}
}
This work is inspired by the following work:
Huang, Kun, Yifu Wang, and Laurent Kneip. "Motion estimation of non-holonomic ground vehicles from a single feature correspondence measured over n views." Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2019. (link).
The authors would also like to thank the fund support from the National Natural Science Foundation of China (62250610225) and Natural Science Foundation of Shanghai (22dz1201900, 22ZR1441300). We also want to acknowledge the generous support of and continued fruitful exchange with our project collaborators at Midea Robozone.