This short set of instrictions assumes that you're using the app Android Dataset Recorder.
- Firstly, with the app mentioned before, record a dataset to estimate the camera intrinsics and camera-imu extrinsics with it. This video gives a nice overview of the entire process.
- After recording the data and transferring it to a PC, execute the following command to create a rosbag from the data:
rosrun kalibr kalibr_bagcreater --folder path/to/data --output-bag path/to/bag
The bag can later be used either for calibration, or as an input to a VIO algorithm.
- Having the bag, it’s possible to run kalibr to estimate the camera intrinsics.
At least for me, kalibr was unable to calibrate a single camera, therefore I used a trick and simulated having 2 cameras by just assigning the same image topic to the second camera.
The aprilgrid.yaml file describes the calibration grid that I used. It was displayed on the screen of my PC. I followed the instructions from theKalibr Wiki and at the end the aprilgrid configuration file aprilgrid.yaml
had the following content:
target_type: 'aprilgrid'
tagCols: 6
tagRows: 6
tagSize: 0.026
tagSpacing: 0.2885
In addition, for the calibration command I needed to adjust the --aprox-sync
parameter after the calibration failed with a hint to decrease it.
The calibration command I used was
$ rosrun kalibr kalibr_calibrate_cameras --bag /path/to/bag --topics /cam0/image_raw /cam0/image_raw --models pinhole-radtan pinhole-radtan --target aprilgrid.yaml --approx-sync 0.0001
As mentioned before, I used a trick faking a second camera because the calibration failed for just one.
The calibration process resulted with a file camchain-test.yaml
which contained the intrinsics of both cameras (which were the same, as only one camera was used in the reality) as well as the calculated baseline between them which was almost 0 (1e-7m).
Then, I was able to finally start the imu-camera calibration (again, simulating the second camera with input form the first one). I also needed to provide noise parameters for the IMU in the imu.yaml file, which I had to guess. I used similar values that had been already in use for VINS-Mobile (apart from the imu fequency which was known to be 50Hz from the dataset recorder):
accelerometer_noise_density: 0.006
accelerometer_random_walk: 0.0002
gyroscope_noise_density: 0.0004
gyroscope_random_walk: 4.0e-06
update_rate: 50.0
rostopic: "/imu"
The camera-imu calibration command had the form of:
$ rosrun kalibr kalibr_calibrate_imu_camera --target aprilgrid.yaml --cam path/to/cam/calibration/result --imu imu.yaml --bag /path/to/bag
This calibration process took a while, but I ended up with a precise estimate of the transformation between imu and camera.
The rest of this file comes from the original Kalibr repo and hasn't been modified.
Kalibr is a toolbox that solves the following calibration problems:
- Multiple camera calibration: intrinsic and extrinsic calibration of a camera-systems with non-globally shared overlapping fields of view
- Visual-inertial calibration calibration (camera-IMU): spatial and temporal calibration of an IMU w.r.t a camera-system
- Rolling Shutter Camera calibration: full intrinsic calibration (projection, distortion and shutter parameters) of rolling shutter cameras
Please find more information on the wiki pages of this repository.
For questions or comments, please open an issue on Github.
We've upgraded and fixed kalibr at ORI for 20.04. Please use our fork: git clone https://github.com/ori-drs/kalibr.git --branch noetic-devel
.
- Use
rosdep
to install almost all required dependencies:rosdep install --from-paths ./ -iry
. - Then install the two missing runtime dependencies:
sudo apt install python3-wxgtk4.0 python3-igraph
- Unittests are currently failing on 20.04 and thus deactivated on the buildserver.
A video tutorial for the IMU-camera calibration can be found here:
(Credits: @indigomega)
- Paul Furgale (email)
- Hannes Sommer (email)
- Jérôme Maye (email)
- Jörn Rehder (email)
- Thomas Schneider (email)
- Luc Oth
The calibration approaches used in Kalibr are based on the following papers. Please cite the appropriate papers when using this toolbox or parts of it in an academic publication.
- Joern Rehder, Janosch Nikolic, Thomas Schneider, Timo Hinzmann, Roland Siegwart (2016). Extending kalibr: Calibrating the extrinsics of multiple IMUs and of individual axes. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), pp. 4304-4311, Stockholm, Sweden.
- Paul Furgale, Joern Rehder, Roland Siegwart (2013). Unified Temporal and Spatial Calibration for Multi-Sensor Systems. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Tokyo, Japan.
- Paul Furgale, T D Barfoot, G Sibley (2012). Continuous-Time Batch Estimation Using Temporal Basis Functions. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), pp. 2088–2095, St. Paul, MN.
- J. Maye, P. Furgale, R. Siegwart (2013). Self-supervised Calibration for Robotic Systems, In Proc. of the IEEE Intelligent Vehicles Symposium (IVS)
- L. Oth, P. Furgale, L. Kneip, R. Siegwart (2013). Rolling Shutter Camera Calibration, In Proc. of the IEEE Computer Vision and Pattern Recognition (CVPR)
This work is supported in part by the European Union's Seventh Framework Programme (FP7/2007-2013) under grants #269916 (V-Charge), and #610603 (EUROPA2).
Copyright (c) 2014, Paul Furgale, Jérôme Maye and Jörn Rehder, Autonomous Systems Lab, ETH Zurich, Switzerland
Copyright (c) 2014, Thomas Schneider, Skybotix AG, Switzerland
All rights reserved.
Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:
-
Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.
-
Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.
-
All advertising materials mentioning features or use of this software must display the following acknowledgement: This product includes software developed by the Autonomous Systems Lab and Skybotix AG.
-
Neither the name of the Autonomous Systems Lab and Skybotix AG nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE AUTONOMOUS SYSTEMS LAB AND SKYBOTIX AG ''AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL the AUTONOMOUS SYSTEMS LAB OR SKYBOTIX AG BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.