Skip to content

YJZLuckyBoy/LVI-SAM-Simple

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NEW Feature

  • lviorf is now available, which supports more lidar and is more robust;

  • This repo is modified based on LVI_SAM, which makes it easier to adapt your sensor;

  • The original version of lio_sam sets many sensor external parameters to fixed values in the code, this version extracts the external parameters into the yaml file, making it easier to configure;

  • The original version of lio_sam dose not consider the translation amount between lidar and camera, this version adds the translation amount;

  • The following params needs to be set as your device extrinsic:

    • cameraToROSStandard*: ros standard frame (x:front-y:left-z:up) to camera frame(normal x:front-y:left-z:up);
    • extrinsic*: IMU frame to camera frame;

------------------- Update Date: 2022-12-22 -------------------

  • Fixed extrinsic bug;
  • Support ubran-nav datasets;
  • Support kitti datasets;

For more details, please refer to my blog: LVI-SAM:配置环境、安装测试、适配自己采集数据集

Run the package

  1. compile package
  mkdir -p ~/lvi-sam-simple/src
  cd ~/lvi-sam-simple/src
  git clone https://github.com/YJZLuckyBoy/LVI-SAM-Simple.git
  cd ..
  catkin_make -j4
  1. Run the launch file:
  roslaunch liorf run_lvi_sam.launch
  1. Play existing bag files:
  rosbag play handheld.bag

Demo

  1. lvi-sam

drawing

  1. ubran-nav

drawing

  1. kitti

drawing

  1. my data

drawing drawing

LVI-SAM

This repository contains code for a lidar-visual-inertial odometry and mapping system, which combines the advantages of LIO-SAM and Vins-Mono at a system level.

drawing


Dependency

  • ROS (Tested with kinetic and melodic)
  • gtsam (Georgia Tech Smoothing and Mapping library)
    sudo add-apt-repository ppa:borglab/gtsam-release-4.0
    sudo apt install libgtsam-dev libgtsam-unstable-dev
    
  • Ceres (C++ library for modeling and solving large, complicated optimization problems)
    sudo apt-get install -y libgoogle-glog-dev
    sudo apt-get install -y libatlas-base-dev
    wget -O ~/Downloads/ceres.zip https://github.com/ceres-solver/ceres-solver/archive/1.14.0.zip
    cd ~/Downloads/ && unzip ceres.zip -d ~/Downloads/
    cd ~/Downloads/ceres-solver-1.14.0
    mkdir ceres-bin && cd ceres-bin
    cmake ..
    sudo make install -j4
    

Getting start with Docker

When you use Docker, you could solve the dependency at once.
For more information, you can check docker_start.md.


Compile

You can use the following commands to download and compile the package.

cd ~/catkin_ws/src
git clone https://github.com/TixiaoShan/LVI-SAM.git
cd ..
catkin_make

Datasets

drawing

The datasets used in the paper can be downloaded from Google Drive. The data-gathering sensor suite includes: Velodyne VLP-16 lidar, FLIR BFS-U3-04S2M-CS camera, MicroStrain 3DM-GX5-25 IMU, and Reach RS+ GPS.

https://drive.google.com/drive/folders/1q2NZnsgNmezFemoxhHnrDnp1JV_bqrgV?usp=sharing

Note that the images in the provided bag files are in compressed format. So a decompression command is added at the last line of launch/module_sam.launch. If your own bag records the raw image data, please comment this line out.

drawing drawing


Run the package

  1. Configure parameters:
Configure sensor parameters in the .yaml files in the ```config``` folder.
  1. Run the launch file:
roslaunch lvi_sam run.launch
  1. Play existing bag files:
rosbag play handheld.bag 

Related Packages


TODO

  • Update graph optimization using all three factors in imuPreintegration.cpp, simplify mapOptimization.cpp, increase system stability

Paper

Thank you for citing our paper if you use any of this code or datasets.

@inproceedings{lvisam2021shan,
  title={LVI-SAM: Tightly-coupled Lidar-Visual-Inertial Odometry via Smoothing and Mapping},
  author={Shan, Tixiao and Englot, Brendan and Ratti, Carlo and Rus Daniela},
  booktitle={IEEE International Conference on Robotics and Automation (ICRA)},
  pages={5692-5698},
  year={2021},
  organization={IEEE}
}

Acknowledgement

  • The visual-inertial odometry module is adapted from Vins-Mono.
  • The lidar-inertial odometry module is adapted from LIO-SAM.

About

Modified simple version based on LVI-SAM.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages