Skip to content

Latest commit

 

History

History
executable file
·
155 lines (108 loc) · 7.08 KB

README.md

File metadata and controls

executable file
·
155 lines (108 loc) · 7.08 KB

🤖 HE-Nav

A High-Performance and Efficient Navigation System for Aerial-Ground Robots in Cluttered Environments


arxiv Project Page

🤗 AGR-Family Works

  • OMEGA (RA-L 2024.12): The First AGR-Tailored Dynamic Navigation System.
  • HE-Nav (RA-L 2024.09): The First AGR-Tailored ESDF-Free Navigation System.
  • AGRNav (ICRA 2024.01): The First AGR-Tailored Occlusion-Aware Navigation System.

🎉 Chinese Media Reports/Interpretations

📢 News

  • [2024/07]: Experiment log of HE-Nav and its key components (i.e., LBSCNet and AG-Planner).
Task Experiment Log
LBSCNet training log link
HE-Nav navigation in square room link
HE-Nav navigation in corridor link
AGRNav navigation in square room link
AGRNav navigation in corridor link
TABV navigation in square room link
TABV navigation in corridor link
  • [2024/04]: The 3D model in the simulation environment can be downloaded in OneDrive.
  • [2024/04]: 🔥 We released the code of HE-Nav in the simulation environment. The pre-trained model can be downloaded at OneDrive

📜 Introduction

HE-Nav introduces a novel, efficient navigation system specialized for Autonomous Ground Robots (AGRs) in highly obstructed settings, optimizing both perception and path planning. It leverages a lightweight semantic scene completion network (LBSCNet) and an energy-efficient path planner (AG-Planner) to deliver high-performance, real-time navigation with impressive energy savings and planning success rates.


@article{wang2024henav,
  title={HE-Nav: A High-Performance and Efficient Navigation System for Aerial-Ground Robots in Cluttered Environments},
  author={Wang, Junming and Sun, Zekai and Guan, Xiuxian and Shen, Tianxiang and Huang, Dong and Zhang, Zongyuan and Duan, Tianyang and Liu, Fangming and Cui, Heming},
  journal={IEEE Robotics and Automation Letters},
  year={2024},
  volume={9},
  number={11},
  pages={10383-10390},
  publisher={IEEE}
}

Please kindly star ⭐️ this project if it helps you. We take great efforts to develop and maintain it 😁.

🔧 Hardware List

Hardware Link
AMOV Lab P600 UAV link
AMOV Lab Allapark1-Jetson Xavier NX link
Wheeltec R550 ROS Car link
Intel RealSense D435i link
Intel RealSense T265 link
TFmini Plus link

❗ Considering that visual positioning is prone to drift in the Z-axis direction, we added TFmini Plus for height measurement. Additionally, GNSS-RTK positioning is recommended for better localization accuracy.

🤑 Our customized Aerial-Ground Robot cost about RMB 70,000.

🛠️ Installation

The code was tested with python=3.6.9, as well as pytorch=1.10.0+cu111 and torchvision=0.11.2+cu111.

Please follow the instructions here to install both PyTorch and TorchVision dependencies. Installing both PyTorch and TorchVision with CUDA support is strongly recommended.

  1. Clone the repository locally:
 git clone https://github.com/jmwang0117/HE-Nav.git
  1. We recommend using Docker to run the project, which can reduce the burden of configuring the environment, you can find the Dockerfile in our project, and then execute the following command:
 docker build . -t skywalker_robot -f Dockerfile
  1. After the compilation is complete, use our one-click startup script in the same directory:
 bash create_container.sh

Pay attention to switch docker image

  1. Next enter the container and use git clone our project
 docker exec -it robot bash
  1. Then catkin_make compiles this project
 apt update && sudo apt-get install libarmadillo-dev ros-melodic-nlopt

  1. Run the following commands
pip install pyyaml
pip install rospkg
pip install imageio
catkin_make
source devel/setup.bash
sh src/run_sim.sh

You've begun this project successfully; If you only want to use the path planner, you can delete the perception network in the ROS package!!! enjoy yourself!

💽 Dataset

  • SemanticKITTI

🏆Acknowledgement

Many thanks to these excellent open source projects: