If you find this package useful for your research, please consider citing our paper:
-
René Zurbrügg, Hermann Blum, Cesar Cadena, Roland Siegwart and Lukas Schmid. "Embodied Active Domain Adaptation for Semantic Segmentation via Informative Path Planning" in IEEE Robotics and Automation Letters (RA-L), 2022. [ArXiv | Video]
@article{Zurbrgg2022EmbodiedAD, title={Embodied Active Domain Adaptation for Semantic Segmentation via Informative Path Planning}, author={R. {Zurbr{\"u}gg} and H. {Blum} and C. {Cadena} and R. {Siegwart} and L. {Schmid}}, journal={ArXiv}, year={2022}, volume={abs/2203.00549} }
Installation instructions for Linux. It is expected that Unreal Engine is already installed!
Prerequisites
-
If not already done so, install ROS (Desktop-Full is recommended).
-
If not already done so, create a catkin workspace with catkin tools:
sudo apt-get install python3-catkin-tools
mkdir -p ~/catkin_ws/src
cd ~/catkin_ws
catkin init
catkin config --extend /opt/ros/noetic # exchange melodic for your ros distro if necessary
catkin config --cmake-args -DCMAKE_BUILD_TYPE=Release
catkin config --merge-devel
Installation
- Move to your catkin workspace:
cd ~/catkin_ws/src
- Install system dependencies:
sudo apt-get install python3-wstool python3-catkin-tools
- Download repo using a SSH key or via HTTPS:
git clone https://github.com/ethz-asl/active_learning_for_segmentation
- Download dependencies New Workspace
wstool init . ./embodied_active_learning/dependencies.rosinstall
wstool update
Existing Workspace
wstool merge -t . ./embodied_active_learning/dependencies.rosinstall
wstool update
- Downloading Additional Dependencies
wstool merge -t . ./mav_active_3d_path_planning/mav_active_3d_planning_https.rosinstal
wstool update
wstool merge -t . ./panoptic_mapping/panoptic_mapping.rosinstall
wstool update
- Install Airsim
cd ../AirSim
./setup.sh
- Install Pip Dependencies
cd ../active_learning_for_segmentation/embodied_active_learning pip install -e . cd ../embodied_active_learning_core pip install -e . cd ../volumetric_labeling pip install -e . cd ../../light-weight-refinenet pip install -e . cd ../densetorch pip install -e .
- Source and compile:
source ../../devel/setup.bash catkin build embodied_active_learning
In order to run an experiment, first download the Flat Environment Here.
Secondly, download the Data Package and extract it at a location of your choice.
The Data Package contains the following Files:
replayset
: Subset of the Scannet Dataset, which is used as replay buffer.testsets
: Selected Images with groundtruth annotations used for evaluationscannet_50_classes_40_clusters.pth
: Checkpoint of the pretrained Network and uncertainty Estimator.
- Modify the files
embodied_active_learning/cfg/experiments/self_supervised/RunScannetDensity.yaml
as well asembodied_active_learning/cfg/experiments/mapper/single_tsdf.yaml
. Update all paths (/home/rene/...
) with your local information. - Start Unreal Engine
- Execute
roslaunch embodied_active_learning panoptic_run.launch
to start an experiment.
This will create the following files during the experiment:
~/out/<RunName>/checkpoints/best_iteration_X.pth
checkpoint of the model after each training cycle.~/out/<RunName>/online_training/step_XXX
folder contatining all training samples (images with their respective pseudo labels) that were used for training.~/out/<RunName>/poses.csv
CSV file containing all requested robot poses. (X,Y,Z,Quaternion)~/out/<RunName>/scroes.csv
CSV file containing all mIoU scores on the training and test set.~/out/<RunName>/params.yaml
yaml file containing all parameters that were set for this experiment
Additional information can be found in the Emboded Active Learning Package.
Main package built on top of all other packages. Conduct embodied experiments with either airsim or gibson
See here
Contains main functionality needed for embodied active learning package.
- Models
- Uncertanties
- Replay / Trainingsbuffer
See here
Contains code for volumetric labeling and image selection.
- Pseudo Label selection
- Code to find subset of images or voxels to annotate
See here
A package that connects the habitat simulator with the ros interface is located here