Skip to content

ethz-asl/active_learning_for_segmentation

Repository files navigation

Active Learning For Segmentation

Table of Contents

Paper and Video

If you find this package useful for your research, please consider citing our paper:

  • René Zurbrügg, Hermann Blum, Cesar Cadena, Roland Siegwart and Lukas Schmid. "Embodied Active Domain Adaptation for Semantic Segmentation via Informative Path Planning" in IEEE Robotics and Automation Letters (RA-L), 2022. [ArXiv | Video]

    @article{Zurbrgg2022EmbodiedAD,
      title={Embodied Active Domain Adaptation for Semantic Segmentation via Informative Path Planning},
      author={R. {Zurbr{\"u}gg} and H. {Blum} and C. {Cadena} and R. {Siegwart} and L. {Schmid}},
      journal={ArXiv},
      year={2022},
      volume={abs/2203.00549}
    }

Reproducing the Experiments

Installation

Installation instructions for Linux. It is expected that Unreal Engine is already installed!

Prerequisites

  1. If not already done so, install ROS (Desktop-Full is recommended).

  2. If not already done so, create a catkin workspace with catkin tools:

sudo apt-get install python3-catkin-tools
mkdir -p ~/catkin_ws/src
cd ~/catkin_ws
catkin init
catkin config --extend /opt/ros/noetic  # exchange melodic for your ros distro if necessary
catkin config --cmake-args -DCMAKE_BUILD_TYPE=Release
catkin config --merge-devel

Installation

  1. Move to your catkin workspace:
cd ~/catkin_ws/src
  1. Install system dependencies:
sudo apt-get install python3-wstool python3-catkin-tools
  1. Download repo using a SSH key or via HTTPS:
git clone https://github.com/ethz-asl/active_learning_for_segmentation
  1. Download dependencies New Workspace
wstool init . ./embodied_active_learning/dependencies.rosinstall
wstool update

Existing Workspace

wstool merge -t . ./embodied_active_learning/dependencies.rosinstall
wstool update
  1. Downloading Additional Dependencies
wstool merge -t . ./mav_active_3d_path_planning/mav_active_3d_planning_https.rosinstal
wstool update
wstool merge -t . ./panoptic_mapping/panoptic_mapping.rosinstall
wstool update
  1. Install Airsim
cd ../AirSim
./setup.sh
  1. Install Pip Dependencies
    cd ../active_learning_for_segmentation/embodied_active_learning
    pip install -e . 
    cd ../embodied_active_learning_core
    pip install -e . 
    cd ../volumetric_labeling
    pip install -e .
    cd  ../../light-weight-refinenet
    pip install -e .
    cd ../densetorch
    pip install -e .
  2. Source and compile:
    source ../../devel/setup.bash
    catkin build embodied_active_learning

Example

In order to run an experiment, first download the Flat Environment Here. Secondly, download the Data Package and extract it at a location of your choice.
The Data Package contains the following Files:

  • replayset: Subset of the Scannet Dataset, which is used as replay buffer.
  • testsets: Selected Images with groundtruth annotations used for evaluation
  • scannet_50_classes_40_clusters.pth: Checkpoint of the pretrained Network and uncertainty Estimator.

Launching the Experiment

  1. Modify the files embodied_active_learning/cfg/experiments/self_supervised/RunScannetDensity.yaml as well as embodied_active_learning/cfg/experiments/mapper/single_tsdf.yaml. Update all paths (/home/rene/...) with your local information.
  2. Start Unreal Engine
  3. Execute roslaunch embodied_active_learning panoptic_run.launch to start an experiment.

Outputs

This will create the following files during the experiment:

  1. ~/out/<RunName>/checkpoints/best_iteration_X.pth checkpoint of the model after each training cycle.
  2. ~/out/<RunName>/online_training/step_XXX folder contatining all training samples (images with their respective pseudo labels) that were used for training.
  3. ~/out/<RunName>/poses.csv CSV file containing all requested robot poses. (X,Y,Z,Quaternion)
  4. ~/out/<RunName>/scroes.csv CSV file containing all mIoU scores on the training and test set.
  5. ~/out/<RunName>/params.yaml yaml file containing all parameters that were set for this experiment

Additional information can be found in the Emboded Active Learning Package.

Others

Repository Content

Embodied Active Learning Package

Main package built on top of all other packages. Conduct embodied experiments with either airsim or gibson

See here

Embodied Active Learning Core

Contains main functionality needed for embodied active learning package.

  • Models
  • Uncertanties
  • Replay / Trainingsbuffer

See here

Volumetric Labeling

Contains code for volumetric labeling and image selection.

  • Pseudo Label selection
  • Code to find subset of images or voxels to annotate

See here

Habitat ROS

A package that connects the habitat simulator with the ros interface is located here

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published