Skip to content

Commit

Permalink
clone from ghe
Browse files Browse the repository at this point in the history
  • Loading branch information
ziqi.pang committed Mar 8, 2021
1 parent 680b915 commit e77bda9
Show file tree
Hide file tree
Showing 79 changed files with 5,037 additions and 1 deletion.
163 changes: 162 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1 +1,162 @@
# LiDAR_SOT
# Tracking and Estimation

## 1 Introduction

This project concerns the tracking and state estimation of objects in point cloud sequences. The input to the algorithm are the starting location (in the form of a 3D bounding box) of an object and point cloud sequences. Out tracker then provides the bounding box on each subsequence point cloud frame, as visualized in the figure below (the tracking results are visualized at an interval of 20 frames).

![Visualization of tracking (at an interval of 20 frames)](./imgs/tracking_visualization.png)

This `README` file describes the most basic usages of this code base. For more details, please refer to:

* [Data Preprocessing](./docs/data_preprocessing.md): It describes the process for converting the raw data in Waymo dataset into more handy forms.
* [Benchmark](./docs/benchmark.md): It explains the process of selecting tracklets and construction of our benchmark. Note that the benchmark information is already in the `./benchmark/` and you may directly use it. The code for benchmark construction is for the purpose of verification.
* [Design](./docs/design.md): This documentation explains our design for the implementation. Reading this would be useful for understanding our tracker implementation and modifying it for your own purpose.
* [Model Configs](./docs/configs.md): We use the `config.yaml` to specify the behaviour of the tracker. Please refer to this documentation for detailed explanation.
* [Toolkit](./docs/toolkit.md): Along this with project, we also provide several code snippets for visualizing the tracking results. This file discusses these toolkits we have created.

## 2 SOT API and Inference

### 2.1 Tracking API

The main API `tracker_api` of is in `main.py`. In the default case, it takes the model configuration, the beginning bounding box, and a data loader as input, output the tracking result as specified below. Some additional guidelines on this API are:

* `data_loader` is an iterator reading the data. On each iteration, it returns a dictionary, with the keys `pc` (point cloud) and `ego` (the transformation matrix to the world coordinate) as compulsory. An example of `data_loader` is in [example_loader](./data_loader/example_loader.py).
* When you want to compare the tracking results with the ground truth along with tracking, please provide the input argument `gts` and import the function `compare_to_gt`, the data type `sot_3d.data_protos.BBox` . The gts are a list of `BBox`.
* We also provide a handy tool for visualization. Please import `from sot_3d.visualization import Visualizer2D` and `frame_result_visualization` for a frame-level BEV visualization.

```Python
import sot_3d
from sot_3d.data_protos import BBox
from sot_3d.visualization import Visualizer2D


def tracker_api(configs, id, start_bbox, start_frame, data_loader, track_len, gts=None, visualize=False):
"""
Args:
configs: model configuration read from config.yaml
id (str): each tracklet has an id
start_bbox ([x, y, z, yaw, l, w, h]): the beginning location of this id
data_loader (an iterator): iterator returning data of each incoming frame
track_len: number of frames in the tracklet
Return:
{
frame_number0: {'bbox0': previous frame result, 'bbox1': current frame result, 'motion': estimated motion}
frame_number1: ...
...
frame_numberN: ...
}
"""
```

### 2.2 Evaluation API

The API for evaluation is in `evaluation/evaluation.py`. `tracklet_acc` and `tracklet_rob` compute the accuracy and robustness given the ious in a tracklet, and `metrics_from_bboxes` deals with the cases when the inputs are raw bounding boxes. Note that the bounding boxes are in the format of `sot_3d.data_protos.BBox`.

```Python
def tracklet_acc(ious):
...
""" the accuracy for a tracklet
"""

def tracklet_rob(ious, thresholds):
...
""" compute the robustness of a tracklet
"""

def metrics_from_bboxes(pred_bboxes, gts):
...
""" Compute the accuracy and robustness of a tracklet
Args:
pred_bboxes (list of BBox)
gts (list of BBox)
Return:
accuracy, robustness, length of tracklet
"""
```

## 3 Building Up the Benchmark

Our `LiDAR-SOT` benchmark selects 1172 tracklets from the validation set of Waymo Open Dataset. These tracklets satisfy the requirements of mobility, length, and meaningful initialization.

The information of selected tracklets is in the `./benchmark/`. Each `json` file stores the ids, segment names, and the frame intervals for each selected tracklet. For replicating the construction of this benchmark, please refer to [this documentation](./docs/benchmark.md).

## 4 Steps for Inference/Evaluation on the Benchmark

### 4.1 Data Preparation

Please follow the guidelines in [Data Preprocessing](./docs/data_preprocessing.md). Suppose your root directory is `DATA_ROOT`.

### 4.2 Running on the benchmark

The command for running on the inference is as follows. Note that there are also some other arguments, please refer to the `main.py` for more details.

```bash
python main.py \
--name NAME \ # The NAME for your experiment.
--bench_list your_tracklet_list \ # The path for your benchmark tracklets. By default at ./benchmark/bench_list.json.
--data_folder DATA_ROOT \ # The location to store your datasets.
--result_folder result_folder \ # Where you store the results of each tracklet.
--process process_number \ # Use mutiple processes to split the dataset and accelerate inference.
```

After this, you may access the result for tracklet `ID` at:

```
-- result_folder
-- NAME
-- summary
-- ID.json
{
frame_index0: {'bbox0': ..., 'bbox1': ..., 'motion': ...,
'gt_bbox0': ..., 'gt_bbox1': ..., 'gt_motion': ...,
'iou2d': ..., 'iou3d': ...}
frame_index1: ...
frame_indexN: ...
}
```

### 4.3 Evaluation

For computing the accuracy and robustness of tracklets, use the following code:

```bash
python evaluation/evaluation.py \
--name NAME \ # the name of the experiment
--result_folder result_folder \ # result folder
--data_folder DATA_ROOT \ # root directory storing the dataset
--bench_list_folder benchmark_list_folder \ # directory for benchmark tracklet information, by default the ./benchmark/
--iou # use this if already computes the iou during inference
```

For the evaluation of shapes, use the following code:

```bash
python evaluation/evaluation.py \
--name NAME \ # the name of the experiment
--result_folder result_folder \ # result folder
--data_folder DATA_ROOT \ # root directory storing the dataset
--bench_list_folder benchmark_list_folder \ # directory for benchmark tracklet information, by default the ./benchmark/
--process process_number # Use mutiple processes to split the dataset and accelerate evaluation.
```

## 5 Environment

For inference on the dataset using our tracker, the following libraries are compulsory:
```
numpy, scikit-learn, numba, scipy
```

If the evaluation with ground-truth is involved, please install the `shapely` library for the computation of iou.
```
shapely (for iou computation)
```

The data preprocessing on Waymo needs.
```
waymo_open_dataset
```

Our visualization toolkit needs.
```
matplotlib, open3d, pangolin
```
1 change: 1 addition & 0 deletions benchmark/cyclist/bench_list.json
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
[{"id": "MIrCL1FTtMdyvgXwmAsPDQ", "type": 4, "segment_name": "segment-15724298772299989727_5386_410_5406_410_with_camera_labels", "frame_range": [53, 196]}, {"id": "pdStnRmAxYhitmrMgpV7uA", "type": 4, "segment_name": "segment-14300007604205869133_1160_000_1180_000_with_camera_labels", "frame_range": [13, 198]}, {"id": "QL7OyPI_U-bq2umJr6WBww", "type": 4, "segment_name": "segment-17791493328130181905_1480_000_1500_000_with_camera_labels", "frame_range": [0, 158]}, {"id": "pvrq19AylPM9NTRY1h6oYg", "type": 4, "segment_name": "segment-10289507859301986274_4200_000_4220_000_with_camera_labels", "frame_range": [42, 195]}, {"id": "mEGuJyvssVOJAd1a7zYE-w", "type": 4, "segment_name": "segment-2736377008667623133_2676_410_2696_410_with_camera_labels", "frame_range": [0, 196]}, {"id": "NiUeKnn9M_0yFc_CITiwEw", "type": 4, "segment_name": "segment-3577352947946244999_3980_000_4000_000_with_camera_labels", "frame_range": [0, 198]}, {"id": "6ZiyBCvMUBpIEu_f3CWJqw", "type": 4, "segment_name": "segment-13356997604177841771_3360_000_3380_000_with_camera_labels", "frame_range": [0, 197]}, {"id": "Jvv4xtf6JlFkAdSfj4XuEw", "type": 4, "segment_name": "segment-6707256092020422936_2352_392_2372_392_with_camera_labels", "frame_range": [0, 197]}, {"id": "u5WiV1L2mBbHsFKggf2zMQ", "type": 4, "segment_name": "segment-10289507859301986274_4200_000_4220_000_with_camera_labels", "frame_range": [0, 141]}, {"id": "zt9K06VDb_KPKnHBdk7pXA", "type": 4, "segment_name": "segment-89454214745557131_3160_000_3180_000_with_camera_labels", "frame_range": [0, 104]}, {"id": "HUp9Se0iK_lH2LXr0erXOQ", "type": 4, "segment_name": "segment-8956556778987472864_3404_790_3424_790_with_camera_labels", "frame_range": [0, 106]}, {"id": "c5jYpKlhqtLJgQKT1tg_Xg", "type": 4, "segment_name": "segment-4612525129938501780_340_000_360_000_with_camera_labels", "frame_range": [11, 198]}, {"id": "AjPJMmO5GC99OAa4Mi3oJQ", "type": 4, "segment_name": "segment-8506432817378693815_4860_000_4880_000_with_camera_labels", "frame_range": [0, 197]}, {"id": "-LzWji5byBLxqIzMY9AtLA", "type": 4, "segment_name": "segment-30779396576054160_1880_000_1900_000_with_camera_labels", "frame_range": [0, 110]}, {"id": "gsUECA1e-dE7fFijlAbYyQ", "type": 4, "segment_name": "segment-17694030326265859208_2340_000_2360_000_with_camera_labels", "frame_range": [0, 158]}, {"id": "2x0ztbwqkK8Spm2yC2qXmA", "type": 4, "segment_name": "segment-89454214745557131_3160_000_3180_000_with_camera_labels", "frame_range": [0, 198]}, {"id": "PxvhIll-_bc7C7gWJKmWTA", "type": 4, "segment_name": "segment-12496433400137459534_120_000_140_000_with_camera_labels", "frame_range": [0, 196]}, {"id": "byYK0xnM4Eypj6oykj2SSA", "type": 4, "segment_name": "segment-15611747084548773814_3740_000_3760_000_with_camera_labels", "frame_range": [0, 117]}, {"id": "NdU2sOgtnZBiyHyDcitfaQ", "type": 4, "segment_name": "segment-8506432817378693815_4860_000_4880_000_with_camera_labels", "frame_range": [85, 197]}, {"id": "BhLBPQjlfCE8-m_y6ykkdA", "type": 4, "segment_name": "segment-6324079979569135086_2372_300_2392_300_with_camera_labels", "frame_range": [7, 196]}, {"id": "_d0ZqY-3HmJMx3qWznO0qg", "type": 4, "segment_name": "segment-8133434654699693993_1162_020_1182_020_with_camera_labels", "frame_range": [24, 197]}, {"id": "7K5-Qf9RbsdjC5fajmeiNg", "type": 4, "segment_name": "segment-5183174891274719570_3464_030_3484_030_with_camera_labels", "frame_range": [63, 198]}, {"id": "_BBR4MmKONZwfvzxUjV-kw", "type": 4, "segment_name": "segment-9231652062943496183_1740_000_1760_000_with_camera_labels", "frame_range": [18, 197]}, {"id": "vBQ8nCgyRvZA6OPM3cXRDw", "type": 4, "segment_name": "segment-8888517708810165484_1549_770_1569_770_with_camera_labels", "frame_range": [0, 127]}, {"id": "QiQGqDOG4_Nd-nuJIt6UDQ", "type": 4, "segment_name": "segment-17791493328130181905_1480_000_1500_000_with_camera_labels", "frame_range": [79, 195]}, {"id": "OvEYS-FSHvmmnsnfZFMdbQ", "type": 4, "segment_name": "segment-13356997604177841771_3360_000_3380_000_with_camera_labels", "frame_range": [0, 197]}, {"id": "rnv3rMVHXX13fv3HzRmDhw", "type": 4, "segment_name": "segment-8506432817378693815_4860_000_4880_000_with_camera_labels", "frame_range": [73, 197]}, {"id": "fDF1c7X8Hd-Awm1YrEyx2g", "type": 4, "segment_name": "segment-6324079979569135086_2372_300_2392_300_with_camera_labels", "frame_range": [4, 196]}, {"id": "Vk_q_y5fogqZxCTRUiPj_w", "type": 4, "segment_name": "segment-9114112687541091312_1100_000_1120_000_with_camera_labels", "frame_range": [52, 196]}, {"id": "MbA0aCWd_ZyT_ozH9xldnQ", "type": 4, "segment_name": "segment-11406166561185637285_1753_750_1773_750_with_camera_labels", "frame_range": [49, 197]}, {"id": "6ujiX_MoSoycVBJaAXP7mA", "type": 4, "segment_name": "segment-6324079979569135086_2372_300_2392_300_with_camera_labels", "frame_range": [45, 186]}, {"id": "2gfzH2GUJ2GfjNW3EYHDiA", "type": 4, "segment_name": "segment-17791493328130181905_1480_000_1500_000_with_camera_labels", "frame_range": [42, 195]}, {"id": "iXXWDtTk8ZulKstE2Mt4jg", "type": 4, "segment_name": "segment-14300007604205869133_1160_000_1180_000_with_camera_labels", "frame_range": [58, 196]}, {"id": "im06oZbIWP0Rf3znT47lDw", "type": 4, "segment_name": "segment-2834723872140855871_1615_000_1635_000_with_camera_labels", "frame_range": [63, 197]}, {"id": "lbiZElbmCnWoUW4C2edelQ", "type": 4, "segment_name": "segment-13356997604177841771_3360_000_3380_000_with_camera_labels", "frame_range": [0, 197]}, {"id": "Nl3ieJVX0UFQTi1sph9u8Q", "type": 4, "segment_name": "segment-8079607115087394458_1240_000_1260_000_with_camera_labels", "frame_range": [0, 197]}, {"id": "Y4rNSKB-LCmf3Ywzzahpkw", "type": 4, "segment_name": "segment-1071392229495085036_1844_790_1864_790_with_camera_labels", "frame_range": [55, 190]}, {"id": "7pijD2U4jl0Uq20N9pBkag", "type": 4, "segment_name": "segment-4409585400955983988_3500_470_3520_470_with_camera_labels", "frame_range": [78, 198]}, {"id": "Bp7Pa_itJVEbE2WvMaxndQ", "type": 4, "segment_name": "segment-2834723872140855871_1615_000_1635_000_with_camera_labels", "frame_range": [63, 197]}, {"id": "8batKB8o3gSWyeXujHG4Qw", "type": 4, "segment_name": "segment-9243656068381062947_1297_428_1317_428_with_camera_labels", "frame_range": [1, 186]}, {"id": "6pjSOWmCzbD3Dii9t5IdJQ", "type": 4, "segment_name": "segment-8679184381783013073_7740_000_7760_000_with_camera_labels", "frame_range": [98, 197]}, {"id": "Y_kUwxXnBdc6vKe9v6iXwg", "type": 4, "segment_name": "segment-2834723872140855871_1615_000_1635_000_with_camera_labels", "frame_range": [48, 197]}, {"id": "GfXU1CLUdtmSoMg_jplr1g", "type": 4, "segment_name": "segment-14931160836268555821_5778_870_5798_870_with_camera_labels", "frame_range": [62, 180]}, {"id": "X20PunQVdDPVTbuKW6pTTA", "type": 4, "segment_name": "segment-4490196167747784364_616_569_636_569_with_camera_labels", "frame_range": [74, 181]}, {"id": "gYdbBKgDKJY5j99n3cWL2A", "type": 4, "segment_name": "segment-2834723872140855871_1615_000_1635_000_with_camera_labels", "frame_range": [55, 197]}, {"id": "s-kPllw7x2DVEDu_xv39xA", "type": 4, "segment_name": "segment-1071392229495085036_1844_790_1864_790_with_camera_labels", "frame_range": [80, 190]}, {"id": "tArE8J9Amj4ncpW8B1GrcA", "type": 4, "segment_name": "segment-6324079979569135086_2372_300_2392_300_with_camera_labels", "frame_range": [1, 180]}, {"id": "j4lvawngtdRmH7n17FF-0g", "type": 4, "segment_name": "segment-6324079979569135086_2372_300_2392_300_with_camera_labels", "frame_range": [3, 196]}, {"id": "d5NMapF1BeOQBFgUbf0N5g", "type": 4, "segment_name": "segment-13356997604177841771_3360_000_3380_000_with_camera_labels", "frame_range": [33, 166]}, {"id": "nmuO4ySw8pg7SIFN7WEMRQ", "type": 4, "segment_name": "segment-5574146396199253121_6759_360_6779_360_with_camera_labels", "frame_range": [68, 195]}, {"id": "pScXRPJElkC4i8Srlidbbw", "type": 4, "segment_name": "segment-2736377008667623133_2676_410_2696_410_with_camera_labels", "frame_range": [80, 189]}, {"id": "cclKYJwtUDfmvwBwWf3J1g", "type": 4, "segment_name": "segment-8506432817378693815_4860_000_4880_000_with_camera_labels", "frame_range": [64, 197]}, {"id": "yJvJohO6E1bj1Nl4d-D9lA", "type": 4, "segment_name": "segment-9231652062943496183_1740_000_1760_000_with_camera_labels", "frame_range": [20, 197]}]
1 change: 1 addition & 0 deletions benchmark/cyclist/easy.json
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
[{"id": "MIrCL1FTtMdyvgXwmAsPDQ", "type": 4, "segment_name": "segment-15724298772299989727_5386_410_5406_410_with_camera_labels", "frame_range": [53, 196]}, {"id": "pdStnRmAxYhitmrMgpV7uA", "type": 4, "segment_name": "segment-14300007604205869133_1160_000_1180_000_with_camera_labels", "frame_range": [13, 198]}, {"id": "QL7OyPI_U-bq2umJr6WBww", "type": 4, "segment_name": "segment-17791493328130181905_1480_000_1500_000_with_camera_labels", "frame_range": [0, 158]}, {"id": "pvrq19AylPM9NTRY1h6oYg", "type": 4, "segment_name": "segment-10289507859301986274_4200_000_4220_000_with_camera_labels", "frame_range": [42, 195]}, {"id": "mEGuJyvssVOJAd1a7zYE-w", "type": 4, "segment_name": "segment-2736377008667623133_2676_410_2696_410_with_camera_labels", "frame_range": [0, 196]}, {"id": "NiUeKnn9M_0yFc_CITiwEw", "type": 4, "segment_name": "segment-3577352947946244999_3980_000_4000_000_with_camera_labels", "frame_range": [0, 198]}, {"id": "6ZiyBCvMUBpIEu_f3CWJqw", "type": 4, "segment_name": "segment-13356997604177841771_3360_000_3380_000_with_camera_labels", "frame_range": [0, 197]}, {"id": "Jvv4xtf6JlFkAdSfj4XuEw", "type": 4, "segment_name": "segment-6707256092020422936_2352_392_2372_392_with_camera_labels", "frame_range": [0, 197]}, {"id": "u5WiV1L2mBbHsFKggf2zMQ", "type": 4, "segment_name": "segment-10289507859301986274_4200_000_4220_000_with_camera_labels", "frame_range": [0, 141]}, {"id": "zt9K06VDb_KPKnHBdk7pXA", "type": 4, "segment_name": "segment-89454214745557131_3160_000_3180_000_with_camera_labels", "frame_range": [0, 104]}, {"id": "HUp9Se0iK_lH2LXr0erXOQ", "type": 4, "segment_name": "segment-8956556778987472864_3404_790_3424_790_with_camera_labels", "frame_range": [0, 106]}, {"id": "c5jYpKlhqtLJgQKT1tg_Xg", "type": 4, "segment_name": "segment-4612525129938501780_340_000_360_000_with_camera_labels", "frame_range": [11, 198]}, {"id": "AjPJMmO5GC99OAa4Mi3oJQ", "type": 4, "segment_name": "segment-8506432817378693815_4860_000_4880_000_with_camera_labels", "frame_range": [0, 197]}, {"id": "-LzWji5byBLxqIzMY9AtLA", "type": 4, "segment_name": "segment-30779396576054160_1880_000_1900_000_with_camera_labels", "frame_range": [0, 110]}, {"id": "gsUECA1e-dE7fFijlAbYyQ", "type": 4, "segment_name": "segment-17694030326265859208_2340_000_2360_000_with_camera_labels", "frame_range": [0, 158]}, {"id": "2x0ztbwqkK8Spm2yC2qXmA", "type": 4, "segment_name": "segment-89454214745557131_3160_000_3180_000_with_camera_labels", "frame_range": [0, 198]}, {"id": "PxvhIll-_bc7C7gWJKmWTA", "type": 4, "segment_name": "segment-12496433400137459534_120_000_140_000_with_camera_labels", "frame_range": [0, 196]}, {"id": "byYK0xnM4Eypj6oykj2SSA", "type": 4, "segment_name": "segment-15611747084548773814_3740_000_3760_000_with_camera_labels", "frame_range": [0, 117]}]
Loading

0 comments on commit e77bda9

Please sign in to comment.