Skip to content

Articulation Annotator

Angel Chang edited this page Sep 23, 2024 · 1 revision

The articulation annotator allows users to manually annotate a part segmented 3D mesh with articulation parameters.

The manual articulation annotator was used as a baseline system for Motion Annotation Programs and to annotate articulations for OPD-Real and MultiScan.

For how to setup the database and the server API, see Simple Articulation Annotation Server.

If you use the Articulation Annotator please consider citing the following work:

@inproceedings{xu2020map,
    author    = "Xianghao Xu, David Charatan, Sonia Raychaudhuri,
                 Hanxiao Jiang, Mae Heitmann, Vladimir Kim
                 Siddhartha Chaudhuri, Manolis Savva, Angel X. Chang,
                 and Daniel Ritchie",
    title     = "Motion Annotation Programs: A Scalable Approach to
                 Annotating Kinematic Articulations in Large 3D Shape
                 Collections",
    booktitle = "3DV",
    year      = "2020"
}

Preprocessing

Before the articulations can be specified, the mesh need to have parts specified. In addition, you can also provide the part connectivity information.

Part annotation

Please use either the Part-Annotator or Semantic-Segmentation-Labeler to first annotate parts for your 3D mesh.

See Part Segmentation Format for required segmentation format.

Motion annotation

Annotations are saved away in a database. They can be viewed using http://localhost:8010/articulation-annotations/list

For how to use the motion annotator, please see https://3dlg-hcvc.github.io/multiscan/read-the-docs/annotation/articulation.html.

Using articulation annotations

Exporting articulation annotations from the database

See ssc/articulations/export-articulations.js

Exporting articulated assets as GLTF and URDF

Export to GLTF

Use ssc/articulations/export-parts-gltf.js to export a single GLTF that has the articulation parameters embedded within it. ˜ Example usage (for a single object, inferring missing geometry for drawers):

NODE_BASE_URL=http://localhost:8010  ${STK}/ssc/articulations/export-parts-gltf.js --id <fullModelId> --output <name> --ensure_connectivity false --as_scene false --infer_geometry true

The exported GLB file will re-organize the parts so that the parts are organized into a tree-structure. Parts attached to any moveable parts will be grouped together with the moveable part.

field description example
isStatic Whether this node corresponds to a static part
isArticulatedNode Whether this node corresponds to a node that can be articulated
partId part id
articulatablePartId the id of the part that will articulate this part (e.g. if this part is a handle of a drawer, than the articulatablePartId will be the partId of the drawer
articulation Information about how the part moves

Export to URDF

Use ssc/articulations/articulated-gltf-to-urdf.js to convert the exported articulated GTLF model (see above) to URDF.

Example usage:

${STK}/ssc/articulations/articulated-gltf-to-urdf.js --input <filename.glb>

Different mesh formats can be specified using --output_mesh_format. Supported formats are obj and glb.

Viewing articulations in model-viewer

After exporting the articulated GLB file, use the model-viewer to load and view the articulations.

Rendering articulated objects

Use ssc/articulations/render-articulations.js (after exporting articulated GLB)

This script will render pngs of each articulated part as they move. Then we use ffmpeg to convert rendered pngs to webp and mp4, and ImageMagick convert to convert to gif.

Example to render articulation of an articulated GLB:

 NODE_BASE_URL=http:/localhost:8010/  ${STK}/ssc/articulations/render-articulations.js --input <filename.glb> --inputType path --video_format webp --clear_pngs
Clone this wiki locally