Skip to content

GSoC 2020 DNN’s for precise manipulation of household objects.

License

Notifications You must be signed in to change notification settings

robocomp/grasping

Repository files navigation

GSoC 2020 Grasping and Pose Estimation

Repository created for GSoC work on grasping and pose estimation.

Project Title: DNN’s for precise manipulation of household objects.

Usage

  • Clone the repository recursively :

    git clone --recurse-submodules https://github.com/robocomp/grasping.git
  • Follow READMEs in each sub-folder for installation and usage instructions.

Folder Structure

  • components : contains all RoboComp interfaces and components.

  • data-collector : contains the code for custom data collection using CoppeliaSim and PyRep.

  • rgb-based-pose-estimation : contains the code for Segmentation-driven 6D Object Pose Estimation neural network.

  • rgbd-based-pose-estimation : contains the code for PVN3D neural network as a git submodule.

System Overview

Figure(1) : Complete schema for grasping and pose estimation workflow with DSR.

As shown in the figure, the components workflow goes as follows :

  • viriatoPyrep component streams the RGBD signal from CoppeliaSim simulator using PyRep API and publishes it to the shared graph through viriatoDSR component.

  • graspDSR component reads the RGBD signal from shared graph and passes it objectPoseEstimation component.

  • objectPoseEstimation component, then, performs pose estimation using DNN and returns the estimated poses.

  • graspDSR component injects the estimated poses into the shared graph and progressively plans dummy targets for the arm to reach the target object.

  • viriatoDSR component, then, reads the dummy target poses from the shared graph and passes it to viriatoPyrep component.

  • Finally, viriatoPyrep component uses the generated poses by graspDSR to progressively plan a successful grasp on the object.

For more information on DSR system integration and usage, refer to DSR-INTEGRATION.md.

System Demos

System Overview

Our system uses PyRep API to call embedded Lua scripts in the arm for fast and precise grasping. The provided poses are estimated using DNNs through RGB(D) data collected from the shoulder camera.

First Demo (Path Planning)

This demo verifies the arm's path planning using DNN-estimated poses.

IMAGE ALT TEXT
Figure(2): Video of grasping first demo.

Figure(3): Visualization of the DNN-estimated pose in first demo.

Second Demo (Grasping and Manipulation)

This demo shows the ability of the arm to grasp and manipulate a certain object out of multiple objects in the scene, using DNN-estimated poses.

IMAGE ALT TEXT
Figure(4): Video of grasping second demo.

Figure(5): Visualization of the DNN-estimated pose in second demo.

About

GSoC 2020 DNN’s for precise manipulation of household objects.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published