PyTorch implementation of paper "When Transformer Meets Robotic Grasping: Exploits Context for Efficient Grasping Detection"
This code was developed with Python 3.6 on Ubuntu 16.04. Python requirements can installed by:
pip install -r requirements.txt
Currently, both the Cornell Grasping Dataset, Jacquard Dataset , and GraspNet 1Billion are supported.
- Download the and extract Cornell Grasping Dataset.
- Download and extract the Jacquard Dataset.
-
The dataset can be downloaded here.
-
Install graspnetAPI following here.
pip install graspnetAPI
-
We use the setting in here
Training is done by the main.py
script.
Some basic examples:
# Train on Cornell Dataset
python main.py --dataset cornell
# k-fold training
python main_k_fold.py --dataset cornell
# GraspNet 1
python main_grasp_1b.py
Trained models are saved in output/models
by default, with the validation score appended.
Some basic examples:
# visulaize grasp rectangles
python visualise_grasp_rectangle.py --network your network address
# visulaize heatmaps
python visulaize_heatmaps.py --network your network address
Our ROS implementation for running the grasping system see https://github.com/USTC-ICR/SimGrasp/tree/main/SimGrasp.
The original implementation for running experiments on a Kinva Mico arm can be found in the repository https://github.com/dougsm/ggcnn_kinova_grasping.
Code heavily inspired and modified from https://github.com/dougsm/ggcnn
If you find this helpful, please cite
@ARTICLE{9810182,
author={Wang, Shaochen and Zhou, Zhangli and Kan, Zhen},
journal={IEEE Robotics and Automation Letters},
title={When Transformer Meets Robotic Grasping: Exploits Context for Efficient Grasp Detection},
year={2022},
volume={},
number={},
pages={1-8},
doi={10.1109/LRA.2022.3187261}}