Skip to content

Follograph/TPIG

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 

Repository files navigation

TPIG

Contents

Install

System Requirements

  • ubuntu22.04.3 LTS
  • CUDA11.7
  • Pytorch1.13.0
  • python3.10

Library Installation

The version of cuRobo applied is v0.6.2, and the version of Flexiv RDK is v0.9.

Using in Isaac Sim

System requirements.

Installation for intelligent grasping

Simulation

In the project, the simulation scene is constructed in Isaac Sim, with features of components consistent with those in the real working environment. The paths of grasping and placing tubes is created by the cuRobo trajectory planning algorithm. As to using cuRobo in Isaac Sim and applying for a new robot, read the instruction below.

Example code
Explanation:
In single_complete.py, a flexiv robot picks a cube to another place.
In novisual.py, the robot with extended finger tips picks the tube from the box to another box, and withplot.py records changes in joints during one of the motion.
If you want to run the script, turn to the folder:

omni_python novisual.py

The config includes the configuration of the flexiv rizon4 robot and yaml files for the robot and the simulation environment. If you need to use yaml files for cuRobo, please allocate them in the right file folder.

Trajectory-Planning

The robotic arm is remotely controlled with the help of flexiv RDK library. Some examples are given to try different controlling mode.
❕ Change robot_ip and local_ip to your own one. Change the RDK package path to your own one.

In the normal tube picking mode, the robotic arm is controlled in a three-coordinate manner, with motion generated by the MoveL primitive. In simpleassedit.py, one tube is picked and placed while the robotic arm can pick and place a row of tubes in multiassedit.py. Multitestedit.py and multitestendedit.py are separately used to check the accuracy of picking and placing positions.

In the anomaly handling mode, the robotic arm is controlled in the six-degree-of-freedom manner, with part of the trajectory generated by cuRobo, motion generated by joint position mode and MoveL primitive. Motion_incline_two.py is for picking the leaning tube while motion_lie.py is for picking the lying one. Additionally, in motion_incline_two.py the trajectory points of the end effector are written into a CSV file.

Intelligent-Grasping

Use GPT-4o to recognize images of grasping scenarios. Images are captured by ZED2 camera, which is not recommended.
As to the use of API, read the gpt cookbook.
The paper GPT4Vision-Robot-Manipulation-Prompts can be a reference for prompts.

  1. Install openAI and ZED SDK.
  2. Run comtest.py to recognize the image and apply correct robot actions (robot_actions.py). There are six types of grasping scenarios in total.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages