An elegant PyTorch deep reinforcement learning library.
-
Updated
Jan 9, 2025 - Python
An elegant PyTorch deep reinforcement learning library.
Google DeepMind's software stack for physics-based simulation and Reinforcement Learning environments, using MuJoCo.
PyTorch implementation of Advantage Actor Critic (A2C), Proximal Policy Optimization (PPO), Scalable trust-region method for deep reinforcement learning using Kronecker-factored approximation (ACKTR) and Generative Adversarial Imitation Learning (GAIL).
OpenDILab Decision AI Engine. The Most Comprehensive Reinforcement Learning Framework B.P.
Reinforcement Learning Coach by Intel AI Lab enables easy experimentation with state of the art Reinforcement Learning algorithms
Collections of robotics environments geared towards benchmarking multi-task and meta reinforcement learning
MyoSuite is a collection of environments/tasks to be solved by musculoskeletal models simulated with the MuJoCo physics engine and wrapped in the OpenAI gym API.
Python library for Reinforcement Learning.
XuanCe: A Comprehensive and Unified Deep Reinforcement Learning Library
Unified Reinforcement Learning Framework
A collection of robotics simulation environments for reinforcement learning
[CoRL '23] Dexterous piano playing with deep reinforcement learning.
A unified framework for robot learning
A MuJoCo/Gym environment for robot control using Reinforcement Learning. The task of agents in this environment is pixel-wise prediction of grasp success chances.
This repository contains model-free deep reinforcement learning algorithms implemented in Pytorch
PyTorch implementation of Trust Region Policy Optimization
Python inverse kinematics based on MuJoCo
Reinforcement learning algorithms for MuJoCo tasks
DeepRL algorithms implementation easy for understanding and reading with Pytorch and Tensorflow 2(DQN, REINFORCE, VPG, A2C, TRPO, PPO, DDPG, TD3, SAC)
Add a description, image, and links to the mujoco topic page so that developers can more easily learn about it.
To associate your repository with the mujoco topic, visit your repo's landing page and select "manage topics."