Skip to content

Code repo for implementation of RL on traffic headway dynamic controls

License

Notifications You must be signed in to change notification settings

labicon/RL-Traffic-Dynamics

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

28 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

RL-Traffic-Dynamics

License: MIT

Overview

This is the public code repo for the paper at ICRA 2023 London , Learning to Influence Vehicles' Routing in Mixed-Autonomy Networks by Dynamically Controlling the Headway of Autonomous Cars, by Xiaoyu Ma and Prof. Negar Mehr from University of Illinois at Urbana-Champaign.

In this work, we propose that in mixed-autonomy networks, i.e., networks where roads are shared between human-driven and autonomous cars, the headway of autonomous cars can be directly controlled to influence vehicles' routing and reduce congestion. We argue that the headway of autonomous cars -- and consequently the capacity of link segments -- is not just a fixed design choice; but rather, it can be leveraged as an {infrastructure control} strategy to {dynamically} regulate capacities.

This code repo provided is developed as a custom environment on top of OpenAI Gym. By defining the classic traffic network -- Braess network in the format of a gym env, we employed Stable Baselines3 to harness the power of Reinforcement Learning. The current implementation provides an one-on-one control scheme for each link in the classic Braess network. By learning a policy to generate varying headway for links, the resulted vehicle distribution across the network is very different from the one without controlling (constant headway), as shown below, where red links are more crowded and green links are less congested.

Constant Headway (without control) Varying Headway (with control)

The repository include necessary files for running the RL_headway_dynamics project in the classic Braess network. Both Linex local implmentation and jupyter notebook implementation are included for demonstration purpose.

Local Implementation

The local copy of code is located in the gym_traffic folder. To use the code implementation locally, you need to setup a Linex system and install python 3.0+.

After that, run pip torch to install Torch. Similarly, use pip to install all required packages before moving on to register our gym env.

Required packages

torch, gym, stable_baselines3, wandb

Register the Env

Please register the env file as a self-defined env in gym before running the runner.py. gym-traffic/gym_traffic/envs contains the defined traffic model.

Refer to Registering Envs for how to register the custom environment.

Jupyter Notebook Implementation

If you are only planning to use the Jupyter Notebook implementation, simply download the notebook. Without registering the env, you can run the notebook in any Linux environment or online server interface (Google CoLab for instance).

About

Code repo for implementation of RL on traffic headway dynamic controls

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published