TF-Agents: A reliable, scalable and easy to use TensorFlow library for Contextual Bandits and Reinforcement Learning.
-
Updated
Oct 9, 2024 - Python
TF-Agents: A reliable, scalable and easy to use TensorFlow library for Contextual Bandits and Reinforcement Learning.
Materials for the Practical Sessions of the Reinforcement Learning Summer School 2019: Bandits, RL & Deep RL (PyTorch).
A lightweight contextual bandit & reinforcement learning library designed to be used in production Python services.
Another A/B test library
Code associated with the NeurIPS19 paper "Weighted Linear Bandits in Non-Stationary Environments"
lightweight contextual bandit library for ts/js
Thompson Sampling for Bandits using UCB policy
A benchmark to test decision-making algorithms for contextual-bandits. The library implements a variety of algorithms (many of them based on approximate Bayesian Neural Networks and Thompson sampling), and a number of real and syntethic data problems exhibiting a diverse set of properties.
Python library of bandits and RL agents in different real-world environments
Python implementation of common RL algorithms using OpenAI gym environments
Code for our PRICAI 2022 paper: "Online Learning in Iterated Prisoner's Dilemma to Mimic Human Behavior".
🐯REPLICA of "Auction-based combinatorial multi-armed bandit mechanisms with strategic arms"
Collaborative project for documenting ML/DS learnings.
Deep Reinforcement Learning Agents in Pytorch in a modular framework
Code for our ICDMW 2018 paper: "Contextual Bandit with Adaptive Feature Extraction".
Simple Implementations of Bandit Algorithms in python
Code for our AJCAI 2020 paper: "Online Semi-Supervised Learning in Contextual Bandits with Episodic Reward".
This project provides a simulation of multi-armed bandit problems. This implementation is based on the below paper. https://arxiv.org/abs/2308.14350.
A python library for (finite) Partial Monitoring algorithms
Play Rock, Paper, Scissors (Kaggle competition) with Reinforcement Learning: bandits, tabular Q-learning and PPO with LSTM.
Add a description, image, and links to the bandits topic page so that developers can more easily learn about it.
To associate your repository with the bandits topic, visit your repo's landing page and select "manage topics."