Skip to content

Latest commit

 

History

History
36 lines (26 loc) · 1.92 KB

README.md

File metadata and controls

36 lines (26 loc) · 1.92 KB

PyTorch_OLoptim

A Pytorch implementation of various Online / Stochastic optimization algorithms

This is a course project for the Online Learning course at BU. Interested readers are refered to A Modern Introduction to Online Learning by Prof. Orabona.

If you find this repository helpful, feel free to take it for research usage.

Descriptions

FTRL: Follow the Regularized Leader

  • intro: a classic algorithm in online learning

FTML: [ICML 2017] Follow the Moving Leader in Deep Learning

SGDOL: [NeurIPS 2019] Surrogate Losses for Online Learning of Stepsizes in Stochastic Non-Convex Optimization

STORM: [NeurIPS 2019] Momentum-Based Variance Reduction in Non-Convex SGD

EXP3: Exponential-weight algorithm for Exploration and Exploitation

UCB: Upper Confidence Bound algorithm

SGDPF

  • intro: a toy example to use gradient descent to automatically tune the learning rate. The name comes from 'SGD + parameter free'