Skip to content

A PyTorch implementation of various Online & Stochastic optimization algorithms for deep learning

Notifications You must be signed in to change notification settings

duanzhiihao/PyTorch_OLoptim

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 

Repository files navigation

PyTorch_OLoptim

A Pytorch implementation of various Online / Stochastic optimization algorithms

This is a course project for the Online Learning course at BU. Interested readers are refered to A Modern Introduction to Online Learning by Prof. Orabona.

If you find this repository helpful, feel free to take it for research usage.

Descriptions

FTRL: Follow the Regularized Leader

  • intro: a classic algorithm in online learning

FTML: [ICML 2017] Follow the Moving Leader in Deep Learning

SGDOL: [NeurIPS 2019] Surrogate Losses for Online Learning of Stepsizes in Stochastic Non-Convex Optimization

STORM: [NeurIPS 2019] Momentum-Based Variance Reduction in Non-Convex SGD

EXP3: Exponential-weight algorithm for Exploration and Exploitation

UCB: Upper Confidence Bound algorithm

SGDPF

  • intro: a toy example to use gradient descent to automatically tune the learning rate. The name comes from 'SGD + parameter free'

About

A PyTorch implementation of various Online & Stochastic optimization algorithms for deep learning

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages