Releases: Tony-Y/pytorch_warmup
PyTorch Warmup 0.2.0: Enhanced Documentation
Users of version 0.1.1 can safely upgrade this package using pip
because there are no breaking changes in this release.
- The documentation and README.md have been updated with significant improvement.
- The EMNIST example has been updated with MPS device support.
- The CIFAR10 example has been added with performance comparisons.
Python 3.7 and 3.8 have reached end-of-life but are still supported in this release. The next release will drop support for them.
What's Changed
- Update EMNIST URL (#18)
- Fix CI on macOS (#21)
- Use os.makedirs (#19)
- Fix archive filename (#22)
- Update CI adding Python 3.9, 3.10, 3.11, and 3.12 (#23)
- Add assertRaises to tests (#24)
- Add recent torchvision versions (#27)
- Update docs (#28)
- Use "Warning" in documentation (#29)
- Update examples (#30)
- Add CIFAR10 example link (#31)
- Add ResNet Performance Comparison for SGD (#32)
- Update README's and the documentation (#33)
- Fix braces in math expressions (#34)
Full Changelog: v0.1.1...v0.2.0
PyPI Package Update
- This release was made for the PyPI package to include a license file. (#9)
- In addition, the GitHub Actions workflows are updated.
There is no further update in this release.
Warmup for PyTorch v1.4.0 or Above
- The
with
statement is used for encapsulating the undampened learning rate. - The learning rate scheduler "chaining" can work along with this version of pytorch_warmup.
Why this change is needed
For the previous version, we have to work around to a "chaining" problem by using ugly code:
optimizer.step()
lr_scheduler.step(lr_scheduler.last_epoch+1)
warmup_scheduler.dampen()
This code causes a more serious problem that PyTorch emits a user warning:
UserWarning: The epoch parameter in scheduler.step()
was not necessary and is being deprecated where possible. Please use scheduler.step()
to step the scheduler. During the deprecation, if epoch is different from None, the closed form is used instead of the new chainable form, where available. Please open an issue if you are unable to replicate your use case: https://github.com/pytorch/pytorch/issues/new/choose.
warnings.warn(EPOCH_DEPRECATION_WARNING, UserWarning)
For this version, we simply code:
optimizer.step()
with warmup_scheduler.dampening():
lr_scheduler.step()
If you use no LR scheduler, you code with pass
:
optimizer.step()
with warmup_scheduler.dampening():
pass