Skip to content

v0.14.1

Compare
Choose a tag to compare
@taylormordan taylormordan released this 28 Sep 17:35
· 15 commits to main since this release
  • flexible compatibility with recent PyTorch versions
  • option to resume optimizer's state from checkpoint
  • AdamW optimizer, linear learning rate scheduler
  • fast scaling as default for data augmentation
  • improve configuration and display for eval