Skip to content
This repository has been archived by the owner on Nov 3, 2023. It is now read-only.

Bump pytorch-lightning from 1.1.8 to 1.2.2 #12

Closed
wants to merge 1 commit into from

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Mar 8, 2021

Bumps pytorch-lightning from 1.1.8 to 1.2.2.

Release notes

Sourced from pytorch-lightning's releases.

Standard weekly patch release

[1.2.2] - 2021-03-02

Added

  • Added checkpoint parameter to callback's on_save_checkpoint hook (#6072)

Changed

  • Changed the order of backward, step, zero_grad to zero_grad, backward, step (#6147)
  • Changed default for DeepSpeed CPU Offload to False, due to prohibitively slow speeds at smaller scale (#6262)

Fixed

  • Fixed epoch level schedulers not being called when val_check_interval < 1.0 (#6075)
  • Fixed multiple early stopping callbacks (#6197)
  • Fixed incorrect usage of detach(), cpu(), to() (#6216)
  • Fixed LBFGS optimizer support which didn't converge in automatic optimization (#6147)
  • Prevent WandbLogger from dropping values (#5931)
  • Fixed error thrown when using valid distributed mode in multi node (#6297)

Contributors

@​akihironitta, @​borisdayma, @​carmocca, @​dvolgyes, @​SeanNaren, @​SkafteNicki

If we forgot someone due to not matching commit email with GitHub account, let us know :]

Standard weekly patch release

[1.2.1] - 2021-02-23

Fixed

  • Fixed incorrect yield logic for the amp autocast context manager (#6080)
  • Fixed priority of plugin/accelerator when setting distributed mode (#6089)
  • Fixed error message for AMP + CPU incompatibility (#6107)

Contributors

@​awaelchli, @​SeanNaren, @​carmocca

If we forgot someone due to not matching commit email with GitHub account, let us know :]

Pruning & Quantization & SWA

[1.2.0] - 2021-02-18

Added

  • Added DataType, AverageMethod and MDMCAverageMethod enum in metrics (#5657)
  • Added support for summarized model total params size in megabytes (#5590)
  • Added support for multiple train loaders (#1959)
  • Added Accuracy metric now generalizes to Top-k accuracy for (multi-dimensional) multi-class inputs using the top_k parameter (#4838)

... (truncated)

Changelog

Sourced from pytorch-lightning's changelog.

[1.2.2] - 2021-03-02

Added

  • Added checkpoint parameter to callback's on_save_checkpoint hook (#6072)

Changed

  • Changed the order of backward, step, zero_grad to zero_grad, backward, step (#6147)
  • Changed default for DeepSpeed CPU Offload to False, due to prohibitively slow speeds at smaller scale (#6262)

Fixed

  • Fixed epoch level schedulers not being called when val_check_interval < 1.0 (#6075)
  • Fixed multiple early stopping callbacks (#6197)
  • Fixed incorrect usage of detach(), cpu(), to() (#6216)
  • Fixed LBFGS optimizer support which didn't converge in automatic optimization (#6147)
  • Prevent WandbLogger from dropping values (#5931)
  • Fixed error thrown when using valid distributed mode in multi node (#6297

[1.2.1] - 2021-02-23

Fixed

  • Fixed incorrect yield logic for the amp autocast context manager (#6080)
  • Fixed priority of plugin/accelerator when setting distributed mode (#6089)
  • Fixed error message for AMP + CPU incompatibility (#6107)

[1.2.0] - 2021-02-18

Added

  • Added DataType, AverageMethod and MDMCAverageMethod enum in metrics (#5657)
  • Added support for summarized model total params size in megabytes (#5590)
  • Added support for multiple train loaders (#1959)
  • Added Accuracy metric now generalizes to Top-k accuracy for (multi-dimensional) multi-class inputs using the top_k parameter (#4838)
  • Added Accuracy metric now enables the computation of subset accuracy for multi-label or multi-dimensional multi-class inputs with the subset_accuracy parameter (#4838)
  • Added HammingDistance metric to compute the hamming distance (loss) (#4838)
  • Added max_fpr parameter to auroc metric for computing partial auroc metric (#3790)
  • Added StatScores metric to compute the number of true positives, false positives, true negatives and false negatives (#4839)
  • Added R2Score metric (#5241)
  • Added LambdaCallback (#5347)
  • Added BackboneLambdaFinetuningCallback (#5377)
  • Accelerator all_gather supports collection (#5221)
  • Added image_gradients functional metric to compute the image gradients of a given input image. (#5056)
  • Added MetricCollection (#4318)
  • Added .clone() method to metrics (#4318)
  • Added IoU class interface (#4704)

... (truncated)

Commits
  • b3b8f95 hotfix for PT1.6 and torchtext (#6323)
  • 9f3ef1b update lightning version to v1.2.2
  • c5e9d67 [fix] Ensure we check deepspeed/sharded in multinode DDP (#6297)
  • fc95f00 Disable CPU Offload as default for DeepSpeed (#6262)
  • ad61624 Fix for incorrect usage of detach(), cpu(), to() (#6216)
  • 09b287a Remove opt from manual_backward in docs (#6267)
  • 3c498ce Call optimizer.zero_grad() before backward inside closure in AutoOpt (#6147)
  • 5abfd2c fix(wandb): prevent WandbLogger from dropping values (#5931)
  • 9329f58 Add checkpoint parameter to on_save_checkpoint (#6072)
  • 4b71a83 Fix for multiple callbacks (#6197)
  • Additional commits viewable in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually

@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Mar 8, 2021
@dependabot dependabot bot requested a review from amogkam March 8, 2021 07:02
@dependabot @github
Copy link
Contributor Author

dependabot bot commented on behalf of github Mar 15, 2021

Superseded by #14.

@dependabot dependabot bot closed this Mar 15, 2021
@dependabot dependabot bot deleted the dependabot/pip/pytorch-lightning-1.2.2 branch March 15, 2021 07:01
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
dependencies Pull requests that update a dependency file
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants