Skip to content

Commit

Permalink
Merge pull request #1762 from NightMachinary/doc1
Browse files Browse the repository at this point in the history
Doc update (saving.md): removed outdated info; Typo fix.
  • Loading branch information
ToucheSir authored Dec 12, 2021
2 parents e8a67b4 + 3207000 commit 878b39c
Show file tree
Hide file tree
Showing 2 changed files with 6 additions and 3 deletions.
2 changes: 1 addition & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ One of the best ways to contribute is by looking at issues labeled ["help wanted

## Good First Issues

While there are not many right now, we do have a section for ["good for issues"](https://github.com/FluxML/Flux.jl/labels/good%20first%20issue). As mentioned above, if any of these seem interesting but there is no clear next step in your mind, please feel free to ask for a suggested step. Often times in open source, issues labeled as "good first issue" actually take some back and forth between maintainers and contributors before the issues is ready to be tackled by a new contributor.
While there are not many right now, we do have a section for ["good first issues"](https://github.com/FluxML/Flux.jl/labels/good%20first%20issue). As mentioned above, if any of these seem interesting but there is no clear next step in your mind, please feel free to ask for a suggested step. Often times in open source, issues labeled as "good first issue" actually take some back and forth between maintainers and contributors before the issues is ready to be tackled by a new contributor.

## Model Zoo

Expand Down
7 changes: 5 additions & 2 deletions docs/src/saving.md
Original file line number Diff line number Diff line change
Expand Up @@ -118,10 +118,13 @@ revert to an older copy of the model if it starts to overfit.
@save "model-$(now()).bson" model loss = testloss()
```

You can even store optimiser state alongside the model, to resume training
exactly where you left off.
Note that to resume a model's training, you might need to restore other stateful parts of your training loop. Possible examples are stateful optimizers (which usually utilize an `IdDict` to store their state), and the randomness used to partition the original data into the training and validation sets.

You can store the optimiser state alongside the model, to resume training
exactly where you left off. BSON is smart enough to [cache values](https://github.com/JuliaIO/BSON.jl/blob/v0.3.4/src/write.jl#L71) and insert links when saving, but only if it knows everything to be saved up front. Thus models and optimizers must be saved together to have the latter work after restoring.

```julia
opt = ADAM()
@save "model-$(now()).bson" model opt
```

0 comments on commit 878b39c

Please sign in to comment.