Skip to content

Commit

Permalink
Add Optimiser.jl as a doc dependency
Browse files Browse the repository at this point in the history
Add `cpu` and `gpu` to the manual and `Optimisers.jl` as a dependency
  • Loading branch information
Saransh-cpp committed Jul 28, 2022
1 parent d660eda commit 46932c7
Show file tree
Hide file tree
Showing 5 changed files with 22 additions and 3 deletions.
1 change: 1 addition & 0 deletions docs/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
Functors = "d9f16b24-f501-4c13-a1f2-28368ffc5196"
MLUtils = "f1d291b0-491e-4a28-83b9-f70985020b54"
NNlib = "872c559c-99b0-510c-b3b7-b6c96a88d5cd"
Optimisers = "3bd65402-5787-11e9-1adc-39752487f4e2"

[compat]
Documenter = "0.26"
4 changes: 2 additions & 2 deletions docs/make.jl
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
using Documenter, Flux, NNlib, Functors, MLUtils, BSON
using Documenter, Flux, NNlib, Functors, MLUtils, BSON, Optimisers


DocMeta.setdocmeta!(Flux, :DocTestSetup, :(using Flux); recursive = true)

makedocs(
modules = [Flux, NNlib, Functors, MLUtils, BSON],
modules = [Flux, NNlib, Functors, MLUtils, BSON, Optimisers],
doctest = false,
sitename = "Flux",
pages = [
Expand Down
5 changes: 5 additions & 0 deletions docs/src/gpu.md
Original file line number Diff line number Diff line change
Expand Up @@ -86,6 +86,11 @@ julia> x |> cpu
0.7766742
```

```@docs
cpu
gpu
```

## Common GPU Workflows

Some of the common workflows involving the use of GPUs are presented below.
Expand Down
14 changes: 14 additions & 0 deletions docs/src/training/optimisers.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,7 @@
```@meta
CurrentModule = Flux
```

# Optimisers

Consider a [simple linear regression](../models/basics.md). We create some dummy data, calculate a loss, and backpropagate to calculate gradients for the parameters `W` and `b`.
Expand Down Expand Up @@ -189,3 +193,13 @@ opt = Optimiser(ClipValue(1e-3), Adam(1e-3))
ClipValue
ClipNorm
```

# Optimisers.jl

Flux re-exports some utility functions from [`Optimisers.jl`](https://github.com/FluxML/Optimisers.jl)
and the complete `Optimisers` package under the `Flux.Optimisers` namespace.

```@docs
Optimisers.destructure
Optimisers.trainable
```
1 change: 0 additions & 1 deletion docs/src/utilities.md
Original file line number Diff line number Diff line change
Expand Up @@ -107,7 +107,6 @@ Flux.outputsize

```@docs
Flux.modules
Flux.destructure
Flux.nfan
```

Expand Down

0 comments on commit 46932c7

Please sign in to comment.