Skip to content

Commit

Permalink
change to mention SignDecay
Browse files Browse the repository at this point in the history
  • Loading branch information
mcabbott committed Mar 2, 2024
1 parent 9d32b74 commit 6c9cb17
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion docs/src/training/training.md
Original file line number Diff line number Diff line change
Expand Up @@ -322,7 +322,8 @@ The first, [`WeightDecay`](@ref Flux.WeightDecay) adds `0.42` times original par
matching the gradient of the penalty above (with the same, unrealistically large, constant).
After that, in either case, [`Adam`](@ref Flux.Adam) computes the final update.

The same trick works for *L₁ regularisation* (also called Lasso), where the penalty is `pen_l1(x::AbstractArray) = sum(abs, x)` instead. This is implemented by `WeightDecay(0.42, 1)`.
The same trick works for *L₁ regularisation* (also called Lasso), where the penalty is
`pen_l1(x::AbstractArray) = sum(abs, x)` instead. This is implemented by `SignDecay(0.42)`.

The same `OptimiserChain` mechanism can be used for other purposes, such as gradient clipping with [`ClipGrad`](@ref Flux.Optimise.ClipValue) or [`ClipNorm`](@ref Flux.Optimise.ClipNorm).

Expand Down

0 comments on commit 6c9cb17

Please sign in to comment.