Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Get rid of documentation warnings and 404 pages #1987

Merged
merged 4 commits into from
Jul 28, 2022

Conversation

Saransh-cpp
Copy link
Member

Changes/Additions

  • fixed internal cross-references that currently redirects to a 404 page
  • added missing docstrings in the manual (to fix the cross-references)

Related PRs - FluxML/Functors.jl#42, JuliaML/MLUtils.jl#95

Leftover work

  1. Warning: no docs found for Flux.destructure in @docs block in src\utilities.md
Flux.modules
Flux.destructure
Flux.nfan

destructure is now imported from Optimisers. Should I add Optimisers.jl as a doc dependency? Adding a new dependency that has functions and structs named the same as those of Flux would also mean specifying the package in the docs. For example, changing this -

```@docs
Flux.Optimise.update!
Descent
Momentum
Nesterov
RMSProp
ADAM
RADAM
AdaMax
ADAGrad
ADADelta
AMSGrad
NADAM
ADAMW
OADAM
AdaBelief
\```

to this -

```@doc
Flux.Optimise.update!
Flux.Descent
Flux.Momentum
Flux.Nesterov
Flux.RMSProp
Flux.ADAM
Flux.RADAM
Flux.AdaMax
Flux.ADAGrad
Flux.ADADelta
Flux.AMSGrad
Flux.NADAM
Flux.ADAMW
Flux.OADAM
Flux.AdaBelief
\```

and so on. Could you please let me know if Flux plans to import all these optimisers from Optimisers.jl? If yes, then this would be a good idea.

  1. Warning: no doc found for reference '[Flux.trainable](@\ref)' in src\training\training.md.
    Again, this would require adding Optimisers.jl as a doc dependency.

Missing docstrings

The logs also show a bunch of docstrings that do not appear in the manual. Most of them are a part of the internal API, but I have marked some of them which I think are user-facing. Should I add them in the manual?

  • Flux.NilNumber.Nil
  • Flux.Losses.xlogy :: Tuple{Any, Any}
  • Flux.Losses.add_blanks :: Tuple{Any, Any}
  • Flux.rng_from_array :: Tuple{AbstractArray}
  • Flux.NilNumber.nil
  • Flux.GRUv3 :: Tuple
  • Flux.cpu :: Tuple{Any}
  • Flux.zeros32 :: Tuple
  • Flux.randn32 :: Tuple{Vararg{Integer}}
  • Flux.Losses.ctc_loss :: Tuple{AbstractArray, Any}
  • Flux.OneHotVector :: Tuple{Any, Any}
  • Flux.Losses.logaddexp :: Tuple{Any, Any}
  • Flux.hasaffine :: Tuple{Union{BatchNorm, GroupNorm, InstanceNorm, LayerNorm}}
  • Flux.gpu :: Tuple{Any}
  • Flux.OneHotMatrix :: Tuple{Any, Any}
  • Flux.Losses.xlogx :: Tuple{Any}
  • Flux.create_bias :: Tuple{AbstractArray, Bool, Vararg{Integer}}
  • Flux.OneHotArray

PR Checklist

  • Tests are added
  • Entry in NEWS.md
  • Documentation, if applicable

@mcabbott
Copy link
Member

mcabbott commented Jun 2, 2022

destructure is now imported from Optimisers. Should I add Optimisers.jl as a doc dependency? Adding a new dependency that has functions and structs named the same as those of Flux

I would vote not to add Optimisers.Adam etc. to the docs here. But destructure should appear, as should trainable. Maybe Optimisers.jl should join Functors, MLUtils, NNlib in the sidebar somewhere, or maybe it should live in one section with Functors?

Screenshot 2022-06-02 at 11 58 51

@Saransh-cpp
Copy link
Member Author

Saransh-cpp commented Jun 3, 2022

Yes, having a separate page for Optimisers.jl makes sense. I could then add a meta block to let Documenter know which package we are documenting (to avoid the confusion between same named optimisers) -

```@meta
CurrentModule = Flux
\```

Should I create a new page or should append Optimisers.jl to the page created for Functors.jl?

Or maybe we should add a new section under the "Optimisers" page for this?

@Saransh-cpp
Copy link
Member Author

  • Added Optimisers.jl under the "optimisers" page for now
  • Added gpu and cpu to the manual

@Saransh-cpp
Copy link
Member Author

cc: @DhairyaLGandhi

Copy link
Member

@mcabbott mcabbott left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks like a good step forwards, fixes things which are broken.

* fix cross-references

* add missing docstrings
Add `cpu` and `gpu` to the manual and `Optimisers.jl` as a dependency
@mcabbott mcabbott merged commit 26fe1d8 into FluxML:master Jul 28, 2022
@Saransh-cpp Saransh-cpp deleted the fix-docs-warnings branch July 29, 2022 05:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants