-
-
Notifications
You must be signed in to change notification settings - Fork 612
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GPU docs #2510
GPU docs #2510
Conversation
Once the build has completed, you can preview any updated documentation at this URL: https://fluxml.ai/Flux.jl/previews/PR2510/ in ~20 minutes In particular, this page: https://fluxml.ai/Flux.jl/previews/PR2510/guide/gpu/ |
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## master #2510 +/- ##
=======================================
Coverage 33.20% 33.20%
=======================================
Files 31 31
Lines 1843 1843
=======================================
Hits 612 612
Misses 1231 1231 ☔ View full report in Codecov by Sentry. |
removing the milestone as this shouldn't be blocking |
The point is, in part, to think through whatever interface we're adopting by trying to explain it clearly. If it's a mess then 0.15 is when to fix it. That's why it was on the milestone. |
Can we get this done so that it doesn't delay the 0.15 release then? I would like to tag in a few days |
Sure. What do you think this lacks? |
!!! compat "Flux ≤ 0.13" | ||
Old versions of Flux automatically loaded CUDA.jl to provide GPU support. Starting from Flux v0.14, it has to be loaded separately. Julia's [package extensions](https://pkgdocs.julialang.org/v1/creating-packages/#Conditional-loading-of-code-in-packages-(Extensions)) allow Flux to automatically load some GPU-specific code when needed. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
!!! compat "Flux ≤ 0.13" | |
Old versions of Flux automatically loaded CUDA.jl to provide GPU support. Starting from Flux v0.14, it has to be loaded separately. Julia's [package extensions](https://pkgdocs.julialang.org/v1/creating-packages/#Conditional-loading-of-code-in-packages-(Extensions)) allow Flux to automatically load some GPU-specific code when needed. |
``` | ||
|
||
|
||
## Manually selecting devices | ||
|
||
I thought there was a whole `Flux.gpu_backend!` and Preferences.jl story we had to tell?? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
gpu_backend!
affects the return from gpu_device
like this:
- If no GPU is available, it returns a CPUDevice object.
- If a LocalPreferences file is present, then the backend specified in the file is used. If the trigger package corresponding to the device is not loaded, then a warning is displayed.
- If no LocalPreferences file is present, then the first working GPU with loaded trigger package is used.
This is already described in the docstring of gpu_device
. I think we shouldn't mention gpu_backend!
at all in this guide because it is useless in practice.
Maybe we should put a TLDR at the top just saying something like using CUDA # or AMDGPU or Metal
device = gpu_device()
model = model |> device
for epoch in 1:num_epochs
for (x, y) in dataloader
x, y = device((x, y))
... compute gradients and update model ...
end
end |
I will finish this |
This is what we're trying not to do. Not "here's the magic boilerplate you should copy", but instead the guide is "here's how things actually work". |
This re-writes the start of the GPU documentation page. It aims to use simpler examples, and stress that
model |> cu
just works, before talking about more exotic non-CUDA packages, and the automaticmodel |> gpu
.Rendered MD: https://github.com/FluxML/Flux.jl/blob/gpu_docs/docs/src/guide/gpu.md
Current docs: http://fluxml.ai/Flux.jl/stable/guide/gpu/