Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

v0.12.10 => v0.13.4 breaks Dropout on CUDA #2018

Closed
marmarelis opened this issue Jul 11, 2022 · 2 comments
Closed

v0.12.10 => v0.13.4 breaks Dropout on CUDA #2018

marmarelis opened this issue Jul 11, 2022 · 2 comments

Comments

@marmarelis
Copy link

Hi all, it seems that an update to the Dropout layer at some point following Flux v0.12.10 caused an incompatibility with CUDA on Julia 1.6.0, CUDA.versioninfo():

CUDA toolkit 11.7, artifact installation
NVIDIA driver 470.57.2, for CUDA 11.4
CUDA driver 11.4

A minimal example

model = Chain(Dense(4,4), Dropout(0.1)) |> gpu
Flux.train!(Flux.params(model), [randn(4)|>gpu], ADAM(1e-1)) do x; model(x) |> sum; end

crashes with

julia: /buildworker/worker/package_linux64/build/src/jitlayers.cpp:958: void jl_merge_module(llvm::Module*, std::unique_ptr<llvm::Module>): Assertion `dG->isDeclaration() || (dG->getInitializer() == sG->getInitializer() &&
 dG->isConstant() && sG->isConstant())' failed.

The trace also includes the following line:

#_dropout_mask#317 at [...]/layers/normalise.jl:45
@ToucheSir
Copy link
Member

That line just calls rand!, so this ought to be reported to CUDA.jl. If I had to guess, toolkit 11.7 is too new and has some outstanding bugs when used with CUDA.jl.

@marmarelis
Copy link
Author

I see, thanks. I'll close this issue then.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants