-
-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Why the allocated memory keep increase in for loop #49761
Comments
Welcome! This is the issue tracker for the julia language, where bugs are reported. Please ask usage questions on https://discourse.julialang.org/. |
IMO, this appears to be a gc bug |
How can you tell? This is not a runnable example - |
FWIW, I have a similar issue since moving to 1.9, using only standard libraries. Adding |
If you have a reproducible example, please do share it! Without having a particular case to investigate, nothing can be really done. |
I have another example where memory just keep stacking up (CNN training using FLux). In Julia 1.8, all is fine. A manual GC.gc() call cleans it up. using Flux
import IterTools: ncycle
struct CNNModel
cnnlayer1
cnnlayer2
poollayer
dense
end
getparams(model::CNNModel) = Flux.params(model.cnnlayer1, model.cnnlayer2, model.dense)
struct Data
input
label
end
function evalmodel(input_CNN, model::CNNModel)
temp = model.cnnlayer1(input_CNN)
temp = model.cnnlayer2(temp)
temp = model.poollayer(temp)
temp = dropdims(temp; dims = (2, 4))
temp = model.dense(permutedims(temp, (2, 1)))
return vec(temp)
end
loss(data::Data, model) = Flux.logitcrossentropy( evalmodel(data.input, model), data.label )
loss(data::Vector{Data}, model) = Flux.mean(loss(d, model) for d in data)
X = [randn(500, 2, 1, 1) for i = 1:50]
Y = [vec( sum(X[i]; dims = 2)) for i = 1:50]
datavec = [ Data(X[i], Y[i]) for i in eachindex(X) ]
data_loader = Flux.Data.DataLoader(datavec, batchsize=20, shuffle=true);
cnnlayer1 = fmap(f64, Conv((128, 2), 1=>16, tanh; stride = 1, pad = (64, 1)))
cnnlayer2 = fmap(f64, Conv((16, 2), 16=>8, tanh; stride = 1, pad = (7, 0)))
cpooling = MeanPool((1, 2))
cdense = fmap(f64, Dense( 8=>1 ; bias = false))
model = CNNModel(cnnlayer1, cnnlayer2, cpooling, cdense)
function callback!(counter, Nbatches)
counter[1]+=1
if counter[1] > Nbatches
counter[2] += 1
counter[1] = 1
println("Episode $(counter[2]) done")
end
end
countervec = [0, 0]
Flux.train!(data->loss(data, model), getparams(model), ncycle(data_loader, 100), ADAM(); cb = ()->callback!(countervec, length(data_loader)))
# to clean up
# Main.GC.gc() |
Forgot to mention, reduced example was tested on Windows 11. |
This issue remains in 1.9.1 btw, at least for the example I provide (tested under Linux as well) |
Maybe similar to #49545. |
I‘m sorry for not being able to provide the code, but I am having a similar issue with my application. While it works fine with Julia 1.8. |
This PR implements GC heuristics based on the amount of pages allocated instead of live objects like was done before. The heuristic for new heap target is based on https://dl.acm.org/doi/10.1145/3563323 (in summary it argues that the heap target should have square root behaviour). From my testing this fixes #49545 and #49761
This PR implements GC heuristics based on the amount of pages allocated instead of live objects like was done before. The heuristic for new heap target is based on https://dl.acm.org/doi/10.1145/3563323 (in summary it argues that the heap target should have square root behaviour). From my testing this fixes #49545 and #49761 (cherry picked from commit 32aa29f)
Happy to report that at least for the Flux example I provided, the issue seems resolved with Julia v1.10.0-beta1. |
This PR implements GC heuristics based on the amount of pages allocated instead of live objects like was done before. The heuristic for new heap target is based on https://dl.acm.org/doi/10.1145/3563323 (in summary it argues that the heap target should have square root behaviour). From my testing this fixes JuliaLang#49545 and JuliaLang#49761
The memory allocated by Julia will keep increase in for loop. How to solve it, only by
gc()
manually? This fig shows the memory allocated when run this Julia code.The text was updated successfully, but these errors were encountered: