-
Notifications
You must be signed in to change notification settings - Fork 121
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory leak #83
Comments
@bodono, thanks for finding this. But I'm not entirely sure how to go about looking for the problem in a garbage collected language... |
Module-level variables could do it. |
I need to run several thousand small linear programs in an iterative algorithm. The memory quickly exploded (with various solvers including Clp, ECOS, GLPK, etc.). Currently I have run for a few hundred iterations, save the results into file, exit Julia and then start a new session, load the saved results and continue running. |
@pluskid, any reason you haven't tried JuMP? |
@mlubin Thanks! No particular reason. I just arrived at juliaopt.org, found JuMP.jl and Convex.jl. I picked the second one by applying the first impression that Convex.jl is more dedicated to convex problems and the stereotype that dedicated packages typically runs faster. I will try JuMP later. :) |
The global variables are indeed the problem. I submitted a pull request (#123) with a temporary workaround. With this merged you have to call |
David, thanks for the PR! And for confirming that the global variables are at fault. Any ideas on the best permanent solution? The global cache of variables is used to avoid extra work when variables and expressions are reused, even on different problems. The problem is that even when all other pointers to the variable or expression have evaporated, the pointer in this global cache remains, preventing garbage collection. Is there any way to check whether the only pointer to the julia object (variable or expression) is coming from this dictionary, at which point we can delete it from the dictionary as well to pave the way for garbage collection? |
Try using a WeakKeyDict: https://github.com/JuliaLang/julia/blob/464904e17256e0e3fdb9f50d0d401abf3d8547d1/base/dict.jl#L823. I've confirmed with the Julia overlords that it's kosher even though it's undocumented. |
Cool, trying it now. David, I'm surprised you needed to include On Wed, Feb 10, 2016 at 3:07 PM, Miles Lubin notifications@github.com wrote:
Madeleine Udell |
Trying:
I see a Anyone know why UInt64s (which we use as Variable and Constraint ids) can't be finalized? Is there another kind of key we could use that could be finalized, or can I define a finalizer? Full error:
|
Oh, I think the |
Is the problem resolved? The memory usage keep on exploding for the following piece of script: `# Array to store Result
end ` |
Can you include the values of By the way, there's another issue talking about memory usage here (with likely a different underlying cause): #254; maybe that's related? I ask because I think |
I'm seeing a similar problem of steadily growing memory usage when repeatedly optimizing in a loop, even when I include I'll see if I can post a minimum example to try, but the full program I'm running that encountered this grew over the course of 8-9 hours. An interesting part is that the solver also slows down as the memory grows even before crashing- not sure if that can give a hint as to where it's happening. If anyone can suggest how I might debug this, I'm open to it (I tried profiling once, but it said the profile stack kept filling up) |
Thanks for the report. Convex has two global caches it uses, which are dictionaries, and what function clearmemory()
global id_to_variables
empty!(id_to_variables)
global conic_constr_to_constr
empty!(conic_constr_to_constr)
GC.gc()
end I think the problem is that dictionaries don't actually free all their memory when they are emptied; their values are just set to Base.summarysize(Convex.id_to_variables) and Base.summarysize(conic_constr_to_constr) I'm not sure how to actually clear the memory of the dictionaries (I had thought What do you think? Edit: sorry @mouryarahul, I think this must be what you were seeing too. When you posted I was totally in the mind that |
I guess here's an example where if you watch htop/top for the mem% column of julia, you see it slowly creep up even with the clearing/gc()
this was also happening on julia 1.3 and convex v0.12.3 |
If I modify your code to using Convex
using ECOS
function testmem()
x = Variable(10)
A = rand(100, 10)
z = randn(1000, 1000, 100)
for i=1:1000, j = 1:1000
b = z[i, j, :]
problem = minimize(norm(A*x - b, 1))
solve!(problem, ECOSSolver(verbose=0))
if (i*1000 + j) % 1000 == 0
@info "Before clearing" i j Base.summarysize(Convex.id_to_variables)/1e6 Base.summarysize(Convex.conic_constr_to_constr)/1e6
Convex.clearmemory()
GC.gc()
x = Variable(10)
@info "After clearing" i j Base.summarysize(Convex.id_to_variables)/1e6 Base.summarysize(Convex.conic_constr_to_constr)/1e6
end
end
end
testmem() Then we can it seems like clearing is quite effective in terms of reducing the memory use of those two globals, and even though there is some memory still used afterwards, the total memory used by those two globals stays under 60 mb (growing until about that, then resetting to close to zero). So I think you're actually seeing a different problem than something that I think it that actually be a problem with ECOS; if I switch to SCS, then the total memory used by the script doesn't seem to be growing. Can you try SCS? If you want to try with the old behavior, do |
Closed by #322 |
Both ECOS and SCS exhibit memory leaking and crashing when running the script here: jump-dev/SCS.jl#24. Creating a problem by handing and calling SCS.jl in a for-loop does not exhibit the same memory leakage (although it might be in a code-path I'm not testing), so my best guess is that it's in Convex.jl somewhere (I know julia is garbage collected, but memory leaks can still occur).
The text was updated successfully, but these errors were encountered: