You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Mar 29, 2024. It is now read-only.
I'm using the tulip-python 5.4.0 package to generate thousands of graphs and compute nodes metrics for each of them.
However, the process is too memory heavy (about 13 Go RAM for ~10k graphs with <128 nodes).
Here's a snippet of code I'm looping over:
#Gen graphg=tlp.importGraph(graphGen, genParams)
#Compute metricseccentricity=applyTlpAlgorithm_Double(g,"Eccentricity", property_name="res_eccentricity", params=algoParams)
closeness_centrality=applyTlpAlgorithm_Double(g,"Eccentricity", property_name="res_closenness_centrality", params=algoParams)
k_cores=applyTlpAlgorithm_Double(g,"K-Cores", property_name="res_kcores")
#Save metricsforning.nodes():
nodes_Features.append([
eccentricity.getNodeValue(n),
closeness_centrality.getNodeValue(n),
k_cores.getNodeValue(n)
])
#Try to free some memory, still use >13 Go RAM for 10k graphsdelggc.collect()
Is there a clean way to destroy a graph object in tulip-python so that the memory is freed ? I.e. a python version of or call to the C++ deletion ?
Thanks
The text was updated successfully, but these errors were encountered:
Hello Loann,
Quick answer. First, in Python, do not use g.nodes but g.getNodes(). I am unsure that g.nodes() is as clean in Python as in C++.
Limit the scope of g in a function or a block to help the garbage collector. It seems that you do not need g anymore after your for loop.
In python as in Java, you do not have to clean an object. Just limit the scope of it to the minimum (good practise in C++ too). If g is in the global scope of your script, the garbage collector cannot clean it because you can use it (Python is not compiled, so a future use cannot be guessed). If g is only in the scope of a function, it will be cleaned more easily when leaving the function (as long as there is no reference to g like a=g then return a. Python is only references).
g is created in gen_features, is never referenced elsewhere, is never affected to another variable and is unallocated before the method's end (g = None). Still, the garbage collector does not seem to free the memory.
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Hi,
I'm using the tulip-python 5.4.0 package to generate thousands of graphs and compute nodes metrics for each of them.
However, the process is too memory heavy (about 13 Go RAM for ~10k graphs with <128 nodes).
Here's a snippet of code I'm looping over:
Is there a clean way to destroy a graph object in tulip-python so that the memory is freed ? I.e. a python version of or call to the C++ deletion ?
Thanks
The text was updated successfully, but these errors were encountered: