Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory limit of 3.5 GiB on 965GiB RAM device #180

Closed
chourroutm opened this issue Oct 15, 2024 · 2 comments
Closed

Memory limit of 3.5 GiB on 965GiB RAM device #180

chourroutm opened this issue Oct 15, 2024 · 2 comments
Labels

Comments

@chourroutm
Copy link

Hi,
Following on seung-lab/cloud-volume#635, I've now switched to a Linux workstation with 1TiB RAM but I can't compute downscaled versions of a dataset with this > 3.5GiB error:

from taskqueue import LocalTaskQueue
import igneous.task_creation as tc
from pathlib import Path

output_dir = "output/mask.precomputed/"
output_dir = Path(output_dir)
mkdir(output_dir)

output_dir = output_dir.absolute().as_uri() + "/"

print(output_dir)

tq = LocalTaskQueue(parallel=True)
tasks = tc.create_downsampling_tasks(output_dir, mip=0, num_mips = 3, factor= (2,2,2), fill_missing=True, delete_black_uploads=True)
tq.insert(tasks)
tq.execute()
print("Done!")
file:///home/me/convert_with_cloudvolume/output/mask.precomputed/
WARNING: Memory limit (3500000000 bytes) too low to compute 3 mips at a time. 2 mips possible.
Volume Bounds:  Bbox([0, 0, 0],[1169, 1169, 1345], dtype=np.int32, unit='vx')
Selected ROI:   Bbox([0, 0, 0],[1169, 1169, 1345], dtype=np.int32, unit='vx')

This is the output of free -h:

               total        used        free      shared  buff/cache   available
Mem:           1.0Ti        26Gi       965Gi       1.0Gi        15Gi       974Gi
Swap:          476Gi          0B       476Gi

I've probably made a mistake on the line tq = LocalTaskQueue(parallel=True)?

@william-silversmith
Copy link
Contributor

Hi! This will depend on your chunk size and data type. I suspect something is set a bit oddly, as usually 3.5 GB is enough for even very large data sets.

However, you can make individual tasks larger by setting memory_target=int(300e9) (300 GB for example)

@chourroutm
Copy link
Author

Thanks, it did solve the issue!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants