You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
Following on seung-lab/cloud-volume#635, I've now switched to a Linux workstation with 1TiB RAM but I can't compute downscaled versions of a dataset with this > 3.5GiB error:
Hi! This will depend on your chunk size and data type. I suspect something is set a bit oddly, as usually 3.5 GB is enough for even very large data sets.
However, you can make individual tasks larger by setting memory_target=int(300e9) (300 GB for example)
Hi,
Following on seung-lab/cloud-volume#635, I've now switched to a Linux workstation with 1TiB RAM but I can't compute downscaled versions of a dataset with this > 3.5GiB error:
This is the output of
free -h
:I've probably made a mistake on the line
tq = LocalTaskQueue(parallel=True)
?The text was updated successfully, but these errors were encountered: