You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Traceback (most recent call last):
File "//./pytorchexample.py", line 112, in <module>
train(model, DEVICE, train_loader, optimizer, epoch)
File "//./pytorchexample.py", line 49, in train
for batch_idx, (data, target) in enumerate(train_loader):
File "/usr/local/lib/python3.9/dist-packages/torch/utils/data/dataloader.py", line 368, in __iter__
return self._get_iterator()
File "/usr/local/lib/python3.9/dist-packages/torch/utils/data/dataloader.py", line 314, in _get_iterator
return _MultiProcessingDataLoaderIter(self)
File "/usr/local/lib/python3.9/dist-packages/torch/utils/data/dataloader.py", line 900, in __init__
self._worker_result_queue = multiprocessing_context.Queue() # type: ignore[var-annotated]
File "/usr/lib/python3.9/multiprocessing/context.py", line 103, in Queue
return Queue(maxsize, ctx=self.get_context())
File "/usr/lib/python3.9/multiprocessing/queues.py", line 43, in __init__
self._rlock = ctx.Lock()
File "/usr/lib/python3.9/multiprocessing/context.py", line 68, in Lock
return Lock(ctx=self.get_context())
File "/usr/lib/python3.9/multiprocessing/synchronize.py", line 162, in __init__
SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx)
File "/usr/lib/python3.9/multiprocessing/synchronize.py", line 57, in __init__
sl = self._semlock = _multiprocessing.SemLock(
FileNotFoundError: [Errno 2] No such file or directory
The text was updated successfully, but these errors were encountered:
Unfortunately, it is a known issue -- Gramine doesn't support the multiprocessing package of Python. This is because Gramine currently doesn't support Sys-V semaphores, which the multiprocessing package requires.
UPDATE 24. March 2023. Python's multiprocessing package uses POSIX semaphores (and thus shared memory) and not Sys-V semaphores. See:
That's unfortunate, because implementing POSIX semaphores in Gramine/SGX would require allowing untrusted shared memory (/dev/shm), which will probably never happen...
I successfully ran the pytorch example. But when i use
datalaoder prefetch_factor
in pytorch example.like this
i got this error:
The text was updated successfully, but these errors were encountered: