You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In references/detection/utils.py line 212: memory=torch.cuda.max_memory_allocated() / MB)). It works fine under the GPU environment, then I train a model with pytorch-cpu, exception occurs: AssertionError: Torch not compiled with CUDA enabled. When I trace the error I found the code in references/detection/utils.py: memory=torch.cuda.max_memory_allocated() / MB)), when you run torch.cuda.max_memory_allocated() in a CPU version it will gives you that error.
The text was updated successfully, but these errors were encountered:
In
references/detection/utils.py
line 212:memory=torch.cuda.max_memory_allocated() / MB))
. It works fine under the GPU environment, then I train a model with pytorch-cpu, exception occurs:AssertionError: Torch not compiled with CUDA enabled
. When I trace the error I found the code inreferences/detection/utils.py
:memory=torch.cuda.max_memory_allocated() / MB))
, when you runtorch.cuda.max_memory_allocated()
in a CPU version it will gives you that error.The text was updated successfully, but these errors were encountered: