Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] can't work with cpu #1022

Closed
LXYTSOS opened this issue Jun 14, 2019 · 2 comments
Closed

[BUG] can't work with cpu #1022

LXYTSOS opened this issue Jun 14, 2019 · 2 comments

Comments

@LXYTSOS
Copy link
Contributor

LXYTSOS commented Jun 14, 2019

In references/detection/utils.py line 212: memory=torch.cuda.max_memory_allocated() / MB)). It works fine under the GPU environment, then I train a model with pytorch-cpu, exception occurs: AssertionError: Torch not compiled with CUDA enabled. When I trace the error I found the code in references/detection/utils.py: memory=torch.cuda.max_memory_allocated() / MB)), when you run torch.cuda.max_memory_allocated() in a CPU version it will gives you that error.

@pmeier
Copy link
Collaborator

pmeier commented Jun 15, 2019

Fixed by @LXYTSOS in #1023.

@fmassa
Copy link
Member

fmassa commented Jun 15, 2019

Thanks for the fix @LXYTSOS

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants