Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use inference_mode instead of no_grad #2804

Merged
merged 1 commit into from
Jan 18, 2024
Merged

Conversation

bryant1410
Copy link
Contributor

Use inference_mode instead of no_grad for BaseHandler. It should bring a (small?) performance boost.

Though, I have 2 questions:

  • What's the min supported PyTorch version? Because inference_mode is supported since PT 1.9.
  • Should other handlers (including example ones) be changed as well?

@msaroufim
Copy link
Member

A bit conflicted about this one since inference mode has a bad interaction with torch.compile

@bryant1410
Copy link
Contributor Author

bryant1410 commented Nov 28, 2023

A bit conflicted about this one since inference mode has a bad interaction with torch.compile

Yeah, I've seen that. I think it's only for 2.0 but not 2.1: pytorch/pytorch#103132. It works fine for me for 2.1 (but not with 2.0).

When compile is being used, we could only use it for PT > 2.0, else use no_grad. What do you think?

@lxning lxning added this pull request to the merge queue Jan 18, 2024
Merged via the queue into pytorch:master with commit 2026cb1 Jan 18, 2024
13 checks passed
@bryant1410 bryant1410 deleted the patch-2 branch January 18, 2024 11:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants