Skip to content

Commit

Permalink
Enhance decorator _use_grad_for_differentiable (pytorch#103567)
Browse files Browse the repository at this point in the history
Aim: enhance decorator _use_grad_for_differentiable so that functions (methods) decorated by it keep their docstrings and signatures unchanged.

Fixes pytorch#103566

Pull Request resolved: pytorch#103567
Approved by: https://github.com/janeyx99
  • Loading branch information
wenh06 authored and pytorchmergebot committed Jun 16, 2023
1 parent 5875a2f commit 67babf7
Showing 1 changed file with 1 addition and 0 deletions.
1 change: 1 addition & 0 deletions torch/optim/optimizer.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@ def _use_grad(self, *args, **kwargs):
finally:
torch.set_grad_enabled(prev_grad)
return ret
functools.update_wrapper(_use_grad, func)
return _use_grad

def _get_value(x):
Expand Down

0 comments on commit 67babf7

Please sign in to comment.