-
Notifications
You must be signed in to change notification settings - Fork 486
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make XLAShardedTensor use default __torch_function__ #6625
Conversation
Thanks for the quick PR! |
Me neither. But from a quick look at what you do, it shouldn't change anything here... |
Hmm dynamo started to complain after adding this |
Ho I see the same issue happening on Int16Tensor in the other PR. Let me look into that. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Well, this seems to fix it from this basic experience haha pytorch/pytorch#120799
Will investigate the Dynamo failure as well but this should make this PR green!
@@ -108,6 +108,9 @@ def __new__(cls, elem: torch.Tensor, *args, **kwargs): | |||
r.global_tensor = elem.detach() if r.requires_grad else elem | |||
return r | |||
|
|||
def __torch_function__(cls, func, types, args=(), kwargs=None): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
def __torch_function__(cls, func, types, args=(), kwargs=None): | |
@classmethod | |
def __torch_function__(cls, func, types, args=(), kwargs=None): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok let me update and give it another try
@albanD ok great, seems like this CI will fix the failure. Let me remove the torch pin and merge it to unblock you. |
@yeounoh can you take a look at this pr and merge it to unblock Alban? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM thanks @albanD
Companion pr for pytorch/pytorch#120632 (comment), through I have to admit I don't fully understand why XLA test will fail without this..