-
Notifications
You must be signed in to change notification settings - Fork 480
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Supporting the Core ATen Opset #5764
Comments
Would working on lowering some of these core ATen ops into Pytorch/XLA be a good first issue for a new contributor? If so, I can work on some, perhaps starting with |
Absolutely! You can take a look at https://github.com/pytorch/xla/blob/master/OP_LOWERING_GUIDE.md and https://github.com/pytorch/xla/blob/master/CODEGEN_MIGRATION_GUIDE.md to get some background on how to do lowering. You can also check recently merged pr of lowering for reference. |
Awesome, I'll take a look, thanks! |
cc @qihqi |
Hi, I created a unit test with failed ops that failed conversion to StableHLO: https://github.com/pytorch/xla/pull/5796/files To add support an op the process would be roughly:
|
Marking this as complete, https://github.com/pytorch/xla/projects?type=classic&query=Core+Aten+Opset. |
🚀 Feature
As PyTorch/XLA tries to support the PyTorch core ATen opset in its entirety, it requires lowering each core ATen op in PyTorch/XLA. As of today, most of these ops are already lowered. This issue will serve as a tracker for our support of the core ATen opset.
Unlowered core ATen ops
The text was updated successfully, but these errors were encountered: