-
Notifications
You must be signed in to change notification settings - Fork 521
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Eager mode #635
Eager mode #635
Conversation
Can you add a test config for the e2e test framework? https://github.com/llvm/torch-mlir/tree/main/python/torch_mlir_e2e_test/torchscript/configs It should be possible to pass all the tests if you implement the fallback correctly. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
drive-by comment. will need a few passes here on the review.
433330b
to
47a6a3d
Compare
5aecdb6
to
8a0be06
Compare
7e23d5e
to
248a3f9
Compare
248a3f9
to
8e27b5c
Compare
8bb719a
to
dadb46f
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
mostly nits.
bbc8e27
to
cf86e01
Compare
from torch_mlir_e2e_test.linalg_on_tensors_backends import refbackend | ||
|
||
|
||
class TorchMLIRTensor(torch.Tensor): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there any upstream documentation describing the extension point you are using here? (_make_wrapper_subclass/`_torch_dispatch``, etc.) It would be good to link it in if it exists.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I spoke with brian hirsch and he said the best documentation for how to use wrapper_subclass is https://github.com/albanD/subclass_zoo. I can add as a comment.
…h-mlir framework. This is accomplished by overriding the `__torch_dispatch__` class method on wrapper subclass `TorchMLIRTensor(torch.Tensor)`. Effectively, this mode works by compiling op by op as the NN is eagerly executed by PyTorch. Entailed in that compilation is building a representation of the op that can be `torch.jit.script`ed, importing using `ModuleBuilder`, and then executing (e.g., with `RefBackendLinalgOnTensorsBackend`). This mode includes a fallback to conventional PyTorch if anything in the torch-mlir compilation process fails (e.g., unsupported op). Currently, all e2e tests pass execpt for two that involve an upstream PyTorch bug (pytorch/pytorch#74400). High priority next steps: 1. A compile cache in order to speed up reruns of the same NN. 2. Integration with IREE (though not in this repo). 3. Integration with `torch.distributed`.
cf86e01
to
e22e2b6
Compare
This PR implements an eager mode backend for PyTorch through the torch-mlir framework. This is accomplished by overriding the
__torch_dispatch__
class method on wrapper subclassTorchMLIRTensor(torch.Tensor)
.Effectively, this mode works by compiling op by op as the NN is eagerly executed by PyTorch. Entailed in that compilation is building a representation of the op that can be
torch.jit.script
ed, importing usingModuleBuilder
, and then executing (e.g., withRefBackendLinalgOnTensorsBackend
). This mode includes a fallback to conventional PyTorch if anything in the torch-mlir compilation process fails (e.g., unsupported op).Currently, all e2e tests pass execpt for two that involve an upstream PyTorch bug (pytorch/pytorch#74400).
High priority next steps:
torch.distributed
.