Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add dynamo expand test. #6659

Merged
merged 3 commits into from
Apr 12, 2024
Merged

Add dynamo expand test. #6659

merged 3 commits into from
Apr 12, 2024

Conversation

ysiraichi
Copy link
Collaborator

@ysiraichi ysiraichi commented Mar 1, 2024

This PR adds a test for #5837. The fix was introduced in PyTorch main repository (pytorch/pytorch#121007), but we need PyTorch/XLA for actually exercising the (previously) failing test case.

cc @miladm @JackCaoG

@@ -568,6 +568,24 @@ def test_all_cpu_tensor(self):
self.assertIn('MarkStep', met.counter_names())


class DynamoOperationsTests(test_utils.XlaTestCase):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we add a flag to skip this test while we address the corresponding bug you will soon open on "expand failing on dynamo"

@alanwaketan
Copy link
Collaborator

You can add a .torch_pin to build with your patch PR. Following the instructions here: https://github.com/pytorch/xla/tree/master/torch_patches

Such that we don't need to add any guards. We just need to land the two patches together.

@alanwaketan
Copy link
Collaborator

Great, looks like all tests pass! Please ping me once your upstream PR is approved. Also, we need to remove the .torch_pin before landing the PR.

Copy link
Collaborator

@alanwaketan alanwaketan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM.

@ysiraichi ysiraichi merged commit 58a412c into master Apr 12, 2024
21 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants