Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[relay] preserve the order of input_info of pytorch #14462

Merged
merged 4 commits into from
Apr 6, 2023

Conversation

sweetcocoa
Copy link
Contributor

This is for resolving #14461

The direct cause of this problem is that the order of input nodes found in _analysis.free_vars(ret) is not guaranteed in the order of names in input_info. From what I understand, _analysis.free_vars(ret) uses DFS to find input nodes from output nodes. Since DFS cannot guarantee the order in which nodes are found, it is intuitive to modify the order of input nodes after DFS, as shown in this PR.

@tvm-bot
Copy link
Collaborator

tvm-bot commented Apr 3, 2023

Thanks for contributing to TVM! Please refer to the contributing guidelines https://tvm.apache.org/docs/contribute/ for useful information and tips. Please request code reviews from Reviewers by @-ing them in a comment.

Generated by tvm-bot

@sweetcocoa sweetcocoa changed the title preserve the order of input_info [relay] preserve the order of input_info of pytorch Apr 3, 2023
Copy link
Member

@masahi masahi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, please fix lint

@sweetcocoa
Copy link
Contributor Author

I fixed lint. please re-review.

@masahi
Copy link
Member

masahi commented Apr 4, 2023

Some tests are broken, please fix them.

@sweetcocoa
Copy link
Contributor Author

sweetcocoa commented Apr 5, 2023

@masahi I fixed a test failure. However, I cannot interpret the current test results. cross-isa-minimal/pr-head — The build of this commit was aborted. Is this caused by a change in my code?

I checked https://ci.tlcpack.ai/blue/organizations/jenkins/tvm-minimal-cross-isa/detail/PR-14462/6/pipeline/95 but what i see is
not an error message but an interrupt message.

@masahi
Copy link
Member

masahi commented Apr 5, 2023

@tvm-bot rerun

@sweetcocoa
Copy link
Contributor Author

sweetcocoa commented Apr 6, 2023

@masahi Could you run again please? (or can i do it myself?)

@masahi
Copy link
Member

masahi commented Apr 6, 2023

@tvm-bot rerun

@sweetcocoa
Copy link
Contributor Author

@masahi Now all checks have passed.

@masahi masahi merged commit 1113de2 into apache:main Apr 6, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants