-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[relay] preserve the order of input_info of pytorch #14462
Conversation
Thanks for contributing to TVM! Please refer to the contributing guidelines https://tvm.apache.org/docs/contribute/ for useful information and tips. Please request code reviews from Reviewers by @-ing them in a comment. Generated by tvm-bot |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, please fix lint
5750245
to
eb75592
Compare
I fixed lint. please re-review. |
Some tests are broken, please fix them. |
6120407
to
67cde54
Compare
67cde54
to
5bae5f3
Compare
@masahi I fixed a test failure. However, I cannot interpret the current test results. I checked https://ci.tlcpack.ai/blue/organizations/jenkins/tvm-minimal-cross-isa/detail/PR-14462/6/pipeline/95 but what i see is |
@tvm-bot rerun |
@masahi Could you run again please? (or can i do it myself?) |
@tvm-bot rerun |
@masahi Now all checks have passed. |
This is for resolving #14461
The direct cause of this problem is that the order of input nodes found in
_analysis.free_vars(ret)
is not guaranteed in the order of names ininput_info
. From what I understand,_analysis.free_vars(ret)
uses DFS to find input nodes from output nodes. Since DFS cannot guarantee the order in which nodes are found, it is intuitive to modify the order of input nodes after DFS, as shown in this PR.