-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Hexagon] Propagate QNN Concat Quantization Params to Inputs #15258
[Hexagon] Propagate QNN Concat Quantization Params to Inputs #15258
Conversation
…ting redundant requantization when possible, and make it concat
Thanks for contributing to TVM! Please refer to the contributing guidelines https://tvm.apache.org/docs/contribute/ for useful information and tips. Please request code reviews from Reviewers by @-ing them in a comment.
Generated by tvm-bot |
66eecf8
to
faf5854
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM apart from a minor comment.
@tvm-bot rerun |
…15258) * [Hexagon] Propagate qnn.concat quantization params to inputs, eliminating redundant requantization when possible, and make it concat * Fix pylint issue * Add relay IR snippet before and after transformation * Better test file description comment
…15258) * [Hexagon] Propagate qnn.concat quantization params to inputs, eliminating redundant requantization when possible, and make it concat * Fix pylint issue * Add relay IR snippet before and after transformation * Better test file description comment
…15258) * [Hexagon] Propagate qnn.concat quantization params to inputs, eliminating redundant requantization when possible, and make it concat * Fix pylint issue * Add relay IR snippet before and after transformation * Better test file description comment
This helps in using optimized implementations of concat; this PR also avoids redundant requantization when an input to a concat is already a requantize op.