-
Notifications
You must be signed in to change notification settings - Fork 863
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Better transformer HF example #2002
Conversation
Codecov Report
@@ Coverage Diff @@
## master #2002 +/- ##
=======================================
Coverage 53.80% 53.80%
=======================================
Files 70 70
Lines 3169 3169
Branches 56 56
=======================================
Hits 1705 1705
Misses 1464 1464 📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good, left some minor nits. Also please run the lint and spellchecking jobs before checking this in
Co-authored-by: Mark Saroufim <marksaroufim@fb.com>
examples/Huggingface_Transformers/Transformer_handler_generalized.py
Outdated
Show resolved
Hide resolved
* [Model parallel inference](examples/Huggingface_Transformers#model-parallelism) | ||
* [Better Transformer for HuggingFace Transformers](examples/Huggingface_Transformers#Speed-up-inference-with-Better-Transformer) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we can remove this
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Assuming all CI is green we're good to go
Description
This PR will add the Better Transformer optimization to the current HF example. This optimization does not change anything in the model just map the layer names from HF to match expected names in NN.Transformer module where we have faster kernels.
Also, need to mention that the optimized model can not be saved for now, it is numerically stable meaning there is no accuracy degradation after the changes.
Fixes #(issue)
Type of change
Please delete options that are not relevant.
Feature/Issue validation/testing
This requires PT 1.13, will upload the regression test logs.
Test A
Logs for Test A
Test B
Logs for Test B