Skip to content
This repository has been archived by the owner on Oct 9, 2023. It is now read-only.

Add support for Torch ORT to Transformer based Tasks #667

Merged
merged 22 commits into from
Aug 17, 2021
Merged

Conversation

SeanNaren
Copy link
Contributor

@SeanNaren SeanNaren commented Aug 16, 2021

What does this PR do?

Adds Torch ORT support to Transformer based Flash Tasks!

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests? [not needed for typos/docs]
  • Did you verify new and existing tests pass locally with your changes?
  • If you made a notable change (that affects users), did you update the CHANGELOG?

PR review

  • Is this pull request ready for review? (if not, please submit in draft mode)

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@codecov
Copy link

codecov bot commented Aug 17, 2021

Codecov Report

Merging #667 (f1723e7) into master (4e89a37) will decrease coverage by 0.70%.
The diff coverage is 76.31%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #667      +/-   ##
==========================================
- Coverage   90.01%   89.30%   -0.71%     
==========================================
  Files         185      186       +1     
  Lines        9664     9696      +32     
==========================================
- Hits         8699     8659      -40     
- Misses        965     1037      +72     
Flag Coverage Δ
unittests 89.30% <76.31%> (-0.71%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
flash/text/seq2seq/question_answering/model.py 76.47% <ø> (ø)
flash/text/seq2seq/summarization/model.py 80.00% <ø> (ø)
flash/text/seq2seq/translation/model.py 76.19% <ø> (ø)
flash/text/ort_callback.py 53.33% <53.33%> (ø)
flash/text/seq2seq/core/model.py 76.62% <87.50%> (+1.26%) ⬆️
flash/text/classification/model.py 92.30% <92.85%> (-0.88%) ⬇️
flash/core/utilities/imports.py 89.18% <100.00%> (+0.09%) ⬆️
flash/graph/classification/model.py 37.70% <0.00%> (-62.30%) ⬇️
flash/graph/data.py 45.00% <0.00%> (-50.00%) ⬇️
flash/graph/classification/cli.py 42.85% <0.00%> (-50.00%) ⬇️
... and 2 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 4e89a37...f1723e7. Read the comment docs.

Copy link
Collaborator

@ethanwharris ethanwharris left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome, looks really neat 😃 Do we need to add torch ORT to the text requirements if it will be enabled by default? Also, I guess the docs section needs to be added to the summarization task too?

docs/source/reference/text_classification.rst Outdated Show resolved Hide resolved
docs/source/reference/translation.rst Outdated Show resolved Hide resolved
docs/source/reference/translation.rst Outdated Show resolved Hide resolved
docs/source/reference/translation.rst Outdated Show resolved Hide resolved
flash/text/ort_callback.py Show resolved Hide resolved
@SeanNaren
Copy link
Contributor Author

Awesome, looks really neat 😃 Do we need to add torch ORT to the text requirements if it will be enabled by default? Also, I guess the docs section needs to be added to the summarization task too?

I just realised that the only way to test is with an NVIDIA GPU or AMD GPU. Do we test Flash at all on GPU? cc @ethanwharris

Copy link
Collaborator

@ethanwharris ethanwharris left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome, LGTM 😃

flash/text/seq2seq/question_answering/model.py Outdated Show resolved Hide resolved
flash/text/seq2seq/summarization/model.py Outdated Show resolved Hide resolved
flash/text/seq2seq/translation/model.py Outdated Show resolved Hide resolved
@ethanwharris ethanwharris enabled auto-merge (squash) August 17, 2021 16:14
@ethanwharris ethanwharris merged commit 741a838 into master Aug 17, 2021
@ethanwharris ethanwharris deleted the torch_ort branch August 17, 2021 16:19
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants