Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Accelerator Refactor: Precision Plugins #5718

Merged
merged 14 commits into from
Jan 31, 2021

Conversation

justusschock
Copy link
Member

What does this PR do?

Adds precision plugins from #5616
To be merged after #5715

Co-Authored with @awaelchli

@pep8speaks
Copy link

pep8speaks commented Jan 30, 2021

Hello @justusschock! Thanks for updating this PR.

Line 26:121: E501 line too long (136 > 120 characters)
Line 40:121: E501 line too long (139 > 120 characters)

Comment last updated at 2021-01-31 17:46:16 UTC

pytorch_lightning/accelerators/accelerator.py Outdated Show resolved Hide resolved
pytorch_lightning/accelerators/accelerator.py Outdated Show resolved Hide resolved
Comment on lines 25 to 38
- CPU
- GPU
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@SeanNaren if we want to do ZeRO-v3 and offload more to CPU memory, is that handled within the training type plugin entirely? is there any dependency on the accelerator in which the plugin is contained?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah handled explicitly within the training type plugin, or more technically, within the sharded DDP/OSS class itself. I think from what I recall, an offload_device will be exposed giving the user control (in this case, our accelerator control)

pytorch_lightning/plugins/precision/precision_plugin.py Outdated Show resolved Hide resolved
from pytorch_lightning.plugins.precision.precision_plugin import PrecisionPlugin


class TPUHalfPrecisionPlugin(PrecisionPlugin):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ooo I didn't see this, this is nice :)

justusschock and others added 11 commits January 31, 2021 17:23
Co-Authored with @awaelchi
Co-Authored with @awaelchi
Co-authored-by: @awaelchi
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
Copy link
Member

@Borda Borda left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lets fix import...

pytorch_lightning/plugins/precision/mixed.py Show resolved Hide resolved
pytorch_lightning/plugins/base_plugin.py Outdated Show resolved Hide resolved
justusschock and others added 3 commits January 31, 2021 18:17
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
@codecov
Copy link

codecov bot commented Jan 31, 2021

Codecov Report

Merging #5718 (60445ad) into release/1.2-dev (3bacac7) will decrease coverage by 0%.
The diff coverage is n/a.

@@               Coverage Diff                @@
##           release/1.2-dev   #5718    +/-   ##
================================================
- Coverage               89%     89%    -0%     
================================================
  Files                  173     173            
  Lines                12495   12339   -156     
================================================
- Hits                 11175   11017   -158     
- Misses                1320    1322     +2     

@justusschock justusschock merged commit 069ae27 into release/1.2-dev Jan 31, 2021
@justusschock justusschock deleted the ref/precision_plugins branch January 31, 2021 18:12
@Borda Borda added the ready PRs ready to be merged label Feb 1, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ready PRs ready to be merged refactor
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants