-
-
Notifications
You must be signed in to change notification settings - Fork 638
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add torchrun launcher plugin #2119
Add torchrun launcher plugin #2119
Conversation
Hi @jbaczek! Thank you for your pull request and welcome to our community. Action RequiredIn order to merge any pull request (code, docs, etc.), we require contributors to sign our Contributor License Agreement, and we don't seem to have one on file for you. ProcessIn order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA. Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with If you have received this in error or have any questions, please contact us at cla@fb.com. Thanks! |
Thanks @jbaczek. |
We're considering having a contrib/ dir to serve as a staging area for various new launchers / sweepers. Thank you for your contribution! |
Should I lint it? Or will you do this on your side? |
Thank you for signing our Contributor License Agreement. We can now accept your code for this (and any) Meta Open Source project. Thanks! |
hi @jbaczek - thanks very much for your contribution! We've created pls let us know if you have any question on this. thanks again. |
… jbaczek/torch_distributed_launcher
I'll update it as soon as I'll have some time for it. Probably mid-may |
@jieru-hu I've moved the plugin to the |
thanks @jbaczek and sorry for the late response. I will take a look at this |
looks great, thanks for your contribution @jbaczek ! |
Motivation
PyTorch distributed apps is a common use case for hydra (according to many github issues). Currently implementation of
torchrun
interferes with hydra in a way, that causes hydra to initialize multiple times, leading to many race conditions.This PR contains a launcher plugin that fixes the issue, forking the process after hydra is initialized.
Have you read the Contributing Guidelines on pull requests?
Yes
This MR is submitted on behalf of NVIDIA.
Test Plan
Use machine with multiple GPUs.
Use pytorch docker conrainer or install PyTorch in any other way.
Install plugin with
pip
.How to run
torch distributed
without plugin:How to run with plugin:
Related Issues and PRs
Resolves #2038