Skip to content
This repository has been archived by the owner on Nov 21, 2022. It is now read-only.

Create custom distributed plugin to allow model.parallelize #23

Closed
SeanNaren opened this issue Jan 8, 2021 · 2 comments
Closed

Create custom distributed plugin to allow model.parallelize #23

SeanNaren opened this issue Jan 8, 2021 · 2 comments
Labels
enhancement New feature or request help wanted Extra attention is needed wontfix This will not be worked on
Milestone

Comments

@SeanNaren
Copy link
Contributor

🚀 Feature

Currently some models can be parallelized using the latest HF changes: huggingface/transformers#8696

We should create a plugin supported in this repo so that users can utilize this with the pl trainer.

@SeanNaren SeanNaren added enhancement New feature or request help wanted Extra attention is needed labels Jan 8, 2021
@SeanNaren SeanNaren modified the milestone: 1.0 Jan 10, 2021
@SeanNaren
Copy link
Contributor Author

Update here. I think we're converging towards a hook that gives access to a distributed enabled environment. Before, setup used to be this hook since distributed communication was initialized, however after the accelerator refactor, things were standardized meaning the setup hook is now being called before distributed communication it initialized.

Discussion here: Lightning-AI/pytorch-lightning#6318

@edenlightning edenlightning added this to the v1.4 milestone Apr 27, 2021
@stale
Copy link

stale bot commented Jun 26, 2021

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the wontfix This will not be worked on label Jun 26, 2021
@stale stale bot closed this as completed Jul 5, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
enhancement New feature or request help wanted Extra attention is needed wontfix This will not be worked on
Projects
None yet
Development

No branches or pull requests

2 participants