Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Customize multiple dataloaders behaviour in train vs val/test #6233

Closed
edenlightning opened this issue Feb 26, 2021 · 3 comments
Closed

Customize multiple dataloaders behaviour in train vs val/test #6233

edenlightning opened this issue Feb 26, 2021 · 3 comments
Labels
design Includes a design discussion feature Is an improvement or enhancement help wanted Open to be worked on won't fix This will not be worked on

Comments

@edenlightning
Copy link
Contributor

User should be able to determine if multiple dataloaders will run simultaneously (passed as a list/dict), or get called sequentially.
This can be done with a trainer flag with multiple options.

Background:
Currently, in training step you get the different batches at the same time as list or dict, but for validation step it gets called multiple times with a dataloader_idx. Just thought it would be nicer if how multiple dataloaders are handled is consistent across training, val and test.

@edenlightning edenlightning added feature Is an improvement or enhancement help wanted Open to be worked on design Includes a design discussion labels Feb 26, 2021
@mees
Copy link
Contributor

mees commented Mar 1, 2021

@williamFalcon's suggestion was to have something like Trainer(multi_dataset_style=“sequential | simultaneous”) to enable consistent multiple dataloader behaviors.

@tchaton
Copy link
Contributor

tchaton commented Mar 1, 2021

Hey @mees @edenlightning,

Any consensus on the feature to be implemented ?

Note: Sequential training would be quick a large refactor internally to enable this feature.

Best,
T.C

@stale
Copy link

stale bot commented Mar 31, 2021

This issue has been automatically marked as stale because it hasn't had any recent activity. This issue will be closed in 7 days if no further activity occurs. Thank you for your contributions, Pytorch Lightning Team!

@stale stale bot added the won't fix This will not be worked on label Mar 31, 2021
@stale stale bot closed this as completed Apr 10, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
design Includes a design discussion feature Is an improvement or enhancement help wanted Open to be worked on won't fix This will not be worked on
Projects
None yet
Development

No branches or pull requests

3 participants