You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Providing a way to do validation every epoch with non-finite (__len__ not implemented) dataloaders.
Motivation
Doing validation every epoch is a natural choice, and with finite dataloader you can do it easily by setting val_check_interval=1.0. However, with non-finite dataloader you cannot set val_check_interval to be float. There's no simple way to work around it yet.
Pitch
There can be several way to make it happen.
One solution on top of my head is to let val_check_interval to be None. if it is none, just do the validation according to the number of check_val_every_n_epoch at the end of every epoch.
Alternatives
Alternatively, you can let the user set val_check_interval to be 1.0 even with non-finite dataloaders. Anything below 1.0 would be invalid but only 1.0 can be valid. If it is 1.0 then do the validation according to the number of check_val_every_n_epoch at the end of every epoch.
Additional context
Not all dataloaders without __len__ implemented are infinite dataloaders. Some just cannot decide length in advance. With these dataloaders the concept of 'epoch' is still valid. Pytorch-lightning needs to serve this kind of dataloaders better.
The text was updated successfully, but these errors were encountered:
Thanks for the issue! Totally agree that we should support this. It might be a bit painful with how validation works at the moment. Currently we validate when the batch index reaches a certain value. But with Iterable stuff we'd need to do it once the itetation is over (and we can't know ahead of time when that will be). I guess that the way to do it is to add a seperate clause which deals specifically with the case when val_check_interval=1.0, although it's not the most elegant solution :/
🚀 Feature
Providing a way to do validation every epoch with non-finite (
__len__
not implemented) dataloaders.Motivation
Doing validation every epoch is a natural choice, and with finite dataloader you can do it easily by setting
val_check_interval=1.0
. However, with non-finite dataloader you cannot setval_check_interval
to be float. There's no simple way to work around it yet.Pitch
There can be several way to make it happen.
One solution on top of my head is to let
val_check_interval
to beNone
. if it is none, just do the validation according to the number ofcheck_val_every_n_epoch
at the end of every epoch.Alternatives
Alternatively, you can let the user set
val_check_interval
to be 1.0 even with non-finite dataloaders. Anything below 1.0 would be invalid but only 1.0 can be valid. If it is 1.0 then do the validation according to the number ofcheck_val_every_n_epoch
at the end of every epoch.Additional context
Not all dataloaders without
__len__
implemented are infinite dataloaders. Some just cannot decide length in advance. With these dataloaders the concept of 'epoch' is still valid. Pytorch-lightning needs to serve this kind of dataloaders better.The text was updated successfully, but these errors were encountered: