Skip to content

Commit

Permalink
[loops] Reset reference to dataloader iterator on run end (#9386)
Browse files Browse the repository at this point in the history
* [loops] Reset reference to dataloader iterator on run end
  • Loading branch information
ananthsub authored Sep 10, 2021
1 parent 58de08d commit c963bf6
Show file tree
Hide file tree
Showing 3 changed files with 6 additions and 0 deletions.
3 changes: 3 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -334,6 +334,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Fixed `replace_sampler` missing the batch size under specific conditions ([#9367](https://github.com/PyTorchLightning/pytorch-lightning/pull/9367))


- Fixed freeing data iterators in loop `on_run_end` ([#9386](https://github.com/PyTorchLightning/pytorch-lightning/pull/9386))


## [1.4.5] - 2021-08-31

- Fixed reduction using `self.log(sync_dict=True, reduce_fx={mean,max})` ([#9142](https://github.com/PyTorchLightning/pytorch-lightning/pull/9142))
Expand Down
1 change: 1 addition & 0 deletions pytorch_lightning/loops/epoch/evaluation_epoch_loop.py
Original file line number Diff line number Diff line change
Expand Up @@ -135,6 +135,7 @@ def on_run_end(self) -> EPOCH_OUTPUT:
outputs = self.outputs
# free memory
self.outputs = []
self.dataloader_iter = None
return outputs

def evaluation_step(self, batch: Any, batch_idx: int, dataloader_idx: int) -> Optional[STEP_OUTPUT]:
Expand Down
2 changes: 2 additions & 0 deletions pytorch_lightning/loops/epoch/training_epoch_loop.py
Original file line number Diff line number Diff line change
Expand Up @@ -232,6 +232,8 @@ def on_run_end(self) -> None:
if self._num_training_batches_reached(self.is_last_batch):
self.update_lr_schedulers("epoch", update_plateau_schedulers=True)

self.dataloader_iter = None

def teardown(self) -> None:
self._results.cpu()
self.batch_loop.teardown()
Expand Down

0 comments on commit c963bf6

Please sign in to comment.