Skip to content

Commit

Permalink
[bugfix] Resolve memory leak for evaluation (#6326)
Browse files Browse the repository at this point in the history
* resolve bug

* resolve flake8

* revert name
  • Loading branch information
tchaton authored and lexierule committed Mar 9, 2021
1 parent 7a2fad4 commit 4c98322
Show file tree
Hide file tree
Showing 3 changed files with 8 additions and 0 deletions.
3 changes: 3 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Check `LightningOptimizer` doesn't delete optimizer hooks ([#6305](https://github.com/PyTorchLightning/pytorch-lightning/pull/6305)


- Resolve memory leak for evaluation ([#6326](https://github.com/PyTorchLightning/pytorch-lightning/pull/6326)


## [1.2.2] - 2021-03-02

### Added
Expand Down
4 changes: 4 additions & 0 deletions pytorch_lightning/trainer/evaluation_loop.py
Original file line number Diff line number Diff line change
Expand Up @@ -203,6 +203,10 @@ def __run_eval_epoch_end(self, num_dataloaders):

# with a single dataloader don't pass an array
outputs = self.outputs

# free memory
self.outputs = []

eval_results = outputs
if num_dataloaders == 1:
eval_results = outputs[0]
Expand Down
1 change: 1 addition & 0 deletions tests/trainer/logging_/test_eval_loop_logging_1_0.py
Original file line number Diff line number Diff line change
Expand Up @@ -126,6 +126,7 @@ def validation_step_end(self, acc):
def validation_epoch_end(self, outputs):
self.log('g', torch.tensor(2, device=self.device), on_epoch=True)
self.validation_epoch_end_called = True
assert len(self.trainer.evaluation_loop.outputs) == 0

def backward(self, loss, optimizer, optimizer_idx):
return LightningModule.backward(self, loss, optimizer, optimizer_idx)
Expand Down

0 comments on commit 4c98322

Please sign in to comment.