Skip to content

Commit

Permalink
Doc fixes (#1362)
Browse files Browse the repository at this point in the history
* Doc fixes from #1357 (awaelchli's comments) + changelog.

* Fix indentation.

* Add blank line to fix doc build?
  • Loading branch information
jbschiratti authored Apr 3, 2020
1 parent bf990a3 commit e570d2e
Show file tree
Hide file tree
Showing 3 changed files with 16 additions and 12 deletions.
2 changes: 1 addition & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Added `summary` method to Profilers. ([#1259](https://github.com/PyTorchLightning/pytorch-lightning/pull/1259))
- Added informative errors if user defined dataloader has zero length ([#1280](https://github.com/PyTorchLightning/pytorch-lightning/pull/1280))
- Added testing for python 3.8 ([#915](https://github.com/PyTorchLightning/pytorch-lightning/pull/915))

- Added a `training_epoch_end` method which is the mirror of `validation_epoch_end`. ([#1357](https://github.com/PyTorchLightning/pytorch-lightning/pull/1357))
### Changed

- Changed `progress_bar_refresh_rate` trainer flag to disable progress bar when set to 0. ([#1108](https://github.com/PyTorchLightning/pytorch-lightning/pull/1108))
Expand Down
24 changes: 14 additions & 10 deletions pytorch_lightning/core/lightning.py
Original file line number Diff line number Diff line change
Expand Up @@ -245,16 +245,18 @@ def training_epoch_end(
for train_batch in train_data:
out = training_step(train_batch)
train_outs.append(out)
training_epoch_end(val_outs)
training_epoch_end(train_outs)
Args:
outputs: List of outputs you defined in training_step, or if there are multiple
dataloaders, a list containing a list of outputs for each dataloader
dataloaders, a list containing a list of outputs for each dataloader
Return:
Dict or OrderedDict (dict): Dict has the following optional keys:
progress_bar -> Dict for progress bar display. Must have only tensors
log -> Dict of metrics to add to logger. Must have only tensors (no images, etc)
Dict or OrderedDict
May contain the following optional keys:
- log (metrics to be added to the logger ; only tensors)
- any metric used in a callback (e.g. early stopping).
.. note:: If this method is not overridden, this won't be called.
Expand Down Expand Up @@ -282,7 +284,7 @@ def training_epoch_end(self, outputs):
With multiple dataloaders, `outputs` will be a list of lists. The outer list contains
one entry per dataloader, while the inner list contains the individual outputs of
each validation step for that dataloader.
each training step for that dataloader.
.. code-block:: python
Expand Down Expand Up @@ -539,12 +541,14 @@ def validation_epoch_end(
Args:
outputs: List of outputs you defined in validation_step, or if there are multiple
dataloaders, a list containing a list of outputs for each dataloader
dataloaders, a list containing a list of outputs for each dataloader
Return:
Dict or OrderedDict (dict): Dict has the following optional keys:
progress_bar -> Dict for progress bar display. Must have only tensors
log -> Dict of metrics to add to logger. Must have only tensors (no images, etc)
Dict or OrderedDict
May have the following optional keys:
- progress_bar (dict for progress bar display ; only tensors)
- log (dict of metrics to add to logger ; only tensors).
.. note:: If you didn't define a validation_step, this won't be called.
Expand Down
2 changes: 1 addition & 1 deletion pytorch_lightning/trainer/training_loop.py
Original file line number Diff line number Diff line change
Expand Up @@ -826,4 +826,4 @@ def _recursive_detach(in_dict):
out_dict.update({k: v.detach()})
else:
out_dict.update({k: v})
return out_dict
return out_dict

0 comments on commit e570d2e

Please sign in to comment.