Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Document exceptions in callbacks #5541

Merged
merged 31 commits into from
Feb 15, 2021
Merged
Show file tree
Hide file tree
Changes from 11 commits
Commits
Show all changes
31 commits
Select commit Hold shift + click to select a range
aacefdb
Add Raises: section to docstring
akihironitta Jan 16, 2021
f2c22ed
Merge branch 'release/1.2-dev' into docs/add-raises-section
akihironitta Jan 19, 2021
1420760
Add Raises section to the docs
akihironitta Jan 20, 2021
b059846
Merge branch 'release/1.2-dev' into docs/add-raises-section
akihironitta Jan 28, 2021
f65fbe5
Merge branch 'release/1.2-dev' into docs/add-raises-section
akihironitta Feb 1, 2021
04aaa2d
Merge branch 'release/1.2-dev' of github.com:PyTorchLightning/pytorch…
akihironitta Feb 10, 2021
84ae9f1
Add raises section to the docs
akihironitta Feb 10, 2021
9a60ec2
Merge branch 'master' into docs/add-raises-section
akihironitta Feb 12, 2021
8cbb72f
Apply suggestions from code review
akihironitta Feb 12, 2021
9f14b62
fix
akihironitta Feb 12, 2021
94c8acc
Remove unnecessary instance check
akihironitta Feb 12, 2021
0c4c3ea
Merge branch 'master' into docs/add-raises-section
mergify[bot] Feb 12, 2021
23c170d
Merge branch 'master' into docs/add-raises-section
mergify[bot] Feb 12, 2021
33957a7
Merge branch 'master' into docs/add-raises-section
mergify[bot] Feb 12, 2021
d96b2f1
Merge branch 'master' into docs/add-raises-section
mergify[bot] Feb 12, 2021
18b2314
Merge branch 'master' into docs/add-raises-section
mergify[bot] Feb 13, 2021
cf1b330
Merge branch 'master' into docs/add-raises-section
mergify[bot] Feb 13, 2021
7e4f96d
Merge branch 'master' into docs/add-raises-section
mergify[bot] Feb 13, 2021
8f6a156
Merge branch 'master' into docs/add-raises-section
mergify[bot] Feb 13, 2021
670efbf
Merge branch 'master' into docs/add-raises-section
mergify[bot] Feb 13, 2021
902074a
Merge branch 'master' into docs/add-raises-section
mergify[bot] Feb 13, 2021
a2a89b8
Merge branch 'master' into docs/add-raises-section
mergify[bot] Feb 13, 2021
4870471
Merge branch 'master' into docs/add-raises-section
mergify[bot] Feb 13, 2021
a5e7d0b
Merge branch 'master' into docs/add-raises-section
mergify[bot] Feb 13, 2021
5dccf5a
Merge branch 'master' into docs/add-raises-section
mergify[bot] Feb 13, 2021
480c4d3
Merge branch 'master' into docs/add-raises-section
mergify[bot] Feb 13, 2021
0d4a0bf
Merge branch 'master' into docs/add-raises-section
mergify[bot] Feb 13, 2021
3f4f364
Merge branch 'master' into docs/add-raises-section
mergify[bot] Feb 13, 2021
9783f92
Merge branch 'master' into docs/add-raises-section
mergify[bot] Feb 14, 2021
852a502
Merge branch 'master' into docs/add-raises-section
mergify[bot] Feb 14, 2021
ba11500
Merge branch 'master' into docs/add-raises-section
mergify[bot] Feb 15, 2021
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions pytorch_lightning/callbacks/early_stopping.py
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,12 @@ class EarlyStopping(Callback):
strict: whether to crash the training if `monitor` is
not found in the validation metrics. Default: ``True``.

Raises:
MisconfigurationException:
If ``mode`` is none of ``"min"``, ``"max"``, and ``"auto"``.
RuntimeError:
If the metric ``monitor`` is not available.

Example::

>>> from pytorch_lightning import Trainer
Expand Down
8 changes: 6 additions & 2 deletions pytorch_lightning/callbacks/finetuning.py
Original file line number Diff line number Diff line change
Expand Up @@ -318,8 +318,12 @@ def __init__(
self.verbose = verbose

def on_fit_start(self, trainer, pl_module):
if hasattr(pl_module, "backbone") and \
(isinstance(pl_module.backbone, Module) or isinstance(pl_module.backbone, Sequential)):
Borda marked this conversation as resolved.
Show resolved Hide resolved
"""
Raises:
MisconfigurationException:
If LightningModule has no nn.Module `backbone` attribute.
"""
if hasattr(pl_module, "backbone") and isinstance(pl_module.backbone, Module):
return
raise MisconfigurationException("The LightningModule should have a nn.Module `backbone` attribute")

Expand Down
4 changes: 4 additions & 0 deletions pytorch_lightning/callbacks/gpu_stats_monitor.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,10 @@ class GPUStatsMonitor(Callback):
temperature: Set to ``True`` to monitor the memory and gpu temperature in degree Celsius.
Default: ``False``.

Raises:
MisconfigurationException:
If NVIDIA driver is not installed, not running on GPUs, or ``Trainer`` has no logger.

Example::

>>> from pytorch_lightning import Trainer
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,13 @@ class GradientAccumulationScheduler(Callback):
Args:
scheduling: scheduling in format {epoch: accumulation_factor}

Raises:
TypeError:
If ``scheduling`` is an empty ``dict``,
or not all keys and values of ``scheduling`` are integers.
IndexError:
If ``minimal_epoch`` is less than 0.

Example::

>>> from pytorch_lightning import Trainer
Expand Down
8 changes: 8 additions & 0 deletions pytorch_lightning/callbacks/lr_monitor.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,10 @@ class LearningRateMonitor(Callback):
log_momentum: option to also log the momentum values of the optimizer, if the optimizer
has the ``momentum`` or ``betas`` attribute. Defaults to ``False``.

Raises:
MisconfigurationException:
If ``logging_interval`` is none of ``"step"``, ``"epoch"``, or ``None``.

Example::

>>> from pytorch_lightning import Trainer
Expand Down Expand Up @@ -77,6 +81,10 @@ def on_train_start(self, trainer, *args, **kwargs):
Called before training, determines unique names for all lr
schedulers in the case of multiple of the same type or in
the case of multiple parameter groups

Raises:
MisconfigurationException:
If ``Trainer`` has no ``logger``.
"""
if not trainer.logger:
raise MisconfigurationException(
Expand Down
8 changes: 8 additions & 0 deletions pytorch_lightning/callbacks/model_checkpoint.py
Original file line number Diff line number Diff line change
Expand Up @@ -115,6 +115,14 @@ class ModelCheckpoint(Callback):
For example, you can change the default last checkpoint name by doing
``checkpoint_callback.CHECKPOINT_NAME_LAST = "{epoch}-last"``

Raises:
MisconfigurationException:
If ``save_top_k`` is neither ``None`` nor more than or equal to ``-1``,
if ``monitor`` is ``None`` and ``save_top_k`` is none of ``None``, ``-1``, and ``0``, or
if ``mode`` is none of ``"min"``, ``"max"``, and ``"auto"``.
akihironitta marked this conversation as resolved.
Show resolved Hide resolved
ValueError:
If ``trainer.save_checkpoint`` is ``None``.

Example::

>>> from pytorch_lightning import Trainer
Expand Down
13 changes: 13 additions & 0 deletions pytorch_lightning/callbacks/pruning.py
Original file line number Diff line number Diff line change
Expand Up @@ -135,6 +135,14 @@ def __init__(

verbose: Verbosity level. 0 to disable, 1 to log overall sparsity, 2 to log per-layer sparsity

Raises:
MisconfigurationException:
If ``parameter_names`` is neither ``"weight"`` nor ``"bias"``,
if the provided ``pruning_fn`` is not supported,
if ``pruning_dim`` is not provided when ``"unstructured"``,
if ``pruning_norm`` is not provided when ``"ln_structured"``,
if ``pruning_fn`` is neither ``str`` nor :class:`torch.nn.utils.prune.BasePruningMethod`, or
if ``amount`` is none of ``int``, ``float`` and ``Callable``.
akihironitta marked this conversation as resolved.
Show resolved Hide resolved
"""

self._use_global_unstructured = use_global_unstructured
Expand Down Expand Up @@ -382,6 +390,11 @@ def sanitize_parameters_to_prune(
"""
This function is responsible of sanitizing ``parameters_to_prune`` and ``parameter_names``.
If ``parameters_to_prune is None``, it will be generated with all parameters of the model.

Raises:
MisconfigurationException:
If ``parameters_to_prune`` doesn't exist in the model, or
if ``parameters_to_prune`` is neither a list of tuple nor ``None``.
akihironitta marked this conversation as resolved.
Show resolved Hide resolved
"""
parameters = parameter_names or ModelPruning.PARAMETER_NAMES

Expand Down