Skip to content

Commit

Permalink
docs
Browse files Browse the repository at this point in the history
  • Loading branch information
rohitgr7 authored and Borda committed Mar 4, 2021
1 parent 40e1d47 commit 8c7c85a
Showing 1 changed file with 2 additions and 6 deletions.
8 changes: 2 additions & 6 deletions pytorch_lightning/core/hooks.py
Original file line number Diff line number Diff line change
Expand Up @@ -581,16 +581,12 @@ def transfer_batch_to_device(self, batch: Any, device: Optional[torch.device] =
For anything else, you need to define how the data is moved to the target device (CPU, GPU, TPU, ...).
Note:
This hook only runs on single GPU training and DDP (no data-parallel).
This hook should only transfer the data and not modify it, nor should it move the data to
any other device than the one passed in as argument (unless you know what you are doing).
Data-Parallel support will come in near future.
Note:
If you need multi-GPU support for your custom batch objects, you need to define your custom
:class:`~torch.nn.parallel.DistributedDataParallel` or
:class:`~pytorch_lightning.overrides.data_parallel.LightningDistributedDataParallel` and
override :meth:`~pytorch_lightning.core.lightning.LightningModule.configure_ddp`.
This hook only runs on single GPU training and DDP (no data-parallel).
Data-Parallel support will come in near future.
Args:
batch: A batch of data that needs to be transferred to a new device.
Expand Down

0 comments on commit 8c7c85a

Please sign in to comment.