Skip to content

Commit

Permalink
Remove opt from manual_backward in docs (Lightning-AI#6267)
Browse files Browse the repository at this point in the history
  • Loading branch information
akihironitta authored and kaushikb11 committed Mar 2, 2021
1 parent 0138f0a commit 548241b
Show file tree
Hide file tree
Showing 2 changed files with 6 additions and 6 deletions.
8 changes: 4 additions & 4 deletions docs/source/common/lightning_module.rst
Original file line number Diff line number Diff line change
Expand Up @@ -946,7 +946,7 @@ When set to ``False``, Lightning does not automate the optimization process. Thi
opt = self.optimizers(use_pl_optimizer=True)
loss = ...
self.manual_backward(loss, opt)
self.manual_backward(loss)
opt.step()
opt.zero_grad()
Expand All @@ -961,16 +961,16 @@ In the multi-optimizer case, ignore the ``optimizer_idx`` argument and use the o
def training_step(self, batch, batch_idx, optimizer_idx):
# access your optimizers with use_pl_optimizer=False. Default is True
(opt_a, opt_b) = self.optimizers(use_pl_optimizer=True)
opt_a, opt_b = self.optimizers(use_pl_optimizer=True)
gen_loss = ...
opt_a.zero_grad()
self.manual_backward(gen_loss, opt_a)
self.manual_backward(gen_loss)
opt_a.step()
disc_loss = ...
opt_b.zero_grad()
self.manual_backward(disc_loss, opt_b)
self.manual_backward(disc_loss)
opt_b.step()
--------------
Expand Down
4 changes: 2 additions & 2 deletions pytorch_lightning/core/lightning.py
Original file line number Diff line number Diff line change
Expand Up @@ -1202,10 +1202,10 @@ def manual_backward(self, loss: Tensor, optimizer: Optional[Optimizer] = None, *
Example::
def training_step(...):
(opt_a, opt_b) = self.optimizers()
opt_a, opt_b = self.optimizers()
loss = ...
# automatically applies scaling, etc...
self.manual_backward(loss, opt_a)
self.manual_backward(loss)
opt_a.step()
"""
if optimizer is not None:
Expand Down

0 comments on commit 548241b

Please sign in to comment.