Skip to content

Commit

Permalink
updated docs
Browse files Browse the repository at this point in the history
  • Loading branch information
williamFalcon committed Mar 6, 2020
1 parent 538389a commit 6d35584
Show file tree
Hide file tree
Showing 2 changed files with 32 additions and 4 deletions.
15 changes: 15 additions & 0 deletions docs/source/hyperparameters.rst
Original file line number Diff line number Diff line change
Expand Up @@ -55,6 +55,21 @@ Now we can parametrize the LightningModule.
.. note:: Bonus! if (hparams) is in your module, Lightning will save it into the checkpoint and restore your
model using those hparams exactly.

And we can also add all the flags available in the Trainer to the Argparser.

.. code-block:: python
# add all the available Trainer options to the ArgParser
parser = pl.Trainer.add_argparse_args(parser)
args = parser.parse_args()
And now you can start your program with

.. code-block:: bash
# now you can use any trainer flag
$ python main.py --num_nodes 2 --gpus 8
Trainer args
^^^^^^^^^^^^

Expand Down
21 changes: 17 additions & 4 deletions docs/source/introduction_guide.rst
Original file line number Diff line number Diff line change
Expand Up @@ -578,7 +578,7 @@ Notice the epoch is MUCH faster!
Hyperparameters
---------------
Normally, we don't hard-code the values to a model. We usually use the command line to
modify the network. The `Trainer` can add all the available options to an ArgumentParser.
modify the network.

.. code-block:: python
Expand All @@ -591,9 +591,6 @@ modify the network. The `Trainer` can add all the available options to an Argume
parser.add_argument('--layer_2_dim', type=int, default=256)
parser.add_argument('--batch_size', type=int, default=64)
# add all the available options to the trainer
parser = pl.Trainer.add_argparse_args(parser)
args = parser.parse_args()
Now we can parametrize the LightningModule.
Expand Down Expand Up @@ -626,6 +623,22 @@ Now we can parametrize the LightningModule.
.. note:: Bonus! if (hparams) is in your module, Lightning will save it into the checkpoint and restore your
model using those hparams exactly.

And we can also add all the flags available in the Trainer to the Argparser.

.. code-block:: python
# add all the available Trainer options to the ArgParser
parser = pl.Trainer.add_argparse_args(parser)
args = parser.parse_args()
And now you can start your program with

.. code-block:: bash
# now you can use any trainer flag
$ python main.py --num_nodes 2 --gpus 8
For a full guide on using hyperparameters, `check out the hyperparameters docs <hyperparameters.rst>`_.

---------
Expand Down

0 comments on commit 6d35584

Please sign in to comment.