Skip to content

Commit

Permalink
Remove config dir from tutorial
Browse files Browse the repository at this point in the history
  • Loading branch information
himkt committed Nov 14, 2020
1 parent b2b0adc commit 77661b8
Show file tree
Hide file tree
Showing 3 changed files with 15 additions and 17 deletions.
12 changes: 5 additions & 7 deletions docs/source/tutorial/advanced_optuna_configuration.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,7 @@ Advanced configuration for Optuna
=================================

You can choose a pruner/sample implemented in Optuna.
To specify a pruner/sampler, create a JSON config file

The example of `optuna.json <./config/optuna.json>`_ looks like:

To specify a pruner/sampler, create a JSON config file.

- ``optuna.json``

Expand Down Expand Up @@ -90,6 +87,7 @@ Next, we have to add `optuna_pruner` to `epoch_callbacks`.
},
trainer: {
cuda_device: cuda_device,
// NOTE add `optuna_pruner` here!
epoch_callbacks: [
{
type: 'optuna_pruner',
Expand All @@ -110,8 +108,8 @@ Finally, you can run optimization with pruning:
.. code-block:: bash
poetry run allennlp tune \
config/imdb_optuna_with_pruning.jsonnet \
config/hparams.json \
--optuna-param-path config/optuna.json \
imdb_optuna_with_pruning.jsonnet \
hparams.json \
--optuna-param-path optuna.json \
--serialization-dir result/hpo \
--study-name test-with-pruning
14 changes: 7 additions & 7 deletions docs/source/tutorial/hyperparameter_optimization_at_scale.rst
Original file line number Diff line number Diff line change
Expand Up @@ -9,9 +9,9 @@ You can easily run distributed optimization by adding an option
.. code-block:: bash
poetry run allennlp tune \
config/imdb_optuna.jsonnet \
config/hparams.json \
--optuna-param-path config/optuna.json \
imdb_optuna.jsonnet \
hparams.json \
--optuna-param-path optuna.json \
--serialization-dir result \
--study-name test \
--skip-if-exists
Expand All @@ -26,10 +26,10 @@ the command should be like following:
.. code-block:: bash
poetry run allennlp tune \
config/imdb_optuna.jsonnet \
config/hparams.json \
--optuna-param-path config/optuna.json \
--serialization-dir result \
imdb_optuna.jsonnet \
hparams.json \
--optuna-param-path optuna.json \
--serialization-dir result/distributed \
--study-name test \
--storage mysql://<user_name>:<passwd>@<db_host>/<db_name> \
--skip-if-exists
Expand Down
6 changes: 3 additions & 3 deletions docs/source/tutorial/run_allennlp_optuna.rst
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,8 @@ You can optimize hyperparameters by:
.. code-block:: bash
poetry run allennlp tune \
config/imdb_optuna.jsonnet \
config/hparams.json \
imdb_optuna.jsonnet \
hparams.json \
--serialization-dir result \
--study-name test
Expand All @@ -31,6 +31,6 @@ Retrain a model with optimized hyperparameters
.. code-block:: bash
poetry run allennlp retrain \
config/imdb_optuna.jsonnet \
imdb_optuna.jsonnet \
--serialization-dir retrain_result \
--study-name test

0 comments on commit 77661b8

Please sign in to comment.