diff --git a/docs/source/tutorial/advanced_optuna_configuration.rst b/docs/source/tutorial/advanced_optuna_configuration.rst index 8cd1997..7dd3f91 100644 --- a/docs/source/tutorial/advanced_optuna_configuration.rst +++ b/docs/source/tutorial/advanced_optuna_configuration.rst @@ -2,10 +2,7 @@ Advanced configuration for Optuna ================================= You can choose a pruner/sample implemented in Optuna. -To specify a pruner/sampler, create a JSON config file - -The example of `optuna.json <./config/optuna.json>`_ looks like: - +To specify a pruner/sampler, create a JSON config file. - ``optuna.json`` @@ -90,6 +87,7 @@ Next, we have to add `optuna_pruner` to `epoch_callbacks`. }, trainer: { cuda_device: cuda_device, + // NOTE add `optuna_pruner` here! epoch_callbacks: [ { type: 'optuna_pruner', @@ -110,8 +108,8 @@ Finally, you can run optimization with pruning: .. code-block:: bash poetry run allennlp tune \ - config/imdb_optuna_with_pruning.jsonnet \ - config/hparams.json \ - --optuna-param-path config/optuna.json \ + imdb_optuna_with_pruning.jsonnet \ + hparams.json \ + --optuna-param-path optuna.json \ --serialization-dir result/hpo \ --study-name test-with-pruning diff --git a/docs/source/tutorial/hyperparameter_optimization_at_scale.rst b/docs/source/tutorial/hyperparameter_optimization_at_scale.rst index 34d91d2..14cfe06 100644 --- a/docs/source/tutorial/hyperparameter_optimization_at_scale.rst +++ b/docs/source/tutorial/hyperparameter_optimization_at_scale.rst @@ -9,9 +9,9 @@ You can easily run distributed optimization by adding an option .. code-block:: bash poetry run allennlp tune \ - config/imdb_optuna.jsonnet \ - config/hparams.json \ - --optuna-param-path config/optuna.json \ + imdb_optuna.jsonnet \ + hparams.json \ + --optuna-param-path optuna.json \ --serialization-dir result \ --study-name test \ --skip-if-exists @@ -26,9 +26,9 @@ the command should be like following: .. code-block:: bash poetry run allennlp tune \ - config/imdb_optuna.jsonnet \ - config/hparams.json \ - --optuna-param-path config/optuna.json \ + imdb_optuna.jsonnet \ + hparams.json \ + --optuna-param-path optuna.json \ --serialization-dir result \ --study-name test \ --storage mysql://:@/ \ diff --git a/docs/source/tutorial/run_allennlp_optuna.rst b/docs/source/tutorial/run_allennlp_optuna.rst index cf5b73e..ff77342 100644 --- a/docs/source/tutorial/run_allennlp_optuna.rst +++ b/docs/source/tutorial/run_allennlp_optuna.rst @@ -10,8 +10,8 @@ You can optimize hyperparameters by: .. code-block:: bash poetry run allennlp tune \ - config/imdb_optuna.jsonnet \ - config/hparams.json \ + imdb_optuna.jsonnet \ + hparams.json \ --serialization-dir result \ --study-name test @@ -31,6 +31,6 @@ Retrain a model with optimized hyperparameters .. code-block:: bash poetry run allennlp retrain \ - config/imdb_optuna.jsonnet \ + imdb_optuna.jsonnet \ --serialization-dir retrain_result \ --study-name test