Skip to content

Commit

Permalink
Update docs
Browse files Browse the repository at this point in the history
  • Loading branch information
himkt committed Nov 14, 2020
1 parent 6ea1420 commit b2b0adc
Show file tree
Hide file tree
Showing 3 changed files with 90 additions and 2 deletions.
89 changes: 89 additions & 0 deletions docs/source/tutorial/advanced_optuna_configuration.rst
Original file line number Diff line number Diff line change
Expand Up @@ -26,3 +26,92 @@ The example of `optuna.json <./config/optuna.json>`_ looks like:
}
}
}
Next, we have to add `optuna_pruner` to `epoch_callbacks`.

- ``imdb_optuna_with_pruning.jsonnet``

.. code-block:: text
local batch_size = 64;
local cuda_device = 0;
local num_epochs = 15;
local seed = 42;
local embedding_dim = std.parseInt(std.extVar('embedding_dim'));
local dropout = std.parseJson(std.extVar('dropout'));
local lr = std.parseJson(std.extVar('lr'));
local max_filter_size = std.parseInt(std.extVar('max_filter_size'));
local num_filters = std.parseInt(std.extVar('num_filters'));
local output_dim = std.parseInt(std.extVar('output_dim'));
local ngram_filter_sizes = std.range(2, max_filter_size);
{
numpy_seed: seed,
pytorch_seed: seed,
random_seed: seed,
dataset_reader: {
lazy: false,
type: 'text_classification_json',
tokenizer: {
type: 'spacy',
},
token_indexers: {
tokens: {
type: 'single_id',
lowercase_tokens: true,
},
},
},
train_data_path: 'https://s3-us-west-2.amazonaws.com/allennlp/datasets/imdb/train.jsonl',
validation_data_path: 'https://s3-us-west-2.amazonaws.com/allennlp/datasets/imdb/dev.jsonl',
model: {
type: 'basic_classifier',
text_field_embedder: {
token_embedders: {
tokens: {
embedding_dim: embedding_dim,
},
},
},
seq2vec_encoder: {
type: 'cnn',
embedding_dim: embedding_dim,
ngram_filter_sizes: ngram_filter_sizes,
num_filters: num_filters,
output_dim: output_dim,
},
dropout: dropout,
},
data_loader: {
shuffle: true,
batch_size: batch_size,
},
trainer: {
cuda_device: cuda_device,
epoch_callbacks: [
{
type: 'optuna_pruner',
}
],
num_epochs: num_epochs,
optimizer: {
lr: lr,
type: 'sgd',
},
validation_metric: '+accuracy',
},
}
Finally, you can run optimization with pruning:

.. code-block:: bash
poetry run allennlp tune \
config/imdb_optuna_with_pruning.jsonnet \
config/hparams.json \
--optuna-param-path config/optuna.json \
--serialization-dir result/hpo \
--study-name test-with-pruning
2 changes: 1 addition & 1 deletion docs/source/tutorial/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,6 @@ Tutorial

allennlp_configuration
optuna_search_space
advanced_optuna_configuration
run_allennlp_optuna
advanced_optuna_configuration
hyperparameter_optimization_at_scale
1 change: 0 additions & 1 deletion docs/source/tutorial/run_allennlp_optuna.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,6 @@ You can optimize hyperparameters by:
poetry run allennlp tune \
config/imdb_optuna.jsonnet \
config/hparams.json \
--optuna-param-path config/optuna.json \
--serialization-dir result \
--study-name test
Expand Down

0 comments on commit b2b0adc

Please sign in to comment.