Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

DNGO tuner #3479

Merged
merged 24 commits into from
May 28, 2021
Merged

DNGO tuner #3479

merged 24 commits into from
May 28, 2021

Conversation

98may
Copy link
Contributor

@98may 98may commented Mar 26, 2021

DNGO works as a NAS tuner
Scalable Bayesian Optimization Using Deep Neural Networks(DNGO): https://arxiv.org/pdf/1502.05700.pdf
Using DNGO model in https://github.com/automl/pybnn

how to test:
1. make sure codeDir in config.yml is correct
2. nnictl create -c examples/tuners/dngo_tuner/config.yml
search space:
support both numerical and string type(convert string to index as its value so that DNGO can get)
support choice/randint/uniform/quniform/loguniform/qloguniform(by reusing nni.parameter_expressions)

examples/trials/mnist-pytorch/mnist.py Outdated Show resolved Hide resolved
raise ValueError('Unknown key %s and value %s' % (key, val))
return chosen_arch

class DngoTuner(Tuner):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@QuanluZhang Is DNGO an example of customized tuner or a built-in tuner?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

DNGO is meant to be a built-in tuner. Let me check how to make it a built-in tuner.

return p0


class BayesianLinearRegression(BaseModel):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why can't we use bayesian regression from third-party library (e.g., sklearn) so that we don't have to maintain one on our own?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we make pybnn our dependency? https://github.com/automl/pybnn

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We do not use the whole https://github.com/automl/pybnn, only the dngo-related part is included.
So I am not sure about "Can we make pybnn our dependency?"

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mean, I can pip install pybnn and import pybnn, rather than copy the whole pybnn into our repo.

@ultmaster
Copy link
Contributor

Please add unittests for DNGO tuner.

@kvartet kvartet mentioned this pull request Apr 26, 2021
49 tasks
@ultmaster ultmaster self-assigned this Apr 28, 2021
@ultmaster ultmaster added the HPO label Apr 28, 2021
revert an irrelevant change
Copy link
Contributor Author

@98may 98may left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have added the unit test. I will try it myself before update this PR.

examples/trials/mnist-pytorch/mnist.py Outdated Show resolved Hide resolved
@98may
Copy link
Contributor Author

98may commented May 14, 2021

finished the general update, but want to wait to test DNGO tuner with the help of Di Wu's benchmark by comparing its performance with other tuners, so that we can make sure the DNGO tuner works well.

@98may 98may marked this pull request as draft May 14, 2021 02:28
@98may 98may marked this pull request as ready for review May 21, 2021 09:20
@98may 98may requested a review from ultmaster May 21, 2021 09:20
@@ -76,3 +76,7 @@ tuners:
classArgsValidator: nni.algorithms.hpo.regularized_evolution_tuner.EvolutionClassArgsValidator
className: nni.algorithms.hpo.regularized_evolution_tuner.RegularizedEvolutionTuner
source: nni
builtinName: DNGOuner
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing -.

Are you sure this works?

from nni.tuner import Tuner
import nni.parameter_expressions as parameter_expressions
from torch.distributions import Normal
from pybnn import DNGO
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should add pybnn into an optional dependency required by DNGO. See how PPO and BOHB does that.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok, will change it this weekend

@@ -0,0 +1,117 @@
import random
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think a __init__.py is needed to make this importable.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed it.

"batch_size": {"_type":"choice", "_value": [16, 32, 64, 128]},
"hidden_size":{"_type":"choice","_value":[128, 256, 512, 1024]},
"lr":{"_type":"choice","_value":[0.0001, 0.001, 0.01, 0.1]},
"momentum":{"_type":"uniform","_value":[0, 1]}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does DNGO tuner works on string choices for example ["apple", "banana", "orange"].

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, technically it supports any string choices. But it's quite meaningless.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But I don't think it's handled in your code.

Have you tested?

@ultmaster
Copy link
Contributor

Please fix the lint.

@@ -76,3 +76,7 @@ tuners:
classArgsValidator: nni.algorithms.hpo.regularized_evolution_tuner.EvolutionClassArgsValidator
className: nni.algorithms.hpo.regularized_evolution_tuner.RegularizedEvolutionTuner
source: nni
- builtinName: DNGOuner
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wrong spelling.

"batch_size": {"_type":"choice", "_value": [16, 32, 64, 128]},
"hidden_size":{"_type":"choice","_value":[128, 256, 512, 1024]},
"lr":{"_type":"choice","_value":[0.0001, 0.001, 0.01, 0.1]},
"momentum":{"_type":"uniform","_value":[0, 1]}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But I don't think it's handled in your code.

Have you tested?

@@ -0,0 +1,18 @@
authorName: Ayan Mao
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove your name.

className: DngoTuner
# Any parameter need to pass to your tuner class __init__ constructor
# can be specified in this optional classArgs field, for example
trial:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this config should be put into example folder, and should be written in V2 format.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

agree

@@ -0,0 +1,6 @@
{
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This file should also be put into examples.

Copy link
Contributor Author

@98may 98may left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see that you have removed example config.yml and the "string" choice for search space, and added UT and docs. Big thanks:)



* **optimize_mode** (*'maximize' or 'minimize'*\ ) - If 'maximize', the tuner will target to maximize metrics. If 'minimize', the tuner will target to minimize metrics.
* **sample_size** (*int, default = 1000*) - Number of samples to select in each iteration. The best one will be picked from the sample.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

-> The best one will be picked from the sample as the next trial.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

updated



def _random_config(search_space, random_state):
chosen_arch = {}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this tuner is not specific for nas, so better to change "arch" to "config"

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

missed that one. sorry.

@ultmaster ultmaster merged commit 580c597 into microsoft:master May 28, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants