-
Notifications
You must be signed in to change notification settings - Fork 1.8k
Conversation
delete "break" for fast trail
raise ValueError('Unknown key %s and value %s' % (key, val)) | ||
return chosen_arch | ||
|
||
class DngoTuner(Tuner): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@QuanluZhang Is DNGO an example of customized tuner or a built-in tuner?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
DNGO is meant to be a built-in tuner. Let me check how to make it a built-in tuner.
return p0 | ||
|
||
|
||
class BayesianLinearRegression(BaseModel): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why can't we use bayesian regression from third-party library (e.g., sklearn) so that we don't have to maintain one on our own?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we make pybnn our dependency? https://github.com/automl/pybnn
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We do not use the whole https://github.com/automl/pybnn, only the dngo-related part is included.
So I am not sure about "Can we make pybnn our dependency?"
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I mean, I can pip install pybnn
and import pybnn
, rather than copy the whole pybnn
into our repo.
Please add unittests for DNGO tuner. |
revert an irrelevant change
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have added the unit test. I will try it myself before update this PR.
finished the general update, but want to wait to test DNGO tuner with the help of Di Wu's benchmark by comparing its performance with other tuners, so that we can make sure the DNGO tuner works well. |
warning has not been ignored, optional for to-do
@@ -76,3 +76,7 @@ tuners: | |||
classArgsValidator: nni.algorithms.hpo.regularized_evolution_tuner.EvolutionClassArgsValidator | |||
className: nni.algorithms.hpo.regularized_evolution_tuner.RegularizedEvolutionTuner | |||
source: nni | |||
builtinName: DNGOuner |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Missing -
.
Are you sure this works?
from nni.tuner import Tuner | ||
import nni.parameter_expressions as parameter_expressions | ||
from torch.distributions import Normal | ||
from pybnn import DNGO |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should add pybnn
into an optional dependency required by DNGO. See how PPO and BOHB does that.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok, will change it this weekend
@@ -0,0 +1,117 @@ | |||
import random |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think a __init__.py
is needed to make this importable.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed it.
"batch_size": {"_type":"choice", "_value": [16, 32, 64, 128]}, | ||
"hidden_size":{"_type":"choice","_value":[128, 256, 512, 1024]}, | ||
"lr":{"_type":"choice","_value":[0.0001, 0.001, 0.01, 0.1]}, | ||
"momentum":{"_type":"uniform","_value":[0, 1]} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does DNGO tuner works on string choices for example ["apple", "banana", "orange"]
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, technically it supports any string choices. But it's quite meaningless.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But I don't think it's handled in your code.
Have you tested?
Please fix the lint. |
@@ -76,3 +76,7 @@ tuners: | |||
classArgsValidator: nni.algorithms.hpo.regularized_evolution_tuner.EvolutionClassArgsValidator | |||
className: nni.algorithms.hpo.regularized_evolution_tuner.RegularizedEvolutionTuner | |||
source: nni | |||
- builtinName: DNGOuner |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wrong spelling.
"batch_size": {"_type":"choice", "_value": [16, 32, 64, 128]}, | ||
"hidden_size":{"_type":"choice","_value":[128, 256, 512, 1024]}, | ||
"lr":{"_type":"choice","_value":[0.0001, 0.001, 0.01, 0.1]}, | ||
"momentum":{"_type":"uniform","_value":[0, 1]} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But I don't think it's handled in your code.
Have you tested?
@@ -0,0 +1,18 @@ | |||
authorName: Ayan Mao |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove your name.
className: DngoTuner | ||
# Any parameter need to pass to your tuner class __init__ constructor | ||
# can be specified in this optional classArgs field, for example | ||
trial: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this config should be put into example folder, and should be written in V2 format.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
agree
@@ -0,0 +1,6 @@ | |||
{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This file should also be put into examples.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see that you have removed example config.yml and the "string" choice for search space, and added UT and docs. Big thanks:)
docs/en_US/Tuner/BuiltinTuner.rst
Outdated
|
||
|
||
* **optimize_mode** (*'maximize' or 'minimize'*\ ) - If 'maximize', the tuner will target to maximize metrics. If 'minimize', the tuner will target to minimize metrics. | ||
* **sample_size** (*int, default = 1000*) - Number of samples to select in each iteration. The best one will be picked from the sample. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
-> The best one will be picked from the sample as the next trial.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
updated
nni/algorithms/hpo/dngo_tuner.py
Outdated
|
||
|
||
def _random_config(search_space, random_state): | ||
chosen_arch = {} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this tuner is not specific for nas, so better to change "arch" to "config"
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
missed that one. sorry.
DNGO works as a NAS tuner
Scalable Bayesian Optimization Using Deep Neural Networks(DNGO): https://arxiv.org/pdf/1502.05700.pdf
Using DNGO model in https://github.com/automl/pybnn
how to test:
1. make sure codeDir in config.yml is correct
2. nnictl create -c examples/tuners/dngo_tuner/config.yml
search space:
support both numerical and string type(convert string to index as its value so that DNGO can get)
support choice/randint/uniform/quniform/loguniform/qloguniform(by reusing nni.parameter_expressions)