-
Notifications
You must be signed in to change notification settings - Fork 1.8k
Improve docstring of base tuner and assessor #1669
Conversation
A new trial will run with this configuration. | ||
|
||
This is the abstract base class for all tuners. | ||
Tuning algorithms should derive this class and override :meth:`update_search_space`, :meth:`receive_trial_result`, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should inherit / be derived from this class
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ultmaster merged here, please create another pr for your comments
to tell whether this trial can be early stopped or not. | ||
|
||
This is the abstract base class for all assessors. | ||
Early stopping algorithms should derive this class and override :meth:`assess_trial` method, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same here.
it hints NNI framework that the trial is likely to result in a poor final accuracy, | ||
and therefore should be killed to save resource. | ||
|
||
If an accessor want's to get notified when a trial ends, it can also override :meth:`trial_end`. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
wants to be notified
|
||
The NNI framework has little guarantee on ``trial_history``. | ||
This method is not guaranteed to be invoked for each time ``trial_history`` get updated. | ||
It is also possible that a trial's history keeps updateing after receiving a bad result. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
updating
trial_job_id: str | ||
Unique identifier of the trial. | ||
success: bool | ||
True if the trial successfully completed; False if failed or terminated. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe I don't know about Python. Shouldn't there be a pass
here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pylint with defaut configuration will complain if there is a pass
.
I tried to find some PEP or other offical doc to clarify whether there shoud be a pass
or not, but failed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If Pylint didn't complain, I'm fine with it.
_logger = logging.getLogger(__name__) | ||
|
||
|
||
class Tuner(Recoverable): | ||
""" | ||
Tuner is an AutoML algorithm, which generates a new configuration for the next try. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
generates configurations to run with.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This sentence is copied from overview.
Do you think it should be updated?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure. "For the next try" looks really Chi-english to me.
as well as :meth:`generate_parameters` or :meth:`generate_multiple_parameters`. | ||
|
||
After initializing, NNI will first call :meth:`update_search_space` to tell tuner the feasible region, | ||
and then call :meth:`generate_parameters` one or more times to request for hyper-parameter configurations. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Better mention generate_parameter
here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you mean generate_parameters
? It's already there.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I mean generate_multiple_parameters
. I think users should have a general idea about which method (multiple or not multiple) to override at the very beginning. You could move the explanation in gen_mul_param
to here. Putting it in tutorials is better, but... never mind.
Oops. Seems PR has been merged. Please ignore my comments. (No big problem here) |
Rewrite tuner and assessor docstring.
Currently
load_checkpoint
,save_checkpoint
, andimport_data
are marked as internal APIs. Because as an internal developer I can't figure out how to use them myself.There are some links to rst/md doc in docstring. I don't know how to add external links. So I simply wrote the file path.
And seems there is a less verbose way to add cross-reference, tell me if you know.
This PR also hides
Tuner.accept_customized_trials
from public interface. This API is likely to change if we refactor advisor interface in next release. (We can safely assume nobody is using it because customized trial is not supported yet.)