Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

Improve docstring of base tuner and assessor #1669

Merged
merged 6 commits into from
Nov 1, 2019

Conversation

liuzhe-lz
Copy link
Contributor

@liuzhe-lz liuzhe-lz commented Oct 30, 2019

Rewrite tuner and assessor docstring.

Currently load_checkpoint, save_checkpoint, and import_data are marked as internal APIs. Because as an internal developer I can't figure out how to use them myself.

There are some links to rst/md doc in docstring. I don't know how to add external links. So I simply wrote the file path.

And seems there is a less verbose way to add cross-reference, tell me if you know.

This PR also hides Tuner.accept_customized_trials from public interface. This API is likely to change if we refactor advisor interface in next release. (We can safely assume nobody is using it because customized trial is not supported yet.)

@liuzhe-lz liuzhe-lz marked this pull request as ready for review November 1, 2019 09:07
@QuanluZhang QuanluZhang merged commit 0521a0c into microsoft:master Nov 1, 2019
A new trial will run with this configuration.

This is the abstract base class for all tuners.
Tuning algorithms should derive this class and override :meth:`update_search_space`, :meth:`receive_trial_result`,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should inherit / be derived from this class

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ultmaster merged here, please create another pr for your comments

to tell whether this trial can be early stopped or not.

This is the abstract base class for all assessors.
Early stopping algorithms should derive this class and override :meth:`assess_trial` method,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here.

it hints NNI framework that the trial is likely to result in a poor final accuracy,
and therefore should be killed to save resource.

If an accessor want's to get notified when a trial ends, it can also override :meth:`trial_end`.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

wants to be notified


The NNI framework has little guarantee on ``trial_history``.
This method is not guaranteed to be invoked for each time ``trial_history`` get updated.
It is also possible that a trial's history keeps updateing after receiving a bad result.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

updating

trial_job_id: str
Unique identifier of the trial.
success: bool
True if the trial successfully completed; False if failed or terminated.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe I don't know about Python. Shouldn't there be a pass here?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pylint with defaut configuration will complain if there is a pass.
I tried to find some PEP or other offical doc to clarify whether there shoud be a pass or not, but failed.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If Pylint didn't complain, I'm fine with it.

_logger = logging.getLogger(__name__)


class Tuner(Recoverable):
"""
Tuner is an AutoML algorithm, which generates a new configuration for the next try.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

generates configurations to run with.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This sentence is copied from overview.
Do you think it should be updated?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure. "For the next try" looks really Chi-english to me.

as well as :meth:`generate_parameters` or :meth:`generate_multiple_parameters`.

After initializing, NNI will first call :meth:`update_search_space` to tell tuner the feasible region,
and then call :meth:`generate_parameters` one or more times to request for hyper-parameter configurations.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Better mention generate_parameter here.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you mean generate_parameters? It's already there.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mean generate_multiple_parameters. I think users should have a general idea about which method (multiple or not multiple) to override at the very beginning. You could move the explanation in gen_mul_param to here. Putting it in tutorials is better, but... never mind.

@ultmaster
Copy link
Contributor

Oops. Seems PR has been merged. Please ignore my comments. (No big problem here)

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants