- you model structured data using LightGBM (classification or regression tasks).
- want to run or scale hyperparameter optimization in your project.
- Looking around for quick ways to try hyperparameter optimization in your project.
- Run hyperparameter tuning with LightGBM on the structured data.
- Configure scheduler for more efficient tuning.
- Few bits about Ray and Tune fundamentals.
- How to use Tune to run hyperparameter optimization - quick start.
- Few more bits about scheduler - to better define how tuning should progress.
- First, make sure that you have an environment ready. Please follow the instructions on environment setup page (5 minutes read).
- Once your env is ready, go ahead and start ray_tune_micro_tutorial.ipynb.
- Check the user guides for more in-depth introduction to Tune.
- Learn about Distributed LightGBM on Ray that enables multi-node and multi-GPU training.
- Have a closer look at Tune docs to learn more about other search algorithms and schedulers.
- Go to the Ray tutorials page to
- Feel free to reach out on Ray-distributed Slack. Join
#tutorials
channel, say hello and ask questions.