Katib is a Kubernetes-native project for automated machine learning (AutoML). Katib supports Hyperparameter Tuning, Early Stopping and Neural Architecture Search.
Katib is the project which is agnostic to machine learning (ML) frameworks. It can tune hyperparameters of applications written in any language of the users’ choice and natively supports many ML frameworks, such as TensorFlow, Apache MXNet, PyTorch, XGBoost, and others.
Katib can perform training jobs using any Kubernetes Custom Resources with out of the box support for Kubeflow Training Operator, Argo Workflows, Tekton Pipelines and many more.
Katib stands for secretary
in Arabic.
Katib supports several search algorithms. Follow the Kubeflow documentation to know more about each algorithm and check the this guide to implement your custom algorithm.
Hyperparameter Tuning | Neural Architecture Search | Early Stopping |
Random Search | ENAS | Median Stop |
Grid Search | DARTS | |
Bayesian Optimization | ||
TPE | ||
Multivariate TPE | ||
CMA-ES | ||
Sobol's Quasirandom Sequence | ||
HyperBand | ||
Population Based Training |
To perform the above algorithms Katib supports the following frameworks:
Please check the official Kubeflow documentation for prerequisites to install Katib.
Please follow the Kubeflow Katib guide for the detailed instructions on how to install Katib.
Run the following command to install the latest stable release of Katib control plane:
kubectl apply -k "github.com/kubeflow/katib.git/manifests/v1beta1/installs/katib-standalone?ref=v0.17.0"
Run the following command to install the latest changes of Katib control plane:
kubectl apply -k "github.com/kubeflow/katib.git/manifests/v1beta1/installs/katib-standalone?ref=master"
For the Katib Experiments check the complete examples list.
Katib implements a Python SDK to simplify creation of hyperparameter tuning jobs for Data Scientists.
Run the following command to install the latest stable release of Katib SDK:
pip install -U kubeflow-katib
Please refer to the getting started guide to quickly create your first hyperparameter tuning Experiment using the Python SDK.
The following links provide information on how to get involved in the community:
- Attend the bi-weekly AutoML and Training Working Group community meeting.
- Join our
#kubeflow-katib
Slack channel. - Check out who is using Katib and presentations about Katib project.
Please refer to the CONTRIBUTING guide.
If you use Katib in a scientific publication, we would appreciate citations to the following paper:
A Scalable and Cloud-Native Hyperparameter Tuning System, George et al., arXiv:2006.02085, 2020.
Bibtex entry:
@misc{george2020katib,
title={A Scalable and Cloud-Native Hyperparameter Tuning System},
author={Johnu George and Ce Gao and Richard Liu and Hou Gang Liu and Yuan Tang and Ramdoot Pydipaty and Amit Kumar Saha},
year={2020},
eprint={2006.02085},
archivePrefix={arXiv},
primaryClass={cs.DC}
}