Katib is a Kubernetes-native project for automated machine learning (AutoML). Katib supports Hyperparameter Tuning, Early Stopping and Neural Architecture Search.
Katib is the project which is agnostic to machine learning (ML) frameworks. It can tune hyperparameters of applications written in any language of the users’ choice and natively supports many ML frameworks, such as TensorFlow, Apache MXNet, PyTorch, XGBoost, and others.
Katib can perform training jobs using any Kubernetes Custom Resources with out of the box support for Kubeflow Training Operator, Argo Workflows, Tekton Pipelines and many more.
Katib stands for secretary
in Arabic.
Katib supports several search algorithms. Follow the Kubeflow documentation to know more about each algorithm and check the Suggestion service guide to implement your custom algorithm.
Hyperparameter Tuning | Neural Architecture Search | Early Stopping |
Random Search | ENAS | Median Stop |
Grid Search | DARTS | |
Bayesian Optimization | ||
TPE | ||
Multivariate TPE | ||
CMA-ES | ||
Sobol's Quasirandom Sequence | ||
HyperBand | ||
Population Based Training |
To perform above algorithms Katib supports the following frameworks:
For the various Katib installs check the Kubeflow guide. Follow the next steps to install Katib standalone.
This is the minimal requirements to install Katib:
- Kubernetes >= 1.27
kubectl
>= 1.27
For the latest Katib version run this command:
kubectl apply -k "github.com/kubeflow/katib.git/manifests/v1beta1/installs/katib-standalone?ref=master"
For the specific Katib release (for example v0.14.0
) run this command:
kubectl apply -k "github.com/kubeflow/katib.git/manifests/v1beta1/installs/katib-standalone?ref=v0.14.0"
Make sure that all Katib components are running:
$ kubectl get pods -n kubeflow
NAME READY STATUS RESTARTS AGE
katib-controller-566595bdd8-hbxgf 1/1 Running 0 36s
katib-db-manager-57cd769cdb-4g99m 1/1 Running 0 36s
katib-mysql-7894994f88-5d4s5 1/1 Running 0 36s
katib-ui-5767cfccdc-pwg2x 1/1 Running 0 36s
For the Katib Experiments check the complete examples list.
You can run your first HyperParameter Tuning Experiment using Katib Python SDK.
In the following example we are going to maximize a simple objective function:
import kubeflow.katib as katib
# Step 1. Create an objective function.
def objective(parameters):
# Import required packages.
import time
time.sleep(5)
# Calculate objective function.
result = 4 * int(parameters["a"]) - float(parameters["b"]) ** 2
# Katib parses metrics in this format: <metric-name>=<metric-value>.
print(f"result={result}")
# Step 2. Create HyperParameter search space.
parameters = {
"a": katib.search.int(min=10, max=20),
"b": katib.search.double(min=0.1, max=0.2)
}
# Step 3. Create Katib Experiment.
katib_client = katib.KatibClient()
name = "tune-experiment"
katib_client.tune(
name=name,
objective=objective,
parameters=parameters,
objective_metric_name="result",
max_trial_count=12
)
# Step 4. Get the best HyperParameters.
print(katib_client.get_optimal_hyperparameters(name))
-
Learn about Katib Concepts in this guide.
-
Learn about Katib Interfaces in this guide.
-
Learn about Katib Components in this guide.
-
Know more about Katib in the presentations and demos list.
We are always growing our community and invite new users and AutoML enthusiasts to contribute to the Katib project. The following links provide information about getting involved in the community:
-
Subscribe to the AutoML calendar to attend Working Group bi-weekly community meetings.
-
If you use Katib, please update the adopters list.
Please feel free to test the system! Developer guide is a good starting point for our developers.
- Kubeflow Katib: Scalable, Portable and Cloud Native System for AutoML (by Andrey Velichkevich)
If you use Katib in a scientific publication, we would appreciate citations to the following paper:
A Scalable and Cloud-Native Hyperparameter Tuning System, George et al., arXiv:2006.02085, 2020.
Bibtex entry:
@misc{george2020katib,
title={A Scalable and Cloud-Native Hyperparameter Tuning System},
author={Johnu George and Ce Gao and Richard Liu and Hou Gang Liu and Yuan Tang and Ramdoot Pydipaty and Amit Kumar Saha},
year={2020},
eprint={2006.02085},
archivePrefix={arXiv},
primaryClass={cs.DC}
}