katib

Automated Machine Learning on Kubernetes

APACHE-2.0 License

Downloads
6.7K
Stars
1.5K
Committers
123

Katib is a Kubernetes-native project for automated machine learning (AutoML). Katib supports Hyperparameter Tuning, Early Stopping and Neural Architecture Search.

Katib is the project which is agnostic to machine learning (ML) frameworks. It can tune hyperparameters of applications written in any language of the users’ choice and natively supports many ML frameworks, such as TensorFlow, Apache MXNet, PyTorch, XGBoost, and others.

Katib can perform training jobs using any Kubernetes Custom Resources with out of the box support for Kubeflow Training Operator, Argo Workflows, Tekton Pipelines and many more.

Katib stands for secretary in Arabic.

Search Algorithms

Katib supports several search algorithms. Follow the Kubeflow documentation to know more about each algorithm and check the this guide to implement your custom algorithm.

To perform the above algorithms Katib supports the following frameworks:

Prerequisites

Please check the official Kubeflow documentation for prerequisites to install Katib.

Installation

Please follow the Kubeflow Katib guide for the detailed instructions on how to install Katib.

Installing the Control Plane

Run the following command to install the latest stable release of Katib control plane:

kubectl apply -k "github.com/kubeflow/katib.git/manifests/v1beta1/installs/katib-standalone?ref=v0.17.0"

Run the following command to install the latest changes of Katib control plane:

kubectl apply -k "github.com/kubeflow/katib.git/manifests/v1beta1/installs/katib-standalone?ref=master"

For the Katib Experiments check the complete examples list.

Installing the Python SDK

Katib implements a Python SDK to simplify creation of hyperparameter tuning jobs for Data Scientists.

Run the following command to install the latest stable release of Katib SDK:

pip install -U kubeflow-katib

Getting Started

Please refer to the getting started guide to quickly create your first hyperparameter tuning Experiment using the Python SDK.

Community

The following links provide information on how to get involved in the community:

Contributing

Please refer to the CONTRIBUTING guide.

Citation

If you use Katib in a scientific publication, we would appreciate citations to the following paper:

A Scalable and Cloud-Native Hyperparameter Tuning System, George et al., arXiv:2006.02085, 2020.

Bibtex entry:

@misc{george2020katib,
    title={A Scalable and Cloud-Native Hyperparameter Tuning System},
    author={Johnu George and Ce Gao and Richard Liu and Hou Gang Liu and Yuan Tang and Ramdoot Pydipaty and Amit Kumar Saha},
    year={2020},
    eprint={2006.02085},
    archivePrefix={arXiv},
    primaryClass={cs.DC}
}