Simple, pure python, gradient-free bayesian optimiser for black box functions. This package supports:
# ------ Ask the optimizer for new points
new_points = propose_points(tested_points,
tested_values,
hyperparameters_config,
num_points=num_points)
# ------ Find the minimum and the value at the minumum
best_point, best_value = minimize_function(function2minimize,
hyperparameters_config,
extra_function_args=(),
tolerance=1e-2,
max_iterations=100)
Currently supported optimizers are based on sampling from a uniform distribution (random_optimizer) and a Bayesian optimiser based on a Gaussian Process Regressor with Thompson Sampling and Upper Confidence Bound acquisition functions(bayesian_optimizer). See the examples directory for more use-cases.
The simplest application to function mimization involves (examples/bayesian-minimize)
import numpy as np
from kitkopt.bayesian_optimizer import minimize_function
from kitkopt.hyper_parameter import HyperParameter
def funct(x):
return np.sum(np.square(x))
# ------ Define hyperparameters with bounds and stepsize
hyperparameters_config = [
HyperParameter(-5, 5, 1),
HyperParameter(-5, 5, 1)
]
# ------ Find the minimum and the value at the minumum
best_point, best_value = minimize_function(funct, hyperparameters_config,
extra_function_args=(),
tolerance=1e-2,
max_iterations=100)
- Python 3.6 or above
- NumPy - Linear algebra for Python
- SciPy - Scientific Python library
- Numba - JIT compilation for Python
MIT