thefittest
is an open-source library designed for the efficient application of classical evolutionary algorithms and their effective modifications in optimization and machine learning. Our project aims to provide performance, accessibility, and ease of use, opening up the world of advanced evolutionary methods to you.
- Performance
- Our library is developed using advanced coding practices and delivers high performance through integration with NumPy, Scipy, Numba, and scikit-learn.
- Versatility
thefittest
offers a wide range of classical evolutionary algorithms and effective modifications, making it the ideal choice for a variety of optimization and machine learning tasks.- Integration with scikit-learn
- Easily integrate machine learning methods from
thefittest
with scikit-learn tools, creating comprehensive and versatile solutions for evolutionary optimization and model training tasks.
To install thefittest
library, use the following command:
pip install thefittest
thefittest
requires:
The following example demonstrates how to use thefittest
library with the SHADE optimizer to minimize a custom objective function. This quick start example showcases the main components needed to set up and run an optimization.
from thefittest.optimizers import SHADE
# Define the objective function to minimize
def custom_problem(x):
return (5 - x[:, 0])**2 + (12 - x[:, 1])**2
# Initialize the SHADE optimizer with custom parameters
optimizer = SHADE(
fitness_function=custom_problem,
iters=25,
pop_size=10,
left_border=-100,
right_border=100,
num_variables=2,
show_progress_each=10,
minimization=True,
)
# Run the optimization
optimizer.fit()
# Retrieve and print the best solution found
fittest = optimizer.get_fittest()
print('The fittest individ:', fittest['phenotype'])
print('with fitness', fittest['fitness'])
This example demonstrates how to train a machine learning model on the Iris dataset using thefittest
library's MLPEAClassifier
with the SHAGA evolutionary optimizer.
from thefittest.optimizers import SHAGA
from thefittest.benchmarks import IrisDataset
from thefittest.classifiers import MLPEAClassifier
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import minmax_scale
from sklearn.metrics import confusion_matrix, f1_score
# Load the Iris dataset
data = IrisDataset()
X = data.get_X()
y = data.get_y()
# Scale features to the [0, 1] range
X_scaled = minmax_scale(X)
# Split the data into training and test sets
X_train, X_test, y_train, y_test = train_test_split(X_scaled, y, test_size=0.1)
# Initialize the MLPEAClassifier with SHAGA as the optimizer
model = MLPEAClassifier(
n_iter=500,
pop_size=500,
hidden_layers=[5, 5],
weights_optimizer=SHAGA,
weights_optimizer_args={"show_progress_each": 10}
)
# Train the model
model.fit(X_train, y_train)
# Make predictions on the test set
predict = model.predict(X_test)
# Evaluate the model
print("confusion_matrix: \n", confusion_matrix(y_test, predict))
print("f1_score: \n", f1_score(y_test, predict, average="macro"))
- Genetic algorithm (Holland, J. H. (1992). Genetic algorithms. Scientific American, 267(1), 66-72):
- Self-configuring genetic algorithm (Semenkin, E.S., Semenkina, M.E. Self-configuring Genetic Algorithm with Modified Uniform Crossover Operator. LNCS, 7331, 2012, pp. 414-421.);
- SHAGA (Stanovov, Vladimir & Akhmedova, Shakhnaz & Semenkin, Eugene. (2019). Genetic Algorithm with Success History based Parameter Adaptation. 180-187.);
- PDPGA (Niehaus, J., Banzhaf, W. (2001); Adaption of Operator Probabilities in Genetic Programming. In: Miller, J., Tomassini, M., Lanzi, P.L., Ryan, C., Tettamanzi, A.G.B., Langdon, W.B. (eds) Genetic Programming. EuroGP 2001. Lecture Notes in Computer Science, vol 2038. Springer, Berlin, Heidelberg.).
- Differential evolution (Storn, Rainer & Price, Kenneth. (1995). Differential Evolution: A Simple and Efficient Adaptive Scheme for Global Optimization Over Continuous Spaces. Journal of Global Optimization. 23)
- jDE (Brest, Janez & Greiner, Sao & Bošković, Borko & Mernik, Marjan & Zumer, Viljem. (2007). Self-Adapting Control Parameters in Differential Evolution: A Comparative 13. 945 - 958.);
- SHADE (Tanabe, Ryoji & Fukunaga, Alex. (2013). Success-history based parameter adaptation for Differential Evolution. 2013 IEEE Congress on Evolutionary Computation, CEC 2013. 71-78.).
- Genetic programming (Koza, John R.. “Genetic programming - on the programming of computers by means of natural selection.” Complex Adaptive Systems (1993)):
- Self-configuring genetic programming (Semenkin, Eugene & Semenkina, Maria. (2012). Self-configuring genetic programming algorithm with modified uniform crossover. 1-6.).
- PDPGP (Niehaus, J., Banzhaf, W. (2001); Adaption of Operator Probabilities in Genetic Programming. In: Miller, J., Tomassini, M., Lanzi, P.L., Ryan, C., Tettamanzi, A.G.B., Langdon, W.B. (eds) Genetic Programming. EuroGP 2001. Lecture Notes in Computer Science, vol 2038. Springer, Berlin, Heidelberg.).
- Genetic programming of neural networks (GPNN) (`Lipinsky L., Semenkin E., Bulletin of the Siberian State Aerospace University., 3(10), 22-26 (2006). In Russian`_);
- Multilayer perceptron trained by evolutionary algorithms (Cotta, Carlos & Alba, Enrique & Sagarna, R. & Larranaga, Pedro. (2002). Adjusting Weights in Artificial Neural Networks using Evolutionary Algorithms.);
- CEC2005 (`Suganthan, Ponnuthurai & Hansen, Nikolaus & Liang, Jing & Deb, Kalyan & Chen, Ying-ping & Auger, Anne & Tiwari, Santosh. (2005). Problem Definitions and Evaluation Criteria for the CEC 2005 Special Session on Real-Parameter Optimization. Natural Computing. 341-357`_);
- Symbolicregression17. 17 test regression problem from the paper (Semenkin, Eugene & Semenkina, Maria. (2012). Self-configuring genetic programming algorithm with modified uniform crossover. 1-6.).
- Iris (Fisher,R. A.. (1988). Iris. UCI Machine Learning Repository.);
- Wine (Aeberhard,Stefan and Forina,M.. (1991). Wine. UCI Machine Learning Repository.);
- Breast Cancer Wisconsin (Diagnostic) (Wolberg,William, Mangasarian,Olvi, Street,Nick, and Street,W.. (1995). Breast Cancer Wisconsin (Diagnostic). UCI Machine Learning Repository.);
- Optical Recognition of Handwritten Digits (Alpaydin,E. and Kaynak,C.. (1998). Optical Recognition of Handwritten Digits. UCI Machine Learning Repository.);
Notebooks on how to use thefittest
:
- Solving Binary and Real-Valued Optimization Problems with Genetic Algorithms;
- Solving Real-Valued Optimization Problems with Differential Evolution;
- Solving Symbolic Regression Problems Using Genetic Programming Algorithms;
- Training Neural Networks Using Evolutionary Algorithms for Regression and Classification Problems;
- Optimizing Neural Network Structure Using Genetic Programming;
If some notebooks are too big to display, you can use NBviewer.
Publications where thefittest
has been used:
- Thefittest: evolutionary machine learning in Python, January 2024, ITM Web of Conferences 59. DOI: 10.1051/itmconf/20245902020. Licensed under CC BY 4.0. Authored by Pavel Sherstnev. Available at: https://doi.org/10.1051/itmconf/20245902020
- 1st place, Samsung Innovation Campus (IT Academy), Artificial Intelligence track, October 2024. Read more;
- Best PhD Student Paper at the 12th International Workshop on Mathematical Models and their Applications (IWMMA'2023) for the paper "Thefittest: Evolutionary Machine Learning in Python" by Pavel Sherstnev;
- Tutorial Presenter at the 13th International Workshop on Mathematical Models and their Applications (IWMMA'2024) with the tutorial titled "Thefittest Library: Evolutionary Algorithms and Automation of Machine Learning Models Design in Python".