AI Starter Guide: this book explains one topic at a time, like a big glossary, easy wiki, quick encyclopedia, or summary notes.
- Download the free e-book file EPUB or PDF.
- If you wish, pay what you want.
- Edited by Joel Parker Henderson.
- For questions and suggestions email me.
- Artificial General Intelligence (AGI)
- Artificial Super Intelligence (ASI)
- Natural language processing (NLP)
- Explainable Artificial Intelligence (XAI)
- Symbolic artificial intelligence
- Generative artificial intelligence
- Expert system
- Case-based reasoning (CBR)
- Central Processing Unit (CPU)
- Graphics Processing Unit (GPU)
- Tensor Processing Unit (TPU)
- Vision Processing Unit (VPU)
- AI processor
- Field Programmable Gate Array (FPGA)
- Supervised learning
- Unsupervised learning
- Reinforcement learning
- Deep learning
- Backpropagation
- Forward Propagation
- Gradient descent
- Zero-shot learning
- Hidden Markov Model (HMM)
- Markov Decision Processes (MDPs)
- Expectation-Maximization (EM) algorithm
- Decision tree
- Eager learning algorithms - TODO
- Lazy learning algorithms
- Instance-based learning algorithms → lazy learning algorithms
- Memory-based learning algorithms → lazy learning algorithms
- Support Vector Machine (SVM)
- Linear Regression: Used for predicting continuous numerical values.
- Logistic Regression: Used for binary classification problems.
- Decision Trees: Tree-based models for both classification and regression tasks.
- Random Forest: An ensemble method combining multiple decision trees.
- Self-Organizing Maps (SOM)
- Kohonen maps → Self-Organizing Maps (SOM)
- K-means Clustering: Partition data points into k clusters based on their proximity to cluster centroids.
- Hierarchical clustering
Statistical Methods:
- Modified Z-Score
- Z-Score
- Percentile: This method identifies anomalies based on percentiles or quantiles of the data distribution.
Density-Based Methods:
- Local Outlier Factor (LOF)
- Isolation Forest
- Density-Based Spatial Clustering of Applications with Noise (DBSCAN)
Proximity-Based Methods:
- k-Nearest Neighbors (KNN)
- Distance-Based Outlier Detection (LOCI)
Machine Learning-Based Methods:
- Autoencoders
- One-Class Support Vector Machines (One-Class SVM)
Ensemble Methods:
- Majority voting
- Isolation Forest Ensemble
- Gaussian Mixture Models (GMM): Model a combination of distributions, allowing data generation and density estimation.
- Autoencoders: Neural networks used for unsupervised feature learning.
- Variational Autoencoders (VAE): Learn to generate new data samples by mapping them to a latent space.
- Q-Learning
- Deep Q Networks (DQN)
- Double Q-learning
- Dueling DQNs
- Rainbow DQN
- Policy gradient methods
- Deep Deterministic Policy Gradients (DDPG)
- Actor-critic
These algorithms leverage both labeled and unlabeled data for learning. They aim to improve model performance by incorporating additional information from unlabeled data. Examples include:
Uses a model to generate pseudo-labeled data from unlabeled examples for further training.
- Bagging (a.k.a. Bootstrap Aggregating)
- Random forest
- Boosting
- Gradient Boosting Machines (GBM)
- Extreme Gradient Boosting (XGBoost)
- LightGBM
- Stacking (a.k.a. Stacked Generalization)
- Voting Classifiers (a.k.a. Voting Ensembles)
- Clustering
- Dimensionality reduction
- Anomaly detection
- Density estimation
- Outlier detection -> anomaly detection
- Parzen window
- Convolutional Neural Network (CNN)
- General Adversarial Network (GAN)
- Recurrent Neural Network (RNN)
- Deep Neural Network (DNN)
- Transformer architecture
- Hyperbolic Tangent (tanh) activation function
- Rectified Linear Unit (ReLU) activation function
- Leaky Rectified Linear Unit (ReLU) activation function (leaky-rectified-linear-unit-activation-function)
- Parametric Rectified Linear Unit (ReLU) activation function (parametric-rectified-linear-unit-activation-function)
- Scaled Exponential Linear Unit (SELU) activation function (scaled-exponential-unit-activation-function)
- Sigmoid activation function
- Mean Squared Error (MSE)
- Mean Absolute Error (MAE)
- Cost function → Loss function
- Objective function → Loss function
- L1 loss → Mean Absolute Error (MAE)
- L1 norm → Mean Absolute Error (MAE)
- Gaussian kernel
- linear kernel
- polynomial kernel
- radial basis function (RBF) kernel
- sigmoid kernel
- Accuracy
- Precision
- True Positive Rate (≡ Sensitivity)
- True Negative Rate (≡ Specificity)
- F1-Score
- Receiver Operating Characteristic (ROC)
- Area Under the Curve (AUC)
- R-squared (R2)
- Silhouette Score
- Davies-Bouldin Index
- Adjusted Rand Index (ARI)
- Overfitting
- Underfitting
- Sensitivity → True Positive Rate
- Specificity → True Negative Rate
- AI content generator
- AI image generation
- AI form fill
- AI UI/UX
- AI internationalization/localization
- AI plagiarism checker
- AI sales
- AI marketing
- AI accounting
- AI human resources
- AI resource leveling
- AI customer service
- AI for business strategy
- AI for change management
- AI for partner management
- AI for product development
- AI for project management
- AI for software programming
- AI + adtech (advertising tech)
- AI + agtech (agricultural tech)
- AI + biotech (biological tech)
- AI + cleantech (clean energy tech)
- AI + edtech (educational tech)
- AI + fintech (financial tech)
- AI + govtech (governmental tech)
- AI + legtech (legal tech)
- AI + martech (marketing tech)
- AI + medtech (medical tech)
- AI + realtech (real estate tech)
- AI + regtech (regulatory tech)