A NumPy implementation of the bayesian inference approach of Deep Neural Networks as Gaussian Processes.
We focus on infinitely wide neural network endowed with ReLU nonlinearity function, allowing for an analytic computation of the layer kernels.
- Python 3
- numpy
Clone the repository
git clone https://github.com/MB-29/NN-gaussian-process.git
and move to the root directory
cd NN-gaussian-process
from nngp import NNGP
# ...
regression = NNGP(
training_data, # Data
training_targets,
test_data,
L, # Neural network depth
sigma_eps_2=sigma_eps**2, # Observation noise variance
sigma_w_2=sigma_w_2, # Weight hyperparameter
sigma_b_2=sigma_b_2 # Bias hyperparameter
)
regression.train()
predictions, covariance = regression.predict()
- A classification script for MNIST is provided in the file
classify_MNIST.py
. It relies on the additional requirementpython-mnist
available on pip. - A 1D regression script is provided in the file
1D_regression.py
. We obtained the following results.