This is a set of code used to demonstrate basic neural network concepts.
The notebooks contain code to demonstrate the concepts and how they are applied in code and data.
This shows the concept of optimization with gradient descent.
Main Concepts:
- Computation of the parameter gradients
- Computation of the gradient steps
- Optimizing the parameters using the gradients
This shows the usage of Keras and the usage of activation functions with stack of linear functions.
Main Concepts:
- How keras makes model creation simpler
- How activation functions allow learning non-linearity
- Dense network introduction
This shows some preprocessing steps on real world data and practical things to set up before training a network.
Main Concepts:
- Basic imputation
- Training, Validation, and Test splits
- Numerical encoding
- Dense networks
- Training checkpoints
- Early stopping
This shows how to train a convolutional network.
Main Concepts:
- Image from numeric formats
- Convolutional networks
This shows how to leverage pretrained models on use cases with low amount of data.
Main Concepts:
- Loading images from image formats
- Loading pretrained models
- Adding layers on pretrained models
- Setting trainable layers on pretrained models
This notebook shows how to generate models that consumes sequences of data with recurrent networks.
Main Concepts:
- Fixed value outputs from sequences
- Masking on variable length sequences
- Recurrent networks
The data
folder contains the example datasets used to run the code in the
notebook
directory. Most of them are smaller versions of the datasets for
the sake of reasonable run times.