Presents implementations of EvoNormB0
and EvoNormS0
layers as proposed in Evolving Normalization-Activation Layers by Liu et al. The authors showed the results with these layers tested on MobileNetV2, ResNets, MnasNet, and EfficientNets. However, I tried a Mini Inception architecture as shown in this blog post with the CIFAR10 dataset.
- Hanxiao Liu for helping me to correct the implementation.
2.2.0-rc3 (the version when I was testing the code on Colab)
Mini_Inception_BN_ReLU.ipynb
: Shows a bunch of experiments with the Mini Inception architecture and BN-ReLU combination.Mini_Inception_EvoNorm.ipynb
: Shows implementations ofEvoNormB0
andEvoNormS0
layers and experiments with the Mini Inception architecture.Mini_Inception_EvoNorm_Sweep.ipynb
: Does a hyperparameter search on thegroups
hyperparameter ofEvoNormS0
layers along with a few other hyperparameters.layer_utils
: ShipsEvoNormB0
andEvoNormS0
layers as stand-alone classes intf.keras
.
Follow experimental summary here.
- 3 ways to create a Keras model with TensorFlow 2.0 (Sequential, Functional, and Model Subclassing) by PyImageSearch
- Evolving Normalization-Activation Layers video guide by Henry AI Labs
- EvoNorms_PyTorch