Implements EvoNorms B0 and S0 as proposed in Evolving Normalization-Activation Layers.
Presents implementations of EvoNormB0
and EvoNormS0
layers as proposed in Evolving Normalization-Activation Layers by Liu et al. The authors showed the results with these layers tested on MobileNetV2, ResNets, MnasNet, and EfficientNets. However, I tried a Mini Inception architecture as shown in this blog post with the CIFAR10 dataset.
2.2.0-rc3 (the version when I was testing the code on Colab)
Mini_Inception_BN_ReLU.ipynb
: Shows a bunch of experiments with the Mini Inception architecture and BN-ReLU combination.Mini_Inception_EvoNorm.ipynb
: Shows implementations of EvoNormB0
and EvoNormS0
layers and experiments with the Mini Inception architecture.Mini_Inception_EvoNorm_Sweep.ipynb
: Does a hyperparameter search on the groups
hyperparameter of EvoNormS0
layers along with a few other hyperparameters.layer_utils
: Ships EvoNormB0
and EvoNormS0
layers as stand-alone classes in tf.keras
.Follow experimental summary here.