A tensorflow2 implementation of some basic CNNs.
- MobileNet_V1
- MobileNet_V2
- MobileNet_V3
- EfficientNet
- ResNeXt
- InceptionV4, InceptionResNetV1, InceptionResNetV2
- SE_ResNet_50, SE_ResNet_101, SE_ResNet_152, SE_ResNeXt_50, SE_ResNeXt_101
- SqueezeNet
- DenseNet
- ShuffleNetV2
- ResNet
- RegNet
For AlexNet and VGG, see : https://github.com/calmisential/TensorFlow2.0_Image_Classification
For InceptionV3, see : https://github.com/calmisential/TensorFlow2.0_InceptionV3
For ResNet, see : https://github.com/calmisential/TensorFlow2.0_ResNet
- Requirements:
- Python >= 3.9
- Tensorflow >= 2.7.0
- tensorflow-addons >= 0.15.0
- To train the network on your own dataset, you can put the dataset under the folder original dataset, and the directory should look like this:
|——original dataset
|——class_name_0
|——class_name_1
|——class_name_2
|——class_name_3
- Run the script split_dataset.py to split the raw dataset into train set, valid set and test set. The dataset directory will be like this:
|——dataset
|——train
|——class_name_1
|——class_name_2
......
|——class_name_n
|——valid
|——class_name_1
|——class_name_2
......
|——class_name_n
|—-test
|——class_name_1
|——class_name_2
......
|——class_name_n
- Run to_tfrecord.py to generate tfrecord files.
- Change the corresponding parameters in config.py.
- Run show_model_list.py to get the index of model.
- Run python train.py --idx [index] to start training.
If you want to train the EfficientNet, you should change the IMAGE_HEIGHT and IMAGE_WIDTH before training.
- b0 = (224, 224)
- b1 = (240, 240)
- b2 = (260, 260)
- b3 = (300, 300)
- b4 = (380, 380)
- b5 = (456, 456)
- b6 = (528, 528)
- b7 = (600, 600)
Run python evaluate.py --idx [index] to evaluate the model's performance on the test dataset.
Type | Neural Network | Input Image Size (height * width) |
---|---|---|
MobileNet | MobileNet_V1 | (224 * 224) |
MobileNet_V2 | (224 * 224) | |
MobileNet_V3 | (224 * 224) | |
EfficientNet | EfficientNet(B0~B7) | / |
ResNeXt | ResNeXt50 | (224 * 224) |
ResNeXt101 | (224 * 224) | |
SEResNeXt | SEResNeXt50 | (224 * 224) |
SEResNeXt101 | (224 * 224) | |
Inception | InceptionV4 | (299 * 299) |
Inception_ResNet_V1 | (299 * 299) | |
Inception_ResNet_V2 | (299 * 299) | |
SE_ResNet | SE_ResNet_50 | (224 * 224) |
SE_ResNet_101 | (224 * 224) | |
SE_ResNet_152 | (224 * 224) | |
SqueezeNet | SqueezeNet | (224 * 224) |
DenseNet | DenseNet_121 | (224 * 224) |
DenseNet_169 | (224 * 224) | |
DenseNet_201 | (224 * 224) | |
DenseNet_269 | (224 * 224) | |
ShuffleNetV2 | ShuffleNetV2 | (224 * 224) |
ResNet | ResNet_18 | (224 * 224) |
ResNet_34 | (224 * 224) | |
ResNet_50 | (224 * 224) | |
ResNet_101 | (224 * 224) | |
ResNet_152 | (224 * 224) |
- MobileNet_V1: Efficient Convolutional Neural Networks for Mobile Vision Applications
- MobileNet_V2: Inverted Residuals and Linear Bottlenecks
- MobileNet_V3: Searching for MobileNetV3
- EfficientNet: EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks
- The official code of EfficientNet: https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet
- ResNeXt: Aggregated Residual Transformations for Deep Neural Networks
- Inception_V4/Inception_ResNet_V1/Inception_ResNet_V2: Inception-v4, Inception-ResNet and the Impact of Residual Connectionson Learning
- The official implementation of Inception_V4: https://github.com/tensorflow/models/blob/master/research/slim/nets/inception_v4.py
- The official implementation of Inception_ResNet_V2: https://github.com/tensorflow/models/blob/master/research/slim/nets/inception_resnet_v2.py
- SENet: Squeeze-and-Excitation Networks
- SqueezeNet: SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and <0.5MB model size
- DenseNet: Densely Connected Convolutional Networks
- https://zhuanlan.zhihu.com/p/37189203
- ShuffleNetV2: ShuffleNet V2: Practical Guidelines for Efficient CNN Architecture Design
- https://zhuanlan.zhihu.com/p/48261931
- ResNet: Deep Residual Learning for Image Recognition
- RegNet: Designing Network Design Spaces