Skip to content

Keras Image Classifiers

Model building scripts which replicate the architectures of various state of the art papers. All of these models are built in Keras or Tensorflow.


Octave Convolutions

Keras implementation of the Octave Convolution blocks from the paper Drop an Octave: Reducing Spatial Redundancy in Convolutional Neural Networks with Octave Convolution.

Provides code blocks for initializing the Oct-Conv architecture, building Oct-Conv blocks and closing the Oct-conv architecture, as well as Octave-Resnet models.


Non-Local Neural Networks

Keras implementation of Non-local blocks from the paper Non-local Neural Networks.

  • Support for "Gaussian", "Embedded Gaussian" and "Dot" instantiations of the Non-Local block.
  • Support for variable shielded computation mode (reduces computation by N**2 x, where N is default to 2)
  • Support for "Concatenation" instantiation will be supported when authors release their code.

Squeeze & Excitation Networks

Implementation of Squeeze and Excitation Networks, as an independent block that can be added to any Keras layer, or pre-built models such as :

  • SE-ResNet. Custom ResNets can be built using the SEResNet model builder, whereas prebuilt Resnet models such as SEResNet50, SEResNet101 and SEResNet154 can also be built directly.
  • SE-InceptionV3
  • SE-Inception-ResNet-v2
  • SE-ResNeXt

Additional models (not from the paper, not verified if they improve performance)

  • SE-MobileNets
  • SE-DenseNet - Custom SE-DenseNets can be built using SEDenseNet model builder, whereas prebuilt SEDenseNet models such as SEDenseNetImageNet121, SEDenseNetImageNet169, SEDenseNetImageNet161, SEDenseNetImageNet201 and SEDenseNetImageNet264 can be build DenseNet in ImageNet configuration. To use SEDenseNet in CIFAR mode, use the SEDenseNet model builder.

NASNet

An implementation of "NASNet" models from the paper Learning Transferable Architectures for Scalable Image Recognitio in Keras 2.0+.

Based on the models described in the TFSlim implementation and some modules from the TensorNets implementation.

Weights have been ported over from the official NASNet Tensorflow repository.


MobileNets V1 and V2

Keras implementation of the paper MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications.

Contains the Keras implementation of the paper MobileNetV2: Inverted Residuals and Linear Bottlenecks.

Weights for all variants of MobileNet V1 and MobileNet V2 are available.


SparseNets

Keras Implementation of Sparse Networks from the paper Sparsely Connected Convolutional Networks.

Code derived from the offical repository - https://github.com/Lyken17/SparseNet

No weights available as they are not released.


Dual Path Networks

Dual Path Networks are highly efficient networks which combine the strength of both ResNeXt Aggregated Residual Transformations for Deep Neural Networks and DenseNets Densely Connected Convolutional Networks.

Due to Keras and Tensorflow not supporting Grouped Convolutions yet, this is an inefficient implementation with no weights.


ResNeXt

Implementation of ResNeXt models from the paper Aggregated Residual Transformations for Deep Neural Networks in Keras 2.0+.

Contains code for building the general ResNeXt model (optimized for datasets similar to CIFAR) and ResNeXtImageNet (optimized for the ImageNet dataset).

Due to Keras and Tensorflow not supporting Grouped Convolutions yet, this is an inefficient implementation with no weights.


Inception v4 / Inception ResNet v2

Implementations of the Inception-v4, Inception - Resnet-v1 and v2 Architectures in Keras using the Functional API. The paper on these architectures is available at Inception-v4, Inception-ResNet and the Impact of Residual Connections on Learning.

Weights are provided for Inception v4 and Inception ResNet v2 Models.


DenseNets

DenseNet implementation of the paper Densely Connected Convolutional Networks in Keras

Now supports the more efficient DenseNet-BC (DenseNet-Bottleneck-Compressed) networks. Using the DenseNet-BC-190-40 model, it obtaines state of the art performance on CIFAR-10 and CIFAR-100

Weights are provided for DenseNet Models.


Wide Residual Networks

Implementation of Wide Residual Networks from the paper Wide Residual Networks in Keras.

No weights available due to limited computation available.


Residual-of-Residual Networks

This is an implementation of the paper Residual Networks of Residual Networks: Multilevel Residual Networks