All Projects → titu1994 → Keras Octconv

titu1994 / Keras Octconv

Licence: mit
Keras implementation of Octave Convolutions

Programming Languages

python
139335 projects - #7 most used programming language

Keras Octave Convolutions (OctConv)


Keras implementation of the Octave Convolution blocks from the paper Drop an Octave: Reducing Spatial Redundancy in Convolutional Neural Networks with Octave Convolution.

Usage

Octave Convolutions are a semi-drop-in-replacement for regular convolution layers.

They are implemented in 3 major steps:

Intiailization of Dual Path Flow

Use the initial_octconv block from octave_conv.py to initialize the Octave convolution blocks. This function accepts a single input tensor, and returns two output tensors : The high frequency pathway and low frequency pathway tensors, in that order

ip = Input(...)

x_high, x_low = initial_conv(ip, ...)

Add any number of Octave Convolution Blocks

Once the two frequency pathways have been obtained, use any number of octconv_block from octave_conv.py to make the network larger.

NOTE:

Each of these blocks accept two input tensors, and emits two output tensors.

x_high, x_low = octconv_block(x_high, x_low, ...)
x_high, x_low = octconv_block(x_high, x_low, ...)
x_high, x_low = octconv_block(x_high, x_low, ...)

Merging the streams back together

Once you are finished adding octconv_blocks, merge the two frequency pathways using final_octconv from octave_conv.py.

This block accepts two input tensors and one output tensor.

x = final_octconv(x_high, x_low, ...)

...

Acknowledgements

This code is heavily based on the MXNet implementation by terrychenism at https://github.com/terrychenism/OctaveConv.

Requirements


  • Keras 2.2.4+
  • Tensorflow 1.13+ (2.0 support depends on when Keras will support it) / Theano (not tested) / CNTK (not tested)
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].