All Projects → amalF → Kervolution

amalF / Kervolution

Licence: other
Kervolution implementation using TF2.0

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Kervolution

tf-faster-rcnn
Tensorflow 2 Faster-RCNN implementation from scratch supporting to the batch processing with MobileNetV2 and VGG16 backbones
Stars: ✭ 88 (+340%)
Mutual labels:  tf2, keras-tensorflow
cnn-visualization-keras-tf2
Filter visualization, Feature map visualization, Guided Backprop, GradCAM, Guided-GradCAM, Deep Dream
Stars: ✭ 21 (+5%)
Mutual labels:  tf2, keras-tensorflow
Codebraid
Live code in Pandoc Markdown
Stars: ✭ 204 (+920%)
Mutual labels:  reproducible-research
100DaysOfMLCode
I am taking up the #100DaysOfMLCode Challenge 😎
Stars: ✭ 12 (-40%)
Mutual labels:  keras-tensorflow
COVID-19-DETECTION
Detect Covid-19 with Chest X-Ray Data
Stars: ✭ 43 (+115%)
Mutual labels:  keras-tensorflow
Fglab
Future Gadget Laboratory
Stars: ✭ 218 (+990%)
Mutual labels:  reproducible-research
groundhog
Reproducible R Scripts Via Date Controlled Installing & Loading of CRAN & Git Packages
Stars: ✭ 58 (+190%)
Mutual labels:  reproducible-research
Avalanche
Avalanche: a End-to-End Library for Continual Learning.
Stars: ✭ 151 (+655%)
Mutual labels:  reproducible-research
benchmark VAE
Unifying Variational Autoencoder (VAE) implementations in Pytorch (NeurIPS 2022)
Stars: ✭ 1,211 (+5955%)
Mutual labels:  reproducible-research
TF2DeepFloorplan
TF2 Deep FloorPlan Recognition using a Multi-task Network with Room-boundary-Guided Attention. Enable tensorboard, quantization, flask, tflite, docker, github actions and google colab.
Stars: ✭ 98 (+390%)
Mutual labels:  keras-tensorflow
Rmarkdown tutorial
Reproducible Research with Rmarkdown: data management, analysis, and reporting all-in-one
Stars: ✭ 18 (-10%)
Mutual labels:  reproducible-research
fertile
creating optimal conditions for reproducibility
Stars: ✭ 52 (+160%)
Mutual labels:  reproducible-research
Reprozip
ReproZip is a tool that simplifies the process of creating reproducible experiments from command-line executions, a frequently-used common denominator in computational science.
Stars: ✭ 231 (+1055%)
Mutual labels:  reproducible-research
FACIL
Framework for Analysis of Class-Incremental Learning with 12 state-of-the-art methods and 3 baselines.
Stars: ✭ 411 (+1955%)
Mutual labels:  reproducible-research
Ten Rules Jupyter
Ten Simple Rules for Writing and Sharing Computational Analyses in Jupyter Notebooks
Stars: ✭ 204 (+920%)
Mutual labels:  reproducible-research
rcompendium
📦 Create a package or compendium structure
Stars: ✭ 26 (+30%)
Mutual labels:  reproducible-research
Neurodocker
Generate custom Docker and Singularity images, and minimize existing containers
Stars: ✭ 198 (+890%)
Mutual labels:  reproducible-research
Containerit
Package an R workspace and all dependencies as a Docker container
Stars: ✭ 248 (+1140%)
Mutual labels:  reproducible-research
targets-tutorial
Short course on the targets R package
Stars: ✭ 87 (+335%)
Mutual labels:  reproducible-research
papeR
A toolbox for writing Sweave or other LaTeX-based papers and reports and to prettify the output of various estimated models.
Stars: ✭ 26 (+30%)
Mutual labels:  reproducible-research

Kervolutional Neural Networks

A Tensorflow implementation of the Kervolutional Neural Networks (KNN).

Introduction

The paper introduces an alternative operator to the usual convolution operator in CNNs, called kernel convolution. The key idea is to use non-linear kernels to extract more complexe features without adding any additional parameters.

Using kernels as a source of non-linearity is more effective than using activation functions and max pooling operations (see figure below).

Implementation

This code was tested using TF2.0 and python 3.6.

pip install -r requirements.txt

To launch training using LeNet5 and MNIST dataset as described in section 4 in the paper :

python train_evaluate.py --lr 0.003 --batch_size 50 --epochs 20 --model_name lenetknn --kernel polynomial

The figures below represent the test accuracy for the first epoch.


For the learnable parameter cp of the polynomial kernal, the initialization of this parameter is important for faster convergence. The curve in the figure below used 0.5 as initial value.



To test the non-linearity impact on the performance, the activations are removed and the max pooling is replaced by an average pooling. These experiments are done using a lower leraning rate (0.0001)



Licence

MIT

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].