All Projects → stanford-futuredata → Equivariant Transformers

stanford-futuredata / Equivariant Transformers

Licence: mit
Equivariant Transformer (ET) layers are image-to-image mappings that incorporate prior knowledge on invariances with respect to continuous transformations groups (ICML 2019). Paper: https://arxiv.org/abs/1901.11399

Projects that are alternatives of or similar to Equivariant Transformers

Yann
This toolbox is support material for the book on CNN (http://www.convolution.network).
Stars: ✭ 41 (-39.71%)
Mutual labels:  jupyter-notebook, convolutional-neural-networks
Very Deep Convolutional Networks For Natural Language Processing In Tensorflow
implement the paper" Very Deep Convolutional Networks for Natural Language Processing"(https://arxiv.org/abs/1606.01781 ) in tensorflow
Stars: ✭ 54 (-20.59%)
Mutual labels:  jupyter-notebook, convolutional-neural-networks
Computervision Recipes
Best Practices, code samples, and documentation for Computer Vision.
Stars: ✭ 8,214 (+11979.41%)
Mutual labels:  jupyter-notebook, convolutional-neural-networks
Deep learning projects
Stars: ✭ 28 (-58.82%)
Mutual labels:  jupyter-notebook, convolutional-neural-networks
Pneumonia Detection From Chest X Ray Images With Deep Learning
Detecting Pneumonia in Chest X-ray Images using Convolutional Neural Network and Pretrained Models
Stars: ✭ 64 (-5.88%)
Mutual labels:  jupyter-notebook, convolutional-neural-networks
Keras Faster Rcnn
Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks
Stars: ✭ 28 (-58.82%)
Mutual labels:  jupyter-notebook, convolutional-neural-networks
Accurate Binary Convolution Network
Binary Convolution Network for faster real-time processing in ASICs
Stars: ✭ 49 (-27.94%)
Mutual labels:  jupyter-notebook, convolutional-neural-networks
Twitter sentiment analysis word2vec convnet
Twitter Sentiment Analysis with Gensim Word2Vec and Keras Convolutional Network
Stars: ✭ 24 (-64.71%)
Mutual labels:  jupyter-notebook, convolutional-neural-networks
Cnn graph
Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering
Stars: ✭ 1,110 (+1532.35%)
Mutual labels:  jupyter-notebook, convolutional-neural-networks
Keras model compression
Model Compression Based on Geoffery Hinton's Logit Regression Method in Keras applied to MNIST 16x compression over 0.95 percent accuracy.An Implementation of "Distilling the Knowledge in a Neural Network - Geoffery Hinton et. al"
Stars: ✭ 59 (-13.24%)
Mutual labels:  jupyter-notebook, convolutional-neural-networks
Brain Tumor Segmentation Keras
Keras implementation of the multi-channel cascaded architecture introduced in the paper "Brain Tumor Segmentation with Deep Neural Networks"
Stars: ✭ 20 (-70.59%)
Mutual labels:  jupyter-notebook, convolutional-neural-networks
Gtsrb
Convolutional Neural Network for German Traffic Sign Recognition Benchmark
Stars: ✭ 65 (-4.41%)
Mutual labels:  jupyter-notebook, convolutional-neural-networks
Traffic Sign Classifier
Udacity Self-Driving Car Engineer Nanodegree. Project: Build a Traffic Sign Recognition Classifier
Stars: ✭ 12 (-82.35%)
Mutual labels:  jupyter-notebook, convolutional-neural-networks
Teacher Student Training
This repository stores the files used for my summer internship's work on "teacher-student learning", an experimental method for training deep neural networks using a trained teacher model.
Stars: ✭ 34 (-50%)
Mutual labels:  jupyter-notebook, convolutional-neural-networks
Dl Workshop Series
Material used for Deep Learning related workshops for Machine Learning Tokyo (MLT)
Stars: ✭ 857 (+1160.29%)
Mutual labels:  jupyter-notebook, convolutional-neural-networks
Svhn Cnn
Google Street View House Number(SVHN) Dataset, and classifying them through CNN
Stars: ✭ 44 (-35.29%)
Mutual labels:  jupyter-notebook, convolutional-neural-networks
All Classifiers 2019
A collection of computer vision projects for Acute Lymphoblastic Leukemia classification/early detection.
Stars: ✭ 22 (-67.65%)
Mutual labels:  jupyter-notebook, convolutional-neural-networks
Plaquebox Paper
Repo for Tang et al, bioRxiv 454793 (2018)
Stars: ✭ 23 (-66.18%)
Mutual labels:  jupyter-notebook, convolutional-neural-networks
Convisualize nb
Visualisations for Convolutional Neural Networks in Pytorch
Stars: ✭ 57 (-16.18%)
Mutual labels:  jupyter-notebook, convolutional-neural-networks
Audio classification
CNN 1D vs 2D audio classification
Stars: ✭ 65 (-4.41%)
Mutual labels:  jupyter-notebook, convolutional-neural-networks

Equivariant Transformer Networks

Equivariant Transformer (ET) layers are image-to-image mappings that incorporate prior knowledge on invariances with respect to continuous transformation groups. ET layers can be used to normalize the appearance of images prior to classification (or other operations) by a convolutional neural network.

More details can be found in our ICML 2019 paper: https://arxiv.org/abs/1901.11399.

Predicted transformations


Requirements


Datasets

Examples of MNIST digits distorted by projective transformations

To download and preprocess datasets, run:

python datasets.py projective_mnist --data_dir=<PATH>

for the Projective MNIST dataset, and

python datasets.py svhn --data_dir=<PATH>

for the SVHN dataset.

This will download the requested dataset to the directory indicated by PATH and will write three files: train.pt, valid.pt and test.pt. These files will be used by the experiment scripts.


Pretrained Models

Pretrained models can be found in the pretrained directory.

These can be loaded by simply setting the load_path argument for the corresponding Model subclass:

from experiment_mnist import MNISTModel
from experiment_svhn import SVHNModel

mnist_model = MNISTModel(load_path='pretrained/etn-projmnist-8x.pt')
svhn_model = SVHNModel(load_path='pretrained/etn-resnet34-svhn.pt')

Usage

There are two scripts for running the experiments described in the paper: experiment_mnist.py and experiment_svhn.py. These scripts come with preset hyperparameters for each task that can be overridden by setting the corresponding flags.

To train a model on the Projective MNIST dataset, run:

python experiment_mnist.py train --train_path <PATH>/train.pt --valid_path <PATH>/valid.pt [--save_path <SAVE_PATH>]

The save_path flag lets us specify a path to save the model that achieves the best validation accuracy during training.

To change the set of transformers used by the model, we can use the tfs flag to specify a list of class names from the etn.transformers module. For example:

python experiment_mnist.py train ... --tfs "[ShearX, HyperbolicRotation]"

To train a model without any transformers, we can simply set tfs to the empty list []:

python experiment_mnist.py train ... --tfs []

We can also set the coordinate transformation that's applied immediately prior to classification by the CNN:

python experiment_mnist.py train ... --coords logpolar

Therefore, to train a bare-bones model without any transformer layers or coordinate transformations, we can run:

python experiment_mnist.py train ... --tfs [] --coords identity

Feel free to play around with different combinations of transformers and coordinate systems!

To train a non-equivariant model, we can set the equivariant flag to False:

python experiment_mnist.py train ... --equivariant False

To change the device used for training, we can set the device flag (this is set to cuda:0) by default:

python experiment_mnist.py train ... --device cuda:1

To evaluate a saved model on the test set, run:

python experiment_mnist.py --load_path <SAVE_PATH> test --test_path <PATH>/test.pt

The experiments on the SVHN dataset can be run in same manner by calling experiment_svhn.py instead of experiment_mnist.py.

These scripts can also be called from a Jupyter Notebook by importing the MNISTModel and SVHNModel classes. In a notebook, we can visualize training progress using the show_plot parameter. This produces a live plot of the loss and validation error as training progresses.

from experiment_mnist import MNISTModel
model = MNISTModel(...)
model.train(..., show_plot=True, ...)

For more training options, see the __init__ and train functions for the base experiments.Model class and its subclasses in experiment_mnist and experiment_svhn.


Citation

If you've found this repository useful in your own work, please consider citing our paper:

@inproceedings{tai2019equivariant,
  title={{Equivariant Transformer Networks}},
  author={Tai, Kai Sheng and Bailis, Peter and Valiant, Gregory},
  booktitle={International Conference on Machine Learning},
  year={2019}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].