All Projects → tgsmith61591 → Smrt

tgsmith61591 / Smrt

Licence: bsd-3-clause
Handle class imbalance intelligently by using variational auto-encoders to generate synthetic observations of your minority class.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Smrt

Tensorflow Mnist Cvae
Tensorflow implementation of conditional variational auto-encoder for MNIST
Stars: ✭ 139 (+36.27%)
Mutual labels:  autoencoder, vae, variational-autoencoder
Neurec
Next RecSys Library
Stars: ✭ 731 (+616.67%)
Mutual labels:  neural-networks, autoencoder, variational-autoencoder
Tensorflow Mnist Vae
Tensorflow implementation of variational auto-encoder for MNIST
Stars: ✭ 422 (+313.73%)
Mutual labels:  autoencoder, vae, variational-autoencoder
Deep Learning With Python
Example projects I completed to understand Deep Learning techniques with Tensorflow. Please note that I do no longer maintain this repository.
Stars: ✭ 134 (+31.37%)
Mutual labels:  neural-networks, vae, variational-autoencoder
Generative Models
Annotated, understandable, and visually interpretable PyTorch implementations of: VAE, BIRVAE, NSGAN, MMGAN, WGAN, WGANGP, LSGAN, DRAGAN, BEGAN, RaGAN, InfoGAN, fGAN, FisherGAN
Stars: ✭ 438 (+329.41%)
Mutual labels:  autoencoder, vae
Awesome Vaes
A curated list of awesome work on VAEs, disentanglement, representation learning, and generative models.
Stars: ✭ 418 (+309.8%)
Mutual labels:  vae, variational-autoencoder
Pyod
A Python Toolbox for Scalable Outlier Detection (Anomaly Detection)
Stars: ✭ 5,083 (+4883.33%)
Mutual labels:  neural-networks, autoencoder
Advanced Deep Learning With Keras
Advanced Deep Learning with Keras, published by Packt
Stars: ✭ 917 (+799.02%)
Mutual labels:  autoencoder, vae
Tensorflow Generative Model Collections
Collection of generative models in Tensorflow
Stars: ✭ 3,785 (+3610.78%)
Mutual labels:  vae, variational-autoencoder
Tensorflow Tutorial
TensorFlow and Deep Learning Tutorials
Stars: ✭ 748 (+633.33%)
Mutual labels:  neural-networks, autoencoder
Variational Autoencoder
PyTorch implementation of "Auto-Encoding Variational Bayes"
Stars: ✭ 25 (-75.49%)
Mutual labels:  vae, variational-autoencoder
Disentangling Vae
Experiments for understanding disentanglement in VAE latent representations
Stars: ✭ 398 (+290.2%)
Mutual labels:  vae, variational-autoencoder
Repo 2017
Python codes in Machine Learning, NLP, Deep Learning and Reinforcement Learning with Keras and Theano
Stars: ✭ 1,123 (+1000.98%)
Mutual labels:  autoencoder, variational-autoencoder
Pytorch Rl
This repository contains model-free deep reinforcement learning algorithms implemented in Pytorch
Stars: ✭ 394 (+286.27%)
Mutual labels:  vae, variational-autoencoder
Vae protein function
Protein function prediction using a variational autoencoder
Stars: ✭ 57 (-44.12%)
Mutual labels:  vae, variational-autoencoder
Codeslam
Implementation of CodeSLAM — Learning a Compact, Optimisable Representation for Dense Visual SLAM paper (https://arxiv.org/pdf/1804.00874.pdf)
Stars: ✭ 64 (-37.25%)
Mutual labels:  autoencoder, variational-autoencoder
Variational Autoencoder
Variational autoencoder implemented in tensorflow and pytorch (including inverse autoregressive flow)
Stars: ✭ 807 (+691.18%)
Mutual labels:  vae, variational-autoencoder
Vae For Image Generation
Implemented Variational Autoencoder generative model in Keras for image generation and its latent space visualization on MNIST and CIFAR10 datasets
Stars: ✭ 87 (-14.71%)
Mutual labels:  vae, variational-autoencoder
classifying-vae-lstm
music generation with a classifying variational autoencoder (VAE) and LSTM
Stars: ✭ 27 (-73.53%)
Mutual labels:  vae, variational-autoencoder
S Vae Pytorch
Pytorch implementation of Hyperspherical Variational Auto-Encoders
Stars: ✭ 255 (+150%)
Mutual labels:  vae, variational-autoencoder

Build status codecov Supported versions Supported versions

Synthetic Minority Reconstruction Technique (SMRT)

Handle your class imbalance more intelligently by using SMOTE's younger, more sophisticated cousin

Installation

Installation is easy. After cloning the project onto your machine and installing the required dependencies, simply use the setup.py file:

$ git clone https://github.com/tgsmith61591/smrt.git
$ cd smrt
$ python setup.py install

About

SMRT (Sythetic Minority Reconstruction Technique) is the new SMOTE (Synthetic Minority Oversampling TEchnique). Using variational auto-encoders, SMRT learns the latent factors that best reconstruct the observations in each minority class, and then generates synthetic observations until the minority class is represented at a user-defined ratio in relation to the majority class size.

SMRT avoids one of SMOTE's greatest risks: In SMOTE, when drawing random observations from whose k-nearest neighbors to synthetically reconstruct, the possibility exists that a "border point," or an observation very close to the decision boundary may be selected. This could result in the synthetically-generated observations lying too close to the decision boundary for reliable classification, and could lead to the degraded performance of an estimator. SMRT avoids this risk implicitly, as the VariationalAutoencoder learns a distribution that is generalizable to the lowest-error (i.e., most archetypal) observations.

See the paper for more in-depth reference.

Example

The SMRT example is an ipython notebook with reproducible code and data that compares an imbalanced variant of the MNIST dataset after being balanced with both SMOTE and SMRT. The following are several of the resulting images produced from both SMOTE and SMRT, respectively. Even visually, it's evident that SMRT better synthesizes data that resembles the input data.

Original:

The MNIST dataset was amended to contain only zeros and ones in an unbalanced (~1:100, respectively) ratio. The top row are the original MNIST images, the second row is the SMRT-generated images, and the bottom row is the SMOTE-generated images:
Original

Notes

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].