All Projects → mfbalin → Concrete Autoencoders

mfbalin / Concrete Autoencoders

Projects that are alternatives of or similar to Concrete Autoencoders

100 Days Of Ml Code
100-Days-Of-ML-Code中文版
Stars: ✭ 16,797 (+24601.47%)
Mutual labels:  jupyter-notebook, unsupervised-learning, supervised-learning
Mlxtend
A library of extension and helper modules for Python's data analysis and machine learning libraries.
Stars: ✭ 3,729 (+5383.82%)
Mutual labels:  unsupervised-learning, supervised-learning
Machine Learning Algorithms From Scratch
Implementing machine learning algorithms from scratch.
Stars: ✭ 297 (+336.76%)
Mutual labels:  unsupervised-learning, supervised-learning
Stockpriceprediction
Stock Price Prediction using Machine Learning Techniques
Stars: ✭ 700 (+929.41%)
Mutual labels:  jupyter-notebook, supervised-learning
machine-learning-course
Machine Learning Course @ Santa Clara University
Stars: ✭ 17 (-75%)
Mutual labels:  supervised-learning, unsupervised-learning
L2c
Learning to Cluster. A deep clustering strategy.
Stars: ✭ 262 (+285.29%)
Mutual labels:  unsupervised-learning, supervised-learning
Hidt
Official repository for the paper "High-Resolution Daytime Translation Without Domain Labels" (CVPR2020, Oral)
Stars: ✭ 513 (+654.41%)
Mutual labels:  jupyter-notebook, unsupervised-learning
sutton-barto-rl-exercises
📖Learning reinforcement learning by implementing the algorithms from reinforcement learning an introduction
Stars: ✭ 77 (+13.24%)
Mutual labels:  supervised-learning, unsupervised-learning
Udacity Deep Learning Nanodegree
This is just a collection of projects that made during my DEEPLEARNING NANODEGREE by UDACITY
Stars: ✭ 15 (-77.94%)
Mutual labels:  jupyter-notebook, supervised-learning
Discogan Pytorch
PyTorch implementation of "Learning to Discover Cross-Domain Relations with Generative Adversarial Networks"
Stars: ✭ 961 (+1313.24%)
Mutual labels:  jupyter-notebook, unsupervised-learning
Gdynet
Unsupervised learning of atomic scale dynamics from molecular dynamics.
Stars: ✭ 37 (-45.59%)
Mutual labels:  jupyter-notebook, unsupervised-learning
machine-learning
Programming Assignments and Lectures for Andrew Ng's "Machine Learning" Coursera course
Stars: ✭ 83 (+22.06%)
Mutual labels:  supervised-learning, unsupervised-learning
machine learning from scratch matlab python
Vectorized Machine Learning in Python 🐍 From Scratch
Stars: ✭ 28 (-58.82%)
Mutual labels:  supervised-learning, unsupervised-learning
Php Ml
PHP-ML - Machine Learning library for PHP
Stars: ✭ 7,900 (+11517.65%)
Mutual labels:  unsupervised-learning, supervised-learning
ml-ai
ML-AI Community | Open Source | Built in Bharat for the World | Data science problem statements and solutions
Stars: ✭ 32 (-52.94%)
Mutual labels:  supervised-learning, unsupervised-learning
Pytorch Cortexnet
PyTorch implementation of the CortexNet predictive model
Stars: ✭ 349 (+413.24%)
Mutual labels:  jupyter-notebook, unsupervised-learning
Keras deep clustering
How to do Unsupervised Clustering with Keras
Stars: ✭ 202 (+197.06%)
Mutual labels:  jupyter-notebook, unsupervised-learning
Pytorch Byol
PyTorch implementation of Bootstrap Your Own Latent: A New Approach to Self-Supervised Learning
Stars: ✭ 213 (+213.24%)
Mutual labels:  jupyter-notebook, unsupervised-learning
Simclr
PyTorch implementation of SimCLR: A Simple Framework for Contrastive Learning of Visual Representations
Stars: ✭ 750 (+1002.94%)
Mutual labels:  jupyter-notebook, unsupervised-learning
Susi
SuSi: Python package for unsupervised, supervised and semi-supervised self-organizing maps (SOM)
Stars: ✭ 42 (-38.24%)
Mutual labels:  unsupervised-learning, supervised-learning

Concrete Autoencoders

The concrete autoencoder is an end-to-end differentiable method for global feature selection, which efficiently identifies a subset of the most informative features and simultaneously learns a neural network to reconstruct the input data from the selected features. The method can be applied to unsupervised and supervised settings, and is a modification of the standard autoencoder.

For more details, see the accompanying paper: "Concrete Autoencoders for Differentiable Feature Selection and Reconstruction", ICML 2019, and please use the citation below.

@article{abid2019concrete,
  title={Concrete Autoencoders for Differentiable Feature Selection and Reconstruction},
  author={Abid, Abubakar and Balin, Muhammed Fatih and Zou, James},
  journal={arXiv preprint arXiv:1901.09346},
  year={2019}
}

Installation

To install, use pip install concrete-autoencoder

Usage

Here's an example of using Concrete Autoencoders to select the 20 most important features (pixels) across the entire MNIST dataset:

from concrete_autoencoder import ConcreteAutoencoderFeatureSelector
from keras.datasets import mnist
from keras.utils import to_categorical
from keras.layers import Dense, Dropout, LeakyReLU
import numpy as np

(x_train, y_train), (x_test, y_test) = mnist.load_data()
x_train = np.reshape(x_train, (len(x_train), -1))
x_test = np.reshape(x_test, (len(x_test), -1))
y_train = to_categorical(y_train)
y_test = to_categorical(y_test)
print(x_train.shape, y_train.shape)
print(x_test.shape, y_test.shape)

def decoder(x):
    x = Dense(320)(x)
    x = LeakyReLU(0.2)(x)
    x = Dropout(0.1)(x)
    x = Dense(320)(x)
    x = LeakyReLU(0.2)(x)
    x = Dropout(0.1)(x)
    x = Dense(784)(x)
    return x

selector = ConcreteAutoencoderFeatureSelector(K = 20, output_function = decoder, num_epochs = 800)

selector.fit(x_train, x_train, x_test, x_test)

Then, to get the pixels, run this:

selector.get_support(indices = True)

Run this code inside a colab notebook: https://colab.research.google.com/drive/11NMLrmToq4bo6WQ_4WX5G4uIjBHyrzXd

Documentation:

class ConcreteAutoencoderFeatureSelector:

Constructor takes a number of parameters to initalize the class. They are:

K: the number of features one wants to select

output_function: the decoder function

num_epochs: number of epochs to start training concrete autoencoders

batch_size: the batch size during training

learning_rate: learning rate of the adam optimizer used during training

start_temp: the starting temperature of the concrete select layer

min_temp: the ending temperature of the concrete select layer

tryout_limit: number of times to double the number of epochs and try again in case it doesn't converge

fit(X, Y = None): trains the concrete autoencoder

X: the data for which you want to do feature selection

Y: labels, in case labels are given, it will do supervised feature selection, if not, then unsupervised feature selection

transform(X): filters X's features after fit has been called

X: the data to be filtered

fit_transform(X): calls fit and transform in a sequence

X: the data to do feature selection on and filter

get_support(indices = False): if indices is True, returns indices of the features selected, if not, returns a mask

indices: boolean flag to determine whether to return a list of indices or a boolean mask

get_params(): returns the underlying keras model for the concrete autoencoder

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].