All Projects → jbornschein → bihm

jbornschein / bihm

Licence: other
Bidirectional Helmholtz Machines

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to bihm

Real Time Ml Project
A curated list of applied machine learning and data science notebooks and libraries across different industries.
Stars: ✭ 143 (+257.5%)
Mutual labels:  theano, machine-learning-algorithms
Draw
Reimplementation of DRAW
Stars: ✭ 346 (+765%)
Mutual labels:  theano, machine-learning-algorithms
reweighted-ws
Implementation of the reweighted wake-sleep machine learning algorithm
Stars: ✭ 39 (-2.5%)
Mutual labels:  theano, machine-learning-algorithms
Mariana
The Cutest Deep Learning Framework which is also a wonderful Declarative Language
Stars: ✭ 151 (+277.5%)
Mutual labels:  theano, machine-learning-algorithms
xgboost-smote-detect-fraud
Can we predict accurately on the skewed data? What are the sampling techniques that can be used. Which models/techniques can be used in this scenario? Find the answers in this code pattern!
Stars: ✭ 59 (+47.5%)
Mutual labels:  machine-learning-algorithms
pycobra
python library implementing ensemble methods for regression, classification and visualisation tools including Voronoi tesselations.
Stars: ✭ 111 (+177.5%)
Mutual labels:  machine-learning-algorithms
symbolic-pymc
Tools for the symbolic manipulation of PyMC models, Theano, and TensorFlow graphs.
Stars: ✭ 58 (+45%)
Mutual labels:  theano
rankpruning
🧹 Formerly for binary classification with noisy labels. Replaced by cleanlab.
Stars: ✭ 81 (+102.5%)
Mutual labels:  machine-learning-algorithms
sia-cog
Various cognitive api for machine learning, vision, language intent alalysis. Covers traditional as well as deep learning model design and training.
Stars: ✭ 34 (-15%)
Mutual labels:  machine-learning-algorithms
mlreef
The collaboration workspace for Machine Learning
Stars: ✭ 1,409 (+3422.5%)
Mutual labels:  machine-learning-algorithms
Final-year-project-deep-learning-models
Deep learning for freehand sketch object recognition
Stars: ✭ 22 (-45%)
Mutual labels:  theano
VNMT
Code for "Variational Neural Machine Translation" (EMNLP2016)
Stars: ✭ 54 (+35%)
Mutual labels:  theano
Multi-Type-TD-TSR
Extracting Tables from Document Images using a Multi-stage Pipeline for Table Detection and Table Structure Recognition:
Stars: ✭ 174 (+335%)
Mutual labels:  machine-learning-algorithms
MLDemos
Machine Learning Demonstrations: A graphical interface to draw data, apply a diverse array of machine learning tools to it, and directly see the results in a visual and understandable manner.
Stars: ✭ 46 (+15%)
Mutual labels:  machine-learning-algorithms
mnist-neural-network-deeplearnjs
🍃 Using a Neural Network to recognize MNIST digets in JavaScript.
Stars: ✭ 26 (-35%)
Mutual labels:  machine-learning-algorithms
rnn benchmarks
RNN benchmarks of pytorch, tensorflow and theano
Stars: ✭ 85 (+112.5%)
Mutual labels:  theano
cheapml
Machine Learning algorithms coded from scratch
Stars: ✭ 17 (-57.5%)
Mutual labels:  machine-learning-algorithms
Sales-Prediction
In depth analysis and forecasting of product sales based on the items, stores, transaction and other dependent variables like holidays and oil prices.
Stars: ✭ 56 (+40%)
Mutual labels:  machine-learning-algorithms
awesome-computer-vision-models
A list of popular deep learning models related to classification, segmentation and detection problems
Stars: ✭ 419 (+947.5%)
Mutual labels:  machine-learning-algorithms
greycat
GreyCat - Data Analytics, Temporal data, What-if, Live machine learning
Stars: ✭ 104 (+160%)
Mutual labels:  machine-learning-algorithms

Bidirectional Helmholtz Machines

This repository contains the source code and additional results for the experiments described in

http://arxiv.org/abs/1506.03877

Overview

concept

The basic idea is to create a deep generative model for unsupervised learning by combining a top-down directed model P and a bottom up directed model Q into a joint model P*. We show that we can train P* such that P and Q are useful approximate inference distributions when we want to sample from the model, or when we want to perform inference.

We generally observe that BiHMs prefer deep architectures with many layers of latent variables. I.e., our best model for the binarized MNIST dataset has 12 layers with 300,200,100,75,50,35,30,25,20,15,10,10 binary latent units. This model reaches a test set LL of 84.8 nats.

Samples from the model

bmnist-samples bmnist-samples

The left image shows 100 random samples from the top-down model P; the right image shows that starting from this point and running 250 Gibbs MCMC steps to approximately sample from P* results in higher quality, crisp digits. (we visualize the Bernoulli probability per pixel instead of sampling from it)

Inpainting

bmnist-inpainting bmnist-inpainting

The left image shows 10 different digits that have been partially occluded. For each digit, we sample 10 different starting configurations from Q and subsequently run a Markov chain that produces approx. samples from P* which are consistent with the initial digits.

Dependencies

This code depends on Fuel, Theano, Blocks and various other libraries from the scientific python universe.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].