All Projects → 2015xli → DBN

2015xli / DBN

Licence: other
Simple code tutorial for deep belief network (DBN)

Programming Languages

Jupyter Notebook
11667 projects

Projects that are alternatives of or similar to DBN

Reducing-the-Dimensionality-of-Data-with-Neural-Networks
Implementation of G. E. Hinton and R. R. Salakhutdinov's Reducing the Dimensionality of Data with Neural Networks (Tensorflow)
Stars: ✭ 34 (+0%)
Mutual labels:  restricted-boltzmann-machine
Spiking-Restricted-Boltzmann-Machine
RBM implemented with spiking neurons in Python. Contrastive Divergence used to train the network.
Stars: ✭ 23 (-32.35%)
Mutual labels:  restricted-boltzmann-machine
Handwritten-Names-Recognition
The goal of this project is to solve the task of name transcription from handwriting images implementing a NN approach.
Stars: ✭ 54 (+58.82%)
Mutual labels:  restricted-boltzmann-machine
NNet
algorithm for study: multi-layer-perceptron, cluster-graph, cnn, rnn, restricted boltzmann machine, bayesian network
Stars: ✭ 24 (-29.41%)
Mutual labels:  restricted-boltzmann-machine
Generative Models
Collection of generative models, e.g. GAN, VAE in Pytorch and Tensorflow.
Stars: ✭ 6,701 (+19608.82%)
Mutual labels:  restricted-boltzmann-machine
xRBM
Implementation of Restricted Boltzmann Machine (RBM) and its variants in Tensorflow
Stars: ✭ 51 (+50%)
Mutual labels:  restricted-boltzmann-machine
Deep-Learning-Models
Deep Learning Models implemented in python.
Stars: ✭ 17 (-50%)
Mutual labels:  restricted-boltzmann-machine

Simple tutotial code for Deep Belief Network (DBN)

The python code implements DBN with an example of MNIST digits image reconstruction.

It also includes a classifier based on the BDN, i.e., the visible units of the top layer include not only the input but also the labels. Then the top layer RBM learns the distribution of p(v, label, h). The input v is still provided from the bottom of the network. The classification is to find the distribution of p(label|v). With the simple implementation, the classifier achieved 92% accuracy without tuning after trained with MNIST for 100 epochs.

The classifier code comes with a digit generator that generates digit images from labels. It is the reverse process of the classifier, i.e., find the distribution of p(v|label). The label is provided to the top layer RBM as part of its visible units, and the image is output at the bottom of the network. The generated images are not pretty while roughly eligible as given below.

Digits

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].