All Projects â†’ wpmed92 â†’ Backpropaganda

wpmed92 / Backpropaganda

Licence: mit
A simple JavaScript neural network framework.

Programming Languages

javascript
184084 projects - #8 most used programming language

Projects that are alternatives of or similar to Backpropaganda

Super Slowmo
An attempt at a PyTorch implimentation of "Super SloMo: High Quality Estimation of Multiple Intermediate Frames for Video Interpolation"
Stars: ✭ 73 (-15.12%)
Mutual labels:  neural-networks
Transfer Learning Conv Ai
🦄 State-of-the-Art Conversational AI with Transfer Learning
Stars: ✭ 1,217 (+1315.12%)
Mutual labels:  neural-networks
Dltk
Deep Learning Toolkit for Medical Image Analysis
Stars: ✭ 1,249 (+1352.33%)
Mutual labels:  neural-networks
Mlalgorithms
Minimal and clean examples of machine learning algorithms implementations
Stars: ✭ 8,733 (+10054.65%)
Mutual labels:  neural-networks
Abigsurvey
A collection of 500+ survey papers on Natural Language Processing (NLP) and Machine Learning (ML)
Stars: ✭ 1,203 (+1298.84%)
Mutual labels:  neural-networks
Tensorflow seq2seq chatbot
Stars: ✭ 81 (-5.81%)
Mutual labels:  neural-networks
Strike With A Pose
A simple GUI tool for generating adversarial poses of objects.
Stars: ✭ 70 (-18.6%)
Mutual labels:  neural-networks
Pumas.jl
Pharmaceutical Modeling and Simulation for Nonlinear Mixed Effects (NLME), Quantiative Systems Pharmacology (QsP), Physiologically-Based Pharmacokinetics (PBPK) models mixed with machine learning
Stars: ✭ 84 (-2.33%)
Mutual labels:  neural-networks
Wav2letter
Speech Recognition model based off of FAIR research paper built using Pytorch.
Stars: ✭ 78 (-9.3%)
Mutual labels:  neural-networks
Neural Networks
brief introduction to Python for neural networks
Stars: ✭ 82 (-4.65%)
Mutual labels:  neural-networks
Mit Deep Learning
Tutorials, assignments, and competitions for MIT Deep Learning related courses.
Stars: ✭ 8,912 (+10262.79%)
Mutual labels:  neural-networks
Math And Ml Notes
Books, papers and links to latest research in ML/AI
Stars: ✭ 76 (-11.63%)
Mutual labels:  neural-networks
Attend infer repeat
A Tensorfflow implementation of Attend, Infer, Repeat
Stars: ✭ 82 (-4.65%)
Mutual labels:  neural-networks
Kaggle Rsna
Deep Learning for Automatic Pneumonia Detection, RSNA challenge
Stars: ✭ 74 (-13.95%)
Mutual labels:  neural-networks
Sonnet
TensorFlow-based neural network library
Stars: ✭ 9,129 (+10515.12%)
Mutual labels:  neural-networks
Componentarrays.jl
Arrays with arbitrarily nested named components.
Stars: ✭ 72 (-16.28%)
Mutual labels:  neural-networks
Deepsmiles
DeepSMILES - A variant of SMILES for use in machine-learning
Stars: ✭ 80 (-6.98%)
Mutual labels:  neural-networks
Knet.jl
Koç University deep learning framework.
Stars: ✭ 1,260 (+1365.12%)
Mutual labels:  neural-networks
Variational Capsule Routing
Official Pytorch code for (AAAI 2020) paper "Capsule Routing via Variational Bayes", https://arxiv.org/pdf/1905.11455.pdf
Stars: ✭ 84 (-2.33%)
Mutual labels:  neural-networks
Discogan Tensorflow
An implementation of DiscoGAN in tensorflow
Stars: ✭ 82 (-4.65%)
Mutual labels:  neural-networks

license

backpropaganda

backpropaganda is a simple JavaScript neural network framework. You can build multi layer feed-forward neural networks with it.

Table of Contents

Installation
Solving the XOR problem
Recognizing handwritten digits
Weight visualization
Contribution

Installation

Clone this repo:

git clone https://github.com/wpmed92/backpropaganda.git

Use the library and experiment.

Solving the XOR problem

To run the example type the following in the terminal.

node src/multi-layer-xor-train.js

Code walkthrough:

Require the libraries.

var Layer = require('./lib/layer');
var Network = require('./lib/network');

Create a network instance.

let network = new Network();

Add layers to the network.

network.addLayer(new Layer(2)); //input
network.addLayer(new Layer(2)); //hidden layer
network.addLayer(new Layer(1)); //output layer

Define the training dataset.

var set = [
    {
        input: [1, 0],
        output: [1]
    },
    {
        input: [0, 1],
        output: [1]
    },
    {
        input: [1, 1],
        output: [0]
    },
    {
        input: [0, 0],
        output: [0]
    },
];

Initialize the network weights and biases.

network.setup();

Train the network.

var TRAINING_SIZE = 4;
var trainIterations = 10000
var learningRate = 10;
var miniBatchSize = 4;

network.train(set, TRAINING_SIZE, trainIterations, learningRate, miniBatchSize);

Evaluate how effective your training was.

for (let i = 0; i < TRAINING_SIZE; i++) {
    network.loadInput(set[i].input);
    let test = network.activate(); 
    console.log("Expected: " + set[i].output + ", activation: " + test);
}

Save the network if you want. network.save() will create a "nets" folder (gitignored by default) in the repo, and save your networks there in JSON format.

network.save("xor");

Recognizing handwritten digits

For this example I used a JavaScript version of the mnist-dataset The network is composed of 3 layers, with dimensions of 784, 30 and 10 respectively. Feel free to play around with other architectures, and see if they are better/worse than this. To run the example type the following in the terminal.

node src/multi-layer-mnist-train.js

Code walkthrough:

Require the libraries.

var Layer = require('./lib/layer');
var Network = require('./lib/network');
var util = require('./lib/util');
var mnist = require('mnist');

Create a network instance.

let network = new Network();

Add layers to the network.

network.addLayer(new Layer(784)); //input
network.addLayer(new Layer(30)); //hidden layer
network.addLayer(new Layer(10)); //output layer

Initialize the network weights and biases.

network.setup();

Train the network. Try increasing/decreasing trainIterations learningRate and miniBatchSize, and see how they change the training process. All the training logic is inside network.train.

var TRAINING_SIZE = 8000;
var TEST_SIZE = 300;
var trainIterations = 10
var learningRate = 5;
var miniBatchSize = 10;
var set = mnist.set(TRAINING_SIZE, TEST_SIZE);
network.train(set.training, TRAINING_SIZE, trainIterations, learningRate, miniBatchSize);

Evaluate how effective your training was on both the training dataset and on a test dataset the network hasn't previously seen.

let testCorrect = 0;
let trainingCorrect = 0;

//Training
for (let i = 0; i < TRAINING_SIZE; i++) {
    network.loadInput(set.training[i].input);
    let test = network.activate();

    if (util.argMax(test) == util.argMax(set.training[i].output)) {
        trainingCorrect++;
    }
}

//Test
for (let i = 0; i < TEST_SIZE; i++) {
    network.loadInput(set.test[i].input);
    let test = network.activate();

    if (util.argMax(test) == util.argMax(set.test[i].output)) {
        testCorrect++;
    }
}

Optional: save the network.

network.save("mnist-net-test");

Print out some stats about the training and evaluation.

console.log("---Net report---")
console.log("-dataset: MNIST");
console.log("-trainIterations: " + trainIterations);
console.log("-learningRate: " + learningRate);
console.log("-training data size: " + TRAINING_SIZE);
console.log("-test data size: " + TEST_SIZE);
console.log("------------------");
console.log("-Training accuracy: " + trainingCorrect + "/" + TRAINING_SIZE + ", " + trainingCorrect/TRAINING_SIZE*100 + "%");
console.log("-Test accuracy: " + testCorrect + "/" + TEST_SIZE + ", " + testCorrect/TEST_SIZE*100 + "%");

Weight visualization

There is an experimental weight visualizer, which saves a given weight matrix into an image.

let weightVisualizer = new WeightVisualizer(weights, VISUALIZER_MODE.RAINBOW);
weightVisualizer.generateImage(id);

Here is an example gif created from weight images saved during the training process. See how the trained digit (a "1") starts taking shape.

Weight evolution

Contribution

Feel free to add fixes, improvements or new features to the codebase. Here are some features I'd like to add:

  • More activation functions. Currently sigmoid is supported.
  • More cost functions. Currently Mean Square Error is supported.
  • Draw things in CLI: learning curves, stats
  • Use backpropaganda as a CLI tool
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].