All Projects → patrikeh → Go Deep

patrikeh / Go Deep

Licence: mit
Artificial Neural Network

Programming Languages

go
31211 projects - #10 most used programming language
golang
3204 projects

Projects that are alternatives of or similar to Go Deep

Minimalistic-Multiple-Layer-Neural-Network-from-Scratch-in-Python
Minimalistic Multiple Layer Neural Network from Scratch in Python.
Stars: ✭ 24 (-92.08%)
Mutual labels:  regression, classification, backpropagation
Python-Machine-Learning-Fundamentals
D-Lab's 6 hour introduction to machine learning in Python. Learn how to perform classification, regression, clustering, and do model selection using scikit-learn and TPOT.
Stars: ✭ 46 (-84.82%)
Mutual labels:  regression, classification
Predictive-Maintenance-of-Aircraft-Engine
In this project I aim to apply Various Predictive Maintenance Techniques to accurately predict the impending failure of an aircraft turbofan engine.
Stars: ✭ 48 (-84.16%)
Mutual labels:  regression, classification
Pycaret
An open-source, low-code machine learning library in Python
Stars: ✭ 4,594 (+1416.17%)
Mutual labels:  regression, classification
onelearn
Online machine learning methods
Stars: ✭ 14 (-95.38%)
Mutual labels:  regression, classification
InstantDL
InstantDL: An easy and convenient deep learning pipeline for image segmentation and classification
Stars: ✭ 33 (-89.11%)
Mutual labels:  regression, classification
projection-pursuit
An implementation of multivariate projection pursuit regression and univariate classification
Stars: ✭ 24 (-92.08%)
Mutual labels:  regression, classification
R-Machine-Learning
D-Lab's 6 hour introduction to machine learning in R. Learn the fundamentals of machine learning, regression, and classification, using tidymodels in R.
Stars: ✭ 27 (-91.09%)
Mutual labels:  regression, classification
wymlp
tiny fast portable real-time deep neural network for regression and classification within 50 LOC.
Stars: ✭ 36 (-88.12%)
Mutual labels:  regression, classification
data-science-notes
Open-source project hosted at https://makeuseofdata.com to crowdsource a robust collection of notes related to data science (math, visualization, modeling, etc)
Stars: ✭ 52 (-82.84%)
Mutual labels:  regression, classification
Fuku Ml
Simple machine learning library / 簡單易用的機器學習套件
Stars: ✭ 280 (-7.59%)
Mutual labels:  classification, regression
machine learning from scratch matlab python
Vectorized Machine Learning in Python 🐍 From Scratch
Stars: ✭ 28 (-90.76%)
Mutual labels:  regression, classification
ugtm
ugtm: a Python package for Generative Topographic Mapping
Stars: ✭ 34 (-88.78%)
Mutual labels:  regression, classification
Machine-Learning-Specialization
Project work and Assignments for Machine learning specialization course on Coursera by University of washington
Stars: ✭ 27 (-91.09%)
Mutual labels:  regression, classification
Synthetic-data-gen
Various methods for generating synthetic data for data science and ML
Stars: ✭ 57 (-81.19%)
Mutual labels:  regression, classification
stg
Python/R library for feature selection in neural nets. ("Feature selection using Stochastic Gates", ICML 2020)
Stars: ✭ 47 (-84.49%)
Mutual labels:  regression, classification
Deepfashion
Apparel detection using deep learning
Stars: ✭ 223 (-26.4%)
Mutual labels:  classification, regression
Orange3
🍊 📊 💡 Orange: Interactive data analysis
Stars: ✭ 3,152 (+940.26%)
Mutual labels:  classification, regression
Python-Machine-Learning
Python Machine Learning Algorithms
Stars: ✭ 80 (-73.6%)
Mutual labels:  regression, classification
pywedge
Makes Interactive Chart Widget, Cleans raw data, Runs baseline models, Interactive hyperparameter tuning & tracking
Stars: ✭ 49 (-83.83%)
Mutual labels:  regression, classification

go-deep

Go Report Card Build Status Coverage Status GoDoc

Feed forward/backpropagation neural network implementation. Currently supports:

  • Activation functions: sigmoid, hyperbolic, ReLU
  • Solvers: SGD, SGD with momentum/nesterov, Adam
  • Classification modes: regression, multi-class, multi-label, binary
  • Supports batch training in parallel
  • Bias nodes

Networks are modeled as a set of neurons connected through synapses. No GPU computations - don't use this for any large scale applications.

Todo:

  • Dropout
  • Batch normalization

Install

go get -u github.com/patrikeh/go-deep

Usage

Import the go-deep package

import (
	"fmt"
	deep "github.com/patrikeh/go-deep"
	"github.com/patrikeh/go-deep/training"
)

Define some data...

var data = training.Examples{
	{[]float64{2.7810836, 2.550537003}, []float64{0}},
	{[]float64{1.465489372, 2.362125076}, []float64{0}},
	{[]float64{3.396561688, 4.400293529}, []float64{0}},
	{[]float64{1.38807019, 1.850220317}, []float64{0}},
	{[]float64{7.627531214, 2.759262235}, []float64{1}},
	{[]float64{5.332441248, 2.088626775}, []float64{1}},
	{[]float64{6.922596716, 1.77106367}, []float64{1}},
	{[]float64{8.675418651, -0.242068655}, []float64{1}},
}

Create a network with two hidden layers of size 2 and 2 respectively:

n := deep.NewNeural(&deep.Config{
	/* Input dimensionality */
	Inputs: 2,
	/* Two hidden layers consisting of two neurons each, and a single output */
	Layout: []int{2, 2, 1},
	/* Activation functions: Sigmoid, Tanh, ReLU, Linear */
	Activation: deep.ActivationSigmoid,
	/* Determines output layer activation & loss function: 
	ModeRegression: linear outputs with MSE loss
	ModeMultiClass: softmax output with Cross Entropy loss
	ModeMultiLabel: sigmoid output with Cross Entropy loss
	ModeBinary: sigmoid output with binary CE loss */
	Mode: deep.ModeBinary,
	/* Weight initializers: {deep.NewNormal(μ, σ), deep.NewUniform(μ, σ)} */
	Weight: deep.NewNormal(1.0, 0.0),
	/* Apply bias */
	Bias: true,
})

Train:

// params: learning rate, momentum, alpha decay, nesterov
optimizer := training.NewSGD(0.05, 0.1, 1e-6, true)
// params: optimizer, verbosity (print stats at every 50th iteration)
trainer := training.NewTrainer(optimizer, 50)

training, heldout := data.Split(0.5)
trainer.Train(n, training, heldout, 1000) // training, validation, iterations

resulting in:

Epochs        Elapsed       Error         
---           ---           ---           
5             12.938µs      0.36438       
10            125.691µs     0.02261       
15            177.194µs     0.00404       
...     
1000          10.703839ms   0.00000       

Finally, make some predictions:

fmt.Println(data[0].Input, "=>", n.Predict(data[0].Input))
fmt.Println(data[5].Input, "=>", n.Predict(data[5].Input))

Alternatively, batch training can be performed in parallell:

optimizer := NewAdam(0.001, 0.9, 0.999, 1e-8)
// params: optimizer, verbosity (print info at every n:th iteration), batch-size, number of workers
trainer := training.NewBatchTrainer(optimizer, 1, 200, 4)

training, heldout := data.Split(0.75)
trainer.Train(n, training, heldout, 1000) // training, validation, iterations

Examples

See training/trainer_test.go for a variety of toy examples of regression, multi-class classification, binary classification, etc.

See examples/ for more realistic examples:

Dataset Topology Epochs Accuracy
wines [5 5] 10000 ~98%
mnist [50] 25 ~97%
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].