All Projects → antoniodeluca → Dn2a

antoniodeluca / Dn2a

Licence: mit
DN2A - Digital Neural Networks Architecture

Programming Languages

javascript
184084 projects - #8 most used programming language

Projects that are alternatives of or similar to Dn2a

Project Checklist
✅ A checklist of things to do before releasing your project
Stars: ✭ 390 (-15.58%)
Mutual labels:  project
Cn Deep Learning
Stars: ✭ 423 (-8.44%)
Mutual labels:  neural-networks
Deep Metric Learning Baselines
PyTorch Implementation for Deep Metric Learning Pipelines
Stars: ✭ 442 (-4.33%)
Mutual labels:  neural-networks
Neuralmonkey
An open-source tool for sequence learning in NLP built on TensorFlow.
Stars: ✭ 400 (-13.42%)
Mutual labels:  neural-networks
Edward2
A simple probabilistic programming language.
Stars: ✭ 419 (-9.31%)
Mutual labels:  neural-networks
Deepxde
Deep learning library for solving differential equations and more
Stars: ✭ 420 (-9.09%)
Mutual labels:  neural-networks
Teamvision
Teamvision软件工程协作工具
Stars: ✭ 380 (-17.75%)
Mutual labels:  project
Pytorch Tutorial
PyTorch Tutorial for Deep Learning Researchers
Stars: ✭ 22,442 (+4757.58%)
Mutual labels:  neural-networks
Deep Learning Resources
由淺入深的深度學習資源 Collection of deep learning materials for everyone
Stars: ✭ 422 (-8.66%)
Mutual labels:  neural-networks
Pytorch Flows
PyTorch implementations of algorithms for density estimation
Stars: ✭ 439 (-4.98%)
Mutual labels:  neural-networks
Simgan
Implementation of Apple's Learning from Simulated and Unsupervised Images through Adversarial Training
Stars: ✭ 406 (-12.12%)
Mutual labels:  neural-networks
Nn playground
Experimental keras implementation of novel neural network structures
Stars: ✭ 414 (-10.39%)
Mutual labels:  neural-networks
Ruby Fann
Ruby library for interfacing with FANN (Fast Artificial Neural Network)
Stars: ✭ 425 (-8.01%)
Mutual labels:  neural-networks
Lasagne
Lightweight library to build and train neural networks in Theano
Stars: ✭ 3,800 (+722.51%)
Mutual labels:  neural-networks
Deeplearningproject
An in-depth machine learning tutorial introducing readers to a whole machine learning pipeline from scratch.
Stars: ✭ 4,293 (+829.22%)
Mutual labels:  neural-networks
Nnom
A higher-level Neural Network library for microcontrollers.
Stars: ✭ 382 (-17.32%)
Mutual labels:  neural-networks
Learning To Learn
Learning to Learn in TensorFlow
Stars: ✭ 4,038 (+774.03%)
Mutual labels:  neural-networks
Pyod
A Python Toolbox for Scalable Outlier Detection (Anomaly Detection)
Stars: ✭ 5,083 (+1000.22%)
Mutual labels:  neural-networks
Spacy
💫 Industrial-strength Natural Language Processing (NLP) in Python
Stars: ✭ 21,978 (+4657.14%)
Mutual labels:  neural-networks
Geneticalgorithmpython
Source code of PyGAD, a Python 3 library for building the genetic algorithm and training machine learning algorithms (Keras & PyTorch).
Stars: ✭ 435 (-5.84%)
Mutual labels:  neural-networks

DN2A (JavaScript)

Digital Neural Networks Architecture

Build Status MIT licensed npm version

About

DN2A is a set of highly decoupled JavaScript modules for Neural Networks and Artificial Intelligence development.

Each module is based on injection by configuration.

You can use a single module alone, more of them together or just the complete set.

DN2A main goal is to allow you to design, train and use without pain Single Neural Networks as well as very powerful Neural Networks Chains through which implement your Artificial Intelligence solution.

DN2A side goals are to simplify integration, to speed up training/querying, to allow clustering and to represent the architecture and the relative data of each Neural Network as a (re)combinable string strain that will be usable within genetics optimization techniques.


Features

  • Modularized components: helps the development and the clear separation of concerns with great benefits for who wants to use mixed solutions.
  • Configurable precision: helps to avoid the noise deriving from operation errors and default system precision limits with great improvement of the learning speed and performance stability.
  • Configuration checker: helps to write less details about configuration and to keep compatibility with older version while the project evolves.
  • StepByStep training: helps to train neural networks doing a single iteration over the passed information.
  • StopAtGoal training: helps to train neural networks doing a finite number of iterations over the passed information until a specific parametric condition is reached.
  • Continuous training: helps to train neural networks doing an infinite number of iterations over the passed information.
  • TODO (Brain) Data normalization: helps to simplify the interaction within your real domain.
  • TODO (Cerebrum) Networks composition: helps to create very effective architectures of multiple neural networks able to obtain advanced behaviours like in deep learning.
  • TODO (Cerebrum) Computation parallelization: helps to improve the scalability of your whole system.
  • TODO (Brain) Sessions intercommunication: helps to improve the scalability of your whole system.

Modules

Neuron

Module able to facilitate the representation of the data structure around Neurons and to hold relative common functionalities.

Synapse

Module able to facilitate the representation of the data structure around Synapses and to hold relative common functionalities.

Network

Module, available in different variations, able to use Neurons and Synapses to implement configurable and autonomous Neural Networks.

Available Network Types

  1. alpha: standard feed forward neural network with error back propagation controlled by layer dimensions, learning mode, learning rate, momentum rate, maximum allowed error and maximum number of epochs.

Cerebrum

Module for the management of multiple Neural Networks in terms of configuration/coordination, training/querying chaining and parallel computing.

Brain

Module for the management of data normalization, integration/intercommunication with other external software and monitoring of the whole session.


Tutorials

Using in Node

To install the library through NPM:

npm install dn2a

Using in the Browser

To install the library through Bower:

bower install dn2a

Training and Querying a Single Network with default parametrization (ES5)

// Importation
var DN2A = require("dn2a");

// Instantiation
var neuralNetwork = new DN2A.NetworkAlpha();

// Training
var trainingPatterns = [
    {
        input: [0, 0],
        output: [0]
    },
    {
        input: [0, 1],
        output: [1]
    },
    {
        input: [1, 0],
        output: [1]
    },
    {
        input: [1, 1],
        output: [0]
    }
];
neuralNetwork.train(trainingPatterns);

// Querying
var inputPatterns = [
    [0, 0],
    [0, 1],
    [1, 0],
    [1, 1]
];
neuralNetwork.query(inputPatterns, function(queryingStatus) {
    inputPatterns.forEach(function(inputPatten, inputPatternIndex) {
        console.log("[" + inputPatterns[inputPatternIndex].join(", ") + "] => [" + queryingStatus.outputPatterns[inputPatternIndex].join(", ") + "]");
    });
});

Training and Querying a Single Network with custom parametrization (ES5)

// Importation
var DN2A = require("dn2a");

// Instantiation
// The object expected by the constructor can specify properties that describe the neural network.
// The list of the valid properties together their accepted ranges and default values is reported in this README file.
// The object can be completely omitted and in this case default values are used for all properties.
var neuralNetwork = new DN2A.NetworkAlpha();

var neuralNetwork = new DN2A.NetworkAlpha({
    layerDimensions: [2, 4, 4, 1],
    learningMode: "continuous",
    learningRate: 0.3,
    momentumRate: 0.7,
    maximumError: 0.005,
    maximumEpoch: 20000,
    dataRepository: {},
    neuron: {
        generator: DN2A.Neuron
    },
    synapse: {
        generator: DN2A.Synapse
    },
    numbersPrecision: 32
});

// Training
var trainingPatterns = [
    {
        input: [0, 0],
        output: [0]
    },
    {
        input: [0, 1],
        output: [1]
    },
    {
        input: [1, 0],
        output: [1]
    },
    {
        input: [1, 1],
        output: [0]
    }
];
neuralNetwork.train(trainingPatterns);

// Querying
var inputPatterns = [
    [0, 0],
    [0, 1],
    [1, 0],
    [1, 1]
];
neuralNetwork.query(inputPatterns, function(queryingStatus) {
    inputPatterns.forEach(function(inputPatten, inputPatternIndex) {
        console.log("[" + inputPatterns[inputPatternIndex].join(", ") + "] => [" + queryingStatus.outputPatterns[inputPatternIndex].join(", ") + "]");
    });
});

Training and Querying a Single Network with evolution feedback (ES5)

// Importation
var DN2A = require("dn2a");

// Instantiation
var neuralNetwork = new DN2A.NetworkAlpha();

// Training
// The object passed to the callback function contains information about the training process.
// The list of the valid properties together their accepted ranges and default values is reported in this README file.
var trainingPatterns = [
    {
        input: [0, 0],
        output: [0]
    },
    {
        input: [0, 1],
        output: [1]
    },
    {
        input: [1, 0],
        output: [1]
    },
    {
        input: [1, 1],
        output: [0]
    }
];
neuralNetwork.train(trainingPatterns, function(trainingStatus) {
    console.log("Epoch: " + trainingStatus.elapsedEpochCounter);
});

// Querying
var inputPatterns = [
    [0, 0],
    [0, 1],
    [1, 0],
    [1, 1]
];
neuralNetwork.query(inputPatterns, function(queryingStatus) {
    inputPatterns.forEach(function(inputPatten, inputPatternIndex) {
        console.log("[" + inputPatterns[inputPatternIndex].join(", ") + "] => [" + queryingStatus.outputPatterns[inputPatternIndex].join(", ") + "]");
    });
});

Training and Querying a Single Network through the Cerebrum (ES5)

// Importation
var DN2A = require("dn2a");

// Instantiation
var cerebrum = new DN2A.Cerebrum({
    minds: [
        {
            name: "firstNeuralNetwork",
            network: {
                generator: DN2A.NetworkAlpha,
                configuration: {
                    layerDimensions: [2, 4, 1],
                    learningMode: "continuous",
                    learningRate: 0.3,
                    momentumRate: 0.7,
                    maximumError: 0.005,
                    maximumEpoch: 1000,
                    dataRepository: {},
                    neuron: {
                        generator: DN2A.Neuron
                    },
                    synapse: {
                        generator: DN2A.Synapse
                    },
                    numbersPrecision: 32
                }
            },
            inputsFrom: [
                "cerebrum"
            ]
        }
    ],
    outputsFrom: [
        "firstNeuralNetwork"
    ]
});

// Training
// The name expected as third parameter by the trainMind method specifies which specific mind to train
var trainingPatterns = [
    {
        input: [0, 0],
        output: [0]
    },
    {
        input: [0, 1],
        output: [1]
    },
    {
        input: [1, 0],
        output: [1]
    },
    {
        input: [1, 1],
        output: [0]
    }
];
cerebrum.trainMind(trainingPatterns, function(trainingStatus) {
    console.log("Epoch: " + trainingStatus.elapsedEpochCounter);
}, "firstNeuralNetwork");

// Querying
// The name expected as third parameter by the queryMind method specifies which specific mind to query
var inputPatterns = [
    [0, 0],
    [0, 1],
    [1, 0],
    [1, 1]
];
cerebrum.queryMind(inputPatterns, function(queryingStatus) {
    inputPatterns.forEach(function(inputPatten, inputPatternIndex) {
        console.log("[" + inputPatterns[inputPatternIndex].join(", ") + "] => [" + queryingStatus.outputPatterns[inputPatternIndex].join(", ") + "]");
    });
}, "firstNeuralNetwork");

Training and Querying an entire Networks Chain through the Cerebrum (ES5)

TODO

Training and Querying a Single Network through the Brain (ES5)

TODO

Training and Querying an entire Networks Chain through the Brain (ES5)

TODO

Creator

Antonio De Luca


License

MIT

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].