All Projects → mlajtos → Moniel

mlajtos / Moniel

Licence: mit
Interactive Notation for Computational Graphs

Programming Languages

javascript
184084 projects - #8 most used programming language

Projects that are alternatives of or similar to Moniel

Speechbrain.github.io
The SpeechBrain project aims to build a novel speech toolkit fully based on PyTorch. With SpeechBrain users can easily create speech processing systems, ranging from speech recognition (both HMM/DNN and end-to-end), speaker recognition, speech enhancement, speech separation, multi-microphone speech processing, and many others.
Stars: ✭ 242 (-11.03%)
Mutual labels:  neural-networks
Place Recognition Using Autoencoders And Nn
Place recognition with WiFi fingerprints using Autoencoders and Neural Networks
Stars: ✭ 256 (-5.88%)
Mutual labels:  neural-networks
Deeplearning Challenges
Codes for weekly challenges on Deep Learning by Siraj
Stars: ✭ 264 (-2.94%)
Mutual labels:  neural-networks
Darwinexlabs
Datasets, tools and more from Darwinex Labs - Prop Investing Arm & Quant Team @ Darwinex
Stars: ✭ 248 (-8.82%)
Mutual labels:  neural-networks
Keras
Deep Learning for humans
Stars: ✭ 53,476 (+19560.29%)
Mutual labels:  neural-networks
Carrot
🥕 Evolutionary Neural Networks in JavaScript
Stars: ✭ 261 (-4.04%)
Mutual labels:  neural-networks
Da Rnn
Dual-Stage Attention-Based Recurrent Neural Net for Time Series Prediction
Stars: ✭ 242 (-11.03%)
Mutual labels:  neural-networks
Deeplearning.ai Assignments
Stars: ✭ 268 (-1.47%)
Mutual labels:  neural-networks
Netket
Machine learning algorithms for many-body quantum systems
Stars: ✭ 256 (-5.88%)
Mutual labels:  neural-networks
Ergo
🧠 A tool that makes AI easier.
Stars: ✭ 264 (-2.94%)
Mutual labels:  neural-networks
Cnn face detection
Implementation based on the paper Li et al., “A Convolutional Neural Network Cascade for Face Detection, ” 2015 CVPR
Stars: ✭ 251 (-7.72%)
Mutual labels:  neural-networks
Deepjazz
Deep learning driven jazz generation using Keras & Theano!
Stars: ✭ 2,766 (+916.91%)
Mutual labels:  neural-networks
Deeplearning.ai Notes
These are my notes which I prepared during deep learning specialization taught by AI guru Andrew NG. I have used diagrams and code snippets from the code whenever needed but following The Honor Code.
Stars: ✭ 262 (-3.68%)
Mutual labels:  neural-networks
Keras Sharp
Keras# initiated as an effort to port the Keras deep learning library to C#, supporting both TensorFlow and CNTK
Stars: ✭ 247 (-9.19%)
Mutual labels:  neural-networks
Keras Vis
Neural network visualization toolkit for keras
Stars: ✭ 2,900 (+966.18%)
Mutual labels:  neural-networks
Kerasdeepspeech
A Keras CTC implementation of Baidu's DeepSpeech for model experimentation
Stars: ✭ 245 (-9.93%)
Mutual labels:  neural-networks
Painters
🎨 Winning solution for the Painter by Numbers competition on Kaggle.
Stars: ✭ 257 (-5.51%)
Mutual labels:  neural-networks
Pycox
Survival analysis with PyTorch
Stars: ✭ 269 (-1.1%)
Mutual labels:  neural-networks
Awesome Ai Awesomeness
A curated list of awesome awesomeness about artificial intelligence
Stars: ✭ 268 (-1.47%)
Mutual labels:  neural-networks
Graph Based Deep Learning Literature
links to conference publications in graph-based deep learning
Stars: ✭ 3,428 (+1160.29%)
Mutual labels:  neural-networks

⚠️ Notice

L1: Tensor Studio - a more practical continuation of the ideas presented in Moniel.


Moniel: Notation for Computational Graphs

Human-friendly declarative dataflow notation for computational graphs. See video.

Demo


Pre-built packages

macOS

Moniel.dmg (77MB)


Setup for other platforms

$ git clone https://github.com/mlajtos/moniel.git
$ cd moniel
$ npm install
$ npm start

Quick Introduction

Moniel is one of many attempts at creating a notation for deep learning models leveraging graph thinking. Instead of defining computation as list of formulea, we define the model as a declarative dataflow graph. It is not a programming language, just a convenient notation. (Which will be executable. Wanna help?)

Note: Proper syntax highlighting is not available here on GitHub. Use the application for the best experience.

Let's start with nothing, i.e. comments:

// This is line comment.

/*
	This is block
	comment.
*/

Node can be created by stating its type:

Sigmoid

You don't have to write full name of a type. Use acronym that fits you! These are all equivalent:

LocalResponseNormalization // canonical, but too long
LocRespNorm // weird, but why not?
LRN // cryptic for beginners, enough for others

Nodes connect with other nodes with an arrow:

Sigmoid -> MaxPooling

There can be chain of any length:

LRN -> Sigm -> BatchNorm -> ReLU -> Tanh -> MP -> Conv -> BN -> ELU

Also, there can be multiple chains:

ReLU -> BN
LRN -> Conv -> MP
Sigm -> Tanh

Nodes can have identifiers:

conv:Convolution

Identifiers let's you refer to nodes that are used more than once:

// inefficient declaration of matrix-matrix multiplication
matrix1:Tensor
matrix2:Tensor
mm:MatrixMultiplication

matrix1 -> mm
matrix2 -> mm

However, this can be rewritten without identifiers using list:

[Tensor,Tensor] -> MatMul

Lists let's you easily declare multi-connection:

// Maximum of 3 random numbers
[Random,Random,Random] -> Maximum

List-to-list connections are sometimes really handy:

// Range of 3 random numbers
[Rand,Rand,Rand] -> [Max,Min] -> Sub -> Abs

Nodes can take named attributes that modify their behavior:

Fill(shape = 10x10x10, value = 1.0)

Attribute names can also be shortened:

Ones(s=10x10x10)

Defining large graphs without proper structuring is unmanageable. Metanodes can help:

layer:{
    RandomNormal(shape=784x1000) -> weights:Variable
    weights -> dp:DotProduct -> act:ReLU
}

Tensor -> layer/dp // feed input into the DotProduct of the "layer" metanode
layer/act -> Softmax // feed output of the "layer" metanode into another node

Metanodes are more powerful when they define proper Input-Output boundary:

layer1:{
    RandomNormal(shape=784x1000) -> weigths:Variable
    [in:Input,weigths] -> DotProduct -> ReLU -> out:Output
}

layer2:{
    RandomNormal(shape=1000x10) -> weigths:Variable
    [in:Input,weigths] -> DotProduct -> ReLU -> out:Output
}

// connect metanodes directly
layer1 -> layer2

Alternatively, you can use inline metanodes:

In -> layer:{[In,Tensor] -> Conv -> Out} -> Out

Or you don't need to give it a name:

In -> {[In,Tensor] -> Conv -> Out} -> Out

If metanodes have identical structure, we can create a reusable metanode and use it as a normal node:

+ReusableLayer(shape = 1x1){
    RandN(shape = shape) -> w:Var
    [in:In,w] -> DP -> RLU -> out:Out
}

RL(s = 784x1000) -> RL(s = 1000x10)

Similar projects and Inspiration

  • Piotr Migdał: Simple Diagrams of Convoluted Neural Networks - great summary on visualizing ML architectures
  • Lobe (video) – "Build, train, and ship custom deep learning models using a simple visual interface."
  • Serrano – "A graph computation framework with Accelerate and Metal support."
  • Subgraphs – "Subgraphs is a visual IDE for developing computational graphs."
  • 💀Machine – "Machine is a machine learning IDE."
  • PyTorch – "Tensors and Dynamic neural networks in Python with strong GPU acceleration."
  • Sonnet – "Sonnet is a library built on top of TensorFlow for building complex neural networks."
  • TensorGraph – "TensorGraph is a framework for building any imaginable models based on TensorFlow"
  • nngraph – "graphical computation for nn library in Torch"
  • DNNGraph – "a deep neural network model generation DSL in Haskell"
  • NNVM – "Intermediate Computational Graph Representation for Deep Learning Systems"
  • DeepRosetta – "An universal deep learning models conversor"
  • TensorBuilder – "a functional fluent immutable API based on the Builder Pattern"
  • Keras – "minimalist, highly modular neural networks library"
  • PrettyTensor – "a high level builder API"
  • TF-Slim – "a lightweight library for defining, training and evaluating models"
  • TFLearn – "modular and transparent deep learning library"
  • Caffe – "deep learning framework made with expression, speed, and modularity in mind"
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].