All Projects → cloudkj → Layer

cloudkj / Layer

Licence: mit
Neural network inference the Unix way

Programming Languages

scheme
763 projects
lisp
113 projects

Projects that are alternatives of or similar to Layer

Livianet
This repository contains the code of LiviaNET, a 3D fully convolutional neural network that was employed in our work: "3D fully convolutional networks for subcortical segmentation in MRI: A large-scale study"
Stars: ✭ 143 (-73.47%)
Mutual labels:  convolutional-neural-networks, neural-networks
Deep Learning With Python
Deep learning codes and projects using Python
Stars: ✭ 195 (-63.82%)
Mutual labels:  convolutional-neural-networks, neural-networks
Iresnet
Improved Residual Networks (https://arxiv.org/pdf/2004.04989.pdf)
Stars: ✭ 163 (-69.76%)
Mutual labels:  convolutional-neural-networks, neural-networks
Jsnet
Javascript/WebAssembly deep learning library for MLPs and convolutional neural networks
Stars: ✭ 126 (-76.62%)
Mutual labels:  convolutional-neural-networks, neural-networks
Komputation
Komputation is a neural network framework for the Java Virtual Machine written in Kotlin and CUDA C.
Stars: ✭ 295 (-45.27%)
Mutual labels:  convolutional-neural-networks, neural-networks
Chainer Cifar10
Various CNN models for CIFAR10 with Chainer
Stars: ✭ 134 (-75.14%)
Mutual labels:  convolutional-neural-networks, neural-networks
Coursera Deep Learning Specialization
Notes, programming assignments and quizzes from all courses within the Coursera Deep Learning specialization offered by deeplearning.ai: (i) Neural Networks and Deep Learning; (ii) Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization; (iii) Structuring Machine Learning Projects; (iv) Convolutional Neural Networks; (v) Sequence Models
Stars: ✭ 188 (-65.12%)
Mutual labels:  convolutional-neural-networks, neural-networks
Graph 2d cnn
Code and data for the paper 'Classifying Graphs as Images with Convolutional Neural Networks' (new title: 'Graph Classification with 2D Convolutional Neural Networks')
Stars: ✭ 67 (-87.57%)
Mutual labels:  convolutional-neural-networks, neural-networks
Deeplearning.ai Assignments
Stars: ✭ 268 (-50.28%)
Mutual labels:  convolutional-neural-networks, neural-networks
Netket
Machine learning algorithms for many-body quantum systems
Stars: ✭ 256 (-52.5%)
Mutual labels:  convolutional-neural-networks, neural-networks
Hyperdensenet
This repository contains the code of HyperDenseNet, a hyper-densely connected CNN to segment medical images in multi-modal image scenarios.
Stars: ✭ 124 (-76.99%)
Mutual labels:  convolutional-neural-networks, neural-networks
Artificio
Deep Learning Computer Vision Algorithms for Real-World Use
Stars: ✭ 326 (-39.52%)
Mutual labels:  convolutional-neural-networks, neural-networks
Sigmoidal ai
Tutoriais de Python, Data Science, Machine Learning e Deep Learning - Sigmoidal
Stars: ✭ 103 (-80.89%)
Mutual labels:  convolutional-neural-networks, neural-networks
Bender
Easily craft fast Neural Networks on iOS! Use TensorFlow models. Metal under the hood.
Stars: ✭ 1,728 (+220.59%)
Mutual labels:  convolutional-neural-networks, neural-networks
Wav2letter
Speech Recognition model based off of FAIR research paper built using Pytorch.
Stars: ✭ 78 (-85.53%)
Mutual labels:  convolutional-neural-networks, neural-networks
Anime4k
A High-Quality Real Time Upscaler for Anime Video
Stars: ✭ 14,083 (+2512.8%)
Mutual labels:  convolutional-neural-networks, neural-networks
Convisualize nb
Visualisations for Convolutional Neural Networks in Pytorch
Stars: ✭ 57 (-89.42%)
Mutual labels:  convolutional-neural-networks, neural-networks
Cyclegan Qp
Official PyTorch implementation of "Artist Style Transfer Via Quadratic Potential"
Stars: ✭ 59 (-89.05%)
Mutual labels:  convolutional-neural-networks, neural-networks
Cnn face detection
Implementation based on the paper Li et al., “A Convolutional Neural Network Cascade for Face Detection, ” 2015 CVPR
Stars: ✭ 251 (-53.43%)
Mutual labels:  convolutional-neural-networks, neural-networks
Cs231
Complete Assignments for CS231n: Convolutional Neural Networks for Visual Recognition
Stars: ✭ 317 (-41.19%)
Mutual labels:  convolutional-neural-networks, neural-networks

layer - neural network inference from the command line

layer is a program for doing neural network inference the Unix way. Many modern neural network operations can be represented as sequential, unidirectional streams of data processed by pipelines of filters. The computations at each layer in these neural networks are equivalent to an invocation of the layer program, and multiple invocations can be chained together to represent the entirety of such networks.

For example, performing inference on a neural network with two fully-connected layers might look something like this:

cat input | layer full -w w.1 --input-shape=2 -f tanh | layer full -w w.2 --input-shape=3 -f sigmoid

layer applies the Unix philosophy to neural network inference. Each type of a neural network layer is a distinct subcommand. Simple text streams of delimited numeric values serve as the interface between different layers of a neural network. Each invocation of layer does one thing: it feeds the numeric input values forward through an instantiation of a neural network layer, then emits the resulting output numeric values.

Usage

Example: a convolutional neural network for CIFAR-10.

$ cat cifar10_x.csv \
    | layer convolutional -w w0.csv -b b0.csv --input-shape=32,32,3  --filter-shape=3,3 --num-filters=32 -f relu \
    | layer convolutional -w w1.csv -b b1.csv --input-shape=30,30,32 --filter-shape=3,3 --num-filters=32 -f relu \
    | layer pooling --input-shape=28,28,32 --filter-shape=2,2 --stride=2 -f max

Example: a multi-layer perceptron for XOR.

$ # Fully connected layer with three neurons
echo "-2.35546875,-2.38671875,3.63671875,3.521484375,-2.255859375,-2.732421875" > layer1.weights
echo "0.7958984375,0.291259765625,1.099609375" > layer1.biases

$ # Fully connected layer with one neuron
echo "-5.0625,-3.515625,-5.0625" > layer2.weights
echo "1.74609375" > layer2.biases

$ # Compute XOR for all possible binary inputs
echo -e "0,0\n0,1\n1,0\n1,1" \
    | layer full -w layer1.weights -b layer1.biases --input-shape=2 -f tanh \
    | layer full -w layer2.weights -b layer2.biases --input-shape=3 -f sigmoid
0.00129012749948779
0.99147053740106
0.991243357927591
0.0111237568184365

Installation

Requirements: BLAS 3.6.0+

  1. Download a release
  2. Install BLAS 3.6.0+
  • On Debian-based systems: apt-get install -y libblas3
  • On RPM-based system: yum install -y blas
  • On macOS 10.3+, BLAS is pre-installed as part of the Accelerate framework
  1. Unzip the release and run [sudo] ./install.sh, or manually relocate the binaries to the path of your choice.

About

layer is currently implemented as a proof-of-concept and supports a limited number of neural network layer types. The types of layers are currently limited to feed-forward layers that can be modeled as sequential, unidirectional pipelines.

Input values, weights and biases for parameterized layers, and output values are all read and written in row-major order, based on the shape parameters specified for each layer.

layer is implemented in CHICKEN Scheme.

License

Copyright © 2018-2019

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].