All Projects β†’ liangfu β†’ dnn

liangfu / dnn

Licence: MIT License
A light-weight deep learning framework implemented in C++.

Programming Languages

C++
36643 projects - #6 most used programming language
c
50402 projects - #5 most used programming language
Objective-C++
1391 projects
python
139335 projects - #7 most used programming language
CMake
9771 projects
objective c
16641 projects - #2 most used programming language

Projects that are alternatives of or similar to dnn

openvpn-portable
πŸš€ OpenVPN portable for Windows
Stars: ✭ 59 (+391.67%)
Mutual labels:  portable
lastfm
Portable .Net library for Last.fm
Stars: ✭ 87 (+625%)
Mutual labels:  portable
java-portable
Install a portable version of the JDK (and bundled JRE) and run it everywhere without admin rights on Windows.
Stars: ✭ 22 (+83.33%)
Mutual labels:  portable
FullProxy
Bind and reverse connection based, SOCKS5, HTTP and PortForward based portable proxy
Stars: ✭ 22 (+83.33%)
Mutual labels:  portable
dot-templater
A small, portable Rust program intended for templating dotfiles across multiple systems.
Stars: ✭ 41 (+241.67%)
Mutual labels:  portable
videoMultiGAN
End to End learning for Video Generation from Text
Stars: ✭ 53 (+341.67%)
Mutual labels:  dnn
OpenCvSharpDNN
Implementation of YoloV3 and Caffe in OpenCvSharp
Stars: ✭ 20 (+66.67%)
Mutual labels:  dnn
dbeaver-portable
πŸš€ DBeaver portable for Windows
Stars: ✭ 23 (+91.67%)
Mutual labels:  portable
ttslearn
ttslearn: Library for Pythonで学ぢ音声合成 (Text-to-speech with Python)
Stars: ✭ 158 (+1216.67%)
Mutual labels:  dnn
wymlp
tiny fast portable real-time deep neural network for regression and classification within 50 LOC.
Stars: ✭ 36 (+200%)
Mutual labels:  portable
pix2pix-tensorflow
A minimal tensorflow implementation of pix2pix (Image-to-Image Translation with Conditional Adversarial Nets - https://phillipi.github.io/pix2pix/).
Stars: ✭ 22 (+83.33%)
Mutual labels:  dnn
systolic-array-dataflow-optimizer
A general framework for optimizing DNN dataflow on systolic array
Stars: ✭ 21 (+75%)
Mutual labels:  dnn
ShiftCNN
A script to convert floating-point CNN models into generalized low-precision ShiftCNN representation
Stars: ✭ 54 (+350%)
Mutual labels:  dnn
NBrightBuy
NBrightStore - E-Commerce for DNN (NBSv3)
Stars: ✭ 21 (+75%)
Mutual labels:  dnn
streamlink-portable
A script to build a portable version of Streamlink for Windows
Stars: ✭ 70 (+483.33%)
Mutual labels:  portable
cadru
A Microsoft .NET Framework toolkit
Stars: ✭ 58 (+383.33%)
Mutual labels:  portable
gemmini
Berkeley's Spatial Array Generator
Stars: ✭ 290 (+2316.67%)
Mutual labels:  dnn
speech-to-text
mixlingual speech recognition system; hybrid (GMM+NNet) model; Kaldi + Keras
Stars: ✭ 61 (+408.33%)
Mutual labels:  dnn
cosmonim
A simple example to show how cosmopolitan libc can be used with Nim
Stars: ✭ 90 (+650%)
Mutual labels:  portable
nextcloud-portable
πŸš€ Nextcloud portable for Windows
Stars: ✭ 21 (+75%)
Mutual labels:  portable

Deep Neural Nets

Build Status License

Introduction

The Deep Neural Nets (DNN) library is a deep learning framework designed to be small in size, computationally efficient and portable.

We started the project as a fork of the popular OpenCV library, while removing some components that is not tightly related to the deep learning framework. Comparing to Caffe and many other implements, DNN is relatively independent to third-party libraries, (Yes, we don't require Boost and Database systems to be install before crafting your own network models) and it can be more easily portable to mobile systems, like iOS, Android and RaspberryPi etc. And more importantly, DNN is powerful! It supports both convolutional networks and recurrent networks, as well as combinations of the two.

Available Modules

The following features have been implemented:

  • Mini-batch based learning, with OpenMP support
  • YAML based network definition
  • Gradient checking for all implemented layers

The following modules are implemented in current version:

Module Name Description
InputLayer for storing original input images
ConvolutionLayer performs 2d convolution upon images
MaxPoolingLayer performs max-pooling operation
DenseLayer fully connected Layer (optionally, perform activation and dropout)
SimpleRNNLayer for processing sequence data
MergeLayer for combining output results from multiple different layers

More modules will be available online !

Options To Define A Specific Layer

Layer Type Attributes
Input name,n_input_planes,input_height,input_width,seq_length
Convolution name,visualize,n_output_planes,ksize
MaxPooling name,visualize,ksize
SpatialTransform name,input_layer,n_output_planes,output_height,output_width
Dense name,input_layer(optional),visualize,n_output_planes,activation
TimeDistributed name,n_output_planes,output_height,output_width,seq_length,time_index
SimpleRNN name,n_output_planes,seq_length,time_index,activation
Merge name,input_layers,visualize,n_output_planes

With the above parameters given in YAML format, one can simply define a network. For instance, a lenet model can be defined as:

%YAML:1.0
layers:
  - {type: Input, name: input1, n_input_planes: 1, input_height: 28, input_width: 28, seq_length: 1}
  - {type: Convolution, name: conv1, visualize: 0, n_output_planes: 6, ksize: 5, stride: 1}
  - {type: MaxPooling, name: pool1, visualize: 0, ksize: 2, stride: 2}
  - {type: Convolution, name: conv2, visualize: 0, n_output_planes: 16, ksize: 5, stride: 1}
  - {type: MaxPooling, name: pool2, visualize: 0, ksize: 2, stride: 2}
  - {type: Dense, name: fc1, visualize: 0, n_output_planes: 10, activation: softmax}

Then, by ruuning network training program:

$ network train --solver data/mnist/lenet_solver.xml

one can start to train a simple network right away. And this is the way the source code and data models are tested in Travis-Ci. (See .travis.yml in the root directory)

Compilation

CMake is required for successfully compiling the project.

Under root directory of the project:

$ cd $DNN_ROOT
$ mkdir build
$ cmake .. 
$ make -j4

License

MIT

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].