All Projects → hpi-xnor → Bmxnet V2

hpi-xnor / Bmxnet V2

Licence: apache-2.0
BMXNet 2: An Open-Source Binary Neural Network Implementation Based on MXNet

Projects that are alternatives of or similar to Bmxnet V2

Horovod
Distributed training framework for TensorFlow, Keras, PyTorch, and Apache MXNet.
Stars: ✭ 11,943 (+5901.51%)
Mutual labels:  mxnet
Gluon Ts
Probabilistic time series modeling in Python
Stars: ✭ 2,373 (+1092.46%)
Mutual labels:  mxnet
Mx Lsoftmax
mxnet version of Large-Margin Softmax Loss for Convolutional Neural Networks.
Stars: ✭ 175 (-12.06%)
Mutual labels:  mxnet
Xlearning
AI on Hadoop
Stars: ✭ 1,709 (+758.79%)
Mutual labels:  mxnet
Ncnn
ncnn is a high-performance neural network inference framework optimized for the mobile platform
Stars: ✭ 13,376 (+6621.61%)
Mutual labels:  mxnet
Deep Learning Containers
AWS Deep Learning Containers (DLCs) are a set of Docker images for training and serving models in TensorFlow, TensorFlow 2, PyTorch, and MXNet.
Stars: ✭ 152 (-23.62%)
Mutual labels:  mxnet
Deep Face Alignment
The MXNet Implementation of Stacked Hourglass and Stacked SAT for Robust 2D and 3D Face Alignment
Stars: ✭ 134 (-32.66%)
Mutual labels:  mxnet
Thinc
🔮 A refreshing functional take on deep learning, compatible with your favorite libraries
Stars: ✭ 2,422 (+1117.09%)
Mutual labels:  mxnet
Djl
An Engine-Agnostic Deep Learning Framework in Java
Stars: ✭ 2,262 (+1036.68%)
Mutual labels:  mxnet
Coach
Reinforcement Learning Coach by Intel AI Lab enables easy experimentation with state of the art Reinforcement Learning algorithms
Stars: ✭ 2,085 (+947.74%)
Mutual labels:  mxnet
Facerecognition
This is an implematation project of face detection and recognition. The face detection using MTCNN algorithm, and recognition using LightenenCNN algorithm.
Stars: ✭ 137 (-31.16%)
Mutual labels:  mxnet
Machine Learning Using K8s
Train and Deploy Machine Learning Models on Kubernetes using Amazon EKS
Stars: ✭ 145 (-27.14%)
Mutual labels:  mxnet
Mobulaop
A Simple & Flexible Cross Framework Operators Toolkit
Stars: ✭ 161 (-19.1%)
Mutual labels:  mxnet
Single Path One Shot Nas Mxnet
Single Path One-Shot NAS MXNet implementation with full training and searching pipeline. Support both Block and Channel Selection. Searched models better than the original paper are provided.
Stars: ✭ 136 (-31.66%)
Mutual labels:  mxnet
Imgclsmob
Sandbox for training deep learning networks
Stars: ✭ 2,405 (+1108.54%)
Mutual labels:  mxnet
Mxnet Mobilenet V2
Reproduction of MobileNetV2 using MXNet
Stars: ✭ 134 (-32.66%)
Mutual labels:  mxnet
Pyeco
python implementation of efficient convolution operators for tracking
Stars: ✭ 150 (-24.62%)
Mutual labels:  mxnet
Gluon Nlp
NLP made easy
Stars: ✭ 2,344 (+1077.89%)
Mutual labels:  mxnet
Xfer
Transfer Learning library for Deep Neural Networks.
Stars: ✭ 177 (-11.06%)
Mutual labels:  mxnet
Crnn Mxnet Chinese Text Recognition
An implementation of CRNN (CNN+LSTM+warpCTC) on MxNet for chinese text recognition
Stars: ✭ 161 (-19.1%)
Mutual labels:  mxnet

BMXNet 2 // Hasso Plattner Institute

A fork of the deep learning framework mxnet to study and implement quantization and binarization in neural networks.

This project is based on the first version of BMXNet, but is different in that it reuses more of the original MXNet operators. This aim was to have only minimal changes to C++ code to get better maintainability with future versions of mxnet.

mxnet version

This version of BMXNet 2 is based on: mxnet v1.5.1

News

See all BMXNet changes: Changelog.

  • May 21, 2019
  • Sep 01, 2018
    • We rebuilt BMXNet to utilize the new Gluon API for better maintainability
    • To build binary neural networks, you can use drop in replacements of convolution and dense layers (see Usage):
    • Note that this project is still in beta and changes might be frequent

Setup

If you only want to test the basics, you can also look at our docker setup.

We use CMake to build the project. Make sure to install all the dependencies described here. If you install CUDA 10, you will need CMake >=3.12.2

Adjust settings in cmake (build-type Release or Debug, configure CUDA, OpenBLAS or Atlas, OpenCV, OpenMP etc.).

Further, we recommend Ninja as a build system for faster builds (Ubuntu: sudo apt-get install ninja-build).

git clone --recursive https://github.com/hpi-xnor/BMXNet-v2.git # remember to include the --recursive
cd BMXNet-v2
mkdir build && cd build
cmake .. -G Ninja # if any error occurs, apply ccmake or cmake-gui to adjust the cmake config.
ccmake . # or GUI cmake
ninja

Build the MXNet Python binding

Step 1 Install prerequisites - python, setup-tools, python-pip and numpy.

sudo apt-get install -y python-dev python3-dev virtualenv
wget -nv https://bootstrap.pypa.io/get-pip.py
python3 get-pip.py
python2 get-pip.py

Step 1b (Optional) Create or activate a virtualenv.

Step 2 Install the MXNet Python binding.

cd <mxnet-root>/python
pip install -e .

If your mxnet python binding still not works, you can add the location of the libray to your LD_LIBRARY_PATH as well as the mxnet python folder to your PYTHONPATH:

$ export LD_LIBRARY_PATH=<mxnet-root>/build/Release
$ export PYTHONPATH=<mxnet-root>/python

Training

Make sure that you have a new version of our example submodule example/bmxnet-examples:

cd example/bmxnet-examples
git checkout master
git pull

Examples for hyperparameters are documented in the Wiki.

Inference

To speed up inference and compress your model, you need to save it as a symbol (not with gluon) and afterwards convert it with the model-converter. Please check the corresponding test case.

build/tools/binary_converter/model-converter model-0000.params

Tests

To run BMXNet specific tests install pytest:

pip install pytest

Then simply run:

pytest tests/binary

Usage

We added binary versions of the following layers of the gluon API:

  • gluon.nn.Dense -> gluon.nn.QDense
  • gluon.nn.Conv1D -> gluon.nn.QConv1D
  • gluon.nn.Conv2D -> gluon.nn.QConv2D
  • gluon.nn.Conv3D -> gluon.nn.QConv3D

Overview of Changes

We added three functions det_sign (ada4ea1d), round_ste (044f81f0) and contrib.gradcancel to MXNet (see src/operator/contrib/gradient_cancel[-inl.h|.cc|.cu]).

The rest of our code resides in the following folders/files:

For more details see the Changelog.

Docker setup

A docker image for testing of BMXNet can be build similar to our CI script at .gitlab-ci.yml, however it only supports CPU, so actual training might be tedious.

cd ci
docker build -f docker/Dockerfile.build.ubuntu_cpu --build-arg USER_ID=1000 --build-arg GROUP_ID=1000 --cache-from bmxnet2-base/build.ubuntu_cpu -t bmxnet2-base/build.ubuntu_cpu docker

Then you can enter the container (and automatically delete it)

docker run --rm -it bmxnet2-base/build.ubuntu_cpu # deletes the container after running
docker run -it bmxnet2-base/build.ubuntu_cpu # keeps the container after running (it needs to be removed manually later)

Inside the container you can now clone, build and test BMXNet 2

# clone
mkdir -p /builds/
cd /builds/
git clone https://github.com/hpi-xnor/BMXNet-v2.git bmxnet --recursive
cd bmxnet
# build
mkdir build
cd build
cmake -DBINARY_WORD_TYPE=uint32 -DUSE_CUDA=OFF -DUSE_MKL_IF_AVAILABLE=OFF -GNinja ..
cd ..
cmake --build build
export PYTHONPATH=/builds/bmxnet/python # add python binding
# run the tests (we need to upgrade pytest first via pip3)
pip3 install pytest --upgrade
pytest tests/binary

You can even train a simple binary MNIST model, but you might need to update the examples to the newest version first (checkout the master branch).

cd example/bmxnet-examples/mnist/
git checkout master
pip3 install mxboard
python3 mnist-lenet.py --bits 1 # trains a binary lenet model with 1 bit activations and 1 bit weights on MNIST

Citing BMXNet 2

Please cite our paper about BMXNet 2 in your publications if it helps your research work:

@article{bmxnetv2,
  title = {Training Competitive Binary Neural Networks from Scratch},
  author = {Joseph Bethge and Marvin Bornstein and Adrian Loy and Haojin Yang and Christoph Meinel},
  journal = {ArXiv e-prints},
  archivePrefix = "arXiv",
  eprint = {1812.01965},
  Year = {2018}
}

References

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].