All Projects → plumerai → rethinking-bnn-optimization

plumerai / rethinking-bnn-optimization

Licence: Apache-2.0 License
Implementation for the paper "Latent Weights Do Not Exist: Rethinking Binarized Neural Network Optimization"

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to rethinking-bnn-optimization

portfolio-optimizer
A library for portfolio optimization algorithms with python interface.
Stars: ✭ 19 (-69.35%)
Mutual labels:  optimizer
lookahead tensorflow
Lookahead optimizer ("Lookahead Optimizer: k steps forward, 1 step back") for tensorflow
Stars: ✭ 25 (-59.68%)
Mutual labels:  optimizer
falcon
A WordPress cleanup and performance optimization plugin.
Stars: ✭ 17 (-72.58%)
Mutual labels:  optimizer
daily astroph
Daily dose of astro-ph reading
Stars: ✭ 37 (-40.32%)
Mutual labels:  arxiv
ToyDB
A ToyDB (for beginner) based on MIT 6.830 and CMU 15445
Stars: ✭ 25 (-59.68%)
Mutual labels:  optimizer
madam
👩 Pytorch and Jax code for the Madam optimiser.
Stars: ✭ 46 (-25.81%)
Mutual labels:  optimizer
EAGO.jl
A development environment for robust and global optimization
Stars: ✭ 106 (+70.97%)
Mutual labels:  optimizer
goga
Go evolutionary algorithm is a computer library for developing evolutionary and genetic algorithms to solve optimisation problems with (or not) many constraints and many objectives. Also, a goal is to handle mixed-type representations (reals and integers).
Stars: ✭ 39 (-37.1%)
Mutual labels:  optimizer
AdaBound-tensorflow
An optimizer that trains as fast as Adam and as good as SGD in Tensorflow
Stars: ✭ 44 (-29.03%)
Mutual labels:  optimizer
Windows11-Optimization
Community repository, to improve security and performance of Windows 10 and windows 11 with tweaks, commands, scripts, registry keys, configuration, tutorials and more
Stars: ✭ 17 (-72.58%)
Mutual labels:  optimizer
qEEG feature set
NEURAL: a neonatal EEG feature set in Matlab
Stars: ✭ 29 (-53.23%)
Mutual labels:  arxiv
ada-hessian
Easy-to-use AdaHessian optimizer (PyTorch)
Stars: ✭ 59 (-4.84%)
Mutual labels:  optimizer
arxiv leaks
Whisper of the arxiv: read comments in tex of papers
Stars: ✭ 22 (-64.52%)
Mutual labels:  arxiv
adamwr
Implements https://arxiv.org/abs/1711.05101 AdamW optimizer, cosine learning rate scheduler and "Cyclical Learning Rates for Training Neural Networks" https://arxiv.org/abs/1506.01186 for PyTorch framework
Stars: ✭ 130 (+109.68%)
Mutual labels:  optimizer
FormTracer
A Mathematica Tracing Package Using FORM
Stars: ✭ 16 (-74.19%)
Mutual labels:  arxiv
deeplearning-papernotes
Краткое изложение статей по NLP, Deep Learning и диалоговым агентам
Stars: ✭ 17 (-72.58%)
Mutual labels:  arxiv
postcss-clean
PostCss plugin to minify your CSS with clean-css
Stars: ✭ 41 (-33.87%)
Mutual labels:  optimizer
pigosat
Go (golang) bindings for Picosat, the satisfiability solver
Stars: ✭ 15 (-75.81%)
Mutual labels:  optimizer
TAGCN
Tensorflow Implementation of the paper "Topology Adaptive Graph Convolutional Networks" (Du et al., 2017)
Stars: ✭ 17 (-72.58%)
Mutual labels:  arxiv
paper-survey
Summary of machine learning papers
Stars: ✭ 26 (-58.06%)
Mutual labels:  arxiv

Rethinking Binarized Neural Network Optimization

arXiv:1906.02107 License: Apache 2.0 Code style: black

Implementation for paper "Latent Weights Do Not Exist: Rethinking Binarized Neural Network Optimization".

A poster illustrating the proposed algorithm and its relation to the previous BNN optimization strategy is included at ./poster.pdf.

Note: Bop is now added to Larq, the open source training library for BNNs. We recommend using the Larq implementation of Bop: it is compatible with more versions of TensorFlow and will be more actively maintained.

Requirements

You can also check out one of our prebuilt docker images.

Installation

This is a complete Python module. To install it in your local Python environment, cd into the folder containing setup.py and run:

pip install -e .

Train

To train a model locally, you can use the cli:

bnno train binarynet --dataset cifar10

Reproduce Paper Experiments

Hyperparameter Analysis (section 5.1)

To reproduce the runs exploring various hyperparameters, run:

bnno train binarynet \
    --dataset cifar10 \
    --preprocess-fn resize_and_flip \
    --hparams-set bop \
    --hparams threshold=1e-6,gamma=1e-3

where you use the appropriate values for threshold and gamma.

CIFAR-10 (section 5.2)

To achieve the accuracy in the paper of 91.3%, run:

bnno train binarynet \
    --dataset cifar10 \
    --preprocess-fn resize_and_flip \
    --hparams-set bop_sec52 \

ImageNet (section 5.3)

To reproduce the reported results on ImageNet, run:

bnno train alexnet --dataset imagenet2012 --hparams-set bop
bnno train xnornet --dataset imagenet2012 --hparams-set bop
bnno train birealnet --dataset imagenet2012 --hparams-set bop

This should give the results listed below. Click on the tensorboard icons to see training and validation accuracy curves of the reported runs.

Network Bop - top-1 accuracy
Binary Alexnet 41.1% tensorboard
XNOR-Net 45.9% tensorboard
Bi-Real Net 56.6% tensorboard
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].