All Projects → qinyao-he → Bit Rnn

qinyao-he / Bit Rnn

Licence: apache-2.0
Quantize weights and activations in Recurrent Neural Networks.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Bit Rnn

Attention Mechanisms
Implementations for a family of attention mechanisms, suitable for all kinds of natural language processing tasks and compatible with TensorFlow 2.0 and Keras.
Stars: ✭ 203 (+136.05%)
Mutual labels:  recurrent-neural-networks, language-model
Ai Reading Materials
Some of the ML and DL related reading materials, research papers that I've read
Stars: ✭ 79 (-8.14%)
Mutual labels:  paper, recurrent-neural-networks
Protein Sequence Embedding Iclr2019
Source code for "Learning protein sequence embeddings using information from structure" - ICLR 2019
Stars: ✭ 194 (+125.58%)
Mutual labels:  recurrent-neural-networks, language-model
Ctcdecoder
Connectionist Temporal Classification (CTC) decoding algorithms: best path, prefix search, beam search and token passing. Implemented in Python.
Stars: ✭ 529 (+515.12%)
Mutual labels:  recurrent-neural-networks, language-model
Ctcwordbeamsearch
Connectionist Temporal Classification (CTC) decoder with dictionary and language model for TensorFlow.
Stars: ✭ 398 (+362.79%)
Mutual labels:  recurrent-neural-networks, language-model
Relational Rnn Pytorch
An implementation of DeepMind's Relational Recurrent Neural Networks in PyTorch.
Stars: ✭ 236 (+174.42%)
Mutual labels:  recurrent-neural-networks, language-model
Codegan
[Deprecated] Source Code Generation using Sequence Generative Adversarial Networks
Stars: ✭ 73 (-15.12%)
Mutual labels:  paper, recurrent-neural-networks
Mead Baseline
Deep-Learning Model Exploration and Development for NLP
Stars: ✭ 238 (+176.74%)
Mutual labels:  recurrent-neural-networks, language-model
Nlp Paper
NLP Paper
Stars: ✭ 484 (+462.79%)
Mutual labels:  paper, language-model
Dl Nlp Readings
My Reading Lists of Deep Learning and Natural Language Processing
Stars: ✭ 656 (+662.79%)
Mutual labels:  paper, language-model
Nlp Tutorial
Natural Language Processing Tutorial for Deep Learning Researchers
Stars: ✭ 9,895 (+11405.81%)
Mutual labels:  paper
Keras Sru
Implementation of Simple Recurrent Unit in Keras
Stars: ✭ 76 (-11.63%)
Mutual labels:  recurrent-neural-networks
Emnist
A project designed to explore CNN and the effectiveness of RCNN on classifying the EMNIST dataset.
Stars: ✭ 81 (-5.81%)
Mutual labels:  recurrent-neural-networks
Greek Bert
A Greek edition of BERT pre-trained language model
Stars: ✭ 84 (-2.33%)
Mutual labels:  language-model
Qrnn
Quasi-recurrent Neural Networks for Keras
Stars: ✭ 74 (-13.95%)
Mutual labels:  recurrent-neural-networks
Delora
Self-supervised Deep LiDAR Odometry for Robotic Applications
Stars: ✭ 81 (-5.81%)
Mutual labels:  paper
Awesome System For Machine Learning
A curated list of research in machine learning system. I also summarize some papers if I think they are really interesting.
Stars: ✭ 1,185 (+1277.91%)
Mutual labels:  paper
Tnn
Biologically-realistic recurrent convolutional neural networks
Stars: ✭ 83 (-3.49%)
Mutual labels:  recurrent-neural-networks
Recursive Cnns
Implementation of my paper "Real-time Document Localization in Natural Images by Recursive Application of a CNN."
Stars: ✭ 80 (-6.98%)
Mutual labels:  paper
Recurrent Environment Simulators
Deepmind Recurrent Environment Simulators paper implementation in tensorflow
Stars: ✭ 73 (-15.12%)
Mutual labels:  recurrent-neural-networks

Bit-RNN

Source code for paper: Effective Quantization Methods for Recurrent Neural Networks.

The implementation of PTB language model is modified from examples in tensorflow.

Requirments

Currently tested and run on TensorFlow 1.8 and Python 3.6. View other branches for legacy support. You may download the data from http://www.fit.vutbr.cz/~imikolov/rnnlm/simple-examples.tgz.

Run

python train.py --config=config.gru --data_path=YOUR_DATA_PATH

Currently default is 2-bit weights and activations. You may edit the config file in config folder to change configuration.

Support

Submit issue for problem relate to the code itself. Send email to the author for general question about the paper.

Citation

Please cite follow if you use our code in your research:

@article{DBLP:journals/corr/HeWZWYZZ16,
  author    = {Qinyao He and
               He Wen and
               Shuchang Zhou and
               Yuxin Wu and
               Cong Yao and
               Xinyu Zhou and
               Yuheng Zou},
  title     = {Effective Quantization Methods for Recurrent Neural Networks},
  journal   = {CoRR},
  volume    = {abs/1611.10176},
  year      = {2016},
  url       = {http://arxiv.org/abs/1611.10176},
  timestamp = {Thu, 01 Dec 2016 19:32:08 +0100},
  biburl    = {http://dblp.uni-trier.de/rec/bib/journals/corr/HeWZWYZZ16},
  bibsource = {dblp computer science bibliography, http://dblp.org}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].