All Projects → batzner → Indrnn

batzner / Indrnn

Licence: apache-2.0
TensorFlow implementation of Independently Recurrent Neural Networks

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Indrnn

Sars tutorial
Repository for the tutorial on Sequence-Aware Recommender Systems held at TheWebConf 2019 and ACM RecSys 2018
Stars: ✭ 320 (-37.38%)
Mutual labels:  rnn
Text summurization abstractive methods
Multiple implementations for abstractive text summurization , using google colab
Stars: ✭ 359 (-29.75%)
Mutual labels:  rnn
Tensorflow Tutorial
Tensorflow tutorial from basic to hard, 莫烦Python 中文AI教学
Stars: ✭ 4,122 (+706.65%)
Mutual labels:  rnn
Tensorflow poems
中文古诗自动作诗机器人,屌炸天,基于tensorflow1.10 api,正在积极维护升级中,快star,保持更新!
Stars: ✭ 3,429 (+571.04%)
Mutual labels:  rnn
Thesemicolon
This repository contains Ipython notebooks and datasets for the data analytics youtube tutorials on The Semicolon.
Stars: ✭ 345 (-32.49%)
Mutual labels:  rnn
Rmdl
RMDL: Random Multimodel Deep Learning for Classification
Stars: ✭ 375 (-26.61%)
Mutual labels:  rnn
Unet Zoo
A collection of UNet and hybrid architectures in PyTorch for 2D and 3D Biomedical Image segmentation
Stars: ✭ 302 (-40.9%)
Mutual labels:  rnn
Rgan
Recurrent (conditional) generative adversarial networks for generating real-valued time series data.
Stars: ✭ 480 (-6.07%)
Mutual labels:  rnn
Fast Pytorch
Pytorch Tutorial, Pytorch with Google Colab, Pytorch Implementations: CNN, RNN, DCGAN, Transfer Learning, Chatbot, Pytorch Sample Codes
Stars: ✭ 346 (-32.29%)
Mutual labels:  rnn
Tsai
Time series Timeseries Deep Learning Pytorch fastai - State-of-the-art Deep Learning with Time Series and Sequences in Pytorch / fastai
Stars: ✭ 407 (-20.35%)
Mutual labels:  rnn
Basicocr
BasicOCR是一个致力于解决自然场景文字识别算法研究的项目。该项目由长城数字大数据应用技术研究院佟派AI团队发起和维护。
Stars: ✭ 336 (-34.25%)
Mutual labels:  rnn
Deeplearningsourceseparation
Deep Recurrent Neural Networks for Source Separation
Stars: ✭ 339 (-33.66%)
Mutual labels:  rnn
Rnn From Scratch
Implementing Recurrent Neural Network from Scratch
Stars: ✭ 377 (-26.22%)
Mutual labels:  rnn
R Net
A Tensorflow Implementation of R-net: Machine reading comprehension with self matching networks
Stars: ✭ 321 (-37.18%)
Mutual labels:  rnn
Tensorflow Char Rnn
Char-RNN implemented using TensorFlow.
Stars: ✭ 429 (-16.05%)
Mutual labels:  rnn
Neural Symbolic Machines
Neural Symbolic Machines is a framework to integrate neural networks and symbolic representations using reinforcement learning, with applications in program synthesis and semantic parsing.
Stars: ✭ 305 (-40.31%)
Mutual labels:  rnn
Easy Deep Learning With Keras
Keras tutorial for beginners (using TF backend)
Stars: ✭ 367 (-28.18%)
Mutual labels:  rnn
Mozi
此项目致力于构建一套最基础,最精简,可维护的react-native项目,支持ios,android 🌹
Stars: ✭ 501 (-1.96%)
Mutual labels:  rnn
Srnn
sliced-rnn
Stars: ✭ 462 (-9.59%)
Mutual labels:  rnn
Wavetorch
🌊 Numerically solving and backpropagating through the wave equation
Stars: ✭ 387 (-24.27%)
Mutual labels:  rnn

Independently Recurrent Neural Networks

Simple TensorFlow implementation of Independently Recurrent Neural Network (IndRNN): Building A Longer and Deeper RNN by Shuai Li et al. The author's original implementation in Theano and Lasagne can be found in Sunnydreamrain/IndRNN_Theano_Lasagne.

Build Status

Summary

In IndRNNs, neurons in recurrent layers are independent from each other. The basic RNN calculates the hidden state h with h = act(W * input + U * state + b). IndRNNs use an element-wise vector multiplication u * state meaning each neuron has a single recurrent weight connected to its last hidden state.

The IndRNN

  • can be used efficiently with ReLU activation functions making it easier to stack multiple recurrent layers without saturating gradients
  • allows for better interpretability, as neurons in the same layer are independent from each other
  • prevents vanishing and exploding gradients by regulating each neuron's recurrent weight

Usage

Copy ind_rnn_cell.py into your project.

from ind_rnn_cell import IndRNNCell

# Regulate each neuron's recurrent weight as recommended in the paper
recurrent_max = pow(2, 1 / TIME_STEPS)

cell = MultiRNNCell([IndRNNCell(128, recurrent_max_abs=recurrent_max),
                     IndRNNCell(128, recurrent_max_abs=recurrent_max)])
output, state = tf.nn.dynamic_rnn(cell, input_data, dtype=tf.float32)
...

Experiments in the paper

Addition Problem

See examples/addition_rnn.py for a script reproducing the "Adding Problem" from the paper. Below are the results reproduced with the addition_rnn.py code.

https://github.com/batzner/indrnn/raw/master/img/addition/TAll.png

Sequential MNIST

See examples/sequential_mnist.py for a script reproducing the Sequential MNIST experiment. I let it run for two days and stopped it after 60,000 training steps with a

  • Training error rate of 0.7%
  • Validation error rate of 1.1%
  • Test error rate of 1.1%

https://github.com/batzner/indrnn/raw/master/img/sequential_mnist/errors.png

Requirements

  • Python 3.4+
  • TensorFlow 1.5+
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].