All Projects → akosiorek → Hart

akosiorek / Hart

Licence: gpl-3.0
Hierarchical Attentive Recurrent Tracking

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Hart

automatic-personality-prediction
[AAAI 2020] Modeling Personality with Attentive Networks and Contextual Embeddings
Stars: ✭ 43 (-71.14%)
Mutual labels:  rnn, attention-mechanism
Awesome Speech Recognition Speech Synthesis Papers
Automatic Speech Recognition (ASR), Speaker Verification, Speech Synthesis, Text-to-Speech (TTS), Language Modelling, Singing Voice Synthesis (SVS), Voice Conversion (VC)
Stars: ✭ 2,085 (+1299.33%)
Mutual labels:  rnn, attention-mechanism
Poetry Seq2seq
Chinese Poetry Generation
Stars: ✭ 159 (+6.71%)
Mutual labels:  rnn, attention-mechanism
Attend infer repeat
A Tensorfflow implementation of Attend, Infer, Repeat
Stars: ✭ 82 (-44.97%)
Mutual labels:  rnn, attention-mechanism
Eeg Dl
A Deep Learning library for EEG Tasks (Signals) Classification, based on TensorFlow.
Stars: ✭ 165 (+10.74%)
Mutual labels:  rnn, attention-mechanism
Linear Attention Recurrent Neural Network
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Stars: ✭ 119 (-20.13%)
Mutual labels:  rnn, attention-mechanism
Hierarchical Multi Label Text Classification
The code of CIKM'19 paper《Hierarchical Multi-label Text Classification: An Attention-based Recurrent Network Approach》
Stars: ✭ 133 (-10.74%)
Mutual labels:  attention-mechanism
Seq2seq chatbot new
基于seq2seq模型的简单对话系统的tf实现,具有embedding、attention、beam_search等功能,数据集是Cornell Movie Dialogs
Stars: ✭ 144 (-3.36%)
Mutual labels:  attention-mechanism
Abstractive Summarization
Implementation of abstractive summarization using LSTM in the encoder-decoder architecture with local attention.
Stars: ✭ 128 (-14.09%)
Mutual labels:  attention-mechanism
Chinese Chatbot
中文聊天机器人,基于10万组对白训练而成,采用注意力机制,对一般问题都会生成一个有意义的答复。已上传模型,可直接运行,跑不起来直播吃键盘。
Stars: ✭ 124 (-16.78%)
Mutual labels:  rnn
Pytorch Image Comp Rnn
PyTorch implementation of Full Resolution Image Compression with Recurrent Neural Networks
Stars: ✭ 146 (-2.01%)
Mutual labels:  rnn
Document Classifier Lstm
A bidirectional LSTM with attention for multiclass/multilabel text classification.
Stars: ✭ 136 (-8.72%)
Mutual labels:  attention-mechanism
Adnet
Attention-guided CNN for image denoising(Neural Networks,2020)
Stars: ✭ 135 (-9.4%)
Mutual labels:  attention-mechanism
Rnn poetry generator
基于RNN生成古诗
Stars: ✭ 143 (-4.03%)
Mutual labels:  rnn
Perceiver Pytorch
Implementation of Perceiver, General Perception with Iterative Attention, in Pytorch
Stars: ✭ 130 (-12.75%)
Mutual labels:  attention-mechanism
Skip Thoughts.torch
Porting of Skip-Thoughts pretrained models from Theano to PyTorch & Torch7
Stars: ✭ 146 (-2.01%)
Mutual labels:  rnn
Image Caption Generator
A neural network to generate captions for an image using CNN and RNN with BEAM Search.
Stars: ✭ 126 (-15.44%)
Mutual labels:  attention-mechanism
Prediction Flow
Deep-Learning based CTR models implemented by PyTorch
Stars: ✭ 138 (-7.38%)
Mutual labels:  attention-mechanism
Gestureai Coreml Ios
Hand-gesture recognition on iOS app using CoreML
Stars: ✭ 145 (-2.68%)
Mutual labels:  rnn
Qrn
Query-Reduction Networks (QRN)
Stars: ✭ 137 (-8.05%)
Mutual labels:  rnn

Hierarchical Attentive Recurrent Tracking

This is an official Tensorflow implementation of single object tracking in videos by using hierarchical attentive recurrent neural networks, as presented in the following paper:

A. R. Kosiorek, A. Bewley, I. Posner, "Hierarchical Attentive Recurrent Tracking", NIPS 2017.

Installation

Install Tensorflow v1.1 and the following dependencies (using pip install -r requirements.txt (preferred) or pip install [package]):

  • matplotlib==1.5.3
  • numpy==1.12.1
  • pandas==0.18.1
  • scipy==0.18.1

Demo

The notebook scripts/demo.ipynb contains a demo, which shows how to evaluate tracker on an arbitrary image sequence. By default, it runs on images located in imgs folder and uses a pretrained model. Before running the demo please download AlexNet weights first (described in the Training section).

Data

  1. Download KITTI dataset from here. We need left color images and tracking labels.
  2. Unpack data into a data folder; images should be in an image folder and labels should be in a label folder.
  3. Resize all the images to (heigh=187, width=621) e.g. by using the scripts/resize_imgs.sh script.

Training

  1. Download the AlexNet weights:

    • Execute scripts/download_alexnet.sh or
    • Download the weights from here and put the file in the checkpoints folder.
  2. Run

     python scripts/train_hart_kitti.py --img_dir=path/to/image/folder --label_dir=/path/to/label/folder
    

The training script will save model checkpoints in the checkpoints folder and report train and test scores every couple of epochs. You can run tensorboard in the checkpoints folder to visualise training progress. Training should converge in about 400k iterations, which should take about 3 days. It might take a couple of hours between logging messages, so don't worry.

Evaluation on KITTI dataset

The scripts/eval_kitti.ipynb notebook contains the code necessary to prepare (IoU, timesteps) curves for train and validation set of KITTI. Before running the evaluation:

  • Download AlexNet weights (described in the Training section).
  • Update image and label folder paths in the notebook.

Citation

If you find this repo useful in your research, please consider citing:

@inproceedings{Kosiorek2017hierarchical,
   title = {Hierarchical Attentive Recurrent Tracking},
   author = {Kosiorek, Adam R and Bewley, Alex and Posner, Ingmar},
   booktitle = {Neural Information Processing Systems},
   url = {http://www.robots.ox.ac.uk/~mobile/Papers/2017NIPS_AdamKosiorek.pdf},
   pdf = {http://www.robots.ox.ac.uk/~mobile/Papers/2017NIPS_AdamKosiorek.pdf},
   year = {2017},
   month = {December}
}

License

This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 3 of the License, or (at your option) any later version.

This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.

You should have received a copy of the GNU General Public License along with this program. If not, see http://www.gnu.org/licenses/.

Release Notes

Version 1.0

  • Original version from the paper. It contains the KITTI tracking experiment.
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].