All Projects → spallas → Time Attention

spallas / Time Attention

Licence: mit
Implementation of RNN for Time Series prediction from the paper https://arxiv.org/abs/1704.02971

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Time Attention

Pytorch Learners Tutorial
PyTorch tutorial for learners
Stars: ✭ 97 (+86.54%)
Mutual labels:  deep-neural-networks, lstm, rnn
automatic-personality-prediction
[AAAI 2020] Modeling Personality with Attentive Networks and Contextual Embeddings
Stars: ✭ 43 (-17.31%)
Mutual labels:  lstm, rnn, attention
Pytorch Kaldi
pytorch-kaldi is a project for developing state-of-the-art DNN/RNN hybrid speech recognition systems. The DNN part is managed by pytorch, while feature extraction, label computation, and decoding are performed with the kaldi toolkit.
Stars: ✭ 2,097 (+3932.69%)
Mutual labels:  deep-neural-networks, lstm, rnn
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+6473.08%)
Mutual labels:  lstm, rnn, attention
Deepseqslam
The Official Deep Learning Framework for Route-based Place Recognition
Stars: ✭ 49 (-5.77%)
Mutual labels:  deep-neural-networks, lstm, rnn
Bitcoin Price Prediction Using Lstm
Bitcoin price Prediction ( Time Series ) using LSTM Recurrent neural network
Stars: ✭ 67 (+28.85%)
Mutual labels:  deep-neural-networks, lstm, rnn
EBIM-NLI
Enhanced BiLSTM Inference Model for Natural Language Inference
Stars: ✭ 24 (-53.85%)
Mutual labels:  lstm, rnn, attention
Machine Learning
My Attempt(s) In The World Of ML/DL....
Stars: ✭ 78 (+50%)
Mutual labels:  lstm, rnn, attention
Easy Deep Learning With Keras
Keras tutorial for beginners (using TF backend)
Stars: ✭ 367 (+605.77%)
Mutual labels:  deep-neural-networks, lstm, rnn
Base-On-Relation-Method-Extract-News-DA-RNN-Model-For-Stock-Prediction--Pytorch
基於關聯式新聞提取方法之雙階段注意力機制模型用於股票預測
Stars: ✭ 33 (-36.54%)
Mutual labels:  lstm, rnn, attention
Rnn For Joint Nlu
Pytorch implementation of "Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling" (https://arxiv.org/abs/1609.01454)
Stars: ✭ 176 (+238.46%)
Mutual labels:  lstm, rnn, attention
Telemanom
A framework for using LSTMs to detect anomalies in multivariate time series data. Includes spacecraft anomaly data and experiments from the Mars Science Laboratory and SMAP missions.
Stars: ✭ 589 (+1032.69%)
Mutual labels:  time-series, lstm, rnn
Chinese Chatbot
中文聊天机器人,基于10万组对白训练而成,采用注意力机制,对一般问题都会生成一个有意义的答复。已上传模型,可直接运行,跑不起来直播吃键盘。
Stars: ✭ 124 (+138.46%)
Mutual labels:  lstm, rnn, attention
Deep Learning Time Series
List of papers, code and experiments using deep learning for time series forecasting
Stars: ✭ 796 (+1430.77%)
Mutual labels:  time-series, deep-neural-networks, lstm
Cnn lstm for text classify
CNN, LSTM, NBOW, fasttext 中文文本分类
Stars: ✭ 90 (+73.08%)
Mutual labels:  lstm, rnn, attention
Chameleon recsys
Source code of CHAMELEON - A Deep Learning Meta-Architecture for News Recommender Systems
Stars: ✭ 202 (+288.46%)
Mutual labels:  deep-neural-networks, lstm, rnn
ConvLSTM-PyTorch
ConvLSTM/ConvGRU (Encoder-Decoder) with PyTorch on Moving-MNIST
Stars: ✭ 202 (+288.46%)
Mutual labels:  time-series, lstm, rnn
Flow Forecast
Deep learning PyTorch library for time series forecasting, classification, and anomaly detection (originally for flood forecasting).
Stars: ✭ 368 (+607.69%)
Mutual labels:  time-series, deep-neural-networks, lstm
Ad examples
A collection of anomaly detection methods (iid/point-based, graph and time series) including active learning for anomaly detection/discovery, bayesian rule-mining, description for diversity/explanation/interpretability. Analysis of incorporating label feedback with ensemble and tree-based detectors. Includes adversarial attacks with Graph Convolutional Network.
Stars: ✭ 641 (+1132.69%)
Mutual labels:  time-series, lstm, rnn
Tf Rnn Attention
Tensorflow implementation of attention mechanism for text classification tasks.
Stars: ✭ 735 (+1313.46%)
Mutual labels:  rnn, attention

time-attention

Implementation of RNN for Time Series prediction from the paper https://arxiv.org/abs/1704.02971.

Some parameters' names, variables and configurations keys are derived from the paper.

Slides introducing the work

Reproducing the results

  • Download the necessary data through ./get_data.sh

  • All the parameters for the training and features used are stored in conf/*.json.

Training file

USAGE: train.py [flags]
flags:

train.py:
  --config: Path to json file with the configuration to be run
    (default: 'conf/SML2010.json')

Try --helpfull to get a list of all flags.

Configurations generator

Used for generating multiple configurations to be run for performance analysis

usage: generate_configs.py [-h] [--src SRC] [--dest DEST]

Generates config files for multiple configurations. It requires the source
directory containing the jsons of the base configurations from which the new
configurations have to be generated.

optional arguments:
  -h, --help   show this help message and exit
  --src SRC    Source directory where base configurations are found
  --dest DEST  Destination directory where the files will be created
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].