All Projects → maxjcohen → Transformer

maxjcohen / Transformer

Licence: gpl-3.0
Implementation of Transformer model (originally from Attention is All You Need) applied to Time Series.

Projects that are alternatives of or similar to Transformer

Tsai
Time series Timeseries Deep Learning Pytorch fastai - State-of-the-art Deep Learning with Time Series and Sequences in Pytorch / fastai
Stars: ✭ 407 (+49.08%)
Mutual labels:  jupyter-notebook, timeseries, transformer
Simplestockanalysispython
Stock Analysis Tutorial in Python
Stars: ✭ 126 (-53.85%)
Mutual labels:  jupyter-notebook, timeseries
Kaggle Web Traffic
1st place solution
Stars: ✭ 1,641 (+501.1%)
Mutual labels:  jupyter-notebook, timeseries
Tensorflow Ml Nlp
텐서플로우와 머신러닝으로 시작하는 자연어처리(로지스틱회귀부터 트랜스포머 챗봇까지)
Stars: ✭ 176 (-35.53%)
Mutual labels:  jupyter-notebook, transformer
Tsmoothie
A python library for time-series smoothing and outlier detection in a vectorized way.
Stars: ✭ 109 (-60.07%)
Mutual labels:  jupyter-notebook, timeseries
Bertqa Attention On Steroids
BertQA - Attention on Steroids
Stars: ✭ 112 (-58.97%)
Mutual labels:  jupyter-notebook, transformer
Timesynth
A Multipurpose Library for Synthetic Time Series Generation in Python
Stars: ✭ 170 (-37.73%)
Mutual labels:  jupyter-notebook, timeseries
Smiles Transformer
Original implementation of the paper "SMILES Transformer: Pre-trained Molecular Fingerprint for Low Data Drug Discovery" by Shion Honda et al.
Stars: ✭ 86 (-68.5%)
Mutual labels:  jupyter-notebook, transformer
Tcdf
Temporal Causal Discovery Framework (PyTorch): discovering causal relationships between time series
Stars: ✭ 217 (-20.51%)
Mutual labels:  jupyter-notebook, timeseries
Timeseries fastai
fastai V2 implementation of Timeseries classification papers.
Stars: ✭ 221 (-19.05%)
Mutual labels:  jupyter-notebook, timeseries
Nn
🧑‍🏫 50! Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
Stars: ✭ 5,720 (+1995.24%)
Mutual labels:  jupyter-notebook, transformer
Getting Started With Google Bert
Build and train state-of-the-art natural language processing models using BERT
Stars: ✭ 107 (-60.81%)
Mutual labels:  jupyter-notebook, transformer
Scientificsummarizationdatasets
Datasets I have created for scientific summarization, and a trained BertSum model
Stars: ✭ 100 (-63.37%)
Mutual labels:  jupyter-notebook, transformer
Analyzing neural time series
python implementations of Analyzing Neural Time Series Textbook
Stars: ✭ 117 (-57.14%)
Mutual labels:  jupyter-notebook, timeseries
Stingray
Anything can happen in the next half hour (including spectral timing made easy)!
Stars: ✭ 94 (-65.57%)
Mutual labels:  jupyter-notebook, timeseries
Pastas
🍝 Pastas is an open-source Python framework for the analysis of hydrological time series.
Stars: ✭ 155 (-43.22%)
Mutual labels:  jupyter-notebook, timeseries
Bertviz
Tool for visualizing attention in the Transformer model (BERT, GPT-2, Albert, XLNet, RoBERTa, CTRL, etc.)
Stars: ✭ 3,443 (+1161.17%)
Mutual labels:  jupyter-notebook, transformer
Indonesian Language Models
Indonesian Language Models and its Usage
Stars: ✭ 64 (-76.56%)
Mutual labels:  jupyter-notebook, transformer
Nlp Tutorial
Natural Language Processing Tutorial for Deep Learning Researchers
Stars: ✭ 9,895 (+3524.54%)
Mutual labels:  jupyter-notebook, transformer
Sttn
[ECCV'2020] STTN: Learning Joint Spatial-Temporal Transformations for Video Inpainting
Stars: ✭ 211 (-22.71%)
Mutual labels:  jupyter-notebook, transformer

Transformers for Time Series

Documentation Status License: GPL v3 Latest release

Implementation of Transformer model (originally from Attention is All You Need) applied to Time Series (Powered by PyTorch).

Transformer model

Transformer are attention based neural networks designed to solve NLP tasks. Their key features are:

  • linear complexity in the dimension of the feature vector ;
  • paralellisation of computing of a sequence, as opposed to sequential computing ;
  • long term memory, as we can look at any input time sequence step directly.

This repo will focus on their application to times series.

Dataset and application as metamodel

Our use-case is modeling a numerical simulator for building consumption prediction. To this end, we created a dataset by sampling random inputs (building characteristics and usage, weather, ...) and got simulated outputs. We then convert these variables in time series format, and feed it to the transformer.

Adaptations for time series

In order to perform well on time series, a few adjustments had to be made:

  • The embedding layer is replaced by a generic linear layer ;
  • Original positional encoding are removed. A "regular" version, better matching the input sequence day/night patterns, can be used instead ;
  • A window is applied on the attention map to limit backward attention, and focus on short term patterns.

Installation

All required packages can be found in requirements.txt, and expect to be run with python3.7. Note that you may have to install pytorch manually if you are not using pip with a Debian distribution : head on to PyTorch installation page. Here are a few lines to get started with pip and virtualenv:

$ apt-get install python3.7
$ pip3 install --upgrade --user pip virtualenv
$ virtualenv -p python3.7 .env
$ . .env/bin/activate
(.env) $ pip install -r requirements.txt

Usage

Downloading the dataset

The dataset is not included in this repo, and must be downloaded manually. It is comprised of two files, dataset.npz contains all input and outputs value, labels.json is a detailed list of the variables. Please refer to #2 for more information.

Running training script

Using jupyter, run the default training.ipynb notebook. All adjustable parameters can be found in the second cell. Careful with the BATCH_SIZE, as we are using it to parallelize head and time chunk calculations.

Outside usage

The Transformer class can be used out of the box, see the docs for more info.

from tst import Transformer

net = Transformer(d_input, d_model, d_output, q, v, h, N, TIME_CHUNK, pe)

Building the docs

To build the doc:

(.env) $ cd docs && make html
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].