All Projects → patrick-kidger → Deep-Signature-Transforms

patrick-kidger / Deep-Signature-Transforms

Licence: Apache-2.0 License
Code for "Deep Signature Transforms" (NeurIPS 2019)

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Deep-Signature-Transforms

signatory
Differentiable computations of the signature and logsignature transforms, on both CPU and GPU. (ICLR 2021)
Stars: ✭ 153 (+135.38%)
Mutual labels:  signature, signatures, rough-paths
AutomatedOutlookSignature
PowerShell script to automate the creation of Outlook signatures using Active Directory attributes.
Stars: ✭ 36 (-44.62%)
Mutual labels:  signature, signatures
neuralRDEs
Code for: "Neural Rough Differential Equations for Long Time Series", (ICML 2021)
Stars: ✭ 102 (+56.92%)
Mutual labels:  signatures, rough-paths
Sig
The most powerful and customizable binary pattern scanner
Stars: ✭ 131 (+101.54%)
Mutual labels:  signature, signatures
tscompdata
Time series competition data
Stars: ✭ 17 (-73.85%)
Mutual labels:  time-series
Robust-Deep-Learning-Pipeline
Deep Convolutional Bidirectional LSTM for Complex Activity Recognition with Missing Data. Human Activity Recognition Challenge. Springer SIST (2020)
Stars: ✭ 20 (-69.23%)
Mutual labels:  time-series
KFAS
KFAS: R Package for Exponential Family State Space Models
Stars: ✭ 50 (-23.08%)
Mutual labels:  time-series
Occupancy-Detection
Occupancy detection of an office room from light, temperature, humidity and CO2 measurements
Stars: ✭ 18 (-72.31%)
Mutual labels:  time-series
ml monorepo
super-monorepo for machine learning and algorithmic trading
Stars: ✭ 43 (-33.85%)
Mutual labels:  time-series
wasmsign2
PoC implementation of the WebAssembly Modules Signatures proposal.
Stars: ✭ 18 (-72.31%)
Mutual labels:  signatures
python-makefun
Dynamically create python functions with a proper signature.
Stars: ✭ 62 (-4.62%)
Mutual labels:  signature
SolRsaVerify
Solidity RSA Sha256 Pkcs1 Verification
Stars: ✭ 45 (-30.77%)
Mutual labels:  signature
rsign2
A command-line tool to sign files and verify signatures in pure Rust.
Stars: ✭ 102 (+56.92%)
Mutual labels:  signatures
support resistance line
A well-tuned algorithm to generate & draw support/resistance line on time series. 根据时间序列自动生成支撑线压力线
Stars: ✭ 53 (-18.46%)
Mutual labels:  time-series
yogcrypt
A fast, general purpose crypto library in pure Rust.
Stars: ✭ 18 (-72.31%)
Mutual labels:  signature
svg-time-series
SVG time-series charting library
Stars: ✭ 18 (-72.31%)
Mutual labels:  time-series
sig
Validate Method Arguments & Results in Ruby
Stars: ✭ 54 (-16.92%)
Mutual labels:  signatures
SMC.jl
Sequential Monte Carlo algorithm for approximation of posterior distributions.
Stars: ✭ 53 (-18.46%)
Mutual labels:  time-series
awesome-time-series
Resources for working with time series and sequence data
Stars: ✭ 178 (+173.85%)
Mutual labels:  time-series
pairs trading cryptocurrencies strategy catalyst
Pairs trading strategy example based on Catalyst
Stars: ✭ 34 (-47.69%)
Mutual labels:  time-series

Deep Signature Transforms

Using the signature transform as a pooling layer in a neural network.

This is the code for the paper Deep Signature Transforms by Bonnier, Kidger, Perez Arribas, Salvi, Lyons 2019.

Look at Signatory for a PyTorch implementation of the signature transform.

Overview

If you're coming at this already knowing something about neural networks, then the idea is that the 'signature transform' is a transformation that does a particularly good job extracting features from streams of data, so it's a natural thing to try and build into our neural network models.

If you're coming at this already knowing something about signatures, then you probably know that they've previously only been used as a feature transformation, on top of which a model is built. But it is actually possible to backpropagate through the signature transform, so as long you design your model correctly (it has to be 'stream-preserving'; see the paper), then it actually makes sense to embed the signature within a neural network. Learning a nonlinearity before the signature transform provides a compact way to select which terms in the signature (of the original path) are useful for the given dataset.

What are signatures?

The signature of a stream of data is essentially a collection of statistics about that stream of data. This collection of statistics does such a good job of capturing the information about the stream of data that it actually determines the stream of data uniquely. (Up to something called 'tree-like equivalance' anyway, which is really just a technicality. It's an equivalence relation that matters about as much as two functions being equal almost everywhere. That is to say, not much at all.) The signature transform is a particularly attractive tool in machine learning because it is what we call a 'universal nonlinearity': it is sufficiently rich that it captures every possible nonlinear function of the original stream of data. Any function of a stream is linear on its signature. Now for various reasons this is a mathematical idealisation not borne out in practice (which is why we put them in a neural network and don't just use a simple linear model), but they still work very well!

Directory layout and reproducability

The src directory contains the scripts for our experiments. Reproducability should be easy: just run the .ipynb files.

(The packages directory just contains some separate packages that were put together to support this project.)

Dependencies

Python 3.7 was used. Virtual environments and packages were managed with Miniconda. The following external packages were used, and may be installed via pip3 install -r requirements.txt.

fbm==0.2.0 for generating fractional Brownian motion.

gym==0.12.1

pytorch-ignite==0.1.2 is an extension to PyTorch.

iisignature==0.23 for calculating signatures. (Which was used as Signatory had not been developed yet.)

jupyter==1.0.0

matplotlib==2.2.4

pandas==0.24.2

torch==1.0.1

scikit-learn==0.20.3

sdepy==1.0.1 for simulating solutions to stochastic differential equations.

tqdm==4.31.1 for progress bars.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].