All Projects → mblondel → Soft Dtw

mblondel / Soft Dtw

Licence: bsd-2-clause
Python implementation of soft-DTW.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Soft Dtw

Sktime Dl
sktime companion package for deep learning based on TensorFlow
Stars: ✭ 379 (+26.33%)
Mutual labels:  time-series, neural-networks
Gluon Ts
Probabilistic time series modeling in Python
Stars: ✭ 2,373 (+691%)
Mutual labels:  time-series, neural-networks
Awesome Ai Ml Dl
Awesome Artificial Intelligence, Machine Learning and Deep Learning as we learn it. Study notes and a curated list of awesome resources of such topics.
Stars: ✭ 831 (+177%)
Mutual labels:  time-series, neural-networks
Lstm anomaly thesis
Anomaly detection for temporal data using LSTMs
Stars: ✭ 178 (-40.67%)
Mutual labels:  time-series, neural-networks
Mckinsey Smartcities Traffic Prediction
Adventure into using multi attention recurrent neural networks for time-series (city traffic) for the 2017-11-18 McKinsey IronMan (24h non-stop) prediction challenge
Stars: ✭ 49 (-83.67%)
Mutual labels:  time-series, neural-networks
Unsupervisedscalablerepresentationlearningtimeseries
Unsupervised Scalable Representation Learning for Multivariate Time Series: Experiments
Stars: ✭ 205 (-31.67%)
Mutual labels:  time-series, neural-networks
Sharpneat
SharpNEAT - Evolution of Neural Networks. A C# .NET Framework.
Stars: ✭ 273 (-9%)
Mutual labels:  neural-networks
Crate
CrateDB is a distributed SQL database that makes it simple to store and analyze massive amounts of data in real-time.
Stars: ✭ 3,254 (+984.67%)
Mutual labels:  time-series
Flux.jl
Relax! Flux is the ML library that doesn't make you tensor
Stars: ✭ 3,358 (+1019.33%)
Mutual labels:  neural-networks
Pycox
Survival analysis with PyTorch
Stars: ✭ 269 (-10.33%)
Mutual labels:  neural-networks
Neuralpde.jl
Physics-Informed Neural Networks (PINN) and Deep BSDE Solvers of Differential Equations for Scientific Machine Learning (SciML) accelerated simulation
Stars: ✭ 295 (-1.67%)
Mutual labels:  neural-networks
Mlpractical
Machine Learning Practical course repository
Stars: ✭ 295 (-1.67%)
Mutual labels:  neural-networks
Uncertainty Baselines
High-quality implementations of standard and SOTA methods on a variety of tasks.
Stars: ✭ 278 (-7.33%)
Mutual labels:  neural-networks
Time Series Deep Learning State Of The Art
Scientific time series and deep learning state of the art
Stars: ✭ 277 (-7.67%)
Mutual labels:  time-series
Deep Learning Papers
Papers about deep learning ordered by task, date. Current state-of-the-art papers are labelled.
Stars: ✭ 3,054 (+918%)
Mutual labels:  neural-networks
Rlgraph
RLgraph: Modular computation graphs for deep reinforcement learning
Stars: ✭ 272 (-9.33%)
Mutual labels:  neural-networks
Nightingale
💡 A Distributed and High-Performance Monitoring System. Prometheus enterprise edition
Stars: ✭ 4,003 (+1234.33%)
Mutual labels:  time-series
Moniel
Interactive Notation for Computational Graphs
Stars: ✭ 272 (-9.33%)
Mutual labels:  neural-networks
Rust Autograd
Tensors and differentiable operations (like TensorFlow) in Rust
Stars: ✭ 278 (-7.33%)
Mutual labels:  neural-networks
Komputation
Komputation is a neural network framework for the Java Virtual Machine written in Kotlin and CUDA C.
Stars: ✭ 295 (-1.67%)
Mutual labels:  neural-networks

.. -- mode: rst --

soft-DTW

Python implementation of soft-DTW.

What is it?

The celebrated dynamic time warping (DTW) [1] defines the discrepancy between two time series, of possibly variable length, as their minimal alignment cost. Although the number of possible alignments is exponential in the length of the two time series, [1] showed that DTW can be computed in only quadractic time using dynamic programming.

Soft-DTW [2] proposes to replace this minimum by a soft minimum. Like the original DTW, soft-DTW can be computed in quadratic time using dynamic programming. However, the main advantage of soft-DTW stems from the fact that it is differentiable everywhere and that its gradient can also be computed in quadratic time. This enables to use soft-DTW for time series averaging or as a loss function, between a ground-truth time series and a time series predicted by a neural network, trained end-to-end using backpropagation.

Supported features

  • soft-DTW (forward pass) and gradient (backward pass) computations, implemented in Cython for speed
  • barycenters (time series averaging)
  • dataset loader for the UCR archive <http://www.cs.ucr.edu/~eamonn/time_series_data/>_
  • Chainer <http://chainer.org>_ function

Example

.. code-block:: python

from sdtw import SoftDTW
from sdtw.distance import SquaredEuclidean

# Time series 1: numpy array, shape = [m, d] where m = length and d = dim
X = ...
# Time series 2: numpy array, shape = [n, d] where n = length and d = dim
Y = ...

# D can also be an arbitrary distance matrix: numpy array, shape [m, n]
D = SquaredEuclidean(X, Y)
sdtw = SoftDTW(D, gamma=1.0)
# soft-DTW discrepancy, approaches DTW as gamma -> 0
value = sdtw.compute()
# gradient w.r.t. D, shape = [m, n], which is also the expected alignment matrix
E = sdtw.grad()
# gradient w.r.t. X, shape = [m, d]
G = D.jacobian_product(E)

Installation

Binary packages are not available.

This project can be installed from its git repository. It is assumed that you have a working C compiler.

  1. Obtain the sources by::

    git clone https://github.com/mblondel/soft-dtw.git

or, if git is unavailable, download as a ZIP from GitHub <https://github.com/mblondel/soft-dtw/archive/master.zip>_.

  1. Install the dependencies::

    via pip

    pip install numpy scipy scikit-learn cython nose

    via conda

    conda install numpy scipy scikit-learn cython nose

  2. Build and install soft-dtw::

    cd soft-dtw make cython python setup.py build sudo python setup.py install

References

.. [1] Hiroaki Sakoe, Seibi Chiba. Dynamic programming algorithm optimization for spoken word recognition. In: IEEE Trans. on Acoustics, Speech, and Sig. Proc, 1978.

.. [2] Marco Cuturi, Mathieu Blondel. Soft-DTW: a Differentiable Loss Function for Time-Series. In: Proc. of ICML 2017. [PDF <https://arxiv.org/abs/1703.01541>_]

Author

  • Mathieu Blondel, 2017
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].