All Projects → White-Link → Unsupervisedscalablerepresentationlearningtimeseries

White-Link / Unsupervisedscalablerepresentationlearningtimeseries

Licence: apache-2.0
Unsupervised Scalable Representation Learning for Multivariate Time Series: Experiments

Projects that are alternatives of or similar to Unsupervisedscalablerepresentationlearningtimeseries

Awesome Ai Ml Dl
Awesome Artificial Intelligence, Machine Learning and Deep Learning as we learn it. Study notes and a curated list of awesome resources of such topics.
Stars: ✭ 831 (+305.37%)
Mutual labels:  jupyter-notebook, time-series, neural-networks
Mckinsey Smartcities Traffic Prediction
Adventure into using multi attention recurrent neural networks for time-series (city traffic) for the 2017-11-18 McKinsey IronMan (24h non-stop) prediction challenge
Stars: ✭ 49 (-76.1%)
Mutual labels:  jupyter-notebook, time-series, neural-networks
Lstm anomaly thesis
Anomaly detection for temporal data using LSTMs
Stars: ✭ 178 (-13.17%)
Mutual labels:  jupyter-notebook, time-series, neural-networks
Shape Detection
🟣 Object detection of abstract shapes with neural networks
Stars: ✭ 170 (-17.07%)
Mutual labels:  jupyter-notebook, neural-networks
Asap
ASAP: Prioritizing Attention via Time Series Smoothing
Stars: ✭ 151 (-26.34%)
Mutual labels:  jupyter-notebook, time-series
Motion Sense
MotionSense Dataset for Human Activity and Attribute Recognition ( time-series data generated by smartphone's sensors: accelerometer and gyroscope)
Stars: ✭ 159 (-22.44%)
Mutual labels:  jupyter-notebook, time-series
Forecasting
Time Series Forecasting Best Practices & Examples
Stars: ✭ 2,123 (+935.61%)
Mutual labels:  jupyter-notebook, time-series
Introduction To Time Series Forecasting Python
Introduction to time series preprocessing and forecasting in Python using AR, MA, ARMA, ARIMA, SARIMA and Prophet model with forecast evaluation.
Stars: ✭ 173 (-15.61%)
Mutual labels:  jupyter-notebook, time-series
Timesynth
A Multipurpose Library for Synthetic Time Series Generation in Python
Stars: ✭ 170 (-17.07%)
Mutual labels:  jupyter-notebook, time-series
Attentionn
All about attention in neural networks. Soft attention, attention maps, local and global attention and multi-head attention.
Stars: ✭ 175 (-14.63%)
Mutual labels:  jupyter-notebook, neural-networks
Deep Learning Notes
My personal notes, presentations, and notebooks on everything Deep Learning.
Stars: ✭ 191 (-6.83%)
Mutual labels:  jupyter-notebook, neural-networks
Hands On Machine Learning With Scikit Learn Keras And Tensorflow
Notes & exercise solutions of Part I from the book: "Hands-On ML with Scikit-Learn, Keras & TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems" by Aurelien Geron
Stars: ✭ 151 (-26.34%)
Mutual labels:  jupyter-notebook, neural-networks
Ml Workspace
🛠 All-in-one web-based IDE specialized for machine learning and data science.
Stars: ✭ 2,337 (+1040%)
Mutual labels:  jupyter-notebook, neural-networks
Fixy
Amacımız Türkçe NLP literatüründeki birçok farklı sorunu bir arada çözebilen, eşsiz yaklaşımlar öne süren ve literatürdeki çalışmaların eksiklerini gideren open source bir yazım destekleyicisi/denetleyicisi oluşturmak. Kullanıcıların yazdıkları metinlerdeki yazım yanlışlarını derin öğrenme yaklaşımıyla çözüp aynı zamanda metinlerde anlamsal analizi de gerçekleştirerek bu bağlamda ortaya çıkan yanlışları da fark edip düzeltebilmek.
Stars: ✭ 165 (-19.51%)
Mutual labels:  jupyter-notebook, neural-networks
Gluon Ts
Probabilistic time series modeling in Python
Stars: ✭ 2,373 (+1057.56%)
Mutual labels:  time-series, neural-networks
Deep Math Machine Learning.ai
A blog which talks about machine learning, deep learning algorithms and the Math. and Machine learning algorithms written from scratch.
Stars: ✭ 173 (-15.61%)
Mutual labels:  jupyter-notebook, neural-networks
Choochoo
Training Diary
Stars: ✭ 186 (-9.27%)
Mutual labels:  jupyter-notebook, time-series
Coursera Deep Learning Specialization
Notes, programming assignments and quizzes from all courses within the Coursera Deep Learning specialization offered by deeplearning.ai: (i) Neural Networks and Deep Learning; (ii) Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization; (iii) Structuring Machine Learning Projects; (iv) Convolutional Neural Networks; (v) Sequence Models
Stars: ✭ 188 (-8.29%)
Mutual labels:  jupyter-notebook, neural-networks
Anomaly detection tuto
Anomaly detection tutorial on univariate time series with an auto-encoder
Stars: ✭ 144 (-29.76%)
Mutual labels:  jupyter-notebook, time-series
Vde
Variational Autoencoder for Dimensionality Reduction of Time-Series
Stars: ✭ 148 (-27.8%)
Mutual labels:  jupyter-notebook, time-series

Unsupervised Scalable Representation Learning for Multivariate Time Series -- Code

This is the code corresponding to the experiments conducted for the work "Unsupervised Scalable Representation Learning for Multivariate Time Series" (Jean-Yves Franceschi, Aymeric Dieuleveut and Martin Jaggi) [NeurIPS] [arXiv] [HAL], presented at NeurIPS 2019. A previous version was presented at the 2nd LLD workshop at ICLR 2019.

Requirements

Experiments were done with the following package versions for Python 3.6:

  • Numpy (numpy) v1.15.2;
  • Matplotlib (matplotlib) v3.0.0;
  • Orange (Orange) v3.18.0;
  • Pandas (pandas) v0.23.4;
  • python-weka-wrapper3 v0.1.6 for multivariate time series (requires Oracle JDK 8 or OpenJDK 8);
  • PyTorch (torch) v0.4.1 with CUDA 9.0;
  • Scikit-learn (sklearn) v0.20.0;
  • Scipy (scipy) v1.1.0.

This code should execute correctly with updated versions of these packages.

Datasets

The datasets manipulated in this code can be downloaded on the following locations:

Files

Core

  • losses folder: implements the triplet loss in the cases of a training set with all time series of the same length, and a training set with time series of unequal lengths;
  • networks folder: implements encoder and its building blocks (dilated convolutions, causal CNN);
  • scikit_wrappers.py file: implements classes inheriting Scikit-learn classifiers that wrap an encoder and a SVM classifier.
  • utils.py file: implements custom PyTorch datasets;
  • default_hyperparameters.json file: example of a JSON file containing the hyperparameters of a pair (encoder, classifier).

Tests

  • ucr.py file: handles learning on the UCR archive (see usage below);
  • uea.py file: handles learning on the UEA archive (see usage below);
  • transfer_ucr.py file: handles transfer learning on the UCR archive (see usage below);
  • combine_ucr.py file: combines learned pairs of (encoder, classifier) for the UCR archive) (see usage below);
  • combine_uea.py file: combines learned pairs of (encoder, classifier) for the UEA archive) (see usage below);
  • sparse_labeling.ipynb file: file containing code to reproduce the results of training an SVM on our representations for different numbers of available labels;
  • HouseholdPowerConsumption.ipynb file: Jupyter notebook containing experiments on the Individual Household Electric Power Consumption dataset.

Results and Visualization

  • results_ucr.csv file: CSV file compiling all results (with those of concurrent methods) on the UCR archive;
  • results_uea.csv file: CSV file compiling all results (with those of concurrent methods) on the UEA archive;
  • results_sparse_labeling_TwoPatterns.csv file: CSV file compiling means and standard variations of five runs of learning an SVM on our representations and the ResNet architecture described in the paper for different numbers of available labels;
  • cd.ipynb file: Jupyter notebook containing the code to produce a critical difference diagram;
  • stat_plots.ipynb file: Jupyter notebook containing the code to produce boxplots and histograms on the results for the UCR archive;
  • models folder: contains a pretrained model for the UCR dataset CricketX.

Usage

Training on the UCR and UEA archives

To train a model on the Mallat dataset from the UCR archive:

python3 ucr.py --dataset Mallat --path path/to/Mallat/folder/ --save_path /path/to/save/models --hyper default_hyperparameters.json [--cuda --gpu 0]

Adding the --load option allows to load a model from the specified save path. Training on the UEA archive with uea.py is done in a similar way.

Further Documentation

See the code documentation for more details. ucr.py, uea.py, transfer_ucr.py, combine_ucr.py and combine_uea.py can be called with the -h option for additional help.

Pretrained Models

Pretrained models are downloadable at https://data.lip6.fr/usrlts/.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].