All Projects → hfawaz → Bigdata18

hfawaz / Bigdata18

Licence: gpl-3.0
Transfer learning for time series classification

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Bigdata18

Densedepth
High Quality Monocular Depth Estimation via Transfer Learning
Stars: ✭ 963 (+239.08%)
Mutual labels:  deep-neural-networks, transfer-learning
L2c
Learning to Cluster. A deep clustering strategy.
Stars: ✭ 262 (-7.75%)
Mutual labels:  deep-neural-networks, transfer-learning
Bitcoin Price Prediction Using Lstm
Bitcoin price Prediction ( Time Series ) using LSTM Recurrent neural network
Stars: ✭ 67 (-76.41%)
Mutual labels:  deep-neural-networks, time-series-analysis
Flow Forecast
Deep learning PyTorch library for time series forecasting, classification, and anomaly detection (originally for flood forecasting).
Stars: ✭ 368 (+29.58%)
Mutual labels:  deep-neural-networks, transfer-learning
Opentpod
Open Toolkit for Painless Object Detection
Stars: ✭ 106 (-62.68%)
Mutual labels:  deep-neural-networks, transfer-learning
AITQA
resources for the IBM Airlines Table-Question-Answering Benchmark
Stars: ✭ 12 (-95.77%)
Mutual labels:  transfer-learning
Data-Analysis
Different types of data analytics projects : EDA, PDA, DDA, TSA and much more.....
Stars: ✭ 22 (-92.25%)
Mutual labels:  time-series-analysis
SIGIR2021 Conure
One Person, One Model, One World: Learning Continual User Representation without Forgetting
Stars: ✭ 23 (-91.9%)
Mutual labels:  transfer-learning
SHOT-plus
code for our TPAMI 2021 paper "Source Data-absent Unsupervised Domain Adaptation through Hypothesis Transfer and Labeling Transfer"
Stars: ✭ 46 (-83.8%)
Mutual labels:  transfer-learning
Awesome Distributed Deep Learning
A curated list of awesome Distributed Deep Learning resources.
Stars: ✭ 277 (-2.46%)
Mutual labels:  deep-neural-networks
Rad
RAD: Reinforcement Learning with Augmented Data
Stars: ✭ 268 (-5.63%)
Mutual labels:  deep-neural-networks
Hub
A library for transfer learning by reusing parts of TensorFlow models.
Stars: ✭ 3,007 (+958.8%)
Mutual labels:  transfer-learning
adapt
Awesome Domain Adaptation Python Toolbox
Stars: ✭ 46 (-83.8%)
Mutual labels:  transfer-learning
Awesome Speech Enhancement
A tutorial for Speech Enhancement researchers and practitioners. The purpose of this repo is to organize the world’s resources for speech enhancement and make them universally accessible and useful.
Stars: ✭ 257 (-9.51%)
Mutual labels:  deep-neural-networks
Bmw Tensorflow Inference Api Gpu
This is a repository for an object detection inference API using the Tensorflow framework.
Stars: ✭ 277 (-2.46%)
Mutual labels:  deep-neural-networks
minutes
🔭 Speaker diarization via transfer learning
Stars: ✭ 25 (-91.2%)
Mutual labels:  transfer-learning
Realtime object detection
Plug and Play Real-Time Object Detection App with Tensorflow and OpenCV. No Bugs No Worries. Enjoy!
Stars: ✭ 260 (-8.45%)
Mutual labels:  deep-neural-networks
Twitter Sent Dnn
Deep Neural Network for Sentiment Analysis on Twitter
Stars: ✭ 270 (-4.93%)
Mutual labels:  deep-neural-networks
Rkd
Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019
Stars: ✭ 257 (-9.51%)
Mutual labels:  deep-neural-networks
Chaidnn
HLS based Deep Neural Network Accelerator Library for Xilinx Ultrascale+ MPSoCs
Stars: ✭ 258 (-9.15%)
Mutual labels:  deep-neural-networks

Transfer learning for time series classification

This is the companion repository for our paper titled "Transfer learning for time series classification" accepted as a regular paper at IEEE International Conference on Big Data 2018 also available on ArXiv.

Architecture

architecture fcn

Source code

The software is developed using Python 3.5. We trained the models on a cluster of more than 60 GPUs. You will need the UCR archive to re-run the experiments of the paper.

If you encouter problems with cython, you can re-generate the "c" files using the build-cython.sh script.

To train the network from scratch launch: python3 main.py train_fcn_scratch

To apply the transfer learning between each pair of datasets launch: python3 main.py transfer_learning

To visualize the figures in the paper launch: python3 main.py visualize_transfer_learning

To generate the inter-datasets similariy matrix launch: python3 main.py compare_datasets

Pre-trained and fine-tuned models

You can download from the companion web page all pre-trained and fine-tuned models you would need to re-produce the experiments. Feel free to fine-tune on your own datasets !!!

Prerequisites

All python packages needed are listed in pip-requirements.txt file and can be installed simply using the pip command.

Results

You can download here the accuracy variation matrix which corresponds to the raw results of the transfer matrix in the paper.

You can download here the raw results for the accuracy matrix instead of the variation.

You can download here the result of the applying nearest neighbor algorithm on the inter-datasets similarity matrix. You will find for each dataset in the archive, the 84 most similar datasets. The steps for computing the similarity matrix are presented in Algorithm 1 in our paper.

Accuracy variation matrix

acc-matrix

Generalization with and without the transfer learning

50words - FISH FordA - wafer Adiac - ShapesAll
plot-50words-fish plot-forda-wafer plot-adiac-shapesall

Model's accuracy with respect to the source dataset's similarity

Herring BeetleFly WormsTwoClass
herring beetlefly wormstwoclass

Reference

If you re-use this work, please cite:

@InProceedings{IsmailFawaz2018transfer,
  Title                    = {Transfer learning for time series classification},
  Author                   = {Ismail Fawaz, Hassan and Forestier, Germain and Weber, Jonathan and Idoumghar, Lhassane and Muller, Pierre-Alain},
  booktitle                = {IEEE International Conference on Big Data},
  pages                    = {1367-1376}, 
  Year                     = {2018}
}

Acknowledgement

The authors would like to thank NVIDIA Corporation for the GPU Grant and the Mésocentre of Strasbourg for providing access to the GPU cluster.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].