All Projects → mlds-lab → Interp Net

mlds-lab / Interp Net

Licence: mit
Interpolation-Prediction Networks for Irregularly Sampled Time Series

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Interp Net

Predictive Maintenance Using Lstm
Example of Multiple Multivariate Time Series Prediction with LSTM Recurrent Neural Networks in Python with Keras.
Stars: ✭ 352 (+718.6%)
Mutual labels:  timeseries, keras-tensorflow
Kaggle Web Traffic Time Series Forecasting
Solution to Kaggle - Web Traffic Time Series Forecasting
Stars: ✭ 29 (-32.56%)
Mutual labels:  timeseries
React Timeseries Charts
Declarative and modular timeseries charting components for React
Stars: ✭ 690 (+1504.65%)
Mutual labels:  timeseries
Pyts
A Python package for time series classification
Stars: ✭ 895 (+1981.4%)
Mutual labels:  timeseries
Pcp
Performance Co-Pilot
Stars: ✭ 716 (+1565.12%)
Mutual labels:  timeseries
Tf Keras Surgeon
Pruning and other network surgery for trained TF.Keras models.
Stars: ✭ 25 (-41.86%)
Mutual labels:  keras-tensorflow
Kafka Streams Machine Learning Examples
This project contains examples which demonstrate how to deploy analytic models to mission-critical, scalable production environments leveraging Apache Kafka and its Streams API. Models are built with Python, H2O, TensorFlow, Keras, DeepLearning4 and other technologies.
Stars: ✭ 661 (+1437.21%)
Mutual labels:  keras-tensorflow
Unsuprevised seg via cnn
Stars: ✭ 38 (-11.63%)
Mutual labels:  keras-tensorflow
Dncnn
Beyond a Gaussian Denoiser: Residual Learning of Deep CNN for Image Denoising (TIP, 2017)
Stars: ✭ 912 (+2020.93%)
Mutual labels:  keras-tensorflow
Otto
Otto makes machine learning an intuitive, natural language experience. 🏆 Facebook AI Hackathon winner ⭐️ #1 Trending on MadeWithML.com ⭐️ #4 Trending JavaScript Project on GitHub ⭐️ #15 Trending (All Languages) on GitHub
Stars: ✭ 894 (+1979.07%)
Mutual labels:  keras-tensorflow
Node Influx
📈 The InfluxDB Client for Node.js and Browsers
Stars: ✭ 820 (+1806.98%)
Mutual labels:  timeseries
Uplot
📈 A small, fast chart for time series, lines, areas, ohlc & bars
Stars: ✭ 6,808 (+15732.56%)
Mutual labels:  timeseries
Phildb
Timeseries database
Stars: ✭ 25 (-41.86%)
Mutual labels:  timeseries
Go Carbon
Golang implementation of Graphite/Carbon server with classic architecture: Agent -> Cache -> Persister
Stars: ✭ 713 (+1558.14%)
Mutual labels:  timeseries
Densedepth
High Quality Monocular Depth Estimation via Transfer Learning
Stars: ✭ 963 (+2139.53%)
Mutual labels:  keras-tensorflow
Pytorch2keras
PyTorch to Keras model convertor
Stars: ✭ 676 (+1472.09%)
Mutual labels:  keras-tensorflow
Datastream.io
An open-source framework for real-time anomaly detection using Python, ElasticSearch and Kibana
Stars: ✭ 814 (+1793.02%)
Mutual labels:  timeseries
Deep Music Genre Classification
🎵 Using Deep Learning to Categorize Music as Time Progresses Through Spectrogram Analysis
Stars: ✭ 23 (-46.51%)
Mutual labels:  keras-tensorflow
Keras one cycle clr
Keras callbacks for one-cycle training, cyclic learning rate (CLR) training, and learning rate range test.
Stars: ✭ 41 (-4.65%)
Mutual labels:  keras-tensorflow
Nhdrrnet
Keras Implementation of the paper Deep HDR Imaging via A Non-Local Network - TIP 2020
Stars: ✭ 37 (-13.95%)
Mutual labels:  keras-tensorflow

Interpolation-Prediction Networks

In this work, we present a new deep learning architecture for addressing the problem of supervised learning with sparse and irregularly sampled multivariate time series. The architecture is based on the use of a semi-parametric interpolation network followed by the application of a prediction network. The interpolation network allows for information to be shared across multiple dimensions of a multivariate time series during the interpolation stage, while any standard deep learning model can be used for the prediction network.

We use a two layer interpolation network. The first interpolation layer performs a semi-parametric univariate interpolation for each of the D time series separately while the second layer merges information from across all of the D time series at each reference time point by taking into account the correlations among the time series.

Reference

Satya Narayan Shukla and Benjamin Marlin. Interpolation-prediction networks for irregularly sampled time series. In International Conference on Learning Representations, 2019. [pdf]

Requirements

The code requires Python 3.7 or later. The file requirements.txt contains the full list of required Python modules.

Usage

For running our model on univariate time series (UWave dataset):

python src/univariate_example.py --epochs 1000 --hidden_units 2048 --ref_points 128 --batch_size 2048

To reproduce the results on MIMIC-III Dataset, first you need to have an access to the dataset which can be requested here. Once your application to access MIMIC has been approved, you can download the data. MIMIC is provided as a collection of comma-separated (CSV) files. You can use these scripts to import the csv files into a database. Assuming you installed postgres while creating the database, you need to install psycopg2 using

pip install psycopg2

Once the database has been created, run these scripts in order.

python src/mimic_data_extraction.py
python src/multivariate_example.py --epochs 1000 --reference_points 192 --hours_from_adm 48 --batch_size 256 --gpus 4

Data Format Example

The notations here align with the notation section 3.1 in the paper. For brevity, lets assume we have just one example in the training set and dimension d = 2. Input format example

Contact

For more details, please contact [email protected].

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].