All Projects → philipperemy → N Beats

philipperemy / N Beats

Licence: mit
Keras/Pytorch implementation of N-BEATS: Neural basis expansion analysis for interpretable time series forecasting.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to N Beats

Gdrl
Grokking Deep Reinforcement Learning
Stars: ✭ 304 (-13.39%)
Mutual labels:  neural-networks
Dgi
Deep Graph Infomax (https://arxiv.org/abs/1809.10341)
Stars: ✭ 326 (-7.12%)
Mutual labels:  neural-networks
Paragraph Vectors
📄 A PyTorch implementation of Paragraph Vectors (doc2vec).
Stars: ✭ 337 (-3.99%)
Mutual labels:  neural-networks
Inceptiontime
InceptionTime: Finding AlexNet for Time Series Classification
Stars: ✭ 311 (-11.4%)
Mutual labels:  neural-networks
Lightnet
🌓 Bringing pjreddie's DarkNet out of the shadows #yolo
Stars: ✭ 322 (-8.26%)
Mutual labels:  neural-networks
Artificio
Deep Learning Computer Vision Algorithms for Real-World Use
Stars: ✭ 326 (-7.12%)
Mutual labels:  neural-networks
Soft Dtw
Python implementation of soft-DTW.
Stars: ✭ 300 (-14.53%)
Mutual labels:  neural-networks
Cyclegan
Tensorflow implementation of CycleGAN
Stars: ✭ 348 (-0.85%)
Mutual labels:  neural-networks
Probability
Probabilistic reasoning and statistical analysis in TensorFlow
Stars: ✭ 3,550 (+911.4%)
Mutual labels:  neural-networks
Machine learning basics
Plain python implementations of basic machine learning algorithms
Stars: ✭ 3,557 (+913.39%)
Mutual labels:  neural-networks
Neural Pipeline
Neural networks training pipeline based on PyTorch
Stars: ✭ 315 (-10.26%)
Mutual labels:  neural-networks
Pywick
High-level batteries-included neural network training library for Pytorch
Stars: ✭ 320 (-8.83%)
Mutual labels:  neural-networks
Mace Models
Mobile AI Compute Engine Model Zoo
Stars: ✭ 329 (-6.27%)
Mutual labels:  neural-networks
Kraken
OCR engine for all the languages
Stars: ✭ 304 (-13.39%)
Mutual labels:  neural-networks
Tbd Nets
PyTorch implementation of "Transparency by Design: Closing the Gap Between Performance and Interpretability in Visual Reasoning"
Stars: ✭ 345 (-1.71%)
Mutual labels:  neural-networks
Pytorch exercises
Stars: ✭ 304 (-13.39%)
Mutual labels:  neural-networks
Deepspeech
DeepSpeech is an open source embedded (offline, on-device) speech-to-text engine which can run in real time on devices ranging from a Raspberry Pi 4 to high power GPU servers.
Stars: ✭ 18,680 (+5221.94%)
Mutual labels:  neural-networks
Brevitas
Brevitas: quantization-aware training in PyTorch
Stars: ✭ 343 (-2.28%)
Mutual labels:  neural-networks
Amazon Forest Computer Vision
Amazon Forest Computer Vision: Satellite Image tagging code using PyTorch / Keras with lots of PyTorch tricks
Stars: ✭ 346 (-1.42%)
Mutual labels:  neural-networks
Supervisely
AI for everyone! 🎉 Neural networks, tools and a library we use in Supervisely
Stars: ✭ 332 (-5.41%)
Mutual labels:  neural-networks

N-BEATS: Neural basis expansion analysis for interpretable time series forecasting (Keras, Pytorch)

Link to [paper]. Authors: Philippe Remy and Jean-Sebastien Dhr


N-Beats at the beginning of the training

Trust me, after a few more steps, the green curve (predictions) matches the ground truth exactly :-)

Installation

Make sure you are in a virtualenv (recommended) and have python3 installed.

From PyPI

Install Keras: pip install nbeats-keras.

Install Pytorch: pip install nbeats-pytorch.

From the sources

Installation is based on a MakeFile.

Command to install N-Beats with Keras: make install-keras

Command to install N-Beats with Pytorch: make install-pytorch

Run on the GPU

To force the utilization of the GPU (Tensorflow), run: pip uninstall -y tensorflow && pip install tensorflow-gpu.

Example

Jupyter notebook: NBeats.ipynb: make run-jupyter.

Here is a toy example on how to use this model (train/inference) for the Keras and Pytorch backends:

import warnings

import numpy as np

from nbeats_keras.model import NBeatsNet as NBeatsKeras
from nbeats_pytorch.model import NBeatsNet as NBeatsPytorch

warnings.filterwarnings(action='ignore', message='Setting attributes')


def main():
    # https://keras.io/layers/recurrent/
    num_samples, time_steps, input_dim, output_dim = 50_000, 10, 1, 1

    for BackendType in [NBeatsKeras, NBeatsPytorch]:
        backend = BackendType(
            backcast_length=time_steps, forecast_length=output_dim,
            stack_types=(NBeatsKeras.GENERIC_BLOCK, NBeatsKeras.GENERIC_BLOCK),
            nb_blocks_per_stack=2, thetas_dim=(4, 4), share_weights_in_stack=True,
            hidden_layer_units=64
        )

        # Definition of the objective function and the optimizer.
        backend.compile(loss='mae', optimizer='adam')

        # Definition of the data. The problem to solve is to find f such as | f(x) - y | -> 0.
        # where f = np.mean.
        x = np.random.uniform(size=(num_samples, time_steps, input_dim))
        y = np.mean(x, axis=1, keepdims=True)

        # Split data into training and testing datasets.
        c = num_samples // 10
        x_train, y_train, x_test, y_test = x[c:], y[c:], x[:c], y[:c]
        test_size = len(x_test)

        # Train the model.
        print('Training...')
        backend.fit(x_train, y_train, validation_data=(x_test, y_test), epochs=20, batch_size=128)

        # Save the model for later.
        backend.save('n_beats_model.h5')

        # Predict on the testing set (forecast).
        predictions_forecast = backend.predict(x_test)
        np.testing.assert_equal(predictions_forecast.shape, (test_size, backend.forecast_length, output_dim))

        # Predict on the testing set (backcast).
        predictions_backcast = backend.predict(x_test, return_backcast=True)
        np.testing.assert_equal(predictions_backcast.shape, (test_size, backend.backcast_length, output_dim))

        # Load the model.
        model_2 = BackendType.load('n_beats_model.h5')

        np.testing.assert_almost_equal(predictions_forecast, model_2.predict(x_test))


if __name__ == '__main__':
    main()

Citation

@misc{NBeatsPRemy,
  author = {Philippe Remy},
  title = {N-BEATS: Neural basis expansion analysis for interpretable time series forecasting},
  year = {2020},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/philipperemy/n-beats}},
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].