All Projects → VSainteuf → lightweight-temporal-attention-pytorch

VSainteuf / lightweight-temporal-attention-pytorch

Licence: MIT license
A PyTorch implementation of the Light Temporal Attention Encoder (L-TAE) for satellite image time series. classification

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to lightweight-temporal-attention-pytorch

Modistsp
An "R" package for automatic download and preprocessing of MODIS Land Products Time Series
Stars: ✭ 118 (+174.42%)
Mutual labels:  time-series, remote-sensing, satellite-imagery
pytorch-psetae
PyTorch implementation of the model presented in "Satellite Image Time Series Classification with Pixel-Set Encoders and Temporal Self-Attention"
Stars: ✭ 117 (+172.09%)
Mutual labels:  remote-sensing, time-series-classification, self-attention
Python-for-Remote-Sensing
python codes for remote sensing applications will be uploaded here. I will try to teach everything I learn during my projects in here.
Stars: ✭ 20 (-53.49%)
Mutual labels:  remote-sensing, satellite-imagery, satellite-data
Torchsat
🔥TorchSat 🌏 is an open-source deep learning framework for satellite imagery analysis based on PyTorch.
Stars: ✭ 261 (+506.98%)
Mutual labels:  satellite, remote-sensing, satellite-imagery
deck.gl-raster
deck.gl layers and WebGL modules for client-side satellite imagery analysis
Stars: ✭ 60 (+39.53%)
Mutual labels:  satellite, remote-sensing, satellite-imagery
Start maja
To process a Sentinel-2 time series with MAJA cloud detection and atmospheric correction processor
Stars: ✭ 47 (+9.3%)
Mutual labels:  time-series, remote-sensing, satellite-imagery
geoblaze
Blazing Fast JavaScript Raster Processing Engine
Stars: ✭ 80 (+86.05%)
Mutual labels:  satellite, remote-sensing, satellite-imagery
modape
MODIS Assimilation and Processing Engine
Stars: ✭ 19 (-55.81%)
Mutual labels:  time-series, satellite, remote-sensing
goes2go
Download and process GOES-16 and GOES-17 data from NOAA's archive on AWS using Python.
Stars: ✭ 77 (+79.07%)
Mutual labels:  satellite, satellite-imagery, satellite-data
s5p-tools
Python scripts to download and preprocess air pollution concentration level data aquired from the Sentinel-5P mission
Stars: ✭ 49 (+13.95%)
Mutual labels:  remote-sensing, satellite-data
query-selector
LONG-TERM SERIES FORECASTING WITH QUERYSELECTOR – EFFICIENT MODEL OF SPARSEATTENTION
Stars: ✭ 63 (+46.51%)
Mutual labels:  time-series, self-attention
sits
Satellite image time series in R
Stars: ✭ 342 (+695.35%)
Mutual labels:  remote-sensing, satellite-imagery
Mintpy
Miami InSAR time-series software in Python
Stars: ✭ 195 (+353.49%)
Mutual labels:  time-series, remote-sensing
Motion Sense
MotionSense Dataset for Human Activity and Attribute Recognition ( time-series data generated by smartphone's sensors: accelerometer and gyroscope)
Stars: ✭ 159 (+269.77%)
Mutual labels:  time-series, deeplearning
iris
Semi-automatic tool for manual segmentation of multi-spectral and geo-spatial imagery.
Stars: ✭ 87 (+102.33%)
Mutual labels:  remote-sensing, satellite-imagery
Java Deep Learning Cookbook
Code for Java Deep Learning Cookbook
Stars: ✭ 156 (+262.79%)
Mutual labels:  time-series, deeplearning
satellite-crosswalk-classification
Deep Learning Based Large-Scale Automatic Satellite Crosswalk Classification (GRSL, 2017)
Stars: ✭ 18 (-58.14%)
Mutual labels:  remote-sensing, satellite-imagery
ChangeFormer
Official PyTorch implementation of our IGARSS'22 paper: A Transformer-Based Siamese Network for Change Detection
Stars: ✭ 220 (+411.63%)
Mutual labels:  remote-sensing, satellite-imagery
Anomaly detection tuto
Anomaly detection tutorial on univariate time series with an auto-encoder
Stars: ✭ 144 (+234.88%)
Mutual labels:  time-series, deeplearning
aitlas
AiTLAS implements state-of-the-art AI methods for exploratory and predictive analysis of satellite images.
Stars: ✭ 134 (+211.63%)
Mutual labels:  remote-sensing, satellite-data

Lightweight Temporal Self-Attention (PyTorch)

A PyTorch implementation of the Light Temporal Attention Encoder (L-TAE) for satellite image time series classification. (see preprint here)

The increasing accessibility and precision of Earth observation satellite data offers considerable opportunities for industrial and state actors alike. This calls however for efficient methods able to process time-series on a global scale. Building on recent work employing multi-headed self-attention mechanisms to classify remote sensing time sequences, we propose a modification of the Temporal Attention Encoder. In our network, the channels of the temporal inputs are distributed among several compact attention heads operating in parallel. Each head extracts highly-specialized temporal features which are in turn concatenated into a single representation. Our approach outperforms other state-of-the-art time series classification algorithms on an open-access satellite image dataset, while using significantly fewer parameters and with a reduced computational complexity.

(see preprint here)

Requirements

  • PyTorch + Torchnet
  • Numpy + Scipy + scikit-learn

(see requirements.txt)

The code was developed in python 3.6.10 with pytorch 1.5.0

Downloads

Sentinel-Agri dataset

We use the Sentinel-Agri dataset available on this github repository. The dataset is comprised of time series of satellite images of agricultural parcels. Check the "Data Format" section of the repository for more details on the data.

Pre-trained weights

Pre-trained weights of the PSE+LTAE model available here

Use the models.stclassifier.PseLTae_pretrained class to instanciate the pre-trained model.

Code

This repo contains all the necessary scripts to reproduce the figure below. The implementations of the L-TAE, TAE, GRU and TempCNN temporal modules can be found in models. These four modules are combined with a Pixel-Set Encoder to form a spatio-temporal classifier, directly applicable on the Sentinel-Agri PixelSet dataset. The four architectures are found in models.stclassifier.

Use the train.py script to train the 150k-parameter L-TAE based classifier (by default). You will only need to specify the path to the dataset folder:

python train.py --dataset_folder path_to_sentinelagri_pixelset_dataset

You can use the same script to play around with the model's hyperparameters, or train an instance of a competing architecture.

To train the precise configurations that were used to produce the figure, add the arguments that are listed in the config_fig2.json file. For example, the following command will train the 9k-parameter L-TAE instance:

python train.py --dataset_folder path_to_sentinelagri_pixelset_dataset --n_head 8 --d_k 8 --mlp3 [128]

Credits

  • The Lightweight Temporal Attention Encoder is heavily inspired by the works of Vaswani et al. on the Transformer, and this pytorch implementation served as code base for the ltae.py script.
  • Credits to github.com/clcarwin/ for the pytorch implementation of the focal loss

Reference

Please include a citation to the following paper if you use the L-TAE.

@article{garnot2020ltae,
  title={Lightweight Temporal Self-Attention  for Classifying Satellite Images Time Series},
  author={Sainte Fare Garnot, Vivien  and Landrieu, Loic},
  journal={arXiv preprint arXiv:2007.00586},
  year={2020}
}

Make sure to also include a citation to the PSE+TAE paper below if you are using the Pixel-Set Encoder.

@article{garnot2020psetae,
  title={Satellite Image Time Series Classification with Pixel-Set Encoders and Temporal Self-Attention},
  author={Sainte Fare Garnot, Vivien  and Landrieu, Loic and Giordano, Sebastien and Chehata, Nesrine},
  journal={CVPR},
  year={2020}
}

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].