All Projects → likyoo → Multimodal-Remote-Sensing-Toolkit

likyoo / Multimodal-Remote-Sensing-Toolkit

Licence: GPL-3.0 license
A python tool to perform deep learning experiments on multimodal remote sensing data.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Multimodal-Remote-Sensing-Toolkit

starfm4py
The STARFM fusion model for Python
Stars: ✭ 86 (+145.71%)
Mutual labels:  remote-sensing
arcsi
Software to automate the production of optical analysis ready data (ARD) from Landsat, Sentinel-2 and others.
Stars: ✭ 22 (-37.14%)
Mutual labels:  remote-sensing
TRAR-VQA
[ICCV 2021] TRAR: Routing the Attention Spans in Transformers for Visual Question Answering -- Official Implementation
Stars: ✭ 49 (+40%)
Mutual labels:  multi-modal-learning
WhiteboxTools-ArcGIS
ArcGIS Python Toolbox for WhiteboxTools
Stars: ✭ 190 (+442.86%)
Mutual labels:  remote-sensing
deep learning ecology
Educational Resources on Neural Networks for Ecology and Remote Sensing
Stars: ✭ 45 (+28.57%)
Mutual labels:  remote-sensing
deck.gl-raster
deck.gl layers and WebGL modules for client-side satellite imagery analysis
Stars: ✭ 60 (+71.43%)
Mutual labels:  remote-sensing
opticks
Open source remote sensing analysis tool
Stars: ✭ 37 (+5.71%)
Mutual labels:  remote-sensing
CDLab
Yet another repository for developing and benchmarking deep learning-based change detection methods.
Stars: ✭ 59 (+68.57%)
Mutual labels:  remote-sensing
waterquality
Package designed to detect and quantify water quality and cyanobacterial harmful algal bloom (CHABs) from remotely sensed imagery
Stars: ✭ 31 (-11.43%)
Mutual labels:  remote-sensing
sentinel-util
A CLI for downloading, processing, and making a mosaic from Sentinel-1, -2 and -3 data
Stars: ✭ 22 (-37.14%)
Mutual labels:  remote-sensing
3D-PV-Locator
Repo for "3D-PV-Locator: Large-scale detection of rooftop-mounted photovoltaic systems in 3D" based on Applied Energy publication.
Stars: ✭ 35 (+0%)
Mutual labels:  remote-sensing
Bayesian LSP
A Bayesian hierarchical model that quantifies long-term annual land surface phenology from sparse time series of vegetation indices.
Stars: ✭ 32 (-8.57%)
Mutual labels:  remote-sensing
geospatial-learn
A python library for geo-spatial processing and machine learning
Stars: ✭ 20 (-42.86%)
Mutual labels:  remote-sensing
ChangeFormer
Official PyTorch implementation of our IGARSS'22 paper: A Transformer-Based Siamese Network for Change Detection
Stars: ✭ 220 (+528.57%)
Mutual labels:  remote-sensing
torchgeo
TorchGeo: datasets, samplers, transforms, and pre-trained models for geospatial data
Stars: ✭ 1,125 (+3114.29%)
Mutual labels:  remote-sensing
satellite-crosswalk-classification
Deep Learning Based Large-Scale Automatic Satellite Crosswalk Classification (GRSL, 2017)
Stars: ✭ 18 (-48.57%)
Mutual labels:  remote-sensing
retrievalSystem
The back-end of cross-modal retrieval system,wihch will contain services such as semantic location .etc
Stars: ✭ 64 (+82.86%)
Mutual labels:  remote-sensing
GoogleEarthEngine-side-projects
Google Earth Engine side projects and tutorial scripts
Stars: ✭ 23 (-34.29%)
Mutual labels:  remote-sensing
aruco-geobits
geobits: ArUco Ground Control Point Targets and Detection for Aerial Imagery (UAV/MAV).
Stars: ✭ 32 (-8.57%)
Mutual labels:  remote-sensing
xarray-sentinel
Xarray backend to Copernicus Sentinel-1 satellite data products
Stars: ✭ 189 (+440%)
Mutual labels:  remote-sensing

Deep Learning Methods for Multi-modal Remote Sensing Classification

authorlast commitlicense

Open In Colab

Houston2013

MMRS is a python tool to perform deep learning experiments on multi-modal remote sensing data.

This repository is developed on the top of DeepHyperX .

Models

Currently, the following deep learning methods are available:

Datasets

Quickstart using Colab

You can use MMRS on Google Colab Notebook without any installation. You can run all cells without any modifications to see how everything works.

Usage

Start a Visdom server: python -m visdom.server and go to http://localhost:8097 to see the visualizations.

Then, run the script main.py.

The most useful arguments are:

  • --model to specify the model (e.g. 'S2ENet', 'Middle_fusion_CNN'),
  • --dataset to specify which dataset to use (e.g. 'Houston2013', 'Trento'),
  • the --cuda switch to run the neural nets on GPU. The tool fallbacks on CPU if this switch is not specified.

There are more parameters that can be used to control more finely the behaviour of the tool. See python main.py -h for more information.

Examples:

!python main.py --model S2ENet --flip_augmentation --patch_size 7 --epoch 128 --lr 0.001 --batch_size 64 --seed 0 --dataset Houston2013 --folder '../' --train_set '../Houston2013/TRLabel.mat' --test_set '../Houston2013/TSLabel.mat' --cuda 0

For more features please refer to DeepHyperX.

Citation

If you find this work valuable or use our code in your own research, please consider citing us:

S. Fang, K. Li and Z. Li, "S²ENet: Spatial–Spectral Cross-Modal Enhancement Network for Classification of Hyperspectral and LiDAR Data," in IEEE Geoscience and Remote Sensing Letters, vol. 19, pp. 1-5, 2022, Art no. 6504205, doi: 10.1109/LGRS.2021.3121028.

Bibtex format :

@ARTICLE{9583936, author={Fang, Sheng and Li, Kaiyu and Li, Zhe}, journal={IEEE Geoscience and Remote Sensing Letters}, title={S²ENet: Spatial–Spectral Cross-Modal Enhancement Network for Classification of Hyperspectral and LiDAR Data}, year={2022}, volume={19}, number={}, pages={1-5}, doi={10.1109/LGRS.2021.3121028}}

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].