All Projects → KGPML → Hyperspectral

KGPML / Hyperspectral

Licence: gpl-3.0
Deep Learning for Land-cover Classification in Hyperspectral Images.

Projects that are alternatives of or similar to Hyperspectral

Skift
scikit-learn wrappers for Python fastText.
Stars: ✭ 213 (-0.93%)
Mutual labels:  jupyter-notebook
Dianjing
点睛 - 头条号文章标题生成工具 (Dianjing, AI to write Title for Articles)
Stars: ✭ 214 (-0.47%)
Mutual labels:  jupyter-notebook
Epidemiology101
Epidemic Modeling for Everyone
Stars: ✭ 215 (+0%)
Mutual labels:  jupyter-notebook
Coursera Stanford
Stanford
Stars: ✭ 212 (-1.4%)
Mutual labels:  jupyter-notebook
Bitcoin prediction
This is the code for "Bitcoin Prediction" by Siraj Raval on Youtube
Stars: ✭ 214 (-0.47%)
Mutual labels:  jupyter-notebook
Gaussianprocesses.jl
A Julia package for Gaussian Processes
Stars: ✭ 214 (-0.47%)
Mutual labels:  jupyter-notebook
Squad
Building QA system for Stanford Question Answering Dataset
Stars: ✭ 213 (-0.93%)
Mutual labels:  jupyter-notebook
Understanding tensorflow nn
🔮Getting started with TensorFlow: Classifying Text with Neural Networks
Stars: ✭ 215 (+0%)
Mutual labels:  jupyter-notebook
Pythonnumericaldemos
Well-documented Python demonstrations for spatial data analytics, geostatistical and machine learning to support my courses.
Stars: ✭ 213 (-0.93%)
Mutual labels:  jupyter-notebook
Pytorch Byol
PyTorch implementation of Bootstrap Your Own Latent: A New Approach to Self-Supervised Learning
Stars: ✭ 213 (-0.93%)
Mutual labels:  jupyter-notebook
Tensorflow Without A Phd
A crash course in six episodes for software developers who want to become machine learning practitioners.
Stars: ✭ 2,488 (+1057.21%)
Mutual labels:  jupyter-notebook
Kekoxtutorial
전 세계의 멋진 케라스 문서 및 튜토리얼을 한글화하여 케라스x코리아를 널리널리 이롭게합니다.
Stars: ✭ 213 (-0.93%)
Mutual labels:  jupyter-notebook
Applied Predictive Modeling With Python
A collection of notebook to learn the Applied Predictive Modeling using Python.
Stars: ✭ 214 (-0.47%)
Mutual labels:  jupyter-notebook
Neural decoding
A python package that includes many methods for decoding neural activity
Stars: ✭ 212 (-1.4%)
Mutual labels:  jupyter-notebook
Pytorch Superpoint
Superpoint Implemented in PyTorch: https://arxiv.org/abs/1712.07629
Stars: ✭ 214 (-0.47%)
Mutual labels:  jupyter-notebook
Tfwss
Weakly Supervised Segmentation with Tensorflow. Implements instance segmentation as described in Simple Does It: Weakly Supervised Instance and Semantic Segmentation, by Khoreva et al. (CVPR 2017).
Stars: ✭ 212 (-1.4%)
Mutual labels:  jupyter-notebook
Tutmom
Tutorial on "Modern Optimization Methods in Python"
Stars: ✭ 214 (-0.47%)
Mutual labels:  jupyter-notebook
Python lectures
파이썬Python 강의에 사용되는 소스코드Source Code와 강의 자료들을 모은 repository 입니다.
Stars: ✭ 214 (-0.47%)
Mutual labels:  jupyter-notebook
Chinese sentiment
中文情感分析,CNN,BI-LSTM,文本分类
Stars: ✭ 216 (+0.47%)
Mutual labels:  jupyter-notebook
Stereo Transformer
Official Repo for Stereo Transformer: Revisiting Stereo Depth Estimation From a Sequence-to-Sequence Perspective with Transformers.
Stars: ✭ 211 (-1.86%)
Mutual labels:  jupyter-notebook

This repository has been ⛔️ DEPRECATED. Please take a look at our fairly recent work:

Band-Adaptive Spectral-Spatial Feature Learning Neural Network for Hyperspectral Image Classification [paper] [Code]

Deep Learning for Land-cover Classification in Hyperspectral Images

Hyperspectral images are images captured in multiple bands of the electromagnetic spectrum. This project is focussed at the development of Deep Learned Artificial Neural Networks for robust landcover classification in hyperspectral images. Land-cover classification is the task of assigning to every pixel, a class label that represents the type of land-cover present in the location of the pixel. It is an image segmentation/scene labeling task. The following diagram describes the task.



This website describes our explorations with the performance of Multi-Layer Perceptrons and Convolutional Neural Networks at the task of Land-cover Classification in Hyperspectral Images. Currently we perform pixel-wise classification.


Dataset =======

We have performed our experiments on the Indian Pines Dataset. The following are the particulars of the dataset:

  • Source: AVIRIS sensor
  • Region: Indian Pines test site over north-western Indiana
  • Time of the year: June
  • Wavelength range: 0.4 – 2.5 micron
  • Number of spectral bands: 220
  • Size of image: 145x145 pixel
  • Number of land-cover classes: 16

Input data format =================

Each pixel is described by an NxN patch centered at the pixel. N denotes the size of spatial context used for making the inference about a given pixel.

The input data was divided into training set (75%) and a test set (25%).

Hardware used

The neural networks were trained on a machine with dual Intel Xeon E5-2630 v2 CPUs, 32 GB RAM and NVIDIA Tesla K-20C GPU.


Multi-Layer Perceptron

Multi-Layer Perceptron (MLP) is an artificial neural network with one or more hidden layers of neurons. MLP is capable of modelling highly non-linear functions between the input and output and forms the basis of Deep-learning Neural Network (DNN) models.

Architecture of Multi-Layer Perceptron used

input- [affine - relu] x 3 - affine - softmax

(Schematic representation below)

Ndenotes the size of the input patch.


Specifics of the learning algorithm

The following are the details of the learning algorithm used:

  • Parameter update algorithm used: Adagrad

    • Batch size: 200
    • Learning rate: 0.01
  • Number of steps: until best validation performance


Performance

Decoding generated for different input patch sizes:


Convolutional Neural Network

(CNN or ConvNet) are a special category of artificial neural networks designed for processing data with a gridlike structure. The ConvNet architecture is based on sparse interactions and parameter sharing and is highly effective for efficient learning of spatial invariances in images. There are four kinds of layers in a typical ConvNet architecture: convolutional (conv), pooling (pool), fullyconnected (affine) and rectifying linear unit (ReLU). Each convolutional layer transforms one set of feature maps into another set of feature maps by convolution with a set of filters.

Architecture of Convolutional Neural Network used

input- [conv - relu - maxpool] x 2 - [affine - relu] x 2 - affine - softmax

(Schematic representation below)

Ndenotes the size of the input patch.


Specifics of the learning algorithm

The following are the details of the learning algorithm used:

  • Parameter update algorithm used: Adagrad

    • Batch size: 100
    • Learning rate: 0.01
  • Number of steps: until best validation performance


Performance

Decoding generated for different input patch sizes:



Description of the repository

  • IndianPines_DataSet_Preparation_Without_Augmentation.ipynb - does the following operations:

    • Loads the Indian Pines dataset
    • Scales the input between [0,1]
    • Mean normalizes the channels
    • Makes training and test splits
    • Extracts patches of given size
    • Oversamples the training set for balancing the classes
  • Spatial_dataset.py - provides a highly flexible Dataset class for handling the Indian Pines data.

  • patch_size.py - specify the required patch-size here.

  • IndianPinesCNN.ipynb- builds the TensorFlow Convolutional Neural Network and defines the training and evaluation ops:

    • inference() - builds the model as far as is required for running the network forward to make predictions.
    • loss() - adds to the inference model the layers required to generate loss.
    • training() - adds to the loss model the Ops required to generate and apply gradients.
    • evaluation() - calcuates the classification accuracy
  • CNN_feed.ipynb - trains and evaluates the Neural Network using a feed dictionary

  • Decoder_Spatial_CNN.ipynb - generates the landcover classification of an input hyperspectral image for a given trained network

  • IndianPinesMLP.py - builds the TensorFlow Multi-layer Perceptron and defines the training and evaluation ops:

    • inference() - builds the model as far as is required for running the network forward to make predictions.
    • loss() - adds to the inference model the layers required to generate loss.
    • training() - adds to the loss model the Ops required to generate and apply gradients.
    • evaluation() - calcuates the classification accuracy
  • MLP_feed.ipynb - trains and evaluates the MLP using a feed dictionary

  • Decoder_Spatial_MLP.ipynb - generates the landcover classification of an input hyperspectral image for a given trained network

  • credibility.ipynb - summarizes the predictions of an ensemble and produces the land-cover classification and class-wise confusion matrix.


Setting up the experiment

  • Download the Indian Pines data-set from here.
  • Make a directory named Data within the current working directory and copy the downloaded .mat files Indian_pines.mat and Indian_pines_gt.mat in this directory.

In order to make sure all codes run smoothly, you should have the following directory subtree structure under your current working directory:

|-- IndianPines_DataSet_Preparation_Without_Augmentation.ipynb
|-- Decoder_Spatial_CNN.ipynb
|-- Decoder_Spatial_MLP.ipynb
|-- IndianPinesCNN.ipynb
|-- CNN_feed.ipynb
|-- MLP_feed.ipynb
|-- credibility.ipynb
|-- IndianPinesCNN.py
|-- IndianPinesMLP.py
|-- Spatial_dataset.py
|-- patch_size.py
|-- Data
|   |-- Indian_pines_gt.mat
|   |-- Indian_pines.mat


  • Set the required patch-size value (eg. 11, 21, etc) in patch_size.py and run the following notebooks in order:
    1. IndianPines_DataSet_Preparation_Without_Augmentation.ipynb
    2. CNN_feed.ipynb OR MLP_feed.ipynb (specify the number of fragments in the training and test data in the variables TRAIN_FILES and TEST_FILES)
    3. Decoder_Spatial_CNN.ipynb OR Decoder_Spatial_MLP.ipynb (set the required checkpoint to be used for decoding in the model_name variable)

Outputs will be displayed in the notebooks.


Acknowledgement

This repository was developed by Anirban Santara, Ankit Singh, Pranoot Hatwar and Kaustubh Mani under the supervision of Prof. Pabitra Mitra during June-July, 2016 at the Department of Computer Science and Engineering, Indian Institute of Technology Kharagpur, India. The project is funded by Satellite Applications Centre, Indian Space Research Organization (SAC-ISRO).

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].