All Projects → syamkakarla98 → Dimensionality-reduction-and-classification-on-Hyperspectral-Images-Using-Python

syamkakarla98 / Dimensionality-reduction-and-classification-on-Hyperspectral-Images-Using-Python

Licence: MIT license
In this repository, You can find the files which implement dimensionality reduction on the hyperspectral image(Indian Pines) with classification.

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Dimensionality-reduction-and-classification-on-Hyperspectral-Images-Using-Python

Machine Learning With Python
Practice and tutorial-style notebooks covering wide variety of machine learning techniques
Stars: ✭ 2,197 (+3387.3%)
Mutual labels:  numpy, pandas, classification, dimensionality-reduction, matplotlib
Machine Learning Projects
This repository consists of all my Machine Learning Projects.
Stars: ✭ 135 (+114.29%)
Mutual labels:  numpy, pandas, classification, matplotlib
introduction to ml with python
도서 "[개정판] 파이썬 라이브러리를 활용한 머신 러닝"의 주피터 노트북과 코드입니다.
Stars: ✭ 211 (+234.92%)
Mutual labels:  numpy, pandas, matplotlib
Windrose
A Python Matplotlib, Numpy library to manage wind data, draw windrose (also known as a polar rose plot), draw probability density function and fit Weibull distribution
Stars: ✭ 208 (+230.16%)
Mutual labels:  numpy, pandas, matplotlib
ml-workflow-automation
Python Machine Learning (ML) project that demonstrates the archetypal ML workflow within a Jupyter notebook, with automated model deployment as a RESTful service on Kubernetes.
Stars: ✭ 44 (-30.16%)
Mutual labels:  numpy, pandas, classification
datascienv
datascienv is package that helps you to setup your environment in single line of code with all dependency and it is also include pyforest that provide single line of import all required ml libraries
Stars: ✭ 53 (-15.87%)
Mutual labels:  numpy, pandas, matplotlib
Data Analysis
主要是爬虫与数据分析项目总结,外加建模与机器学习,模型的评估。
Stars: ✭ 142 (+125.4%)
Mutual labels:  numpy, pandas, matplotlib
Data Science Types
Mypy stubs, i.e., type information, for numpy, pandas and matplotlib
Stars: ✭ 180 (+185.71%)
Mutual labels:  numpy, pandas, matplotlib
Orange3
🍊 📊 💡 Orange: Interactive data analysis
Stars: ✭ 3,152 (+4903.17%)
Mutual labels:  numpy, pandas, classification
federated pca
Federated Principal Component Analysis Revisited!
Stars: ✭ 30 (-52.38%)
Mutual labels:  pca, dimensionality-reduction, principal-components
Python-for-data-analysis
No description or website provided.
Stars: ✭ 18 (-71.43%)
Mutual labels:  numpy, pandas, matplotlib
Ml Cheatsheet
A constantly updated python machine learning cheatsheet
Stars: ✭ 136 (+115.87%)
Mutual labels:  numpy, pandas, matplotlib
covid-19
Data ETL & Analysis on the global and Mexican datasets of the COVID-19 pandemic.
Stars: ✭ 14 (-77.78%)
Mutual labels:  numpy, pandas, matplotlib
Opendatawrangling
공공데이터 분석
Stars: ✭ 148 (+134.92%)
Mutual labels:  numpy, pandas, matplotlib
Data Science For Marketing Analytics
Achieve your marketing goals with the data analytics power of Python
Stars: ✭ 127 (+101.59%)
Mutual labels:  numpy, pandas, matplotlib
Stock Market Analysis And Prediction
Stock Market Analysis and Prediction is the project on technical analysis, visualization and prediction using data provided by Google Finance.
Stars: ✭ 112 (+77.78%)
Mutual labels:  numpy, pandas, matplotlib
Mlcourse.ai
Open Machine Learning Course
Stars: ✭ 7,963 (+12539.68%)
Mutual labels:  numpy, pandas, matplotlib
Abu
阿布量化交易系统(股票,期权,期货,比特币,机器学习) 基于python的开源量化交易,量化投资架构
Stars: ✭ 8,589 (+13533.33%)
Mutual labels:  numpy, pandas, matplotlib
Python Wechat Itchat
微信机器人,基于Python itchat接口功能实例展示:01-itchat获取微信好友或者微信群分享文章、02-itchat获取微信公众号文章、03-itchat监听微信公众号发送的文章、04 itchat监听微信群或好友撤回的消息、05 itchat获得微信好友信息以及表图对比、06 python打印出微信被删除好友、07 itchat自动回复好友、08 itchat微信好友个性签名词云图、09 itchat微信好友性别比例、10 微信群或微信好友撤回消息拦截、11 itchat微信群或好友之间转发消息
Stars: ✭ 216 (+242.86%)
Mutual labels:  numpy, pandas, matplotlib
Udacity-Data-Analyst-Nanodegree
Repository for the projects needed to complete the Data Analyst Nanodegree.
Stars: ✭ 31 (-50.79%)
Mutual labels:  numpy, pandas, matplotlib

Dimensionality reduction and classification on Hyperspectral Image Using Python

Authors

Prerequisites

The prerequisites to better understand the code and concept are:

    * Python
    * MatLab
    * Linear Algebra

Installation

  • This project is fully based on python. So, the necessary modules needed for computaion are:
    * Numpy
    * Sklearn
    * Matplotlib
    * Pandas
  • The commands needed for installing the above modules on windows platfom are:
    pip install numpy
    pip install sklearn
    pip install matplotlib
    pip install pandas
  • we can verify the installation of modules by importing the modules. For example:
    import numpy
    from sklearn.decomposition import PCA 
    import matplotlib.pyplot as plt
    import pandas as pd

Results

  • Here we are performing the the dimensionality reduction on one of the widely used hyperspectral image Indian Pines
  1. The result of the indian_pines_pca.py is shown below:

    • It initial result is a bargraph for the first 10 Pricipal Components according to their variance ratio's :

    indian_pines_varianve_ratio

    Since, the initial two principal COmponents have high variance. so, we will select the initial two PC'S.

    • It second result is a scatter plot for the first 10 Pricipal Components is :

    indian_pines_after_pca_with_2pc

    • The above program resullts a dimensionally reduced csvfile .
  2. The result of the indian_pines_knnc.py is given below:

    • The above program will classify the Indian Pines dataset before Principal Component Analysis(PCA). The classifier here used for classification is K-Nearest Neighbour Classifier (KNNC)
    • The time taken for classification is:

    indian_pines_classification_before_pca

    • Then the classification accuracy of indian pines dataset before PCA is:

    indian_pines_accuracy_before_pca

  3. The result of the indian_pines_knnc_after_pca.py

    • Then the resultant classification accuracy of indian pines dataset after PCA is:

      indian_pines_accuracy_after_pca

Conclusion :

  • By performing PCA on the corrected indian pines dataset results 100 Principal Components(PC'S).

  • since, the initial two Principal Components(PC'S) has 92.01839071674918 variance ratio. we selected two only.

  • Initially the dataset contains the dimensions 21025 X 200 is drastically reduced to 21025 X 2 dimensions.

  • The time taken for classification before and after Principal Component Analysis(PCA) is:

    Dataset Accuracy Time Taken
    Before PCA 72.748890 17.6010
    After PCA 60.098187 0.17700982
  • Hence, the time has been reduced with a lot of difference and the classification accuracy(C.A) also reduced but the C.A can increased little bit by varying the 'k' value.

License

This project is licensed under the MIT License - see the LICENSE.md file for details

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].