All Projects → snatch59 → Cnn Svm Classifier

snatch59 / Cnn Svm Classifier

Licence: apache-2.0
Using Tensorflow and a Support Vector Machine to Create an Image Classifications Engine

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Cnn Svm Classifier

MachineLearningSeries
Vídeos e códigos do Universo Discreto ensinando o fundamental de Machine Learning em Python. Para mais detalhes, acompanhar a playlist listada.
Stars: ✭ 20 (-39.39%)
Mutual labels:  random-forest
Hyperparameter Optimization Of Machine Learning Algorithms
Implementation of hyperparameter optimization/tuning methods for machine learning & deep learning models (easy&clear)
Stars: ✭ 516 (+1463.64%)
Mutual labels:  random-forest
Grt
gesture recognition toolkit
Stars: ✭ 739 (+2139.39%)
Mutual labels:  random-forest
2018 Machinelearning Lectures Esa
Machine Learning Lectures at the European Space Agency (ESA) in 2018
Stars: ✭ 280 (+748.48%)
Mutual labels:  random-forest
Pytorch classification
利用pytorch实现图像分类的一个完整的代码,训练,预测,TTA,模型融合,模型部署,cnn提取特征,svm或者随机森林等进行分类,模型蒸馏,一个完整的代码
Stars: ✭ 395 (+1096.97%)
Mutual labels:  random-forest
Deep Forest
An Efficient, Scalable and Optimized Python Framework for Deep Forest (2021.2.1)
Stars: ✭ 547 (+1557.58%)
Mutual labels:  random-forest
linear-tree
A python library to build Model Trees with Linear Models at the leaves.
Stars: ✭ 128 (+287.88%)
Mutual labels:  random-forest
Awesome Fraud Detection Papers
A curated list of data mining papers about fraud detection.
Stars: ✭ 843 (+2454.55%)
Mutual labels:  random-forest
Machinelearnjs
Machine Learning library for the web and Node.
Stars: ✭ 498 (+1409.09%)
Mutual labels:  random-forest
Awesome Gradient Boosting Papers
A curated list of gradient boosting research papers with implementations.
Stars: ✭ 704 (+2033.33%)
Mutual labels:  random-forest
Rrcf
🌲 Implementation of the Robust Random Cut Forest algorithm for anomaly detection on streams
Stars: ✭ 289 (+775.76%)
Mutual labels:  random-forest
User Machine Learning Tutorial
useR! 2016 Tutorial: Machine Learning Algorithmic Deep Dive http://user2016.org/tutorials/10.html
Stars: ✭ 393 (+1090.91%)
Mutual labels:  random-forest
Thundergbm
ThunderGBM: Fast GBDTs and Random Forests on GPUs
Stars: ✭ 586 (+1675.76%)
Mutual labels:  random-forest
Machine Learning With Python
Python code for common Machine Learning Algorithms
Stars: ✭ 3,334 (+10003.03%)
Mutual labels:  random-forest
Text Classification Benchmark
文本分类基准测试
Stars: ✭ 18 (-45.45%)
Mutual labels:  random-forest
2020plus
Classifies genes as an oncogene, tumor suppressor gene, or as a non-driver gene by using Random Forests
Stars: ✭ 44 (+33.33%)
Mutual labels:  random-forest
Grf
Generalized Random Forests
Stars: ✭ 532 (+1512.12%)
Mutual labels:  random-forest
Mljar Supervised
Automated Machine Learning Pipeline with Feature Engineering and Hyper-Parameters Tuning 🚀
Stars: ✭ 961 (+2812.12%)
Mutual labels:  random-forest
Jsmlt
🏭 JavaScript Machine Learning Toolkit
Stars: ✭ 22 (-33.33%)
Mutual labels:  random-forest
H2o 3
H2O is an Open Source, Distributed, Fast & Scalable Machine Learning Platform: Deep Learning, Gradient Boosting (GBM) & XGBoost, Random Forest, Generalized Linear Modeling (GLM with Elastic Net), K-Means, PCA, Generalized Additive Models (GAM), RuleFit, Support Vector Machine (SVM), Stacked Ensembles, Automatic Machine Learning (AutoML), etc.
Stars: ✭ 5,656 (+17039.39%)
Mutual labels:  random-forest

cnn-svm-classifier

This example uses a sub set of 48 labelled images from the Caltech image set (http://www.vision.caltech.edu/Image_Datasets/Caltech101/), limited to between 40 and 80 images per label. The images are fed to a TensorFlow implementation of Inception V3 with the classification layer removed in order to produce a set of labelled feature vectors.

Dimensionality reduction is carried out on the 2048-d features using t-distributed stochastic neighbor embedding (t-SNE) to transform them into a 2-d feature which is easy to visualize. Note that t-SNE is used as an informative step. If the same color/label points are mostly clustered together there is a high chance that we could use the features to train a classifier with high accuracy.

The 2048-d labelled features are presented to a number of classifiers. Intially the project was to train a Support Vector Machine to classify images, however for comparison this has been extended to the following:

  • Support Vector Machine (SVM)
  • Extra Trees (ET)
  • Random Forest (RF)
  • K-Nearest Neighbor (KNN)
  • Multi-Layer Perceptron (ML)
  • Gaussian Naive Bayes (GNB)
  • Linear Discriminant Analysis (LDA)
  • Quadratic Discriminant Analysis (QDA)

Training and validation time, and the accuracy of each classifier is displayed. Most classifiers were run with their default tuning values, however tuning was carried, where possible, on those classifiers that fell well below 90% accuracy for their defaults, such of Extra Trees and Random Forsest (initially in the 75 - 78% region).

A summary of the results is as follows (training/test time, accuracy):

  • SVM: 6.77 sec, 96.9%
  • ET: 1.52 sec, 93.2%
  • RF: 16.47 sec, 90.8%
  • KNN: 2.2 sec, 91.5%
  • MLP: 13.83 sec, 97.1%
  • GNB: 1.1 sec, 91.8%
  • LDA: 4.95 sec, 91.0%
  • QDA: 0.84 sec, 5.3% (Variables are collinear warning!)

Note that these results vary between runs, and are just representative.

Quick Start

  1. Unzip the curated image set caltech_101_images.zip. You should then have a directory called caltech_101_images in the same directory as inception3_svm_classifier.py

  2. The imagenet directory already has classify_image_graph_def.pb. If I've removed it to save space on my github account, then download it from http://download.tensorflow.org/models/image/imagenet/inception-2015-12-05.tgz, un-zip it, and place classify_image_graph_def.pb in a directory called 'imagenet'.

  3. Run inception3_svm_classifier.py using Python 3. The following packages are required: tensorflow, sklearn (scikit-learn), numpy, matplotlib. Run time (from scratch) was about 28 minutes on my dual core i7 Skylake laptop.

t-SNE

caltech t-SNE plot

Support Vector Machine

caltech SVM confusion matrix

Extra Trees

caltech ET confusion matrix

Random Forest

caltech RF confusion matrix

K-Nearest Neighbor

caltech KNN confusion matrix

Multi-Layer Perceptron

caltech MLP confusion matrix

Gaussian Naive Bayes

caltech GNB confusion matrix

Linear Discriminant Analysis

caltech LDA confusion matrix

Quadratic Discriminant Analysis

caltech QDA confusion matrix

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].