All Projects → kevinlin311tw → Caffe Deepbinarycode

kevinlin311tw / Caffe Deepbinarycode

Licence: other
Supervised Semantics-preserving Deep Hashing (TPAMI18)

Projects that are alternatives of or similar to Caffe Deepbinarycode

Similarity-Adaptive-Deep-Hashing
Unsupervised Deep Hashing with Similarity-Adaptive and Discrete Optimization (TPAMI2018)
Stars: ✭ 18 (-91.26%)
Mutual labels:  hashing, caffe, image-retrieval
Vehicle Retrieval Kcnns
vehicle image retrieval using k CNNs ensemble method
Stars: ✭ 81 (-60.68%)
Mutual labels:  convolutional-neural-networks, caffe, image-retrieval
Feathercnn
FeatherCNN is a high performance inference engine for convolutional neural networks.
Stars: ✭ 1,106 (+436.89%)
Mutual labels:  convolutional-neural-networks, caffe
Deep Ranking
Learning Fine-grained Image Similarity with Deep Ranking is a novel application of neural networks, where the authors use a new multi scale architecture combined with a triplet loss to create a neural network that is able to perform image search. This repository is a simplified implementation of the same
Stars: ✭ 64 (-68.93%)
Mutual labels:  convolutional-neural-networks, image-retrieval
Core50
CORe50: a new Dataset and Benchmark for Continual Learning
Stars: ✭ 91 (-55.83%)
Mutual labels:  convolutional-neural-networks, caffe
Cnnimageretrieval Pytorch
CNN Image Retrieval in PyTorch: Training and evaluating CNNs for Image Retrieval in PyTorch
Stars: ✭ 931 (+351.94%)
Mutual labels:  convolutional-neural-networks, image-retrieval
Deep Mihash
Code for papers "Hashing with Mutual Information" (TPAMI 2019) and "Hashing with Binary Matrix Pursuit" (ECCV 2018)
Stars: ✭ 13 (-93.69%)
Mutual labels:  hashing, image-retrieval
Malware Classification
Towards Building an Intelligent Anti-Malware System: A Deep Learning Approach using Support Vector Machine for Malware Classification
Stars: ✭ 88 (-57.28%)
Mutual labels:  convolutional-neural-networks, supervised-learning
O Cnn
O-CNN: Octree-based Convolutional Neural Networks for 3D Shape Analysis
Stars: ✭ 432 (+109.71%)
Mutual labels:  convolutional-neural-networks, caffe
Idn Caffe
Caffe implementation of "Fast and Accurate Single Image Super-Resolution via Information Distillation Network" (CVPR 2018)
Stars: ✭ 104 (-49.51%)
Mutual labels:  convolutional-neural-networks, caffe
Keras Video Classifier
Keras implementation of video classifier
Stars: ✭ 100 (-51.46%)
Mutual labels:  convolutional-neural-networks, supervised-learning
Turkce Yapay Zeka Kaynaklari
Türkiye'de yapılan derin öğrenme (deep learning) ve makine öğrenmesi (machine learning) çalışmalarının derlendiği sayfa.
Stars: ✭ 1,900 (+822.33%)
Mutual labels:  convolutional-neural-networks, caffe
All Classifiers 2019
A collection of computer vision projects for Acute Lymphoblastic Leukemia classification/early detection.
Stars: ✭ 22 (-89.32%)
Mutual labels:  convolutional-neural-networks, caffe
Machine Learning Curriculum
💻 Make machines learn so that you don't have to struggle to program them; The ultimate list
Stars: ✭ 761 (+269.42%)
Mutual labels:  convolutional-neural-networks, caffe
Teacher Student Training
This repository stores the files used for my summer internship's work on "teacher-student learning", an experimental method for training deep neural networks using a trained teacher model.
Stars: ✭ 34 (-83.5%)
Mutual labels:  convolutional-neural-networks, caffe
Caffenet Benchmark
Evaluation of the CNN design choices performance on ImageNet-2012.
Stars: ✭ 700 (+239.81%)
Mutual labels:  convolutional-neural-networks, caffe
Cnn Svm
An Architecture Combining Convolutional Neural Network (CNN) and Linear Support Vector Machine (SVM) for Image Classification
Stars: ✭ 170 (-17.48%)
Mutual labels:  convolutional-neural-networks, supervised-learning
Hardnet
Hardnet descriptor model - "Working hard to know your neighbor's margins: Local descriptor learning loss"
Stars: ✭ 350 (+69.9%)
Mutual labels:  convolutional-neural-networks, image-retrieval
Neuralnetwork.net
A TensorFlow-inspired neural network library built from scratch in C# 7.3 for .NET Standard 2.0, with GPU support through cuDNN
Stars: ✭ 392 (+90.29%)
Mutual labels:  convolutional-neural-networks, supervised-learning
Lsuvinit
Reference caffe implementation of LSUV initialization
Stars: ✭ 99 (-51.94%)
Mutual labels:  convolutional-neural-networks, caffe

Caffe-DeepBinaryCode

Supervised Learning of Semantics-Preserving Deep Hashing (SSDH)

Created by Kevin Lin, Huei-Fang Yang, and Chu-Song Chen at Academia Sinica, Taipei, Taiwan.

Introduction

This paper presents a simple yet effective supervised deep hash approach that constructs binary hash codes from labeled data for large-scale image search. We assume that the semantic labels are governed by several latent attributes with each attribute on or off, and classification relies on these attributes. Based on this assumption, our approach, dubbed supervised semantics-preserving deep hashing (SSDH), constructs hash functions as a latent layer in a deep network and the binary codes are learned by minimizing an objective function defined over classification error and other desirable hash codes properties. With this design, SSDH has a nice characteristic that classification and retrieval are unified in a single learning model. Moreover, SSDH performs joint learning of image representations, hash codes, and classification in a point-wised manner, and thus is scalable to large-scale datasets. SSDH is simple and can be realized by a slight enhancement of an existing deep architecture for classification; yet it is effective and outperforms other hashing approaches on several benchmarks and large datasets. Compared with state-of-the-art approaches, SSDH achieves higher retrieval accuracy, while the classification performance is not sacrificed.

The TPAMI pre-print can be found in the following arXiv preprint. Presentation slide can be found here

Citing the deep hashing work

If you find our work useful in your research, please consider citing:

Supervised Learning of Semantics-Preserving Hash via Deep Convolutional Neural Networks
Huei-Fang Yang, Kevin Lin, Chu-Song Chen
IEEE Transactions on Pattern Analysis and Machine Intelligence (​TPAMI), 2017

Supervised Learning of Semantics-Preserving Hashing via Deep Neural Networks for Large-Scale Image Search
Huei-Fang Yang, Kevin Lin, Chu-Song Chen
arXiv preprint arXiv:1507.00101

Prerequisites

  1. MATLAB (tested with 2012b on 64-bit Linux)
  2. Caffe's prerequisites

Install Caffe-DeepBinaryCode

Adjust Makefile.config and simply run the following commands:

$ make all -j8
$ make test -j8
$ make matcaffe
$ ./prepare.sh

For a faster build, compile in parallel by doing make all -j8 where 8 is the number of parallel threads for compilation (a good choice for the number of threads is the number of cores in your machine).

Demo

Launch matlab and run demo.m. This demo will generate 48-bits binary codes for each image using the proposed SSDH.

>> demo

Retrieval evaluation on CIFAR10

Launch matalb and run run_cifar10.m to perform the evaluation of precision at k and mean average precision (mAP) at k. In this CIFAR10 demo, we employ all the test images (10,000 images) as the query set, and we select all the training images (50,000 images) to form the database (In the paper, only 1,000 test images are used as the query to comply with the settings in other methods). We computed mAP based on the entire retrieval list, thus we set k = 50,000 in this experiment. The bit length of binary codes is 48. This process takes around 12 minutes.

>> run_cifar10

Then, you will get the mAP result as follows.

>> MAP = 0.913361

Moreover, simply run the following commands to generate the precision at k curves:

$ cd analysis
$ gnuplot plot-p-at-k.gnuplot 

You will reproduce the precision curves with respect to different number of top retrieved samples when the 48-bit hash codes are used in the evaluation.

Train SSDH on CIFAR10

Simply run the following command to train SSDH:

$ cd /examples/SSDH
$ ./train.sh

After 50,000 iterations, the top-1 error rate is around 10% on the test set of CIFAR10 dataset:

I1221 16:27:44.764175  2985 solver.cpp:326] Iteration 50000, loss = -0.10567
I1221 16:27:44.764205  2985 solver.cpp:346] Iteration 50000, Testing net (#0)
I1221 16:27:58.907842  2985 solver.cpp:414]     Test net output #0: accuracy = 0.8989
I1221 16:27:58.907877  2985 solver.cpp:414]     Test net output #1: loss: 50%-fire-rate = 0.000621793 (* 1 = 0.000621793 loss)
I1221 16:27:58.907886  2985 solver.cpp:414]     Test net output #2: loss: classfication-error = 0.369317 (* 1 = 0.369317 loss)
I1221 16:27:58.907892  2985 solver.cpp:414]     Test net output #3: loss: forcing-binary = -0.114405 (* 1 = -0.114405 loss)
I1221 16:27:58.907897  2985 solver.cpp:331] Optimization Done.
I1221 16:27:58.907902  2985 caffe.cpp:214] Optimization Done.

The training process takes roughly 2~3 hours on a desktop with Titian X GPU. You will finally get your model named SSDH48_iter_xxxxxx.caffemodel under folder /examples/SSDH/

To use the model, modify the model_file in demo.m to link to your model:

    model_file = './YOUR/MODEL/PATH/filename.caffemodel';

Launch matlab, run demo.m and enjoy!

>> demo

Train SSDH on another dataset

It should be easy to train the model using another dataset as long as that dataset has label annotations.

  1. Convert your training/test set into leveldb/lmdb format using create_imagenet.sh.
  2. Modify the source in /example/SSDH/train_val.prototxt to link to your training/test set.
  3. Run ./examples/SSDH/train.sh, and start training on your dataset.

Resources

Note: This documentation may contain links to third party websites, which are provided for your convenience only. Third party websites may be subject to the third party’s terms, conditions, and privacy statements.

If ./prepare.sh fails to download data, you may manually download the resouces from:

  1. 48-bit SSDH model: MEGA, DropBox

  2. CIFAR10 dataset (jpg format): MEGA, DropBox, BaiduYun

  3. AlexNet pretrained networks: MEGA, DropBox, BaiduYun

FAQ

Q: I have followed the instructions in README, and ran the evaluation code. As shown in README that I will get the mAP around 90%, however, I can only get about 10% mAP. Could you please give me some suggestions?

A: You may have this problem if you didn’t launch matlab at caffe's root folder, which will automatically include important folders into PATH. Two ways to solve this problem: First, run startup.m before you run run_cifar10.m. Second, open ./matlab/feat_batch.m and change line 36 d = load('./matlab/+caffe/imagenet/ilsvrc_2012_mean.mat'); to d = load('THE-PATH-OF-THIS-REPO-IN-YOU-COMPUTER/matlab/+caffe/imagenet/ilsvrc_2012_mean.mat');. Then run run_cifar10.m.

Contact

Please feel free to leave suggestions or comments to Kevin Lin ([email protected]), Huei-Fang Yang ([email protected]) or Chu-Song Chen ([email protected])

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].