All Projects → shamangary → DeepCD

shamangary / DeepCD

Licence: MIT license
[ICCV17] DeepCD: Learning Deep Complementary Descriptors for Patch Representations

Programming Languages

lua
6591 projects
shell
77523 projects

Projects that are alternatives of or similar to DeepCD

ASV
[CVPR16] Accumulated Stability Voting: A Robust Descriptor from Descriptors of Multiple Scales
Stars: ✭ 26 (-33.33%)
Mutual labels:  matching, feature
Gms Feature Matcher
GMS: Grid-based Motion Statistics for Fast, Ultra-robust Feature Correspondence (CVPR 17 & IJCV 20)
Stars: ✭ 797 (+1943.59%)
Mutual labels:  matching, feature
mods-light-zmq
MODS with external deep descriptors/detectors
Stars: ✭ 46 (+17.95%)
Mutual labels:  matching, descriptor
Unsupervised-Adaptation-for-Deep-Stereo
Code for "Unsupervised Adaptation for Deep Stereo" - ICCV17
Stars: ✭ 59 (+51.28%)
Mutual labels:  iccv, iccv17
Tensorflow-Wide-Deep-Local-Prediction
This project demonstrates how to run and save predictions locally using exported tensorflow estimator model
Stars: ✭ 28 (-28.21%)
Mutual labels:  deep
ProgramUpdater
PUF - Program Updater Framework. A library to easier the task of program updating
Stars: ✭ 14 (-64.1%)
Mutual labels:  patch
Windows10Tools
Tools for Windows 10
Stars: ✭ 45 (+15.38%)
Mutual labels:  patch
deep-learning-platforms
deep-learning platforms,framework,data(深度学习平台、框架、资料)
Stars: ✭ 17 (-56.41%)
Mutual labels:  torch
hawp
Holistically-Attracted Wireframe Parsing
Stars: ✭ 146 (+274.36%)
Mutual labels:  deep
wcwidth-icons
Support fonts with double-width icons in xterm/rxvt-unicode/zsh/vim/…
Stars: ✭ 36 (-7.69%)
Mutual labels:  patch
neural-vqa-attention
❓ Attention-based Visual Question Answering in Torch
Stars: ✭ 96 (+146.15%)
Mutual labels:  torch
fuzzywuzzyR
fuzzy string matching in R
Stars: ✭ 32 (-17.95%)
Mutual labels:  matching
unleash-client-java
Unleash client SDK for Java
Stars: ✭ 86 (+120.51%)
Mutual labels:  feature
snowman
Welcome to Snowman App – a Data Matching Benchmark Platform.
Stars: ✭ 25 (-35.9%)
Mutual labels:  matching
Cross-View-Gait-Based-Human-Identification-with-Deep-CNNs
Code for 2016 TPAMI(IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE) A Comprehensive Study on Cross-View Gait Based Human Identification with Deep CNNs
Stars: ✭ 21 (-46.15%)
Mutual labels:  torch
map-keys-deep-lodash
Map/rename keys recursively with Lodash
Stars: ✭ 16 (-58.97%)
Mutual labels:  deep
WassersteinGAN.torch
Torch implementation of Wasserstein GAN https://arxiv.org/abs/1701.07875
Stars: ✭ 48 (+23.08%)
Mutual labels:  torch
RWMN
🎥 Repository for our ICCV 2017 paper: A Read Write Network for Movie Story Understanding
Stars: ✭ 87 (+123.08%)
Mutual labels:  iccv17
D3Feat.pytorch
[PyTorch] Official Implementation of CVPR'20 oral paper - D3Feat: Joint Learning of Dense Detection and Description of 3D Local Features https://arxiv.org/abs/2003.03164
Stars: ✭ 99 (+153.85%)
Mutual labels:  descriptor
torch-pitch-shift
Pitch-shift audio clips quickly with PyTorch (CUDA supported)! Additional utilities for searching efficient transformations are included.
Stars: ✭ 70 (+79.49%)
Mutual labels:  torch

DeepCD

Code Author: Tsun-Yi Yang

Last update: 2017/08/17 (Training and testing codes are both uploaded.)

Platform: Ubuntu 14.04, Torch7

Paper

[ICCV17] DeepCD: Learning Deep Complementary Descriptors for Patch Representations

Authors: Tsun-Yi Yang, Jo-Han Hsu, Yen-Yu Lin, and Yung-Yu Chuang

PDF:

Code abstract

This is the source code of DeepCD. The training is done on Brown dataset.

Two distinct descriptors are learned for the same network.

Product late fusion in distance domain is performed before the final ranking.

DeepCD project is heavily inspired by pnnet https://github.com/vbalnt/pnnet

This respository: (author: Tsun-Yi Yang)

Related respositories: (author: Jo-Han Hsu)

Model

Training with Data-Dependent Modulation (DDM) layer

  • DDM layer dynamically adapt the learning rate of the complementary stream.

  • It consider information of the whole batch by considering both leading and complementary distances.

The backward gradient value is scaled by a factor η (1e-3~1e-4). This step not only let us to slow down the learning of fully connected layer inside DDM layer, but also let us to approximately ignore the effect of DDM layer on the forward propagation of the complementary stream and make it an identity operation. The update equation is basically the the backward equation derived from multipling a parameter w from the previous layer.

a_DDM = nn.Identity()
output_layer_DDM = nn.Linear(pT.batch_size*2,pT.batch_size)
output_layer_DDM.weight:fill(0)
output_layer_DDM.bias:fill(1)
b_DDM = nn.Sequential():add(nn.Reshape(pT.batch_size*2,false)):add(output_layer_DDM):add(nn.Sigmoid())
DDM_ct1 = nn.ConcatTable():add(a_DDM:clone()):add(b_DDM:clone())
DDM_layer = nn.Sequential():add(DDM_ct1):add(nn.DataDependentModule(pT.DDM_LR))

Testing stage

  • A hard threshold will be appied on the complementary descriptor before the Hamming distance calculation.

  • DDM layer is not involved in the testing stage since we only need the trained model from the triplet structure.

  • Product late fusion at distance domain is computed before the final ranking.

Brown dataset results

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].