All Projects → CompVis → Metric Learning Divide And Conquer

CompVis / Metric Learning Divide And Conquer

Licence: lgpl-3.0
Source code for the paper "Divide and Conquer the Embedding Space for Metric Learning", CVPR 2019

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Metric Learning Divide And Conquer

Additive Margin Softmax
This is the implementation of paper <Additive Margin Softmax for Face Verification>
Stars: ✭ 464 (+100.87%)
Mutual labels:  metric-learning
Pointglr
Global-Local Bidirectional Reasoning for Unsupervised Representation Learning of 3D Point Clouds (CVPR 2020)
Stars: ✭ 86 (-62.77%)
Mutual labels:  metric-learning
Deep metric learning
Deep metric learning methods implemented in Chainer
Stars: ✭ 153 (-33.77%)
Mutual labels:  metric-learning
Prototypical Networks
Code for the NeurIPS 2017 Paper "Prototypical Networks for Few-shot Learning"
Stars: ✭ 705 (+205.19%)
Mutual labels:  metric-learning
Open Reid
Open source person re-identification library in python
Stars: ✭ 1,144 (+395.24%)
Mutual labels:  metric-learning
Negative Margin.few Shot
PyTorch implementation of “Negative Margin Matters: Understanding Margin in Few-shot Classification”
Stars: ✭ 101 (-56.28%)
Mutual labels:  metric-learning
Deep Metric Learning Baselines
PyTorch Implementation for Deep Metric Learning Pipelines
Stars: ✭ 442 (+91.34%)
Mutual labels:  metric-learning
Magnetloss Pytorch
PyTorch implementation of a deep metric learning technique called "Magnet Loss" from Facebook AI Research (FAIR) in ICLR 2016.
Stars: ✭ 217 (-6.06%)
Mutual labels:  metric-learning
Mvgcn
Multi-View Graph Convolutional Network and Its Applications on Neuroimage Analysis for Parkinson's Disease (AMIA 2018)
Stars: ✭ 81 (-64.94%)
Mutual labels:  metric-learning
Segsort
SegSort: Segmentation by Discriminative Sorting of Segments
Stars: ✭ 130 (-43.72%)
Mutual labels:  metric-learning
Hcn Prototypeloss Pytorch
Hierarchical Co-occurrence Network with Prototype Loss for Few-shot Learning (PyTorch)
Stars: ✭ 17 (-92.64%)
Mutual labels:  metric-learning
Metric Learn
Metric learning algorithms in Python
Stars: ✭ 1,125 (+387.01%)
Mutual labels:  metric-learning
Declutr
The corresponding code from our paper "DeCLUTR: Deep Contrastive Learning for Unsupervised Textual Representations". Do not hesitate to open an issue if you run into any trouble!
Stars: ✭ 111 (-51.95%)
Mutual labels:  metric-learning
Humpback Whale Identification 1st
https://www.kaggle.com/c/humpback-whale-identification
Stars: ✭ 591 (+155.84%)
Mutual labels:  metric-learning
Revisiting deep metric learning pytorch
(ICML 2020) This repo contains code for our paper "Revisiting Training Strategies and Generalization Performance in Deep Metric Learning" (https://arxiv.org/abs/2002.08473) to facilitate consistent research in the field of Deep Metric Learning.
Stars: ✭ 172 (-25.54%)
Mutual labels:  metric-learning
Amsoftmax
A simple yet effective loss function for face verification.
Stars: ✭ 443 (+91.77%)
Mutual labels:  metric-learning
Pvse
Polysemous Visual-Semantic Embedding for Cross-Modal Retrieval (CVPR 2019)
Stars: ✭ 93 (-59.74%)
Mutual labels:  metric-learning
Catalyst
Accelerated deep learning R&D
Stars: ✭ 2,804 (+1113.85%)
Mutual labels:  metric-learning
Pytorch Image Retrieval
A PyTorch framework for an image retrieval task including implementation of N-pair Loss (NIPS 2016) and Angular Loss (ICCV 2017).
Stars: ✭ 203 (-12.12%)
Mutual labels:  metric-learning
Dml cross entropy
Code for the paper "A unifying mutual information view of metric learning: cross-entropy vs. pairwise losses" (ECCV 2020 - Spotlight)
Stars: ✭ 117 (-49.35%)
Mutual labels:  metric-learning

Divide and Conquer the Embedding Space for Metric Learning

About

This repository contains the code for reproducing the results for Divide and Conquer the Embedding Space for Metric Learning (CVPR 2019) with the datasets In-Shop Clothes, Stanford Online Products and PKU VehicleID.

Paper: pdf
Supplementary: pdf

We also applied our method to the Humpback Whale Identification Challenge at Kaggle and finished at 10th place out of 2131.
Slides: link

method pipeline

Requirements

Usage

The following command will train the model with Margin loss on the In-Shop Clothes dataset for 200 epochs and a batch size of 80 while splitting the embedding layer with 8 clusters and finetuning the model from epoch 190 on. You can use this command to reproduce the results of the paper for the three datasets by changing simply --dataset=inshop to --dataset=sop (Stanford Online Products) or --dataset=vid (Vehicle-ID).

CUDA_VISIBLE_DEVICES=0 python experiment.py --dataset=inshop \
--dir=test --exp=0 --random-seed=0 --nb-clusters=8 --nb-epochs=200 \
--sz-batch=80 --backend=faiss-gpu  --embedding-lr=1e-5 --embedding-wd=1e-4 \
--backbone-lr=1e-5 --backbone-wd=1e-4 --finetune-epoch=190

The model can be trained without the proposed method by setting the number of clusters to 1 with --nb-clusters=1.
For faster clustering we run Faiss on GPU. If you installed Faiss without GPU support use flag --backend=faiss.

Expected Results

The model checkpoints and log files are saved in the selected log-directory. You can print a summary of the results with python browse_results <log path>.

You will get slightly higher results than what we have reported in the paper. For SOP, In-Shop and Vehicle-ID the [email protected] results should be somewhat around 76.40, 87.36 and 91.54.

Related Repos

  • Collection of baselines for metric learning from @Confusezius [PyTorch]

License

You may find out more about the license here

Reference

If you use this code, please cite the following paper:

Artsiom Sanakoyeu, Vadim Tschernezki, Uta Büchler, Björn Ommer. "Divide and Conquer the Embedding Space for Metric Learning", CVPR 2019.

@InProceedings{dcesml,
  title={Divide and Conquer the Embedding Space for Metric Learning},
  author={Sanakoyeu, Artsiom and Tschernezki, Vadim and B\"uchler, Uta and Ommer, Bj\"orn},
  booktitle={Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition},
  year={2019},
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].