All Projects → ppriyank → -Online-Soft-Mining-and-Class-Aware-Attention-Pytorch

ppriyank / -Online-Soft-Mining-and-Class-Aware-Attention-Pytorch

Licence: other
(Pytorch and Tensorflow) Implementation of Weighted Contrastive Loss (Deep Metric Learning by Online Soft Mining and Class-Aware Attention)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to -Online-Soft-Mining-and-Class-Aware-Attention-Pytorch

GeDML
Generalized Deep Metric Learning.
Stars: ✭ 30 (+50%)
Mutual labels:  loss-functions, deep-metric-learning
GIouloss CIouloss caffe
Caffe version Generalized & Distance & Complete Iou loss Implementation for Faster RCNN/FPN bbox regression
Stars: ✭ 42 (+110%)
Mutual labels:  loss-functions
ofFaceRecognition
simple example face recognition with deep metric learning to dlib
Stars: ✭ 20 (+0%)
Mutual labels:  deep-metric-learning
Ranked-List-Loss-for-DML
CVPR 2019: Ranked List Loss for Deep Metric Learning, with extension for TPAMI submission
Stars: ✭ 56 (+180%)
Mutual labels:  deep-metric-learning
Pytorch Metric Learning
The easiest way to use deep metric learning in your application. Modular, flexible, and extensible. Written in PyTorch.
Stars: ✭ 3,936 (+19580%)
Mutual labels:  deep-metric-learning
consistency
Implementation of models in our EMNLP 2019 paper: A Logic-Driven Framework for Consistency of Neural Models
Stars: ✭ 26 (+30%)
Mutual labels:  loss-functions
NCE-loss
Tensorflow NCE loss in Keras
Stars: ✭ 30 (+50%)
Mutual labels:  loss-functions
introduction-to-machine-learning
A document covering machine learning basics. 🤖📊
Stars: ✭ 17 (-15%)
Mutual labels:  loss-functions
SphereFace
🍑 TensorFlow Code for CVPR 2017 paper "SphereFace: Deep Hypersphere Embedding for Face Recognition"
Stars: ✭ 110 (+450%)
Mutual labels:  loss-functions
hierarchical-categories-loss-tensorflow
A loss function for categories with a hierarchical structure.
Stars: ✭ 26 (+30%)
Mutual labels:  loss-functions
Addressing-Class-Imbalance-FL
This is the code for Addressing Class Imbalance in Federated Learning (AAAI-2021).
Stars: ✭ 62 (+210%)
Mutual labels:  loss-functions
triplet-loss-pytorch
Highly efficient PyTorch version of the Semi-hard Triplet loss ⚡️
Stars: ✭ 79 (+295%)
Mutual labels:  loss-functions
stylegan-encoder
StyleGAN Encoder - converts real images to latent space
Stars: ✭ 694 (+3370%)
Mutual labels:  loss-functions
SphericalEmbedding
official pytorch implementation of "Deep Metric Learning with Spherical Embedding", NeurIPS 2020
Stars: ✭ 35 (+75%)
Mutual labels:  deep-metric-learning
LinearityIQA
[official] Norm-in-Norm Loss with Faster Convergence and Better Performance for Image Quality Assessment (ACM MM 2020)
Stars: ✭ 73 (+265%)
Mutual labels:  loss-functions
ProxyGML
Official PyTorch Implementation of ProxyGML Loss for Deep Metric Learning, NeurIPS 2020 (spotlight)
Stars: ✭ 44 (+120%)
Mutual labels:  deep-metric-learning
C3Net
C3Net: Demoireing Network Attentive in Channel, Color and Concatenation (CVPRW 2020)
Stars: ✭ 17 (-15%)
Mutual labels:  loss-functions
Image-Segmentation-Loss-Functions
some loss functions of image segmentation
Stars: ✭ 56 (+180%)
Mutual labels:  loss-functions
opl
Official repository for "Orthogonal Projection Loss" (ICCV'21)
Stars: ✭ 61 (+205%)
Mutual labels:  loss-functions
pycsou
Pycsou is a Python 3 package for solving linear inverse problems with state-of-the-art proximal algorithms. The software implements in a highly modular way the main building blocks -cost functionals, penalty terms and linear operators- of generic penalised convex optimisation problems.
Stars: ✭ 37 (+85%)
Mutual labels:  loss-functions

Online-Soft-Mining-and-Class-Aware-Attention

Implementation of Weighted Contrastive Loss from

Deep Metric Learning by Online Soft Mining and Class-Aware Attention (https://arxiv.org/pdf/1811.01459v2.pdf)
Xinshao Wang, Yang Hua1, Elyor Kodirov, Guosheng Hu, Neil M. Robertson

Use (person re-id) Pytorch:


criterion_osm_caa = OSM_CAA_Loss()
if use_gpu:
   imgs, pids = imgs.cuda(), pids.cuda()
imgs, pids = Variable(imgs), Variable(pids)
outputs, features = model(imgs)
if use_gpu:
  loss = criterion_osm_caa(features, pids , model.module.classifier.weight.t())         
else:
  loss = criterion_osm_caa(features, pids , model.classifier.weight.t())         

Tensorflow:

sess =tf.Session()
x = tf.random.uniform([32,200]) #(batch size= 32, embedding dim= 200)
embd  =tf.random.uniform([200, 10]) #(embedding dim= 200 , num of classes = 10)

loss = OSM_CAA_Loss()
osm_loss = loss.forward

loss_val = osm_loss(x , labels , embd)
sess.run(loss_val)

If you find any deviation from the paper, please let me know (raise issue) I will make the necessary changes.

Comments

dist refers to the pairwise distance between normalized feature vectors, of the shape n x n, dij = dist[i][j]

A refers to the pairwise attention score Aij = min(ai , aj)

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].