All Projects → zbr17 → GeDML

zbr17 / GeDML

Licence: MIT license
Generalized Deep Metric Learning.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to GeDML

Pytorch Metric Learning
The easiest way to use deep metric learning in your application. Modular, flexible, and extensible. Written in PyTorch.
Stars: ✭ 3,936 (+13020%)
Mutual labels:  metric-learning, deep-metric-learning, self-supervised-learning, contrastive-learning
TCE
This repository contains the code implementation used in the paper Temporally Coherent Embeddings for Self-Supervised Video Representation Learning (TCE).
Stars: ✭ 51 (+70%)
Mutual labels:  metric-learning, self-supervised-learning, contrastive-learning
SCL
📄 Spatial Contrastive Learning for Few-Shot Classification (ECML/PKDD 2021).
Stars: ✭ 42 (+40%)
Mutual labels:  self-supervised-learning, contrastive-learning
-Online-Soft-Mining-and-Class-Aware-Attention-Pytorch
(Pytorch and Tensorflow) Implementation of Weighted Contrastive Loss (Deep Metric Learning by Online Soft Mining and Class-Aware Attention)
Stars: ✭ 20 (-33.33%)
Mutual labels:  loss-functions, deep-metric-learning
object-aware-contrastive
Object-aware Contrastive Learning for Debiased Scene Representation (NeurIPS 2021)
Stars: ✭ 44 (+46.67%)
Mutual labels:  self-supervised-learning, contrastive-learning
ViCC
[WACV'22] Code repository for the paper "Self-supervised Video Representation Learning with Cross-Stream Prototypical Contrasting", https://arxiv.org/abs/2106.10137.
Stars: ✭ 33 (+10%)
Mutual labels:  self-supervised-learning, contrastive-learning
Simclr
SimCLRv2 - Big Self-Supervised Models are Strong Semi-Supervised Learners
Stars: ✭ 2,720 (+8966.67%)
Mutual labels:  self-supervised-learning, contrastive-learning
triplet-loss-pytorch
Highly efficient PyTorch version of the Semi-hard Triplet loss ⚡️
Stars: ✭ 79 (+163.33%)
Mutual labels:  metric-learning, loss-functions
GCA
[WWW 2021] Source code for "Graph Contrastive Learning with Adaptive Augmentation"
Stars: ✭ 69 (+130%)
Mutual labels:  self-supervised-learning, contrastive-learning
proxy-synthesis
Official PyTorch implementation of "Proxy Synthesis: Learning with Synthetic Classes for Deep Metric Learning" (AAAI 2021)
Stars: ✭ 30 (+0%)
Mutual labels:  metric-learning, deep-metric-learning
S2-BNN
S2-BNN: Bridging the Gap Between Self-Supervised Real and 1-bit Neural Networks via Guided Distribution Calibration (CVPR 2021)
Stars: ✭ 53 (+76.67%)
Mutual labels:  self-supervised-learning, contrastive-learning
DisCont
Code for the paper "DisCont: Self-Supervised Visual Attribute Disentanglement using Context Vectors".
Stars: ✭ 13 (-56.67%)
Mutual labels:  self-supervised-learning, contrastive-learning
PIC
Parametric Instance Classification for Unsupervised Visual Feature Learning, NeurIPS 2020
Stars: ✭ 41 (+36.67%)
Mutual labels:  self-supervised-learning, contrastive-learning
CLMR
Official PyTorch implementation of Contrastive Learning of Musical Representations
Stars: ✭ 216 (+620%)
Mutual labels:  self-supervised-learning, contrastive-learning
simclr-pytorch
PyTorch implementation of SimCLR: supports multi-GPU training and closely reproduces results
Stars: ✭ 89 (+196.67%)
Mutual labels:  self-supervised-learning, contrastive-learning
CLSA
official implemntation for "Contrastive Learning with Stronger Augmentations"
Stars: ✭ 48 (+60%)
Mutual labels:  self-supervised-learning, contrastive-learning
G-SimCLR
This is the code base for paper "G-SimCLR : Self-Supervised Contrastive Learning with Guided Projection via Pseudo Labelling" by Souradip Chakraborty, Aritra Roy Gosthipaty and Sayak Paul.
Stars: ✭ 69 (+130%)
Mutual labels:  self-supervised-learning, contrastive-learning
AdCo
AdCo: Adversarial Contrast for Efficient Learning of Unsupervised Representations from Self-Trained Negative Adversaries
Stars: ✭ 148 (+393.33%)
Mutual labels:  self-supervised-learning, contrastive-learning
Revisiting-Contrastive-SSL
Revisiting Contrastive Methods for Unsupervised Learning of Visual Representations. [NeurIPS 2021]
Stars: ✭ 81 (+170%)
Mutual labels:  self-supervised-learning, contrastive-learning
FastAP-metric-learning
Code for CVPR 2019 paper "Deep Metric Learning to Rank"
Stars: ✭ 93 (+210%)
Mutual labels:  metric-learning, deep-metric-learning

Logo

PyPi version Documentation build

News

  • [2022-3-22]: **v0.2.2 has been released:
    • Fix some bugs.
  • [2021-11-3]: **v0.2.0 has been released:
    • New features:
      • Change the format of link configuration.
  • [2021-10-27]: **v0.1.4 has been released:
    • New features:
      • Add contrastive representation learning methods (MoCo-V2).
  • [2021-10-24]: **v0.1.2 has been released:
    • New features:
      • Add distributed (DDP) support.
  • [2021-10-7]: **v0.1.1 has been released:
    • New features:
      • Change the Cars196 loading method.
  • [2021-9-15]: **v0.1.0 has been released:
    • New features:
      • output_wrapper and pipeline setting are decomposed for convenience.
      • Pipeline will be stored in the experiment folder using a directed graph.
  • [2021-9-13]: **v0.0.1 has been released:
    • New features:
      • config.yaml will be created to store the configuration in the experiment folder.**
  • [2021-9-6]: v0.0.0 has been released.

Introduction

GeDML is an easy-to-use generalized deep metric learning library, which contains:

  • State-of-the-art DML algorithms: We contrain 18+ losses functions and 6+ sampling strategies, and divide these algorithms into three categories (i.e., collectors, selectors, and losses).
  • Bridge bewteen DML and SSL: We attempt to bridge the gap between deep metric learning and self-supervised learning through specially designed modules, such as collectors.
  • Auxiliary modules to assist in building: We also encapsulates the upper interface for users to start programs quickly and separates the codes and configs for managing hyper-parameters conveniently.

Installation

Pip

pip install gedml

Quickstart

Demo 1: deep metric learning

CUDA_VISIBLE_DEVICES=0 python demo.py \
--data_path <path_to_data> \
--save_path <path_to_save> \
--eval_exclude f1_score NMI AMI \
--device 0 --batch_size 128 --test_batch_size 128 \
--setting proxy_anchor --splits_to_eval test --embeddings_dim 128 \
--lr_trunk 0.0001 --lr_embedder 0.0001 --lr_collector 0.01 \
--dataset cub200 --delete_old \

Demo 2: contrastive representation learning

CUDA_VISIBLE_DEVICES=0 python demo.py \
--data_path <path_to_data> \
--save_path <path_to_save> \
--eval_exclude f1_score NMI AMI \
--device 0 --batch_size 128 --test_batch_size 128 \
--setting mocov2 --splits_to_eval test --embeddings_dim 128 \
--lr_trunk 0.015 --lr_embedder 0.015 \
--dataset imagenet --delete_old \

If you want to use our code to conduct DML or CRL experiments, please refer to the up-to-date and most detailed configurations below: 👇

  • If you use the command line, you can run sample_run.sh to try this project.
  • If you debug with VS Code, you can refer to launch.json to set .vscode.

API

Initialization

Use ParserWithConvert to get parameters

>>> from gedml.launcher.misc import ParserWithConvert
>>> csv_path = ...
>>> parser = ParserWithConvert(csv_path=csv_path, name="...")
>>> opt, convert_dict = parser.render()

Use ConfigHandler to create all objects.

>>> from gedml.launcher.creators import ConfigHandler
>>> link_path = ...
>>> assert_path = ...
>>> param_path = ...
>>> config_handler = ConfigHandler(
    convert_dict=convert_dict,
    link_path=link_path,
    assert_path=assert_path,
    params_path=param_path,
    is_confirm_first=True
)
>>> config_handler.get_params_dict()
>>> objects_dict = config_handler.create_all()

Start

Use manager to automatically call trainer and tester.

>>> from gedml.launcher.misc import utils
>>> manager = utils.get_default(objects_dict, "managers")
>>> manager.run()

Or directly use trainer and tester.

>>> from gedml.launcher.misc import utils
>>> trainer = utils.get_default(objects_dict, "trainers")
>>> tester = utils.get_default(objects_dict, "testers")
>>> recorder = utils.get_default(objects_dict, "recorders")
# start to train
>>> utils.func_params_mediator(
    [objects_dict],
    trainer.__call__
)
# start to test
>>> metrics = utils.func_params_mediator(
    [
        {"recorders": recorder},
        objects_dict,
    ],
    tester.__call__
)

Document

For more information, please refer to: 👉 Docs 📖

Some specific guidances:

Configs

We will continually update the optimal parameters of different configs in TsinghuaCloud

Framework

This project is modular in design. The pipeline diagram is as follows:

Pipeline

Code structure

Method

Collectors

method description
BaseCollector Base class
DefaultCollector Do nothing
ProxyCollector Maintain a set of proxies
MoCoCollector paper: Momentum Contrast for Unsupervised Visual Representation Learning
SimSiamCollector paper: Exploring Simple Siamese Representation Learning
HDMLCollector paper: Hardness-Aware Deep Metric Learning
DAMLCollector paper: Deep Adversarial Metric Learning
DVMLCollector paper: Deep Variational Metric Learning

Losses

classifier-based

method description
CrossEntropyLoss Cross entropy loss for unsupervised methods
LargeMarginSoftmaxLoss paper: Large-Margin Softmax Loss for Convolutional Neural Networks
ArcFaceLoss paper: ArcFace: Additive Angular Margin Loss for Deep Face Recognition
CosFaceLoss paper: CosFace: Large Margin Cosine Loss for Deep Face Recognition

pair-based

method description
ContrastiveLoss paper: Learning a Similarity Metric Discriminatively, with Application to Face Verification
MarginLoss paper: Sampling Matters in Deep Embedding Learning
TripletLoss paper: Learning local feature descriptors with triplets and shallow convolutional neural networks
AngularLoss paper: Deep Metric Learning with Angular Loss
CircleLoss paper: Circle Loss: A Unified Perspective of Pair Similarity Optimization
FastAPLoss paper: Deep Metric Learning to Rank
LiftedStructureLoss paper: Deep Metric Learning via Lifted Structured Feature Embedding
MultiSimilarityLoss paper: Multi-Similarity Loss With General Pair Weighting for Deep Metric Learning
NPairLoss paper: Improved Deep Metric Learning with Multi-class N-pair Loss Objective
SignalToNoiseRatioLoss paper: Signal-To-Noise Ratio: A Robust Distance Metric for Deep Metric Learning
PosPairLoss paper: Exploring Simple Siamese Representation Learning

proxy-based

method description
ProxyLoss paper: No Fuss Distance Metric Learning Using Proxies
ProxyAnchorLoss paper: Proxy Anchor Loss for Deep Metric Learning
SoftTripleLoss paper: SoftTriple Loss: Deep Metric Learning Without Triplet Sampling

Selectors

method description
BaseSelector Base class
DefaultSelector Do nothing
DenseTripletSelector Select all triples
DensePairSelector Select all pairs

Code Reference

TODO:

  • assert parameters.
  • write github action to automate unit-test, package publish and docs building.
  • add cross-validation splits protocol.
  • distributed tester for matrix-form input.
  • add metrics module.
  • how to improve the running efficiency.

IMPORTANT TODO:

  • re-define pipeline setting!!!
  • simplify distribution setting!!
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].