All Projects → hologerry → SoCo

hologerry / SoCo

Licence: MIT license
[NeurIPS 2021 Spotlight] Aligning Pretraining for Detection via Object-Level Contrastive Learning

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects
Dockerfile
14818 projects

Projects that are alternatives of or similar to SoCo

CLMR
Official PyTorch implementation of Contrastive Learning of Musical Representations
Stars: ✭ 216 (+72.8%)
Mutual labels:  self-supervised-learning, contrastive-learning
GeDML
Generalized Deep Metric Learning.
Stars: ✭ 30 (-76%)
Mutual labels:  self-supervised-learning, contrastive-learning
SCL
📄 Spatial Contrastive Learning for Few-Shot Classification (ECML/PKDD 2021).
Stars: ✭ 42 (-66.4%)
Mutual labels:  self-supervised-learning, contrastive-learning
Simclr
SimCLRv2 - Big Self-Supervised Models are Strong Semi-Supervised Learners
Stars: ✭ 2,720 (+2076%)
Mutual labels:  self-supervised-learning, contrastive-learning
TCE
This repository contains the code implementation used in the paper Temporally Coherent Embeddings for Self-Supervised Video Representation Learning (TCE).
Stars: ✭ 51 (-59.2%)
Mutual labels:  self-supervised-learning, contrastive-learning
DiGCL
The PyTorch implementation of Directed Graph Contrastive Learning (DiGCL), NeurIPS-2021
Stars: ✭ 27 (-78.4%)
Mutual labels:  contrastive-learning, neurips-2021
ViCC
[WACV'22] Code repository for the paper "Self-supervised Video Representation Learning with Cross-Stream Prototypical Contrasting", https://arxiv.org/abs/2106.10137.
Stars: ✭ 33 (-73.6%)
Mutual labels:  self-supervised-learning, contrastive-learning
GCA
[WWW 2021] Source code for "Graph Contrastive Learning with Adaptive Augmentation"
Stars: ✭ 69 (-44.8%)
Mutual labels:  self-supervised-learning, contrastive-learning
Revisiting-Contrastive-SSL
Revisiting Contrastive Methods for Unsupervised Learning of Visual Representations. [NeurIPS 2021]
Stars: ✭ 81 (-35.2%)
Mutual labels:  self-supervised-learning, contrastive-learning
object-aware-contrastive
Object-aware Contrastive Learning for Debiased Scene Representation (NeurIPS 2021)
Stars: ✭ 44 (-64.8%)
Mutual labels:  self-supervised-learning, contrastive-learning
Pytorch Metric Learning
The easiest way to use deep metric learning in your application. Modular, flexible, and extensible. Written in PyTorch.
Stars: ✭ 3,936 (+3048.8%)
Mutual labels:  self-supervised-learning, contrastive-learning
info-nce-pytorch
PyTorch implementation of the InfoNCE loss for self-supervised learning.
Stars: ✭ 160 (+28%)
Mutual labels:  self-supervised-learning, contrastive-learning
PIC
Parametric Instance Classification for Unsupervised Visual Feature Learning, NeurIPS 2020
Stars: ✭ 41 (-67.2%)
Mutual labels:  self-supervised-learning, contrastive-learning
awesome-graph-self-supervised-learning-based-recommendation
A curated list of awesome graph & self-supervised-learning-based recommendation.
Stars: ✭ 37 (-70.4%)
Mutual labels:  self-supervised-learning, contrastive-learning
simclr-pytorch
PyTorch implementation of SimCLR: supports multi-GPU training and closely reproduces results
Stars: ✭ 89 (-28.8%)
Mutual labels:  self-supervised-learning, contrastive-learning
DisCont
Code for the paper "DisCont: Self-Supervised Visual Attribute Disentanglement using Context Vectors".
Stars: ✭ 13 (-89.6%)
Mutual labels:  self-supervised-learning, contrastive-learning
G-SimCLR
This is the code base for paper "G-SimCLR : Self-Supervised Contrastive Learning with Guided Projection via Pseudo Labelling" by Souradip Chakraborty, Aritra Roy Gosthipaty and Sayak Paul.
Stars: ✭ 69 (-44.8%)
Mutual labels:  self-supervised-learning, contrastive-learning
AdCo
AdCo: Adversarial Contrast for Efficient Learning of Unsupervised Representations from Self-Trained Negative Adversaries
Stars: ✭ 148 (+18.4%)
Mutual labels:  self-supervised-learning, contrastive-learning
CLSA
official implemntation for "Contrastive Learning with Stronger Augmentations"
Stars: ✭ 48 (-61.6%)
Mutual labels:  self-supervised-learning, contrastive-learning
S2-BNN
S2-BNN: Bridging the Gap Between Self-Supervised Real and 1-bit Neural Networks via Guided Distribution Calibration (CVPR 2021)
Stars: ✭ 53 (-57.6%)
Mutual labels:  self-supervised-learning, contrastive-learning

SoCo

[NeurIPS 2021 Spotlight] Aligning Pretraining for Detection via Object-Level Contrastive Learning

By Fangyun Wei*, Yue Gao*, Zhirong Wu, Han Hu, Stephen Lin.

* Equal contribution.

Introduction

Image-level contrastive representation learning has proven to be highly effective as a generic model for transfer learning. Such generality for transfer learning, however, sacrifices specificity if we are interested in a certain downstream task. We argue that this could be sub-optimal and thus advocate a design principle which encourages alignment between the self-supervised pretext task and the downstream task. In this paper, we follow this principle with a pretraining method specifically designed for the task of object detection. We attain alignment in the following three aspects:

  1. object-level representations are introduced via selective search bounding boxes as object proposals;
  2. the pretraining network architecture incorporates the same dedicated modules used in the detection pipeline (e.g. FPN);
  3. the pretraining is equipped with object detection properties such as object-level translation invariance and scale invariance. Our method, called Selective Object COntrastive learning (SoCo), achieves state-of-the-art results for transfer performance on COCO detection using a Mask R-CNN framework.

Architecture

Main results

The pretrained models and finetuned models with their logs are available on Google Drive and Baidu Pan (code: 4662)

The following links are relative paths of the share folder.

SoCo pre-trained models

Model Arch Epochs Scripts Pretrained Model (relative path)
SoCo ResNet50-C4 100 SoCo_C4_100ep log (pretrain/SoCo_C4_100ep/log.txt)
raw model (pretrain/SoCo_C4_100ep/ckpt_epoch_100.pth)
converted d2 model (pretrain/SoCo_C4_100ep/current_detectron2_C4.pkl)
SoCo ResNet50-C4 400 SoCo_C4_400ep log (pretrain/SoCo_C4_400ep/log.txt)
raw model (pretrain/SoCo_C4_400ep/ckpt_epoch_400.pth)
converted d2 model (pretrain/SoCo_C4_400ep/current_detectron2_C4.pkl)
SoCo ResNet50-FPN 100 SoCo_FPN_100ep log (pretrain/SoCo_FPN_100ep/log.txt)
raw model (pretrain/SoCo_FPN_100ep/ckpt_epoch_100.pth)
converted d2 model (pretrain/SoCo_FPN_100ep/current_detectron2_Head.pkl)
SoCo ResNet50-FPN 400 SoCo_FPN_400ep log (pretrain/SoCo_FPN_400ep/log.txt)
raw model (pretrain/SoCo_FPN_400ep/ckpt_epoch_400.pth)
converted d2 model (pretrain/SoCo_FPN_400ep/current_detectron2_Head.pkl)
SoCo* ResNet50-FPN 400 SoCo_FPN_Star_400ep log (pretrain/SoCo_FPN_Star_400ep/log.txt)
raw model (pretrain/SoCo_FPN_Star_400ep/ckpt_epoch_400.pth)
converted d2 model (pretrain/SoCo_FPN_Star_400ep/current_detectron2_Head.pkl)

Results on LVIS with MaskRCNN R50-FPN

Methods Epoch APbb APbb50 APbb75 APmk APmk50 APmk75 config Detectron2 trained (relative path)
Supervised 90 20.4 32.9 21.7 19.4 30.6 20.5 -- --
SoCo* 400 26.3 41.2 27.8 25.0 38.5 26.8 config log (finetune/mask_rcnn_lvis_SoCo_FPN_Star_400ep_1x/log.txt)
model (finetune/mask_rcnn_lvis_SoCo_FPN_Star_400ep_1x/model_final.pth)

Results on COCO with MaskRCNN R50-FPN

Methods Epoch APbb APbb50 APbb75 APmk APmk50 APmk75 config Detectron2 trained (relative path)
Scratch - 31.0 49.5 33.2 28.5 46.8 30.4 -- --
Supervised 90 38.9 59.6 42.7 35.4 56.5 38.1 -- --
SoCo 100 42.3 62.5 46.5 37.6 59.1 40.5 config log (finetune/mask_rcnn_coco_SoCo_FPN_100ep_1x/log.txt)
model (finetune/mask_rcnn_coco_SoCo_FPN_100ep_1x/model_final.pth)
SoCo 400 43.0 63.3 47.1 38.2 60.2 41.0 config log (finetune/mask_rcnn_coco_SoCo_FPN_400ep_1x/log.txt)
model (finetune/mask_rcnn_coco_SoCo_FPN_400ep_1x/model_final.pth)
SoCo* 400 43.2 63.5 47.4 38.4 60.2 41.4 config log (finetune/mask_rcnn_coco_SoCo_FPN_Star_400ep_1x/log.txt)
model (finetune/mask_rcnn_coco_SoCo_FPN_Star_400ep_1x/model_final.pth)

Results on COCO with MaskRCNN R50-C4

Methods Epoch APbb APbb50 APbb75 APmk APmk50 APmk75 config Detectron2 trained (relative path)
Scratch - 26.4 44.0 27.8 29.3 46.9 30.8 -- --
Supervised 90 38.2 58.2 41.2 33.3 54.7 35.2 -- --
SoCo 100 40.4 60.4 43.7 34.9 56.8 37.0 config log (finetune/mask_rcnn_coco_SoCo_C4_100ep_1x/log.txt)
model (finetune/mask_rcnn_coco_SoCo_C4_100ep_1x/model_final.pth)
SoCo 400 40.9 60.9 44.3 35.3 57.5 37.3 config log (finetune/mask_rcnn_coco_SoCo_C4_400ep_1x/log.txt)
model (finetune/mask_rcnn_coco_SoCo_C4_400ep_1x/model_final.pth)

Get started

Requirements

The Dockerfile is included, please refer to it.

Prepare data with Selective Search

  1. Generate Selective Search proposals
    python selective_search/generate_imagenet_ss_proposals.py
  2. Filter out invalid proposals with filter strategy
    python selective_search/filter_ss_proposals_json.py
  3. Post preprocessing for images of no proposals
    python selective_search/filter_ss_proposals_json_post_no_prop.py

Pretrain with SoCo

Use SoCo FPN 100 epoch as example.

bash ./tools/SoCo_FPN_100ep.sh

Finetune detector

  1. Copy the folder detectron2_configs to the root folder of Detectron2
  2. Train the detectors with Detectron2

Citation

@article{wei2021aligning,
  title={Aligning Pretraining for Detection via Object-Level Contrastive Learning},
  author={Wei, Fangyun and Gao, Yue and Wu, Zhirong and Hu, Han and Lin, Stephen},
  journal={arXiv preprint arXiv:2106.02637},
  year={2021}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].