All Projects → mdiephuis → SimCLR

mdiephuis / SimCLR

Licence: MIT license
Pytorch implementation of "A Simple Framework for Contrastive Learning of Visual Representations"

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to SimCLR

Simclr
SimCLRv2 - Big Self-Supervised Models are Strong Semi-Supervised Learners
Stars: ✭ 2,720 (+4084.62%)
Mutual labels:  representation-learning, unsupervised-learning, simclr, contrastive-learning
Revisiting-Contrastive-SSL
Revisiting Contrastive Methods for Unsupervised Learning of Visual Representations. [NeurIPS 2021]
Stars: ✭ 81 (+24.62%)
Mutual labels:  representation-learning, unsupervised-learning, contrastive-learning
proto
Proto-RL: Reinforcement Learning with Prototypical Representations
Stars: ✭ 67 (+3.08%)
Mutual labels:  representation-learning, unsupervised-learning
object-aware-contrastive
Object-aware Contrastive Learning for Debiased Scene Representation (NeurIPS 2021)
Stars: ✭ 44 (-32.31%)
Mutual labels:  representation-learning, contrastive-learning
State-Representation-Learning-An-Overview
Simplified version of "State Representation Learning for Control: An Overview" bibliography
Stars: ✭ 32 (-50.77%)
Mutual labels:  representation-learning, unsupervised-learning
mirror-bert
[EMNLP 2021] Mirror-BERT: Converting Pretrained Language Models to universal text encoders without labels.
Stars: ✭ 56 (-13.85%)
Mutual labels:  unsupervised-learning, contrastive-learning
Supervised-Contrastive-Learning-in-TensorFlow-2
Implements the ideas presented in https://arxiv.org/pdf/2004.11362v1.pdf by Khosla et al.
Stars: ✭ 117 (+80%)
Mutual labels:  representation-learning, contrastive-learning
VQ-APC
Vector Quantized Autoregressive Predictive Coding (VQ-APC)
Stars: ✭ 34 (-47.69%)
Mutual labels:  representation-learning, unsupervised-learning
Pytorch Byol
PyTorch implementation of Bootstrap Your Own Latent: A New Approach to Self-Supervised Learning
Stars: ✭ 213 (+227.69%)
Mutual labels:  representation-learning, unsupervised-learning
M-NMF
An implementation of "Community Preserving Network Embedding" (AAAI 2017)
Stars: ✭ 119 (+83.08%)
Mutual labels:  representation-learning, unsupervised-learning
ViCC
[WACV'22] Code repository for the paper "Self-supervised Video Representation Learning with Cross-Stream Prototypical Contrasting", https://arxiv.org/abs/2106.10137.
Stars: ✭ 33 (-49.23%)
Mutual labels:  unsupervised-learning, contrastive-learning
awesome-graph-self-supervised-learning
Awesome Graph Self-Supervised Learning
Stars: ✭ 805 (+1138.46%)
Mutual labels:  representation-learning, unsupervised-learning
rl singing voice
Unsupervised Representation Learning for Singing Voice Separation
Stars: ✭ 18 (-72.31%)
Mutual labels:  representation-learning, unsupervised-learning
COCO-LM
[NeurIPS 2021] COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining
Stars: ✭ 109 (+67.69%)
Mutual labels:  representation-learning, contrastive-learning
CLSA
official implemntation for "Contrastive Learning with Stronger Augmentations"
Stars: ✭ 48 (-26.15%)
Mutual labels:  unsupervised-learning, contrastive-learning
Contrastive Predictive Coding Pytorch
Contrastive Predictive Coding for Automatic Speaker Verification
Stars: ✭ 223 (+243.08%)
Mutual labels:  representation-learning, unsupervised-learning
mmselfsup
OpenMMLab Self-Supervised Learning Toolbox and Benchmark
Stars: ✭ 2,315 (+3461.54%)
Mutual labels:  unsupervised-learning, simclr
Variational Ladder Autoencoder
Implementation of VLAE
Stars: ✭ 196 (+201.54%)
Mutual labels:  representation-learning, unsupervised-learning
FUSION
PyTorch code for NeurIPSW 2020 paper (4th Workshop on Meta-Learning) "Few-Shot Unsupervised Continual Learning through Meta-Examples"
Stars: ✭ 18 (-72.31%)
Mutual labels:  representation-learning, unsupervised-learning
TCE
This repository contains the code implementation used in the paper Temporally Coherent Embeddings for Self-Supervised Video Representation Learning (TCE).
Stars: ✭ 51 (-21.54%)
Mutual labels:  representation-learning, contrastive-learning

SimCLR

Pytorch implementation of the paper A Simple Framework for Contrastive Learning of Visual Representations

  • ADAM optimizer
  • ExponentialLR schedular. No warmup or other exotics
  • Batchsize of 256 via gradient accumulation

Feature model

  • Resnet50, where the first convolutional layer has a filter size of 3 instead of 7.
  • h() feature dimensionality: 2048
  • z() learning head output dimensionality: 128

Classifier model

  • Simple 1 layer Neural network from 2048 to num_classes

Classification Results

Epochs 100 200
Paper 83.9 89.2
This repo 87.49 88.16

Run

Train the feature extracting model (resnet). Note CIFAR10C inherits from datasets.CIFAR and provides the augmented image pairs.

python train_features.py --batch-size=64 --accumulation-steps=4 --tau=0.5 
                          --feature-size=128 --dataset-name=CIFAR10C --data-dir=path/to/your/data

Train the classifier model. Needs a saved feature model to extract features from images.

python train_classifier.py --load-model=models/modelname_timestamp.pt --dataset-name=CIFAR10 
                          --data-dir=path/to/your/data
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].