All Projects → sthalles → Simclr

sthalles / Simclr

Licence: mit
PyTorch implementation of SimCLR: A Simple Framework for Contrastive Learning of Visual Representations

Projects that are alternatives of or similar to Simclr

Pytorch Byol
PyTorch implementation of Bootstrap Your Own Latent: A New Approach to Self-Supervised Learning
Stars: ✭ 213 (-71.6%)
Mutual labels:  jupyter-notebook, unsupervised-learning, representation-learning
Simclr
SimCLRv2 - Big Self-Supervised Models are Strong Semi-Supervised Learners
Stars: ✭ 2,720 (+262.67%)
Mutual labels:  jupyter-notebook, unsupervised-learning, representation-learning
Decagon
Graph convolutional neural network for multirelational link prediction
Stars: ✭ 268 (-64.27%)
Mutual labels:  jupyter-notebook, representation-learning
100 Days Of Ml Code
100-Days-Of-ML-Code中文版
Stars: ✭ 16,797 (+2139.6%)
Mutual labels:  jupyter-notebook, unsupervised-learning
Contrastive Predictive Coding
Keras implementation of Representation Learning with Contrastive Predictive Coding
Stars: ✭ 369 (-50.8%)
Mutual labels:  unsupervised-learning, representation-learning
awesome-contrastive-self-supervised-learning
A comprehensive list of awesome contrastive self-supervised learning papers.
Stars: ✭ 748 (-0.27%)
Mutual labels:  representation-learning, unsupervised-learning
ladder-vae-pytorch
Ladder Variational Autoencoders (LVAE) in PyTorch
Stars: ✭ 59 (-92.13%)
Mutual labels:  representation-learning, unsupervised-learning
Pytorch Cortexnet
PyTorch implementation of the CortexNet predictive model
Stars: ✭ 349 (-53.47%)
Mutual labels:  jupyter-notebook, unsupervised-learning
rl singing voice
Unsupervised Representation Learning for Singing Voice Separation
Stars: ✭ 18 (-97.6%)
Mutual labels:  representation-learning, unsupervised-learning
Awesome Vaes
A curated list of awesome work on VAEs, disentanglement, representation learning, and generative models.
Stars: ✭ 418 (-44.27%)
Mutual labels:  unsupervised-learning, representation-learning
Modelsgenesis
Official Keras & PyTorch Implementation and Pre-trained Models for Models Genesis - MICCAI 2019
Stars: ✭ 416 (-44.53%)
Mutual labels:  jupyter-notebook, representation-learning
Vl Bert
Code for ICLR 2020 paper "VL-BERT: Pre-training of Generic Visual-Linguistic Representations".
Stars: ✭ 493 (-34.27%)
Mutual labels:  jupyter-notebook, representation-learning
SimCLR
Pytorch implementation of "A Simple Framework for Contrastive Learning of Visual Representations"
Stars: ✭ 65 (-91.33%)
Mutual labels:  representation-learning, unsupervised-learning
proto
Proto-RL: Reinforcement Learning with Prototypical Representations
Stars: ✭ 67 (-91.07%)
Mutual labels:  representation-learning, unsupervised-learning
srVAE
VAE with RealNVP prior and Super-Resolution VAE in PyTorch. Code release for https://arxiv.org/abs/2006.05218.
Stars: ✭ 56 (-92.53%)
Mutual labels:  representation-learning, unsupervised-learning
State-Representation-Learning-An-Overview
Simplified version of "State Representation Learning for Control: An Overview" bibliography
Stars: ✭ 32 (-95.73%)
Mutual labels:  representation-learning, unsupervised-learning
Simclr
PyTorch implementation of SimCLR: A Simple Framework for Contrastive Learning of Visual Representations by T. Chen et al.
Stars: ✭ 293 (-60.93%)
Mutual labels:  unsupervised-learning, representation-learning
Lemniscate.pytorch
Unsupervised Feature Learning via Non-parametric Instance Discrimination
Stars: ✭ 532 (-29.07%)
Mutual labels:  unsupervised-learning, representation-learning
awesome-graph-self-supervised-learning
Awesome Graph Self-Supervised Learning
Stars: ✭ 805 (+7.33%)
Mutual labels:  representation-learning, unsupervised-learning
amr
Official adversarial mixup resynthesis repository
Stars: ✭ 31 (-95.87%)
Mutual labels:  representation-learning, unsupervised-learning

PyTorch SimCLR: A Simple Framework for Contrastive Learning of Visual Representations

DOI

Blog post with full documentation: Exploring SimCLR: A Simple Framework for Contrastive Learning of Visual Representations

Image of SimCLR Arch

See also PyTorch Implementation for BYOL - Bootstrap Your Own Latent: A New Approach to Self-Supervised Learning.

Installation

$ conda env create --name simclr --file env.yml
$ conda activate simclr
$ python run.py

Config file

Before running SimCLR, make sure you choose the correct running configurations. You can change the running configurations by passing keyword arguments to the run.py file.

$ python run.py -data ./datasets --dataset-name stl10 --log-every-n-steps 100 --epochs 100 

If you want to run it on CPU (for debugging purposes) use the --disable-cuda option.

For 16-bit precision GPU training, there NO need to to install NVIDIA apex. Just use the --fp16_precision flag and this implementation will use Pytorch built in AMP training.

Feature Evaluation

Feature evaluation is done using a linear model protocol.

First, we learned features using SimCLR on the STL10 unsupervised set. Then, we train a linear classifier on top of the frozen features from SimCLR. The linear model is trained on features extracted from the STL10 train set and evaluated on the STL10 test set.

Check the Open In Colab notebook for reproducibility.

Note that SimCLR benefits from longer training.

Linear Classification Dataset Feature Extractor Architecture Feature dimensionality Projection Head dimensionality Epochs Top1 %
Logistic Regression (Adam) STL10 SimCLR ResNet-18 512 128 100 74.45
Logistic Regression (Adam) CIFAR10 SimCLR ResNet-18 512 128 100 69.82
Logistic Regression (Adam) STL10 SimCLR ResNet-50 2048 128 50 70.075
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].