All Projects → IgorSusmelj → simsiam-cifar10

IgorSusmelj / simsiam-cifar10

Licence: MIT license
Code to train the SimSiam model on cifar10 using PyTorch

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to simsiam-cifar10

DisCont
Code for the paper "DisCont: Self-Supervised Visual Attribute Disentanglement using Context Vectors".
Stars: ✭ 13 (-60.61%)
Mutual labels:  self-supervised-learning
mmselfsup
OpenMMLab Self-Supervised Learning Toolbox and Benchmark
Stars: ✭ 2,315 (+6915.15%)
Mutual labels:  self-supervised-learning
mae-scalable-vision-learners
A TensorFlow 2.x implementation of Masked Autoencoders Are Scalable Vision Learners
Stars: ✭ 54 (+63.64%)
Mutual labels:  self-supervised-learning
lossyless
Generic image compressor for machine learning. Pytorch code for our paper "Lossy compression for lossless prediction".
Stars: ✭ 81 (+145.45%)
Mutual labels:  self-supervised-learning
self6dpp
Self6D++: Occlusion-Aware Self-Supervised Monocular 6D Object Pose Estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI) 2021.
Stars: ✭ 45 (+36.36%)
Mutual labels:  self-supervised-learning
info-nce-pytorch
PyTorch implementation of the InfoNCE loss for self-supervised learning.
Stars: ✭ 160 (+384.85%)
Mutual labels:  self-supervised-learning
sesemi
supervised and semi-supervised image classification with self-supervision (Keras)
Stars: ✭ 43 (+30.3%)
Mutual labels:  self-supervised-learning
SoCo
[NeurIPS 2021 Spotlight] Aligning Pretraining for Detection via Object-Level Contrastive Learning
Stars: ✭ 125 (+278.79%)
Mutual labels:  self-supervised-learning
TCE
This repository contains the code implementation used in the paper Temporally Coherent Embeddings for Self-Supervised Video Representation Learning (TCE).
Stars: ✭ 51 (+54.55%)
Mutual labels:  self-supervised-learning
pillar-motion
Self-Supervised Pillar Motion Learning for Autonomous Driving (CVPR 2021)
Stars: ✭ 98 (+196.97%)
Mutual labels:  self-supervised-learning
SimSiam
Exploring Simple Siamese Representation Learning
Stars: ✭ 52 (+57.58%)
Mutual labels:  self-supervised-learning
awesome-graph-self-supervised-learning
Awesome Graph Self-Supervised Learning
Stars: ✭ 805 (+2339.39%)
Mutual labels:  self-supervised-learning
awesome-graph-self-supervised-learning-based-recommendation
A curated list of awesome graph & self-supervised-learning-based recommendation.
Stars: ✭ 37 (+12.12%)
Mutual labels:  self-supervised-learning
CLMR
Official PyTorch implementation of Contrastive Learning of Musical Representations
Stars: ✭ 216 (+554.55%)
Mutual labels:  self-supervised-learning
esvit
EsViT: Efficient self-supervised Vision Transformers
Stars: ✭ 323 (+878.79%)
Mutual labels:  self-supervised-learning
ViCC
[WACV'22] Code repository for the paper "Self-supervised Video Representation Learning with Cross-Stream Prototypical Contrasting", https://arxiv.org/abs/2106.10137.
Stars: ✭ 33 (+0%)
Mutual labels:  self-supervised-learning
GeDML
Generalized Deep Metric Learning.
Stars: ✭ 30 (-9.09%)
Mutual labels:  self-supervised-learning
newt
Natural World Tasks
Stars: ✭ 24 (-27.27%)
Mutual labels:  self-supervised-learning
GCL
List of Publications in Graph Contrastive Learning
Stars: ✭ 25 (-24.24%)
Mutual labels:  self-supervised-learning
Awesome-Vision-Transformer-Collection
Variants of Vision Transformer and its downstream tasks
Stars: ✭ 124 (+275.76%)
Mutual labels:  self-supervised-learning

A minimal PyTorch example of training SimSiam on CIFAR10 with a kNN predictor to report the accuracy after each epoch. The code uses lightlys version of SimSiam.

Final test set accuracy is 91% which is similar to the original paper.

Installation

pip install requirements.txt

Dependencies

This code uses SimSiam provided by lightly. We use PyTorch Lightning for the training loop. There is another example of using SimSiam with lighlty and plain PyTorch here.

Train SimSiam on CIFAR10

Training on a V100 GPU takes around 8 hours for 800 epochs and reaches around 91% accuracy on the test set.

You can run training using the following command:

python main.py

The main.py script is kept very simple. You can modify common parameters such as number of epochs, batch size, number of workers etc.

The default values are:

  • num_workers = 8
  • max_epochs = 800
  • knn_k = 200
  • knn_t = 0.1
  • classes = 10
  • batch_size = 512
  • seed=1

Here are the tensorboard plots: Plot showing accuracy and loss of SimSiam on cifar10

You can access the tensorboard logs using

tensorboard --logdir lightning_logs/
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].