All Projects → zhirongw → Lemniscate.pytorch

zhirongw / Lemniscate.pytorch

Unsupervised Feature Learning via Non-parametric Instance Discrimination

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Lemniscate.pytorch

amr
Official adversarial mixup resynthesis repository
Stars: ✭ 31 (-94.17%)
Mutual labels:  representation-learning, unsupervised-learning
SimCLR
Pytorch implementation of "A Simple Framework for Contrastive Learning of Visual Representations"
Stars: ✭ 65 (-87.78%)
Mutual labels:  representation-learning, unsupervised-learning
rl singing voice
Unsupervised Representation Learning for Singing Voice Separation
Stars: ✭ 18 (-96.62%)
Mutual labels:  representation-learning, unsupervised-learning
FUSION
PyTorch code for NeurIPSW 2020 paper (4th Workshop on Meta-Learning) "Few-Shot Unsupervised Continual Learning through Meta-Examples"
Stars: ✭ 18 (-96.62%)
Mutual labels:  representation-learning, unsupervised-learning
Simclr
PyTorch implementation of SimCLR: A Simple Framework for Contrastive Learning of Visual Representations by T. Chen et al.
Stars: ✭ 293 (-44.92%)
Mutual labels:  unsupervised-learning, representation-learning
M-NMF
An implementation of "Community Preserving Network Embedding" (AAAI 2017)
Stars: ✭ 119 (-77.63%)
Mutual labels:  representation-learning, unsupervised-learning
proto
Proto-RL: Reinforcement Learning with Prototypical Representations
Stars: ✭ 67 (-87.41%)
Mutual labels:  representation-learning, unsupervised-learning
Pytorch Byol
PyTorch implementation of Bootstrap Your Own Latent: A New Approach to Self-Supervised Learning
Stars: ✭ 213 (-59.96%)
Mutual labels:  unsupervised-learning, representation-learning
srVAE
VAE with RealNVP prior and Super-Resolution VAE in PyTorch. Code release for https://arxiv.org/abs/2006.05218.
Stars: ✭ 56 (-89.47%)
Mutual labels:  representation-learning, unsupervised-learning
ladder-vae-pytorch
Ladder Variational Autoencoders (LVAE) in PyTorch
Stars: ✭ 59 (-88.91%)
Mutual labels:  representation-learning, unsupervised-learning
VQ-APC
Vector Quantized Autoregressive Predictive Coding (VQ-APC)
Stars: ✭ 34 (-93.61%)
Mutual labels:  representation-learning, unsupervised-learning
Disentangling Vae
Experiments for understanding disentanglement in VAE latent representations
Stars: ✭ 398 (-25.19%)
Mutual labels:  unsupervised-learning, representation-learning
Revisiting-Contrastive-SSL
Revisiting Contrastive Methods for Unsupervised Learning of Visual Representations. [NeurIPS 2021]
Stars: ✭ 81 (-84.77%)
Mutual labels:  representation-learning, unsupervised-learning
awesome-graph-self-supervised-learning
Awesome Graph Self-Supervised Learning
Stars: ✭ 805 (+51.32%)
Mutual labels:  representation-learning, unsupervised-learning
Contrastive Predictive Coding Pytorch
Contrastive Predictive Coding for Automatic Speaker Verification
Stars: ✭ 223 (-58.08%)
Mutual labels:  unsupervised-learning, representation-learning
State-Representation-Learning-An-Overview
Simplified version of "State Representation Learning for Control: An Overview" bibliography
Stars: ✭ 32 (-93.98%)
Mutual labels:  representation-learning, unsupervised-learning
Simclr
SimCLRv2 - Big Self-Supervised Models are Strong Semi-Supervised Learners
Stars: ✭ 2,720 (+411.28%)
Mutual labels:  unsupervised-learning, representation-learning
Variational Ladder Autoencoder
Implementation of VLAE
Stars: ✭ 196 (-63.16%)
Mutual labels:  unsupervised-learning, representation-learning
awesome-contrastive-self-supervised-learning
A comprehensive list of awesome contrastive self-supervised learning papers.
Stars: ✭ 748 (+40.6%)
Mutual labels:  representation-learning, unsupervised-learning
Contrastive Predictive Coding
Keras implementation of Representation Learning with Contrastive Predictive Coding
Stars: ✭ 369 (-30.64%)
Mutual labels:  unsupervised-learning, representation-learning

Unsupervised Feature Learning via Non-parameteric Instance Discrimination

This repo constains the pytorch implementation for the CVPR2018 unsupervised learning paper (arxiv).

Updated Pretrained Model

An updated instance discrimination model with memory bank implementation and with nce-k=65536 negatives is provided. The updated model is trained with Softmax-CE loss as in CPC/MoCo instead of the original NCE loss.

Oldies: original releases of ResNet18 and ResNet50 trained with 4096 negatives and the NCE loss. Each tar ball contains the feature representation of all ImageNet training images (600 mb) and model weights (100-200mb). You can also get these representations by forwarding the network for the entire ImageNet images.

  • ResNet 18 (top 1 nearest neighbor accuracy 41.0%)
  • ResNet 50 (top 1 nearest neighbor accuracy 46.8%)

Highlight

  • We formulate unsupervised learning from a completely different non-parametric perspective.
  • Feature encodings can be as compact as 128 dimension for each image.
  • Enjoys the benefit of advanced architectures and techniques from supervised learning.
  • Runs seamlessly with nearest neighbor classifiers.

Nearest Neighbor

Please follow this link for a list of nearest neighbors on ImageNet. Results are visualized from our ResNet50 model, compared with raw image features and supervised features. First column is the query image, followed by 20 retrievals ranked by the similarity.

Usage

Our code extends the pytorch implementation of imagenet classification in official pytorch release. Please refer to the official repo for details of data preparation and hardware configurations.

  • supports python27 and pytorch=0.4

  • if you are looking for pytorch 0.3, please switch to tag v0.3

  • clone this repo: git clone https://github.com/zhirongw/lemniscate.pytorch

  • Training on ImageNet:

    python main.py DATAPATH --arch resnet18 -j 32 --nce-k 4096 --nce-t 0.07 --lr 0.03 --nce-m 0.5 --low-dim 128 -b 256

    • parameter nce-k controls the number of negative samples. If nce-k sets to 0, the code also supports full softmax learning.
    • nce-t controls temperature of the distribution. 0.07-0.1 works well in practice.
    • nce-m stabilizes the learning process. A value of 0.5 works well in practice.
    • learning rate is initialized to 0.03, a bit smaller than standard supervised learning.
    • the embedding size is controlled by the parameter low-dim.
  • During training, we monitor the supervised validation accuracy by K nearest neighbor with K=1, as it's faster, and gives a good estimation of the feature quality.

  • Testing on ImageNet:

    python main.py DATAPATH --arch resnet18 --resume input_model.pth.tar -e runs testing with default K=200 neighbors.

  • Training on CIFAR10:

    python cifar.py --nce-k 0 --nce-t 0.1 --lr 0.03

Citation

@inproceedings{wu2018unsupervised,
  title={Unsupervised Feature Learning via Non-Parametric Instance Discrimination},
  author={Wu, Zhirong and Xiong, Yuanjun and Stella, X Yu and Lin, Dahua},
  booktitle={Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition},
  year={2018}
}

Contact

For any questions, please feel free to reach

Zhirong Wu: [email protected]
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].