All Projects → davidtellez → Contrastive Predictive Coding

davidtellez / Contrastive Predictive Coding

Keras implementation of Representation Learning with Contrastive Predictive Coding

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Contrastive Predictive Coding

Contrastive Predictive Coding Pytorch
Contrastive Predictive Coding for Automatic Speaker Verification
Stars: ✭ 223 (-39.57%)
Mutual labels:  unsupervised-learning, representation-learning, predictive-modeling
awesome-graph-self-supervised-learning
Awesome Graph Self-Supervised Learning
Stars: ✭ 805 (+118.16%)
Mutual labels:  representation-learning, unsupervised-learning
M-NMF
An implementation of "Community Preserving Network Embedding" (AAAI 2017)
Stars: ✭ 119 (-67.75%)
Mutual labels:  representation-learning, unsupervised-learning
Pytorch Cortexnet
PyTorch implementation of the CortexNet predictive model
Stars: ✭ 349 (-5.42%)
Mutual labels:  unsupervised-learning, predictive-modeling
VQ-APC
Vector Quantized Autoregressive Predictive Coding (VQ-APC)
Stars: ✭ 34 (-90.79%)
Mutual labels:  representation-learning, unsupervised-learning
FUSION
PyTorch code for NeurIPSW 2020 paper (4th Workshop on Meta-Learning) "Few-Shot Unsupervised Continual Learning through Meta-Examples"
Stars: ✭ 18 (-95.12%)
Mutual labels:  representation-learning, unsupervised-learning
rl singing voice
Unsupervised Representation Learning for Singing Voice Separation
Stars: ✭ 18 (-95.12%)
Mutual labels:  representation-learning, unsupervised-learning
Variational Ladder Autoencoder
Implementation of VLAE
Stars: ✭ 196 (-46.88%)
Mutual labels:  unsupervised-learning, representation-learning
proto
Proto-RL: Reinforcement Learning with Prototypical Representations
Stars: ✭ 67 (-81.84%)
Mutual labels:  representation-learning, unsupervised-learning
SimCLR
Pytorch implementation of "A Simple Framework for Contrastive Learning of Visual Representations"
Stars: ✭ 65 (-82.38%)
Mutual labels:  representation-learning, unsupervised-learning
awesome-contrastive-self-supervised-learning
A comprehensive list of awesome contrastive self-supervised learning papers.
Stars: ✭ 748 (+102.71%)
Mutual labels:  representation-learning, unsupervised-learning
Revisiting-Contrastive-SSL
Revisiting Contrastive Methods for Unsupervised Learning of Visual Representations. [NeurIPS 2021]
Stars: ✭ 81 (-78.05%)
Mutual labels:  representation-learning, unsupervised-learning
srVAE
VAE with RealNVP prior and Super-Resolution VAE in PyTorch. Code release for https://arxiv.org/abs/2006.05218.
Stars: ✭ 56 (-84.82%)
Mutual labels:  representation-learning, unsupervised-learning
Simclr
PyTorch implementation of SimCLR: A Simple Framework for Contrastive Learning of Visual Representations by T. Chen et al.
Stars: ✭ 293 (-20.6%)
Mutual labels:  unsupervised-learning, representation-learning
Pytorch Byol
PyTorch implementation of Bootstrap Your Own Latent: A New Approach to Self-Supervised Learning
Stars: ✭ 213 (-42.28%)
Mutual labels:  unsupervised-learning, representation-learning
amr
Official adversarial mixup resynthesis repository
Stars: ✭ 31 (-91.6%)
Mutual labels:  representation-learning, unsupervised-learning
Autoregressive Predictive Coding
Autoregressive Predictive Coding: An unsupervised autoregressive model for speech representation learning
Stars: ✭ 138 (-62.6%)
Mutual labels:  unsupervised-learning, representation-learning
Simclr
SimCLRv2 - Big Self-Supervised Models are Strong Semi-Supervised Learners
Stars: ✭ 2,720 (+637.13%)
Mutual labels:  unsupervised-learning, representation-learning
State-Representation-Learning-An-Overview
Simplified version of "State Representation Learning for Control: An Overview" bibliography
Stars: ✭ 32 (-91.33%)
Mutual labels:  representation-learning, unsupervised-learning
ladder-vae-pytorch
Ladder Variational Autoencoders (LVAE) in PyTorch
Stars: ✭ 59 (-84.01%)
Mutual labels:  representation-learning, unsupervised-learning

Representation Learning with Contrastive Predictive Coding

This repository contains a Keras implementation of the algorithm presented in the paper Representation Learning with Contrastive Predictive Coding.

The goal of unsupervised representation learning is to capture semantic information about the world, recognizing patterns in the data without using annotations. This paper presents a new method called Contrastive Predictive Coding (CPC) that can do so across multiple applications. The main ideas of the paper are:

  • Contrastive: it is trained using a contrastive approach, that is, the main model has to discern between right and wrong data sequences.
  • Predictive: the model has to predict future patterns given the current context.
  • Coding: the model performs this prediction in a latent space, transforming code vectors into other code vectors (in contrast with predicting high-dimensional data directly).

CPC has to predict the next item in a sequence using only an embedded representation of the data, provided by an encoder. In order to solve the task, this encoder has to learn a meaningful representation of the data space. After training, this encoder can be used for other downstream tasks like supervised classification.

CPC algorithm

To train the CPC algorithm, I have created a toy dataset. This dataset consists of sequences of modified MNIST numbers (64x64 RGB). Positive sequence samples contain sorted numbers, and negative ones random numbers. For example, let's assume that the context sequence length is S=4, and CPC is asked to predict the next P=2 numbers. A positive sample could look like [2, 3, 4, 5]->[6, 7], whereas a negative one could be [1, 2, 3, 4]->[0, 8]. Of course CPC will only see the patches, not the actual numbers.

Disclaimer: this code is provided as is, if you encounter a bug please report it as an issue. Your help will be much welcomed!

Results

After 10 training epochs, CPC reports a 99% accuracy on the contrastive task. After training, I froze the encoder and trained a MLP on top of it to perform supervised digit classification on the same MNIST data. It achieved 90% accuracy after 10 epochs, demonstrating the effectiveness of CPC for unsupervised feature extraction.

Usage

  • Execute python train_model.py to train the CPC model.
  • Execute python benchmark_model.py to train the MLP on top of the CPC encoder.

Requisites

References

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].