All Projects → nejlag → Semi-Supervised-Learning-GAN

nejlag / Semi-Supervised-Learning-GAN

Licence: MIT license
Semi-supervised Learning GAN

Programming Languages

Jupyter Notebook
11667 projects

Projects that are alternatives of or similar to Semi-Supervised-Learning-GAN

GDPP
Generator loss to reduce mode-collapse and to improve the generated samples quality.
Stars: ✭ 32 (-55.56%)
Mutual labels:  generative-adversarial-networks
gan-vae-pretrained-pytorch
Pretrained GANs + VAEs + classifiers for MNIST/CIFAR in pytorch.
Stars: ✭ 134 (+86.11%)
Mutual labels:  generative-adversarial-networks
JCLAL
JCLAL is a general purpose framework developed in Java for Active Learning.
Stars: ✭ 22 (-69.44%)
Mutual labels:  semi-supervised-learning
semi-memory
Tensorflow Implementation on Paper [ECCV2018]Semi-Supervised Deep Learning with Memory
Stars: ✭ 49 (-31.94%)
Mutual labels:  semi-supervised-learning
PCLoc
Pose Correction for Highly Accurate Visual Localization in Large-scale Indoor Spaces (ICCV 2021)
Stars: ✭ 37 (-48.61%)
Mutual labels:  feature-matching
ssdg-benchmark
Benchmarks for semi-supervised domain generalization.
Stars: ✭ 46 (-36.11%)
Mutual labels:  semi-supervised-learning
generative models
Pytorch implementations of generative models: VQVAE2, AIR, DRAW, InfoGAN, DCGAN, SSVAE
Stars: ✭ 82 (+13.89%)
Mutual labels:  semi-supervised-learning
stylegan-v
[CVPR 2022] StyleGAN-V: A Continuous Video Generator with the Price, Image Quality and Perks of StyleGAN2
Stars: ✭ 136 (+88.89%)
Mutual labels:  generative-adversarial-networks
DualStudent
Code for Paper ''Dual Student: Breaking the Limits of the Teacher in Semi-Supervised Learning'' [ICCV 2019]
Stars: ✭ 106 (+47.22%)
Mutual labels:  semi-supervised-learning
DeepAtlas
Joint Semi-supervised Learning of Image Registration and Segmentation
Stars: ✭ 38 (-47.22%)
Mutual labels:  semi-supervised-learning
ETCI-2021-Competition-on-Flood-Detection
Experiments on Flood Segmentation on Sentinel-1 SAR Imagery with Cyclical Pseudo Labeling and Noisy Student Training
Stars: ✭ 102 (+41.67%)
Mutual labels:  semi-supervised-learning
pyprophet
PyProphet: Semi-supervised learning and scoring of OpenSWATH results.
Stars: ✭ 23 (-68.06%)
Mutual labels:  semi-supervised-learning
GAN-keras
tensorflow2.x implementations of Generative Adversarial Networks.
Stars: ✭ 30 (-58.33%)
Mutual labels:  generative-adversarial-networks
EC-GAN
EC-GAN: Low-Sample Classification using Semi-Supervised Algorithms and GANs (AAAI 2021)
Stars: ✭ 29 (-59.72%)
Mutual labels:  semi-supervised-learning
OASIS
Official implementation of the paper "You Only Need Adversarial Supervision for Semantic Image Synthesis" (ICLR 2021)
Stars: ✭ 232 (+222.22%)
Mutual labels:  generative-adversarial-networks
seededlda
Semisupervided LDA for theory-driven text analysis
Stars: ✭ 46 (-36.11%)
Mutual labels:  semi-supervised-learning
Feature-Detection-and-Matching
Feature Detection and Matching with SIFT, SURF, KAZE, BRIEF, ORB, BRISK, AKAZE and FREAK through the Brute Force and FLANN algorithms using Python and OpenCV
Stars: ✭ 95 (+31.94%)
Mutual labels:  feature-matching
cfg-gan
CFG-GAN: Composite functional gradient learning of generative adversarial models
Stars: ✭ 15 (-79.17%)
Mutual labels:  generative-adversarial-networks
semantic-parsing-dual
Source code and data for ACL 2019 Long Paper ``Semantic Parsing with Dual Learning".
Stars: ✭ 17 (-76.39%)
Mutual labels:  semi-supervised-learning
gcWGAN
Guided Conditional Wasserstein GAN for De Novo Protein Design
Stars: ✭ 38 (-47.22%)
Mutual labels:  generative-adversarial-networks

Semi-supervised Learning with Generative Adversarial Networks (GANs)

Modern deep learning classifiers require a large volume of labeled samples to be able to generalize well. GANs have shown a lot of potential in semi-supervised learning where the classifier can obtain good performance with very few labeled data (Salimans et. al., 2016).

Overview

To train a -class classifier with a small number of labeled samples, discriminator (D) in a GAN's game should be replaced with a -classiifer where it receives a data point as input and outputs a -dimensional vector of logits . These logits can then be transferred into class probabilities, where:

provides the probability that is fake.

provides the probability that is real and belongs to class . Now, the loss of discriminator can be written as:

where:

is the standard supervised learning loss function given that the data is real and:

is the standard GAN's game-value where:

.

Now, let's denote the activations on an intermediate layer of discriminator. The feature matching loss of generator can be defined as:

Feature matching has shown a lot of potential in semi-supervised learning. The goal of feature matching is to push the generator to generate data that matches the statistics of real data. Discriminator is used to specify those statistics as it naturally learns to find features that are most discriminative of real data versus data generated by the current model.

In this code, I combined with the known generator cost that maximizes the log-probability of discriminator being mistaken:

.

So, the loss of generator can be written as:

Results

Table below shows cross-validation accuracy of semi-supervised learning GAN for 1000 epochs when 10% and 100% of MNIST data is labeled.

10% labeled data 100% labeled data
0.9255 0.945

Figure below shows cross-validation accuracy for 1000 epochs when 10% of data is labeled. As can be seen here, training has not yet reached a plateau which indicates further training could provide higher accuracy.

Figures below show some generated samples at different epochs of training when 10% of data is labeled:

Reference:

Salimans, T., Goodfellow, I., Zaremba, W., Cheung, V., Radford, A., and Chen, X. (2016). Improved Techniques for Training GANs. In advances in Neural Information Processing Systems (NIPS), pages 2226-2234 (http://papers.nips.cc/paper/6125-improved-techniques-for-training-gans.pdf)

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].