All Projects → podgorskiy → Gpnd

podgorskiy / Gpnd

Generative Probabilistic Novelty Detection with Adversarial Autoencoders

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Gpnd

Ad examples
A collection of anomaly detection methods (iid/point-based, graph and time series) including active learning for anomaly detection/discovery, bayesian rule-mining, description for diversity/explanation/interpretability. Analysis of incorporating label feedback with ensemble and tree-based detectors. Includes adversarial attacks with Graph Convolutional Network.
Stars: ✭ 641 (+472.32%)
Mutual labels:  gan, generative-adversarial-network, anomaly-detection, autoencoder
Awesome Tensorlayer
A curated list of dedicated resources and applications
Stars: ✭ 248 (+121.43%)
Mutual labels:  generative-adversarial-network, autoencoder, mnist, adversarial-learning
Niftynet
[unmaintained] An open-source convolutional neural networks platform for research in medical image analysis and image-guided therapy
Stars: ✭ 1,276 (+1039.29%)
Mutual labels:  deep-neural-networks, gan, autoencoder
Pytorch Mnist Celeba Gan Dcgan
Pytorch implementation of Generative Adversarial Networks (GAN) and Deep Convolutional Generative Adversarial Networks (DCGAN) for MNIST and CelebA datasets
Stars: ✭ 363 (+224.11%)
Mutual labels:  gan, generative-adversarial-network, mnist
Generative Models
Annotated, understandable, and visually interpretable PyTorch implementations of: VAE, BIRVAE, NSGAN, MMGAN, WGAN, WGANGP, LSGAN, DRAGAN, BEGAN, RaGAN, InfoGAN, fGAN, FisherGAN
Stars: ✭ 438 (+291.07%)
Mutual labels:  gan, generative-adversarial-network, autoencoder
Semantic Pyramid for Image Generation
PyTorch reimplementation of the paper: "Semantic Pyramid for Image Generation" [CVPR 2020].
Stars: ✭ 45 (-59.82%)
Mutual labels:  generative-adversarial-network, gan, adversarial-learning
MNIST-invert-color
Invert the color of MNIST images with PyTorch
Stars: ✭ 13 (-88.39%)
Mutual labels:  generative-adversarial-network, gan, mnist
Lggan
[CVPR 2020] Local Class-Specific and Global Image-Level Generative Adversarial Networks for Semantic-Guided Scene Generation
Stars: ✭ 97 (-13.39%)
Mutual labels:  gan, generative-adversarial-network, adversarial-learning
Pytorch Generative Model Collections
Collection of generative models in Pytorch version.
Stars: ✭ 2,296 (+1950%)
Mutual labels:  gan, generative-adversarial-network, mnist
Adversarial video generation
A TensorFlow Implementation of "Deep Multi-Scale Video Prediction Beyond Mean Square Error" by Mathieu, Couprie & LeCun.
Stars: ✭ 662 (+491.07%)
Mutual labels:  deep-neural-networks, gan, generative-adversarial-network
Gans In Action
Companion repository to GANs in Action: Deep learning with Generative Adversarial Networks
Stars: ✭ 748 (+567.86%)
Mutual labels:  deep-neural-networks, gan, generative-adversarial-network
P2pala
Page to PAGE Layout Analysis Tool
Stars: ✭ 147 (+31.25%)
Mutual labels:  deep-neural-networks, gan, generative-adversarial-network
Nice Gan Pytorch
Official PyTorch implementation of NICE-GAN: Reusing Discriminators for Encoding: Towards Unsupervised Image-to-Image Translation
Stars: ✭ 140 (+25%)
Mutual labels:  deep-neural-networks, gan, generative-adversarial-network
Alae
[CVPR2020] Adversarial Latent Autoencoders
Stars: ✭ 3,178 (+2737.5%)
Mutual labels:  gan, generative-adversarial-network, autoencoder
Anogan Tf
Unofficial Tensorflow Implementation of AnoGAN (Anomaly GAN)
Stars: ✭ 218 (+94.64%)
Mutual labels:  gan, generative-adversarial-network, anomaly-detection
Tensorflow Tutorial
Tensorflow tutorial from basic to hard, 莫烦Python 中文AI教学
Stars: ✭ 4,122 (+3580.36%)
Mutual labels:  gan, generative-adversarial-network, autoencoder
Repo 2017
Python codes in Machine Learning, NLP, Deep Learning and Reinforcement Learning with Keras and Theano
Stars: ✭ 1,123 (+902.68%)
Mutual labels:  generative-adversarial-network, anomaly-detection, autoencoder
Tensorflow Mnist Gan Dcgan
Tensorflow implementation of Generative Adversarial Networks (GAN) and Deep Convolutional Generative Adversarial Netwokrs for MNIST dataset.
Stars: ✭ 163 (+45.54%)
Mutual labels:  gan, generative-adversarial-network, mnist
Tensorflow Tutorials
텐서플로우를 기초부터 응용까지 단계별로 연습할 수 있는 소스 코드를 제공합니다
Stars: ✭ 2,096 (+1771.43%)
Mutual labels:  gan, autoencoder, mnist
Pix2pixhd
Synthesizing and manipulating 2048x1024 images with conditional GANs
Stars: ✭ 5,553 (+4858.04%)
Mutual labels:  deep-neural-networks, gan, generative-adversarial-network

Generative Probabilistic Novelty Detection with Adversarial Autoencoders

Stanislav Pidhorskyi, Ranya Almohsen, Donald A Adjeroh, Gianfranco Doretto

Lane Department of Computer Science and Electrical Engineering, West Virginia University
Morgantown, WV 26508
{stpidhorskyi, ralmohse, daadjeroh, gidoretto} @mix.wvu.edu

The e-preprint of the article on arxiv.

NeurIPS Proceedings.

@inproceedings{pidhorskyi2018generative,
  title={Generative probabilistic novelty detection with adversarial autoencoders},
  author={Pidhorskyi, Stanislav and Almohsen, Ranya and Doretto, Gianfranco},
  booktitle={Advances in neural information processing systems},
  pages={6822--6833},
  year={2018}
}

Content

  • partition_mnist.py - code for preparing MNIST dataset.
  • train_AAE.py - code for training the autoencoder.
  • novelty_detector.py - code for running novelty detector
  • net.py - contains definitions of network architectures.

How to run

You will need to run partition_mnist.py first.

Then run schedule.py. It will run as many concurent experiments as many GPUs are available. Reusults will be written to results.csv file


Alternatively, you can call directly functions from train_AAE.py and novelty_detector.py

Train autoenctoder with train_AAE.py, you need to call train function:

train_AAE.train(
  folding_id,
  inliner_classes,
  ic
)

Args:

  • folding_id: Id of the fold. For MNIST, 5 folds are generated, so folding_id must be in range [0..5]
  • inliner_classes: List of classes considered inliers.
  • ic: inlier class set index (used to save model with unique filename).

After autoencoder was trained, from novelty_detector.py, you need to call main function:

novelty_detector.main(
  folding_id,
  inliner_classes,
  total_classes,
  mul,
  folds=5
)
  • folding_id: Id of the fold. For MNIST, 5 folds are generated, so folding_id must be in range [0..5]
  • inliner_classes: List of classes considered inliers.
  • ic: inlier class set index (used to save model with unique filename).
  • total_classes: Total count of classes (deprecated, moved to config).
  • mul: multiplier for power correction. Default value 0.2.
  • folds: Number of folds (deprecated, moved to config).

Generated/Reconstructed images

MNIST Reconstruction

MNIST Reconstruction. First raw - real image, second - reconstructed.



MNIST Reconstruction

MNIST Generation.



COIL100 Reconstruction

COIL100 Reconstruction, single category. First raw - real image, second - reconstructed. Only 57 images were used for training.



COIL100 Generation

COIL100 Generation. First raw - real image, second - reconstructed. Only 57 images were used for training.



COIL100 Reconstruction

COIL100 Reconstruction, 7 categories. First raw - real image, second - reconstructed. Only about 60 images per category were used for training



COIL100 Generation

COIL100 Generation. First raw - real image, second - reconstructed. Only about 60 images per category were used for training.



PDF

PDF of the latent space for MNIST. Size of the latent space - 32

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].