All Projects → kiryor → Nnpulearning

kiryor / Nnpulearning

Licence: other
Non-negative Positive-Unlabeled (nnPU) and unbiased Positive-Unlabeled (uPU) learning reproductive code on MNIST and CIFAR10

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Nnpulearning

gans-2.0
Generative Adversarial Networks in TensorFlow 2.0
Stars: ✭ 76 (-58.01%)
Mutual labels:  mnist, cifar10
PFL-Non-IID
The origin of the Non-IID phenomenon is the personalization of users, who generate the Non-IID data. With Non-IID (Not Independent and Identically Distributed) issues existing in the federated learning setting, a myriad of approaches has been proposed to crack this hard nut. In contrast, the personalized federated learning may take the advantage…
Stars: ✭ 58 (-67.96%)
Mutual labels:  mnist, cifar10
image-defect-detection-based-on-CNN
TensorBasicModel
Stars: ✭ 17 (-90.61%)
Mutual labels:  mnist, cifar10
deeplearning-mpo
Replace FC2, LeNet-5, VGG, Resnet, Densenet's full-connected layers with MPO
Stars: ✭ 26 (-85.64%)
Mutual labels:  mnist, cifar10
Randwire tensorflow
tensorflow implementation of Exploring Randomly Wired Neural Networks for Image Recognition
Stars: ✭ 29 (-83.98%)
Mutual labels:  mnist, cifar10
Tf Vqvae
Tensorflow Implementation of the paper [Neural Discrete Representation Learning](https://arxiv.org/abs/1711.00937) (VQ-VAE).
Stars: ✭ 226 (+24.86%)
Mutual labels:  mnist, cifar10
chainer-ADDA
Adversarial Discriminative Domain Adaptation in Chainer
Stars: ✭ 24 (-86.74%)
Mutual labels:  chainer, mnist
Chainer Cifar10
Various CNN models for CIFAR10 with Chainer
Stars: ✭ 134 (-25.97%)
Mutual labels:  chainer, cifar10
Theano Xnor Net
Theano implementation of XNOR-Net
Stars: ✭ 23 (-87.29%)
Mutual labels:  mnist, cifar10
Capsnet
CapsNet (Capsules Net) in Geoffrey E Hinton paper "Dynamic Routing Between Capsules" - State Of the Art
Stars: ✭ 423 (+133.7%)
Mutual labels:  mnist, chainer
Cifar-Autoencoder
A look at some simple autoencoders for the Cifar10 dataset, including a denoising autoencoder. Python code included.
Stars: ✭ 42 (-76.8%)
Mutual labels:  mnist, cifar10
Label Embedding Network
Label Embedding Network
Stars: ✭ 69 (-61.88%)
Mutual labels:  mnist, cifar10
Relativistic Average Gan Keras
The implementation of Relativistic average GAN with Keras
Stars: ✭ 36 (-80.11%)
Mutual labels:  mnist, cifar10
Generative adversarial networks 101
Keras implementations of Generative Adversarial Networks. GANs, DCGAN, CGAN, CCGAN, WGAN and LSGAN models with MNIST and CIFAR-10 datasets.
Stars: ✭ 138 (-23.76%)
Mutual labels:  mnist, cifar10
Pytorch Distributed Example
Stars: ✭ 157 (-13.26%)
Mutual labels:  mnist
Lsoftmax Pytorch
The Pytorch Implementation of L-Softmax
Stars: ✭ 133 (-26.52%)
Mutual labels:  mnist
Aognet
Code for CVPR 2019 paper: " Learning Deep Compositional Grammatical Architectures for Visual Recognition"
Stars: ✭ 132 (-27.07%)
Mutual labels:  cifar10
Imgclsmob
Sandbox for training deep learning networks
Stars: ✭ 2,405 (+1228.73%)
Mutual labels:  chainer
Deep metric learning
Deep metric learning methods implemented in Chainer
Stars: ✭ 153 (-15.47%)
Mutual labels:  chainer
Chainer Pix2pix
chainer implementation of pix2pix
Stars: ✭ 130 (-28.18%)
Mutual labels:  chainer

Chainer implementation of non-negative PU learning and unbiased PU learning

This is a reproducing code for non-negative PU learning [1] and unbiased PU learning [2] in the paper "Positive-Unlabeled Learning with Non-Negative Risk Estimator".

  • pu_loss.py has a chainer implementation of the risk estimator for non-negative PU (nnPU) learning and unbiased PU (uPU) learning.
  • train.py is an example code of nnPU learning and uPU learning. Dataset are MNIST [3] preprocessed in such a way that even digits form the P class and odd digits form the N class and CIFAR10 [4] preprocessed in such a way that artifacts form the P class and living things form the N class. The default setting is 100 P data and 59900 U data of MNIST, and the class prior is the ratio of P class data in U data.

Requirements

  • Python == 3.7
  • Numpy == 1.16
  • Chainer == 6.4
  • Scikit-learn == 0.21
  • Matplotlib == 3.0

Quick start

You can run an example code of MNIST for comparing the performance of nnPU learning and uPU learning on GPU.

python3 train.py -g 0

There are also preset configurations for reproducing results on [1].

  • --preset figure1: The setting of Figure 1
  • --preset exp-mnist: The setting of MNIST experiment in Experiment
  • --preset exp-cifar: The setting of CIFAR10 experiment in Experiment

You can see additional information by adding --help.

Example result

After running training_mnist.py, 2 figures and 1 log file are made in result/ by default. The errors are measured by zero-one loss.

  • Training error in result/training_error.png

training error

  • Test error in result/test_error.png

test error

Reference

[1] Ryuichi Kiryo, Gang Niu, Marthinus Christoffel du Plessis, and Masashi Sugiyama. "Positive-Unlabeled Learning with Non-Negative Risk Estimator." Advances in neural information processing systems. 2017.

[2] Marthinus Christoffel du Plessis, Gang Niu, and Masashi Sugiyama. "Convex formulation for learning from positive and unlabeled data." Proceedings of The 32nd International Conference on Machine Learning. 2015.

[3] LeCun, Yann. "The MNIST database of handwritten digits." http://yann.lecun.com/exdb/mnist/ (1998).

[4] Krizhevsky, Alex, and Geoffrey Hinton. "Learning multiple layers of features from tiny images." (2009).

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].