All Projects → t-sakai-kure → pywsl

t-sakai-kure / pywsl

Licence: MIT license
Python codes for weakly-supervised learning

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to pywsl

Uda
Unsupervised Data Augmentation (UDA)
Stars: ✭ 1,877 (+1490.68%)
Mutual labels:  semi-supervised-learning
Graph Adversarial Learning
A curated collection of adversarial attack and defense on graph data.
Stars: ✭ 188 (+59.32%)
Mutual labels:  semi-supervised-learning
Good Papers
I try my best to keep updated cutting-edge knowledge in Machine Learning/Deep Learning and Natural Language Processing. These are my notes on some good papers
Stars: ✭ 248 (+110.17%)
Mutual labels:  semi-supervised-learning
Weakly Supervised Panoptic Segmentation
Weakly- and Semi-Supervised Panoptic Segmentation (ECCV18)
Stars: ✭ 149 (+26.27%)
Mutual labels:  semi-supervised-learning
Cct
[CVPR 2020] Semi-Supervised Semantic Segmentation with Cross-Consistency Training.
Stars: ✭ 171 (+44.92%)
Mutual labels:  semi-supervised-learning
Triple Gan
See Triple-GAN-V2 in PyTorch: https://github.com/taufikxu/Triple-GAN
Stars: ✭ 203 (+72.03%)
Mutual labels:  semi-supervised-learning
Cleanlab
The standard package for machine learning with noisy labels, finding mislabeled data, and uncertainty quantification. Works with most datasets and models.
Stars: ✭ 2,526 (+2040.68%)
Mutual labels:  semi-supervised-learning
realistic-ssl-evaluation-pytorch
Reimplementation of "Realistic Evaluation of Deep Semi-Supervised Learning Algorithms"
Stars: ✭ 79 (-33.05%)
Mutual labels:  semi-supervised-learning
Vosk
VOSK Speech Recognition Toolkit
Stars: ✭ 182 (+54.24%)
Mutual labels:  semi-supervised-learning
Tricks Of Semi Superviseddeepleanring Pytorch
PseudoLabel 2013, VAT, PI model, Tempens, MeanTeacher, ICT, MixMatch, FixMatch
Stars: ✭ 240 (+103.39%)
Mutual labels:  semi-supervised-learning
Deep Sad Pytorch
A PyTorch implementation of Deep SAD, a deep Semi-supervised Anomaly Detection method.
Stars: ✭ 152 (+28.81%)
Mutual labels:  semi-supervised-learning
Stylealign
[ICCV 2019]Aggregation via Separation: Boosting Facial Landmark Detector with Semi-Supervised Style Transition
Stars: ✭ 172 (+45.76%)
Mutual labels:  semi-supervised-learning
Adversarial Autoencoders
Tensorflow implementation of Adversarial Autoencoders
Stars: ✭ 215 (+82.2%)
Mutual labels:  semi-supervised-learning
Adversarial Semisupervised Semantic Segmentation
Pytorch Implementation of "Adversarial Learning For Semi-Supervised Semantic Segmentation" for ICLR 2018 Reproducibility Challenge
Stars: ✭ 147 (+24.58%)
Mutual labels:  semi-supervised-learning
DeFMO
[CVPR 2021] DeFMO: Deblurring and Shape Recovery of Fast Moving Objects
Stars: ✭ 144 (+22.03%)
Mutual labels:  semi-supervised-learning
Snowball
Implementation with some extensions of the paper "Snowball: Extracting Relations from Large Plain-Text Collections" (Agichtein and Gravano, 2000)
Stars: ✭ 131 (+11.02%)
Mutual labels:  semi-supervised-learning
Graph Representation Learning
Autoencoders for Link Prediction and Semi-Supervised Node Classification (DSAA 2018)
Stars: ✭ 199 (+68.64%)
Mutual labels:  semi-supervised-learning
ST-PlusPlus
[CVPR 2022] ST++: Make Self-training Work Better for Semi-supervised Semantic Segmentation
Stars: ✭ 168 (+42.37%)
Mutual labels:  semi-supervised-learning
Active-learning-for-object-detection
Active learning for deep object detection using YOLO
Stars: ✭ 35 (-70.34%)
Mutual labels:  semi-supervised-learning
Improvedgan Pytorch
Semi-supervised GAN in "Improved Techniques for Training GANs"
Stars: ✭ 228 (+93.22%)
Mutual labels:  semi-supervised-learning

pywsl: python codes for weakly-supervised learning

License: MIT Build Status PyPI version

This package contains the following implementation:

  • Unbiased PU learning
        in "Convex formulation for learning from positive and unlabeled data", ICML, 2015 [uPU]
  • Non-negative PU Learning
        in "Positive-unlabeled learning with non-negative risk estimator", NIPS, 2017 [nnPU]
  • PU Set Kernel Classifier
        in "Convex formulation of multiple instance learning from positive and unlabeled bags", Neural Networks, 2018 [PU-SKC]
  • Class-prior estimation based on energy distance
        in "Computationally efficient class-prior estimation under class balance change using energy distance", IEICE-ED, 2016 [CPE-ENE].
  • PNU classification
        in "Semi-supervised classification based on classification from positive and unlabeled data", ICML 2017 [PNU].
  • PNU-AUC optimization
        in "Semi-supervised AUC optimization based on positive-unlabeled learning", MLJ 2018 [PNU-AUC].

Installation

$ pip install pywsl

Main contributors

References

  1. du Plessis, M. C., Niu, G., and Sugiyama, M.   Convex formulation for learning from positive and unlabeled data.
    In Bach, F. and Blei, D. (Eds.), Proceedings of 32nd International Conference on Machine Learning, JMLR Workshop and Conference Proceedings, vol.37, pp.1386-1394, Lille, France, Jul. 6-11, 2015.
  2. Kiryo, R., Niu, G., du Plessis, M. C., and Sugiyama, M.
    Positive-unlabeled learning with non-negative risk estimator.
    In Guyon, I., Luxburg, U. V., Bengio, S., Wallach, H., Fergus, R., Vishwanathan, S., and Garnett, R. (Eds.), Advances in Neural Information Processing Systems 30, pp.1674-1684, 2017.
  3. Bao, H., Sakai, T., Sato, I., and Sugiyama, M.
    Convex formulation of multiple instance learning from positive and unlabeled bags.
    Neural Networks, vol.105, pp.132-141, 2018.
  4. Kawakubo, H., du Plessis, M. C., and Sugiyama, M.
    Computationally efficient class-prior estimation under class balance change using energy distance.
    IEICE Transactions on Information and Systems, vol.E99-D, no.1, pp.176-186, 2016.
  5. Sakai, T., du Plessis, M. C., Niu, G., and Sugiyama, M.
    Semi-supervised classification based on classification from positive and unlabeled data.
    In Precup, D. and Teh, Y. W. (Eds.), Proceedings of 34th International Conference on Machine Learning, Proceedings of Machine Learning Research, vol.70, pp.2998-3006, Sydney, Australia, Aug. 6-12, 2017.
  6. Sakai, T., Niu, G., and Sugiyama, M.
    Semi-supervised AUC optimization based on positive-unlabeled learning.
    Machine Learning, vol.107, no.4, pp.767-794, 2018.
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].