All Projects → aonotas → Adversarial_text

aonotas / Adversarial_text

Code for Adversarial Training Methods for Semi-Supervised Text Classification

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Adversarial text

See
Code for the AAAI 2018 publication "SEE: Towards Semi-Supervised End-to-End Scene Text Recognition"
Stars: ✭ 545 (+400%)
Mutual labels:  semi-supervised-learning, chainer
Comicolorization
This is the implementation of the "Comicolorization: Semi-automatic Manga Colorization"
Stars: ✭ 99 (-9.17%)
Mutual labels:  chainer
Sparsely Grouped Gan
Code for paper "Sparsely Grouped Multi-task Generative Adversarial Networks for Facial Attribute Manipulation"
Stars: ✭ 68 (-37.61%)
Mutual labels:  semi-supervised-learning
Spacenet building detection
Project to train/test convolutional neural networks to extract buildings from SpaceNet satellite imageries.
Stars: ✭ 83 (-23.85%)
Mutual labels:  chainer
Grand
Source code and dataset of the NeurIPS 2020 paper "Graph Random Neural Network for Semi-Supervised Learning on Graphs"
Stars: ✭ 75 (-31.19%)
Mutual labels:  semi-supervised-learning
Chainer Handson
CAUTION: This is not maintained anymore. Visit https://github.com/chainer-community/chainer-colab-notebook/
Stars: ✭ 84 (-22.94%)
Mutual labels:  chainer
Mean Teacher
A state-of-the-art semi-supervised method for image recognition
Stars: ✭ 1,130 (+936.7%)
Mutual labels:  semi-supervised-learning
Ict
Code for reproducing ICT ( published in IJCAI 2019)
Stars: ✭ 107 (-1.83%)
Mutual labels:  semi-supervised-learning
Wasserstein Gan
Chainer implementation of Wasserstein GAN
Stars: ✭ 95 (-12.84%)
Mutual labels:  chainer
Hypergcn
NeurIPS 2019: HyperGCN: A New Method of Training Graph Convolutional Networks on Hypergraphs
Stars: ✭ 80 (-26.61%)
Mutual labels:  semi-supervised-learning
Seranet
Super Resolution of picture images using deep learning
Stars: ✭ 79 (-27.52%)
Mutual labels:  chainer
Chainer Pspnet
PSPNet in Chainer
Stars: ✭ 76 (-30.28%)
Mutual labels:  chainer
Bible text gcn
Pytorch implementation of "Graph Convolutional Networks for Text Classification"
Stars: ✭ 90 (-17.43%)
Mutual labels:  semi-supervised-learning
Deepaffinity
Protein-compound affinity prediction through unified RNN-CNN
Stars: ✭ 75 (-31.19%)
Mutual labels:  semi-supervised-learning
Deepergnn
Official PyTorch implementation of "Towards Deeper Graph Neural Networks" [KDD2020]
Stars: ✭ 106 (-2.75%)
Mutual labels:  semi-supervised-learning
Chainer Ssd
Implementation of SSD (Single Shot MultiBox Detector) using Chainer
Stars: ✭ 66 (-39.45%)
Mutual labels:  chainer
Dtc
Semi-supervised Medical Image Segmentation through Dual-task Consistency
Stars: ✭ 79 (-27.52%)
Mutual labels:  semi-supervised-learning
Onnx Chainer
Add-on package for ONNX format support in Chainer
Stars: ✭ 83 (-23.85%)
Mutual labels:  chainer
Kiss
Code for the paper "KISS: Keeping it Simple for Scene Text Recognition"
Stars: ✭ 108 (-0.92%)
Mutual labels:  chainer
Self Supervised Speech Recognition
speech to text with self-supervised learning based on wav2vec 2.0 framework
Stars: ✭ 106 (-2.75%)
Mutual labels:  semi-supervised-learning

Adversarial Training Methods for Semi-Supervised Text Classification

Code for Adversarial Training Methods for Semi-Supervised Text Classification

This code reproduce the [Miyato et al., 2017] with Chainer.

Setup envirment

Please install Chainer and Cupy.

You can set up the environment easily with this Setup.md.

Download Pretrain Model

Please download pre-trained model.

$ wget http://sato-motoki.com/research/vat/imdb_pretrained_lm.model

Result

Model Error Rate
Baseline [Miyato et al., 2017] 7.39
Baseline (Our code) 6.62
Adversarial [Miyato et al., 2017] 6.21
Adversarial Training (Our code) 6.35
Virtual Adversarial Training [Tensorflow code] 6.40
Virtual Adversarial Training [Miyato et al., 2017] 5.91
Virtual Adversarial Training (Our code) 5.82

Run

Pretrain

$ python -u pretrain.py -g 0 --layer 1 --dataset imdb --bproplen 100 --batchsize 32 --out results_imdb_adaptive --adaptive-softmax

Note that this command takes about 30 hours with single GPU.

Train (VAT: Semi-supervised setting)

$ python train.py --gpu=0 --n_epoch=30 --batchsize 32 --save_name=imdb_model_vat --lower=0 --use_adv=0 --xi_var=5.0  --use_unlabled=1 --alpha=0.001 --alpha_decay=0.9998 --min_count=1 --ignore_unk=1 --pretrained_model imdb_pretrained_lm.model --use_exp_decay=1 --clip=5.0 --batchsize_semi 96 --use_semi_data 1

Note that this command takes about 8 hours with single GPU.

Train (Adversarial Training: Supervised setting)

$ python train.py --gpu=0 --n_epoch=30 --batchsize 32 --save_name=imdb_model_adv --lower=0 --use_adv=1 --xi_var=5.0  --use_unlabled=1 --alpha=0.001 --alpha_decay=0.9998 --min_count=1 --ignore_unk=1 --pretrained_model imdb_pretrained_lm.model --use_exp_decay=1 --clip=5.0

Note that this command takes about 6 hours with single GPU.

Authors

We thank Takeru Miyato (@takerum) who suggested that we reproduce the result of a [Miyato et al., 2017].

Reference

[Miyato et al., 2017]: Takeru Miyato, Andrew M. Dai and Ian Goodfellow
Adversarial Training Methods for Semi-Supervised Text Classification.
International Conference on Learning Representation (ICLR), 2017
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].