All Projects → bzantium → bert-AAD

bzantium / bert-AAD

Licence: other
Adversarial Adaptation with Distillation for BERT Unsupervised Domain Adaptation

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to bert-AAD

gpl
Powerful unsupervised domain adaptation method for dense retrieval. Requires only unlabeled corpus and yields massive improvement: "GPL: Generative Pseudo Labeling for Unsupervised Domain Adaptation of Dense Retrieval" https://arxiv.org/abs/2112.07577
Stars: ✭ 216 (+700%)
Mutual labels:  bert, domain-adaptation
chainer-ADDA
Adversarial Discriminative Domain Adaptation in Chainer
Stars: ✭ 24 (-11.11%)
Mutual labels:  adda, domain-adaptation
PDN
The official PyTorch implementation of "Pathfinder Discovery Networks for Neural Message Passing" (WebConf '21)
Stars: ✭ 44 (+62.96%)
Mutual labels:  bert
transfer-learning-algorithms
Implementation of many transfer learning algorithms in Python with Jupyter notebooks
Stars: ✭ 42 (+55.56%)
Mutual labels:  domain-adaptation
SentimentAnalysis
(BOW, TF-IDF, Word2Vec, BERT) Word Embeddings + (SVM, Naive Bayes, Decision Tree, Random Forest) Base Classifiers + Pre-trained BERT on Tensorflow Hub + 1-D CNN and Bi-Directional LSTM on IMDB Movie Reviews Dataset
Stars: ✭ 40 (+48.15%)
Mutual labels:  bert
MRC Competition Dureader
机器阅读理解 冠军/亚军代码及中文预训练MRC模型
Stars: ✭ 552 (+1944.44%)
Mutual labels:  bert
muse-as-service
REST API for sentence tokenization and embedding using Multilingual Universal Sentence Encoder.
Stars: ✭ 45 (+66.67%)
Mutual labels:  bert
ai web RISKOUT BTS
국방 리스크 관리 플랫폼 (🏅 국방부장관상/Minister of National Defense Award)
Stars: ✭ 18 (-33.33%)
Mutual labels:  bert
DRCN
Pytorch implementation of Deep Reconstruction Classification Networks
Stars: ✭ 31 (+14.81%)
Mutual labels:  domain-adaptation
ganslate
Simple and extensible GAN image-to-image translation framework. Supports natural and medical images.
Stars: ✭ 17 (-37.04%)
Mutual labels:  domain-adaptation
Self-Supervised-Embedding-Fusion-Transformer
The code for our IEEE ACCESS (2020) paper Multimodal Emotion Recognition with Transformer-Based Self Supervised Feature Fusion.
Stars: ✭ 57 (+111.11%)
Mutual labels:  bert
mmrazor
OpenMMLab Model Compression Toolbox and Benchmark.
Stars: ✭ 644 (+2285.19%)
Mutual labels:  knowledge-distillation
contextualSpellCheck
✔️Contextual word checker for better suggestions
Stars: ✭ 274 (+914.81%)
Mutual labels:  bert
DeepNER
An Easy-to-use, Modular and Prolongable package of deep-learning based Named Entity Recognition Models.
Stars: ✭ 9 (-66.67%)
Mutual labels:  bert
Xpersona
XPersona: Evaluating Multilingual Personalized Chatbot
Stars: ✭ 54 (+100%)
Mutual labels:  bert
game-feature-learning
Code for paper "Cross-Domain Self-supervised Multi-task Feature Learning using Synthetic Imagery", Ren et al., CVPR'18
Stars: ✭ 68 (+151.85%)
Mutual labels:  domain-adaptation
parsbert-ner
🤗 ParsBERT Persian NER Tasks
Stars: ✭ 15 (-44.44%)
Mutual labels:  bert
Tianchi2020ChineseMedicineQuestionGeneration
2020 阿里云天池大数据竞赛-中医药文献问题生成挑战赛
Stars: ✭ 20 (-25.93%)
Mutual labels:  bert
BIFI
[ICML 2021] Break-It-Fix-It: Unsupervised Learning for Program Repair
Stars: ✭ 74 (+174.07%)
Mutual labels:  domain-adaptation
GoEmotions-pytorch
Pytorch Implementation of GoEmotions 😍😢😱
Stars: ✭ 95 (+251.85%)
Mutual labels:  bert

Knowledge Distillation for BERT Unsupervised Domain Adaptation

Official PyTorch implementation | Paper

Abstract

A pre-trained language model, BERT, has brought significant performance improvements across a range of natural language processing tasks. Since the model is trained on a large corpus of diverse topics, it shows robust performance for domain shift problems in which data distributions at training (source data) and testing (target data) differ while sharing similarities. Despite its great improvements compared to previous models, it still suffers from performance degradation due to domain shifts. To mitigate such problems, we propose a simple but effective unsupervised domain adaptation method, adversarial adaptation with distillation (AAD), which combines the adversarial discriminative domain adaptation (ADDA) framework with knowledge distillation. We evaluate our approach in the task of cross-domain sentiment classification on 30 domain pairs, advancing the state-of-the-art performance for unsupervised domain adaptation in text sentiment classification.

Requirements

  • pandas
  • pytorch
  • transformers

Run the test

$ python main.py --pretrain --adapt --src books --tgt dvd

How to cite

@article{ryu2020knowledge,
  title={Knowledge Distillation for BERT Unsupervised Domain Adaptation},
  author={Ryu, Minho and Lee, Kichun},
  journal={arXiv preprint arXiv:2010.11478},
  year={2020}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].