All Projects → XiaoYee → ACAN

XiaoYee / ACAN

Licence: MIT license
Code for NAACL 2019 paper: Adversarial Category Alignment Network for Cross-domain Sentiment Classification

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to ACAN

DA-RetinaNet
Official Detectron2 implementation of DA-RetinaNet of our Image and Vision Computing 2021 work 'An unsupervised domain adaptation scheme for single-stage artwork recognition in cultural sites'
Stars: ✭ 31 (+34.78%)
Mutual labels:  domain-adaptation, unsupervised-domain-adaptation
CADA
Attending to Discriminative Certainty for Domain Adaptation
Stars: ✭ 17 (-26.09%)
Mutual labels:  domain-adaptation, unsupervised-domain-adaptation
domain-adaptation-capls
Unsupervised Domain Adaptation via Structured Prediction Based Selective Pseudo-Labeling
Stars: ✭ 43 (+86.96%)
Mutual labels:  domain-adaptation, unsupervised-domain-adaptation
SFA
Official Implementation of "Exploring Sequence Feature Alignment for Domain Adaptive Detection Transformers"
Stars: ✭ 79 (+243.48%)
Mutual labels:  domain-adaptation
DAS
Code and datasets for EMNLP2018 paper ‘‘Adaptive Semi-supervised Learning for Cross-domain Sentiment Classification’’.
Stars: ✭ 48 (+108.7%)
Mutual labels:  domain-adaptation
german-sentiment
A data set and model for german sentiment classification.
Stars: ✭ 37 (+60.87%)
Mutual labels:  sentiment-classification
arabic-sentiment-analysis
Sentiment Analysis in Arabic tweets
Stars: ✭ 64 (+178.26%)
Mutual labels:  sentiment-classification
multichannel-semseg-with-uda
Multichannel Semantic Segmentation with Unsupervised Domain Adaptation
Stars: ✭ 19 (-17.39%)
Mutual labels:  domain-adaptation
CAC-UNet-DigestPath2019
1st to MICCAI DigestPath2019 challenge (https://digestpath2019.grand-challenge.org/Home/) on colonoscopy tissue segmentation and classification task. (MICCAI 2019) https://teacher.bupt.edu.cn/zhuchuang/en/index.htm
Stars: ✭ 83 (+260.87%)
Mutual labels:  domain-adaptation
DCAN
[AAAI 2020] Code release for "Domain Conditioned Adaptation Network" https://arxiv.org/abs/2005.06717
Stars: ✭ 27 (+17.39%)
Mutual labels:  domain-adaptation
AdaptationSeg
Curriculum Domain Adaptation for Semantic Segmentation of Urban Scenes, ICCV 2017
Stars: ✭ 128 (+456.52%)
Mutual labels:  domain-adaptation
german-sentiment-lib
An easy to use python package for deep learning-based german sentiment classification.
Stars: ✭ 33 (+43.48%)
Mutual labels:  sentiment-classification
banglabert
This repository contains the official release of the model "BanglaBERT" and associated downstream finetuning code and datasets introduced in the paper titled "BanglaBERT: Language Model Pretraining and Benchmarks for Low-Resource Language Understanding Evaluation in Bangla" accpeted in Findings of the Annual Conference of the North American Chap…
Stars: ✭ 186 (+708.7%)
Mutual labels:  sentiment-classification
brand-sentiment-analysis
Scripts utilizing Heartex platform to build brand sentiment analysis from the news
Stars: ✭ 21 (-8.7%)
Mutual labels:  sentiment-classification
StockerBot
Twitter Bot to follow financial trends in publicly traded companies
Stars: ✭ 77 (+234.78%)
Mutual labels:  sentiment-classification
IAST-ECCV2020
IAST: Instance Adaptive Self-training for Unsupervised Domain Adaptation (ECCV 2020) https://teacher.bupt.edu.cn/zhuchuang/en/index.htm
Stars: ✭ 84 (+265.22%)
Mutual labels:  domain-adaptation
meta-learning-progress
Repository to track the progress in Meta-Learning (MtL), including the datasets and the current state-of-the-art for the most common MtL problems.
Stars: ✭ 26 (+13.04%)
Mutual labels:  domain-adaptation
Transformers-Domain-Adaptation
Adapt Transformer-based language models to new text domains
Stars: ✭ 67 (+191.3%)
Mutual labels:  domain-adaptation
BA3US
code for our ECCV 2020 paper "A Balanced and Uncertainty-aware Approach for Partial Domain Adaptation"
Stars: ✭ 31 (+34.78%)
Mutual labels:  domain-adaptation
Transferable-E2E-ABSA
Transferable End-to-End Aspect-based Sentiment Analysis with Selective Adversarial Learning (EMNLP'19)
Stars: ✭ 62 (+169.57%)
Mutual labels:  domain-adaptation

ACAN

Code for NAACL 2019 paper: "Adversarial Category Alignment Network for Cross-domain Sentiment Classification" (pdf)

Dataset & pretrained word embeddings

You can download the datasets (amazon-benchmark) at [Download]. The zip file should be decompressed and put in the root directory.

Download the pretrained Glove vectors [glove.840B.300d.zip]. Decompress the zip file and put the txt file in the root directory.

Train & evaluation

You can find arguments and hyper-parameters defined in train_batch.py with default values.

Under code/, use the following command for training any source-target pair from the amazon benchmark:

CUDA_VISIBLE_DEVICES="0" python train_batchs.py \
--emb ../glove.840B.300d.txt \
--dataset amazon \
--source $source \
--target $target \
--n-class 2  \
--lamda1 -0.1 --lamda2 0.1 --lamda3 5 --lamda4 1.5 \
--epochs 30 

where --emb is the path to the pre-trained word embeddings. $source and $target are domains from the amazon benchmark, both in ['book', 'dvd', 'electronics', 'kitchen']. --n-class denoting the number of output classes is set to 2 as we only consider binary classification (positive or negative) on this dataset. All other hyper-parameters are left as their defaults.

Dependencies

The code was only tested under the environment below:

  • Python 2.7
  • Keras 2.1.2
  • tensorflow 1.4.1
  • numpy 1.13.3

Cite

If you use the code, please cite the following paper:

@InProceedings{qu-etal-2019-adversarial,
  author    = {Qu, Xiaoye and Zou, Zhikang and Cheng, Yu and Yang, Yang and Zhou, Pan},
  title     = {Adversarial Category Alignment Network for Cross-domain Sentiment Classification},
  booktitle = {Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics},
  publisher = {Association for Computational Linguistics}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].