All Projects → hsqmlzno1 → Hatn

hsqmlzno1 / Hatn

Licence: mit
Hierarchical Attention Transfer Network for Cross-domain Sentiment Classification (AAAI'18)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Hatn

MGAN
Exploiting Coarse-to-Fine Task Transfer for Aspect-level Sentiment Classification (AAAI'19)
Stars: ✭ 44 (-39.73%)
Mutual labels:  attention, domain-adaptation
Deep Transfer Learning
Deep Transfer Learning Papers
Stars: ✭ 68 (-6.85%)
Mutual labels:  domain-adaptation
Biblosa Pytorch
Re-implementation of Bi-Directional Block Self-Attention for Fast and Memory-Efficient Sequence Modeling (T. Shen et al., ICLR 2018) on Pytorch.
Stars: ✭ 43 (-41.1%)
Mutual labels:  attention
Yolov4 Pytorch
This is a pytorch repository of YOLOv4, attentive YOLOv4 and mobilenet YOLOv4 with PASCAL VOC and COCO
Stars: ✭ 1,070 (+1365.75%)
Mutual labels:  attention
Gvb
Code of Gradually Vanishing Bridge for Adversarial Domain Adaptation (CVPR2020)
Stars: ✭ 52 (-28.77%)
Mutual labels:  domain-adaptation
Global Self Attention Network
A Pytorch implementation of Global Self-Attention Network, a fully-attention backbone for vision tasks
Stars: ✭ 64 (-12.33%)
Mutual labels:  attention
Self Supervised Da
self-supervised domain adaptation
Stars: ✭ 36 (-50.68%)
Mutual labels:  domain-adaptation
Man
Multinomial Adversarial Networks for Multi-Domain Text Classification (NAACL 2018)
Stars: ✭ 72 (-1.37%)
Mutual labels:  domain-adaptation
Cross Domain ner
Cross-domain NER using cross-domain language modeling, code for ACL 2019 paper
Stars: ✭ 67 (-8.22%)
Mutual labels:  domain-adaptation
Fluence
A deep learning library based on Pytorch focussed on low resource language research and robustness
Stars: ✭ 54 (-26.03%)
Mutual labels:  attention
Pointer Networks Experiments
Sorting numbers with pointer networks
Stars: ✭ 53 (-27.4%)
Mutual labels:  attention
Meta Learning Bert
Meta learning with BERT as a learner
Stars: ✭ 52 (-28.77%)
Mutual labels:  domain-adaptation
Deeplearning Nlp Models
A small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (-12.33%)
Mutual labels:  attention
Sentences pair similarity calculation siamese lstm
A Keras Implementation of Attention_based Siamese Manhattan LSTM
Stars: ✭ 48 (-34.25%)
Mutual labels:  attention
Enjoy Hamburger
[ICLR 2021] Is Attention Better Than Matrix Decomposition?
Stars: ✭ 69 (-5.48%)
Mutual labels:  attention
Attentions
PyTorch implementation of some attentions for Deep Learning Researchers.
Stars: ✭ 39 (-46.58%)
Mutual labels:  attention
Text Classification Keras
📚 Text classification library with Keras
Stars: ✭ 53 (-27.4%)
Mutual labels:  attention
Attention Over Attention Tf Qa
论文“Attention-over-Attention Neural Networks for Reading Comprehension”中AoA模型实现
Stars: ✭ 58 (-20.55%)
Mutual labels:  attention
Absa Pytorch
Aspect Based Sentiment Analysis, PyTorch Implementations. 基于方面的情感分析,使用PyTorch实现。
Stars: ✭ 1,181 (+1517.81%)
Mutual labels:  attention
Libtlda
Library of transfer learners and domain-adaptive classifiers.
Stars: ✭ 71 (-2.74%)
Mutual labels:  domain-adaptation

HATN

Data and source code for our AAAI'18 paper "Hierarchical Attention Transfer Network for Cross-domain Sentiment Classification", which is an extension of our IJCAI'17 paper "End-to-End Adversarial Memory Network for Cross-domain Sentiment Classification".

Demo Video

Click the picture for watching a demo about visualization of attention transfer. => .

Requirements

  • Python 2.7.5

  • Tensorflow-gpu 1.2.1

  • numpy 1.13.3

  • nltk 3.2.1

  • Google Word2Vec

Environment

  • OS: CentOS Linux release 7.5.1804
  • GPU: NVIDIA TITAN Xp
  • CUDA: 8.0

Tips for Amazon small setting

  1. It would be better to use smaller batch size like bs=20.
  2. Removing hierarchical position embeddings, which are data-driven.
  3. Using some regurizations like dropout to avoid overfitting.

Running

Individual attention learning:

The goal is to automatically capture pos/neg pivots as a bridge across domains based on PNet, which provides the inputs and labels for NPnet. If the pivots are already obtained, you can ignore this step.

python extract_pivots.py --train --test -s dvd [source_domain] -t electronics [target_domain] -v [verbose]

Joint attention learning:

PNet and NPnet are jointly trained for cross-domain sentiment classification. When there exists large domain discrepany, it can demonstrate the efficacy of NPnet.

python train_hatn.py --train --test -s dvd [source_domain] -t electronics [target_domain] -v [verbose]

Training over all transfer pairs:

./all_train.sh

Results

The results are obtained under ramdom seed 0 in this implementation.

Task P-net HATN_h (full model)
books-dvd 0.8722 0.8770
books-electronics 0.8388 0.8620
books-kitchen 0.8518 0.8708
books-video 0.8728 0.8735
dvd-books 0.8783 0.8802
dvd-electronics 0.8393 0.8678
dvd-kitchen 0.8467 0.8700
dvd-video 0.8822 0.8897
electronics-books 0.8328 0.8362
electronics-dvd 0.8340 0.8387
electronics-kitchen 0.9010 0.9012
electronics-video 0.8352 0.8345
kitchen-books 0.8398 0.8483
kitchen-dvd 0.8357 0.8473
kitchen-electronics 0.8807 0.8908
kitchen-video 0.8370 0.8403
video-books 0.8682 0.8748
video-dvd 0.8737 0.8760
video-electronics 0.8347 0.8585
video-kitchen 0.8463 0.8602
Average 0.8551 0.8649

Citation

If the data and code are useful for your research, please be kindly to give us stars and cite our paper as follows:

@inproceedings{li2018hatn,
	author = {Zheng Li and Ying Wei and Yu Zhang and Qiang Yang},
	title = {Hierarchical Attention Transfer Network for Cross-Domain Sentiment Classification},
	conference = {AAAI Conference on Artificial Intelligence},
	year = {2018},
}
@inproceedings{li2017end,
  title={End-to-end adversarial memory network for cross-domain sentiment classification},
  author={Li, Zheng and Zhang, Yu and Wei, Ying and Wu, Yuxiang and Yang, Qiang},
  booktitle={Proceedings of the International Joint Conference on Artificial Intelligence (IJCAI 2017)},
  year={2017}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].