All Projects → hsqmlzno1 → MGAN

hsqmlzno1 / MGAN

Licence: MIT license
Exploiting Coarse-to-Fine Task Transfer for Aspect-level Sentiment Classification (AAAI'19)

Projects that are alternatives of or similar to MGAN

Transferable-E2E-ABSA
Transferable End-to-End Aspect-based Sentiment Analysis with Selective Adversarial Learning (EMNLP'19)
Stars: ✭ 62 (+40.91%)
Mutual labels:  domain-adaptation, aspect-based-sentiment-analysis, cross-domain-sentiment
DAS
Code and datasets for EMNLP2018 paper ‘‘Adaptive Semi-supervised Learning for Cross-domain Sentiment Classification’’.
Stars: ✭ 48 (+9.09%)
Mutual labels:  domain-adaptation, cross-domain-sentiment
Hatn
Hierarchical Attention Transfer Network for Cross-domain Sentiment Classification (AAAI'18)
Stars: ✭ 73 (+65.91%)
Mutual labels:  attention, domain-adaptation
Datastories Semeval2017 Task4
Deep-learning model presented in "DataStories at SemEval-2017 Task 4: Deep LSTM with Attention for Message-level and Topic-based Sentiment Analysis".
Stars: ✭ 184 (+318.18%)
Mutual labels:  attention
Self Attentive Tensorflow
Tensorflow implementation of "A Structured Self-Attentive Sentence Embedding"
Stars: ✭ 189 (+329.55%)
Mutual labels:  attention
Self Attention Cv
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (+375%)
Mutual labels:  attention
Cgnl Network.pytorch
Compact Generalized Non-local Network (NIPS 2018)
Stars: ✭ 252 (+472.73%)
Mutual labels:  attention
Pyramid Attention Networks Pytorch
Implementation of Pyramid Attention Networks for Semantic Segmentation.
Stars: ✭ 182 (+313.64%)
Mutual labels:  attention
Ai law
all kinds of baseline models for long text classificaiton( text categorization)
Stars: ✭ 243 (+452.27%)
Mutual labels:  attention
Neat Vision
Neat (Neural Attention) Vision, is a visualization tool for the attention mechanisms of deep-learning models for Natural Language Processing (NLP) tasks. (framework-agnostic)
Stars: ✭ 213 (+384.09%)
Mutual labels:  attention
Pen Net For Inpainting
[CVPR'2019]PEN-Net: Learning Pyramid-Context Encoder Network for High-Quality Image Inpainting
Stars: ✭ 206 (+368.18%)
Mutual labels:  attention
Hnatt
Train and visualize Hierarchical Attention Networks
Stars: ✭ 192 (+336.36%)
Mutual labels:  attention
Appnp
A PyTorch implementation of "Predict then Propagate: Graph Neural Networks meet Personalized PageRank" (ICLR 2019).
Stars: ✭ 234 (+431.82%)
Mutual labels:  attention
Graph attention pool
Attention over nodes in Graph Neural Networks using PyTorch (NeurIPS 2019)
Stars: ✭ 186 (+322.73%)
Mutual labels:  attention
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+7668.18%)
Mutual labels:  attention
Deep Time Series Prediction
Seq2Seq, Bert, Transformer, WaveNet for time series prediction.
Stars: ✭ 183 (+315.91%)
Mutual labels:  attention
Long Range Arena
Long Range Arena for Benchmarking Efficient Transformers
Stars: ✭ 235 (+434.09%)
Mutual labels:  attention
Doc Han Att
Hierarchical Attention Networks for Chinese Sentiment Classification
Stars: ✭ 206 (+368.18%)
Mutual labels:  attention
Guided Attention Inference Network
Contains implementation of Guided Attention Inference Network (GAIN) presented in Tell Me Where to Look(CVPR 2018). This repository aims to apply GAIN on fcn8 architecture used for segmentation.
Stars: ✭ 204 (+363.64%)
Mutual labels:  attention
Gam
A PyTorch implementation of "Graph Classification Using Structural Attention" (KDD 2018).
Stars: ✭ 227 (+415.91%)
Mutual labels:  attention

MGAN

Data for our AAAI'19, oral paper "Exploiting Coarse-to-Fine Task Transfer for Aspect-level Sentiment Classification".

2019/02/24 Update: Data resource has been released now!

Descriptions

This dataset acts as highly beneficial source domains to improve the learning of more fine-grained aspect-term level (AT) sentiment analysis. The dataset has three characteristics:

Large-scale: 100k for each domain

Multi-domain: Restaurant, Hotel, Beautyspa

Aspect-category (AC): Coarse-grained asepct

Even with a simple attention-based model for the AT task, our method can achieve the STOA performances by leveraging the knowledge distilled from the AC task.

Data format

Each instance behaves as the format below:

inst1: ID1/sentence1/aspect1/label1

inst2: ID1/sentence1/aspect2/label2

inst3: ID1/sentence1/aspect3/label3

....

The sentence containing multiple aspects are arranged together.

Example

0H0FwmPY78v_5u51r2TQrw i did n't dislike the food , but the menu is n't exactly cohesive ... pizza and asian cuisine . FOOD_SELECTION -1

0H0FwmPY78v_5u51r2TQrw i did n't dislike the food , but the menu is n't exactly cohesive ... pizza and asian cuisine . FOOD_FOOD_DISH -1

0H0FwmPY78v_5u51r2TQrw i did n't dislike the food , but the menu is n't exactly cohesive ... pizza and asian cuisine . RESTAURANT_CUSINE -1

0H0FwmPY78v_5u51r2TQrw i did n't dislike the food , but the menu is n't exactly cohesive ... pizza and asian cuisine . FOOD_FOOD 1

Label Mapping

positive: 1 neutral: 0 negative: -1

Citation

If the data is useful for your research, please be kindly to give us stars and cite our paper as follows:

@article{li2018exploiting,
  title={Exploiting Coarse-to-Fine Task Transfer for Aspect-level Sentiment Classification},
  author={Li, Zheng and Wei, Ying and Zhang, Yu and Zhang, Xiang and Li, Xin and Yang, Qiang},
  conference = {AAAI Conference on Artificial Intelligence},
  year={2019}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].