All Projects → BDBC-KG-NLP → Covolution_over_Dependency_Tree_EMNLP2019

BDBC-KG-NLP / Covolution_over_Dependency_Tree_EMNLP2019

Licence: other
Here is the code for the paper ``Aspect-Level Sentiment Analysis via Convolution over Dependency Tree'' accepted by EMNLP2019.

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to Covolution over Dependency Tree EMNLP2019

EgoCNN
Code for "Distributed, Egocentric Representations of Graphs for Detecting Critical Structures" (ICML 2019)
Stars: ✭ 16 (-5.88%)
Mutual labels:  graph-convolutional-neural-networks
HAST
Aspect Term Extraction with History Attention and Selective Transformation (IJCAI 2018)
Stars: ✭ 53 (+211.76%)
Mutual labels:  aspect-based-sentiment-analysis
SA-DL
Sentiment Analysis with Deep Learning models. Implemented with Tensorflow and Keras.
Stars: ✭ 35 (+105.88%)
Mutual labels:  aspect-based-sentiment-analysis
MGAN
Exploiting Coarse-to-Fine Task Transfer for Aspect-level Sentiment Classification (AAAI'19)
Stars: ✭ 44 (+158.82%)
Mutual labels:  aspect-based-sentiment-analysis
AspectBasedSentimentAnalysis
Aspect Based Sentiment Analysis is a special type of sentiment analysis. In an explicit aspect, opinion is expressed on a target(opinion target), this aspect-polarity extraction is known as ABSA.
Stars: ✭ 61 (+258.82%)
Mutual labels:  aspect-based-sentiment-analysis
MemNet ABSA
No description or website provided.
Stars: ✭ 20 (+17.65%)
Mutual labels:  aspect-based-sentiment-analysis
PBAN-PyTorch
A Position-aware Bidirectional Attention Network for Aspect-level Sentiment Analysis, PyTorch implementation.
Stars: ✭ 33 (+94.12%)
Mutual labels:  aspect-based-sentiment-analysis
ACSA
Papers, models and datasets for Aspect-Category Sentiment Analysis.
Stars: ✭ 50 (+194.12%)
Mutual labels:  aspect-based-sentiment-analysis
PyABSA
Open Framework for Aspect-based Sentiment Analysis based on state-of-the-art Models
Stars: ✭ 398 (+2241.18%)
Mutual labels:  aspect-based-sentiment-analysis
extra-model
Code to run the ExtRA algorithm for unsupervised topic/aspect extraction on English texts.
Stars: ✭ 43 (+152.94%)
Mutual labels:  aspect-based-sentiment-analysis
Scon-ABSA
[CIKM 2021] Enhancing Aspect-Based Sentiment Analysis with Supervised Contrastive Learning
Stars: ✭ 17 (+0%)
Mutual labels:  aspect-based-sentiment-analysis
Transferable-E2E-ABSA
Transferable End-to-End Aspect-based Sentiment Analysis with Selective Adversarial Learning (EMNLP'19)
Stars: ✭ 62 (+264.71%)
Mutual labels:  aspect-based-sentiment-analysis
Position-Aware-Tagging-for-ASTE
Code and models for the paper " Position-Aware Tagging for Aspect Sentiment Triplet Extraction", EMNLP 2020.
Stars: ✭ 70 (+311.76%)
Mutual labels:  aspect-based-sentiment-analysis
STEP
Spatial Temporal Graph Convolutional Networks for Emotion Perception from Gaits
Stars: ✭ 39 (+129.41%)
Mutual labels:  graph-convolutional-neural-networks
BGCN
A Tensorflow implementation of "Bayesian Graph Convolutional Neural Networks" (AAAI 2019).
Stars: ✭ 129 (+658.82%)
Mutual labels:  graph-convolutional-neural-networks

Aspect-Level Sentiment Analysis Via Convolution over Dependency Tree

Dataset and code for the paper: Aspect-Level Sentiment Analysis Via Convolution over Dependency Tree. Kai Sun, Richong Zhang, Samuel Mensah, Yongyi Mao, Xudong Liu. EMNLP 2019. [pdf]

Overview

A dependency tree shortens the distance between the aspects and opinion words of a sentence, allowing neural network models to capture long-term syntactic dependencies effortlessly. Besides, dependency trees have graph-like structures bringing to play the recent class of neural networks, namely, graph convolutional networks (GCN). The GCN has been successful in learning representations for nodes, capturing the local position of nodes in the graph. These observations motivate us to develop a neural model which can operate on the dependency tree of a sentence, with the aim to make accurate sentiment predictions with respect to specific aspects. Specifically, a BiLSTM is first employed to capture the contexutual information for each word. Then the BiLSTM embeddings are enhanced by a multi-layer GCN over the dependency tree. The aspect-sepcific embeddings of the last GCN layer are extracted and used for final classification.

overview

Requirement

  • Python 3.6.7

  • PyTorch 1.2.0

  • NumPy 1.17.2

  • GloVe pre-trained word vectors:

  • Rported results in the paper are under a fixed random seed, thus results might be unstable under different GPU devices or random seeds. To reproduce the reported results, you can try to train the model for several times under different random seeds such as from 0 to 50. If you want to get the trained models, please refer to https://drive.google.com/file/d/1ijAnzl1pHtSimRsxBEVoArVg4iJw18zg/view?usp=sharing

  • Note that some sentences in Rest16 dataset have no aspects, thus in fact sentence-level. Most of recenct works on ABSA have removed these sentences for training and evaluation.

Usage

Training the model:

python train.py --dataset [dataset]

Prepare vocabulary files for the dataset:

python prepare_vocab.py --dataset [dataset]

Evaluate trained model

python eval.py --model_dir [model_file path]

Citation

If this work is helpful, please cite as:

@inproceedings{Sun2019CDT,
  author    = {Kai Sun and
               Richong Zhang and
               Samuel Mensah and
               Yongyi Mao and
               Xudong Liu},
  title     = {Aspect-Level Sentiment Analysis Via Convolution over Dependency Tree},
  booktitle = {Proceedings of the 2019 Conference on Empirical Methods in Natural
               Language Processing and the 9th International Joint Conference on
               Natural Language Processing, {EMNLP-IJCNLP} 2019, Hong Kong, China,
               November 3-7, 2019},
  pages     = {5678--5687},
  year      = {2019}
}

License

MIT

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].