All Projects → Sshanu → Hierarchical-Word-Sense-Disambiguation-using-WordNet-Senses

Sshanu / Hierarchical-Word-Sense-Disambiguation-using-WordNet-Senses

Licence: MIT license
Word Sense Disambiguation using Word Specific models, All word models and Hierarchical models in Tensorflow

Programming Languages

Jupyter Notebook
11667 projects

Projects that are alternatives of or similar to Hierarchical-Word-Sense-Disambiguation-using-WordNet-Senses

datastories-semeval2017-task6
Deep-learning model presented in "DataStories at SemEval-2017 Task 6: Siamese LSTM with Attention for Humorous Text Comparison".
Stars: ✭ 20 (-39.39%)
Mutual labels:  lstm, attention, attention-mechanism
Image Caption Generator
A neural network to generate captions for an image using CNN and RNN with BEAM Search.
Stars: ✭ 126 (+281.82%)
Mutual labels:  lstm, attention, attention-mechanism
automatic-personality-prediction
[AAAI 2020] Modeling Personality with Attentive Networks and Contextual Embeddings
Stars: ✭ 43 (+30.3%)
Mutual labels:  lstm, attention, attention-mechanism
Multimodal Sentiment Analysis
Attention-based multimodal fusion for sentiment analysis
Stars: ✭ 172 (+421.21%)
Mutual labels:  lstm, attention, attention-mechanism
ntua-slp-semeval2018
Deep-learning models of NTUA-SLP team submitted in SemEval 2018 tasks 1, 2 and 3.
Stars: ✭ 79 (+139.39%)
Mutual labels:  lstm, attention, attention-mechanism
Datastories Semeval2017 Task4
Deep-learning model presented in "DataStories at SemEval-2017 Task 4: Deep LSTM with Attention for Message-level and Topic-based Sentiment Analysis".
Stars: ✭ 184 (+457.58%)
Mutual labels:  lstm, attention, attention-mechanism
Chinese semantic role labeling
基于 Bi-LSTM 和 CRF 的中文语义角色标注
Stars: ✭ 60 (+81.82%)
Mutual labels:  crf, lstm
Nlp Journey
Documents, papers and codes related to Natural Language Processing, including Topic Model, Word Embedding, Named Entity Recognition, Text Classificatin, Text Generation, Text Similarity, Machine Translation),etc. All codes are implemented intensorflow 2.0.
Stars: ✭ 1,290 (+3809.09%)
Mutual labels:  crf, attention
Daguan 2019 rank9
datagrand 2019 information extraction competition rank9
Stars: ✭ 121 (+266.67%)
Mutual labels:  crf, lstm
lstm-attention
Attention-based bidirectional LSTM for Classification Task (ICASSP)
Stars: ✭ 87 (+163.64%)
Mutual labels:  attention, attention-mechanism
Rnnsharp
RNNSharp is a toolkit of deep recurrent neural network which is widely used for many different kinds of tasks, such as sequence labeling, sequence-to-sequence and so on. It's written by C# language and based on .NET framework 4.6 or above versions. RNNSharp supports many different types of networks, such as forward and bi-directional network, sequence-to-sequence network, and different types of layers, such as LSTM, Softmax, sampled Softmax and others.
Stars: ✭ 277 (+739.39%)
Mutual labels:  crf, lstm
Multilstm
keras attentional bi-LSTM-CRF for Joint NLU (slot-filling and intent detection) with ATIS
Stars: ✭ 122 (+269.7%)
Mutual labels:  crf, lstm
korean ner tagging challenge
KU_NERDY 이동엽, 임희석 (2017 국어 정보 처리 시스템경진대회 금상) - 한글 및 한국어 정보처리 학술대회
Stars: ✭ 30 (-9.09%)
Mutual labels:  crf, lstm
Tf Lstm Crf Batch
Tensorflow-LSTM-CRF tool for Named Entity Recognizer
Stars: ✭ 59 (+78.79%)
Mutual labels:  crf, lstm
Ner blstm Crf
LSTM-CRF for NER with ConLL-2002 dataset
Stars: ✭ 51 (+54.55%)
Mutual labels:  crf, lstm
End To End Sequence Labeling Via Bi Directional Lstm Cnns Crf Tutorial
Tutorial for End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF
Stars: ✭ 87 (+163.64%)
Mutual labels:  crf, lstm
Ner Lstm Crf
An easy-to-use named entity recognition (NER) toolkit, implemented the Bi-LSTM+CRF model in tensorflow.
Stars: ✭ 337 (+921.21%)
Mutual labels:  crf, lstm
Ncrfpp
NCRF++, a Neural Sequence Labeling Toolkit. Easy use to any sequence labeling tasks (e.g. NER, POS, Segmentation). It includes character LSTM/CNN, word LSTM/CNN and softmax/CRF components.
Stars: ✭ 1,767 (+5254.55%)
Mutual labels:  crf, lstm
NLP-paper
🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-30.3%)
Mutual labels:  crf, attention-mechanism
h-transformer-1d
Implementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (+266.67%)
Mutual labels:  attention, attention-mechanism

Word Sense Disambiguation

Word sense disambiguation (WSD) is the ability to identify the meaning of words in context. We address this problem using series of end-to-end neural architectures using bidirectional Long Short Term Memory (LSTM). We propose two variants for WSD: an end-to-end word specific neural model and all-words neural model. In the word specific models we have to train models for every disambiguation target word. We addressed this issue using the all-words model which rely on sequence learning. We also used POS tags to improve the performance. We tried different variants of attention mechanisms for the all-words model. Performance was boosted by using convolutional neural networks (CNN) which captures local features around the words that is normally what humans do for predicting the senses. We further improved the performance using hierarchical models. We used POS tags as hierarchy and used two variants as soft masking and hard masking.

Methods

Best Models

Details

For detailed information about models and results:

All words Models

All-words Hierarchical Model+Soft Masking

All-words Hierarchical Model+Hard Masking

Basic Model

Basic Model+Local Attention

Basic Model+Local Attentionn+Hidden States

Basic Model+Local Attentionn+Hidden States+CRF

Basic Model+Gated Attention

Basic Model+CNN

Word Specific Models

Basic Model

Files with name as Model-1-multigpu-1.ipynb are the basic models

Basic Model+POS Tags

Files with name as Model-1-multigpu-2.ipynb are the basic models

Basic Model+POS Tags+CRF

Files with name as Model-1-multigpu-3.ipynb are the basic models

Word specific hierarchical model

Files with name as Model-1-multigpu-4.ipynb are the basic models

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].