All Projects → nn116003 → Self Attention Classification

nn116003 / Self Attention Classification

document classification using LSTM + self attention

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Self Attention Classification

iPerceive
Applying Common-Sense Reasoning to Multi-Modal Dense Video Captioning and Video Question Answering | Python3 | PyTorch | CNNs | Causality | Reasoning | LSTMs | Transformers | Multi-Head Self Attention | Published in IEEE Winter Conference on Applications of Computer Vision (WACV) 2021
Stars: ✭ 52 (-38.1%)
Mutual labels:  lstm, attention
dhs summit 2019 image captioning
Image captioning using attention models
Stars: ✭ 34 (-59.52%)
Mutual labels:  lstm, attention
automatic-personality-prediction
[AAAI 2020] Modeling Personality with Attentive Networks and Contextual Embeddings
Stars: ✭ 43 (-48.81%)
Mutual labels:  lstm, attention
Machine Learning
My Attempt(s) In The World Of ML/DL....
Stars: ✭ 78 (-7.14%)
Mutual labels:  lstm, attention
Banglatranslator
Bangla Machine Translator
Stars: ✭ 21 (-75%)
Mutual labels:  lstm, attention
Hierarchical-Word-Sense-Disambiguation-using-WordNet-Senses
Word Sense Disambiguation using Word Specific models, All word models and Hierarchical models in Tensorflow
Stars: ✭ 33 (-60.71%)
Mutual labels:  lstm, attention
Base-On-Relation-Method-Extract-News-DA-RNN-Model-For-Stock-Prediction--Pytorch
基於關聯式新聞提取方法之雙階段注意力機制模型用於股票預測
Stars: ✭ 33 (-60.71%)
Mutual labels:  lstm, attention
Datastories Semeval2017 Task4
Deep-learning model presented in "DataStories at SemEval-2017 Task 4: Deep LSTM with Attention for Message-level and Topic-based Sentiment Analysis".
Stars: ✭ 184 (+119.05%)
Mutual labels:  lstm, attention
Text Classification
Implementation of papers for text classification task on DBpedia
Stars: ✭ 682 (+711.9%)
Mutual labels:  lstm, attention
Vad
Voice activity detection (VAD) toolkit including DNN, bDNN, LSTM and ACAM based VAD. We also provide our directly recorded dataset.
Stars: ✭ 622 (+640.48%)
Mutual labels:  lstm, attention
learningspoons
nlp lecture-notes and source code
Stars: ✭ 29 (-65.48%)
Mutual labels:  lstm, attention
Text Classification Keras
📚 Text classification library with Keras
Stars: ✭ 53 (-36.9%)
Mutual labels:  lstm, attention
EBIM-NLI
Enhanced BiLSTM Inference Model for Natural Language Inference
Stars: ✭ 24 (-71.43%)
Mutual labels:  lstm, attention
datastories-semeval2017-task6
Deep-learning model presented in "DataStories at SemEval-2017 Task 6: Siamese LSTM with Attention for Humorous Text Comparison".
Stars: ✭ 20 (-76.19%)
Mutual labels:  lstm, attention
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+3969.05%)
Mutual labels:  lstm, attention
ntua-slp-semeval2018
Deep-learning models of NTUA-SLP team submitted in SemEval 2018 tasks 1, 2 and 3.
Stars: ✭ 79 (-5.95%)
Mutual labels:  lstm, attention
Rnn For Joint Nlu
Pytorch implementation of "Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling" (https://arxiv.org/abs/1609.01454)
Stars: ✭ 176 (+109.52%)
Mutual labels:  lstm, attention
Deep Time Series Prediction
Seq2Seq, Bert, Transformer, WaveNet for time series prediction.
Stars: ✭ 183 (+117.86%)
Mutual labels:  lstm, attention
Crnn attention ocr chinese
CRNN with attention to do OCR,add Chinese recognition
Stars: ✭ 315 (+275%)
Mutual labels:  lstm, attention
Time Attention
Implementation of RNN for Time Series prediction from the paper https://arxiv.org/abs/1704.02971
Stars: ✭ 52 (-38.1%)
Mutual labels:  lstm, attention

document classification LSTM + self attention

Pytorch implementation of LSTM classification with self attention. See A STRUCTURED SELF-ATTENTIVE SENTENCE EMBEDDING

Some resutls -> my blog post

IMDB Experiments

training

python imdb_attn.py

visualize attention

python view_attn.py

results ./attn.html: label \t pred label \t sentence with attention(<span ....>)

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].