831 Open source projects that are alternatives of or similar to Hierarchical-Word-Sense-Disambiguation-using-WordNet-Senses

Lambda Networks
Implementation of LambdaNetworks, a new approach to image recognition that reaches SOTA with less compute
Stars: ✭ 1,497 (+4436.36%)
Mutual labels:  attention, attention-mechanism
AoA-pytorch
A Pytorch implementation of Attention on Attention module (both self and guided variants), for Visual Question Answering
Stars: ✭ 33 (+0%)
Mutual labels:  attention, attention-mechanism
Self Attention Cv
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (+533.33%)
Mutual labels:  attention, attention-mechanism
Graph attention pool
Attention over nodes in Graph Neural Networks using PyTorch (NeurIPS 2019)
Stars: ✭ 186 (+463.64%)
Mutual labels:  attention, attention-mechanism
End To End Sequence Labeling Via Bi Directional Lstm Cnns Crf Tutorial
Tutorial for End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF
Stars: ✭ 87 (+163.64%)
Mutual labels:  crf, lstm
NLP-paper
🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-30.3%)
Mutual labels:  crf, attention-mechanism
dhs summit 2019 image captioning
Image captioning using attention models
Stars: ✭ 34 (+3.03%)
Mutual labels:  lstm, attention
Pointer Networks Experiments
Sorting numbers with pointer networks
Stars: ✭ 53 (+60.61%)
Mutual labels:  lstm, attention
Crnn attention ocr chinese
CRNN with attention to do OCR,add Chinese recognition
Stars: ✭ 315 (+854.55%)
Mutual labels:  lstm, attention
Linear Attention Recurrent Neural Network
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Stars: ✭ 119 (+260.61%)
Mutual labels:  lstm, attention-mechanism
Time Attention
Implementation of RNN for Time Series prediction from the paper https://arxiv.org/abs/1704.02971
Stars: ✭ 52 (+57.58%)
Mutual labels:  lstm, attention
Nlp Journey
Documents, papers and codes related to Natural Language Processing, including Topic Model, Word Embedding, Named Entity Recognition, Text Classificatin, Text Generation, Text Similarity, Machine Translation),etc. All codes are implemented intensorflow 2.0.
Stars: ✭ 1,290 (+3809.09%)
Mutual labels:  crf, attention
Daguan 2019 rank9
datagrand 2019 information extraction competition rank9
Stars: ✭ 121 (+266.67%)
Mutual labels:  crf, lstm
Rnnsharp
RNNSharp is a toolkit of deep recurrent neural network which is widely used for many different kinds of tasks, such as sequence labeling, sequence-to-sequence and so on. It's written by C# language and based on .NET framework 4.6 or above versions. RNNSharp supports many different types of networks, such as forward and bi-directional network, sequence-to-sequence network, and different types of layers, such as LSTM, Softmax, sampled Softmax and others.
Stars: ✭ 277 (+739.39%)
Mutual labels:  crf, lstm
Ncrfpp
NCRF++, a Neural Sequence Labeling Toolkit. Easy use to any sequence labeling tasks (e.g. NER, POS, Segmentation). It includes character LSTM/CNN, word LSTM/CNN and softmax/CRF components.
Stars: ✭ 1,767 (+5254.55%)
Mutual labels:  crf, lstm
LMMS
Language Modelling Makes Sense - WSD (and more) with Contextual Embeddings
Stars: ✭ 79 (+139.39%)
Poetry Seq2seq
Chinese Poetry Generation
Stars: ✭ 159 (+381.82%)
Mutual labels:  lstm, attention-mechanism
Chinese semantic role labeling
基于 Bi-LSTM 和 CRF 的中文语义角色标注
Stars: ✭ 60 (+81.82%)
Mutual labels:  crf, lstm
Multilstm
keras attentional bi-LSTM-CRF for Joint NLU (slot-filling and intent detection) with ATIS
Stars: ✭ 122 (+269.7%)
Mutual labels:  crf, lstm
lstm-attention
Attention-based bidirectional LSTM for Classification Task (ICASSP)
Stars: ✭ 87 (+163.64%)
Mutual labels:  attention, attention-mechanism
S2VT-seq2seq-video-captioning-attention
S2VT (seq2seq) video captioning with bahdanau & luong attention implementation in Tensorflow
Stars: ✭ 18 (-45.45%)
Mutual labels:  attention-mechanism
algorithmia
No description or website provided.
Stars: ✭ 15 (-54.55%)
Mutual labels:  lstm
Bidirectiona-LSTM-for-text-summarization-
A bidirectional encoder-decoder LSTM neural network is trained for text summarization on the cnn/dailymail dataset. (MIT808 project)
Stars: ✭ 73 (+121.21%)
Mutual labels:  lstm
dltf
Hands-on in-person workshop for Deep Learning with TensorFlow
Stars: ✭ 14 (-57.58%)
Mutual labels:  lstm
Attentive-Neural-Process
A Pytorch Implementation of Attentive Neural Process
Stars: ✭ 60 (+81.82%)
Mutual labels:  attention
LSTM-sentiment-analysis
LSTM sentiment analysis. Please look at my another repo for SVM and Naive algorithem
Stars: ✭ 19 (-42.42%)
Mutual labels:  lstm
lstm har
LSTM based human activity recognition using smart phone sensor dataset
Stars: ✭ 20 (-39.39%)
Mutual labels:  lstm
LanguageModel-using-Attention
Pytorch implementation of a basic language model using Attention in LSTM network
Stars: ✭ 27 (-18.18%)
Mutual labels:  attention-mechanism
textaugment
TextAugment: Text Augmentation Library
Stars: ✭ 280 (+748.48%)
Mutual labels:  wordnet
mahjong
开源中文分词工具包,中文分词Web API,Lucene中文分词,中英文混合分词
Stars: ✭ 40 (+21.21%)
Mutual labels:  crf
En-transformer
Implementation of E(n)-Transformer, which extends the ideas of Welling's E(n)-Equivariant Graph Neural Network to attention
Stars: ✭ 131 (+296.97%)
Mutual labels:  attention-mechanism
sarcasm-detection-for-sentiment-analysis
Sarcasm Detection for Sentiment Analysis
Stars: ✭ 21 (-36.36%)
Mutual labels:  lstm
free-lunch-saliency
Code for "Free-Lunch Saliency via Attention in Atari Agents"
Stars: ✭ 15 (-54.55%)
Mutual labels:  attention
QTextRecognizer
A gui for tesseractOCR with some preprocessing image options (OpenCV) for improve character recognition
Stars: ✭ 27 (-18.18%)
Mutual labels:  lstm
CogNet
CogNet: a large-scale, high-quality cognate database for 338 languages, 1.07M words, and 8.1 million cognates
Stars: ✭ 26 (-21.21%)
Mutual labels:  wordnet
crfsuite-rs
Rust binding to crfsuite
Stars: ✭ 19 (-42.42%)
Mutual labels:  crf
CIAN
Implementation of the Character-level Intra Attention Network (CIAN) for Natural Language Inference (NLI) upon SNLI and MultiNLI corpus
Stars: ✭ 17 (-48.48%)
Mutual labels:  attention-mechanism
dgcnn
Clean & Documented TF2 implementation of "An end-to-end deep learning architecture for graph classification" (M. Zhang et al., 2018).
Stars: ✭ 21 (-36.36%)
Mutual labels:  attention-mechanism
Compact-Global-Descriptor
Pytorch implementation of "Compact Global Descriptor for Neural Networks" (CGD).
Stars: ✭ 22 (-33.33%)
Mutual labels:  attention-mechanism
R Unet
Video prediction using lstm and unet
Stars: ✭ 25 (-24.24%)
Mutual labels:  lstm
xinlp
把李航老师《统计学习方法》的后几章的算法都用java实现了一遍,实现盒子与球的EM算法,扩展到去GMM训练,后来实现了HMM分词(实现了HMM分词的参数训练)和CRF分词(借用CRF++训练的参数模型),最后利用tensorFlow把BiLSTM+CRF实现了,然后为lucene包装了一个XinAnalyzer
Stars: ✭ 21 (-36.36%)
Mutual labels:  crf
uniformer-pytorch
Implementation of Uniformer, a simple attention and 3d convolutional net that achieved SOTA in a number of video classification tasks, debuted in ICLR 2022
Stars: ✭ 90 (+172.73%)
Mutual labels:  attention-mechanism
5baa61e4c9b93f3f0682250b6cf8331b7ee68fd8
RNN-LSTM that learns passwords from a starting list
Stars: ✭ 35 (+6.06%)
Mutual labels:  lstm
DeepMove
Codes for WWW'18 Paper-DeepMove: Predicting Human Mobility with Attentional Recurrent Network
Stars: ✭ 120 (+263.64%)
Mutual labels:  attention
Paper-Implementation-DSTP-RNN-For-Stock-Prediction-Based-On-DA-RNN
基於DA-RNN之DSTP-RNN論文試做(Ver1.0)
Stars: ✭ 62 (+87.88%)
Mutual labels:  lstm
CoreLooper
No description or website provided.
Stars: ✭ 34 (+3.03%)
Mutual labels:  all
deepseg
Chinese word segmentation in tensorflow 2.x
Stars: ✭ 23 (-30.3%)
Mutual labels:  crf
DeepLearningReading
Deep Learning and Machine Learning mini-projects. Current Project: Deepmind Attentive Reader (rc-data)
Stars: ✭ 78 (+136.36%)
Mutual labels:  attention
wn
A modern, interlingual wordnet interface for Python
Stars: ✭ 119 (+260.61%)
Mutual labels:  wordnet
keras-crf-ner
keras+bi-lstm+crf,中文命名实体识别
Stars: ✭ 16 (-51.52%)
Mutual labels:  crf
efficient-attention
An implementation of the efficient attention module.
Stars: ✭ 191 (+478.79%)
Mutual labels:  attention-mechanism
ws4j
WordNet Similarity for Java provides an API for several Semantic Relatedness/Similarity algorithms
Stars: ✭ 41 (+24.24%)
Mutual labels:  wordnet
flow1d
[ICCV 2021 Oral] High-Resolution Optical Flow from 1D Attention and Correlation
Stars: ✭ 91 (+175.76%)
Mutual labels:  attention
Hierarchical-attention-network
My implementation of "Hierarchical Attention Networks for Document Classification" in Keras
Stars: ✭ 26 (-21.21%)
Mutual labels:  attention-mechanism
Gumbel-CRF
Implementation of NeurIPS 20 paper: Latent Template Induction with Gumbel-CRFs
Stars: ✭ 51 (+54.55%)
Mutual labels:  crf
ChangeFormer
Official PyTorch implementation of our IGARSS'22 paper: A Transformer-Based Siamese Network for Change Detection
Stars: ✭ 220 (+566.67%)
Mutual labels:  attention-mechanism
hexia
Mid-level PyTorch Based Framework for Visual Question Answering.
Stars: ✭ 24 (-27.27%)
Mutual labels:  attention-mechanism
Wordbook
Wordbook is a dictionary application built for GNOME.
Stars: ✭ 56 (+69.7%)
Mutual labels:  wordnet
organic-chemistry-reaction-prediction-using-NMT
organic chemistry reaction prediction using NMT with Attention
Stars: ✭ 30 (-9.09%)
Mutual labels:  attention-mechanism
LSTM-Attention
A Comparison of LSTMs and Attention Mechanisms for Forecasting Financial Time Series
Stars: ✭ 53 (+60.61%)
Mutual labels:  attention-mechanism
61-120 of 831 similar projects