Ner BertBERT-NER (nert-bert) with google bert https://github.com/google-research.
Stars: ✭ 339 (+527.78%)
Encoder decoderFour styles of encoder decoder model by Python, Theano, Keras and Seq2Seq
Stars: ✭ 269 (+398.15%)
Speech TransformerA PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Stars: ✭ 565 (+946.3%)
Neural spEnd-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+655.56%)
Seq2seq SummarizerPointer-generator reinforced seq2seq summarization in PyTorch
Stars: ✭ 306 (+466.67%)
Isab PytorchAn implementation of (Induced) Set Attention Block, from the Set Transformers paper
Stars: ✭ 21 (-61.11%)
ResUNetPlusPlusOfficial code for ResUNetplusplus for medical image segmentation (TensorFlow implementation) (IEEE ISM)
Stars: ✭ 69 (+27.78%)
Chinesenre中文实体关系抽取,pytorch,bilstm+attention
Stars: ✭ 463 (+757.41%)
Nlp TutorialsSimple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
Stars: ✭ 394 (+629.63%)
DefactonlpDeFactoNLP: An Automated Fact-checking System that uses Named Entity Recognition, TF-IDF vector comparison and Decomposable Attention models.
Stars: ✭ 30 (-44.44%)
SimplecvreproductionReproduce simple cv project including attention module, classification, object detection, segmentation, keypoint detection, tracking 😄 etc.
Stars: ✭ 602 (+1014.81%)
Mcan VqaDeep Modular Co-Attention Networks for Visual Question Answering
Stars: ✭ 273 (+405.56%)
Biblosa PytorchRe-implementation of Bi-Directional Block Self-Attention for Fast and Memory-Efficient Sequence Modeling (T. Shen et al., ICLR 2018) on Pytorch.
Stars: ✭ 43 (-20.37%)
AbcnnImplementation of ABCNN(Attention-Based Convolutional Neural Network) on Tensorflow
Stars: ✭ 264 (+388.89%)
Cell DetrOfficial and maintained implementation of the paper Attention-Based Transformers for Instance Segmentation of Cells in Microstructures [BIBM 2020].
Stars: ✭ 26 (-51.85%)
GansformerGenerative Adversarial Transformers
Stars: ✭ 421 (+679.63%)
Tf Rnn AttentionTensorflow implementation of attention mechanism for text classification tasks.
Stars: ✭ 735 (+1261.11%)
Pytorch Original TransformerMy implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+661.11%)
Attentive Neural Processesimplementing "recurrent attentive neural processes" to forecast power usage (w. LSTM baseline, MCDropout)
Stars: ✭ 33 (-38.89%)
Deep learning nlpKeras, PyTorch, and NumPy Implementations of Deep Learning Architectures for NLP
Stars: ✭ 407 (+653.7%)
Text ClassificationImplementation of papers for text classification task on DBpedia
Stars: ✭ 682 (+1162.96%)
Transformer TensorflowTensorFlow implementation of 'Attention Is All You Need (2017. 6)'
Stars: ✭ 319 (+490.74%)
VadVoice activity detection (VAD) toolkit including DNN, bDNN, LSTM and ACAM based VAD. We also provide our directly recorded dataset.
Stars: ✭ 622 (+1051.85%)
DeepxiDeep Xi: A deep learning approach to a priori SNR estimation implemented in TensorFlow 2/Keras. For speech enhancement and robust ASR.
Stars: ✭ 304 (+462.96%)
Abd Net[ICCV 2019] "ABD-Net: Attentive but Diverse Person Re-Identification" https://arxiv.org/abs/1908.01114
Stars: ✭ 272 (+403.7%)
AttentionwalkA PyTorch Implementation of "Watch Your Step: Learning Node Embeddings via Graph Attention" (NeurIPS 2018).
Stars: ✭ 266 (+392.59%)
Performer PytorchAn implementation of Performer, a linear attention-based transformer, in Pytorch
Stars: ✭ 546 (+911.11%)
Nlp tensorflow projectUse tensorflow to achieve some NLP project, eg: classification chatbot ner attention QAetc.
Stars: ✭ 27 (-50%)
mtad-gat-pytorchPyTorch implementation of MTAD-GAT (Multivariate Time-Series Anomaly Detection via Graph Attention Networks) by Zhao et. al (2020, https://arxiv.org/abs/2009.02040).
Stars: ✭ 85 (+57.41%)
Punctuator2A bidirectional recurrent neural network model with attention mechanism for restoring missing punctuation in unsegmented text
Stars: ✭ 483 (+794.44%)
Attention一些不同的Attention机制代码
Stars: ✭ 17 (-68.52%)
AttentionsPyTorch implementation of some attentions for Deep Learning Researchers.
Stars: ✭ 39 (-27.78%)
Rnn NluA TensorFlow implementation of Recurrent Neural Networks for Sequence Classification and Sequence Labeling
Stars: ✭ 463 (+757.41%)
SBR⌛ Introducing Self-Attention to Target Attentive Graph Neural Networks (AISP '22)
Stars: ✭ 22 (-59.26%)
Pytorch GatMy implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
Stars: ✭ 908 (+1581.48%)
Ban VqaBilinear attention networks for visual question answering
Stars: ✭ 449 (+731.48%)
Time AttentionImplementation of RNN for Time Series prediction from the paper https://arxiv.org/abs/1704.02971
Stars: ✭ 52 (-3.7%)
AttentionclusterTensorFlow Implementation of "Attention Clusters: Purely Attention Based Local Feature Integration for Video Classification"
Stars: ✭ 33 (-38.89%)
Mac NetworkImplementation for the paper "Compositional Attention Networks for Machine Reasoning" (Hudson and Manning, ICLR 2018)
Stars: ✭ 444 (+722.22%)