Attention TransferImproving Convolutional Networks via Attention Transfer (ICLR 2017)
Stars: ✭ 1,231 (+497.57%)
Ccnet Pure PytorchCriss-Cross Attention for Semantic Segmentation in pure Pytorch with a faster and more precise implementation.
Stars: ✭ 124 (-39.81%)
Nlp JourneyDocuments, papers and codes related to Natural Language Processing, including Topic Model, Word Embedding, Named Entity Recognition, Text Classificatin, Text Generation, Text Similarity, Machine Translation),etc. All codes are implemented intensorflow 2.0.
Stars: ✭ 1,290 (+526.21%)
AttentionsPyTorch implementation of some attentions for Deep Learning Researchers.
Stars: ✭ 39 (-81.07%)
Image Caption GeneratorA neural network to generate captions for an image using CNN and RNN with BEAM Search.
Stars: ✭ 126 (-38.83%)
Absa PytorchAspect Based Sentiment Analysis, PyTorch Implementations. 基于方面的情感分析,使用PyTorch实现。
Stars: ✭ 1,181 (+473.3%)
AttentionnAll about attention in neural networks. Soft attention, attention maps, local and global attention and multi-head attention.
Stars: ✭ 175 (-15.05%)
Leader LineDraw a leader line in your web page.
Stars: ✭ 1,872 (+808.74%)
Njunmt TfAn open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
Stars: ✭ 97 (-52.91%)
Prediction FlowDeep-Learning based CTR models implemented by PyTorch
Stars: ✭ 138 (-33.01%)
Nlp TutorialNatural Language Processing Tutorial for Deep Learning Researchers
Stars: ✭ 9,895 (+4703.4%)
Deeplearning Nlp ModelsA small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (-68.93%)
FluenceA deep learning library based on Pytorch focussed on low resource language research and robustness
Stars: ✭ 54 (-73.79%)
SightseqComputer vision tools for fairseq, containing PyTorch implementation of text recognition and object detection
Stars: ✭ 116 (-43.69%)
Attentive Neural Processesimplementing "recurrent attentive neural processes" to forecast power usage (w. LSTM baseline, MCDropout)
Stars: ✭ 33 (-83.98%)
Numpy MlMachine learning, in numpy
Stars: ✭ 11,100 (+5288.35%)
CaptcharecognitionEnd-to-end variable length Captcha recognition using CNN+RNN+Attention/CTC (pytorch implementation). 端到端的不定长验证码识别
Stars: ✭ 97 (-52.91%)
Isab PytorchAn implementation of (Induced) Set Attention Block, from the Set Transformers paper
Stars: ✭ 21 (-89.81%)
Multihead Siamese NetsImplementation of Siamese Neural Networks built upon multihead attention mechanism for text semantic similarity task.
Stars: ✭ 144 (-30.1%)
Eval On Nn Of RcEmpirical Evaluation on Current Neural Networks on Cloze-style Reading Comprehension
Stars: ✭ 84 (-59.22%)
Vqa regatResearch Code for ICCV 2019 paper "Relation-aware Graph Attention Network for Visual Question Answering"
Stars: ✭ 129 (-37.38%)
HnattTrain and visualize Hierarchical Attention Networks
Stars: ✭ 192 (-6.8%)
Chinese Chatbot中文聊天机器人,基于10万组对白训练而成,采用注意力机制,对一般问题都会生成一个有意义的答复。已上传模型,可直接运行,跑不起来直播吃键盘。
Stars: ✭ 124 (-39.81%)
HatnHierarchical Attention Transfer Network for Cross-domain Sentiment Classification (AAAI'18)
Stars: ✭ 73 (-64.56%)
Rnn For Joint NluPytorch implementation of "Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling" (https://arxiv.org/abs/1609.01454)
Stars: ✭ 176 (-14.56%)
Enjoy Hamburger[ICLR 2021] Is Attention Better Than Matrix Decomposition?
Stars: ✭ 69 (-66.5%)
Absa kerasKeras Implementation of Aspect based Sentiment Analysis
Stars: ✭ 126 (-38.83%)
Global Self Attention NetworkA Pytorch implementation of Global Self-Attention Network, a fully-attention backbone for vision tasks
Stars: ✭ 64 (-68.93%)
Guided Attention Inference NetworkContains implementation of Guided Attention Inference Network (GAIN) presented in Tell Me Where to Look(CVPR 2018). This repository aims to apply GAIN on fcn8 architecture used for segmentation.
Stars: ✭ 204 (-0.97%)
Yolov4 PytorchThis is a pytorch repository of YOLOv4, attentive YOLOv4 and mobilenet YOLOv4 with PASCAL VOC and COCO
Stars: ✭ 1,070 (+419.42%)
FastpunctPunctuation restoration and spell correction experiments.
Stars: ✭ 121 (-41.26%)
Transformers.jlJulia Implementation of Transformer models
Stars: ✭ 173 (-16.02%)
Time AttentionImplementation of RNN for Time Series prediction from the paper https://arxiv.org/abs/1704.02971
Stars: ✭ 52 (-74.76%)
Nlp Models TensorflowGathers machine learning and Tensorflow deep learning models for NLP problems, 1.13 < Tensorflow < 2.0
Stars: ✭ 1,603 (+678.16%)
Biblosa PytorchRe-implementation of Bi-Directional Block Self-Attention for Fast and Memory-Efficient Sequence Modeling (T. Shen et al., ICLR 2018) on Pytorch.
Stars: ✭ 43 (-79.13%)
Graph attention poolAttention over nodes in Graph Neural Networks using PyTorch (NeurIPS 2019)
Stars: ✭ 186 (-9.71%)
AttentionclusterTensorFlow Implementation of "Attention Clusters: Purely Attention Based Local Feature Integration for Video Classification"
Stars: ✭ 33 (-83.98%)
DefactonlpDeFactoNLP: An Automated Fact-checking System that uses Named Entity Recognition, TF-IDF vector comparison and Decomposable Attention models.
Stars: ✭ 30 (-85.44%)
Hey JetsonDeep Learning based Automatic Speech Recognition with attention for the Nvidia Jetson.
Stars: ✭ 161 (-21.84%)
Lambda NetworksImplementation of LambdaNetworks, a new approach to image recognition that reaches SOTA with less compute
Stars: ✭ 1,497 (+626.7%)
Doc Han AttHierarchical Attention Networks for Chinese Sentiment Classification
Stars: ✭ 206 (+0%)
GraphtransformerGraph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.
Stars: ✭ 187 (-9.22%)
Datastories Semeval2017 Task4Deep-learning model presented in "DataStories at SemEval-2017 Task 4: Deep LSTM with Attention for Message-level and Topic-based Sentiment Analysis".
Stars: ✭ 184 (-10.68%)
Medical TransformerPytorch Code for "Medical Transformer: Gated Axial-Attention for Medical Image Segmentation"
Stars: ✭ 153 (-25.73%)