seq2seq-autoencoderTheano implementation of Sequence-to-Sequence Autoencoder
Stars: ✭ 12 (-66.67%)
Dalle PytorchImplementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
Stars: ✭ 3,661 (+10069.44%)
Neat VisionNeat (Neural Attention) Vision, is a visualization tool for the attention mechanisms of deep-learning models for Natural Language Processing (NLP) tasks. (framework-agnostic)
Stars: ✭ 213 (+491.67%)
Linear Attention TransformerTransformer based on a variant of attention that is linear complexity in respect to sequence length
Stars: ✭ 205 (+469.44%)
Seq2seq Signal PredictionSignal forecasting with a Sequence-to-Sequence (seq2seq) Recurrent Neural Network (RNN) model in TensorFlow - Guillaume Chevalier
Stars: ✭ 890 (+2372.22%)
Attention MechanismsImplementations for a family of attention mechanisms, suitable for all kinds of natural language processing tasks and compatible with TensorFlow 2.0 and Keras.
Stars: ✭ 203 (+463.89%)
STAM-pytorchImplementation of STAM (Space Time Attention Model), a pure and simple attention model that reaches SOTA for video classification
Stars: ✭ 109 (+202.78%)
Csa InpaintingCoherent Semantic Attention for image inpainting(ICCV 2019)
Stars: ✭ 202 (+461.11%)
Cluener2020CLUENER2020 中文细粒度命名实体识别 Fine Grained Named Entity Recognition
Stars: ✭ 689 (+1813.89%)
NiuTrans.NMTA Fast Neural Machine Translation System. It is developed in C++ and resorts to NiuTensor for fast tensor APIs.
Stars: ✭ 112 (+211.11%)
Attentive Gan DerainnetUnofficial tensorflow implemention of "Attentive Generative Adversarial Network for Raindrop Removal from A Single Image (CVPR 2018) " model https://maybeshewill-cv.github.io/attentive-gan-derainnet/
Stars: ✭ 184 (+411.11%)
Seq2seq PytorchSequence to Sequence Models with PyTorch
Stars: ✭ 678 (+1783.33%)
Datastories Semeval2017 Task4Deep-learning model presented in "DataStories at SemEval-2017 Task 4: Deep LSTM with Attention for Message-level and Topic-based Sentiment Analysis".
Stars: ✭ 184 (+411.11%)
Seq2seq CoupletPlay couplet with seq2seq model. 用深度学习对对联。
Stars: ✭ 5,149 (+14202.78%)
Lstm attentionattention-based LSTM/Dense implemented by Keras
Stars: ✭ 168 (+366.67%)
Slot AttentionImplementation of Slot Attention from GoogleAI
Stars: ✭ 168 (+366.67%)
tensorflow-chatbot-chinese網頁聊天機器人 | tensorflow implementation of seq2seq model with bahdanau attention and Word2Vec pretrained embedding
Stars: ✭ 50 (+38.89%)
Nlp pytorch projectEmbedding, NMT, Text_Classification, Text_Generation, NER etc.
Stars: ✭ 153 (+325%)
Tf chatbot seq2seq antilmSeq2seq chatbot with attention and anti-language model to suppress generic response, option for further improve by deep reinforcement learning.
Stars: ✭ 369 (+925%)
Seq2seqchatbotsA wrapper around tensor2tensor to flexibly train, interact, and generate data for neural chatbots.
Stars: ✭ 466 (+1194.44%)
HartHierarchical Attentive Recurrent Tracking
Stars: ✭ 149 (+313.89%)
Attribute Aware Attention[ACM MM 2018] Attribute-Aware Attention Model for Fine-grained Representation Learning
Stars: ✭ 143 (+297.22%)
Prediction FlowDeep-Learning based CTR models implemented by PyTorch
Stars: ✭ 138 (+283.33%)
ChangeFormerOfficial PyTorch implementation of our IGARSS'22 paper: A Transformer-Based Siamese Network for Change Detection
Stars: ✭ 220 (+511.11%)
Image CaptioningImage Captioning using InceptionV3 and beam search
Stars: ✭ 290 (+705.56%)
captioning chainerA fast implementation of Neural Image Caption by Chainer
Stars: ✭ 17 (-52.78%)
Abstractive SummarizationImplementation of abstractive summarization using LSTM in the encoder-decoder architecture with local attention.
Stars: ✭ 128 (+255.56%)
SiGATsource code for signed graph attention networks (ICANN2019) & SDGNN (AAAI2021)
Stars: ✭ 37 (+2.78%)
MultiwozSource code for end-to-end dialogue model from the MultiWOZ paper (Budzianowski et al. 2018, EMNLP)
Stars: ✭ 384 (+966.67%)
PygatPytorch implementation of the Graph Attention Network model by Veličković et. al (2017, https://arxiv.org/abs/1710.10903)
Stars: ✭ 1,853 (+5047.22%)
Multi-task-Conditional-Attention-NetworksA prototype version of our submitted paper: Conversion Prediction Using Multi-task Conditional Attention Networks to Support the Creation of Effective Ad Creatives.
Stars: ✭ 21 (-41.67%)
Overlappredator[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 106 (+194.44%)
Ylg[CVPR 2020] Official Implementation: "Your Local GAN: Designing Two Dimensional Local Attention Mechanisms for Generative Models".
Stars: ✭ 109 (+202.78%)
question-generationNeural Models for Key Phrase Detection and Question Generation
Stars: ✭ 29 (-19.44%)
Reformer PytorchReformer, the efficient Transformer, in Pytorch
Stars: ✭ 1,644 (+4466.67%)
Time Series PredictionA collection of time series prediction methods: rnn, seq2seq, cnn, wavenet, transformer, unet, n-beats, gan, kalman-filter
Stars: ✭ 351 (+875%)
RETRO-pytorchImplementation of RETRO, Deepmind's Retrieval based Attention net, in Pytorch
Stars: ✭ 473 (+1213.89%)
ai-n-queensSolving and GUI demonstration of traditional N-Queens Problem using Hill Climbing, Simulated Annealing, Local Beam Search, and Genetic Algorithm.
Stars: ✭ 30 (-16.67%)
hamnetPyTorch implementation of AAAI 2021 paper: A Hybrid Attention Mechanism for Weakly-Supervised Temporal Action Localization
Stars: ✭ 30 (-16.67%)
LMFD-PADLearnable Multi-level Frequency Decomposition and Hierarchical Attention Mechanism for Generalized Face Presentation Attack Detection
Stars: ✭ 27 (-25%)
parallel-corpora-toolsTools for filtering and cleaning parallel and monolingual corpora for machine translation and other natural language processing tasks.
Stars: ✭ 35 (-2.78%)
hexiaMid-level PyTorch Based Framework for Visual Question Answering.
Stars: ✭ 24 (-33.33%)
deepQAa Deep learning based chatbot implemented by Tensorflow with beam search (forked from Conchylicultor/DeepQA)
Stars: ✭ 17 (-52.78%)