transformerNeutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (+66.67%)
Seq2seq chatbot基于seq2seq模型的简单对话系统的tf实现,具有embedding、attention、beam_search等功能,数据集是Cornell Movie Dialogs
Stars: ✭ 308 (+755.56%)
Tf Seq2seqSequence to sequence learning using TensorFlow.
Stars: ✭ 387 (+975%)
SockeyeSequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+2650%)
Seq2seq chatbot new基于seq2seq模型的简单对话系统的tf实现,具有embedding、attention、beam_search等功能,数据集是Cornell Movie Dialogs
Stars: ✭ 144 (+300%)
TS3000 TheChatBOTIts a social networking chat-bot trained on Reddit dataset . It supports open bounded queries developed on the concept of Neural Machine Translation. Beware of its being sarcastic just like its creator 😝 BDW it uses Pytorch framework and Python3.
Stars: ✭ 20 (-44.44%)
Image Caption GeneratorA neural network to generate captions for an image using CNN and RNN with BEAM Search.
Stars: ✭ 126 (+250%)
NmtpytorchSequence-to-Sequence Framework in PyTorch
Stars: ✭ 392 (+988.89%)
Word-Level-Eng-Mar-NMTTranslating English sentences to Marathi using Neural Machine Translation
Stars: ✭ 37 (+2.78%)
Image-CaptionUsing LSTM or Transformer to solve Image Captioning in Pytorch
Stars: ✭ 36 (+0%)
Video-Cap🎬 Video Captioning: ICCV '15 paper implementation
Stars: ✭ 44 (+22.22%)
ttslearnttslearn: Library for Pythonで学ぶ音声合成 (Text-to-speech with Python)
Stars: ✭ 158 (+338.89%)
Neural spEnd-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+1033.33%)
Nlp made easyExplains nlp building blocks in a simple manner.
Stars: ✭ 232 (+544.44%)
Nmt KerasNeural Machine Translation with Keras
Stars: ✭ 501 (+1291.67%)
RNNSearchAn implementation of attention-based neural machine translation using Pytorch
Stars: ✭ 43 (+19.44%)
SequenceToSequenceA seq2seq with attention dialogue/MT model implemented by TensorFlow.
Stars: ✭ 11 (-69.44%)
NLP-paper🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-36.11%)
Im2latexImage to LaTeX (Seq2seq + Attention with Beam Search) - Tensorflow
Stars: ✭ 342 (+850%)
JoeynmtMinimalist NMT for educational purposes
Stars: ✭ 420 (+1066.67%)
neural-chatAn AI chatbot using seq2seq
Stars: ✭ 30 (-16.67%)
Seq2seq.pytorchSequence-to-Sequence learning using PyTorch
Stars: ✭ 514 (+1327.78%)
XmunmtAn implementation of RNNsearch using TensorFlow
Stars: ✭ 69 (+91.67%)
Nspm🤖 Neural SPARQL Machines for Knowledge Graph Question Answering.
Stars: ✭ 156 (+333.33%)
Seq2seq基于Pytorch的中文聊天机器人 集成BeamSearch算法
Stars: ✭ 200 (+455.56%)
dynmt-pyNeural machine translation implementation using dynet's python bindings
Stars: ✭ 17 (-52.78%)
Seq2seq SummarizerPointer-generator reinforced seq2seq summarization in PyTorch
Stars: ✭ 306 (+750%)
Pytorch Seq2seqTutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+9394.44%)
MoChA-pytorchPyTorch Implementation of "Monotonic Chunkwise Attention" (ICLR 2018)
Stars: ✭ 65 (+80.56%)
Openseq2seqToolkit for efficient experimentation with Speech Recognition, Text2Speech and NLP
Stars: ✭ 1,378 (+3727.78%)
Tensorflow ShakespeareNeural machine translation between the writings of Shakespeare and modern English using TensorFlow
Stars: ✭ 244 (+577.78%)
beam searchBeam search for neural network sequence to sequence (encoder-decoder) models.
Stars: ✭ 31 (-13.89%)
2018-dlslUPC Deep Learning for Speech and Language 2018
Stars: ✭ 18 (-50%)
seq3Source code for the NAACL 2019 paper "SEQ^3: Differentiable Sequence-to-Sequence-to-Sequence Autoencoder for Unsupervised Abstractive Sentence Compression"
Stars: ✭ 121 (+236.11%)
rnnt decoder cudaAn efficient implementation of RNN-T Prefix Beam Search in C++/CUDA.
Stars: ✭ 60 (+66.67%)
Machine-Translation-Hindi-to-english-Machine translation is the task of converting one language to other. Unlike the traditional phrase-based translation system which consists of many small sub-components that are tuned separately, neural machine translation attempts to build and train a single, large neural network that reads a sentence and outputs a correct translation.
Stars: ✭ 19 (-47.22%)
CVAE DialCVAE_XGate model in paper "Xu, Dusek, Konstas, Rieser. Better Conversations by Modeling, Filtering, and Optimizing for Coherence and Diversity"
Stars: ✭ 16 (-55.56%)
resolutions-2019A list of data mining and machine learning papers that I implemented in 2019.
Stars: ✭ 19 (-47.22%)
h-transformer-1dImplementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (+236.11%)
visdialVisual Dialog: Light-weight Transformer for Many Inputs (ECCV 2020)
Stars: ✭ 27 (-25%)
efficient-attentionAn implementation of the efficient attention module.
Stars: ✭ 191 (+430.56%)
NiuTrans.NMTA Fast Neural Machine Translation System. It is developed in C++ and resorts to NiuTensor for fast tensor APIs.
Stars: ✭ 112 (+211.11%)
dodrioExploring attention weights in transformer-based models with linguistic knowledge.
Stars: ✭ 233 (+547.22%)
DCAN[AAAI 2020] Code release for "Domain Conditioned Adaptation Network" https://arxiv.org/abs/2005.06717
Stars: ✭ 27 (-25%)
probabilistic nlgTensorflow Implementation of Stochastic Wasserstein Autoencoder for Probabilistic Sentence Generation (NAACL 2019).
Stars: ✭ 28 (-22.22%)
SiGATsource code for signed graph attention networks (ICANN2019) & SDGNN (AAAI2021)
Stars: ✭ 37 (+2.78%)
Multi-task-Conditional-Attention-NetworksA prototype version of our submitted paper: Conversion Prediction Using Multi-task Conditional Attention Networks to Support the Creation of Effective Ad Creatives.
Stars: ✭ 21 (-41.67%)
lang2logic-PyTorchPyTorch port of the paper "Language to Logical Form with Neural Attention"
Stars: ✭ 34 (-5.56%)
RETRO-pytorchImplementation of RETRO, Deepmind's Retrieval based Attention net, in Pytorch
Stars: ✭ 473 (+1213.89%)
chatbot一个基于深度学习的中文聊天机器人,这里有详细的教程与代码,每份代码都有详细的注释,作为学习是美好的选择。A Chinese chatbot based on deep learning.
Stars: ✭ 94 (+161.11%)