keras-chatbot-web-apiSimple keras chat bot using seq2seq model with Flask serving web
Stars: ✭ 51 (+168.42%)
seq3Source code for the NAACL 2019 paper "SEQ^3: Differentiable Sequence-to-Sequence-to-Sequence Autoencoder for Unsupervised Abstractive Sentence Compression"
Stars: ✭ 121 (+536.84%)
RNNSearchAn implementation of attention-based neural machine translation using Pytorch
Stars: ✭ 43 (+126.32%)
sktSanskrit compound segmentation using seq2seq model
Stars: ✭ 21 (+10.53%)
CVAE DialCVAE_XGate model in paper "Xu, Dusek, Konstas, Rieser. Better Conversations by Modeling, Filtering, and Optimizing for Coherence and Diversity"
Stars: ✭ 16 (-15.79%)
dtsA Keras library for multi-step time-series forecasting.
Stars: ✭ 130 (+584.21%)
Word-Level-Eng-Mar-NMTTranslating English sentences to Marathi using Neural Machine Translation
Stars: ✭ 37 (+94.74%)
fiction generatorFiction generator with Tensorflow. 模仿王小波的风格的小说生成器
Stars: ✭ 27 (+42.11%)
YodaSpeakTranslating English to Yoda English using Sequence-to-Sequence with Tensorflow.
Stars: ✭ 25 (+31.58%)
parse seq2seqA tensorflow implementation of neural sequence-to-sequence parser for converting natural language queries to logical form.
Stars: ✭ 26 (+36.84%)
transformerNeutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (+215.79%)
chatbotkbqa task-oriented qa seq2seq ir neo4j jena seq2seq tf chatbot chat
Stars: ✭ 32 (+68.42%)
classyclassy is a simple-to-use library for building high-performance Machine Learning models in NLP.
Stars: ✭ 61 (+221.05%)
sentence2vecDeep sentence embedding using Sequence to Sequence learning
Stars: ✭ 23 (+21.05%)
Video-Cap🎬 Video Captioning: ICCV '15 paper implementation
Stars: ✭ 44 (+131.58%)
dynmt-pyNeural machine translation implementation using dynet's python bindings
Stars: ✭ 17 (-10.53%)
torch-asgAuto Segmentation Criterion (ASG) implemented in pytorch
Stars: ✭ 42 (+121.05%)
NLP-paper🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (+21.05%)
NeuralCitationNetworkNeural Citation Network for Context-Aware Citation Recommendation (SIGIR 2017)
Stars: ✭ 24 (+26.32%)
probabilistic nlgTensorflow Implementation of Stochastic Wasserstein Autoencoder for Probabilistic Sentence Generation (NAACL 2019).
Stars: ✭ 28 (+47.37%)
ttslearnttslearn: Library for Pythonで学ぶ音声合成 (Text-to-speech with Python)
Stars: ✭ 158 (+731.58%)
kospeechOpen-Source Toolkit for End-to-End Korean Automatic Speech Recognition leveraging PyTorch and Hydra.
Stars: ✭ 456 (+2300%)
Nuts自然语言处理常见任务(主要包括文本分类,序列标注,自动问答等)解决方案试验田
Stars: ✭ 21 (+10.53%)
DeepLearning-LabCode lab for deep learning. Including rnn,seq2seq,word2vec,cross entropy,bidirectional rnn,convolution operation,pooling operation,InceptionV3,transfer learning.
Stars: ✭ 83 (+336.84%)
avsr-tf1Audio-Visual Speech Recognition using Sequence to Sequence Models
Stars: ✭ 76 (+300%)
Shakespearizing-Modern-EnglishCode for "Jhamtani H.*, Gangal V.*, Hovy E. and Nyberg E. Shakespearizing Modern Language Using Copy-Enriched Sequence to Sequence Models" Workshop on Stylistic Variation, EMNLP 2017
Stars: ✭ 64 (+236.84%)
neural-chatAn AI chatbot using seq2seq
Stars: ✭ 30 (+57.89%)
adversarial-code-generationSource code for the ICLR 2021 work "Generating Adversarial Computer Programs using Optimized Obfuscations"
Stars: ✭ 16 (-15.79%)
transformerA PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (+47.37%)
minimal-nmtA minimal nmt example to serve as an seq2seq+attention reference.
Stars: ✭ 36 (+89.47%)
EmbeddingEmbedding模型代码和学习笔记总结
Stars: ✭ 25 (+31.58%)
DLCV2018SPRINGDeep Learning for Computer Vision (CommE 5052) in NTU
Stars: ✭ 38 (+100%)
cnn-seq2seqNo description or website provided.
Stars: ✭ 39 (+105.26%)
TaLKConvolutionsOfficial PyTorch implementation of Time-aware Large Kernel (TaLK) Convolutions (ICML 2020)
Stars: ✭ 26 (+36.84%)
chatbot一个基于深度学习的中文聊天机器人,这里有详细的教程与代码,每份代码都有详细的注释,作为学习是美好的选择。A Chinese chatbot based on deep learning.
Stars: ✭ 94 (+394.74%)
lstm-mathNeural network that solves math equations on the character level
Stars: ✭ 26 (+36.84%)
MoChA-pytorchPyTorch Implementation of "Monotonic Chunkwise Attention" (ICLR 2018)
Stars: ✭ 65 (+242.11%)
SequenceToSequenceA seq2seq with attention dialogue/MT model implemented by TensorFlow.
Stars: ✭ 11 (-42.11%)
BERT-NERUsing pre-trained BERT models for Chinese and English NER with 🤗Transformers
Stars: ✭ 114 (+500%)
lang2logic-PyTorchPyTorch port of the paper "Language to Logical Form with Neural Attention"
Stars: ✭ 34 (+78.95%)
ai-visual-storytelling-seq2seqImplementation of seq2seq model for Visual Storytelling Challenge (VIST) http://visionandlanguage.net/VIST/index.html
Stars: ✭ 50 (+163.16%)
FontRNNImplementation of FontRNN [Computer Graphics Forum, 2019].
Stars: ✭ 27 (+42.11%)
convolutional seq2seqfairseq: Convolutional Sequence to Sequence Learning (Gehring et al. 2017) by Chainer
Stars: ✭ 63 (+231.58%)
GAN-RNN Timeseries-imputationRecurrent GAN for imputation of time series data. Implemented in TensorFlow 2 on Wikipedia Web Traffic Forecast dataset from Kaggle.
Stars: ✭ 107 (+463.16%)
2D-LSTM-Seq2SeqPyTorch implementation of a 2D-LSTM Seq2Seq Model for NMT.
Stars: ✭ 25 (+31.58%)
chatbot🤖️ 基于 PyTorch 的任务型聊天机器人(支持私有部署和 docker 部署的 Chatbot)
Stars: ✭ 77 (+305.26%)
deep-molecular-optimizationMolecular optimization by capturing chemist’s intuition using the Seq2Seq with attention and the Transformer
Stars: ✭ 60 (+215.79%)
deep-keyphraseseq2seq based keyphrase generation model sets, including copyrnn copycnn and copytransfomer
Stars: ✭ 51 (+168.42%)