Seq2seqchatbotsA wrapper around tensor2tensor to flexibly train, interact, and generate data for neural chatbots.
Stars: ✭ 466 (+339.62%)
Nlp TutorialsSimple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
Stars: ✭ 394 (+271.7%)
transformerA PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-73.58%)
Neural spEnd-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+284.91%)
Chinese Chatbot中文聊天机器人,基于10万组对白训练而成,采用注意力机制,对一般问题都会生成一个有意义的答复。已上传模型,可直接运行,跑不起来直播吃键盘。
Stars: ✭ 124 (+16.98%)
Pytorch Seq2seqTutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+3124.53%)
Tensorflow Ml Nlp텐서플로우와 머신러닝으로 시작하는 자연어처리(로지스틱회귀부터 트랜스포머 챗봇까지)
Stars: ✭ 176 (+66.04%)
keras-chatbot-web-apiSimple keras chat bot using seq2seq model with Flask serving web
Stars: ✭ 51 (-51.89%)
visualizationa collection of visualization function
Stars: ✭ 189 (+78.3%)
NLP-paper🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-78.3%)
chatbot一个基于深度学习的中文聊天机器人,这里有详细的教程与代码,每份代码都有详细的注释,作为学习是美好的选择。A Chinese chatbot based on deep learning.
Stars: ✭ 94 (-11.32%)
XpersonaXPersona: Evaluating Multilingual Personalized Chatbot
Stars: ✭ 54 (-49.06%)
Njunmt TfAn open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
Stars: ✭ 97 (-8.49%)
transformerNeutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (-43.4%)
deep-molecular-optimizationMolecular optimization by capturing chemist’s intuition using the Seq2Seq with attention and the Transformer
Stars: ✭ 60 (-43.4%)
DeepqaMy tensorflow implementation of "A neural conversational model", a Deep learning based chatbot
Stars: ✭ 2,811 (+2551.89%)
RNNSearchAn implementation of attention-based neural machine translation using Pytorch
Stars: ✭ 43 (-59.43%)
Encoder decoderFour styles of encoder decoder model by Python, Theano, Keras and Seq2Seq
Stars: ✭ 269 (+153.77%)
Transformer TensorflowTensorFlow implementation of 'Attention Is All You Need (2017. 6)'
Stars: ✭ 319 (+200.94%)
CrabNetPredict materials properties using only the composition information!
Stars: ✭ 57 (-46.23%)
h-transformer-1dImplementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (+14.15%)
chatbotkbqa task-oriented qa seq2seq ir neo4j jena seq2seq tf chatbot chat
Stars: ✭ 32 (-69.81%)
Pytorch Original TransformerMy implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+287.74%)
Practical seq2seqA simple, minimal wrapper for tensorflow's seq2seq module, for experimenting with datasets rapidly
Stars: ✭ 563 (+431.13%)
Speech TransformerA PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Stars: ✭ 565 (+433.02%)
EmbeddingEmbedding模型代码和学习笔记总结
Stars: ✭ 25 (-76.42%)
chatbot🤖️ 基于 PyTorch 的任务型聊天机器人(支持私有部署和 docker 部署的 Chatbot)
Stars: ✭ 77 (-27.36%)
Seq2seq chatbot linksLinks to the implementations of neural conversational models for different frameworks
Stars: ✭ 270 (+154.72%)
TRAR-VQA[ICCV 2021] TRAR: Routing the Attention Spans in Transformers for Visual Question Answering -- Official Implementation
Stars: ✭ 49 (-53.77%)
Seq2seq Chatbot For KerasThis repository contains a new generative model of chatbot based on seq2seq modeling.
Stars: ✭ 322 (+203.77%)
Seq2seq chatbot基于seq2seq模型的简单对话系统的tf实现,具有embedding、attention、beam_search等功能,数据集是Cornell Movie Dialogs
Stars: ✭ 308 (+190.57%)
JoeynmtMinimalist NMT for educational purposes
Stars: ✭ 420 (+296.23%)
Seq2seq SummarizerPointer-generator reinforced seq2seq summarization in PyTorch
Stars: ✭ 306 (+188.68%)
SockeyeSequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+833.96%)
MoelMoEL: Mixture of Empathetic Listeners
Stars: ✭ 38 (-64.15%)
Nlp tensorflow projectUse tensorflow to achieve some NLP project, eg: classification chatbot ner attention QAetc.
Stars: ✭ 27 (-74.53%)
Cell DetrOfficial and maintained implementation of the paper Attention-Based Transformers for Instance Segmentation of Cells in Microstructures [BIBM 2020].
Stars: ✭ 26 (-75.47%)
Deeplearning Nlp ModelsA small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (-39.62%)
Transformer Temporal TaggerCode and data form the paper BERT Got a Date: Introducing Transformers to Temporal Tagging
Stars: ✭ 55 (-48.11%)
kospeechOpen-Source Toolkit for End-to-End Korean Automatic Speech Recognition leveraging PyTorch and Hydra.
Stars: ✭ 456 (+330.19%)