All Projects → Nlp Tutorials → Similar Projects or Alternatives

1584 Open source projects that are alternatives of or similar to Nlp Tutorials

Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+767.51%)
Mutual labels:  tutorial, attention, seq2seq, transformer
Multiturndialogzoo
Multi-turn dialogue baselines written in PyTorch
Stars: ✭ 106 (-73.1%)
Mutual labels:  attention, seq2seq, transformer
transformer
A PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-92.89%)
Mutual labels:  transformer, seq2seq, attention
Nlp Tutorial
Natural Language Processing Tutorial for Deep Learning Researchers
Stars: ✭ 9,895 (+2411.42%)
Mutual labels:  tutorial, attention, transformer
Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+3.55%)
Mutual labels:  attention, seq2seq, transformer
Text Classification Models Pytorch
Implementation of State-of-the-art Text Classification Models in Pytorch
Stars: ✭ 379 (-3.81%)
Mutual labels:  attention, seq2seq, transformer
Awesome Fast Attention
list of efficient attention modules
Stars: ✭ 627 (+59.14%)
Mutual labels:  attention, transformer
Bertqa Attention On Steroids
BertQA - Attention on Steroids
Stars: ✭ 112 (-71.57%)
Mutual labels:  attention, transformer
Sightseq
Computer vision tools for fairseq, containing PyTorch implementation of text recognition and object detection
Stars: ✭ 116 (-70.56%)
Mutual labels:  attention, transformer
Getting Things Done With Pytorch
Jupyter Notebook tutorials on solving real-world problems with Machine Learning & Deep Learning using PyTorch. Topics: Face detection with Detectron 2, Time Series anomaly detection with LSTM Autoencoders, Object Detection with YOLO v5, Build your first Neural Network, Time Series forecasting for Coronavirus daily cases, Sentiment Analysis with BERT.
Stars: ✭ 738 (+87.31%)
Mutual labels:  tutorial, transformer
Chinese Chatbot
中文聊天机器人,基于10万组对白训练而成,采用注意力机制,对一般问题都会生成一个有意义的答复。已上传模型,可直接运行,跑不起来直播吃键盘。
Stars: ✭ 124 (-68.53%)
Mutual labels:  attention, seq2seq
seq2seq-pytorch
Sequence to Sequence Models in PyTorch
Stars: ✭ 41 (-89.59%)
Mutual labels:  transformer, attention
chinese ancient poetry
seq2seq attention tensorflow textrank context
Stars: ✭ 30 (-92.39%)
Mutual labels:  seq2seq, attention
Paddlenlp
NLP Core Library and Model Zoo based on PaddlePaddle 2.0
Stars: ✭ 212 (-46.19%)
Mutual labels:  seq2seq, transformer
Pytorch Original Transformer
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+4.31%)
Mutual labels:  attention, transformer
Njunmt Tf
An open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
Stars: ✭ 97 (-75.38%)
Mutual labels:  attention, transformer
Cell Detr
Official and maintained implementation of the paper Attention-Based Transformers for Instance Segmentation of Cells in Microstructures [BIBM 2020].
Stars: ✭ 26 (-93.4%)
Mutual labels:  attention, transformer
Graphtransformer
Graph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.
Stars: ✭ 187 (-52.54%)
Mutual labels:  attention, transformer
Self Attention Cv
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (-46.95%)
Mutual labels:  attention, transformer
Tensorflow Tutorials
텐서플로우를 기초부터 응용까지 단계별로 연습할 수 있는 소스 코드를 제공합니다
Stars: ✭ 2,096 (+431.98%)
Mutual labels:  tutorial, seq2seq
Machine Translation
Stars: ✭ 51 (-87.06%)
Mutual labels:  seq2seq, transformer
h-transformer-1d
Implementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (-69.29%)
Mutual labels:  transformer, attention
learningspoons
nlp lecture-notes and source code
Stars: ✭ 29 (-92.64%)
Mutual labels:  transformer, attention
NLP-paper
🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-94.16%)
Mutual labels:  transformer, seq2seq
classifier multi label seq2seq attention
multi-label,classifier,text classification,多标签文本分类,文本分类,BERT,ALBERT,multi-label-classification,seq2seq,attention,beam search
Stars: ✭ 26 (-93.4%)
Mutual labels:  seq2seq, attention
chatbot
一个基于深度学习的中文聊天机器人,这里有详细的教程与代码,每份代码都有详细的注释,作为学习是美好的选择。A Chinese chatbot based on deep learning.
Stars: ✭ 94 (-76.14%)
Mutual labels:  seq2seq, attention
Relation-Extraction-Transformer
NLP: Relation extraction with position-aware self-attention transformer
Stars: ✭ 63 (-84.01%)
Mutual labels:  transformer, attention
RNNSearch
An implementation of attention-based neural machine translation using Pytorch
Stars: ✭ 43 (-89.09%)
Mutual labels:  seq2seq, attention
Kospeech
Open-Source Toolkit for End-to-End Korean Automatic Speech Recognition.
Stars: ✭ 190 (-51.78%)
Mutual labels:  seq2seq, transformer
Speech Transformer
A PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Stars: ✭ 565 (+43.4%)
Mutual labels:  attention, transformer
Tensorflow Ml Nlp
텐서플로우와 머신러닝으로 시작하는 자연어처리(로지스틱회귀부터 트랜스포머 챗봇까지)
Stars: ✭ 176 (-55.33%)
Mutual labels:  seq2seq, transformer
Deeplearning Nlp Models
A small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (-83.76%)
Mutual labels:  attention, transformer
Pointer Networks Experiments
Sorting numbers with pointer networks
Stars: ✭ 53 (-86.55%)
Mutual labels:  attention, seq2seq
Asr
Stars: ✭ 54 (-86.29%)
Mutual labels:  seq2seq, transformer
Deep Time Series Prediction
Seq2Seq, Bert, Transformer, WaveNet for time series prediction.
Stars: ✭ 183 (-53.55%)
Mutual labels:  attention, seq2seq
Transformers.jl
Julia Implementation of Transformer models
Stars: ✭ 173 (-56.09%)
Mutual labels:  attention, transformer
Jddc solution 4th
2018-JDDC大赛第4名的解决方案
Stars: ✭ 235 (-40.36%)
Mutual labels:  attention, transformer
Medical Transformer
Pytorch Code for "Medical Transformer: Gated Axial-Attention for Medical Image Segmentation"
Stars: ✭ 153 (-61.17%)
Mutual labels:  attention, transformer
Awesome Chatbot
Awesome Chatbot Projects,Corpus,Papers,Tutorials.Chinese Chatbot =>:
Stars: ✭ 1,785 (+353.05%)
Mutual labels:  tutorial, seq2seq
Machine Learning
My Attempt(s) In The World Of ML/DL....
Stars: ✭ 78 (-80.2%)
Mutual labels:  tutorial, attention
deep-molecular-optimization
Molecular optimization by capturing chemist’s intuition using the Seq2Seq with attention and the Transformer
Stars: ✭ 60 (-84.77%)
Mutual labels:  transformer, seq2seq
Visual-Transformer-Paper-Summary
Summary of Transformer applications for computer vision tasks.
Stars: ✭ 51 (-87.06%)
Mutual labels:  transformer, attention
Encoder decoder
Four styles of encoder decoder model by Python, Theano, Keras and Seq2Seq
Stars: ✭ 269 (-31.73%)
Mutual labels:  attention, seq2seq
tensorflow-ml-nlp-tf2
텐서플로2와 머신러닝으로 시작하는 자연어처리 (로지스틱회귀부터 BERT와 GPT3까지) 실습자료
Stars: ✭ 245 (-37.82%)
Mutual labels:  transformer, seq2seq
TRAR-VQA
[ICCV 2021] TRAR: Routing the Attention Spans in Transformers for Visual Question Answering -- Official Implementation
Stars: ✭ 49 (-87.56%)
Mutual labels:  transformer, attention
CrabNet
Predict materials properties using only the composition information!
Stars: ✭ 57 (-85.53%)
Mutual labels:  transformer, attention
kospeech
Open-Source Toolkit for End-to-End Korean Automatic Speech Recognition leveraging PyTorch and Hydra.
Stars: ✭ 456 (+15.74%)
Mutual labels:  transformer, seq2seq
Seq2seq Summarizer
Pointer-generator reinforced seq2seq summarization in PyTorch
Stars: ✭ 306 (-22.34%)
Mutual labels:  attention, seq2seq
visualization
a collection of visualization function
Stars: ✭ 189 (-52.03%)
Mutual labels:  transformer, attention
Transformer Temporal Tagger
Code and data form the paper BERT Got a Date: Introducing Transformers to Temporal Tagging
Stars: ✭ 55 (-86.04%)
Mutual labels:  transformer, seq2seq
pytorch-transformer-chatbot
PyTorch v1.2에서 생긴 Transformer API 를 이용한 간단한 Chitchat 챗봇
Stars: ✭ 44 (-88.83%)
Mutual labels:  transformer, seq2seq
transformer
Neutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (-84.77%)
Mutual labels:  transformer, seq2seq
ai challenger 2018 sentiment analysis
Fine-grained Sentiment Analysis of User Reviews --- AI CHALLENGER 2018
Stars: ✭ 16 (-95.94%)
Mutual labels:  transformer, attention
Base-On-Relation-Method-Extract-News-DA-RNN-Model-For-Stock-Prediction--Pytorch
基於關聯式新聞提取方法之雙階段注意力機制模型用於股票預測
Stars: ✭ 33 (-91.62%)
Mutual labels:  seq2seq, attention
Seq2seqchatbots
A wrapper around tensor2tensor to flexibly train, interact, and generate data for neural chatbots.
Stars: ✭ 466 (+18.27%)
Mutual labels:  seq2seq, transformer
Sockeye
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+151.27%)
Mutual labels:  seq2seq, transformer
tensorflow-chatbot-chinese
網頁聊天機器人 | tensorflow implementation of seq2seq model with bahdanau attention and Word2Vec pretrained embedding
Stars: ✭ 50 (-87.31%)
Mutual labels:  seq2seq, attention
Embedding
Embedding模型代码和学习笔记总结
Stars: ✭ 25 (-93.65%)
Mutual labels:  transformer, seq2seq
Keras Transformer
Transformer implemented in Keras
Stars: ✭ 273 (-30.71%)
Mutual labels:  attention, transformer
Transformer Tensorflow
TensorFlow implementation of 'Attention Is All You Need (2017. 6)'
Stars: ✭ 319 (-19.04%)
Mutual labels:  attention, transformer
1-60 of 1584 similar projects