All Projects → MorvanZhou → Nlp Tutorials

MorvanZhou / Nlp Tutorials

Licence: mit
Simple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Nlp Tutorials

Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+767.51%)
Mutual labels:  tutorial, attention, seq2seq, transformer
Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+3.55%)
Mutual labels:  attention, seq2seq, transformer
transformer
A PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-92.89%)
Mutual labels:  transformer, seq2seq, attention
Nlp Tutorial
Natural Language Processing Tutorial for Deep Learning Researchers
Stars: ✭ 9,895 (+2411.42%)
Mutual labels:  tutorial, attention, transformer
Multiturndialogzoo
Multi-turn dialogue baselines written in PyTorch
Stars: ✭ 106 (-73.1%)
Mutual labels:  attention, seq2seq, transformer
Text Classification Models Pytorch
Implementation of State-of-the-art Text Classification Models in Pytorch
Stars: ✭ 379 (-3.81%)
Mutual labels:  attention, seq2seq, transformer
Relation-Extraction-Transformer
NLP: Relation extraction with position-aware self-attention transformer
Stars: ✭ 63 (-84.01%)
Mutual labels:  transformer, attention
visualization
a collection of visualization function
Stars: ✭ 189 (-52.03%)
Mutual labels:  transformer, attention
Embedding
Embedding模型代码和学习笔记总结
Stars: ✭ 25 (-93.65%)
Mutual labels:  transformer, seq2seq
pytorch-transformer-chatbot
PyTorch v1.2에서 생긴 Transformer API 를 이용한 간단한 Chitchat 챗봇
Stars: ✭ 44 (-88.83%)
Mutual labels:  transformer, seq2seq
NLP-paper
🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-94.16%)
Mutual labels:  transformer, seq2seq
Base-On-Relation-Method-Extract-News-DA-RNN-Model-For-Stock-Prediction--Pytorch
基於關聯式新聞提取方法之雙階段注意力機制模型用於股票預測
Stars: ✭ 33 (-91.62%)
Mutual labels:  seq2seq, attention
deep-molecular-optimization
Molecular optimization by capturing chemist’s intuition using the Seq2Seq with attention and the Transformer
Stars: ✭ 60 (-84.77%)
Mutual labels:  transformer, seq2seq
classifier multi label seq2seq attention
multi-label,classifier,text classification,多标签文本分类,文本分类,BERT,ALBERT,multi-label-classification,seq2seq,attention,beam search
Stars: ✭ 26 (-93.4%)
Mutual labels:  seq2seq, attention
RNNSearch
An implementation of attention-based neural machine translation using Pytorch
Stars: ✭ 43 (-89.09%)
Mutual labels:  seq2seq, attention
chatbot
一个基于深度学习的中文聊天机器人,这里有详细的教程与代码,每份代码都有详细的注释,作为学习是美好的选择。A Chinese chatbot based on deep learning.
Stars: ✭ 94 (-76.14%)
Mutual labels:  seq2seq, attention
transformer
Neutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (-84.77%)
Mutual labels:  transformer, seq2seq
Encoder decoder
Four styles of encoder decoder model by Python, Theano, Keras and Seq2Seq
Stars: ✭ 269 (-31.73%)
Mutual labels:  attention, seq2seq
ai challenger 2018 sentiment analysis
Fine-grained Sentiment Analysis of User Reviews --- AI CHALLENGER 2018
Stars: ✭ 16 (-95.94%)
Mutual labels:  transformer, attention
Keras Transformer
Transformer implemented in Keras
Stars: ✭ 273 (-30.71%)
Mutual labels:  attention, transformer

Natural Language Processing Tutorial

Tutorial in Chinese can be found in mofanpy.com.

This repo includes many simple implementations of models in Neural Language Processing (NLP).

All code implementations in this tutorial are organized as following:

  1. Search Engine
  1. Understand Word (W2V)
  1. Understand Sentence (Seq2Seq)
  1. All about Attention
  1. Pretrained Models

Thanks for the contribution made by @W1Fl with a simplified keras codes in simple_realize

Installation

$ git clone https://github.com/MorvanZhou/NLP-Tutorials
$ cd NLP-Tutorials/
$ sudo pip3 install -r requirements.txt

TF-IDF

TF-IDF numpy code

TF-IDF short sklearn code

image

Word2Vec

Efficient Estimation of Word Representations in Vector Space

Skip-Gram code

CBOW code

image image image

Seq2Seq

Sequence to Sequence Learning with Neural Networks

Seq2Seq code

image

CNNLanguageModel

Convolutional Neural Networks for Sentence Classification

CNN language model code

image

Seq2SeqAttention

Effective Approaches to Attention-based Neural Machine Translation

Seq2Seq Attention code

image image

Transformer

Attention Is All You Need

Transformer code

image image image

ELMO

Deep contextualized word representations

ELMO code

image image

GPT

Improving Language Understanding by Generative Pre-Training

GPT code

image image

BERT

BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

BERT code

My new attempt Bert with window mask

image image
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].