All Projects → graykode → Nlp Tutorial

graykode / Nlp Tutorial

Licence: mit
Natural Language Processing Tutorial for Deep Learning Researchers

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Nlp Tutorial

Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (-65.46%)
Mutual labels:  jupyter-notebook, tutorial, attention, transformer
Bertviz
Tool for visualizing attention in the Transformer model (BERT, GPT-2, Albert, XLNet, RoBERTa, CTRL, etc.)
Stars: ✭ 3,443 (-65.2%)
Mutual labels:  jupyter-notebook, natural-language-processing, transformer, bert
Pytorch Sentiment Analysis
Tutorials on getting started with PyTorch and TorchText for sentiment analysis.
Stars: ✭ 3,209 (-67.57%)
Mutual labels:  jupyter-notebook, tutorial, natural-language-processing, bert
Deeplearning Nlp Models
A small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (-99.35%)
Mutual labels:  jupyter-notebook, attention, transformer
Mindspore Nlp Tutorial
Natural Language Processing Tutorial for MindSpore Users
Stars: ✭ 58 (-99.41%)
Mutual labels:  jupyter-notebook, tutorial, natural-language-processing
Multihead Siamese Nets
Implementation of Siamese Neural Networks built upon multihead attention mechanism for text semantic similarity task.
Stars: ✭ 144 (-98.54%)
Mutual labels:  jupyter-notebook, natural-language-processing, attention
Machine Learning Resources
A curated list of awesome machine learning frameworks, libraries, courses, books and many more.
Stars: ✭ 226 (-97.72%)
Mutual labels:  tutorial, paper, natural-language-processing
Jddc solution 4th
2018-JDDC大赛第4名的解决方案
Stars: ✭ 235 (-97.63%)
Mutual labels:  jupyter-notebook, attention, transformer
Deeptoxic
top 1% solution to toxic comment classification challenge on Kaggle.
Stars: ✭ 180 (-98.18%)
Mutual labels:  jupyter-notebook, tutorial, natural-language-processing
Vietnamese Electra
Electra pre-trained model using Vietnamese corpus
Stars: ✭ 55 (-99.44%)
Mutual labels:  jupyter-notebook, natural-language-processing, transformer
Question generation
Neural question generation using transformers
Stars: ✭ 356 (-96.4%)
Mutual labels:  jupyter-notebook, natural-language-processing, transformer
Nlp Tutorials
Simple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
Stars: ✭ 394 (-96.02%)
Mutual labels:  tutorial, attention, transformer
Bertqa Attention On Steroids
BertQA - Attention on Steroids
Stars: ✭ 112 (-98.87%)
Mutual labels:  jupyter-notebook, attention, transformer
Pytorch Pos Tagging
A tutorial on how to implement models for part-of-speech tagging using PyTorch and TorchText.
Stars: ✭ 96 (-99.03%)
Mutual labels:  jupyter-notebook, tutorial, natural-language-processing
Pytorch Question Answering
Important paper implementations for Question Answering using PyTorch
Stars: ✭ 154 (-98.44%)
Mutual labels:  jupyter-notebook, tutorial, natural-language-processing
Machine Learning
My Attempt(s) In The World Of ML/DL....
Stars: ✭ 78 (-99.21%)
Mutual labels:  jupyter-notebook, tutorial, attention
Transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Stars: ✭ 55,742 (+463.34%)
Mutual labels:  natural-language-processing, transformer, bert
Transformers.jl
Julia Implementation of Transformer models
Stars: ✭ 173 (-98.25%)
Mutual labels:  natural-language-processing, attention, transformer
Code search
Code For Medium Article: "How To Create Natural Language Semantic Search for Arbitrary Objects With Deep Learning"
Stars: ✭ 436 (-95.59%)
Mutual labels:  jupyter-notebook, tutorial, natural-language-processing
Pytorch Original Transformer
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (-95.85%)
Mutual labels:  jupyter-notebook, attention, transformer

nlp-tutorial

nlp-tutorial is a tutorial for who is studying NLP(Natural Language Processing) using Pytorch. Most of the models in NLP were implemented with less than 100 lines of code.(except comments or blank lines)

  • [08-14-2020] Old TensorFlow v1 code is archived in the archive folder. For beginner readability, only pytorch version 1.0 or higher is supported.

Curriculum - (Example Purpose)

1. Basic Embedding Model

2. CNN(Convolutional Neural Network)

3. RNN(Recurrent Neural Network)

4. Attention Mechanism

5. Model based on Transformer

Dependencies

  • Python 3.5+
  • Pytorch 1.0.0+

Author

  • Tae Hwan Jung(Jeff Jung) @graykode
  • Author Email : [email protected]
  • Acknowledgements to mojitok as NLP Research Internship.
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].