All Projects → sudharsan13296 → Getting Started With Google Bert

sudharsan13296 / Getting Started With Google Bert

Build and train state-of-the-art natural language processing models using BERT

Projects that are alternatives of or similar to Getting Started With Google Bert

Bert Multitask Learning
BERT for Multitask Learning
Stars: ✭ 380 (+255.14%)
Mutual labels:  jupyter-notebook, transformer
Getting Things Done With Pytorch
Jupyter Notebook tutorials on solving real-world problems with Machine Learning & Deep Learning using PyTorch. Topics: Face detection with Detectron 2, Time Series anomaly detection with LSTM Autoencoders, Object Detection with YOLO v5, Build your first Neural Network, Time Series forecasting for Coronavirus daily cases, Sentiment Analysis with BERT.
Stars: ✭ 738 (+589.72%)
Mutual labels:  jupyter-notebook, transformer
Deepsvg
[NeurIPS 2020] Official code for the paper "DeepSVG: A Hierarchical Generative Network for Vector Graphics Animation". Includes a PyTorch library for deep learning with SVG data.
Stars: ✭ 403 (+276.64%)
Mutual labels:  jupyter-notebook, transformer
Demo Chinese Text Binary Classification With Bert
Stars: ✭ 276 (+157.94%)
Mutual labels:  jupyter-notebook, transformer
Indonesian Language Models
Indonesian Language Models and its Usage
Stars: ✭ 64 (-40.19%)
Mutual labels:  jupyter-notebook, transformer
Dab
Data Augmentation by Backtranslation (DAB) ヽ( •_-)ᕗ
Stars: ✭ 294 (+174.77%)
Mutual labels:  jupyter-notebook, transformer
Pytorch Original Transformer
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+284.11%)
Mutual labels:  jupyter-notebook, transformer
Jddc solution 4th
2018-JDDC大赛第4名的解决方案
Stars: ✭ 235 (+119.63%)
Mutual labels:  jupyter-notebook, transformer
Deeplearning Nlp Models
A small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (-40.19%)
Mutual labels:  jupyter-notebook, transformer
Vietnamese Electra
Electra pre-trained model using Vietnamese corpus
Stars: ✭ 55 (-48.6%)
Mutual labels:  jupyter-notebook, transformer
Transformer
Implementation of Transformer model (originally from Attention is All You Need) applied to Time Series.
Stars: ✭ 273 (+155.14%)
Mutual labels:  jupyter-notebook, transformer
Smiles Transformer
Original implementation of the paper "SMILES Transformer: Pre-trained Molecular Fingerprint for Low Data Drug Discovery" by Shion Honda et al.
Stars: ✭ 86 (-19.63%)
Mutual labels:  jupyter-notebook, transformer
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+3094.39%)
Mutual labels:  jupyter-notebook, transformer
Question generation
Neural question generation using transformers
Stars: ✭ 356 (+232.71%)
Mutual labels:  jupyter-notebook, transformer
Bertviz
Tool for visualizing attention in the Transformer model (BERT, GPT-2, Albert, XLNet, RoBERTa, CTRL, etc.)
Stars: ✭ 3,443 (+3117.76%)
Mutual labels:  jupyter-notebook, transformer
Tsai
Time series Timeseries Deep Learning Pytorch fastai - State-of-the-art Deep Learning with Time Series and Sequences in Pytorch / fastai
Stars: ✭ 407 (+280.37%)
Mutual labels:  jupyter-notebook, transformer
Sttn
[ECCV'2020] STTN: Learning Joint Spatial-Temporal Transformations for Video Inpainting
Stars: ✭ 211 (+97.2%)
Mutual labels:  jupyter-notebook, transformer
Nn
🧑‍🏫 50! Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
Stars: ✭ 5,720 (+5245.79%)
Mutual labels:  jupyter-notebook, transformer
Gpt2 French
GPT-2 French demo | Démo française de GPT-2
Stars: ✭ 47 (-56.07%)
Mutual labels:  jupyter-notebook, transformer
Nlp Tutorial
Natural Language Processing Tutorial for Deep Learning Researchers
Stars: ✭ 9,895 (+9147.66%)
Mutual labels:  jupyter-notebook, transformer

Getting started with BERT

Build and train state-of-the-art natural language processing models using BERT

About the book

Book Cover

BERT (bidirectional encoder representations from transformer) has revolutionized the world of natural language processing (NLP) with promising results. This book is an introductory guide that will help you get to grips with Google's BERT architecture. With a detailed explanation of the transformer architecture, this book will help you understand how the transformer's encoder and decoder work.

You'll explore the BERT architecture by learning how the BERT model is pre-trained and how to use pre-trained BERT for downstream tasks by fine-tuning it for NLP tasks such as sentiment analysis and text summarization with the Hugging Face transformers library. As you advance, you'll learn about different variants of BERT such as ALBERT, RoBERTa, and ELECTRA, and look at SpanBERT, which is used for NLP tasks like question answering. You'll also cover simpler and faster BERT variants based on knowledge distillation such as DistilBERT and TinyBERT.

The book takes you through MBERT, XLM, and XLM-R in detail and then introduces you to sentence-BERT, which is used for obtaining sentence representation. Finally, you'll discover domain-specific BERT models such as BioBERT and ClinicalBERT, and discover an interesting variant called VideoBERT.

Get the book


Clone the repo and run in Google Colab

1. A Primer on Transformer

2. Understanding the BERT model

3. Getting hands-on with BERT

4. BERT variants I - ALBERT, RoBERTa, ELECTRA, SpanBERT

5. BERT variants II - Based on knowledge distillation

6. Exploring BERTSUM for text summarization

7. Applying BERT for other languages

8. Exploring Sentence and Domain Specific BERT

9. Understanding VideoBERT, BART, and more

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].