All Projects → sinanuozdemir → oreilly-bert-nlp

sinanuozdemir / oreilly-bert-nlp

Licence: other
This repository contains code for the O'Reilly Live Online Training for BERT

Programming Languages

Jupyter Notebook
11667 projects

Projects that are alternatives of or similar to oreilly-bert-nlp

backprop
Backprop makes it simple to use, finetune, and deploy state-of-the-art ML models.
Stars: ✭ 229 (+1105.26%)
Mutual labels:  transformers, transfer-learning, bert
Haystack
🔍 Haystack is an open source NLP framework that leverages Transformer models. It enables developers to implement production-ready neural search, question answering, semantic document search and summarization for a wide range of applications.
Stars: ✭ 3,409 (+17842.11%)
Mutual labels:  transformers, transfer-learning, bert
wechsel
Code for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models.
Stars: ✭ 39 (+105.26%)
Mutual labels:  transformers, transfer-learning, bert
ParsBigBird
Persian Bert For Long-Range Sequences
Stars: ✭ 58 (+205.26%)
Mutual labels:  transformers, transfer-learning, bert
bangla-bert
Bangla-Bert is a pretrained bert model for Bengali language
Stars: ✭ 41 (+115.79%)
Mutual labels:  transformers, bert
OpenDialog
An Open-Source Package for Chinese Open-domain Conversational Chatbot (中文闲聊对话系统,一键部署微信闲聊机器人)
Stars: ✭ 94 (+394.74%)
Mutual labels:  transformers, bert
nlp workshop odsc europe20
Extensive tutorials for the Advanced NLP Workshop in Open Data Science Conference Europe 2020. We will leverage machine learning, deep learning and deep transfer learning to learn and solve popular tasks using NLP including NER, Classification, Recommendation \ Information Retrieval, Summarization, Classification, Language Translation, Q&A and T…
Stars: ✭ 127 (+568.42%)
Mutual labels:  transformers, transfer-learning
HugsVision
HugsVision is a easy to use huggingface wrapper for state-of-the-art computer vision
Stars: ✭ 154 (+710.53%)
Mutual labels:  transformers, bert
policy-data-analyzer
Building a model to recognize incentives for landscape restoration in environmental policies from Latin America, the US and India. Bringing NLP to the world of policy analysis through an extensible framework that includes scraping, preprocessing, active learning and text analysis pipelines.
Stars: ✭ 22 (+15.79%)
Mutual labels:  transformers, bert
Tokenizers
💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
Stars: ✭ 5,077 (+26621.05%)
Mutual labels:  transformers, bert
Spark Nlp
State of the Art Natural Language Processing
Stars: ✭ 2,518 (+13152.63%)
Mutual labels:  transformers, bert
classy
classy is a simple-to-use library for building high-performance Machine Learning models in NLP.
Stars: ✭ 61 (+221.05%)
Mutual labels:  transformers, bert
text2text
Text2Text: Cross-lingual natural language processing and generation toolkit
Stars: ✭ 188 (+889.47%)
Mutual labels:  transformers, bert
erc
Emotion recognition in conversation
Stars: ✭ 34 (+78.95%)
Mutual labels:  transformers, bert
text2class
Multi-class text categorization using state-of-the-art pre-trained contextualized language models, e.g. BERT
Stars: ✭ 15 (-21.05%)
Mutual labels:  transformers, bert
Text and Audio classification with Bert
Text Classification in Turkish Texts with Bert
Stars: ✭ 34 (+78.95%)
Mutual labels:  transformers, bert
Clue
中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
Stars: ✭ 2,425 (+12663.16%)
Mutual labels:  transformers, bert
Pytorch Sentiment Analysis
Tutorials on getting started with PyTorch and TorchText for sentiment analysis.
Stars: ✭ 3,209 (+16789.47%)
Mutual labels:  transformers, bert
gpl
Powerful unsupervised domain adaptation method for dense retrieval. Requires only unlabeled corpus and yields massive improvement: "GPL: Generative Pseudo Labeling for Unsupervised Domain Adaptation of Dense Retrieval" https://arxiv.org/abs/2112.07577
Stars: ✭ 216 (+1036.84%)
Mutual labels:  transformers, bert
robo-vln
Pytorch code for ICRA'21 paper: "Hierarchical Cross-Modal Agent for Robotics Vision-and-Language Navigation"
Stars: ✭ 34 (+78.95%)
Mutual labels:  transformers, bert

oreilly-logo

BERT Transformer Architecture for NLP

This repository contains code for the O'Reilly Live Online Training for BERT

This training will focus on how BERT is used for a wide variety of NLP tasks including text classification, question answering, and machine translation. The training will begin with an introduction to necessary concepts including language models and transformers and then build on those concepts to introduce the BERT architecture. We will then move into how BERT is architected specifically as a language model that can be used for multiple natural language processing tasks with hands-on examples of fine-tuning pre-trained BERT models.

BERT is one of the most relevant NLP architectures today and it is closely related to other important NLP deep learning models like GPT-3. Both of these models are derived from the newly invented transformer architecture and represent an inflection point in how machines process language and context.

Notebook 1: [Introduction to BERT](Introduction to BERT.ipynb)

Notebook 2: [Pre-training and fine-tuning BERT](Fine-tuning BERT.ipynb)

Instructor

Sinan Ozdemir is currently the Director of Data Science at Directly, managing the AI and machine learning models that power the company’s intelligent customer support platform. Sinan is a former lecturer of Data Science at Johns Hopkins University and the author of multiple textbooks on data science and machine learning. Additionally, he is the founder of the recently acquired Kylie.ai, an enterprise-grade conversational AI platform with RPA capabilities. He holds a Master’s Degree in Pure Mathematics from Johns Hopkins University and is based in San Francisco, CA.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].