All Projects → amanjeetsahu → Natural Language Processing Specialization

amanjeetsahu / Natural Language Processing Specialization

This repo contains my coursework, assignments, and Slides for Natural Language Processing Specialization by deeplearning.ai on Coursera

Projects that are alternatives of or similar to Natural Language Processing Specialization

Coursera Natural Language Processing Specialization
Programming assignments from all courses in the Coursera Natural Language Processing Specialization offered by deeplearning.ai.
Stars: ✭ 39 (-74.17%)
Mutual labels:  jupyter-notebook, course, coursera, natural-language-processing, nlp-machine-learning, natural-language-understanding
Nlp Conference Compendium
Compendium of the resources available from top NLP conferences.
Stars: ✭ 349 (+131.13%)
Mutual labels:  natural-language-processing, nlp-machine-learning, natural-language-understanding, natural-language-generation
Course Computational Literary Analysis
Course materials for Introduction to Computational Literary Analysis, taught at UC Berkeley in Summer 2018, 2019, and 2020, and at Columbia University in Fall 2020.
Stars: ✭ 74 (-50.99%)
Mutual labels:  jupyter-notebook, course, course-materials, natural-language-processing
Ludwig
Data-centric declarative deep learning framework
Stars: ✭ 8,018 (+5209.93%)
Mutual labels:  natural-language-processing, natural-language-understanding, natural-language-generation
Tensorflow In Practice Specialization
DeepLearning.AI TensorFlow Developer Professional Certificate Specialization
Stars: ✭ 29 (-80.79%)
Mutual labels:  jupyter-notebook, coursera, natural-language-processing
Practical dl
DL course co-developed by YSDA, HSE and Skoltech
Stars: ✭ 1,006 (+566.23%)
Mutual labels:  jupyter-notebook, course, course-materials
Speech Emotion Analyzer
The neural network model is capable of detecting five different male/female emotions from audio speeches. (Deep Learning, NLP, Python)
Stars: ✭ 633 (+319.21%)
Mutual labels:  jupyter-notebook, natural-language-processing, natural-language-understanding
Python Tutorial Notebooks
Python tutorials as Jupyter Notebooks for NLP, ML, AI
Stars: ✭ 52 (-65.56%)
Mutual labels:  jupyter-notebook, natural-language-processing, natural-language-understanding
Convai Baseline
ConvAI baseline solution
Stars: ✭ 49 (-67.55%)
Mutual labels:  natural-language-processing, natural-language-understanding, natural-language-generation
Intent classifier
Stars: ✭ 67 (-55.63%)
Mutual labels:  natural-language-processing, nlp-machine-learning, natural-language-understanding
Codesearchnet
Datasets, tools, and benchmarks for representation learning of code.
Stars: ✭ 1,378 (+812.58%)
Mutual labels:  jupyter-notebook, natural-language-processing, nlp-machine-learning
Mongolian Bert
Pre-trained Mongolian BERT models
Stars: ✭ 21 (-86.09%)
Mutual labels:  jupyter-notebook, natural-language-processing, natural-language-understanding
Coursera
Quiz & Assignment of Coursera
Stars: ✭ 774 (+412.58%)
Mutual labels:  jupyter-notebook, coursera, natural-language-processing
This Word Does Not Exist
This Word Does Not Exist
Stars: ✭ 640 (+323.84%)
Mutual labels:  natural-language-processing, natural-language-understanding, natural-language-generation
Spark Nlp Models
Models and Pipelines for the Spark NLP library
Stars: ✭ 88 (-41.72%)
Mutual labels:  jupyter-notebook, natural-language-processing, natural-language-understanding
Textaugmentation Gpt2
Fine-tuned pre-trained GPT2 for custom topic specific text generation. Such system can be used for Text Augmentation.
Stars: ✭ 104 (-31.13%)
Mutual labels:  natural-language-processing, nlp-machine-learning, natural-language-generation
Ppd599
USC urban data science course series with Python and Jupyter
Stars: ✭ 1,062 (+603.31%)
Mutual labels:  jupyter-notebook, course, course-materials
Courses
Quiz & Assignment of Coursera
Stars: ✭ 454 (+200.66%)
Mutual labels:  jupyter-notebook, coursera, natural-language-processing
Ml Mipt
Open Machine Learning course at MIPT
Stars: ✭ 480 (+217.88%)
Mutual labels:  jupyter-notebook, course, natural-language-processing
Deep Nlp Seminars
Materials for deep NLP course
Stars: ✭ 113 (-25.17%)
Mutual labels:  jupyter-notebook, natural-language-processing, natural-language-understanding

Natural Language Processing Specialization

Natural Language Processing (NLP) uses algorithms to understand and manipulate human language. This technology is one of the most broadly applied areas of machine learning. As AI continues to expand, so will the demand for professionals skilled at building models that analyze speech and language, uncover contextual patterns, and produce insights from text and audio. This Specialization will equip you with the state-of-the-art deep learning techniques needed to build cutting-edge NLP systems. By the end of this Specialization, you will be ready to design NLP applications that perform question-answering and sentiment analysis, create tools to translate languages and summarize text, and even build chatbots.

This Specialization is for students of machine learning or artificial intelligence as well as software engineers looking for a deeper understanding of how NLP models work and how to apply them. Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. If you would like to brush up on these skills, we recommend the Deep Learning Specialization, offered by deeplearning.ai and taught by Andrew Ng.

This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper.

Course 1: Classification and Vector Spaces in NLP

This is the first course of the Natural Language Processing Specialization.

Week 1: Logistic Regression for Sentiment Analysis of Tweets

  • Use a simple method to classify positive or negative sentiment in tweets

Week 2: Naïve Bayes for Sentiment Analysis of Tweets

  • Use a more advanced model for sentiment analysis

Week 3: Vector Space Models

  • Use vector space models to discover relationships between words and use principal component analysis (PCA) to reduce the dimensionality of the vector space and visualize those relationships

Week 4: Word Embeddings and Locality Sensitive Hashing for Machine Translation

  • Write a simple English-to-French translation algorithm using pre-computed word embeddings and locality sensitive hashing to relate words via approximate k-nearest neighbors search

Course 2: Probabilistic Models in NLP

This is the second course of the Natural Language Processing Specialization.

Week 1: Auto-correct using Minimum Edit Distance

  • Create a simple auto-correct algorithm using minimum edit distance and dynamic programming

Week 2: Part-of-Speech (POS) Tagging

  • Apply the Viterbi algorithm for POS tagging, which is important for computational linguistics

Week 3: N-gram Language Models

  • Write a better auto-complete algorithm using an N-gram model (similar models are used for translation, determining the author of a text, and speech recognition)

Week 4: Word2Vec and Stochastic Gradient Descent

  • Write your own Word2Vec model that uses a neural network to compute word embeddings using a continuous bag-of-words model

Course 3: Sequence Models in NLP

This is the third course in the Natural Language Processing Specialization.

Week 1: Sentiment with Neural Nets

  • Train a neural network with GLoVe word embeddings to perform sentiment analysis of tweets

Week 2: Language Generation Models

  • Generate synthetic Shakespeare text using a Gated Recurrent Unit (GRU) language model

Week 3: Named Entity Recognition (NER)

  • Train a recurrent neural network to perform NER using LSTMs with linear layers

Week 4: Siamese Networks

  • Use so-called ‘Siamese’ LSTM models to compare questions in a corpus and identify those that are worded differently but have the same meaning

Course 4: Attention Models in NLP

This is the fourth course in the Natural Language Processing Specialization.

Week 1: Neural Machine Translation with Attention

  • Translate complete English sentences into French using an encoder/decoder attention model

Week 2: Summarization with Transformer Models

  • Build a transformer model to summarize text

Week 3: Question-Answering with Transformer Models

  • Use T5 and BERT models to perform question answering

Week 4: Chatbots with a Reformer Model

  • Build a chatbot using a reformer model

Specialization Completion Certificate

Certificate

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].