All Projects → ibrahimjelliti → Deeplearning.ai Natural Language Processing Specialization

ibrahimjelliti / Deeplearning.ai Natural Language Processing Specialization

Licence: gpl-3.0
This repository contains my full work and notes on Coursera's NLP Specialization (Natural Language Processing) taught by the instructor Younes Bensouda Mourri and Łukasz Kaiser offered by deeplearning.ai

Projects that are alternatives of or similar to Deeplearning.ai Natural Language Processing Specialization

Machine learning basics
Plain python implementations of basic machine learning algorithms
Stars: ✭ 3,557 (+652.01%)
Mutual labels:  jupyter-notebook, neural-networks, logistic-regression
Deep Math Machine Learning.ai
A blog which talks about machine learning, deep learning algorithms and the Math. and Machine learning algorithms written from scratch.
Stars: ✭ 173 (-63.42%)
Mutual labels:  jupyter-notebook, neural-networks, logistic-regression
Deep Learning Specialization Coursera
Deep Learning Specialization by Andrew Ng on Coursera.
Stars: ✭ 483 (+2.11%)
Mutual labels:  jupyter-notebook, coursera, neural-networks
Deeplearning.ai
该存储库包含由deeplearning.ai提供的相关课程的个人的笔记和实现代码。
Stars: ✭ 181 (-61.73%)
Mutual labels:  jupyter-notebook, coursera, logistic-regression
Abstractive Summarization
Implementation of abstractive summarization using LSTM in the encoder-decoder architecture with local attention.
Stars: ✭ 128 (-72.94%)
Mutual labels:  jupyter-notebook, attention-mechanism, encoder-decoder
Coursera Deep Learning Specialization
Notes, programming assignments and quizzes from all courses within the Coursera Deep Learning specialization offered by deeplearning.ai: (i) Neural Networks and Deep Learning; (ii) Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization; (iii) Structuring Machine Learning Projects; (iv) Convolutional Neural Networks; (v) Sequence Models
Stars: ✭ 188 (-60.25%)
Mutual labels:  jupyter-notebook, coursera, neural-networks
Coursera Deep Learning Deeplearning.ai
(完结)网易云课堂微专业《深度学习工程师》听课笔记,编程作业和课后练习
Stars: ✭ 344 (-27.27%)
Mutual labels:  jupyter-notebook, coursera
Amazon Forest Computer Vision
Amazon Forest Computer Vision: Satellite Image tagging code using PyTorch / Keras with lots of PyTorch tricks
Stars: ✭ 346 (-26.85%)
Mutual labels:  jupyter-notebook, neural-networks
Text summurization abstractive methods
Multiple implementations for abstractive text summurization , using google colab
Stars: ✭ 359 (-24.1%)
Mutual labels:  jupyter-notebook, encoder-decoder
Start Machine Learning In 2020
A complete guide to start and improve in machine learning (ML), artificial intelligence (AI) in 2021 without ANY background in the field and stay up-to-date with the latest news and state-of-the-art techniques!
Stars: ✭ 357 (-24.52%)
Mutual labels:  coursera, neural-networks
Probability
Probabilistic reasoning and statistical analysis in TensorFlow
Stars: ✭ 3,550 (+650.53%)
Mutual labels:  jupyter-notebook, neural-networks
Easy Deep Learning With Keras
Keras tutorial for beginners (using TF backend)
Stars: ✭ 367 (-22.41%)
Mutual labels:  jupyter-notebook, neural-networks
Deep Reinforcement Learning
Repo for the Deep Reinforcement Learning Nanodegree program
Stars: ✭ 4,012 (+748.2%)
Mutual labels:  jupyter-notebook, neural-networks
Tbd Nets
PyTorch implementation of "Transparency by Design: Closing the Gap Between Performance and Interpretability in Visual Reasoning"
Stars: ✭ 345 (-27.06%)
Mutual labels:  jupyter-notebook, neural-networks
Action Recognition Visual Attention
Action recognition using soft attention based deep recurrent neural networks
Stars: ✭ 350 (-26%)
Mutual labels:  jupyter-notebook, attention-mechanism
Supervisely
AI for everyone! 🎉 Neural networks, tools and a library we use in Supervisely
Stars: ✭ 332 (-29.81%)
Mutual labels:  jupyter-notebook, neural-networks
Tf 2.0 Hacks
Contains my explorations of TensorFlow 2.x
Stars: ✭ 369 (-21.99%)
Mutual labels:  jupyter-notebook, neural-networks
Pytorch Original Transformer
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (-13.11%)
Mutual labels:  jupyter-notebook, attention-mechanism
Neuralmonkey
An open-source tool for sequence learning in NLP built on TensorFlow.
Stars: ✭ 400 (-15.43%)
Mutual labels:  neural-networks, encoder-decoder
Edward2
A simple probabilistic programming language.
Stars: ✭ 419 (-11.42%)
Mutual labels:  jupyter-notebook, neural-networks

My GAN Specialization repository

Click on the image

DeepLearning.ai NLP Specialization Courses Notes

This repository contains my personal notes on DeepLearning.ai NLP specialization courses.

DeepLearning.ai contains four courses which can be taken on Coursera. The four courses are:

  1. Natural Language Processing with Classification and Vector Spaces
  2. Natural Language Processing with Probabilistic Models
  3. Natural Language Processing with Sequence Models
  4. Natural Language Processing with Attention Models

About This Specialization (From the official NLP Specialization page)

  • Natural Language Processing (NLP) uses algorithms to understand and manipulate human language. This technology is one of the most broadly applied areas of machine learning. As AI continues to expand, so will the demand for professionals skilled at building models that analyze speech and language, uncover contextual patterns, and produce insights from text and audio.

  • By the end of this Specialization, you will be ready to design NLP applications that perform question-answering and sentiment analysis, create tools to translate languages and summarize text, and even build chatbots. These and other NLP applications are going to be at the forefront of the coming transformation to an AI-powered future.

  • This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper.

Applied Learning Project

This Specialization will equip you with the state-of-the-art deep learning techniques needed to build cutting-edge NLP systems:

• Use logistic regression, naïve Bayes, and word vectors to implement sentiment analysis, complete analogies, and translate words, and use locality sensitive hashing for approximate nearest neighbors.

• Use dynamic programming, hidden Markov models, and word embeddings to autocorrect misspelled words, autocomplete partial sentences, and identify part-of-speech tags for words.

• Use dense and recurrent neural networks, LSTMs, GRUs, and Siamese networks in TensorFlow and Trax to perform advanced sentiment analysis, text generation, named entity recognition, and to identify duplicate questions.

• Use encoder-decoder, causal, and self-attention to perform advanced machine translation of complete sentences, text summarization, question-answering and to build chatbots. Models covered include T5, BERT, transformer, reformer, and more! Enjoy!

Usage

I share the assignment notebooks with my prefilled and from the contributors code structred as in the course Course/Week The assignment notebooks are subject to changes through time.

Connect with your mentors and fellow learners on Slack!

Once you enrolled to the course, you are invited to join a slack workspace for this specialization: Please join the Slack workspace by going to the following link deeplearningai-nlp.slack.com This Slack workspace includes all courses of this specialization.

Contact Information

Stargazers over Time

Stargazers over time

Ibrahim Jelliti © 2020

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].