Skip to content

Repository for Assignments done for Natural Language Processing Specialization offered by deeplearning.ai.

Notifications You must be signed in to change notification settings

kovendhanv/Natural-Language-Processing-Deeplearning-ai

Repository files navigation

Natural-Language-Processing-Deeplearning-ai

Repository for Assignments done for Natural Language Processing Specialization offered by deeplearning.ai.

Course 1: Natural Language Processing with Classification and Vector Spaces

Week 1: Logistic Regression for Sentiment Analysis of Tweets

  • Use a simple ML Algorithm to classify positive or negative sentiment in tweets

Week 2: Naïve Bayes for Sentiment Analysis of Tweets

  • Use a more advanced model for sentiment analysis

Week 3: Vector Space Models

  • Use vector space models to discover relationships between words and use principal component analysis (PCA) to reduce the dimensionality of the vector space and visualize those relationships

Week 4: Word Embeddings and Locality Sensitive Hashing for Machine Translation

  • Write a simple English-to-French translation algorithm using pre-computed word embeddings and Locality Sensitive Hashing (LSH) to relate words via Approximate K-Nearest Neighbors search

Course 2: Natural Language Processing with Probabilistic Models

Week 1: Auto-correct using Minimum Edit Distance

  • Create a simple Auto-Correct algorithm using Minimum Edit Distance and Dynamic Programming

Week 2: Part-of-Speech (POS) Tagging

  • Apply the Viterbi algorithm for POS tagging

Week 3: N-gram Language Models

  • Write a sentence auto-completion algorithm using an N-gram model

Week 4: Word2Vec and Stochastic Gradient Descent

  • Write your own Word2Vec model that uses a neural network to compute word embeddings using a Continuous Bag-Of-Words (CBOW) model

Course 3: Natural Language Processing with Sequence Models

Week 1: Sentiment with Neural Nets

  • Train a Neural Network with GLoVe word embeddings to perform Sentiment Analysis of Tweets

Week 2: Language Generation Models

  • Generate synthetic Shakespeare text using a Gated Recurrent Unit (GRU) language model

Week 3: Named Entity Recognition (NER)

  • Train a RNN to perform NER using LSTMs with linear layers

Week 4: Siamese Networks

  • Use ‘Siamese’ LSTM models to compare questions in a corpus and identify those that are worded differently but have the same meaning

Course 4: Natural Language Processing with Attention Models

Week 1: Neural Machine Translation with Attention

  • Translate complete English sentences into French using an encoder/decoder attention model

Week 2: Summarization with Transformer Models

  • Build a transformer model for Text Summarization.

Week 3: Question-Answering with Transformer Models

  • Use T5 and BERT models to perform Question Answering tasks using Transfer Learning

Week 4: Chatbots with a Reformer Model

  • Build a Chatbot using a Reformer model

About

Repository for Assignments done for Natural Language Processing Specialization offered by deeplearning.ai.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published