Skip to content

JavidChaji/DeepLearning.AI-Natural-Language-Processing-Specialization

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

69 Commits
 
 
 
 

Repository files navigation

Natural Language Processing Specialization

Natural Language Processing with Classification and Vector Spaces

Week 1 : Sentiment Analysis with Logistic Regression

Lecture: Logistic Regression

  • Video : Welcome to the NLP Specialization
  • Video : Welcome to course 1
  • Reading : Acknowledgment - Ken Church
  • Video : Week Introduction
  • Video : Supervised ML & Sentiment Analysis
  • Reading : Supervised ML & Sentiment Analysis
  • Video : Vocabulary & Feature Extraction
  • Reading : Vocabulary & Feature Extraction
  • Video : Negative and Positive Frequencies
  • Video : Feature Extraction with Frequencies
  • Reading : Feature Extraction with Frequencies
  • Video : Preprocessing
  • Reading : Preprocessing
  • Lab : Natural Language preprocessing
  • Video : Putting it All Together
  • Reading : Putting it all together
  • Lab : Visualizing word frequencies
  • Video : Logistic Regression Overview
  • Reading : Logistic Regression Overview
  • Video : Logistic Regression: Training
  • Reading : Logistic Regression: Training
  • Lab : Visualizing tweets and Logistic Regression models
  • Video : Logistic Regression: Testing
  • Reading : Logistic Regression: Testing
  • Video : Logistic Regression: Cost Function
  • Reading : Optional Logistic Regression: Cost Function
  • Video : Week Conclusion
  • Reading : Optional Logistic Regression: Gradient
  • Ungraded App Item : Intake Survey)
  • Have questions, issues or ideas? Join our Community!

Lecture Notes (Optional)

  • Reading : Lecture Notes W1

Practice Quiz

  • Practice Quiz : Logistic Regression

Assignment: Sentiment Analysis with Logistic Regression

  • Reading : (Optional) Downloading your Notebook, Downloading your Workspace and Refreshing your Workspace
  • Programming Assignment : Logistic Regression

Heroes of NLP: Chris Manning (Optional)

  • Video : Andrew Ng with Chris Manning

Week 2 : Sentiment Analysis with NaĂŻve Bayes

Lecture: Naive Bayes

  • Video : Week Introduction
  • Video : Probability and Bayes’ Rule
  • Reading : Probability and Bayes’ Rule
  • Video : Bayes’ Rule
  • Reading : Bayes' Rule
  • Video : NaĂŻve Bayes Introduction
  • Reading : Naive Bayes Introduction
  • Video : Laplacian Smoothing
  • Reading : Laplacian Smoothing
  • Video : Log Likelihood, Part 1
  • Reading : Log Likelihood, Part 1
  • Video : Log Likelihood, Part 2
  • Reading : Log Likelihood Part 2
  • Video : Training NaĂŻve Bayes
  • Reading : Training naĂŻve Bayes
  • Lab : Visualizing likelihoods and confidence ellipses
  • Video : Testing NaĂŻve Bayes
  • Reading : Testing naĂŻve Bayes
  • Video : Applications of NaĂŻve Bayes
  • Reading : Applications of Naive Bayes
  • Video : NaĂŻve Bayes Assumptions
  • Reading : NaĂŻve Bayes Assumptions
  • Video : Error Analysis
  • Reading : Error Analysis
  • Video : Week Conclusion

Lecture Notes (Optional)

  • Lecture Notes W2

Practice Quiz

  • Naive Bayes

Assignment: Naive Bayes

  • Programming Assignment : Naive Bayes

Week 3 : Vector Space Models

Lecture: Vector Space Models

  • Video : Week Introduction
  • Video : Vector Space Models
  • Reading : Vector Space Models
  • Video : Word by Word and Word by Doc.
  • Reading : Word by Word and Word by Doc.
  • Lab : Linear algebra in Python with Numpy
  • Video : Euclidean Distance
  • Reading : Euclidian Distance
  • Video : Cosine Similarity: Intuition
  • Reading : Cosine Similarity: Intuition
  • Video : Cosine Similarity
  • Reading : Cosine Similarity
  • Video : Manipulating Words in Vector Spaces
  • Reading : Manipulating Words in Vector Spaces
  • Lab : Manipulating word embeddings
  • Video : Visualization and PCA
  • Reading : Visualization and PCA
  • Video : PCA Algorithm
  • Reading : PCA algorithm
  • Lab : Another explanation about PCA
  • Reading : The Rotation Matrix (Optional Reading)
  • Video : Week Conclusion

Lecture Notes (Optional)

  • Reading : Lecture Notes W3

Practice Quiz

  • Practice Quiz : Vector Space Models

Assignment: Vector Space Models

  • Programming Assignment : Assignment: Vector Space Models

Week 4 : Machine Translation and Document Search

Lecture: Machine Translation

  • Video : Week Introduction
  • Video : Overview
  • Video : Transforming word vectors
  • Reading : Transforming word vectors
  • Lab : Rotation matrices in R2
  • Video : K-nearest neighbors
  • Reading : K-nearest neighbors
  • Video : Hash tables and hash functions
  • Reading : Hash tables and hash functions
  • Video : Locality sensitive hashing
  • Reading : Locality sensitive hashing
  • Video : Multiple Planes
  • Reading : Multiple Planes
  • Lab : Hash tables
  • Video : Approximate nearest neighbors
  • Reading : Approximate nearest neighbors
  • Video : Searching documents
  • Reading : Searching documents
  • Video : Week Conclusion

Lecture Notes (Optional)

  • Reading : Lecture Notes W4

Practice Quiz

  • Practice Quiz : Hashing and Machine Translation

Assignment: Machine Translation

  • Programming Assignment : Word Translation

Acknowledgments and Bibliography

  • Reading : Acknowledgements
  • Reading : Bibliography

Heroes of NLP: Kathleen McKeown

  • Video : Andrew Ng with Kathleen McKeown

Natural Language Processing with Probabilistic Models

Week 1

Lecture: Autocorrect and Minimum Edit Distance

  • Video : Intro to Course 2
  • Video : Week Introduction
  • Video : Overview
  • Reading : Overview
  • Video : Autocorrect
  • Reading : Autocorrect
  • Video : Building the model
  • Reading : Building the model
  • Lab : Lecture notebook: Building the vocabulary
  • Video : Building the model II
  • Reading : Building the model II
  • Lab : Lecture notebook: Candidates from edits
  • Video : Minimum edit distance
  • Reading : Minimum edit distance
  • Video : Minimum edit distance algorithm
  • Reading : Minimum edit distance algorithm
  • Video : Minimum edit distance algorithm II
  • Reading : Minimum edit distance algorithm II
  • Video : Minimum edit distance algorithm III
  • Reading : Minimum edit distance III
  • Video : Week Conclusion
  • Ungraded App Item : [IMPORTANT] Have questions, issues or ideas? Join our Community!

Lecture Notes (Optional)

  • Reading : Lecture Notes W1

Quiz: Auto-correct and Minimum Edit Distance

  • Practice Quiz : Auto-correct and Minimum Edit Distance

Assignment: Autocorrect

  • Reading : (Optional) Downloading your Notebook, Downloading your Workspace and Refreshing your Workspace
  • Programming Assignment : Autocorrect

Week 2

Lecture: Part of Speech Tagging

  • Video : Week Introduction
  • Video : Part of Speech Tagging
  • Reading : Part of Speech Tagging
  • Lab : Lecture Notebook - Working with text files
  • Video : Markov Chains
  • Reading : Markov Chains
  • Video : Markov Chains and POS Tags
  • Reading : Markov Chains and POS Tags
  • Video : Hidden Markov Models
  • Reading : Hidden Markov Models
  • Video : Calculating Probabilities
  • Reading : Calculating Probabilities
  • Video : Populating the Transition Matrix
  • Reading : Populating the Transition Matrix
  • Video : Populating the Emission Matrix
  • Reading : Populating the Emission Matrix
  • Lab : Lecture Notebook - Working with tags and Numpy
  • Video : The Viterbi Algorithm
  • Reading : The Viterbi Algorithm
  • Video : Viterbi: Initialization
  • Reading : Viterbi: Initialization
  • Video : Viterbi: Forward Pass
  • Reading : Viterbi: Forward Pass
  • Video : Viterbi: Backward Pass
  • Reading : Viterbi: Backward Pass
  • Video : Week Conclusion

Lecture Notes (Optional)

  • Reading : Lecture Notes W2

Practice Quiz

  • Practice Quiz : Part of Speech Tagging

Assignment: Part of Speech Tagging

  • Programming Assignment : Part of Speech Tagging

Week 3

Lecture: Autocomplete

  • Video : Week Introduction
  • Video : N-Grams: Overview
  • Reading : N-Grams: Overview
  • Video : N-grams and Probabilities
  • Reading : N-grams and Probabilities
  • Video : Sequence Probabilities
  • Reading : Sequence Probabilities
  • Video : Starting and Ending Sentences
  • Reading : Starting and Ending Sentences
  • Lab : Lecture notebook: Corpus preprocessing for N-grams
  • Video : The N-gram Language Model
  • Reading : The N-gram Language Model
  • Video : Language Model Evaluation
  • Lab : Lecture notebook: Building the language model
  • Reading : Language Model Evaluation
  • Video : Out of Vocabulary Words
  • Reading : Out of Vocabulary Words
  • Video : Smoothing
  • Reading : Smoothing
  • Lab : Lecture notebook: Language model generalization
  • Video : Week Summary
  • Reading : Week Summary
  • Video : Week Conclusion

Lecture Notes (Optional)

  • Reading : Lecture Notes W3

Practice Quiz

  • Practice Quiz : Autocomplete

Assignment: Autocomplete

  • Programming Assignment : Autocomplete

Week 4

Lecture: Word Embeddings

  • Video : Week Introduction
  • Video : Overview
  • Reading : Overview
  • Video : Basic Word Representations
  • Reading : Basic Word Representations
  • Video : Word Embeddings
  • Reading : Word Embeddings
  • Video : How to Create Word Embeddings
  • Reading : How to Create Word Embeddings?
  • Video : Word Embedding Methods
  • Reading : Word Embedding Methods
  • Video : Continuous Bag-of-Words Model
  • Reading : Continuous Bag-of-Words Model
  • Video : Cleaning and Tokenization
  • Reading : Cleaning and Tokenization
  • Video : Sliding Window of Words in Python
  • Reading : Sliding Window of Words in Python
  • Video : Transforming Words into Vectors
  • Reading : Transforming Words into Vectors
  • Lab : Lecture Notebook - Data Preparation
  • Video : Architecture of the CBOW Model
  • Reading : Architecture of the CBOW Model
  • Video : Architecture of the CBOW Model: Dimensions
  • Reading : Architecture of the CBOW Model: Dimensions
  • Video : Architecture of the CBOW Model: Dimensions 2
  • Reading : Architecture of the CBOW Model: Dimensions 2
  • Video : Architecture of the CBOW Model: Activation Functions
  • Reading : Architecture of the CBOW Model: Activation Functions
  • Lab : Lecture Notebook - Intro to CBOW model
  • Video : Training a CBOW Model: Cost Function
  • Reading : Training a CBOW Model: Cost Function
  • Video : Training a CBOW Model: Forward Propagation
  • Reading : Training a CBOW Model: Forward Propagation
  • Video : Training a CBOW Model: Backpropagation and Gradient Descent
  • Reading : Training a CBOW Model: Backpropagation and Gradient Descent
  • Lab : Lecture Notebook - Training the CBOW model
  • Video : Extracting Word Embedding Vectors
  • Reading : Extracting Word Embedding Vectors
  • Lab : Lecture Notebook - Word Embeddings
  • Video : Evaluating Word Embeddings: Intrinsic Evaluation
  • Reading : Evaluating Word Embeddings: Intrinsic Evaluation
  • Video : Evaluating Word Embeddings: Extrinsic Evaluation
  • Reading : Evaluating Word Embeddings: Extrinsic Evaluation
  • Lab : Lecture notebook: Word embeddings step by step
  • Video : Conclusion
  • Reading : Conclusion
  • Video : Week Conclusion

Lecture Notes (Optional)

  • Reading : Lecture Notes W4

Practice Quiz

  • Practice Quiz : Word Embeddings

End of access to Lab Notebooks

  • Reading : [IMPORTANT] Reminder about end of access to Lab Notebooks

Assignment: Word Embeddings

  • Programming Assignment : Word Embeddings

Acknowledgments

  • Reading : Acknowledgments

Natural Language Processing with Sequence Models

Week 1

Week 2

Week 3

Week 4

Natural Language Processing with Attention Models

Week 1

Week 2

Week 3

Week 4

Releases

No releases published

Packages

No packages published