DeepLearning.AI - Natural Language Processing with Probabilistic Models
- Offered byCoursera
Natural Language Processing with Probabilistic Models at Coursera Overview
Duration | 24 hours |
Start from | Start Now |
Total fee | Free |
Mode of learning | Online |
Difficulty level | Intermediate |
Official Website | Explore Free Course |
Credential | Certificate |
Natural Language Processing with Probabilistic Models at Coursera Highlights
- Earn a shareable certificate upon completion.
- Flexible deadlines according to your schedule.
Natural Language Processing with Probabilistic Models at Coursera Course details
- In Course 2 of the Natural Language Processing Specialization, offered by deeplearning.ai, you will:
- a) Create a simple auto-correct algorithm using minimum edit distance and dynamic programming,
- b) Apply the Viterbi Algorithm for part-of-speech (POS) tagging, which is important for computational linguistics,
- c) Write a better auto-complete algorithm using an N-gram language model, and
- d) Write your own Word2Vec model that uses a neural network to compute word embeddings using a continuous bag-of-words model.
- Please make sure that you'??re comfortable programming in Python and have a basic knowledge of machine learning, matrix multiplications, and conditional probability.
- By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot!
- This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. 'ukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper.
Natural Language Processing with Probabilistic Models at Coursera Curriculum
Autocorrect
Intro to Course 2
Overview
Autocorrect
Building the model
Building the model II
Minimum edit distance
Minimum edit distance algorithm
Minimum edit distance algorithm II
Minimum edit distance algorithm III
Connect with your mentors and fellow learners on Slack!
How to Refresh your Workspace
Part of Speech Tagging and Hidden Markov Models
Part of Speech Tagging
Markov Chains
Markov Chains and POS Tags
Hidden Markov Models
Calculating Probabilities
Populating the Transition Matrix
Populating the Emission Matrix
The Viterbi Algorithm
Viterbi: Initialization
Viterbi: Forward Pass
Viterbi: Backward Pass
Autocomplete and Language Models
N-Grams: Overview
N-grams and Probabilities
Sequence Probabilities
Starting and Ending Sentences
The N-gram Language Model
Language Model Evaluation
Out of Vocabulary Words
Smoothing
Week Summary
Word embeddings with neural networks
Overview
Basic Word Representations
Word Embeddings
How to Create Word Embeddings
Word Embedding Methods
Continuous Bag-of-Words Model
Cleaning and Tokenization
Sliding Window of Words in Python
Transforming Words into Vectors
Architecture of the CBOW Model
Architecture of the CBOW Model: Dimensions
Architecture of the CBOW Model: Dimensions 2
Architecture of the CBOW Model: Activation Functions
Training a CBOW Model: Cost Function
Training a CBOW Model: Forward Propagation
Training a CBOW Model: Backpropagation and Gradient Descent
Extracting Word Embedding Vectors
Evaluating Word Embeddings: Intrinsic Evaluation
Evaluating Word Embeddings: Extrinsic Evaluation
Conclusion
Acknowledgments
Natural Language Processing with Probabilistic Models at Coursera Admission Process
Important Dates
Other courses offered by Coursera
Natural Language Processing with Probabilistic Models at Coursera Students Ratings & Reviews
- 4-51