Coursera
Coursera Logo

DeepLearning.AI - Natural Language Processing with Attention Models 

  • Offered byCoursera

Natural Language Processing with Attention Models
 at 
Coursera 
Overview

Duration

31 hours

Start from

Start Now

Total fee

Free

Mode of learning

Online

Difficulty level

Intermediate

Official Website

Explore Free Course External Link Icon

Credential

Certificate

Natural Language Processing with Attention Models
 at 
Coursera 
Highlights

  • Shareable Certificate Earn a Certificate upon completion
  • 100% online Start instantly and learn at your own schedule.
  • Course 4 of 4 in the Natural Language Processing Specialization
  • Flexible deadlines Reset deadlines in accordance to your schedule.
  • Intermediate Level
  • Approx. 31 hours to complete
  • English Subtitles: Arabic, French, Portuguese (European), Italian, Vietnamese, German, Russian, English, Spanish, Japanese
Read more
Details Icon

Natural Language Processing with Attention Models
 at 
Coursera 
Course details

Skills you will learn
More about this course
  • In Course 4 of the Natural Language Processing Specialization, offered by DeepLearning.AI, you will:
  • a) Translate complete English sentences into German using an encoder-decoder attention model,
  • b) Build a Transformer model to summarize text,
  • c) Use T5 and BERT models to perform question-answering, and
  • d) Build a chatbot using a Reformer model.
  • This course is for students of machine learning or artificial intelligence as well as software engineers looking for a deeper understanding of how NLP models work and how to apply them.
  • By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot!
  • Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you?ve completed course 3 - Natural Language Processing with Sequence Models - before starting this course.
  • This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. ?ukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper.
Read more

Natural Language Processing with Attention Models
 at 
Coursera 
Curriculum

Neural Machine Translation

Course 4 Introduction

Seq2seq

Alignment

Attention

Setup for Machine Translation

Training an NMT with Attention

Evaluation for Machine Translation

Sampling and Decoding

Andrew Ng with Oren Etzioni

Connect with your mentors and fellow learners on Slack!

Background on seq2seq

(Optional): The Real Meaning of Ich Bin ein Berliner

Attention

Training an NMT with Attention

(Optional) What is Teacher Forcing?

Evaluation for Machine Translation

Sampling and Decoding

Content Resource

How to Refresh your Workspace

Text Summarization

Transformers vs RNNs

Transformer Applications

Dot-Product Attention

Causal Attention

Multi-head Attention

Transformer Decoder

Transformer Summarizer

Transformers vs RNNs

Transformer Applications

Dot-Product Attention

Causal Attention

Multi-head Attention

Transformer Decoder

Transformer Summarizer

Content Resource

Question Answering

Week 3 Overview

Transfer Learning in NLP

ELMo, GPT, BERT, T5

Bidirectional Encoder Representations from Transformers (BERT)

BERT Objective

Fine tuning BERT

Transformer: T5

Multi-Task Training Strategy

GLUE Benchmark

Question Answering

Week 3 Overview

Transfer Learning in NLP

ELMo, GPT, BERT, T5

Bidirectional Encoder Representations from Transformers (BERT)

BERT Objective

Fine tuning BERT

Transformer T5

Multi-Task Training Strategy

GLUE Benchmark

Question Answering

Content Resource

Chatbot

Tasks with Long Sequences

Transformer Complexity

LSH Attention

Motivation for Reversible Layers: Memory!

Reversible Residual Layers

Reformer

Andrew Ng with Quoc Le

Tasks with Long Sequences

Optional AI Storytelling

Transformer Complexity

LSH Attention

Optional KNN & LSH Review

Motivation for Reversible Layers: Memory!

Reversible Residual Layers

Reformer

Optional Transformers beyond NLP

Acknowledgments

References

Natural Language Processing with Attention Models
 at 
Coursera 
Admission Process

    Important Dates

    May 25, 2024
    Course Commencement Date

    Other courses offered by Coursera

    – / –
    3 months
    Beginner
    – / –
    20 hours
    Beginner
    – / –
    2 months
    Beginner
    – / –
    3 months
    Beginner
    View Other 6715 CoursesRight Arrow Icon
    qna

    Natural Language Processing with Attention Models
     at 
    Coursera 

    Student Forum

    chatAnything you would want to ask experts?
    Write here...