Probabilistic Deep Learning with TensorFlow 2
- Offered byCoursera
Probabilistic Deep Learning with TensorFlow 2 at Coursera Overview
Duration | 53 hours |
Start from | Start Now |
Total fee | Free |
Mode of learning | Online |
Difficulty level | Advanced |
Official Website | Explore Free Course |
Credential | Certificate |
Probabilistic Deep Learning with TensorFlow 2 at Coursera Highlights
- Shareable Certificate Earn a Certificate upon completion
- 100% online Start instantly and learn at your own schedule.
- Course 3 of 3 in the TensorFlow 2 for Deep Learning Specialization
- Flexible deadlines Reset deadlines in accordance to your schedule.
- Advanced Level * Python 3 * Knowledge of general machine learning concepts * Knowledge of the field of deep learning * Probability and statistics
- Approx. 53 hours to complete
- English Subtitles: Arabic, French, Portuguese (European), Italian, Vietnamese, German, Russian, English, Spanish
Probabilistic Deep Learning with TensorFlow 2 at Coursera Course details
- Welcome to this course on Probabilistic Deep Learning with TensorFlow!
- This course builds on the foundational concepts and skills for TensorFlow taught in the first two courses in this specialisation, and focuses on the probabilistic approach to deep learning. This is an increasingly important area of deep learning that aims to quantify the noise and uncertainty that is often present in real world datasets. This is a crucial aspect when using deep learning models in applications such as autonomous vehicles or medical diagnoses; we need the model to know what it doesn't know.
- You will learn how to develop probabilistic models with TensorFlow, making particular use of the TensorFlow Probability library, which is designed to make it easy to combine probabilistic models with deep learning. As such, this course can also be viewed as an introduction to the TensorFlow Probability library.
- You will learn how probability distributions can be represented and incorporated into deep learning models in TensorFlow, including Bayesian neural networks, normalising flows and variational autoencoders. You will learn how to develop models for uncertainty quantification, as well as generative models that can create new samples similar to those in the dataset, such as images of celebrity faces.
- You will put concepts that you learn about into practice straight away in practical, hands-on coding tutorials, which you will be guided through by a graduate teaching assistant. In addition there is a series of automatically graded programming assignments for you to consolidate your skills.
- At the end of the course, you will bring many of the concepts together in a Capstone Project, where you will develop a variational autoencoder algorithm to produce a generative model of a synthetic image dataset that you will create yourself.
- This course follows on from the previous two courses in the specialisation, Getting Started with TensorFlow 2 and Customising Your Models with TensorFlow 2. The additional prerequisite knowledge required in order to be successful in this course is a solid foundation in probability and statistics. In particular, it is assumed that you are familiar with standard probability distributions, probability density functions, and concepts such as maximum likelihood estimation, change of variables formula for random variables, and the evidence lower bound (ELBO) used in variational inference.
Probabilistic Deep Learning with TensorFlow 2 at Coursera Curriculum
TensorFlow Distributions
Welcome to Probabilistic Deep Learning with TensorFlow 2
Interview with Paige Bailey
The TensorFlow Probability library
Univariate distributions
[Coding tutorial] Univariate distributions
Multivariate distributions
[Coding tutorial] Multivariate distributions
The Independent distribution
[Coding tutorial] The Independent distribution
Sampling and log probs
[Coding tutorial] Sampling and log probs
Trainable distributions
[Coding tutorial] Trainable distributions
Wrap up and introduction to the programming assignment
About Imperial College & the team
How to be successful in this course
Grading policy
Additional readings & helpful references
[Knowledge check] Standard distributions
Probabilistic layers and Bayesian neural networks
Welcome to week 2 - Probabilistic layers and Bayesian neural networks
The need for uncertainty in deep learning models
The DistributionLambda layer
[Coding tutorial] The DistributionLambda layer
Probabilistic layers
[Coding tutorial] Probabilistic layers
The DenseVariational layer
[Coding tutorial] The DenseVariational layer
Reparameterization layers
[Coding tutorial] Reparameterization layers
Wrap up and introduction to the programming assignment
Sources of uncertainty
Bijectors and normalising flows
Welcome to week 3 - Bijectors and normalising flows
Interview with Doug Kelly
Bijectors
[Coding tutorial] Bijectors
The TransformedDistribution class
[Coding tutorial] The Transformed Distribution class
Subclassing bijectors
[Coding tutorial] Subclassing bijectors
Autoregressive flows
RealNVP
[Coding tutorial] Normalising flows
Wrap up and introduction to the programming assignment
Change of variables formula
Variational autoencoders
Welcome to week 4 - Variational autoencoders
Encoders and decoders
[Coding tutorial] Encoders and decoders
Minimising KL divergence
[Coding tutorial] Minimising KL divergence
Maximising the ELBO
[Coding tutorial] Maximising the ELBO
KL divergence layers
[Coding tutorial] KL divergence layers
Wrap up and introduction to the programming assignment
Variational autoencoders
Capstone Project
Welcome to the Capstone Project
Goodbye video