DeepLearning.AI - Custom and Distributed Training with TensorFlow
- Offered byCoursera
Custom and Distributed Training with TensorFlow at Coursera Overview
Duration | 24 hours |
Start from | Start Now |
Total fee | Free |
Mode of learning | Online |
Difficulty level | Intermediate |
Official Website | Explore Free Course |
Credential | Certificate |
Custom and Distributed Training with TensorFlow at Coursera Highlights
- Shareable Certificate Earn a Certificate upon completion
- 100% online Start instantly and learn at your own schedule.
- Course 2 of 4 in the TensorFlow: Advanced Techniques Specialization
- Flexible deadlines Reset deadlines in accordance to your schedule.
- Intermediate Level Basic calculus, linear algebra, stats Knowledge of AI, deep learning Experience with Python, TF/Keras/PyTorch framework, decorator, context manager
- Approx. 24 hours to complete
- English Subtitles: English
Custom and Distributed Training with TensorFlow at Coursera Course details
- In this course, you will:
- ? Learn about Tensor objects, the fundamental building blocks of TensorFlow, understand the difference between the eager and graph modes in TensorFlow, and learn how to use a TensorFlow tool to calculate gradients.
- ? Build your own custom training loops using GradientTape and TensorFlow Datasets to gain more flexibility and visibility with your model training.
- ? Learn about the benefits of generating code that runs in graph mode, take a peek at what graph code looks like, and practice generating this more efficient code automatically with TensorFlow?s tools.
- ? Harness the power of distributed training to process more data and train larger models, faster, get an overview of various distributed training strategies, and practice working with a strategy that trains on multiple GPU cores, and another that trains on multiple TPU cores.
- The DeepLearning.AI TensorFlow: Advanced Techniques Specialization introduces the features of TensorFlow that provide learners with more control over their model architecture and tools that help them create and train advanced ML models.
- This Specialization is for early and mid-career software and machine learning engineers with a foundational understanding of TensorFlow who are looking to expand their knowledge and skill set by learning advanced TensorFlow features to build powerful models.
Custom and Distributed Training with TensorFlow at Coursera Curriculum
Differentiation and Gradients
A conversation with Andrew Ng: Overview of course 2
What is a tensor?
Creating tensors in code
Math operations with tensors
Basic Tensors code walkthrough
Broadcasting, operator overloading and Numpy compatibility
Evaluating variables and changing data types
Gradient Tape
Gradient Descent using Gradient Tape
Calculate gradients on higher order functions
Persistent=true and higher order gradients
Gradient Tape basics code walkthrough
Connect with your mentors and fellow learners on Slack!
Reference: CNN for visual recognition
Tensors and Gradient Tape
Custom Training
Custom Training Loop steps
Loss and gradient descent
Define Training Loop and Validate Model
Training Basics code walkthrough
Training steps and data pipeline
Define the training loop
Gradients, metrics, and validation
Fashion MNIST Custom Training Loop code walkthrough
Reference: tf.keras.metrics
Custom Training
Graph Mode
Benefits of graph mode
Generating graph code
AutoGraph Basics code walkthrough
Control dependencies and flows
Loops and tracing variables
AutoGraph code walkthrough
Reference: Fizz Buzz
AutoGraph
Distributed Training
Intro to distribution strategies
Types of distribution strategies
Converting code to the Mirrored Strategy
Mirrored Strategy code walkthrough
Custom Training for Multiple GPU Mirrored Strategy
Multi GPU Mirrored Strategy code walkthrough
TPU Strategy
TPU Strategy code walkthrough
Other Distributed Strategies
References used in Other Distributed Strategies
References
Acknowledgments
Distributed Strategy