Coursera
Coursera Logo

The Nuts and Bolts of Machine Learning 

  • Offered byCoursera

The Nuts and Bolts of Machine Learning
 at 
Coursera 
Overview

Duration

33 hours

Total fee

Free

Mode of learning

Online

Difficulty level

Advanced

Official Website

Explore Free Course External Link Icon

Credential

Certificate

The Nuts and Bolts of Machine Learning
 at 
Coursera 
Highlights

  • Flexible deadlines Reset deadlines in accordance to your schedule.
  • Shareable Certificate Earn a Certificate upon completion
  • 100% online Start instantly and learn at your own schedule.
  • Coursera Labs Includes hands on learning projects. Learn more about Coursera Labs External Link
  • Advanced Level
  • Approx. 33 hours to complete
  • English Subtitles: English
Read more
Details Icon

The Nuts and Bolts of Machine Learning
 at 
Coursera 
Course details

More about this course
  • This is the sixth of seven courses in the Google Advanced Data Analytics Certificate. In this course, you’ll learn about machine learning, which uses algorithms and statistics to teach computer systems to discover patterns in data. Data professionals use machine learning to help analyze large amounts of data, solve complex problems, and make accurate predictions. You’ll focus on the two main types of machine learning: supervised and unsupervised. You'll learn how to apply different machine learning models to business problems and become familiar with specific models such as Naive Bayes, decision tree, random forest, and more.
  • Google employees who currently work in the field will guide you through this course by providing hands-on activities that simulate relevant tasks, sharing examples from their day-to-day work, and helping you enhance your data analytics skills to prepare for your career.
  • Learners who complete the seven courses in this program will have the skills needed to apply for data science and advanced data analytics jobs. This certificate assumes prior knowledge of foundational analytical principles, skills, and tools covered in the Google Data Analytics Certificate.
  • By the end of this course, you will:
  • -Apply feature engineering techniques using Python
  • -Construct a Naive Bayes model
  • -Describe how unsupervised learning differs from supervised learning
  • -Code a K-means algorithm in Python
  • -Evaluate and optimize the results of K-means model
  • -Explore decision tree models, how they work, and their advantages over other types of supervised machine learning
  • -Characterize bagging in machine learning, specifically for random forest models
  • -Distinguish boosting in machine learning, specifically for XGBoost models
  • -Explain tuning model parameters and how they affect performance and evaluation metrics
Read more

The Nuts and Bolts of Machine Learning
 at 
Coursera 
Curriculum

The different types of machine learning

Introduction to Course 6

Susheela: Delight people with data

Welcome to week 1

The main types of machine learning

Determine when features are infinite

Categorical features and classification models

Guide user interest with recommendation systems

Equity and fairness in machine learning

Build ethical models

Python for machine learning

Different types of Python IDEs

More about Python packages

Resources to answer programming questions

Your machine learning team

Samantha: Connect to the data professional community

Wrap-up

Helpful resources and tips

Course 6 overview

Case study: The Woobles: The power of recommendation systems to drive sales

Reference guide: Python for machine learning

Python libraries and packages

Find solutions online

Glossary terms from week 1

Test your knowledge: Introduction to machine learning

Test your knowledge: Categorical versus continuous data types and models

Test your knowledge: Machine learning in everyday life

Test your knowledge: Ethics in machine learning

Test your knowledge: Utilize the Python toolbelt for machine learning

Test your knowledge: Machine learning resources for data professionals

Weekly challenge 1

Workflow for building complex models

Welcome to week 2

PACE in machine learning

Plan for a machine learning project

Ganesh: Overcome challenges and learn from your mistakes

Analyze data for a machine learning model

Introduction to feature engineering

Solve issues that come with imbalanced datasets

Feature engineering and class balancing

Introduction to Naive Bayes

Construct a Naive Bayes model with Python

Key evaluation metrics for classification models

Wrap-up

More about planning a machine learning project

Explore feature engineering

More about imbalanced datasets

Follow-along instructions: Feature engineering with Python

Naive Bayes classifiers

Follow-along instructions: Construct a Naive Bayes model with Python

More about evaluation metrics for classification models (R-256)

Glossary terms week 2

Test your knowledge: PACE in machine learning: The plan and analyze stages

Test your knowledge: PACE in machine learning: The construct and execute stages

Weekly challenge 2

Unsupervised learning techniques

Welcome to week 3

Introduction to K-means

Use K-means for color compression with Python

Key metrics for representing K-means clustering

Inertia and silhouette coefficient metrics

Apply inertia and silhouette score with Python

Wrap-up

More about K-means

Follow-along instructions: Use K-means for color compression with Python

More about inertia and silhouette coefficient metrics

Follow-along instructions: Apply inertia and silhouette score with Python

Glossary terms from week 3

Test your knowledge: Explore unsupervised learning and K-means

Test your knowledge: Evaluate a K-means model

Weekly challenge 3

Tree-based modeling

Welcome to week 4

Tree-based modeling

Build a decision tree with Python

Tune a decision tree

Verify performance using validation

Tune and validate decision trees with Python

Bootstrap aggregation

Explore a random forest

Tuning a random forest

Build and cross-validate a random forest model with Python

Build and validate a random forest model using a validation data set

Introduction to boosting: AdaBoost

Gradient boosting machines

Tune a GBM model

Build an XGBoost model with Python

Wrap-up

Explore decision trees

Follow-along instructions: Build a decision tree with Python

Hyperparameter tuning

More about validation and cross-validation

Follow-along instructions: Tune and validate decision trees with Python

Bagging: How it works and why to use it

More about random forests

Follow-along instructions: Build and cross-validate a random forest model with Python

Reference guide: Random forest tuning

Case Study: Machine learning model unearths resourcing insights for Booz Allen Hamilton

More about gradient boosting

Reference guide: XGBoost tuning

Follow-along instructions: Build an XGBoost model with Python

Glossary terms from week 4

Test your knowledge: Additional supervised learning techniques

Test your knowledge: Tune tree-based models

Test your knowledge: Bagging

Test your knowledge: Boosting

Weekly challenge 4

Course 6 end-of-course project

Welcome to week 5

Uri: Impress interviewers with your unique solutions

Introduction to Course 6 end-of-course portfolio project

End-of-course project wrap-up and tips for ongoing career success

Course wrap-up

Course 6 end-of-course portfolio project overview: Automatidata

Activity Exemplar: Create your Course 6 Automatidata project exemplar

Course 6 glossary

Get started on the next course

Activity: Create your Course 6 Automatidata project

Assess your Course 6 end-of-course project

Other courses offered by Coursera

– / –
3 months
Beginner
– / –
20 hours
Beginner
– / –
2 months
Beginner
– / –
3 months
Beginner
View Other 6715 CoursesRight Arrow Icon
qna

The Nuts and Bolts of Machine Learning
 at 
Coursera 

Student Forum

chatAnything you would want to ask experts?
Write here...