Coursera
Coursera Logo

University of Washington - Machine Learning: Regression 

  • Offered byCoursera

Machine Learning: Regression
 at 
Coursera 
Overview

Duration

22 hours

Start from

Start Now

Total fee

Free

Mode of learning

Online

Official Website

Explore Free Course External Link Icon

Credential

Certificate

Machine Learning: Regression
 at 
Coursera 
Highlights

  • Shareable Certificate Earn a Certificate upon completion
  • 100% online Start instantly and learn at your own schedule.
  • Course 2 of 4 in the Machine Learning Specialization
  • Flexible deadlines Reset deadlines in accordance to your schedule.
  • Approx. 22 hours to complete
  • English Subtitles: Arabic, French, Portuguese (European), Italian, Vietnamese, Korean, German, Russian, English, Spanish
Read more
Details Icon

Machine Learning: Regression
 at 
Coursera 
Course details

Skills you will learn
More about this course
  • Case Study - Predicting Housing Prices
  • In our first case study, predicting house prices, you will create models that predict a continuous value (price) from input features (square footage, number of bedrooms and bathrooms,...). This is just one of the many places where regression can be applied. Other applications range from predicting health outcomes in medicine, stock prices in finance, and power usage in high-performance computing, to analyzing which regulators are important for gene expression.
  • In this course, you will explore regularized linear regression models for the task of prediction and feature selection. You will be able to handle very large sets of features and select between models of various complexity. You will also analyze the impact of aspects of your data -- such as outliers -- on your selected models and predictions. To fit these models, you will implement optimization algorithms that scale to large datasets.
  • Learning Outcomes: By the end of this course, you will be able to:
  • -Describe the input and output of a regression model.
  • -Compare and contrast bias and variance when modeling data.
  • -Estimate model parameters using optimization algorithms.
  • -Tune parameters with cross validation.
  • -Analyze the performance of the model.
  • -Describe the notion of sparsity and how LASSO leads to sparse solutions.
  • -Deploy methods to select between models.
  • -Exploit the model to form predictions.
  • -Build a regression model to predict prices using a housing dataset.
  • -Implement these techniques in Python.
Read more

Machine Learning: Regression
 at 
Coursera 
Curriculum

Welcome

Welcome!

What is the course about?

Outlining the first half of the course

Outlining the second half of the course

Assumed background

Important Update regarding the Machine Learning Specialization

Slides presented in this module

Reading: Software tools you'll need

A case study in predicting house prices

Regression fundamentals: data & model

Regression fundamentals: the task

Regression ML block diagram

The simple linear regression model

The cost of using a given line

Using the fitted line

Interpreting the fitted line

Defining our least squares optimization objective

Finding maxima or minima analytically

Maximizing a 1d function: a worked example

Finding the max via hill climbing

Finding the min via hill descent

Choosing stepsize and convergence criteria

Gradients: derivatives in multiple dimensions

Gradient descent: multidimensional hill descent

Computing the gradient of RSS

Approach 1: closed-form solution

Approach 2: gradient descent

Comparing the approaches

Influence of high leverage points: exploring the data

Influence of high leverage points: removing Center City

Influence of high leverage points: removing high-end towns

Asymmetric cost functions

A brief recap

Slides presented in this module

Optional reading: worked-out example for closed-form solution

Optional reading: worked-out example for gradient descent

Download notebooks to follow along

Fitting a simple linear regression model on housing data

Simple Linear Regression

Fitting a simple linear regression model on housing data

Multiple Regression

Multiple regression intro

Polynomial regression

Modeling seasonality

Where we see seasonality

Regression with general features of 1 input

Motivating the use of multiple inputs

Defining notation

Regression with features of multiple inputs

Interpreting the multiple regression fit

Rewriting the single observation model in vector notation

Rewriting the model for all observations in matrix notation

Computing the cost of a D-dimensional curve

Computing the gradient of RSS

Approach 1: closed-form solution

Discussing the closed-form solution

Approach 2: gradient descent

Feature-by-feature update

Algorithmic summary of gradient descent approach

A brief recap

Slides presented in this module

Optional reading: review of matrix algebra

Exploring different multiple regression models for house price prediction

Numpy tutorial

Implementing gradient descent for multiple regression

Multiple Regression

Exploring different multiple regression models for house price prediction

Implementing gradient descent for multiple regression

Assessing Performance

Assessing performance intro

What do we mean by "loss"?

Training error: assessing loss on the training set

Generalization error: what we really want

Test error: what we can actually compute

Defining overfitting

Training/test split

Irreducible error and bias

Variance and the bias-variance tradeoff

Error vs. amount of data

Formally defining the 3 sources of error

Formally deriving why 3 sources of error

Training/validation/test split for model selection, fitting, and assessment

A brief recap

Slides presented in this module

Polynomial Regression

Assessing Performance

Exploring the bias-variance tradeoff

Ridge Regression

Symptoms of overfitting in polynomial regression

Overfitting demo

Overfitting for more general multiple regression models

Balancing fit and magnitude of coefficients

The resulting ridge objective and its extreme solutions

How ridge regression balances bias and variance

Ridge regression demo

The ridge coefficient path

Computing the gradient of the ridge objective

Approach 1: closed-form solution

Discussing the closed-form solution

Approach 2: gradient descent

Selecting tuning parameters via cross validation

K-fold cross validation

How to handle the intercept

A brief recap

Slides presented in this module

Download the notebook and follow along

Download the notebook and follow along

Observing effects of L2 penalty in polynomial regression

Implementing ridge regression via gradient descent

Ridge Regression

Observing effects of L2 penalty in polynomial regression

Implementing ridge regression via gradient descent

Feature Selection & Lasso

The feature selection task

All subsets

Complexity of all subsets

Greedy algorithms

Complexity of the greedy forward stepwise algorithm

Can we use regularization for feature selection?

Thresholding ridge coefficients?

The lasso objective and its coefficient path

Visualizing the ridge cost

Visualizing the ridge solution

Visualizing the lasso cost and solution

Lasso demo

What makes the lasso objective different

Coordinate descent

Normalizing features

Coordinate descent for least squares regression (normalized features)

Coordinate descent for lasso (normalized features)

Assessing convergence and other lasso solvers

Coordinate descent for lasso (unnormalized features)

Deriving the lasso coordinate descent update

Choosing the penalty strength and other practical issues with lasso

A brief recap

Slides presented in this module

Download the notebook and follow along

Using LASSO to select features

Implementing LASSO using coordinate descent

Feature Selection and Lasso

Using LASSO to select features

Implementing LASSO using coordinate descent

Nearest Neighbors & Kernel Regression

Limitations of parametric regression

1-Nearest neighbor regression approach

Distance metrics

1-Nearest neighbor algorithm

k-Nearest neighbors regression

k-Nearest neighbors in practice

Weighted k-nearest neighbors

From weighted k-NN to kernel regression

Global fits of parametric models vs. local fits of kernel regression

Performance of NN as amount of data grows

Issues with high-dimensions, data scarcity, and computational complexity

k-NN for classification

A brief recap

Slides presented in this module

Predicting house prices using k-nearest neighbors regression

Nearest Neighbors & Kernel Regression

Predicting house prices using k-nearest neighbors regression

Simple and multiple regression

Assessing performance and ridge regression

Feature selection, lasso, and nearest neighbor regression

What we covered and what we didn't cover

Thank you!

Slides presented in this module

Machine Learning: Regression
 at 
Coursera 
Admission Process

    Important Dates

    May 25, 2024
    Course Commencement Date

    Other courses offered by Coursera

    – / –
    3 months
    Beginner
    – / –
    20 hours
    Beginner
    – / –
    2 months
    Beginner
    – / –
    3 months
    Beginner
    View Other 6715 CoursesRight Arrow Icon

    Machine Learning: Regression
     at 
    Coursera 
    Students Ratings & Reviews

    4/5
    Verified Icon1 Rating
    A
    Ayush Jain
    Machine Learning: Regression
    Offered by Coursera
    4
    Learning Experience: Course is good and organised, clearly explained topics, hands on exercises
    Faculty: Faculties are highly professional, a lot of practical examples are there, all lectures are pre recorded Course is structured according to a beginner in regression, assessments are up to mark and assignments helps in understanding the topic correctly
    Course Support: This course does not provide any job assistance but it helps highly in getting relevant jobs
    Reviewed on 29 Aug 2022Read More
    Thumbs Up IconThumbs Down Icon
    View 1 ReviewRight Arrow Icon
    qna

    Machine Learning: Regression
     at 
    Coursera 

    Student Forum

    chatAnything you would want to ask experts?
    Write here...