Coursera
Coursera Logo

UCSC - Bayesian Statistics: Mixture Models 

  • Offered byCoursera

Bayesian Statistics: Mixture Models
 at 
Coursera 
Overview

Duration

22 hours

Start from

Start Now

Total fee

Free

Mode of learning

Online

Difficulty level

Intermediate

Official Website

Explore Free Course External Link Icon

Credential

Certificate

Bayesian Statistics: Mixture Models
 at 
Coursera 
Highlights

  • Earn a shareable certificate upon completion.
  • Flexible deadlines according to your schedule.
Details Icon

Bayesian Statistics: Mixture Models
 at 
Coursera 
Course details

Skills you will learn
More about this course
  • Bayesian Statistics: Mixture Models introduces you to an important class of statistical models. The course is organized in five modules, each of which contains lecture videos, short quizzes, background reading, discussion prompts, and one or more peer-reviewed assignments. Statistics is best learned by doing it, not just watching a video, so the course is structured to help you learn through application.
  • Some exercises require the use of R, a freely-available statistical software package. A brief tutorial is provided, but we encourage you to take advantage of the many other resources online for learning R if you are interested.
  • This is an advanced course, and it was designed to be the third in UC Santa Cruz's series on Bayesian statistics, after Herbie Lee's "Bayesian Statistics: From Concept to Data Analysis" and Matthew Heiner's "Bayesian Statistics: Techniques and Models." To succeed in the course, you should have some knowledge of and comfort with calculus-based probability, principles of maximum-likelihood estimation, and Bayesian estimation.
Read more

Bayesian Statistics: Mixture Models
 at 
Coursera 
Curriculum

Basic concepts on Mixture Models

Welcome to Bayesian Statistics: Mixture Models

Installing and using R

Basic definitions

Mixtures of Gaussians

Zero-inflated mixtures

Hierarchical representations

Sampling from a mixture model

The likelihood function

Parameter identifiability

An Introduction to R

Example of a bimodal mixture of Gaussians

Example of a unimodal and skewed mixture of Gaussians

Example of a unimodal, symmetric and heavy tailed mixture of Gaussians

Example of a zero-inflated negative binomial distribution

Example of a zero-inflated log Gaussian distribution

Sample code for simulating from a Mixture Model

Basic definitions

Mixtures of Gaussians

Zero-inflated distributions

Definition of Mixture Models

The likelihood function

Identifiability

Likelihood function for mixture models

Maximum likelihood estimation for Mixture Models

EM for general mixtures

EM for location mixtures of Gaussians

EM example 1

EM example 2

Sample code for EM example 1

Sample code for EM example 2

Bayesian estimation for Mixture Models

Markov Chain Monte Carlo algorithms part 1

Markov Chain Monte Carlo algorithms, part 2

MCMC for location mixtures of normals Part 1

MCMC for location mixtures of normals Part 2

MCMC Example 1

MCMC Example 2

Sample code for MCMC example 1

Sample code for MCMC example 2

Applications of Mixture Models

Density estimation using Mixture Models

Density Estimation Example

Mixture Models for Clustering

Clustering example

Mixture Models and naive Bayes classifiers

Linear and quadratic discriminant analysis in the context of Mixture Models

Classification example

Sample code for density estimation problems

Sample EM algorithm for clustering problems

Sample EM algorithm for classification problems

Practical considerations

Numerical stability

Computational issues associated with multimodality

Bayesian Information Criteria (BIC)

Bayesian Information Criteria Example

Estimating the number of components in Bayesian settings

Estimating the full partition structure in Bayesian settings

Example: Bayesian inference for the partition structure

Sample code to illustrate numerical stability issues

Sample code to illustrate multimodality issues 1

Sample code to illustrate multimodality issues 2

Sample code: Bayesian Information Criteria

Sample code for estimating the number of components and the partition structure in Bayesian models

Computational considerations for Mixture Models

Bayesian Information Criteria (BIC)

Estimating the number of components in Bayesian settings

Estimating the partition structure in Bayesian models

Bayesian Statistics: Mixture Models
 at 
Coursera 
Admission Process

    Important Dates

    May 25, 2024
    Course Commencement Date

    Other courses offered by Coursera

    – / –
    3 months
    Beginner
    – / –
    20 hours
    Beginner
    – / –
    2 months
    Beginner
    – / –
    3 months
    Beginner
    View Other 6715 CoursesRight Arrow Icon
    qna

    Bayesian Statistics: Mixture Models
     at 
    Coursera 

    Student Forum

    chatAnything you would want to ask experts?
    Write here...