Master’s Theorem and Its Use in Calculating Time Complexity

Master’s Theorem and Its Use in Calculating Time Complexity

3 mins read3.7K Views Comment
Updated on Sep 20, 2022 12:26 IST

Author: Kanika Joshi

The master’s Theorem is a formula used to analyze the time complexity of recursive relations. Now, the question arises what are recursive relations?

2022_09_MicrosoftTeams-image-32-1.jpg

In this article, we will look at what exactly the master theorem is, and how we can calculate the time complexity of any recursive program using the master’s theorem. 

Table of Contents

Recursive relations or functions are such functions that call themself until a base condition is achieved. Recursive relations sometimes lead to an infinite loop if there is no terminating or base condition for the function. So, the master’s theorem is a formula that helps in identifying the time complexity of any recursive relation in a quicker and easier manner. 

What is Master’s Theorem?

Recursive functions call themselves continuously and it makes the function complex, if the algorithm gets complex it’s more complicated to calculate the time complexity of the program. Master’s theorem is the easiest and fastest way to calculate the time complexity of recursive functions. 

Master’s Theorem can be applied on:

  • Dividing function
  • Decreasing function

1. For Dividing Functions

This formula is usually used to calculate time complexity of “Dividing” recursive relations, which are of the following form:

T(n) = aT(n/b) + f(n) 

where, 

f(n) = θ(nklogpn)

n is the input size

a should be greater than or equal to 1, i.e. a>=1, and a is the number of subproblems present in the recursive function.

b should be greater than 1, i.e. b>1, and n/b is the size of each subproblem

Master’s theorem on dividing functions can only be applied when the relation is of this form

 T(n) = aT(n/b) + f(n), and f(n) = θ(nklogpn).

Like, 

T(n) = 3T(n/4) + nlogn

T(n)= 2T(n/3) + n2

So, the master’s theorem for dividing function is stated as:

T(n) = aT(n/b) +θ(nklogpn)

Case I: If logba>k, then,

  •  T(n) = θ(nlogba)

Case II:  If logba=k, then,

  • If p>-1, then T(n) = θ(nklog(p+1)n)
  • If p= -1, then T(n) = θ(nkloglogn)
  • If p<-1, then T(n) = θ(nk)

Case III:  If logba<k, then

  • If p>=0, then T(n) = θ(nklogpn)
  • If p<0, then T(n) = θ(nk)

Example: Calculate the time complexity of the following recursive relation:

T(n)= 2T(n/3) + n2

Solution:

We can compare the recursive relation with the Master’s theorem, and identify the value of a, b, k, and p. 

So, after comparing the above given recurisve relation with the Master’s theorem [T(n) = aT(n/b) +θ], the values of a =2, b = 3, k=2, p=0

 logba=k

a = bk

2 < 32

From here, we can see that Case III will be applied. 

Since p= 0, then T(n) = θ(nklogpn)

n2log0n

n2

Therefore, the time complexity for the above relation is T(n) = θ(n2)

Explore Free Online Courses with Certificates

Introduction to Types of Shell
Introduction to Types of Shell
Author: Kanika Joshi
Asymptotic Notation: The Language of Algorithm Efficiency
Asymptotic Notation: The Language of Algorithm Efficiency
Asymptotic notation has been closely associated with the development of computer algorithms and data structures. It is used to understand and improve the performance of both simple and complex algorithms,...read more
All That You Need To Know About Stack
All That You Need To Know About Stack
Author: Kanika Joshi

2. For Decreasing Functions

This formula is usually used to calculate time complexity of “Decreasing” recursive relations, which are of the following form:

T(n) = aT(n-b) + f(n) 

where, 

f(n)= θ(nk)

a is the number of subproblems, 

n is the number of input

n-b is size of subproblem.

a>0

b>0

k>=0

Here, the value of “a(number of subproblems)” will decide the time complexity of the relation.  

Master’s theorem on decreasing functions can only be applied when the relation is of this form

 T(n) = aT(n-b) + f(n), and f(n) = θ(nk).

Like, 

T(n) = 2T(n-4) + n

T(n)= 2T(n/3) + n2

So, the master’s theorem for decreasing function is stated as:

T(n) = aT(n-b) +θ(nk)

Case I: If a< 1, then T(n)= θ(nk)

Case II: If a=1, then T(n)= θ(nk+1)

Case III: if a>1, then T(n)= θ(nn/b* f(n))

Example: Calculate the time complexity of the following recursive relation:

T(n) = T(n-3) +θ(n2)

Solution: 

First check,  a>0, b>0 and k>=0. So here, a= 1, b=3, and k= 2. Therefore, this can be solved using master’s theorem for decreasing function. 

Here, a=1, so clearly Case II will be applied. 

T(n)= θ(nk+1)

T(n)= θ(n2+1)

T(n)= θ(n3)

Therefore, the time complexity for the above relation is T(n) = θ(n3)

Application:

  • Master’s Theorem can be used in solving Merge sort and Binary search. 

Limitations:

  • Master’s Theorem can’t be applied to every recurrence relation of the form T(n) = aT(n/b) + f(n) 
  • It can’t be applied when a is less than 1, and a is not a constant, i.e. a= x
  • It can’t be applied when f(n ) is not a polynomial and T(n) is not monotonic. 

Recently completed any professional course/certification from the market? Tell us what liked or disliked in the course for more curated content.

Click here to submit its review with Shiksha Online.

About the Author

This is a collection of insightful articles from domain experts in the fields of Cloud Computing, DevOps, AWS, Data Science, Machine Learning, AI, and Natural Language Processing. The range of topics caters to upski... Read Full Bio