1. Introduction

Many algorithms work by breaking down a problem of size n into sub-problems of a smaller size. Most divide-and-conquer algorithms, such as Binary Search Merge Sort, are of this nature. The running time of such algorithms is naturally modeled as a recursive sequence. In this tutorial, we’ll go over the master theorem, which is a cookbook method for solving certain recurrence relations.

2. Statement of the Master Theorem

Consider an algorithm that solves a large problem of size n by,

  1. Dividing the problem into a small chunks, each of size \frac{n}{b}
  2. Recursively solve a smaller sub-problems
  3. Recombining the sub-problems

Let T(n) be the total work done in solving a problem of size n. Let f(n) be the cost of dividing the problems into sub-problems and then recombining the subproblems – steps 1 and 3.  As T(n) cost corresponds to size n, T(n/b) is the cost corresponding to a sub-problem of size n/b. In step 2, we solve a such sub-problems, so the work done in step 2 equals aT\left(\frac{n}{b}\right). Putting everything together, the total work T(n), can be expressed as the sum:

    \[T(n) = aT\left(\frac{n}{b}\right) + f(n)\]

2.1. The Runtime for an Algorithm With Recurrence \mathbf{T(n)=aT(n/b)}

For simplicity, first, consider an algorithm with a recurrence of the form:

    \[T(n) = aT\left(\frac{n}{b}\right)\]

A recursion tree for the total amount of work done by this algorithm can be graphically drawn as:

Rendered by QuickLaTeX.com

The runtime for each of the initial a nodes is T\left(\frac{n}{b}\right). This tree can be drawn further until it bottoms out, that is until each tiny sub-problem is reduced to an elementary one of size 1:

Rendered by QuickLaTeX.com

At depth 1, each sub-problem has size \frac{n}{b}, at depth 2, each sub-problem has size \frac{n}{b^2} and at depth h, each sub-problem has size \frac{n}{b^h}. The height of the tree, is given by the expression

    \[\frac{n}{b^h} = 1 \quad \Longleftrightarrow \quad h = \log_b n\]

So, the tree has a height h = \log_b n and at depth i, the tree has a^i nodes. So, there are a^{\log_b n} = n^{\log_b a} leaf nodes. Each leaf node represents a problem of size 1, that takes T(1) or constant runtime. Therefore, the total runtime is n^{\log_b a} \times T(1) = \Theta(n^{\log_b a}).

2.2. The Master Theorem

Intuitively speaking, when an asymptotically positive function f is added to the recurrence relation, we have:

    \[T(n) = aT\left(\frac{n}{b}\right) + f(n)\]

The Master Theorem asserts that it is possible to deduce the runtime of the algorithm T(n), based on a relative comparison between f and n^{\log_b a}.

The Master Theorem is defined by the following. Given a recurrence of the form:

    \[T(n) = aT\left(\frac{n}{b}\right) + f(n)\]

For constants a \geq 1 and b > 1 and f asymptotically positive, the following statements are true:

  • Case 1. If f(n) = O(n^{\log_b a - \epsilon}), for some \epsilon > 0, then T(n) = \Theta(n^{\log_b a}).
  • Case 2. If f(n) = \Theta(n^{\log_b a}), then T(n)=\Theta(n^{\log_b a}\log n).
  • Case 3. If f(n) = \Omega(n^{\log_b a + \epsilon}) for some \epsilon > 0 and if af(n/b) \leq c f(n) for some constant c < 1, then T(n) = \Theta(f(n))

Simply put, if f(n) is polynomially smaller than n^{\log_b a}, then n^{\log_b a} dominates and the runtime of the algorithm is \Theta(n^{\log_b a}). If f(n) is polynomially larger than the  n^{\log_b a}, then f(n) dominates and the runtime of the algorithm is \Theta(f(n)). Finally, if f(n) eventually grows as fast as n^{\log_b a}, then the runtime of the algorithm is \Theta(n^{\log_b a}\log n).

2.3. Polynomially Large vs. Asymptotically Large

The master theorem does not provide a solution for all f. In particular, if f is smaller or larger than \mathbf{n^{\log_b a}} by a polynomial factor, then none of the three cases are satisfied.

Polynomially larger means that the ratio of the functions falls between two polynomials asymptotically. Specifically, f(n) is polynomially larger than g(n), if and only if, there exist generalized polynomials (fractional exponents are allowed) p(n), q(n) such that:

    \[p(n) \leq \frac{f(n)}{g(n)} \leq q(n)\]

For example, n\log n is asymptotically larger n, because:

    \[\lim_{n\to \infty}\frac{n \log n}{n} = \lim_{n \to \infty} \log n = \infty\]

.

However, n \log n is not larger than n by a polynomial factor. There’s no polynomial p(n) such that:

    \[p(n) \leq \frac{n \log n}{n} = \log n < n^{1}\]

Another example of this is e^n, which is not polynomially larger than n^{1000}. The function e^n grows too quickly. It is exponentially larger than n^{1000}. So, while e^n is asymptotically larger than n^{1000}, there is no polynomial q(n), such that:

    \[\frac{e^n}{n^{1000}} \leq q(n)\]

3. Examples

As a first example, let’s consider the recurrence:

    \[T(n) = 9T\left(\frac{n}{3}\right) + n\]

For this recurrence, we have a = 9, b=3, f(n) = n. Since, the quantity n^{\log_b a} which is n^{\log_3 9} = n^2, is polynomially larger than f(n) = n, case 1 of the master theorem implies that the solution is T(n) = \Theta(n^2).

Next, we can consider the following:

    \[T(n) = T\left(\frac{2n}{3}\right) + 1\]

For this recurrence, we have a = 1, b=3/2, f(n)=1. Since, the quantity n^{\log_b a} = n^{\log_{3/2} 1}=n^0 = 1 grows as fast as f(n) = n^0, case 2 of the master theorem implies that the solution is T(n)=\Theta(\log n).

We can also see:

    \[T(n) = 3T\left(\frac{n}{4}\right) + n\log n\]

For this recurrence, we have a = 3, b = 4, f(n) = n\log n. Let’s look at the ratio of the quantities n \log n to n^{\log_b a} = n^{\log_4 3}. We can find nice polynomial bounds for the ratio (n \log n)/(n^{\log_4 3}).

    \[n^{1 - \log_4 (7/2)} \leq \frac{n \log n}{n^{\log_4 3}} = n^{1 - \log_4 3} \log n \leq n^2\]

Consequently, n \log n is polynomially larger than n^{\log_4 3}. Case 3 of the master theorem implies that the solution to the recurrence is T(n)=\Theta(n \log n).

Finally, let’s consider the recurrence:

    \[T(n) = 2T\left(\frac{n}{2}\right) + n\log n\]

For this recurrence, we have a = 2, b=2 and f(n) = n \log n. As seen earlier, f(n)  = n \log n is asymptotically larger than n^{\log_b a} = n^{\log_2 2} = n. But, n\log n is not polynomially larger than n.

4. Proof of the Master Theorem

Let’s analyze the recurrence:

    \[T(n) = aT\left(\frac{n}{b}\right) + f(n)\]

The root of the tree now has an add-on cost f(n), and it has a children, each with the cost f\left(\frac{n}{b}\right). Each of these children has a children, resulting in a^2 nodes, at the depth 2, each with the cost f\left(\frac{n}{b^2}\right). In general, at the depth j, there are a^j nodes, each with the cost f\left(\frac{n}{b^j}\right):

Rendered by QuickLaTeX.com

The cost of all the nodes at depth j is a^j f\left(\frac{n}{b^j}\right). So, the total cost of all the internal nodes is:

    \[\sum_{j=0}^{\log_b n - 1}a^j f\left(\frac{n}{b^j}\right)\]

As we’ve seen earlier, the total cost of the leaf nodes is \Theta(n^{\log_b a}). Therefore, the total running time T(n) is given by:

    \[T(n) = \Theta(n^{\log_b a}) + \sum_{j=0}^{\log_b n - 1}a^j f\left(\frac{n}{b^j}\right)\]

5. Conclusion

In this article, we’ve learned how to practically use the Master Theorem to compute the runtime of the algorithm. Given a recurrence T(n) = aT(n/b) + f(n), the crux of the Master Theorem is to compare n^{\log_b a} with f(n).

Further, the Master Theorem only applies if f(n) is polynomially smaller or polynomially larger than n^{\log_b a}. It’s not too difficult to construct pathological examples, where one function is asymptotically larger than the other, but it is not polynomially larger.

Comments are closed on this article!