1. Introduction
Many algorithms work by breaking down a problem of size into sub-problems of a smaller size. Most divide-and-conquer algorithms, such as Binary Search Merge Sort, are of this nature. The running time of such algorithms is naturally modeled as a recursive sequence. In this tutorial, we’ll go over the master theorem, which is a cookbook method for solving certain recurrence relations.
2. Statement of the Master Theorem
Consider an algorithm that solves a large problem of size by,
- Dividing the problem into small chunks, each of size
- Recursively solve smaller sub-problems
- Recombining the sub-problems
Let be the total work done in solving a problem of size . Let be the cost of dividing the problems into sub-problems and then recombining the subproblems – steps 1 and 3. As cost corresponds to size , is the cost corresponding to a sub-problem of size . In step 2, we solve such sub-problems, so the work done in step 2 equals . Putting everything together, the total work , can be expressed as the sum:
2.1. The Runtime for an Algorithm With Recurrence
For simplicity, first, consider an algorithm with a recurrence of the form:
A recursion tree for the total amount of work done by this algorithm can be graphically drawn as:
The runtime for each of the initial nodes is . This tree can be drawn further until it bottoms out, that is until each tiny sub-problem is reduced to an elementary one of size :
At depth , each sub-problem has size , at depth , each sub-problem has size and at depth , each sub-problem has size . The height of the tree, is given by the expression
So, the tree has a height and at depth , the tree has nodes. So, there are leaf nodes. Each leaf node represents a problem of size 1, that takes or constant runtime. Therefore, the total runtime is .
2.2. The Master Theorem
Intuitively speaking, when an asymptotically positive function is added to the recurrence relation, we have:
The Master Theorem asserts that it is possible to deduce the runtime of the algorithm , based on a relative comparison between and .
The Master Theorem is defined by the following. Given a recurrence of the form:
For constants and and asymptotically positive, the following statements are true:
- Case 1. If , for some , then .
- Case 2. If , then .
- Case 3. If for some and if for some constant , then
Simply put, if is polynomially smaller than , then dominates and the runtime of the algorithm is . If is polynomially larger than the , then dominates and the runtime of the algorithm is . Finally, if eventually grows as fast as , then the runtime of the algorithm is .
2.3. Polynomially Large vs. Asymptotically Large
The master theorem does not provide a solution for all . In particular, if is smaller or larger than by a polynomial factor, then none of the three cases are satisfied.
Polynomially larger means that the ratio of the functions falls between two polynomials asymptotically. Specifically, is polynomially larger than , if and only if, there exist generalized polynomials (fractional exponents are allowed) such that:
For example, is asymptotically larger , because:
.
However, is not larger than by a polynomial factor. There’s no polynomial such that:
Another example of this is , which is not polynomially larger than . The function grows too quickly. It is exponentially larger than . So, while is asymptotically larger than , there is no polynomial , such that:
3. Examples
As a first example, let’s consider the recurrence:
For this recurrence, we have , , . Since, the quantity which is , is polynomially larger than , case 1 of the master theorem implies that the solution is .
Next, we can consider the following:
For this recurrence, we have , , . Since, the quantity grows as fast as , case 2 of the master theorem implies that the solution is .
We can also see:
For this recurrence, we have , , . Let’s look at the ratio of the quantities to . We can find nice polynomial bounds for the ratio .
Consequently, is polynomially larger than . Case 3 of the master theorem implies that the solution to the recurrence is .
Finally, let’s consider the recurrence:
For this recurrence, we have , and . As seen earlier, is asymptotically larger than . But, is not polynomially larger than .
4. Proof of the Master Theorem
Let’s analyze the recurrence:
The root of the tree now has an add-on cost , and it has children, each with the cost . Each of these children has children, resulting in nodes, at the depth , each with the cost . In general, at the depth , there are nodes, each with the cost :
The cost of all the nodes at depth is . So, the total cost of all the internal nodes is:
As we’ve seen earlier, the total cost of the leaf nodes is . Therefore, the total running time is given by:
5. Conclusion
In this article, we’ve learned how to practically use the Master Theorem to compute the runtime of the algorithm. Given a recurrence , the crux of the Master Theorem is to compare with .
Further, the Master Theorem only applies if is polynomially smaller or polynomially larger than . It’s not too difficult to construct pathological examples, where one function is asymptotically larger than the other, but it is not polynomially larger.