Com S 311 Section B Introduction to the Design and Analysis of Algorithms
Lecture Two for Week 3
Xiaoqiu (See-ow-chew) Huang
Iowa State University
February 11, 2021
Another example
Consider the recurrence for some constant d > 0 T(n) = d(n − 1) if n < 3,
T(n) = T(⌊n/3⌋) + T(⌊2n/3⌋) + dn if n ≥ 3.
The value of this function at any n can be obtained by using the recurrence. For example,
T (3) = T (1) + T (2) + d 3 = 0 + d + 3d = 4d .
Assume that n is a power of 3. Then
T (n) = T (n/3) + T (2n/3) + dn if n ≥ 3.
Replacing n by n/3 and 2n/3 in the above recurrence, we obtain T(n/3) = T(n/32) + T(2n/32) + d(n/3), and
T(2n/3) = T(2n/32) + T(4n/32) + d(2n/3).
The sum of all costs at this level is d(n/3) + d(2n/3) = dn.
Another example
At the next level, we have
T(n/32) = T(n/33) + T(2n/33) + d(n/32), T(2n/32) = T(2n/33) + T(4n/33) + d(2n/32), T(4n/32) = T(4n/33) + T(8n/33) + d(4n/32). The sum of all costs at this level is
d(n/32) + 2d(2n/32) + d(4n/32) = dn.
The longest simple path from the root to a leaf is n → (2/3)n → (2/3)2n... → 1,
where (2/3)k n = 1 or (3/2)k = n.
Thus the height of the tree is k = log3/2 n.
The sum of costs over all levels is the number of levels times the cost of each level, or O(dn log3/2 n) = O(n lg n).
Proving by mathematical induction
We prove by mathematical induction that T(n) ≤ cnlgn for some constant c > 0.
Base: For1≤n<3,wehave
T (n) = d (n − 1) ≤ cn lg n if c ≥ d .
Induction: Assume that for every 1 ≤ m < n, we have T(m) ≤ cmlgm. Setting m = ⌊n/3⌋ and m = ⌊2n/3⌋, respectively, we obtain
T (⌊n/3⌋) ≤ c (⌊n/3⌋) lg(⌊n/3⌋),
T (⌊2n/3⌋) ≤ c (⌊2n/3⌋) lg(⌊2n/3⌋).
Proving by mathematical induction
Then we have
T(n) = T(⌊n/3⌋) + T(⌊2n/3⌋) + dn
≤ c(⌊n/3⌋) lg(⌊n/3⌋) + c(⌊2n/3⌋) lg(⌊2n/3⌋) + dn
≤ c(n/3) lg(n/3) + c(2n/3) lg(n/(3/2)) + dn
= [c(n/3) lg n − c(n/3) lg 3] + [c(2n/3) lg n − c(2n/3) lg(3/2)] + dn = cn lg n − c[(n/3) lg 3 + (2n/3) lg(3/2)] + dn
= cn lg n − c[(n/3) lg 3 + (2n/3) lg 3 − (2n/3) lg 2] + dn
= cn lg n − cn[lg 3 − 2/3] + dn
≤ cnlgn,
if −cn[lg 3 − 2/3] + dn ≤ 0 or c ≥ d/(lg 3 − (2/3)).
Note that lg3 > 1.58 > 0.67 > 2/3.
The master method for solving recurrences
The master method is used to determine the running time T(n) of an algorithm that divides a problem of size n into a subproblems, each of size ⌈n/b⌉ or ⌊n/b⌋, where a ≥ 1 and b > 1 are positive constants.
The a subproblems are solved recursively, each in T(⌈n/b⌉) or T (⌊n/b⌋) time. Let the asymptotically positive function f (n) denote the cost of dividing the problem and of combining the results of the subproblems.
Then T(n) is defined by the following recurrence T(n) = aT(⌈n/b⌉) + f (n), or
T (n) = aT (⌊n/b⌋) + f (n).
The master method is based on the following theorem.
Master Theorem
Theorem 4.1 (Master theorem)
Let f (n) and T (n) with positive constants a and b be defined in the above recurrence.
1. If f (n) = O(nlogb a−ε) for some constant ε > 0, then T(n) = Θ(nlogb a).
2. Iff(n)=Θ(nlogba),thenT(n)=Θ(nlogbalgn).
3. If f (n) = Ω(nlogb a+ε) for some constant ε > 0, and if
af (⌈n/b⌉) ≤ cf (n) (or af (⌊n/b⌋) ≤ cf (n)) for some constant c < 1 and all sufficiently large n, then T (n) = Θ(f (n)).
Understanding the theorem
In other words, the solution to the recurrence is determined by the larger of f (n) and nlogb a.
In case 1, the function nlogb a is the larger, and the solution is T(n) = Θ(nlogb a).
In case 3, the function f (n) is the larger, and the solution is T (n) = Θ(f (n)).
In case 2, the two functions are equal to within a constant factor, and the solution is T (n) = Θ(f (n) lg n), where a logarithmic factor is included in the solution.
Understanding the theorem
In case 1, f (n) must be asymptotically smaller than nlogb a by a factor of nε for some constant ε > 0.
In case 3, f (n) must be asymptotically larger than nlogb a by a factor of nε for some constant ε > 0, and in addition satisfy the ‘regularity’ condition that af (n/b) ≤ cf (n).
Using the master method
Consider the recurrence
T (n) = 4T (⌊n/2⌋) + f (n), where (a) f (n) = n3/2,
(b) f (n) = n2,
(c) f (n) = n5/2, or
(d) f (n) = n2 lg n.
In this recurrence, a = 4 ≥ 1 and b = 2 > 1, and f (n) in each case is asymptotically positive. So they meet the assumptions for the Master theorem.
We compare f (n) with nlogb a = nlog2 4 = n2.
Incase(a),f(n)=n3/2 =O(n2−ε)forε=1/2=0.5. Case1of the master theorem applies to the function n3/2, so we have the solution T (n) = Θ(n2).
Using the master method
In case (b), f (n) = n2 = Θ(nlogb a). Case 2 of the master theorem applies to the function n2, so we have the solution
T (n) = Θ(n2 lg n).
Incase(c),f(n)=n5/2 =O(n2+ε)forε=1/2=0.5. And af (⌊n/b⌋) = 4(⌊n/2⌋5/2) ≤ 4(n/2)5/2
=4n5/2 = 4 n5/2=cn5/2forc= 4 <1. 25/2 25/2 25/2
Case 3 of the master theorem applies to the function n5/2, so we have the solution T (n) = Θ(n5/2).
Using the master method
In case (d), f (n) = n2 lg n. The master theorem does not apply to this case. The ratio f (n)/nlogb a = (n2 lg n)/n2 = lg n is asymptotically less than nε for any positive constant ε. This recurrence falls into the gap between case 2 and case 3.