CSC373 Fall’20 Tutorial 1 Tuesday, Sept. 15, 2020
Master Theorem (General Version):
For constants a 1 and b > 1, and an asymptotically positive function f(n), the recurrence relation T (n) a · T (n/b) + O(f (n)) has the following solution.
1. If f(n) = Onlogb a−ε for some constant ε > 0, then T(n) = Onlogb a.
2. If f(n) = Θnlogb a logk n for some constant k 0, then T(n) = Onlogb a logk+1 n.
3. If f(n) = Ωnlogb a+ε for some constant ε > 0 and f satisfies the regularity condition*, then T(n) = O (f(n)).
(*Regularity condition: For some constant c < 1 and all sufficiently large n, a·f (n/b) c·f (n).)
Note: There are recurrence relations which do not fall under any of these three cases (e.g. the recurrence relation T (n) T (n/5) + T (7n/10) + O(n) from QuickSelect where the smaller instances are not of uniform size, or the recurrence relation T (n) √n · T (√n) + O(n) where a and b are not constants). If you’re interested in how more general recurrences can be solved, there are some excellent resources available online.1,2
Q1 Practicing Recurrence Relations
Find the best possible asymptotic upper bound for T(n) under the following recurrence relations.3 (a) T(n)3·T(n/2)+O(nlog3n)
(b) T(n)4·T(n/2)+O(n2)
(c) T(n)2·T(n/2)+O(nlog2n)
(d) T(n)2·T(n/4)+O(n0.5001)
Q2 Monotonic Function Evaluation
Consider a monotonously decreasing function f : N → Z (that is, a function defined on natural numbers which takes integer values and satisfies f(i) > f(i + 1) for all i ∈ N). Assuming we can evaluate f at any point i in constant time, we want to find n = min{i ∈ N|f(i) 0} (that is, we want to find the first point where f becomes non-positive). Note that n is not given to us, but we are told that some point i with f(i) 0 exists (i.e. n is well-defined), and we are allowed to express the running time of our algorithm in terms of n.
1http://jeffe.cs.illinois.edu/teaching/algorithms/notes/99-recurrences.pdf
2 http://web.csulb.edu/~tebert/teaching/lectures/528/recurrence/recurrence.pdf
3Note that when proving an upper bound on the worst-case running time of an algorithm, you would encounter
equations of the form T(n) … rather than T(n) = …, yielding T(n) = O(·) rather than T(n) = Θ(·). For a lower bound, you would need to explicitly construct instances on which the algorithm takes the desired amount of time.
1
We can obviously solve the problem in O(n) time by simply evaluating f(1),f(2),f(3),…,f(n). Describe an O(log n) time algorithm.
[Hint: Try to quickly get an estimate of n, and then precisely pinpoint the exact value of n in the range you estimated.]
Q3 Maximum Subarray Sum
You are given an array A[1…n], and you are asked to find the maximum subarray sum, that is, the maximum value of jt=i A[t] over all possible (i, j) with 1 i j n. Design an O(n) time divide and conquer algorithm for the problem.
[Hint: Once you divide the array into two equal halves, say A[1…mid] and A[mid+1…n], you will get the maximum subarray sum within each half. What extra information do you need from each half?
If you spend O(n) time in the merge step to calculate this extra information, you will get O(n log n) running time. Can you get your recursive algorithm to return this information instead?]
2