Algorithms COMP3121/9101
THE UNIVERSITY OF
NEW SOUTH WALES
Copyright By PowCoder代写 加微信 powcoder
Algorithms
COMP3121/9101
School of Computer Science and Engineering
University of Wales Sydney
3. RECURRENCES
COMP3121/9101 1 / 22
Asymptotic notation
“Big Oh” notation: f(n) = O(g(n)) is an abbreviation for:
“There exist positive constants c and n0 such that
0 ≤ f(n) ≤ c g(n) for all n ≥ n0”.
In this case we say that g(n) is an asymptotic upper bound for
f(n) = O(g(n)) means that f(n) does not grow substantially faster
than g(n) because a multiple of g(n) eventually dominates f(n).
Clearly, multiplying constants c of interest will be larger than 1,
thus “enlarging” g(n).
COMP3121/9101 2 / 22
Asymptotic notation
“Omega” notation: f(n) = Ω(g(n)) is an abbreviation for:
“There exists positive constants c and n0 such that
0 ≤ c g(n) ≤ f(n) for all n ≥ n0.”
In this case we say that g(n) is an asymptotic lower bound for
f(n) = Ω(g(n)) essentially says that f(n) grows at least as fast as
g(n), because f(n) eventually dominates a multiple of g(n).
Since c g(n) ≤ f(n) if and only if g(n) ≤ 1
f(n), we have
f(n) = Ω(g(n)) if and only if g(n) = O(f(n)).
“Theta” notation: f(n) = Θ(g(n)) if and only if f(n) = O(g(n))
and f(n) = Ω(g(n)); thus, f(n) and g(n) have the same
asymptotic growth rate.
COMP3121/9101 3 / 22
Recurrences
Recurrences are important to us because they arise in estimations
of time complexity of divide-and-conquer algorithms.
Merge-Sort(A, p, r) *sorting A[p..r]*
1 if p < r
2 then q ← ⌊p+r
3 Merge-Sort(A, p, q)
4 Merge-Sort(A, q + 1, r)
5 Merge(A, p, q, r)
Since Merge(A, p, q, r) runs in linear time, the runtime T (n) of
Merge-Sort(A, p, r) satisfies
T (n) = 2T
COMP3121/9101 4 / 22
Recurrences
Let a ≥ 1 be an integer and b > 1 a real number;
Assume that a divide-and-conquer algorithm:
reduces a problem of size n to a many problems of smaller size n/b;
the overhead cost of splitting up/combining the solutions for size
n/b into a solution for size n is if f(n),
then the time complexity of such algorithm satisfies
T (n) = a T
Note: we should be writing
T (n) = a T
but it can be shown that ignoring the integer parts and additive
constants is OK when it comes to obtaining the asymptotics.
COMP3121/9101 5 / 22
T (n) = a T
recursion:
size of instance = n
size of instances = n/b
size of instances = n/b2
size of instances = 1
COMP3121/9101 6 / 22
Some recurrences can be solved explicitly, but this tends to be
Fortunately, to estimate efficiency of an algorithm we do not need
the exact solution of a recurrence
We only need to find:
1 the growth rate of the solution i.e., its asymptotic behaviour;
2 the (approximate) sizes of the constants involved (more about
that later)
This is what the Master Theorem provides (when it is
applicable).
COMP3121/9101 7 / 22
Master Theorem:
a ≥ 1 be an integer and and b > 1 a real;
f(n) > 0 be a non-decreasing function;
T (n) be the solution of the recurrence T (n) = aT (n/b) + f(n);
1 If f(n) = O(nlogb a−ε) for some ε > 0, then T (n) = Θ(nlogb a);
2 If f(n) = Θ(nlogb a), then T (n) = Θ(nlogb a log2 n);
3 If f(n) = Ω(nlogb a+ε) for some ε > 0, and for some c < 1 and some n0, a f (n/b) ≤ c f(n) holds for all n > n0, then T (n) = Θ(f(n));
4 If none of these conditions hold, the Master Theorem is NOT applicable.
(But often the proof of the Master Theorem can be tweaked to obtain the
asymptotic of the solution T (n) in such a case when the Master Theorem does
not apply; an example is T (n) = 2T (n/2) + n logn).
COMP3121/9101 8 / 22
Master Theorem – a remark
Note that for any b > 1,
logb n = logb 2 log2 n;
Since b > 1 is constant (does not depend on n), we have for
c = logb 2 > 0
logb n = c log2 n;
logb n = Θ(log2 n)
log2 n = Θ(logb n).
So whenever we have f = Θ(g(n) log n) we do not have to specify
what base the log is – all bases produce equivalent asymptotic
estimates (but we do have to specify b in expressions such as
COMP3121/9101 9 / 22
Master Theorem – Examples
Let T (n) = 4T (n/2) + n;
then nlogb a = nlog2 4 = n2;
thus f(n) = n = O(n2−ε) for any ε < 1.
Condition of case 1 is satisfied; thus, T (n) = Θ(n2).
Let T (n) = 2T (n/2) + 5n;
then nlogb a = nlog2 2 = n1 = n;
thus f(n) = 5n = Θ(n) = Θ(nlog2 2).
Thus, condition of case 2 is satisfied; and so,
T (n) = Θ(nlog2 2 log n) = Θ(n log n).
COMP3121/9101 10 / 22
Master Theorem - Examples
Let T (n) = 3T (n/4) + n;
then nlogb a = nlog4 3 < n0.8;
thus f(n) = n = Ω(n0.8+ε) for any ε < 0.2.
Also, af(n/b) = 3f(n/4)= 3/4 n < cn = cf(n) for c = .9 < 1.
Thus, Case 3 applies, and T (n) = Θ(f(n)) = Θ(n).
Let T (n) = 2T (n/2) + n log2 n;
then nlogb a = nlog2 2 = n1 = n.
Thus, f(n) = n log2 n = Ω(n).
However, f(n) = n log2 n ̸= Ω(n1+ε), no matter how small ε > 0.
This is because for every ε > 0, and every c > 0, no matter how
small, log2 n < c · nε for all sufficiently large n.
Homework: Prove this.
Hint: Use de L’Hôpital’s Rule to show that log n/nε → 0.
Thus, in this case the Master Theorem does not apply!
COMP3121/9101 11 / 22
Master Theorem - Proof:
T (n) = a T
+ f(n) (1)
implies (by applying it to n/b in place of n)
and (by applying (1) to n/b2 in place of n)
and so on . . ., we get
T (n) = a T
+ f(n) = a2
+ f(n) = . . .
COMP3121/9101 12 / 22
Master Theorem Proof:
Continuing in this way logb n− 1 many times we get ...
T (n) = a3 T
︸ ︷︷ ︸+a2f
= a⌊logb n⌋T
+ a⌊logb n⌋−1f
b⌊logb n⌋−1
≈ alogb nT
⌊logb n⌋−1∑
We now use alogb n = nlogb a:
T (n) ≈ nlogb aT (1) +
⌊logb n⌋−1∑
Note that so far we did not use any assumptions on f(n) . . .
COMP3121/9101 13 / 22
T (n) ≈ nlogb aT (1) +
⌊logb n⌋−1∑
Case 1: f(m) = O(mlogb a−ε)
⌊logb n⌋−1∑
⌊logb n⌋−1∑
⌊logb n⌋−1∑
)logb a−ε = O
nlogb a−ε ⌊logb n⌋−1∑
(bi)logb a−ε
nlogb a−ε ⌊logb n⌋−1∑
nlogb a−ε ⌊logb n⌋−1∑
blogb ab−ε
nlogb a−ε ⌊logb n⌋−1∑
nlogb a−ε ⌊logb n⌋−1∑
logb a−ε (b
⌊logb n⌋ − 1
; we are using
COMP3121/9101 14 / 22
Master Theorem Proof:
Case 1 - continued:
⌊logb n⌋−1∑
logb a−ε (b
⌊logb n⌋ − 1
nlogb a−ε
logb a−ε n
nlogb a − nlogb a−ε
Since we had: T (n) ≈ nlogb aT (1) +
⌊logb n⌋−1∑
T (n) ≈ nlogb aT (1) +O
COMP3121/9101 15 / 22
Master Theorem Proof:
Case 2: f(m) = Θ(mlogb a)
⌊logb n⌋−1∑
⌊logb n⌋−1∑
⌊logb n⌋−1∑
nlogb a ⌊logb n⌋−1∑
(bi)logb a
nlogb a ⌊logb n⌋−1∑
nlogb a ⌊logb n⌋−1∑
logb a⌊logb n⌋
COMP3121/9101 16 / 22
Master Theorem Proof:
Case 2 (continued):
⌊logb n⌋−1∑
logb alogb n
logb alog2 n
because logb n = log2 n · logb 2 = Θ(log2 n). Since we had (1):
T (n) ≈ nlogb aT (1) +
⌊logb n⌋−1∑
T (n) ≈ nlogb aT (1) + Θ
logb a log2 n
logb a log2 n
COMP3121/9101 17 / 22
Master Theorem Proof:
Case 3: f(m) = Ω(mlogb a+ε) and a f(n/b) ≤ c f(n) for some 0 < c < 1.
We get by substitution: f(n/b) ≤
By chaining these inequalities we get
f(n/b)︸ ︷︷ ︸ ≤ ca · ca f(n)︸ ︷︷ ︸ =
)︸ ︷︷ ︸ ≤ ca · c
f(n)︸ ︷︷ ︸ =
)︸ ︷︷ ︸ ≤ ca · c
f(n)︸ ︷︷ ︸ =
COMP3121/9101 18 / 22
Master Theorem Proof:
Case 3 (continued):
We got f(n/b
⌊logb n⌋−1∑
⌊logb n⌋−1∑
f(n) < f(n)
Since we had (1):
T (n) ≈ nlogb aT (1) +
⌊logb n⌋−1∑
and since f(n) = Ω(nlogb a+ε) we get:
logb aT (1) +O (f(n)) = O (f(n))
but we also have
T (n) = aT (n/b) + f(n) > f(n)
T (n) = Θ (f(n))
COMP3121/9101 19 / 22
Master Theorem Proof: Homework
Exercise 1: Show that condition
f(n) = Ω(n
follows from the condition
a f(n/b) ≤ c f(n) for some 0 < c < 1. Example: Let us estimate the asymptotic growth rate of T (n) which satisfies T (n) = 2T (n/2) + n logn Note: we have seen that the Master Theorem does NOT apply, but the technique used in its proof still works! So let us just unwind the recurrence and sum up the logarithmic overheads. COMP3121/9101 20 / 22 T (n) = 2T ︸ ︷︷ ︸+n logn ︷ ︸︸ ︷2T ( n ︸ ︷︷ ︸+n log n2 + n logn ︷ ︸︸ ︷2T ( n + n log n ︸ ︷︷ ︸+n log n22 + n log n2 + n logn + ...+ n log n = nT (1) + n(logn× logn− log 2logn−1 − ...− log 22 − log 2 = nT (1) + n((logn)2 − (logn− 1)− ...− 3− 2− 1) = nT (1) + n((logn)2 − logn(logn− 1)/2 = nT (1) + n((logn)2/2 + logn/2) COMP3121/9101 21 / 22 Five pirates have to split 100 bars of gold. They all line up and proceed as follows: 1 The first pirate in line gets to propose a way to split up the gold (for example: everyone gets 20 bars) 2 The pirates, including the one who proposed, vote on whether to accept the proposal. If the proposal is rejected, the pirate who made the proposal is killed. 3 The next pirate in line then makes his proposal, and the 4 pirates vote again. If the vote is tied (2 vs 2) then the proposing pirate is still killed. Only majority can accept a proposal. The process continues until a proposal is accepted or there is only one pirate left. Assume that every pirate : above all wants to live; given that he will be alive he wants to get as much gold as possible; given maximal possible amount of gold, he wants to see any other pirate killed, just for fun; each pirate knows his exact position in line; all of the pirates are excellent puzzle solvers. Question : What proposal should the first pirate make? Hint: assume first that there are only two pirates, and see what happens. Then assume that there are three pirates and that they have figured out what happens if there were only two pirates and try to see what they would do. Further, assume that there are four pirates and that they have figured out what happens if there were only three pirates, try to see what they would do. Finally assume there are five pirates and that they have figured out what happens if there were only four pirates. COMP3121/9101 22 / 22 程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com