CS计算机代考程序代写 chain AI Excel algorithm Algorithms: COMP3121/9101

Algorithms: COMP3121/9101

THE UNIVERSITY OF
NEW SOUTH WALES

Algorithms:
COMP3121/9101

School of Computer Science and Engineering
University of New South Wales

3. RECURRENCES – part A

COMP3121/3821/9101/9801 1 / 1

Asymptotic notation

“Big Oh” notation: f(n) = O(g(n)) is an abbreviation for:

“There exist positive constants c and n0 such that
0 ≤ f(n) ≤ c g(n) for all n ≥ n0”.

In this case we say that g(n) is an asymptotic upper bound for
f(n).

f(n) = O(g(n)) means that f(n) does not grow substantially faster
than g(n) because a multiple of g(n) eventually dominates f(n).

Clearly, multiplying constants c of interest will be larger than 1,
thus “enlarging” g(n).

COMP3121/3821/9101/9801 2 / 1

Asymptotic notation

“Omega” notation: f(n) = Ω(g(n)) is an abbreviation for:

“There exists positive constants c and n0 such that
0 ≤ c g(n) ≤ f(n) for all n ≥ n0.”

In this case we say that g(n) is an asymptotic lower bound for
f(n).

f(n) = Ω(g(n)) essentially says that f(n) grows at least as fast as
g(n), because f(n) eventually dominates a multiple of g(n).

Since c g(n) ≤ f(n) if and only if g(n) ≤ 1
c
f(n), we have

f(n) = Ω(g(n)) if and only if g(n) = O(f(n)).

“Theta” notation: f(n) = Θ(g(n)) if and only if f(n) = O(g(n))
and f(n) = Ω(g(n)); thus, f(n) and g(n) have the same
asymptotic growth rate.

COMP3121/3821/9101/9801 3 / 1

Recurrences

Recurrences are important to us because they arise in estimations
of time complexity of divide-and-conquer algorithms.

Merge-Sort(A, p, r) *sorting A[p..r]*

1 if p < r 2 then q ← bp+r 2 c 3 Merge-Sort(A, p, q) 4 Merge-Sort(A, q + 1, r) 5 Merge(A, p, q, r) Since Merge(A, p, q, r) runs in linear time, the runtime T (n) of Merge-Sort(A, p, r) satisfies T (n) = 2T (n 2 ) + c n COMP3121/3821/9101/9801 4 / 1 Recurrences Let a ≥ 1 be an integer and b > 1 a real number;
Assume that a divide-and-conquer algorithm:

reduces a problem of size n to a many problems of smaller size n/b;
the overhead cost of splitting up/combining the solutions for size
n/b into a solution for size n is if f(n),

then the time complexity of such algorithm satisfies

T (n) = a T
(n
b

)
+ f(n)

Note: we should be writing

T (n) = a T
(⌈n

b

⌉)
+ f(n)

but it can be shown that ignoring the integer parts and additive
constants is OK when it comes to obtaining the asymptotics.

COMP3121/3821/9101/9801 5 / 1

T (n) = a T
(n
b

)
+ f(n)

a many
instances

… …


.
.
.

.

.

.
… …

.

.

.

.

.

.
.
.
.

depth of
recursion:
log! 𝑛

size of instance = n

size of instances = n/b

size of instances = n/b2

.

.

.

size of instances = 1

COMP3121/3821/9101/9801 6 / 1

Some recurrences can be solved explicitly, but this tends to be
tricky.

Fortunately, to estimate efficiency of an algorithm we do not need
the exact solution of a recurrence

We only need to find:
1 the growth rate of the solution i.e., its asymptotic behaviour;
2 the (approximate) sizes of the constants involved (more about

that later)

This is what the Master Theorem provides (when it is
applicable).

COMP3121/3821/9101/9801 7 / 1

Master Theorem:

Let:

a ≥ 1 be an integer and and b > 1 a real;
f(n) > 0 be a non-decreasing function;

T (n) be the solution of the recurrence T (n) = aT (n/b) + f(n);

Then:

1 If f(n) = O(nlogb a−ε) for some ε > 0, then T (n) = Θ(nlogb a);

2 If f(n) = Θ(nlogb a), then T (n) = Θ(nlogb a log2 n);

3 If f(n) = Ω(nlogb a+ε) for some ε > 0, and for some c < 1 and some n0, a f (n/b) ≤ c f(n) holds for all n > n0, then T (n) = Θ(f(n));

4 If none of these conditions hold, the Master Theorem is NOT applicable.

(But often the proof of the Master Theorem can be tweaked to obtain the
asymptotic of the solution T (n) in such a case when the Master Theorem does
not apply; an example is T (n) = 2T (n/2) + n logn).

COMP3121/3821/9101/9801 8 / 1

Master Theorem – a remark

Note that for any b > 1,

logb n = logb 2 log2 n;

Since b > 1 is constant (does not depend on n), we have for
c = logb 2 > 0

logb n = c log2 n;

log2 n =
1

c
logb n;

Thus,
logb n = Θ(log2 n)

and also
log2 n = Θ(logb n).

So whenever we have f = Θ(g(n) log n) we do not have to specify
what base the log is – all bases produce equivalent asymptotic
estimates (but we do have to specify b in expressions such as
nlogb a).

COMP3121/3821/9101/9801 9 / 1

Master Theorem – Examples

Let T (n) = 4T (n/2) + n;

then nlogb a = nlog2 4 = n2;

thus f(n) = n = O(n2−ε) for any ε < 1. Condition of case 1 is satisfied; thus, T (n) = Θ(n2). Let T (n) = 2T (n/2) + 5n; then nlogb a = nlog2 2 = n1 = n; thus f(n) = 5n = Θ(n) = Θ(nlog2 2). Thus, condition of case 2 is satisfied; and so, T (n) = Θ(nlog2 2 log n) = Θ(n log n). COMP3121/3821/9101/9801 10 / 1 Master Theorem - Examples Let T (n) = 3T (n/4) + n; then nlogb a = nlog4 3 < n0.8; thus f(n) = n = Ω(n0.8+ε) for any ε < 0.2. Also, af(n/b) = 3f(n/4)= 3/4 n < cn = cf(n) for c = .9 < 1. Thus, Case 3 applies, and T (n) = Θ(f(n)) = Θ(n). Let T (n) = 2T (n/2) + n log2 n; then nlogb a = nlog2 2 = n1 = n. Thus, f(n) = n log2 n = Ω(n). However, f(n) = n log2 n 6= Ω(n1+ε), no matter how small ε > 0.
This is because for every ε > 0, and every c > 0, no matter how
small, log2 n < c · nε for all sufficiently large n. Homework: Prove this. Hint: Use de L’Hôpital’s Rule to show that log n/nε → 0. Thus, in this case the Master Theorem does not apply! COMP3121/3821/9101/9801 11 / 1 Master Theorem - Proof: Since T (n) = a T ( n b ) + f(n) (1) implies (by applying it to n/b in place of n) T ( n b ) = a T ( n b2 ) + f ( n b ) (2) and (by applying (1) to n/b2 in place of n) T ( n b2 ) = a T ( n b3 ) + f ( n b2 ) (3) and so on . . ., we get (1)︷ ︸︸ ︷ T (n) = a T ( n b ) ︸ ︷︷ ︸ (2L) +f(n) = a ( a T ( n b2 ) + f ( n b ) ︸ ︷︷ ︸ (2R) ) + f(n) = a2 T ( n b2 ) ︸ ︷︷ ︸ (3L) +a f ( n b ) + f(n) = a2 ( a T ( n b3 ) + f ( n b2 ) ︸ ︷︷ ︸ (3R) ) + a f ( n b ) + f(n) = a3 T ( n b3 ) ︸ ︷︷ ︸+a 2f ( n b2 ) + a f ( n b ) + f(n) = . . . COMP3121/3821/9101/9801 12 / 1 Master Theorem Proof: Continuing in this way logb n− 1 many times we get ... T (n) = a3 T ( n b3 ) ︸ ︷︷ ︸+a2f ( n b2 ) + a f (n b ) + f(n) = = . . . = ablogb ncT ( n bblogb nc ) + ablogb nc−1f ( n bblogb nc−1 ) + . . . + a3f ( n b3 ) + a2f ( n b2 ) + a f (n b ) + f(n) ≈ alogb nT ( n blogb n ) + blogb nc−1∑ i=0 aif ( n bi ) We now use alogb n = nlogb a: T (n) ≈ nlogb aT (1) + blogb nc−1∑ i=0 aif ( n bi ) (4) Note that so far we did not use any assumptions on f(n) . . . COMP3121/3821/9101/9801 13 / 1 T (n) ≈ nlogb aT (1) + blogb nc−1∑ i=0 aif (n bi ) ︸ ︷︷ ︸ Case 1: f(m) = O(mlogb a−ε) blogb nc−1∑ i=0 a i f ( n bi ) = blogb nc−1∑ i=0 a i O ( n bi )logb a−ε = O  blogb nc−1∑ i=0 a i ( n bi )logb a−ε = O  nlogb a−ε blogb nc−1∑ i=0 ( ai (bi)logb a−ε ) = O  nlogb a−ε blogb nc−1∑ i=0 ( a blogb a−ε )i = O  nlogb a−ε blogb nc−1∑ i=0 ( a blogb ab−ε )i = O  nlogb a−ε blogb nc−1∑ i=0 ( a bε a )i = O  nlogb a−ε blogb nc−1∑ i=0 (b ε ) i   = O ( n logb a−ε (b ε) blogb nc − 1 bε − 1 ) ; we are using m−1∑ i=0 q i = qm − 1 q − 1 COMP3121/3821/9101/9801 14 / 1 Master Theorem Proof: Case 1 - continued: blogb nc−1∑ i=0 a i f ( n bi ) = O ( n logb a−ε (b ε) blogb nc − 1 bε − 1 ) = O  nlogb a−ε ( bblogb nc )ε − 1 bε − 1   = O ( n logb a−ε n ε − 1 bε − 1 ) = O ( nlogb a − nlogb a−ε bε − 1 ) = O ( n logb a ) Since we had: T (n) ≈ nlogb aT (1) + blogb nc−1∑ i=0 a i f ( n bi ) we get: T (n) ≈ nlogb aT (1) + O ( n logb a ) = Θ ( n logb a ) COMP3121/3821/9101/9801 15 / 1 Master Theorem Proof: Case 2: f(m) = Θ(mlogb a) blogb nc−1∑ i=0 a i f ( n bi ) = blogb nc−1∑ i=0 a i Θ ( n bi )logb a = Θ  blogb nc−1∑ i=0 a i ( n bi )logb a = Θ  nlogb a blogb nc−1∑ i=0 ( ai (bi)logb a ) = Θ  nlogb a blogb nc−1∑ i=0 ( a blogb a )i = Θ  nlogb a blogb nc−1∑ i=0 1   = Θ ( n logb ablogb nc ) COMP3121/3821/9101/9801 16 / 1 Master Theorem Proof: Case 2 (continued): Thus, blogb nc−1∑ i=0 a i f ( n bi ) = Θ ( n logb alogb n ) = Θ ( n logb alog2 n ) because logb n = log2 n · logb 2 = Θ(log2 n). Since we had (1): T (n) ≈ nlogb aT (1) + blogb nc−1∑ i=0 a i f ( n bi ) we get: T (n) ≈ nlogb aT (1) + Θ ( n logb a log2 n ) = Θ ( n logb a log2 n ) COMP3121/3821/9101/9801 17 / 1 Master Theorem Proof: Case 3: f(m) = Ω(mlogb a+ε) and a f(n/b) ≤ c f(n) for some 0 < c < 1. We get by substitution: f(n/b) ≤ c a f(n) f(n/b 2 ) ≤ c a f(n/b) f(n/b 3 ) ≤ c a f(n/b 2 ) . . . f(n/b i ) ≤ c a f(n/b i−1 ) By chaining these inequalities we get f(n/b 2 ) ≤ c a f(n/b)︸ ︷︷ ︸ ≤ ca · ca f(n)︸ ︷︷ ︸ = c2 a2 f(n) f(n/b 3 ) ≤ c a f(n/b 2 )︸ ︷︷ ︸ ≤ ca · c 2 a2 f(n)︸ ︷︷ ︸ = c3 a3 f(n) . . . f(n/b i ) ≤ c a f(n/b i−1 )︸ ︷︷ ︸ ≤ ca · c i−1 ai−1 f(n)︸ ︷︷ ︸ = ci ai f(n) COMP3121/3821/9101/9801 18 / 1 Master Theorem Proof: Case 3 (continued): We got f(n/b i ) ≤ ci ai f(n) Thus, blogb nc−1∑ i=0 a i f ( n bi ) ≤ blogb nc−1∑ i=0 a i c i ai f(n) < f(n) ∞∑ i=0 c i = f(n) 1− c Since we had (1): T (n) ≈ nlogb aT (1) + blogb nc−1∑ i=0 a i f ( n bi ) and since f(n) = Ω(nlogb a+ε) we get: T (n) < n logb aT (1) + O (f(n)) = O (f(n)) but we also have T (n) = aT (n/b) + f(n) > f(n)

thus,
T (n) = Θ (f(n))

COMP3121/3821/9101/9801 19 / 1

Master Theorem Proof: Homework

Exercise 1: Show that condition

f(n) = Ω(n
logb a+ε)

follows from the condition

a f(n/b) ≤ c f(n) for some 0 < c < 1. Example: Let us estimate the asymptotic growth rate of T (n) which satisfies T (n) = 2T (n/2) + n logn Note: we have seen that the Master Theorem does NOT apply, but the technique used in its proof still works! So let us just unwind the recurrence and sum up the logarithmic overheads. COMP3121/3821/9101/9801 20 / 1 T (n) = 2T ( n 2 ) ︸ ︷︷ ︸+n logn = 2  ︷ ︸︸ ︷2T ( n 22 ) + n 2 log n 2  + n logn = 22 T ( n 22 ) ︸ ︷︷ ︸+n log n2 + n logn = 22  ︷ ︸︸ ︷2T ( n 23 ) + n 22 log n 22  + n log n 2 + n logn = 23 T ( n 23 ) ︸ ︷︷ ︸+n log n22 + n log n2 + n logn . . . = 2lognT ( n 2log n ) + n log n 2log n−1 + ... + n log n 22 + n log n 2 + n logn = nT (1) + n(logn× logn− log 2logn−1 − ...− log 22 − log 2 = nT (1) + n((logn)2 − (logn− 1)− ...− 3− 2− 1) = nT (1) + n((logn)2 − logn(logn− 1)/2 = nT (1) + n((logn)2/2 + logn/2) = Θ ( n(logn)2 ) . COMP3121/3821/9101/9801 21 / 1 PUZZLE! Five pirates have to split 100 bars of gold. They all line up and proceed as follows: 1 The first pirate in line gets to propose a way to split up the gold (for example: everyone gets 20 bars) 2 The pirates, including the one who proposed, vote on whether to accept the proposal. If the proposal is rejected, the pirate who made the proposal is killed. 3 The next pirate in line then makes his proposal, and the 4 pirates vote again. If the vote is tied (2 vs 2) then the proposing pirate is still killed. Only majority can accept a proposal. The process continues until a proposal is accepted or there is only one pirate left. Assume that every pirate : above all wants to live; given that he will be alive he wants to get as much gold as possible; given maximal possible amount of gold, he wants to see any other pirate killed, just for fun; each pirate knows his exact position in line; all of the pirates are excellent puzzle solvers. Question : What proposal should the first pirate make? Hint: assume first that there are only two pirates, and see what happens. Then assume that there are three pirates and that they have figured out what happens if there were only two pirates and try to see what they would do. Further, assume that there are four pirates and that they have figured out what happens if there were only three pirates, try to see what they would do. Finally assume there are five pirates and that they have figured out what happens if there were only four pirates. COMP3121/3821/9101/9801 22 / 1