程序代写代做代考 C Excel algorithm AI chain NEW SOUTH WALES

NEW SOUTH WALES
Algorithms: COMP3121/9101
School of Computer Science and Engineering University of New South Wales
3. RECURRENCES
COMP3121/3821/9101/9801 1 / 22

Asymptotic notation
“Big Oh” notation: f(n) = O(g(n)) is an abbreviation for: “There exist positive constants c and n0 such that
0≤f(n)≤cg(n) for all n≥n0”.
In this case we say that g(n) is an asymptotic upper bound for
f (n).
f(n) = O(g(n)) means that f(n) does not grow substantially faster
than g(n) because a multiple of g(n) eventually dominates f(n). Clearly, multiplying constants c of interest will be larger than 1,
thus “enlarging” g(n).
COMP3121/3821/9101/9801 2 / 22

Asymptotic notation
“Omega” notation: f(n) = Ω(g(n)) is an abbreviation for: “There exists positive constants c and n0 such that
0≤cg(n)≤f(n) for all n≥n0.”
In this case we say that g(n) is an asymptotic lower bound for
f (n).
f(n) = Ω(g(n)) essentially says that f(n) grows at least as fast as
g(n), because f(n) eventually dominates a multiple of g(n). Since c g(n) ≤ f(n) if and only if g(n) ≤ 1f(n), we have
c
f(n) = Ω(g(n)) if and only if g(n) = O(f(n)).
“Theta” notation: f(n) = Θ(g(n)) if and only if f(n) = O(g(n)) and f(n) = Ω(g(n)); thus, f(n) and g(n) have the same asymptotic growth rate.
COMP3121/3821/9101/9801 3 / 22

Recurrences
Recurrences are important to us because they arise in estimations of time complexity of divide-and-conquer algorithms.
Merge-Sort(A, p, r)
1 ifp 1 a real number; Assume that a divide-and-conquer algorithm:
reduces a problem of size n to a many problems of smaller size n/b; the overhead cost of splitting up/combining the solutions for size n/b into a solution for size n is if f(n),
then the time complexity of such algorithm satisfies T (n) = a T 􏰛 n 􏰜 + f (n)
b
Note: we should be writing
T (n) = a T 􏰛􏰳 n 􏰴􏰜 + f (n)
b
but it can be shown that ignoring the integer parts and additive constants is OK when it comes to obtaining the asymptotics.
COMP3121/3821/9101/9801 5 / 22

T (n) = a T 􏰛 n 􏰜 + f (n) b
size of instance = n size of instances = n/b
size of instances = n/b2 .
a many instances

….. ….. …


…. .
. .
depth of recursion: log! 𝑛
……
size of instances = 1
COMP3121/3821/9101/9801
6 / 22

Some recurrences can be solved explicitly, but this tends to be tricky.
Fortunately, to estimate efficiency of an algorithm we do not need the exact solution of a recurrence
We only need to find:
1 the growth rate of the solution i.e., its asymptotic behaviour;
2 the (approximate) sizes of the constants involved (more about
that later)
This is what the Master Theorem provides (when it is applicable).
COMP3121/3821/9101/9801 7 / 22

Master Theorem:
Let:
a ≥ 1 be an integer and and b > 1 a real;
f(n) > 0 be a non-decreasing function;
T (n) be the solution of the recurrence T (n) = a T (n/b) + f (n);
Then:
1 If f(n) = O(nlogb a−ε) for some ε > 0, then T(n) = Θ(nlogb a);
2 If f(n) = Θ(nlogb a), then T(n) = Θ(nlogb a log2 n);
3 Iff(n)=Ω(nlogba+ε)forsomeε>0,andforsomec<1andsomen0, af (n/b) ≤ cf(n) holds for all n > n0, then T(n) = Θ(f(n));
4 If none of these conditions hold, the Master Theorem is NOT applicable.
(But often the proof of the Master Theorem can be tweaked to obtain the asymptotic of the solution T(n) in such a case when the Master Theorem does not apply; an example is T (n) = 2T (n/2) + n log n).
COMP3121/3821/9101/9801 8 / 22

Master Theorem – a remark
Note that for any b > 1,
logb n = logb 2 log2 n;
Since b > 1 is constant (does not depend on n), we have for c = logb 2 > 0
logbn=c log2n; log2n= 1logbn;
Thus, and also
c
logb n = Θ(log2 n)
log2 n = Θ(logb n).
So whenever we have f = Θ(g(n) log n) we do not have to specify what base the log is – all bases produce equivalent asymptotic estimates.
COMP3121/3821/9101/9801 9 / 22

Master Theorem – Examples
Let T (n) = 4 T (n/2) + n; then nlogb a = nlog2 4 = n2;
thus f(n) = n = O(n2−ε) for any ε < 1. Condition of case 1 is satisfied; thus, T(n) = Θ(n2). Let T (n) = 2T (n/2) + c n; thennlogba =nlog22 =n1 =n; thus f(n) = cn = Θ(n) = Θ(nlog2 2). Thus, condition of case 2 is satisfied; and so, T(n) = Θ(nlog2 2 logn) = Θ(nlogn). COMP3121/3821/9101/9801 10 / 22 Master Theorem - Examples Let T (n) = 3 T (n/4) + n; then nlogb a = nlog4 3 < n0.8; thus f(n) = n = Ω(n0.8+ε) for any ε < 0.2. Also,af(n/b)=3f(n/4)=3/4n 0. This is because for every ε > 0, and every c > 0, no matter how small, log2 n < c · nε for all sufficiently large n. Homework: Prove this. Hint: Use de L’Hˆopital’s Rule to show that log n/nε → 0. Thus, in this case the Master Theorem does not apply! COMP3121/3821/9101/9801 11 / 22 Master Theorem - Proof: Since 􏰑n􏰒 b T (n) = a T + f (n) (1) (2) (3) implies (by applying it to n/b in place of n) 􏰑n􏰒 􏰑n􏰒 􏰑n􏰒 T =aT +f b b2 b and (by applying (1) to n/b2 in place of n) 􏰑n􏰒 􏰑n􏰒 􏰑n􏰒 T =aT +f b2 b3 b2 and so on ..., we get (1) 􏰌 􏰏􏰎 􏰍 􏰑n􏰒 􏰑 􏰑n􏰒 􏰑n􏰒􏰒 T(n)=aT 􏰎􏰍􏰌􏰏 􏰎 􏰍􏰌 􏰏 +f(n) 􏰑 n 􏰒􏰒 b2 􏰏 +f(n)=a aT +f b b2 b (2) 2 􏰑 n 􏰒 +af 􏰗n􏰘 b +f(n)=a (2) 2􏰑 aT 􏰎 􏰑 n 􏰒 b3 +af 􏰗n􏰘 b +f(n) =a T b2 +f 􏰍􏰌 􏰎 􏰍􏰌 􏰏 (3) 3􏰑n􏰒2􏰛n􏰜􏰗n􏰘 (3) =aT b3 +af b2 +af b +f(n)=... 􏰎 􏰍􏰌 􏰏 COMP3121/3821/9101/9801 12 / 22 Master Theorem Proof: Continuing in this way logb n − 1 many times we get ... T (n) = a3 T 􏰛 n 􏰜 +a2 f 􏰛 n 􏰜 + a f 􏰛 n 􏰜 + f (n) = n 􏰜+... b⌊logb n⌋−1 b3 b2 b 􏰎 􏰍􏰌 􏰏 =... =a⌊logb n⌋T􏰛 n 􏰜+a⌊logb n⌋−1f􏰛 b⌊logb n⌋ + a3 f 􏰛 n 􏰜 + a2 f 􏰛 n 􏰜 + a f 􏰛 n 􏰜 + f (n) b3 b2 b ⌊log n⌋−1 ≈alogbnT blogb n We now use alogb n = nlogb a: T(n)≈nlogbaT(1)+ 􏰁 aif bi i=0 Note that so far we did not use any assumptions on f (n) . . . COMP3121/3821/9101/9801 􏰛n􏰜b 􏰛n􏰜 + 􏰁 aif i=0 bi ⌊log n⌋−1 b 􏰛n􏰜 (4) 13 / 22 Master Theorem Proof: Case 1: f(m) = O(mlogb a−ε) ⌊log n⌋−1 ⌊log n⌋−1 b􏰛􏰜b􏰛􏰜 􏰁 aif n = 􏰁 aiO n logba−ε i=0 bi bi i=0 b􏰛􏰜bi 􏰒    ⌊log n⌋−1􏰑 =O a =Onb ⌊log n⌋−1 􏰁 i nlogba−ε loga−ε 􏰁 a (bi)logb a−ε i=0  log a−ε =On b bi ⌊log n⌋−1  =On b i=0 i=0 ⌊logb n⌋−1    =O nlog a−ε 􏰁 􏰁 i b 􏰛a􏰜 log a−ε b 􏰛a􏰜  ⌊log n⌋−1 􏰁 i blogb a−ε i=0 blogb ab−ε   =O nlog a−ε 􏰁 (bε)i ⌊logb n⌋−1 􏰑abε 􏰒i bb = O nlogb a−ε (bε)⌊logb n⌋ − 1 bε − 1 ; we are using qm+1 − 1 q − 1 a i=0 􏰝􏰞 􏰁m i=0 i=0 qm = COMP3121/3821/9101/9801 14 / 22 Master Theorem Proof: Case 1 - continued: ⌊log n⌋−1 b 􏰛n􏰜 􏰝 􏰞 (bε)⌊log n⌋−1 =O nlogba−ε b bε − 1  􏰛b⌊logb n⌋􏰜ε −1 =O nlog a−ε b bε − 1 􏰑log a−εnε−1􏰒 bε − 1 􏰑 nlogb a − nlogb a−ε 􏰒 Since we had: T(n) ≈ nlogb aT (1) + T(n) ≈ nlogb aT (1) + O 􏰛nlogb a􏰜 = Θ􏰛nlogb a􏰜 COMP3121/3821/9101/9801 we get: 􏰁 aif i=0 bi =Onb =O = O 􏰛nlogb a􏰜 bε − 1 ⌊log n⌋−1 b 􏰛n􏰜 􏰁 aif i=0 bi 15 / 22 Master Theorem Proof: Case 2: f(m) = Θ(mlogb a) ⌊log n⌋−1 ⌊log n⌋−1 b 􏰛n􏰜b 􏰛n􏰜b 􏰁i 􏰁iloga afbi = aΘbi i=0 i=0 ⌊log n⌋−1  b 􏰛n􏰜b =Θ􏰁ai loga  bi  i=0 ⌊logb n⌋−1 􏰑 ai 􏰒 b  =Θnloga 􏰁 (bi)logb a i=0  ⌊log n⌋−1  b 􏰛a􏰜 =Θnloga 􏰁 i b blogb a i=0  ⌊logbn⌋−1  = Θ nlogb a 􏰁 1 i=0 = Θ􏰛nlogb a⌊logb n⌋􏰜 COMP3121/3821/9101/9801 16 / 22 Master Theorem Proof: Case 2 (continued): Thus, ⌊log n⌋−1 b􏰛n􏰜􏰛􏰜􏰛􏰜 􏰁 aif bi =Θ nlogbalogbn =Θ nlogbalog2n i=0 because logb n = log2 n · logb 2 = Θ(log2 n). Since we had (1): ⌊log n⌋−1 T(n) ≈ nlogb aT (1) + T(n) ≈ nlogb aT (1) + Θ􏰛nlogb a log2 n􏰜 = Θ 􏰛nlogb a log2 n􏰜 b 􏰛n􏰜 􏰁 aif i=0 we get: bi COMP3121/3821/9101/9801 17 / 22 Master Theorem Proof: Case 3: f(m) = Ω(mlogb a+ε) and af(n/b) ≤ cf(n) for some 0 < c < 1. We get by substitution: f(n/b) ≤ c f(n) a f(n/b2) ≤ c f(n/b) a f(n/b3) ≤ c f(n/b2) a ... f(n/bi) ≤ c f(n/bi−1) a By chaining these inequalities we get 2 c cc c2 f(n/b )≤ f(n/b)≤ · f(n)= 2 f(n) a􏰎􏰍􏰌􏰏aa a 􏰎 􏰍􏰌 􏰏 3c2cc2 c3 f(n/b )≤ f(n/b )≤ · 2 f(n)= 3 f(n) a􏰎 􏰍􏰌 􏰏 a a a 􏰎 􏰍􏰌 􏰏 ... ici−1cci−1 ci f(n/b)≤ f(n/b )≤ · i−1 f(n)= i f(n) a􏰎 􏰍􏰌 􏰏 a a a 􏰎 􏰍􏰌 􏰏 COMP3121/3821/9101/9801 18 / 22 Master Theorem Proof: Case 3 (continued): i ci We got f(n/b ) ≤ ai f(n) Thus, ⌊log n⌋−1 ⌊log n⌋−1 b 􏰛n􏰜b ci ∞ 􏰁 f(n) bi ai 1−c aif ≤ 􏰁 ai i=0 f(n) f(n)
T(n) = Θ(f(n)) COMP3121/3821/9101/9801
19 / 22

Master Theorem Proof: Homework
Exercise 1: Show that condition
f(n) = Ω(nlogb a+ε)
follows from the condition
af(n/b)≤cf(n) forsome0