程序代写代做代考 C algorithm go EECS 3101

EECS 3101
Prof. Andy Mirzaian

STUDY MATERIAL:
• [CLRS] Appendix A, chapter 4
• Lecture Note 3
2

Summations:
𝑛
𝑓𝑖 =𝑓1+𝑓2+⋯+𝑓𝑛=Θ(?) 𝑖=1
𝑛
2𝑖=Θ 𝑛2𝑛 𝑖=1
Recurrence Relations:
𝑇𝑛−1+𝑓(𝑛) ∀𝑛≥1 𝑛 𝑇𝑛=0 ∀𝑛<1⟹𝑇𝑛= 𝑓(𝑖) 𝑖=1 𝑇𝑛=2𝑇𝑛−1+1 ∀𝑛≥1⟹𝑇𝑛=Θ2𝑛 0 ∀𝑛 < 1 3 TOPICS Summations & Recurrence Relations arise in the running time analysis of algorithms.  SUMMATIONS:  Classic Methods: Arithmetic, Geometric, Harmonic, Bounded Tail  Approximating Summation by Integration  Summation by Parts  ASSUME: ASymptotic SUmmation Made Easy  RECURRENCE RELATIONS:  Iteration method  Recursion tree method  Divide-&-Conquer Recurrence: The Master Method  Guess-&-Verify method  Full History Recurrences  Variable Substitution method 4 SUMMATION f : Ν  R n S(n)  f(i)  f(1)f(2) f(3) f(n) i1 5  Arithmetic (or polynomial): S(n) = Q(n f(n)) = n(n+1)/2 = Q( n2 ) = n(n+1)(2n+1)/6 = Q( n3 ) = Q( nd+1 ), for any constant real d > -1
f(n) = n: f(n) = n2: f(n) = nd:
1 + 2 + ··· + n 12 + 22 + ··· + n2 1d + 2d + ··· + nd
Classic Methods
S(n) = Q(f(n))
= Q(xn) if x > 1 (Geometric increasing)
f(n) = 2n: 1+ 2 + 22 + ··· + 2n = Q(2n) (with x = 2)  Harmonic: S(n) = Q(log n)
f(n) = 1/n: H(n) = 1 + 1/2 + 1/3 + ··· + 1/n ≈ ln n = Q(log n)
 Bounded Tail: S(n) = Q(1)
1 + x + x2 + ··· + xn = (1 – xn+1)/(1-x) = Q(1) if 02(1/4) = 1⁄2
> 4(1/8) = 1⁄2 > 8(1/16) = 1⁄2
 1 + 1⁄2 log n  W(log n)
 H(n) = Q(log n)
11

ASSUME : ASymptotic SUmmation Made Easy
f: N  + :
S(n) = f(1) + f(2) + f(3) + ··· + f(n)
max { f(1) , f(2) , … , f(n) } min { f(1) , f(2) , … , f(n) }
average {f(1) , f(2), … , f(n) }
maxn(f) minn(f)
aven(f)
= =
=
= n · aven(f)
0  minn(f)  aven(f)  maxn(f)
f  minn(f) = f(1) , maxn(f) = f(n) f  minn(f) = f(n) , maxn(f) = f(1)
S(n)
max{ n · minn(f), maxn(f) }  S(n)  n · maxn(f)
 S(n) = Q( g(n) · maxn(f)) for some 1  g(n)  n
ASSUME: A method for finding g(n) (and hence, S(n)).
12

f: N  + FACT: f is monotone
(1) f(n) = 0
(2) f(n) = Q(1)
(3) f(n) = w(1)
(4) f(n) = o(1)
S(n) = f(1) + f(2) + f(3) + ··· + f(n)  exactly one of the following holds:
n and f
and f and f(1) > 0
ASSUME on monotone f(n)
Cases (1) & (2): g(n) = Q(n), S(n) = Q(g(n) f(n)) = Q(n f(n)) Case (3): Considered next.
Case (4): Similar methods apply. Exercise.
13

ASSUME for f & f(n) = w(1)
f(n)
f(m)  cf(n)  f(m+1) 0 −1 𝑖𝑓 𝑒 = −1 𝑖𝑓 𝑒 < −1 𝚯 𝐥𝐨𝐠𝒆𝒏 𝒏 𝑛3.5 log 𝑛 𝑛4, 1 23𝑛 𝑛2+log 𝑛 Θ(1) 22 A Simple form The SAFs we encounter, usually have the following simple form: f(n)Qtn nd logen for some real constants t>0, d, e. We classify S(n) for such f(n) as:
t> 1
Geometric (increasing)
t= 1
d > -1
Arithmetic
d = -1
quasi Harmonic
d < -1 Bounded tail (polynomially decreasing) 0< t <1 Bounded tail (geometrically decreasing) 23 RECURRENCE RELATIONS T (n)  2Tn2  Q(n) O(1) for n  O(1) for n  O(1)   T (n)  Q(n log n) 24 The Iteration Method 0 for n0 T(n1)f(n) for n1 The iteration method (or unwinding the recurrence): T ( n )   T(n) = f(n) + T(n –1) = f(n) + f(n-1) + T(n –2) = f(n) + f(n-1) + f(n-2) + T(n –3) f(n) is the driving function of the recurrence =... = f(n) + f(n-1) + f(n-2) + ··· + f(3) + f(2) + T(1) = f(n) + f(n-1) + f(n-2) + ··· + f(3) + f(2) + f(1) + T(0) = f(n) + f(n-1) + f(n-2) + ··· + f(3) + f(2) + f(1). Example: f(n) = Q(n)  T(n) = Q(n2). f(n) = Q(2n)  T(n) = Q(2n). 25 The Iteration Method T ( n )   0 f o r n  1 2Tnn forn1 T(n) = n + 2T(n/2) =n +2[n/2+2T(n/22)] = 2n + 22 T(n/22) = 2n + 22 [n/22 + 2 T(n/23)] = 3n + 23 T(n/23) = 3n + 23 [n/23 + 2T(n/24)] 2 = 4n + 24 T(n/24) =... =kn+2k T(n/2k) = n(1 + log n) = Q(n log n ) takek=1+logn, T(n/2k)=0 26 The Iteration Method T ( n )   0 f o r n  1 2Tnn2 for n1 T(n) = n2 + 2T(n/2) = n2 + 2 [(n/2)2 + 2T(n/22)] = n2(1+ 1⁄2) + 22 T(n/22) = n2(1+ 1⁄2) + 22 [(n/22)2 + 2 T(n/23)] = n2(1+ 1⁄2 + 1⁄22) + 23 T(n/23) = n2(1+ 1⁄2 + 1⁄22) + 23 [(n/23)2 + 2T(n/23)] =n2(1+1⁄2+1⁄22+1⁄23)+24 T(n/24) =... =n2(1+1⁄2+1⁄22+1⁄23 +···+1⁄2k)+2k+1 T(n/2k+1) take k = log n, T(n/2k+1) = 0 =n2(1+1⁄2+1⁄22+1⁄23 +···+1⁄2k) = n2 · Q(1) geometric decreasing = Q(n2 ) 2 27 The Recursion Tree Method T ( n )   0 T(n/2) 2 T(n) f(n) f(n) 2f(n/2) 4f(n/4) 8f(n/8) f o r n  1 2Tnf(n) forn1 f(n/2) f(n/4) T(n/2) f(n/2) T(n/8) T(n/8) f(n/8) f(n/8) T(n/4) T(n/4) f(n/4) T(n/4) f(n/4) T(n/4) f(n/4) T(n/8) f(n/8) T(n/16) T(n/8) f(n/8) T(n/16) T(n/8) T(n/8) T(n/8) T(n/8) f(n/8) f(n/8) f(n/8) f(n/8) T(n) f(n)2f(n2)4f(n4)8f(n8) logn   2i fn2i i0 28 T ( n )   0 CLAIM: T(n) = Q(n). Lower bound: T(n)  W(n) obvious The Recursion Tree Method f o r n  4 TnTnn forn4 42 T(n) n n  3n/4  (3/4)2n  (3/4)3n T(n/4) n/4 T(n/2) n/2 T(n/16) T(n/8) T(n/8) n/8 T(n/4) n/4 T(n/8) n/8 T(n/16) T(n/64) n/16 n/8 T(n/32) T(n/32) T(n/16) T(n/32) n/16 n/32 T(n/16) T(n/16) n/64 n/32 n/32 n/16 n/16 T(n/256) 2 3 T ( n )  n   3 4 n   3 4  n   3 4  n      Upper bound n1     4nO(n). 444 33233 29 log4 n log2 n The Recursion Tree Method f o r n  4 TnTnn forn4 T ( n )   0 This is a special case of the following recurrence: 42 T(n) = T(an) + T(bn) + Q(n) where 0  a < 1 and 0  b < 1 are real constant parameters. FACT: (1) a  b  1  T(n) = Q(n) [linear] (2) a  b  1  T(n) = Q(n log n) (3) a  b  1  T(n) = Q(nd) [super-linear poly] where d > 1 is the unique constant
that satisfies the equation ad + bd = 1.
[See the guess-&-verify method later & Exercise 9.]
30

Divide-&-Conquer
MergeSort is the prototypical divide-&-conquer algorithm. Here is a high level description:
Algorithm MergeSort( S ) § John von Neumann [1945] Pre-Condition: input S is a sequence of numbers
Post-Condition: output is a sorted permutation of the input sequence
Base: Divide: Conquer: Combine: Output: end
if |S|  1 then return S
Divide S into its left and right halves S=L, R, |L||R| 1⁄2|S| L’  MergeSort( L ); R’  MergeSort( R )
S’  Merge( L’, R’ ) return S’
31

MergeSort Example
INPUT
23 15 41 82 37 92 66 31 34 73 25 19 33 15 16 58
15
15
15
23 41 82
23 41 82
66 34 73 19
37 92 31
31 37 66
15
25
25 34 73
33 16 58
15 16 33 58
92
19
15 16 19
23 31 37 41 66 82 92
25 33 34 58 73
15 15 16 19 23 25 31 33 34 37 41 58 66 73 82 92
OUTPUT
32

Algorithm MergeSort(A[1..n]) B[0..n+1]  auxiliary array for merging MS(1,n)
end
procedure MS( s, f ) § sort A[s..f]
MERGE Loop Invariant:
n
A
skf
s-1 i m-1m j f+1 B
if s  f then return m  (s+f)/2
Base:
Divide:
Conquer:
Combine: Merge( s, m, f ) end
merged so far
MS( s, m –1 ) ; MS( m, f )


left half
right half
procedure Merge( s, m, f ) B[s-1 .. m-2]  A[s,m-1]
B[m .. f]  A[m,f]
B[m-1] B[f+1] § barrier sentinel i  s-1 ; j  m
for k  s .. f do if B[i]  B[j]
Merge time = Q(n), where n = f – s +1.
end
then A[k]  B[i]; i++ else A[k]  B[j]; j++
33

Algorithm MergeSort(A[1..n]) B[0..n+1]  auxiliary array for merging MS(1,n)
end
procedure MS( s, f ) § sort A[s..f]
MergeSort Recurrence: T(n)=Q(1) n1
T(n) = T( n/2 ) + T( n/2 ) + Q(n) n>1 Simplified Recurrence:
if s  f then return m  (s+f)/2
Base:
Divide:
Conquer:
Combine: Merge( s, m, f ) end
MS( s, m –1 ) ; MS( m, f )
T(n) = 2 T(n/2) + Q(n) T(n) = Q(1)
Solution:
for n > 1 forn1
procedure Merge( s, m, f ) B[s-1 .. m-2]  A[s,m-1]
B[m .. f]  A[m,f]
B[m-1] B[f+1] § barrier sentinel i  s-1 ; j  m
for k  s .. f do if B[i]  B[j]
then A[k]  B[i]; i++ else A[k]  B[j]; j++
T(n) = Q( n log n).
end
34

Divide-&-Conquer Recurrence
𝑎𝑇𝑛𝑏 +𝑓(𝑛) ∀𝑛>1 Θ 1 ∀𝑛 ≤ 1
a = # of recursive calls (a > 0)
n/b = size of each sub-instance called recursively (b > 1) f(n) = cost of divide + combine steps
(every thing except the recursive calls).
In MergeSort we have:
a = 2 recursive calls
of size n/b = n/2 each.
Cost of divide = Q(1)
Cost of Combine, i.e., merge, is Q(n). So, f(n) = Q(n).
𝑇𝑛=
35

T ( n )   a T ( bn )  f ( n )  n  1 Q(1) n1
Recursion Tree:
a
f(n)
f(n/b) f(n/b) f(n/b)
f(n/b2) f(n/b2) f(n/b2) f(n/b2)
k = logbn
f(n) af(n/b) a2f(n/b2)
ak-1f(n/bk-1) h(n) = Q(n logba )
f(n/bk-1) f(n/bk-1)
Q(1) Q(1)
Q(1) Q(1)
a logb n = n logb a
ak
(compare their logarithms)
h(n) = ak ·Q(1) = Q(ak) = Q(alogbn ) = Qalogbn) = Qnlogba)
k = logbn = Q( log n). 36

T(n)aT(bn)f(n) n1 Q(1) n  1
T(n) = S(n) + h(n) = Q( max{S(n), h(n)}) h(n) = Q n logba )
S(n)= S f(n)af(n)a2f(n)ak1fn 
S ( n )  k  1 a i f  n 
f(n)
af(n/b)
a2f(n/b2)
bi i0
ak-1f(n/bk-1)
b
b2 bk1
a logb n = n logb a (compare their logarithms)
h(n) = ak ·Q(1) = Q(ak) = Q(alogbn ) = Qalogbn) = Qnlogba)
k = logbn = Q( log n). 37
h(n) = Q(n logba )

T(n)aT(bn)f(n) n1 Q(1) n1
T(n) = S(n) + h(n) Special Solution: contribution of the internal nodes of the recursion tree.
Homogeneous Solution: contribution of the leaves of the recursion tree.
h(n)ah(bn) n1 h(n)=Qnlogba) Q(1) n1
T(n) = h(n)( Q(n) + 1 )
Define: Q(n)S(n), r(n) f(n) h(n) h(n)
Q(n)r(n)rbnrn rn  b2 b3
Q(n)Q(bn)r(n) n1  0  n  1
f(n) is a SAF  r(n) is a SAF
38

Q(n) S(n), r(n) f(n) h(n) h(n)
T(n) = h(n)( Q(n) + 1 )
f(n) is a SAF  r(n) is a SAF
↪ Let n=bk forintegerk=Q(logn).
k Q(n)  Qrbi 
 i1 
r(n)
r(bi)
Q(n)=Q( ? )
T(n)=Q( ? )
W(n+e)
W(b+ie)
r(n)
f(n)
O(n-e)
O(b-ie)
1
h(n)
Q(loge n) e > -1
Q(ie)
r(n)  k
f(n)log n
Q(log-1 n)
Q(1/i)
log k
h(n)log log n
O(loge n) e < -1 O(i-e) 1 h(n) 39 T ( n )   a T ( bn )  f ( n )  n  1 Q(1) n1 h(n) = Q n logba ) THE MASTER THEOREM Suppose f(n) is a SAF, and e  0 & e are constant reals. 𝒇(𝒏) / 𝒉(𝒏) 𝛀 𝒏+𝜺 Compare with CLRS. See Exercise 4. 𝚯 𝐥𝐨𝐠𝒆𝒏 (e > -1)
𝚯 𝐥𝐨𝐠𝒆𝒏 (e = -1)
𝐎 𝐥𝐨𝐠𝒆𝒏 (e < -1) 𝐎 𝒏−𝜺 f(n) » h(n) f(n)  h(n) f(n) « h(n) 𝑻(𝒏) 𝛩(𝑓(𝑛)) 𝛩(𝑓(𝑛)log𝑛) = 𝛩(𝑛log𝑏𝑎log𝑒+1𝑛) 𝛩(h(𝑛) log log 𝑛) = 𝛩(𝑛log𝑏𝑎 log log 𝑛) 𝛩(h(𝑛)) = 𝛩(𝑛log𝑏𝑎) 40 T ( n )   a T ( bn )  f ( n )  n  1 Q(1) n1 h(n) = Q n logba ) f(n) = Q( tn nd log e n) constants t > 0, d, e.
t> 1
T(n) = Q(f(n))
t= 1
bd > a
T(n) = Q(f(n))
bd = a

d = logb a
e > -1
T(n) = Q(f(n) log n)
e = -1
T(n) = Q(h(n) log log n)
e < -1 T(n) = Q(h(n)) bd < a T(n) = Q(h(n)) 0 < t< 1 T(n) = Q(h(n)) 41 EECS 3101 Sums & Recurrences Fact Sheet log 𝑥𝑦 = log 𝑥 + log 𝑦 log𝑥𝑦 =𝑦log𝑥 log𝑦 𝑥 = log 𝑥 / log 𝑦 log(1±𝑥)=±Θ(𝑥) for𝑥=𝑜(1) 𝑛 𝑖𝑑 = 𝑖=1 Θ 𝑛𝑑+1 if d > −1 Θ(log𝑛) ifd=−1 Θ 1 if d < −1 𝑛 𝑖=0 𝑥𝑖 = 𝑥𝑛+1−1 if0≠|𝑥|≠1 𝑥−1 Θ 𝑥𝑛 if x > 1
Θ 1 if 0 < x < 1 𝑛 𝑛 𝑛+1 Forf↑: 𝑓𝑥𝑑𝑥≤ 𝑚−1 𝑖=𝑚 𝑓𝑖 ≤ 𝑓𝑥𝑑𝑥 𝑚 𝑛 𝑛 𝑛+1 For f ↓: 𝑓 𝑥 𝑑𝑥 ≥ 𝑚−1 𝑖=𝑚 𝑓 𝑖 ≥ 𝑓 𝑥 𝑑𝑥 𝑚 ASSUME[ASymptotic SUmmationMade Easy] g(n)← 𝑛−𝑚 where log𝑓 𝑚 =log𝑓 𝑛 −Θ 1 : 𝑆𝑛 = 𝑛𝑖=1𝑓𝑖 𝑓𝑜𝑟𝑓𝑛 ↑=𝜔1 𝐟↑𝐚𝐧𝐝𝐠↑ ⇒ 𝐒𝐧 =𝚯 𝐠𝐧 𝐟𝐧 . Sum Type 𝒇(𝒏) : 𝒂 𝑺𝑨𝑭 𝒏 𝑺𝒏= 𝒇(𝒊) 𝒊=𝟏 Geometric 2Ω(𝑛) 𝜣( 𝒇 𝒏 ) Arithmetic 𝑛Θ 1 −1 𝜣( 𝒏 𝒇 𝒏 ) Harmonic Θ(𝑛−1 log𝑒𝑛) , e = a real constant 𝜣 𝐥𝐨𝐠𝒆+𝟏𝒏 𝚯 𝐥𝐨𝐠 𝐥𝐨𝐠 𝒏 𝚯 𝟏 𝒊𝒇 𝒆 > −𝟏 𝒊𝒇 𝒆 = −𝟏 𝒊𝒇 𝒆 < −𝟏 Bounded Tail 𝑛−Ω 1 −1 𝑜𝑟𝑒𝑣𝑒𝑛 𝑂 𝑛−1log𝑒𝑛 , 𝑒<−1 𝜣(𝟏) MASTERMETHOD: 𝑻(𝒏) = 𝒂𝑻 𝒏𝒃 +𝒇(𝒏) ∀𝒏>𝟏 , h(𝑛) = Θ 𝑛log𝑏𝑎 , 𝒇 𝒏 ∶ 𝐚𝐒𝑨𝑭, constants 𝜀>0 & 𝑒. 𝜣(𝟏) ∀𝒏 ≤ 𝟏
𝒇 𝒏 ∶ 𝒉(𝒏)
𝒇𝒏 ≫ 𝒉(𝒏)
𝒇 𝒏 ≈ 𝒉(𝒏)
𝒇 𝒏 ≪ 𝒉(𝒏)
𝒇(𝒏) 𝒉(𝒏)
=
Ω(𝑛𝜀 )
𝑻𝒏=
𝜣(𝒇𝒏 )
Θ(log𝑒𝑛) , 𝑒 >−1
Θ log−1𝑛 , 𝑒 = −1
𝑂 𝑛−𝜀 𝑜𝑟𝑒𝑣𝑒𝑛 𝑂(log𝑒𝑛), 𝑒<−1 𝜣(𝒇 𝒏 𝐥𝐨𝐠𝒏) 𝜣(𝒉 𝒏 𝐥𝐨𝐠𝐥𝐨𝐠𝒏) 𝜣(𝒉𝒏 ) 42 T ( n )   a T ( bn )  f ( n )  n  1 Q(1) n1 h(n) = Q n logba ) Example 1: Assume a = 2, b = 4, f(n) = Q(nd log2 n). Express T(n) based on the constant parameter d. Solution: logb a = log4 2 = 1⁄2  h(n) = Q( 𝑛) d > 1⁄2  T(n) = Q(f(n)) = Q(𝑛𝑑 𝑙𝑜𝑔2𝑛) d = 1⁄2  T(n) = Q(f(n) log n) = Q( 𝑛 𝑙𝑜𝑔3 𝑛) d <1⁄2  T(n)=Q(h(n)) = Q( 𝑛) 43 T ( n )   a T ( bn )  f ( n )  n  1 Q(1) n1 h(n) = Q n logba ) Example 2: Assume b = 3, f(n) = Q(n4 / log n). Express T(n) based on the constant parameter a. Solution: h(n)=Q(nlog3a) , bd =34 =81 a < 81  T(n) = Q(f(n)) = Q(n4 / log n) a = 81  T(n) = Q(h(n) log log n) = Q(n4 log log n) a > 81  T(n) = Q(h(n)) = Q(nlog3a)
44

Guess-&-Verify
BASIC INGRIDIANTS:
 Guess a solution to the recurrence.
 Verify it by mathematical induction.
The induction variable must be a natural number.
E.g., height of the recursion tree (max recursion depth), or
size of the recursion tree (# times you apply the recurrence). n may also be a candidate if it “stays” integer.
 If you spot a place where the verification does not go through, examine it to help revise your guess, and try again.
 One key guiding principle, taken with a grain of salt, is to first make sure the leading term (with its constant coefficient) matches on the LHS and RHS of the recurrence. That will show you to guess higher or lower the next time! After figuring out the leading term, you can apply the same principle to find the lower order terms.
45

Guess-&-Verify: Example 1
4𝑇𝑛2+𝑛2 𝑓𝑜𝑟𝑛>1 1 𝑓𝑜𝑟 𝑛 ≤ 1
•Clearly𝑇𝑛=Ω𝑛2. Is 𝑇𝑛=𝑂𝑛2? • Guess#1: 𝑇𝑛 =𝑐𝑛2+𝐿𝑛
const. 𝑐 > 0, 𝐿 𝑛 = 𝑜 𝑛2 lower order terms (to be determined) • Plug this guess in the recurrence and verify: 𝑇 𝑛 ? =? 4𝑇 𝑛2 + 𝑛2
𝑐𝑛2+𝐿𝑛 ?=? 4 𝑐 𝑛 2+𝐿 𝑛 +𝑛2 = (𝑐+1)𝑛2+4𝐿 𝑛 222
• LHS leading term < RHS leading term  guess is too low. 𝑇𝑛= 46 Guess-&-Verify: Example 1 4𝑇𝑛2+𝑛2 𝑓𝑜𝑟𝑛>1 1 𝑓𝑜𝑟 𝑛 ≤ 1
𝑇𝑛=
• Guess#2: 𝑇𝑛 =𝑐𝑛2+𝜀+𝐿𝑛
const. 𝑐 > 0, 𝜀 > 0, 𝐿 𝑛 lower order terms (to be determined) • Plug this guess in the recurrence and verify: 𝑇 𝑛 ? =? 4𝑇 𝑛2 + 𝑛2
𝑐𝑛2+𝜀+𝐿𝑛 ?=? 4 𝑐 𝑛 2+𝜀+𝐿 𝑛 +𝑛2 = 1 𝑐𝑛2+𝜀+4𝐿 𝑛 +𝑛2 2 2 2𝜀 2
• LHS leading term > RHS leading term  guess is too high.
47

Guess-&-Verify: Example 1
4𝑇𝑛2+𝑛2 𝑓𝑜𝑟𝑛>1 1 𝑓𝑜𝑟 𝑛 ≤ 1
𝑇𝑛=
• Guess#3: 𝑇 𝑛 =𝑐𝑛2log𝑛+𝐿 𝑛
const. 𝑐 > 0, 𝐿 𝑛 lower order terms (to be determined) • Plug this guess in the recurrence and verify: 𝑇 𝑛 ? =? 4𝑇 𝑛2 + 𝑛2
𝑐𝑛2log𝑛+𝐿𝑛 ?=? 4 𝑐 𝑛 2log𝑛+𝐿 𝑛 +𝑛2 222
=𝑐𝑛2log𝑛+4𝐿𝑛2 +(1−𝑐)𝑛2
• LHS leading term = RHS leading term  correct guess of high order term.
• Solve the residual recurrence for 𝐿(𝑛) (with revised boundary condition):
𝐿 𝑛 = 4 𝐿 𝑛2 + ( 1 − 𝑐 ) 𝑛 2
Claim: 𝑐 = 1. Why? If 1 − 𝑐 > 0, then 𝐿(𝑛) is not lower order.
If 1 − 𝑐 < 0, then 𝐿(𝑛) becomes negative but the leading term gets larger. Residual recurrence: 𝐿 𝑛 = 4𝐿 𝑛2 (plus revised boundary condition) Solution: L 𝑛 =𝑂 𝑛2 and 𝑇 𝑛 =𝑛2log𝑛+𝑂 𝑛2 =Θ(𝑛2log𝑛). 48 Guess-&-Verify: Example 2 Recurrence: T(n) = 2T( n/2 ) + n2 , T(0) = 0 T(n)  n2  W(n2) is obvious. Let’s see if O(n2) is an upper-bound: Guess: T(n)  an2 + bn + c (for some constants a>0, b, c to be determined)
Basis (n = 0): T(0) = 0  a02 + b0 + c we need c  0 Ind. Step (n 1): T(n) = 2T( n/2 ) +n2
2[an/22+bn/2+c]+n2 byinductionhypothesis if n is even  = 2 [ a(n/2)2 + b(n/2)+c ] + n2  not “” for odd n, if b<0 = (1+ a/2)n2 + bn + 2c  an2 + bn + c  our guessed upper-bound We need: for all even n 1: an2 + bn + c  (1+ a/2)n2 + bn + 2c, i.e., (a/2–1)n2 –c0. 49 Guess-&-Verify: Example 2 Recurrence: T(n) = 2T( n/2 ) + n2 , T(0) = 0 T(n)  n2  W(n2) is obvious. Let’s see if O(n2) is an upper-bound: Guess: T(n)  an2 + bn + c (for some constants a>0, b, c to be determined)
Basis (n = 0): T(0) = 0  a02 + b0 + c we need c  0 Ind. Step (n 1): T(n) = 2T( n/2 ) +n2
2[an/22+bn/2+c]+n2 byinductionhypothesis
if n is odd  = 2 [ a((n-1)/2)2 + b((n-1)/2)+c ] + n2 = (1+ a/2)n2 + (b-a)n + (2c-b+a/2)
 an2 + bn + c  our guessed upper-bound
We need: for all odd n 1: an2 + bn + c  (1+ a/2)n2 + (b-a)n + (2c-b+a/2), i.e., (a/2 – 1)n2 + an + (b – c – a/2)  0.
We need: for all even n 1: an2 + bn + c  (1+ a/2)n2 + bn + 2c, i.e.,
(a/2–1)n2 –c0.
a = 2 b = -1 c = 0
works for all n  0.
n2 T(n)2n2-n50

Guess-&-Verify: Example 2 Recurrence: T(n) = 2T( n/2 ) + n2 , T(0) = 0
Exercise 8: Using this same method, verify the following
tighter LOWER BOUND:
n1: 2n2 – n – 2n log n  T(n)  2n2 – n. 
T(n) = 2n2 – n – O(n log n).
For more examples of guess-&-verify see
Lecture Note 3
&
Sample Solutions ….
a=2 b = -1 c =0
works for all n  0.
n2 T(n)2n2-n51

Full History Recurrence
Such recurrences often arise in average-case analysis. Example1:T(n)n1 T(i)f(n), n0
Values of T(n) for some small n:
T(0) = f(0)
T(1) = T(0) + f(1) = f(0) + f(1)
T(2) = T(0) + T(1) + f(2) = 2f(0) + f(1) + f(2)
T(3) = T(0) + T(1) + T(2) + f(3) = 4f(0) + 2f(1) + f(2)+ f(3)
T(4) = T(0) + T(1) + T(2) + T(3) + f(4) = 8f(0) + 4f(1) + 2f(2) + f(3) + f(4)
Example 2: T(n) 2n1 T(i)  n, n0 n i0
This is the QuickSort expected time recurrence (details shown later in LS5)
i0
52

Full History Recurrence
Example 1: T(n)n1 T(i)f(n), n0 i0
A Solution Method:
nn-1: T(n1)n2 T(i)f (n1), n10 (i.e., n1) i0
Subtract: T(n)T(n1)T(n1)f(n)f(n1), n1 Rearrange: T(n)2T(n1) f(n)f(n1) n1
f(0) forn0
Now there is only one appearance of T(.) on the RHS. Continue with conventional methods.
53

Variable Substitution: Example 1 Sometimes we can considerably simplify the recurrence by a change of variable.
T(n) = T(n/2) + log n
[assume the usual boundary condition T(O(1))=O(1).]
Change of variable: n = 2m, i.e., m = log n.
Now T(n) = T(2m). It is some function of m. Rename it S(m). That is, T(n) = T(2m) = S(m).
The recurrence becomes:
S(m) = S(m-1) + m.
By the iteration method: S(m) = Q(m2).
So, T(n) = T(2m) = S(m) = Q(m2) = Q(log2 n).
T(n) = Q(log2 n).
54

T( n )
g(m) Rename:
Boundary Condition
+ f(n) T(n) = 0 for n < 1 g(m-1) n = g(m), n/2 = g(m-1), S(m) = T(g(m)), h(m) = f(g(m)) Variable Substitution: How? Recurrence = T( n/2 ) g(m) Now solve 2 recurrences: 1) g(m) = 2g(m-1),  g(m) = 2mg(0) = 2m . g(0) = 1  our choice 2) S(m) = S(m-1) + h(m)  S(m) = h(0)+h(1)+ ··· +h(m). S(0) = T(g(0)) = T(1) = f(1) = h(0) Now back substitute: T(n) = T(g(m)) = S(m) = S(g-1(n)), where g-1 is functional inverse of g. See another example on the next page 55 Variable Substitution: Example 2 T(𝑛) = T( 𝑛) + 1 g(m) g(m-1) Rename: n = g(m), 𝑛 = g(m-1), S(m) = T(g(m)). g(m)g(m1)2 g(m2)22 g(m3)23 g(0)2m g(0) = 2  n = g(m) = 22m  m = log log n. S(m) = S(m-1)+1  S(m) = Q(m) = Q(log log n). S(0) = T(2) = Q(1) T(n) = Q(loglogn). 56 Exercises 57 1. For each pseudo-code below derive the simplified asymptotic running time in Q(?) notation. (a) for i1..n do for j1..2*i doprint(i+j) (c) for i1..n do j1 while ji do jj+3 (e) for i1..n do j1 while ji do jj+j (g) for i1..n do j2 while jn do jj*j (i) for i1..n do j1 while i*jn do jj+1 (k) for i1..n do (b) for i  1 .. n do for j  2*i .. n do print( i+j ) (d) for i1..n do j1 while i+jn do jj+3 (f) for i1..n do j1 while j*ji do jj+1 (h) for i1..n do j2 while ji do jj*j (j) for i1..n do j1 while i*ij do jj+1 (l) for i1..n do jn jn while i < j*j do j  j – 2 while i  j*j do j  j div 2 58 59 4. The Master Theorem: CLRS versus these Slides: The case “f(n)/h(n) = W(ne)” of the Master Theorem [page 94 of CLRS] has the following extra condition: (*) if af(n/b) < cf(n) for some constant c < 1 and all sufficiently large n. The same theorem on page 40 of these slides instead uses the condition: (**) f(n) is a SAF. Show that (**) implies (*) in the case “f(n)/h(n) = W(ne)” . [Hint: Consider the SAF r(n) = f(n)/h(n)  W(n+e). So, r(bi) = W(b+ie) is increasing at least exponentially as a function of i. Then, with some extra derivation, show that r(n/b) / r(n) < c < 1 for some constant c and all sufficiently large n. c = b–e/2 works. ] 5. Exercise 6 of Lecture Note 3 (page 9) solves the recurrence T(n) = 3T(n/3 +5) + n/2 by the guess-&-verify method. Such recurrences can also be solved by the variable substitution method. Consider the somewhat generalized recurrence T(n) = a T(n/b + c) + f(n), where a>0, b>1, and c are constants, and f(n) is the driving function which is a SAF.
a) Show the substitution S(n) = T(n + cb/(b-1)) results in the new recurrence S(n) = aS(n/b) + f(n + cb/(b-1)). [Now this can be solved by the Master Theorem.]
b) Use this method to solve the recurrence T(n) = 4T(n/2 +7) + n2. 60

(m) T(n)  n1 iT(i)  n i0
61

(n)𝑇𝑛= 𝑛𝑇𝑛−1+1 𝑛−1
(o)𝑇𝑛 = 𝑛−1𝑇𝑛−1+1 𝑛
62

8. Show that the recurrence 𝑇 𝑛 = 2𝑇 𝑛 + 𝑛2 ,
2
has the solution: 𝑇(𝑛) = 2𝑛2 – 𝑛 – 𝑂(𝑛 𝑙𝑜𝑔 𝑛).
𝑇0=0
9. Show that the recurrence
𝑇a𝑛 +𝑇b𝑛 +𝑛 𝑓𝑜𝑟𝑛≥𝑛𝑜
𝑐 𝑓𝑜𝑟 0 ≤ 𝑛 < 𝑛𝑜 for any real constants 0  a < 1, 0  b < 1, c > 0, 𝑛𝑜> 0, has the following solutions:
a) a  b  1  T(n) = Q(n)
b) ab 1  T(n)=Q(nlogn)
c) a  b  1  T(n) = Q(nd), where d > 1 is the unique constant
that satisfies the equation: ad + bd = 1.
63
𝑇𝑛=

64

12. Derive the most simplified answers to the following summations. Mention which methods you used to derive your solution.
a) log 𝑛 𝑖=1
b) 𝑛 𝑖=1
c)2𝑛 𝑖=1
2𝑖 𝑖 + 7𝑖25𝑖 + 4𝑖6log 𝑖
2𝑖 (2𝑗+7)5
𝑗=1 (3𝑖+4) 8𝑖+3𝑖5 (9𝑗−1)3
𝑛 2𝑖+𝑗=Θ?. 𝑗=1 𝑗4
= Θ ? . =Θ?.
d) 𝑛 𝑖4 (𝑖2+𝑗)4 =Θ?. 𝑖=1 𝑗=1 2𝑖log𝑖+5 𝑗 2
e) 𝑛2 𝑖=1
f) 𝑛𝑖=1
log𝑖 73𝑗+5𝑗2 = Θ(?). 𝑗=1 2𝑗3+9𝑗 log 𝑗
𝑖log𝑖 𝑖log𝑗 =Θ(?) 𝑗=1 𝑗+log 𝑖
65

13. Recursion Time Analysis:
14.
A certain recursive algorithm takes an input list of n elements. Divides the list into n sub-lists, each with n elements. Recursively solves each of these n smaller sub- instances. Then spends an additional Θ(n) time to combine the solutions of these sub- instances to obtain the solution of the main instance.
As a base case, if the size of the input list is at most a specified positive constant, then the algorithm solves such a small instance directly in Θ(1) time.
a) Express the recurrence relation that governs T n , the time complexity of this algorithm.
b) Derive the solution to this recurrence relation: T n = Θ(? ). Mention which methods you used to derive your solution.
The expected-case running time of a particular algorithm is given by the following recurrence relation (with the usual boundary condition 𝑇(𝑂(1)) = Θ(1) ).
Show that the solution to this recurrence is 𝑇(𝑛) = Θ(𝑛 log 𝑛).
𝑇𝑛 =1 𝑇 3𝑛 +𝑇 𝑛 +1𝑇𝑛−1 +Θ𝑛. 2442
66

[Akra-Bazzi] A Generalization of the Master Theorem [ See LN3a for details]:
Consider the following (continuous) recurrence
𝑘
𝑇𝑥= 𝑎𝑖𝑇𝑏𝑖𝑥+𝜀𝑖(𝑥)+𝑓(𝑥)
𝑖=1 𝑥
where 𝑘 is fixed, 𝑎𝑖 > 0, 0 < 𝑏𝑖 < 1 , |𝜀𝑖 𝑥 | = 𝑂 log2 𝑥 , 𝑓 𝑥 has polynomial growth rate. Theorem [ Akra-Bazzi, 1996] The asymptotic solution to the above recurrence is 𝑝 𝑥𝑓𝑢 𝑇𝑥=Θ𝑥 1+ 𝑢1+𝑝𝑑𝑢 1 where 𝑝 is the unique real number such that 𝑘𝑖=1 𝑎𝑖𝑏𝑖𝑝 = 1 . 15. Use Akra-Bazzi Theorem to show (or answer) the following: a) 𝑇 𝑥 =2𝑇 𝑥/4 +3𝑇 𝑥/6 +Θ(𝑥log𝑥) ⟹ 𝑝=1 and 𝑇 𝑥 =Θ(𝑥log2𝑥). b) 𝑇 𝑥 =2𝑇 𝑥/2 +89𝑇 3𝑥/4 +Θ(𝑥2/log𝑥) ⟹ 𝑝=2 and 𝑇 𝑥 =Θ(𝑥2/loglog𝑥). c) 𝑇 𝑥 =2𝑇 𝑥/2 +Θ(log𝑥) ⟹ 𝑝=0 and 𝑇 𝑥 =Θ(log2𝑥). d) 𝑇 𝑥 =12𝑇 𝑥/2 +Θ(1/𝑥) ⟹ 𝑝=−1and𝑇 𝑥 =Θ( log𝑥 /𝑥). e) 𝑇 𝑥 =4𝑇 𝑥/2 +Θ(𝑥) ⟹ 𝑝=2 and𝑇 𝑥 =Θ(𝑥2). f) 𝑇 𝑥 =2𝑇 𝑥/4 +5 𝑥 + 3𝑇 𝑥/6−7log𝑥 +Θ(𝑥2log𝑥) ⟹ 𝑝=??? and 𝑇 𝑥 =Θ(???). 67 END 68