CS计算机代考程序代写 algorithm Com S 311 Section B Introduction to the Design and Analysis of Algorithms

Com S 311 Section B Introduction to the Design and Analysis of Algorithms
Lecture One for Week 2
Xiaoqiu (See-ow-chew) Huang
Iowa State University
February 2, 2021

// Sorts the elements in the subarray $A[p .. r]$
MERGE-SORT(A, p, r)
1 2 3 4 5
ifp 0.
T(1) = d,
T (n) = T (􏰾 n 􏰿) + T (􏰼 n 􏰽) + dn if n > 1. 22

Analysis of MERGE-SORT
Assumethatnisaperfectpowerof2,i.e.,n=2k forsomek≥0. Then the recurrence for T(n) becomes
T (n) = 2T (n/2) + dn if n > 1.
Repeatedly replacing the left side by the right side, we obtain T(n) = 2T(n/2) + dn = 2[(2T(n/22)) + d(n/2)] + dn
= [22T(n/22) + dn] + dn = 22T(n/22) + 2dn
= 2hT(n/2h) + hdn for 1 ≤ h ≤ k
= 2k T (n/2k ) + kdn = 2k T (1) + kdn = nd + kdn = dn(k + 1)
= dn(1 + log n) ≤ dn(log n + log n) = 2dn log n = cn log n, if n > 2 with c = 2d.

Analysis of MERGE-SORT
Ifnisnotaperfectpowerof2,thenwehave2k 2.
Thus, T (n) is in O(n log n).

Asymptotic Notation
The asymptotic efficiency of algorithms is concerned with how the running time of an algorithm increases with the input size tending towards infinity (increasing without bound).
An algorithm that is asymptotically more efficient will be the best for all but small inputs.
The asymptotic running time of an algorithm is described in terms of functions whose domains are the set of natural numbers
N = {0,1,2,…,}.
We assume that every function in every asymptotic notation is asymptotically nonnegative.

Θ-Notation
For a given function g(n), let Θ(g(n)) denote the following set of functions:
Θ(g(n)) = {f (n) : 0 ≤ c1g(n) ≤ f (n) ≤ c2g(n) for all n ≥ n0
for some positive constants c1, c2, and n0 }.
In other words, for all n ≥ n0, the function f (n) is equal to g(n) to within a constant factor.
The function g (n) is an asymptotically tight bound for f (n).

Example
Show that f (n) = an2 + bn + c is in Θ(n2), where a, b and c are constants with a > 0.
We need to find positive constants c1, c2, and n0 such that for all n ≥ n0,
c1n2 ≤an2 +bn+c ≤c2n2,
which is the same as
(a−c1)n2 ≥−bn−c, and
(c2 − a)n2 ≥ bn + c.

Example
We can make the two previous inequlities hold by requiring (a−c1)n2 ≥|b|n+|c|(≥−bn−c ),
and
(c2 − a)n2 ≥ |b|n + |c| ( ≥ bn + c).
Seta−c1 =c2−a,choosec1 =a/4. Thenc2 =7a/4. We just need to show that
3an2 ≥|b|n+|c|. 4
The inequlity holds if we require 2a n2 ≥ |b|n, and
4
an2 ≥|c|. 4
This is the same as n ≥ 2 × (|b|/a), and n ≥ 2 × 􏰭|c|/a.

Example
Choosen0 =1+2×max(|b|/a,􏰭|c|/a). Thenforalln≥n0 >0, the previous two inequalities hold.
Note that we have chosen c1 = a/4 and c2 = 7a/4.
Note that since a > 0, c1 and c2 are positive, and that n0 is positive.

O -Notation
For a given function g(n), let O(g(n)) denote the following set of
functions:
O(g(n)) = {f (n) : 0 ≤ f (n) ≤ cg(n) for all n ≥ n0 for some positive constants c and n0 }.
In other words, for all n ≥ n0, the function f (n) is bounded from above by a constant times g(n).
The function g (n) is an asymptotically upper bound for f (n).
By definition, all functions in Θ(g(n)) are in O(g(n)).
Thus, any quadratic function an2 + bn + c with a > 0 is in O(n2).
As another example, an + b with a > 0 is in O(n2), because 0 ≤ an + b ≤ an + |b| ≤ an + |b|n
= (a + |b|)n ≤ cn2
foralln0 =max(1,−b/a),wherec=a+|b|>0.

Ω-Notation
For a given function g(n), let Ω(g(n)) denote the following set of functions:
Ω(g(n)) = {f (n) : 0 ≤ cg(n) ≤ f (n) for all n ≥ n0
for some positive constants c and n0 }.
In other words, for all n ≥ n0, the function f (n) is bounded from below by a constant times g(n).
The function g (n) is an asymptotically lower bound for f (n).

Ω-Notation
The following theorem follows from the definitions of the three asymptotic notations.
Theorem 3.1
For any two functions f (n) and g (n), f (n) is in Θ(g (n)) if and only if f (n) is in O(g(n)) and is in Ω(g(n)).
Thus, any quadratic function an2 + bn + c with a > 0 is in Ω(n2).
In practice, we often use the theorem prove asymptotically tight bounds from asymptotic upper and lower bounds.

Monotonicity
A function f (n) is monotonically increasing if f (m) ≤ f (n) for all m ≤ n.
Let d be a positive constant. It can shown by mathematical induction that the function T(n) defined below is monotonically increasing:
T(1) = d,
T (n) = T (􏰾 n 􏰿) + T (􏰼 n 􏰽) + dn if n > 1. 22
A function f (n) is strictly increasing if f (m) < f (n) for all m < n. A function f (n) is monotonically decreasing if f (m) ≥ f (n) for all m ≤ n. A function f (n) is strictly decreasing if f (m) > f (n) for all m < n. Floors and Ceilings For any real number x, let ⌊x⌋ denote the greatest integer less than or equal to x, and let ⌈x⌉ be the least integer greater than or equal to x. For all real x, we have x−1<⌊x⌋≤x≤⌈x⌉ 0, or ⌈x⌉ = k < k + 1 = x + 1 if s = 0. Floors and Ceilings For any integer n, we have 􏰾 n 􏰿 + 􏰼 n 􏰽 = n. 22 This was proved before. For any real number x ≥ 0 and integers a,b > 0, we have 􏱂⌈x ⌉􏱃
a = 􏰾 x 􏰿, (3.4) b ab
􏱀⌊x ⌋􏱁
a = 􏰼 x 􏰽. (3.5) b ab

Floors and Ceilings
Define k = 􏰼 x 􏰽. Then x = abk + s for some real number s with ab
0 ≤ s < ab. If s = 0, then each side in (3.4) and (3.5) becomes k, so (3.4) and (3.5) hold. Consider the case where 0 < s < ab. Note that 􏰾x 􏰿=k+1. ab Dividing each side by a, we have 0 < s/a < b. By (3.3) and the definition of 􏰾 s 􏰿, we have s􏰾s􏰿 a 0 0, we prove that 􏰾 a 􏰿 ≤ a+(b−1) , (3.6)
bb
􏰼a􏰽≤ a−(b−1). (3.7)
bb
By (3.3), we have 􏰾a􏰿 0, then the polynomial p(n) is asymptoticaly positive and we can show that p(n) is in Θ(nd).
A function f (n) is polynomially bounded if f (n) is in O(nk ) for some constant k.

Exponentials
For any real number a > 0 and any integers m and n, the following identities hold:
a0 = 1,
a1 = a,
a−1 = 1/a,
(am)n = amn = (an)m, aman = am+n.
Any exponential function with a base strictly greater than 1 grows faster than any polynomial function. For example, 2n grows much faster than nd for any nonnegative integer d.
For any real number x, the exponential function ex is of the
following form
ex =1+x+x2 +x3 +…=􏰇∞ xi, 2! 3! i=0 i!
where e ≈ 2.71828 is the base of the natural logarithm function.

Logarithms
The following notations are used in the analysis of algorithms: lg n = log2 n (binary logarithm),
ln n = loge n (natural logarithm),
lgk n = (lg n)k (exponentiation),
lg lg n = lg(lg n) (composition).
An important notational convention: lg n + k = (lg n) + k, not lg(n + k).

Logarithms
For all real a > 0, b > 0, c > 0, and n, the following identities are useful:
a = blogb a,
logc (ab) = logc a + logc b,
logb an = n logb a,
log a=logca, b logc b
logb (1/a) = − logb a, loga= 1 ,
b loga b alogb c = clogb a,
where the logarithm bases are not 1.