Analysis of Algorithms, I
CSOR W4231
Computer Science Department
Copyright By PowCoder代写 加微信 powcoder
Columbia University Asymptotic notation, mergesort, recurrences
1 Asymptotic notation
2 The divide & conquer principle; application: mergesort
3 Solving recurrences and running time of mergesort
Review of the last lecture
Introduced the problem of sorting.
Analyzed insertion-sort.
Worst-case running time: T (n) = 3n2 + 7n − 4 22
Space: in-place algorithm
Worst-case running time analysis: a reasonable measure of
algorithmic efficiency.
Defined polynomial-time algorithms as “efficient”.
Argued that detailed characterizations of running times are not convenient for understanding scalability of algorithms.
Running time in terms of # primitive steps
We need a coarser classification of running times of algorithms; exact characterizations
are too detailed;
do not reveal similarities between running times in an
immediate way as n grows large;
are often meaningless: high-level language steps will expand by a constant factor that depends on the hardware.
1 Asymptotic notation
2 The divide & conquer principle; application: mergesort
3 Solving recurrences and running time of mergesort
Aymptotic analysis
A framework that will allow us to compare the rate of growth of different running times as the input size n grows.
We will express the running time as a function of
the number of primitive steps; the latter is a function of the input size n.
To compare functions expressing running times, we will ignore their low-order terms and focus solely on the highest-order term.
Asymptotic upper bounds: Big-O notation
Definition 1 (O).
We say that T(n) = O(f(n)) if there exist constants c > 0 and n0 ≥0s.t. foralln≥n0,wehaveT(n)≤c·f(n).
T(n) = O(f(n))
Asymptotic upper bounds: Big-O notation
Definition 1 (O).
We say that T(n) = O(f(n)) if there exist constants c > 0 and
n0 ≥0s.t. foralln≥n0,wehaveT(n)≤c·f(n).
Examples: Show that T(n) = O(f(n)) when
T(n)=an2+b,a,b>0constantsandf(n)=n2. T(n)=an2+bandf(n)=n3.
Asymptotic lower bounds: Big-Ω notation
Definition 2 (Ω).
We say that T (n) = Ω(f (n)) if there exist constants c > 0 and n0 ≥0s.t. foralln≥n0,wehaveT(n)≥c·f(n).
T(n) = Ω(f(n))
Asymptotic lower bounds: Big-Ω notation
Definition 2 (Ω).
We say that T (n) = Ω(f (n)) if there exist constants c > 0 and
n0 ≥0s.t. foralln≥n0,wehaveT(n)≥c·f(n).
Examples: Show that T(n) = Ω(f(n)) when
T(n)=an2+b,a,b>0constantsandf(n)=n2. T(n)=an2+b,a,b>0constantsandf(n)=n.
Asymptotic tight bounds: Θ notation
Definition 3 (Θ).
We say that T(n) = Θ(f(n)) if there exist constants c1,c2 > 0
andn0 ≥0s.t. foralln≥n0,wehave
c1 ·f(n) ≤ T(n) ≤ c2 ·f(n).
T(n) = Θ(f(n))
Asymptotic tight bounds: Θ notation
Definition 3 (Θ).
We say that T(n) = Θ(f(n)) if there exist constants c1,c2 > 0
andn0 ≥0s.t. foralln≥n0,wehave
c1 ·f(n) ≤ T(n) ≤ c2 ·f(n).
Equivalent definition
T(n) = Θ(f(n)) if T(n) = O(f(n)) and T(n) = Ω(f(n))
Asymptotic tight bounds: Θ notation
Definition 3 (Θ).
We say that T(n) = Θ(f(n)) if there exist constants c1,c2 > 0
andn0 ≥0s.t. foralln≥n0,wehave
c1 ·f(n) ≤ T(n) ≤ c2 ·f(n).
Equivalent definition
T(n) = Θ(f(n)) if T(n) = O(f(n)) and T(n) = Ω(f(n))
Notational convention: log n stands for log2 n
Examples: Show that T(n) = Θ(f(n)) when
T(n)=an2+b,a,b>0constantsandf(n)=n2 T(n)=nlogn+nandf(n)=nlogn
Asymptotic upper bounds that are not tight: little-o
Definition 4 (o).
We say that T (n) = o(f (n)) if, for any constant c > 0, there exists a constant n0 ≥ 0 such that for all n ≥ n0, we have
T (n) < c · f (n) .
Asymptotic upper bounds that are not tight: little-o
Definition 4 (o).
We say that T (n) = o(f (n)) if, for any constant c > 0, there exists a constant n0 ≥ 0 such that for all n ≥ n0, we have
T (n) < c · f (n) .
Intuitively, T(n) becomes insignificant relative to f(n) as n → ∞.
Proof by showing that lim T (n) = 0 (if the limit exists). n→∞ f(n)
Asymptotic upper bounds that are not tight: little-o
Definition 4 (o).
We say that T (n) = o(f (n)) if, for any constant c > 0, there exists a constant n0 ≥ 0 such that for all n ≥ n0, we have
T (n) < c · f (n) .
Intuitively, T(n) becomes insignificant relative to f(n) as n → ∞.
Proof by showing that lim T (n) = 0 (if the limit exists). n→∞ f(n)
Examples: Show that T(n) = o(f(n)) when
T(n)=an2+b,a,b>0constantsandf(n)=n3. T(n)=nlognandf(n)=n2.
Asymptotic lower bounds that are not tight: little-ω
Definition 5 (ω).
We say that T(n) = ω(f(n)) if, for any constant c > 0, there exists a constant n0 ≥ 0 such that for all n ≥ n0, we have
T (n) > c · f (n).
Asymptotic lower bounds that are not tight: little-ω
Definition 5 (ω).
We say that T(n) = ω(f(n)) if, for any constant c > 0, there exists a constant n0 ≥ 0 such that for all n ≥ n0, we have
T (n) > c · f (n).
Intuitively T(n) becomes arbitrarily large relative to f(n), as n → ∞.
T(n) = ω(f(n)) implies that lim T(n) = ∞, if the limit n→∞ f(n)
exists. Then f(n) = o(T(n)).
Asymptotic lower bounds that are not tight: little-ω
Definition 5 (ω).
We say that T(n) = ω(f(n)) if, for any constant c > 0, there exists a constant n0 ≥ 0 such that for all n ≥ n0, we have
T (n) > c · f (n).
Intuitively T(n) becomes arbitrarily large relative to f(n), as n → ∞.
T(n) = ω(f(n)) implies that lim T(n) = ∞, if the limit n→∞ f(n)
exists. Then f(n) = o(T(n)).
Examples: Show that T(n) = ω(f(n)) when T(n) = n2 and f(n) = nlogn.
T(n)=2n andf(n)=n5.
Basic rules for omitting low order terms from functions
1. Ignore multiplicative factors: e.g., 10n3 becomes n3
2. na dominates nb if a > b: e.g., n2 dominates n
3. Exponentials dominate polynomials: e.g., 2n dominates n4 4. Polynomials dominate logarithms: e.g., n dominates log3 n
⇒ For large enough n,
logn
This structure is typical of recurrence relations
an inequality or equation bounds T(n) in terms of an expression involving T (m) for m < n
a base case generally says that T (n) is constant for small constant n
We ignore floor and ceiling notations.
A recurrence does not provide an asymptotic bound for T (n): to this end, we must solve the recurrence.
1 Asymptotic notation
2 The divide & conquer principle; application: mergesort
3 Solving recurrences and running time of mergesort
Solving recurrences, method 1: recursion trees
The technique consists of three steps
1. Analyze the first few levels of the tree of recursive calls 2. Identify a pattern
3. Sum the work spent over all levels of recursion
Example: give an asymptotic bound for the recurrence describing the running time of mergesort
T(n)=2T(n/2)+cn, forn≥2,constantc>0 T (1) = c
A general recurrence and its solution
The running times of many recursive algorithms can be expressed by the following recurrence
T (n) = aT (n/b) + cnk , for a, c > 0, b > 1,k ≥ 0 What is the recursion tree for this recurrence?
a is the branching factor
b is the factor by which the size of each subproblem shrinks ⇒ at level i, there are ai subproblems, each of size n/bi
⇒ each subproblem at level i requires c(n/bi)k work
the height of the tree is logb n levels
logbni ik k b ai
⇒ Totalwork: ac(n/b) =cn
Solving recurrences, method 2: Master theorem
Theorem 6 (Master theorem).
If T(n) = aT(⌈n/b⌉) + O(nk) for some constants a > 0, b > 1, k ≥ 0, then
O(nlogba) ,ifa>bk O(nklogn) ,ifa=bk
O(nk) ,ifa