Copyright By PowCoder代写 加微信 powcoder
O(n log n) Time
O(n log n) time. Arises in divide-and-conquer algorithms. also referred to as linearithmic time
Sorting. Mergesort and heapsort are sorting algorithms that perform O(n log n) comparisons.
Calculating Time complexity in two ways
At each iteration, the array is divided by half. So let’s say the length of array at any iteration is n
At Iteration 1, Length of array = n
At Iteration 2, Length of array = n⁄2
At Iteration 3, Length of array = (n⁄2)⁄2 = n⁄22
Therefore, after Iteration k, Length of array = n⁄2k
we know that after After k divisions, the length of array becomes 1 Therefore, Length of array = n⁄2k = 1 => n = 2k
Applying log function on both sides: => log2 (n) = log2 (2k) => log2 (n) = k log2 (2) As (loga (a) = 1)
Therefore, => k = log2 (n) Hence, the time complexity of Binary Search is T(n) = log2 (n)
Calculating Time complexity in two ways T(1) = 1, (a) T(n)=c+T(n/2),whenn>1. (b)
T(n/2) = c + T(n/4) (c)
Substitute (c) in (b)
T(n) = c + c + T(n/4) = 2c + T(n/4) (d) T(n/4) = c + T(n/8 ) (e)
Substitute (e) in (d)
T(n) = 2c + c + T(n/8 ) = 3c + T(n/8 ) (f)
T(n) = ic + T(n/ 2i) (g)
T(n/ 2i) = 1 (at some point)
Þ n/ 2i = 1 (cross multiplication) s
Take log on both side
logn =log2i logn=i (h)
Substitute (h) in (g) T(n) = c.log n + T(n/2lgn)
T(n) = c.log n + T(n/n) T(n) = c.log n + T(1)
log n + 1 = O(log n).
Asymptotic Order of Growth
Upper bounds. f(n) is O(g(n)) if there exist constants c > 0 and n0 3 0 suchthatforalln3n0 wehavef(n)£c·g(n).
Lower bounds. f(n) is W(g(n)) if there exist constants c > 0 and n0 3 0 suchthatforalln3n0 wehavef(n)3c·g(n).
Tight bounds. f(n) is Q(g(n)) if f(n) is both O(g(n)) and W(g(n)). Ex: f(n) = 32n2 + 17n + 32.
f(n) is O(n2), O(n3), W(n2), W(n), and Q(n2) . f(n) is not O(n), W(n3), Q(n), or Q(n3).
Transitivity.
! Iff=O(g)andg=O(h)thenf=O(h).
! Iff=W(g)andg=W(h)thenf=W(h).
! Iff=Q(g)andg=Q(h)thenf=Q(h).
Additivity.
! Iff=O(h)andg=O(h)thenf+g=O(h).
! Iff=W(h)andg=W(h)thenf+g=W(h).
! Iff=Q(h)andg=O(h)thenf+g=Q(h).
Properties
Slight abuse of notation. T(n) = O(f(n)).
! Not transitive:
– f(n) = 5n3; g(n) = 3n2 – f(n) = O(n3) = g(n)
– but f(n) 1 g(n).
! Better notation: T(n) Î O(f(n)).
Meaningless statement. Any comparison-based sorting algorithm requires at least O(n log n) comparisons.
! Statement doesn’t “type-check.”
! Use W for lower bounds.
Asymptotic Bounds for Some Common Functions
Polynomials. a0 + a1n + … + adnd is Q(nd) if ad > 0.
Polynomial time. Running time is O(nd) for some constant d independent
of the input size n.
Logarithms. O(log a n) = O(log b n) for any constants a, b > 0. can avoid specifying the
Logarithms. For every x > 0, log n = O(nx).
log grows slower than every polynomial
Exponentials. For every r > 1 and every d > 0, nd = O(rn).
every exponential grows faster than every polynomial
Polynomial-Time
Brute force. For many non-trivial problems, there is a natural brute force search algorithm that checks every possible solution.
Typically takes 2N time or worse for inputs of size N.
Unacceptable in practice.
n ! for stable matching with n men and n women
Desirable scaling property. When the input size doubles, the algorithm should only slow down by some constant factor C.
There exists constants c > 0 and d > 0 such that on every input of size N, its running time is bounded by c Nd steps.
Def. An algorithm is poly-time if the above scaling property holds.
choose C = 2d
Worst-Case Analysis
Worst case running time. Obtain bound on largest possible running time of algorithm on input of a given size N.
! Generally captures efficiency in practice.
! Draconian view, but hard to find effective alternative.
Average case running time. Obtain bound on running time of algorithm on random input as a function of input size N.
! Hard (or impossible) to accurately model real instances by random distributions.
! Algorithm tuned for a certain distribution may perform poorly on other inputs.
Worst-Case Polynomial-Time
Def. An algorithm is efficient if its running time is polynomial. Justification: It really works in practice!
Although 6.02 ́ 1023 ́ N20 is technically poly-time, it would be
useless in practice.
! In practice, the poly-time algorithms that people develop almost
always have low constants and low exponents.
! Breaking through the exponential barrier of brute force typically
exposes some crucial structure of the problem.
Exceptions.
! Some poly-time algorithms do have high constants and/or exponents, and are useless in practice.
! Some exponential-time (or worse) algorithms are widely used
because the worst-case instances seem to be rare.
simplex method Unix grep
Why It Matters
程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com