COMP20007 Design of Algorithms
Analysis of Algorithms
Lars Kulik Lecture 4 Semester 1, 2020
1
Establishing Growth Rate
In the last lecture we proved t(n) ∈ O(g(n)) for some cases of t and g, using the definition of O directly:
n>n0 ⇒t(n)
max ← A[i] return max
We count the number of comparisons executed for a list of size n:
n−1
C (n) = 1 = n − 1 = Θ(n)
i=1
5
Example: Selection Sort
function SelSort(A[0..n − 1]) for i ← 0 to n − 2 do
min ← i
for j ← i + 1 to n − 1 do
if A[j] < a[min] then min ← j
swap A[i] and A[min]
We count the number of comparisons executed for a list of size n: n−2 n−1 n−2 n−2
C(n)= 1=(n−1−i)=(n−1)2 −i
i=0 j=i+1 i=0
i=0
= (n − 1)2 − (n − 2)(n − 1) = n(n − 1) = Θ(n2) 22
6
Example: Matrix Multiplication
function
MatrixMult(A[0..n − 1, 0..n − 1], B [0..n − 1, 0..n − 1]) for i ← 0 to n − 1 do
for j ← 0 to n − 1 do C[i,j] ← 0.0
for k ← 0 to n − 1 do
C [i , j ] ← C [i , j ] + A[i , k ] · B [k , j ] return C
M(n) = 1 i=0 j=0 k=0
The number of multiplications executed for a list of size n is: n−1 n−1 n−1
✎
7
Analysing Recursive Algorithms
Let us start with a simple example:
function F(n)
if n=0thenreturn1 else return F(n − 1) · n
The basic operation here is the multiplication. We express the cost recursively as well:
M(0) = 0
M(n) = M(n−1)+1 forn>0
To find a closed form, that is, one without recursion, we usually try “telescoping”, or “backward substitutions” in the recursive part.
8
Telescoping
The recursive equation was:
M(n)=M(n−1)+1 (forn>0)
Use the fact M(n − 1) = M(n − 2) + 1 to expand the right-hand side:
M(n) = [M(n − 2) + 1] + 1 = M(n − 2) + 2 and keep going:
… = [M(n−3)+1]+2 = M(n−3)+3 = … = M(n−n)+n = n where we used the base case M(0) = 0 to finish.
9
A Second Example: Binary Search in Sorted Array
function BinSearch(A[],lo,hi,key) if lo > hi then return −1
mid ← lo + (hi − lo)/2
if A[mid] = key then return mid else
if A[mid] > key then
return BinSearch(A,lo,mid −1,key)
else return BinSearch(A, mid + 1, hi , key )
The basic operation is the key comparison. The cost, recursively, in the worst case:
C(0) = 0
C(n) = C(n/2)+1 forn>0
10
Telescoping
A smoothness rule allows us to assume that n is a power of 2. The recursive equation was:
Use the fact C (n/2) = C (n/4) + 1 to expand, and keep going:
C(n) = C(n/2) + 1
= [C(n/4)+1]+1
= [[C(n/8)+1]+1]+1 .
= [[…[[C(0)+1]+1]+···+1]+1]
Hence C (n) = Θ(log n).
C(n) = C(n/2) + 1 (for n > 0)
1+log2 n times
11
Logarithmic Functions Have Same Rate of Growth
In O-expressions we can just write “log” for any logarithmic function, no matter what its base is.
Asymptotically, all logarithmic behaviour is the same, since loga x = (loga b)(logb x)
So, for example, if ln is the natural logarithm then log2n ∈ O(lnn)
lnn ∈ O(log2n)
Also note that since log nc = c · log n, we have, for all constants c,
lognc =O(logn)
12
Summarising Reasoning with Big-Oh
O(f(n))+O(g(n)) = O(max{f(n),g(n)})
c · O(f (n)) = O(f (n))
O(f (n)) · O(g(n)) = O(f (n) · g(n)).
The first equation justifies throwing smaller summands away.
The second says that constants can be thrown away too.
The third may be used with some nested loops. Suppose we have a loop which is executed O(f (n)) times, and each execution takes time O(g(n)). Then the execution of the loop takes time
O(f (n) · g(n)).
13
Some Useful Formulas
From Stirling’s formula:
2 n! = O(nn+1 )
Some useful sums:
n i2 = n(n+1)(n+1)
i=0 32 ni=0(2i+1) = (n+1)2
ni=11/i = O(logn) See also Levitin’s Appendix A.
Levitin’s Appendix B is a tutorial on recurrence relations.
14
The Road Ahead
You will become much more familiar with asymptotic analysis as we use it on algorithms that we meet.
We shall begin the study of algorithms by looking at brute force approaches.
15