CS计算机代考程序代写 algorithm Tutorial

Tutorial
COMP20007 Design of Algorithms
Workshop 7 Solutions
1. Negative edge weights Your friend’s algorithm might sound like a good idea, but it sadly won’t cure Dijkstra’s algorithm of its inability to handle negative edge weights properly. Simply adding a constant value to the weight of each edge distorts the length of paths differently depending on how many edges they contain. Therefore the shortest paths found by Dijkstra’s algorithm in the modified graph might not correspond to true shortest paths in the original graph.
As an example, consider the graph below.
C 11
A −5 B 3 D
The shortest path from A to D is A, B, C, D. However, when you add 5 to every edge, the shortest
path becomes A, B, D:
C 66
A0B8D
(Interestingly, however, a similar idea forms the basis of a fast all pairs, shortest path algorithm called Johnson’s algorithm. See https://en.wikipedia.org/wiki/Johnson%27s_algorithm for details, it’s an interesting read!)
2. Master Theorem Note for this question that comparing a and bd is the same as comparing logb a and d.
(a) T(n)=9T􏰄n􏰅+n3, T(1)=1 3
Wehavea=9,b=3,d=3andc=1. Sologb(a)=log3(9)=2.
Also 2 < 3 = c, so the Θ(n3) is the dominating term. So T(n) ∈ Θ(n3). (b) T(n)=64T􏰄n􏰅+n+logn, T(1)=1 4 We have a = 64 and b = 4, so logb(a) = log4(64) = 3. Also, since n + log n ∈ Θ(n1), d = 1 < 3, so T (n) ∈ Θ(n3). (c) T(n)=2T􏰄n􏰅+n, T(1)=1 2 Wehavea=b=2sologb(a)=log2(2)=1. Alson∈Θ(n1)sod=1aswell. So T(n) ∈ Θ(nd logn) = Θ(nlogn). (d) T(n) = 2T 􏰁n􏰂 = 2T 􏰁n􏰂+Θ(1), T(1) = 1 22 Herea=b=2andd=0sobd =1<2=a,thuswegetΘ(nlog22)=Θ(n). 1 3. Mergesort Time Complexity Recurrence relation: T (n) = T 􏰄 n 􏰅 + T 􏰄 n 􏰅 + Θ(n) = 2T 􏰄 n 􏰅 + Θ(n) where T (n) is the runtime of mergesort sorting n elements. The first T ( n2 ) is the time it takes to sort the left half of the input using mergesort. The other T ( n2 ) is the time it takes to sort the right half. Θ(n) is a bound on the time it takes to merge the two halves together. Recall that the Master Theorem states that if we have a recurrence relation T (n) such that T(n) = aT 􏰄n􏰅 + Θ(nd), then, b T(1) = c, Θ􏰁nd􏰂 if a < bd T(n)∈Θ􏰁ndlogn􏰂 ifa=bd . 222 Θ􏰁nlogb(a)􏰂 ifa>bd
We can recognise that the mergesort recurrence relation fits the form required by the Master Theorem,
with constants a = 2, b = 2, and d = 1.
so, by the master theorem, T (n) ∈ Θ(n log n).
4. Lower bound for the Closest Pairs problem We can use the closest pair algorithm from class to solve the element distinction problem like so, where the output is a collection of elements C = {c1, . . . , cn} and the output is Distinct or NotDistinct:
function ElementDistinction(C = {c1, . . . , cn}) Points ← {(c1,0),…,(cn,0)}
Distance ← ClosestPair(P oints)
if Distnace is 0 then
return NotDistinct else
return Distinct
So, we can see that we can solve the element distinct problem using the closest pair algorithm. This
is called a reduction from element distinction to closest pair.
We know that ElementDistinction is Ω(nlogn), and want to prove that ClosestPair is also
Ω(n log n).
We assume for the sake of contradiction that ClosestPair can be solved in a time complexity smaller (i.e., asymptotically faster) that n log n. As we have exhibited (provided) a reduction from Element- Distinction to ClosestPair then this must also give us an algorithm for ElementDistinction which is asymptotically faster than n log n.
This contradicts the statement that ElementDistinction is Ω(n log n), and as a result our assump- tion that ClosestPair can be solved in a time complexity smaller (i.e., asymptotically faster) that n log n must be false.
Hence ClosestPair can not be solved in faster than n log n time, and is therefore Ω(n log n).
bd = 2 = a
2