CS计算机代考程序代写 data structure algorithm Graph Algorithms (III)

Graph Algorithms (III)

1

COMP9024: Data Structures and
Algorithms

Graphs (III)

2

Contents

 Shortest Paths
 Minimum Spanning Trees

3

Shortest Paths

CB

A

E

D

F

0

328

5 8

48

7 1

2 5

2

3 9

4

Weighted Graphs

 In a weighted graph, each edge has an associated numerical
value, called the weight of the edge

 Edge weights may represent, distances, costs, etc.
 Example:

 In a flight route graph, the weight of an edge represents the
distance in miles between the endpoint airports

ORD PVD

MIA
DFW

SFO

LAX

LGA
HNL

5

Shortest Paths
 Given a weighted graph and two vertices u and v, we want to

find a path of minimum total weight between u and v.
 Length of a path is the sum of the weights of its edges.

 Example:
 Shortest path between Providence and Honolulu

 Applications
 Internet packet routing
 Flight reservations
 Driving directions

ORD PVD

MIA
DFW

SFO

LAX

LGA
HNL

6

Shortest Path Properties
Property 1:

A subpath of a shortest path is itself a shortest path
Property 2:

There is a tree of shortest paths from a start vertex to all the other
vertices

Example:
Tree of shortest paths from Providence

ORD PVD

MIA
DFW

SFO

LAX

LGA
HNL

7

Dijkstra’s Algorithm

 The distance of a vertex
v from a vertex s is the
length of a shortest path
between s and v

 Dijkstra’s algorithm
computes the distances
of all the vertices from a
given start vertex s

 Assumptions:
 the graph is connected
 the edges are

undirected
 the edge weights are

nonnegative

 We grow a “cloud” of vertices,
beginning with s and eventually
covering all the vertices

 We store with each vertex v a
label D(v) representing the
distance of v from s in the
subgraph consisting of the cloud
and its adjacent vertices

 At each step
 We add to the cloud the vertex

u outside the cloud with the
smallest distance label, D(u)

 We update the labels of the
vertices that are adjacent to u
and not in the cloud

8

Edge Relaxation

 Consider an edge e = (u,z) such
that
 u is the vertex most recently

added to the cloud
 z is not in the cloud

 The relaxation of edge e
updates distance D(z) as
follows:
D(z) = min{D(z), D(u) + weight(e)}

D(z) = 75
D(u) = 50

zs
u

D(z) = 60
D(u) = 50

zs
u

e

e

9

Example (1/2)

CB

A

E

D

F

0

428

∞ ∞

48

7 1

2 5

2

3 9

CB

A

E

D

F

0

328

5 11

48

7 1

2 5

2

3 9

CB

A

E

D

F

0

328

5 8

48

7 1

2 5

2

3 9

CB

A

E

D

F

0

327

5 8

48

7 1

2 5

2

3 9

10

Example (2/2)

CB

A

E

D

F

0

327

5 8

48

7 1

2 5

2

3 9

CB

A

E

D

F

0

327

5 8

48

7 1

2 5

2

3 9

11

Dijkstra’s Algorithm
Algorithm DijkstraDistances(G, s)

Input: A simple undirected weighted graph G with nonnegative edge weights,
and a distinguished vertex s.

Output: A label D[u], for each vertex u, such that D[u] is the length of a shortest
path from s to u in G.

{ for each v ∈ G do
if ( v = s )

D[v] = 0;
else

D[v] = +∞;
Create a priority queue Q containing all the vertices of G using the D labels as keys;
while Q is not empty do

{ u = Q.removeMin();
for each z ∈ Q such that z is adjacent to u do

if ( D[u] + w((u,z)) < D[z] ) // relax edge e { D[z] = D[u] + w((u,z)); Change to D[z] the key of vertex z in Q; } } return the label D[u] of each vertex u; } 12 Analysis of Dijkstra’s Algorithm  Creating the priority queue Q takes O(n log n) time if using an adaptable priority queue, or O(n) time by using bottom-up heap construction.  At each iteration of the while loop, we spend O(log n) time to remove vertex u from Q and O(degree(u) log n) time to perform the relaxation procedure on the edges incident on u.  The overall running time of the while loop is O(Σu (1+degree(u)) log n) = O((n + m) log n) (Recall that Σu degree(u) = 2m)  The running time can also be expressed as O(m log n) since the graph is connected 13 Why Dijkstra’s Algorithm Works  Dijkstra’s algorithm is based on the greedy method. It adds vertices by increasing distance. CB A D u v 0 327 5 8 48 7 1 2 5 2 3 9  Suppose it didn’t find all shortest distances. Let v be the first wrong vertex the algorithm processed.  When the previous node, u, on the true shortest path was considered, its distance was correct.  But the edge (u,v) was relaxed at that time!  Thus, so long as D(v)>D(u), v’s
distance cannot be wrong. That is,
there is no wrong vertex.

14

Why It Doesn’t Work for
Negative-Weight Edges

 If a node with a negative
incident edge were to be added
late to the cloud, it could mess
up distances for vertices already
in the cloud.

CB

A

E

D

F

0

457

5 -3

48

7 1

2 5

6

0 -8

Dijkstra’s algorithm is based on the greedy
method. It adds vertices by increasing distance.

C’s true distance is 1,
but it is already in the
cloud with D(C)=5!

15

Bellman-Ford Algorithm

 Works even with negative-
weight edges

 Must assume directed
edges (for otherwise we
would have negative-
weight cycles)

 Iteration i finds all shortest
paths that use i edges.

 Running time: O(nm).
 Can be extended to detect

a negative-weight cycle if it
exists
 How?

Algorithm BellmanFord(G, s)
{ for each v ∈ G do

if ( v = s )
D[v] = 0;

else
D[v] = +∞;

for ( i = 1; i ≤ n-1; i ++ )
for each e ∈ G.edges()

// relax edge e
{ u = G.origin(e);

z = G.opposite(u,e);
r = D[u] + weight(e);
if ( r < D[z] ) D[z] = r; } } 16 ∞ -2 Bellman-Ford Example ∞∞ 0 ∞ ∞ ∞ 48 7 1 -2 5 -2 3 9 ∞ 0 ∞ ∞ ∞ 48 7 1 -2 5 3 9 Nodes are labeled with their D(v) values -2 -28 0 4 ∞ 48 7 1 -2 5 3 9 ∞ 8 -2 4 -15 6 1 9 -25 0 1 -1 9 48 7 1 -2 5 -2 3 9 4 17 DAG-based Algorithm (not in book)  Works even with negative-weight edges  Uses topological order  Doesn’t use any fancy data structures  Is much faster than Dijkstra’s algorithm  Running time: O(n+m). Algorithm DagDistances(G, s) { for all v ∈ G.vertices() if ( v = s ) D[v] = 0; else D[v] = +∞; Perform a topological sort of the vertices; for ( u = 1; u ≤ n; u ++ ) // in topological order for each e ∈ G.outEdges(u) do // relax edge e { z = G.opposite(u,e); r = D[u] + weight(e); if ( r < D[z] ) D[z] = r; } } 18 ∞ -2 DAG Example ∞∞ 0 ∞ ∞ ∞ 48 7 1 -5 5 -2 3 9 ∞ 0 ∞ ∞ ∞ 48 7 1 -5 5 3 9 Nodes are labeled with their D(v) values -2 -28 0 4 ∞ 48 7 1 -5 5 3 9 ∞ -2 4 -1 1 7 -25 0 1 -1 7 48 7 1 -5 5 -2 3 9 4 1 2 43 6 5 1 2 43 6 5 8 1 2 43 6 5 1 2 43 6 5 5 0 (two steps) 19 Minimum Spanning Trees JFK BOS MIA ORD LAX DFW SFO BWI PVD 867 2704 187 1258 849 144740 1391 184 946 1090 1121 2342 1846 621 802 1464 1235 337 20 Minimum Spanning Trees Spanning subgraph  Subgraph of a graph G containing all the vertices of G Spanning tree  Spanning subgraph that is itself a (free) tree Minimum spanning tree (MST)  Spanning tree of a weighted graph with minimum total edge weight  Applications  Communications networks  Transportation networks ORD PIT ATL STL DEN DFW DCA 10 1 9 8 6 3 25 7 4 21 Cycle Property Cycle Property:  Let T be a minimum spanning tree of a weighted graph G  Let e be an edge of G that is not in T and C be the cycle formed by e with T  For every edge f of C, weight(f) ≤ weight(e) Proof:  By contradiction  If weight(f) > weight(e) we

can get a spanning tree
of smaller weight by
replacing e with f

8
4

2 3
6

7

7

9

8
e

C
f

8
4

2 3
6

7

7

9

8

C

e

f

Replacing f with e yields
a better spanning tree

22

U V

Partition Property
Partition Property:

 Consider a partition of the vertices of
G into subsets U and V

 Let e be an edge of minimum weight
across the partition

 There is a minimum spanning tree of
G containing edge e

Proof:
 Let T be an MST of G
 If T does not contain e, consider the

cycle C formed by e with T and let f
be an edge of C across the partition

 By the cycle property,
weight(f) ≤ weight(e)

 Thus, weight(f) = weight(e)
 We obtain another MST by replacing

f with e

7
4

2 8
5

7

3

9

8 e

f

7
4

2 8
5

7

3

9

8 e

f

Replacing f with e yields
another MST

U V

23

Kruskal’s Algorithm

A priority queue stores
the edges outside the
cloud
 Key: weight
 Element: edge

At the end of the
algorithm
 We are left with one

cloud that encompasses
the MST

 A tree T which is our
MST

Algorithm KruskalMST(G)
{

for each vertex v in G do
define a Cloud(v) of {v};

let Q be a priority queue;
Insert all edges into Q using their
weights as the key;
T = ∅;
while T has fewer than n-1 edges do

{ edge e = Q.removeMin();
Let u, v be the endpoints of e;
if ( Cloud(v) ≠ Cloud(u) )

{ Add edge e to T;
Merge Cloud(v) and Cloud(u);

}
}

return T;
}

24

Data Structure for Kruskal
Algorithm

 The algorithm maintains a forest of trees
 An edge is accepted it if connects distinct trees
 We need a data structure that maintains a partition,

i.e., a collection of disjoint sets, with the operations:
-find(u): return the set storing u
-union(u,v): replace the sets storing u and v with
their union

25

Representation of a
Partition

 Each set is stored in a sequence
 Each element has a reference back to the set

 operation find(u) takes O(1) time, and returns the set of
which u is a member.

 in operation union(u,v), we move the elements of the
smaller set to the sequence of the larger set and update
their references

 the time for operation union(u,v) is min(nu,nv), where nu
and nv are the sizes of the sets storing u and v

 Whenever an element is processed, it goes into a
set of size at least double, hence each element is
processed at most log n times

26

Partition-Based Implementation
 A partition-based version of Kruskal’s Algorithm

performs cloud merges as unions and tests as finds.
Algorithm Kruskal(G):

Input: A weighted graph G.
Output: An MST T for G.

{ Let P be a partition of the vertices of G, where each vertex forms a separate set;
Let Q be a priority queue storing the edges of G, sorted by their weights;
Let T be an initially-empty tree;
while Q is not empty do

{ (u,v) = Q.removeMinElement();
if ( P.find(u) != P.find(v) )

{ Add (u,v) to T;
P.union(u,v);

}
}

return T;
}

Running time: O((m+n) log n)

27

Kruskal Example (1/14)

JFK

BOS

MIA

ORD

LAX
DFW

SFO BWI

PVD

867
2704

187

1258

849

144740

1391

184

946
1090

1121

2342

1846 621

802

1464

1235

337

28

JFK

BOS

MIA

ORD

LAX
DFW

SFO BWI

PVD

867
2704

187

1258

849

144740

1391

184

946
1090

1121

2342

1846 621

802

1464

1235

337

Example (2/14)

29

Example (3/14)

JFK

BOS

MIA

ORD

LAX
DFW

SFO BWI

PVD

867
2704

187

1258

849

144740

1391

184

946
1090

1121

2342

1846 621

802

1464

1235

337

30

Example (4/14)

JFK

BOS

MIA

ORD

LAX
DFW

SFO BWI

PVD

867
2704

187

1258

849

144740

1391

184

946
1090

1121

2342

1846 621

802

1464

1235

337

31

Example (5/14)

JFK

BOS

MIA

ORD

LAX
DFW

SFO BWI

PVD

867
2704

187

1258

849

144740

1391

184

946
1090

1121

2342

1846 621

802

1464

1235

337

32

Example (6/14)

JFK

BOS

MIA

ORD

LAX
DFW

SFO BWI

PVD

867
2704

187

1258

849

144740

1391

184

946
1090

1121

2342

1846 621

802

1464

1235

337

33

Example (7/14)

JFK

BOS

MIA

ORD

LAX
DFW

SFO BWI

PVD

867
2704

187

1258

849

144740

1391

184

946
1090

1121

2342

1846 621

802

1464

1235

337

34

Example (8/14)

JFK

BOS

MIA

ORD

LAX
DFW

SFO BWI

PVD

867
2704

187

1258

849

144740

1391

184

946
1090

1121

2342

1846 621

802

1464

1235

337

35

Example (9/14)

JFK

BOS

MIA

ORD

LAX
DFW

SFO BWI

PVD

867
2704

187

1258

849

144740

1391

184

946
1090

1121

2342

1846 621

802

1464

1235

337

36

Example (10/14)

JFK

BOS

MIA

ORD

LAX
DFW

SFO BWI

PVD

867
2704

187

1258

849

144740

1391

184

946
1090

1121

2342

1846 621

802

1464

1235

337

37

Example (11/14)

JFK

BOS

MIA

ORD

LAX
DFW

SFO BWI

PVD

867
2704

187

1258

849

144740

1391

184

946
1090

1121

2342

1846 621

802

1464

1235

337

38

Example (12/14)

JFK

BOS

MIA

ORD

LAX
DFW

SFO BWI

PVD

867
2704

187

1258

849

144740

1391

184

946
1090

1121

2342

1846 621

802

1464

1235

337

39

Example (13/14)

JFK

BOS

MIA

ORD

LAX
DFW

SFO BWI

PVD

867
2704

187

1258

849

144740

1391

184

946
1090

1121

2342

1846 621

802

1464

1235

337

40

Example (14/14)

JFK

BOS

MIA

ORD

LAX
DFW

SFO BWI

PVD

867
2704

187

1258

849

144740

1391

184

946
1090

1121

2342

1846 621

802

1464

1235

337

41

Prim-Jarnik’s Algorithm (1/2)
 Similar to Dijkstra’s algorithm (for a connected graph)
 We pick an arbitrary vertex s and we grow the MST as a

cloud of vertices, starting from v
 We store with each vertex u a label D(u) = the smallest

weight of an edge connecting u to a vertex in the cloud
At each step:
 We add to the cloud the
vertex u outside the cloud
with the smallest distance
label
 We update the labels of the
vertices adjacent to u

42

Prim-Jarnik’s Algorithm (2/2)
Algorithm PrimJarnikMST(G)

{
Pick any vertex v of G;
D[v] = 0;
for each u ∈ G with u ≠ v do

D[u] = + ∞;
T = ∅ ;
Create a priority queue Q with an entry
((u, null), D[u]) for each vertex u, where
(u, null) is the element and D[u] is the key;
while Q is not Empty do

{ (u, e) = Q.removeMin();
add vertex u and edge e to T;
for each vertex z in Q such that z is adjacent to u do

if ( w((u, z)) < D[z] ) { D[z] = w((u, z)); Change to (z, (u,z)) the element of vertex z in Q; Change to D[z] the key of vertex z in Q; } } return T; } 43 Example (1/2) B D C A F E 7 4 2 8 5 7 3 9 8 0 7 2 8 ∞ ∞ B D C A F E 7 4 2 8 5 7 3 9 8 0 7 2 5 ∞ 7 B D C A F E 7 4 2 8 5 7 3 9 8 0 7 2 5 ∞ 7 B D C A F E 7 4 2 8 5 7 3 9 8 0 7 2 5 4 7 44 Example (2/2) B D C A F E 7 4 2 8 5 7 3 9 8 0 3 2 5 4 7 B D C A F E 7 4 2 8 5 7 3 9 8 0 3 2 5 4 7 45 Analysis  Graph operations  Method incidentEdges is called once for each vertex  Label operations  We set/get the distance, parent and locator labels of vertex z O(deg(z)) times  Setting/getting a label takes O(1) time  Priority queue operations  Each vertex is inserted once into and removed once from the priority queue, where each insertion or removal takes O(log n) time  The key of a vertex w in the priority queue is modified at most deg(w) times, where each key change takes O(log n) time  Prim-Jarnik’s algorithm runs in O((n + m) log n) time provided the graph is represented by the adjacency list structure  Recall that Σv deg(v) = 2m  The running time is O(m log n) since the graph is connected 46 Boruvka’s Algorithm  Like Kruskal’s Algorithm, Boruvka’s algorithm grows many “clouds” at once.  Each iteration of the while-loop halves the number of connected components in T.  The running time is O(m log n). Algorithm BoruvkaMST(G) { T = V; // just the vertices of G while T has fewer than n-1 edges do for each connected component C in T do { Let edge e be the smallest-weight edge from C to another component in T; if e is not already in T then Add edge e to T; } return T; } 47 JFK BOS MIA ORD LAX DFW SFO BWI PVD 867 2704 187 1258 849 144740 1391 184 946 1090 1121 2342 1846 621 802 1464 1235 337 Boruvka Example (1/3) 48 Example (2/3) JFK BOS MIA ORD LAX DFW SFO BWI PVD 867 2704 187 1258 849 144740 1391 184 946 1090 1121 2342 1846 621 802 1464 1235 337 49 Example (3/3) JFK BOS MIA ORD LAX DFW SFO BWI PVD 867 2704 187 1258 849 144740 1391 184 946 1090 1121 2342 1846 621 802 1464 1235 337 50 Summary • Shortest path problem  Dijkstra’s algorithm  Bellman-Ford algorithm • Minimum spanning tree problem  Kruskal’s algorithm  Prim-Jarnik’s algorithm  Boruvk’s algorithm  Suggested reading (Sedgewick):  Ch.20  Ch.21 COMP9024: Data Structures and Algorithms Contents Shortest Paths Weighted Graphs Shortest Paths Shortest Path Properties Dijkstra’s Algorithm Edge Relaxation Example (1/2) Example (2/2) Dijkstra’s Algorithm Analysis of Dijkstra’s Algorithm Why Dijkstra’s Algorithm Works Why It Doesn’t Work for Negative-Weight Edges Bellman-Ford Algorithm � Bellman-Ford Example DAG-based Algorithm �(not in book) DAG Example Minimum Spanning Trees Minimum Spanning Trees Cycle Property Partition Property Slide Number 23 Data Structure for Kruskal Algorithm Representation of a Partition Partition-Based Implementation Kruskal Example (1/14) Example (2/14) Example (3/14) Example (4/14) Example (5/14) Example (6/14) Example (7/14) Example (8/14) Example (9/14) Example (10/14) Example (11/14) Example (12/14) Example (13/14) Example (14/14) Prim-Jarnik’s Algorithm (1/2) Prim-Jarnik’s Algorithm (2/2) Example (1/2) Example (2/2) Analysis Boruvka’s Algorithm � Boruvka Example (1/3) Example (2/3) Example (3/3) Summary