COMP20007 Design of Algorithms
Dynamic Programming Part 1: Warshall and Floyd algorithms
Daniel Beck
Lecture 19
Semester 1, 2020
1
Fibonacci Numbers
0, 1, 1, 2, 3, 5, 8, 13, . . .
2
Fibonacci Numbers
0, 1, 1, 2, 3, 5, 8, 13, . . .
F(n) = F(n−1)+F(n−2), n > 1, F(0) = 1, F(1) = 1.
2
Fibonacci Numbers
0, 1, 1, 2, 3, 5, 8, 13, . . .
F(n) = F(n−1)+F(n−2), n > 1,
F(0) = 1, F(1) = 1.
function Fibonacci(n)
if n == 0 or n == 1 then return 1
return Fibonacci(n − 1) + Fibonacci(n − 2)
2
Fibonacci Numbers
function Fibonacci(n)
if n == 0 or n == 1 then return 1
return Fibonacci(n − 1) + Fibonacci(n − 2)
3
Storing Intermediate Solutions
• Allocate an array of size n to store previous solutions.
4
Storing Intermediate Solutions
• Allocate an array of size n to store previous solutions.
function FibonacciDP(n) F[0] ← 1
F[1] ← 1
for i = 2 to n do
F[i] = F[i−1]+F[i−2] return F[n]
4
Storing Intermediate Solutions
• Allocate an array of size n to store previous solutions.
function FibonacciDP(n) F[0] ← 1
F[1] ← 1
for i = 2 to n do
F[i] = F[i−1]+F[i−2] return F[n]
• From exponential to linear complexity.
4
Dynamic Programming
5
Dynamic Programming
• The solution to a problem can be broken into solutions to subproblems (recurrence relations).
5
Dynamic Programming
• The solution to a problem can be broken into solutions to subproblems (recurrence relations).
• Solutions to subproblems can overlap (calls to F for all values lesser than n).
• Allocates extra memory to store solutions to subproblems.
5
Dynamic Programming
• The solution to a problem can be broken into solutions to subproblems (recurrence relations).
• Solutions to subproblems can overlap (calls to F for all values lesser than n).
• Allocates extra memory to store solutions to subproblems.
• DP is mostly related to optimisation problems (but not always, see Fibonacci).
• Optimal solution should be obtained through optimal solutions to subproblems (not always the case).
5
Transitive Closure using DP
Goal: find all node pairs that have a path between them.
6
Transitive Closure using DP
Goal: find all node pairs that have a path between them.
• The solution to a problem can be broken into solutions to subproblems.
6
Transitive Closure using DP
Goal: find all node pairs that have a path between them.
• The solution to a problem can be broken into solutions to subproblems.
• If there’s a path between two nodes i and j which are not directly connected, that path has to go through at least another node k. Therefore, we only need to find if the pairs (i,k) and (k,j) have paths.
6
Transitive Closure using DP
Goal: find all node pairs that have a path between them.
• The solution to a problem can be broken into solutions to subproblems.
• If there’s a path between two nodes i and j which are not directly connected, that path has to go through at least another node k. Therefore, we only need to find if the pairs (i,k) and (k,j) have paths.
• Solutions to subproblems can overlap.
6
Transitive Closure using DP
Goal: find all node pairs that have a path between them.
• The solution to a problem can be broken into solutions to subproblems.
• If there’s a path between two nodes i and j which are not directly connected, that path has to go through at least another node k. Therefore, we only need to find if the pairs (i,k) and (k,j) have paths.
• Solutions to subproblems can overlap.
• If the pairs (i,j1) and (i,j2) have paths that go through k, then finding if the pair (i,k) has a path is part of the solutions for both problems.
6
Warshall’s Algorithm
• Assume nodes can be numbered from 1 to n, with A being the adjacency matrix.
7
Warshall’s Algorithm
• Assume nodes can be numbered from 1 to n, with A being the adjacency matrix.
R0ij = A[i,j]
Rk = Rk−1 or (Rk−1 and Rk−1 ) ij ij ik kj
7
Warshall’s Algorithm
• Assume nodes can be numbered from 1 to n, with A being the adjacency matrix.
R0ij = A[i,j]
Rk = Rk−1 or (Rk−1 and Rk−1 )
ij ij ik kj
function Warshall(A[1..n, 1..n]) R0 ← A
for k ← 1 to n do for i ← 1 to n do
for j ← 1 to n do
Rk[i, j] ← Rk−1[i, j] or (Rk−1[i, k] and Rk−1[k, j])
return Rn
7
Warshall’s Algorithm
• We can allow input A to be used for the output, simplifying things.
8
Warshall’s Algorithm
• We can allow input A to be used for the output, simplifying things.
• Namely, if Rk−1[i, k] (that is, A[i, k]) is 0 then the assignment is doing nothing. And if it is 1, and if A[k, j] is also 1, then A[i, j] gets set to 1.
8
Warshall’s Algorithm
• We can allow input A to be used for the output, simplifying things.
• Namely, if Rk−1[i, k] (that is, A[i, k]) is 0 then the assignment is doing nothing. And if it is 1, and if A[k, j] is also 1, then A[i, j] gets set to 1.
for k ← 1 to n do for i ← 1 to n do
for j ← 1 to n do if A[i, k] then
if A[k, j] then A[i, j] ← 1
8
Warshall’s Algorithm
• Now we notice that A[i, k] does not depend on j, so testing it can be moved outside the innermost loop.
9
Warshall’s Algorithm
• Now we notice that A[i, k] does not depend on j, so testing it can be moved outside the innermost loop.
for k ← 1 to n do for i ← 1 to n do if A[i, k] then
for j ← 1 to n do if A[k, j] then A[i, j] ← 1
9
Warshall’s Algorithm
• Now we notice that A[i, k] does not depend on j, so testing it can be moved outside the innermost loop.
for k ← 1 to n do for i ← 1 to n do if A[i, k] then
for j ← 1 to n do if A[k, j] then A[i, j] ← 1
• Can use bitstring operations.
9
Analysis of Warshall’s Algorithm
• Straightforward analysis: Θ(n3) in all cases.
10
Analysis of Warshall’s Algorithm
• Straightforward analysis: Θ(n3) in all cases. • In practice:
• Ideal for dense graphs.
10
Analysis of Warshall’s Algorithm
• Straightforward analysis: Θ(n3) in all cases. • In practice:
• Ideal for dense graphs.
• Not the best for sparse graphs (#edges ∈ O(n)): DFS
from each node tends to perform better.
10
Floyd’s Algorithm: All-Pairs Shortest-Paths
• Floyd’s algorithm solves the all-pairs shortest-path problem for weighted graphs with positive weights.
11
Floyd’s Algorithm: All-Pairs Shortest-Paths
• Floyd’s algorithm solves the all-pairs shortest-path problem for weighted graphs with positive weights.
• Similar to Warshall’s, but uses a weight matrix W instead of adjacency matrix A (with ∞ values for missing edges)
11
Floyd’s Algorithm: All-Pairs Shortest-Paths
• Floyd’s algorithm solves the all-pairs shortest-path problem for weighted graphs with positive weights.
• Similar to Warshall’s, but uses a weight matrix W instead of adjacency matrix A (with ∞ values for missing edges)
• It works for directed as well as undirected graphs.
11
Floyd’s Algorithm
• The recurrence follows Warshall’s closely:
12
Floyd’s Algorithm
• The recurrence follows Warshall’s closely:
D0ij = W[i,j]
Dk = min(Dk−1, Dk−1 + Dk−1) ij ijikkj
12
Floyd’s Algorithm
• The recurrence follows Warshall’s closely:
D0ij = W[i,j]
Dk = min(Dk−1, Dk−1 + Dk−1)
ij ijikkj
function Floyd(W[1..n, 1..n]) D←W
for k ← 1 to n do for i ← 1 to n do
for j ← 1 to n do
D[i, j] ← min(D[i, j], D[i, k] + D[k, j])
return D
12
Negative weights
• Negative weights are not necessarily a problem, but negative cycles are.
13
Negative weights
• Negative weights are not necessarily a problem, but negative cycles are.
• These trigger arbitrarily low values for the paths involved.
13
Negative weights
• Negative weights are not necessarily a problem, but negative cycles are.
• These trigger arbitrarily low values for the paths involved.
• Floyd’s algorithm can be adapted to detect negative cycles (by looking if diagonal values become negative).
13
Summary
• Dynamic programming is another technique that trades memory for speed.
14
Summary
• Dynamic programming is another technique that trades memory for speed.
• Breaks into subproblems.
• Store overlapping solutions in memory.
14
Summary
• Dynamic programming is another technique that trades memory for speed.
• Breaks into subproblems.
• Store overlapping solutions in memory.
• Warshall’s algorithm: find the transitive closure of a graph.
14
Summary
• Dynamic programming is another technique that trades memory for speed.
• Breaks into subproblems.
• Store overlapping solutions in memory.
• Warshall’s algorithm: find the transitive closure of a graph.
• Floyd’s algorithm: all-pairs shortest paths.
14
Summary
• Dynamic programming is another technique that trades memory for speed.
• Breaks into subproblems.
• Store overlapping solutions in memory.
• Warshall’s algorithm: find the transitive closure of a graph.
• Floyd’s algorithm: all-pairs shortest paths. Next lecture: Dynamic Programming part 2.
14