CS计算机代考程序代写 data structure algorithm 22_Backtracking_BB_TSP.pdf

22_Backtracking_BB_TSP.pdf

Lecture 22
Backtracking, Branch
and Bound Algorithms

EECS 281: Data Structures & Algorithms

2

Outline
• Review

– Constraint Satisfaction

– Optimization

• Backtracking
– General Form

– n Queens
• Branch and Bound

– Traveling salesperson problem

Data Structures & Algorithms

Backtracking

4

Types of Algorithm Problems
• Constraint satisfaction problems

– Can we satisfy all given constraints?

– If yes, how do we satisfy them?

• Need a specific solution

– May have more than one solution

– Examples: sorting, puzzles, GRE/analytical

• Optimization problems
– Must satisfy all constraints (can we?) and

– Must minimize an objective function subject to

those constraints

5

Types of Algorithm Problems
• Constraint satisfaction problems

– Go over all possible solutions

– Does a given input combination satisfy all

constraints?

– Can stop when a satisfying solution is found
• Optimization problems

– Similar, except we also need to compute the

objective function every time

– Stopping early = possible non-optimal solution

6

Types of Algorithm Problems
• Constraint satisfaction problems

– Can rely on Backtracking algorithms
• Optimization problems

– Can rely on Branch and Bound algorithms

For particular problems, there may be much more

efficient approaches, but think of these as a

fallback to a more sophisticated version of a brute-

force approach.

7

General Form: Backtracking
Algorithm checknode(node v)

if (promising(v))
if (solution(v))

write solution*

else
for each node u adjacent to v

checknode(u)

* Can exit here if only the existence of a solution is needed

8

Alternate Form: Backtracking
Algorithm checknode(node v)

if (solution(v))
write solution*

else
for each node u adjacent to v

if (promising(u))
checknode(u)

* Can exit here if only the existence of a solution is needed

9

General Form: Backtracking
solution(v)
• Check ‘depth’ of solution (constraint satisfaction)

promising(v)
• Different for each application

checknode(v)
• Called only if partial solution is both promising

and not a solution

10

Specific Example: n Queens
• n = 1: Can 1 queen be placed on a 1×1 board so

that it doesn’t threaten another?

• n = 2
• n = 3
• n = 4
• n = 5
• …

11

8 Queens: Search Space
• Brute force checks 1.785×1014

possibilities, including many ridiculous
board configurations

• Sensible backtracking search space is still
fairly large:
– 16,772,216 possibilities
– 92 solutions

• How can the search space be further
reduced?

12

Backtracking

Q

Q
4. Nothing promising

in row 3, backtrack
to row 2!

1. Start here

2. Threatened

3. Try here

13

Backtracking

Q

Q

Q

5. Try here

6. Row 3,
one choice

7. Nothing promising
in rows 2-4,
backtrack to row 1!

14

Constraint Satisfaction: n Queens
• Could require one solution
• Could require several solutions
• Could require all solutions

15

Specific Form: n Queens
solution(v)
• Check ‘depth’ of solution (constraint satisfaction)
• Placed queen on each row
• That is, depth = N

checknode(v)
• Called only if promising and not solution
• Recursive call to all positions (columns) of queen within

row

16

Specific Form: n Queens
promising(row, col)
• Called for each node of the search tree
• Assume data structures that can tell you if:

– column[col] // is column ‘col’ available
– leftDiagonal[x] // is upper-left to lower-
right diagonal available

– rightDiagonal[y] // is upper-right to
lower-left diagonal available

• NOT promising if any of these are unavailable
– We’ll see what ‘x’ and ‘y’ are soon…

17

Summary: 4 Queens
For 4 Queens
• Entire tree has 256 leaves
• Backtracking enables searching of 19 branches

before finding first solution

• Promising:

– May lead to solution
• Not promising:

– Will never lead to solution
– Therefore should be pruned

18

4 Queens Branches

Branches searched

7. A->H->I = vert. threat
8. A->H->J->M = 2 threats
9. A->H->J->N = 2 threats

10. A->H->J->O = diag. threat

11. A->H->J->P = 2 threats

12. A->H->K = 2 threats

13. A->H->L = vert. threat

14. B->E = diag. threat
15. B->F = vert. threat

16. B->G = diag. threat

17. B->H->I->M = vert. threat
18. B->H->I->N = 3 threats

19. B->H->I->O = SOLUTION

A B C D

E F G H

I J K L

M N O P

1. A->E = vert. threat
2. A->F = diag. threat

3. A->G->I = vert. threat
4. A->G->J = diag. threat

5. A->G->K = 2 threats

6. A->G->L = diag. threat

Q
Q
Q
Q

Q
Q
Q
Q

Q
Q
Q

Q
Q
Q

19

Search Tree: n Queens

Q1 Q1

Q2 Q2 Q2 Q2 Q2 Q2 Q2 Q2

Q3 Q3 Q3

Col 1
Col 2 Col 3

Col 4

Q2 Q2 Q2

Q1

Q2 Q2 Q2 Q2

… …

… … 64 branches

256 leavesQ4 Q4Q4

Q3

Q4 …

…4

16

N = 4

Q4

Q3

Q1

Q2

20

Summary: Backtracking
• Backtracking allows pruning of branches

that are not promising
• All backtracking algorithms have a similar

form
• Often, most difficult part is determining

nature of promising()

Data Structures & Algorithms

Backtracking

Data Structures & Algorithms

Branch and Bound

23

Types of Algorithm Problems
• Constraint satisfaction problems

– Can we satisfy all given constraints?

– If yes, how do we satisfy them?

• Need a specific solution

– May have more than one solution

– Examples: sorting, puzzles, GRE/analytical

• Optimization problems
– Must satisfy all constraints (can we?) and

– Must minimize an objective function subject to

those constraints

24

Types of Algorithm Problems
• Constraint satisfaction problems

– Go over all possible solutions

– Does a given input combination satisfy all

constraints?

– Can stop when a satisfying solution is found
• Optimization problems

– Similar, except we also need to compute the

objective function every time

– Stopping early = possible non-optimal solution

25

Types of Algorithm Problems
• Constraint satisfaction problems

– Can rely on Backtracking algorithms
• Optimization problems

– Can rely on Branch and Bound algorithms

For particular problems, there may be much more

efficient approaches, but think of these as a

fallback to a more sophisticated version of a brute-

force approach.

26

Branch-and-Bound, a.k.a. B&B
• The idea of backtracking extended

to optimization problems
• You are minimizing a function with this

useful property:
– A partial solution is pruned if its cost ³ cost of

best known complete solution

– e.g., the length of a path or tour

• If the cost of a partial solution is too big
drop this partial solution

27

General Form: Branch & Bound
Algorithm checknode(Node v, Best currBest)

Node u
if (promising(v, currBest))

if (solution(v)) then
update(currBest)

else
for each child u of v

checknode(u, currBest)
return currBest

28

General Form: Branch & Bound
solution()
• Check ‘depth’ of solution (constraint satisfaction)

update()
• If new solution better than current solution, then

update (optimization)

checknode()
• Called only if promising and not solution

29

General Form: Branch & Bound
lowerbound()
• Estimate of solution based upon

– Cost so far, plus
– Under estimate of cost remaining (aka bound)

promising()
• Different for each application, but must return

true when lowerbound() < currBest • A return of false is what causes pruning (≥) 30 The Key to B&B is the Bound • The efficiency of B&B is based on “bounding away” (aka “pruning”) unpromising partial solutions • The earlier you know a solution is not promising, the less time you spend on it • The more accurately you can compute partial costs, the earlier you can prune • Sometimes it’s worth spending extra effort to compute better bounds 31 Minimizing With B&B • Start with an “infinity” bound • Find first complete solution – use its cost as an upper bound to prune the rest of the search • Measure each partial solution and calculate a lower bound estimate needed to complete the solution • Prune partial solutions whose lower bounds exceed the current upper bound • If another complete solution yields a lower cost – that will be the new upper bound • When search is done, the current upper bound will be a minimal solution 32 Maximizing With B&B • Start with a “zero” bound • Find first complete solution – use its cost as a lower bound to prune the rest of the search • Measure each partial solution and calculate an upper bound estimate needed to complete the solution • Prune partial solutions whose upper bounds are less than the current lower bound • If another complete solution yields a larger value – that will be the new lower bound • When search is done, the current lower bound will be a maximal solution 33 Summary Branch and Bound • Method to prune search space for optimization problems • Need to keep current best solution • Measure partial solutions and combine with optimistic estimates of their completions • If estimate is not an improvement, actual cannot be either, so prune Data Structures & Algorithms Branch and Bound Data Structures & Algorithms Traveling Salesperson Problem (TSP) 36 TSP Defined • Hamiltonian Cycle – Definition: Given a graph G = (V, E), find a cycle that traverses each node exactly once – No vertex may appear twice, except the first/last – Constraint satisfaction problem • Traveling Salesperson Problem – Definition: Hamiltonian cycle with least weight – Optimization problem 37 TSP Illustrated Find tour of minimum length starting and ending in same city and visiting every city exactly once A B E C F D 1 3 5 7 2 8 4 5 5 9 38 TSP: (NP) Hard Problem! 1954: n = 49 2004: n = 24978 http://www.math.uwaterloo.ca/tsp/sweden/index.html 39 TSP with Backtracking Dead end in the graph = unpromising partial solution (all adjacent vertices are already visited) A B E C F D 1 3 5 7 2 8 4 5 5 9 40 Advantage of TSP with B&B Dead end Min cost if we complete a cycle from this partial solution. If > 27 è unpromising partial solution

Best
solution
so far

Note: usually you don’t have
these min-cost values and
thus need to design a function
to calculate a bound given a
partial solution.

A B

E C

F

D
1

3

5

7
2

8

4
5

5

9

41

Bounding Function
• Estimate must be ≤ reality
• The bounding function must have

complexity better than just continuing TSP
for the k vertices not yet visited:
– For instance, O(k2) is better than O(k!) for

most values of k
• What method can we use to find the

lowest cost way to connect k vertices
together in O(k2) time?

42

Bounding Function
• Some vertices are connected so far, some

vertices are not
• For ONLY the unvisited vertices, connect

them together with lowest possible cost
• Then connect the visited vertices to the

unvisited
• Yes, this function considers solutions that

violate constraints, but it’s a LOWER
bound so it’s OK

43

A
B

C

D E

FG

Current path: A – B – C

Unvisited vertices: D, E, F, and G

Partial TSP Example

What’s the best way
to connect D, E, F, and G
to each other?

44

A
B

C

D E

FG

Current path: A – B – C

Unvisited vertices: D, E, F, and G

Connect Unvisited Nodes
Together

How many edges are we
missing? A full TSP tour
would have V edges (7),
currently we have 5…

45

A
B

C

D E

FG

Current path: A – B – C

Unvisited vertices: D, E, F, and G

Connect Partial Tour to
Unvisited

Connect from A-B-C to
D-E-F-G in the best,
cheapest, fastest way
possible

46

Generating Permutations
1 template
2 void genPerms(vector &path, size_t permLength) {
3 if (permLength == path.size()) {
4 // Do something with the path
5 return;
6 } // if
7 if (!promising(path, permLength))
8 return;
9 for (size_t i = permLength; i < path.size(); ++i) { 10 swap(path[permLength], path[i]); 11 genPerms(path, permLength + 1); 12 swap(path[permLength], path[i]); 13 } // for 14 } // genPerms() 47 Optimal TSP With B&B • Given n vertices, need to find best path out of (n – 1)! options, use genPerms() • Start with upper bound that is “infinity”, or better yet a fast calculation of a path that is guaranteed not shorter than optimal • Use the upper bound to prune the rest of the search, lowering it every time a shorter, complete path is found • Measure each partial solution, the path length of the first 1 ≤ k points and estimate the cheapest cost to connect the remaining n – k points, this is the lower bound • Prune a partial solution if its lower bound exceeds the current upper bound • If another complete path is shorter than the upper bound, save the path and replace the upper bound • When the search is done, the current upper bound will be a shortest path Data Structures & Algorithms Traveling Salesperson Problem (TSP) 49 Branch and Bound & Traveling Salesperson Problem http://xkcd.com/399 Data Structures & Algorithms n Queens Demo 51 NQueens Implementation • We know that: – Each row will have exactly one queen – Each column will have exactly one queen – Each diagonal will have at most one queen • Don’t model the chessboard as 2D array! – Instead, use 1D arrays of row position, column availability and diagonal availabilities • To simplify the presentation, we will study for size 4x4 52 Implementing the Chessboard First: We need to define an array to store the location of queens placed so far positionInRow 1 3 0 2 Q Q Q Q 53 Implementing the Chessboard (cont.) We need an array to keep track of the availability status of the column when we assign queens FTFT Q Q Suppose that we have placed two queens 54 Implementing the Chessboard (cont.) We have 7 left diagonals (2 * N - 1); we want to keep track of available diagonals after queens are placed (start indexing at upper left) Q Q Diagonal Index = row + col T F T T F T T 55 Implementing the Chessboard (cont.) We also have 7 right diagonals (start indexing at upper right) Q Q Diagonal Index = (row – col) + (n – 1) T F F T T T T 56 The promising() Function 1 bool NQueens::promising(uint32_t row, uint32_t col) { 2 return column[col] == AVAILABLE 3 && leftDiagonal[row + col] == AVAILABLE 4 && rightDiagonal[row – col + (n - 1)] == AVAILABLE; 5 } // promising() 57 The Recursive putQueen() Function 1 void NQueens::putQueen(uint32_t row) { 2 // Check for solution 3 if (row == n) { 4 cout << "solution found" << endl; 5 return; 6 } // if 7 // Check every column in this row 8 for (uint32_t col = 0; col < n; ++col) 9 if (promising(row, col)) { 10 // Make the move, and a recursive call to next move 11 positionInRow[row] = col; 12 column[col] = !AVAILABLE; 13 leftDiagonal[row + col] = !AVAILABLE; 14 rightDiagonal[row – col + (n - 1)] = !AVAILABLE; 15 putQueen(row + 1); 16 17 // Undo this move and thus backtrack 18 column[col] = AVAILABLE; 19 leftDiagonal[row + col] = AVAILABLE; 20 rightDiagonal[row – col + (n - 1)] = AVAILABLE; 21 } // if 22 } // putQueen() Place a piece @(row,col) Remove piece @(row,col) 58 NQueens Demo From a web browser: bit.ly/eecs281-nqueens-demo From a terminal: wget bit.ly/eecs281-nqueens-demo -O nqdemo.tgz At the command line: tar xvzf nqdemo.tgz g++ -std=c++1z –O3 *.cpp -o nqueens ./nqueens Data Structures & Algorithms n Queens Demo