Introduction to
Artificial Intelligence
with Python
Optimization
optimization
choosing the best option from a set of options
local search
search algorithms that maintain a single node and searches by moving to a neighboring node
B
A
B
Cost: 17
state-space landscape
objective function
global maximum
cost function
global minimum
current state
neighbors
Hill Climbing
Hill Climbing
function HILL-CLIMB(problem): current = initial state of problem repeat:
neighbor = highest valued neighbor of current if neighbor not better than current:
return current current = neighbor
Cost: 17
Cost: 17
Cost: 17
Cost: 15
Cost: 13
Cost: 11
Cost: 9
global maximum
local maxima
global minimum
local minima
flat local maximum
shoulder
Hill Climbing Variants
Variant
Definition
steepest-ascent
choose the highest-valued neighbor
stochastic
choose randomly from higher-valued neighbors
first-choice
choose the first higher-valued neighbor
random-restart
conduct hill climbing multiple times
local beam search
chooses the k highest-valued neighbors
Simulated Annealing
Simulated Annealing
•
•
Early on, higher “temperature”: more likely to accept neighbors that are worse than current state
Later on, lower “temperature”: less likely to accept neighbors that are worse than current state
Simulated Annealing
function SIMULATED-ANNEALING(problem, max): current = initial state of problem
for t = 1 to max:
T = TEMPERATURE(t)
neighbor = random neighbor of current
ΔE = how much better neighbor is than current ifΔE > 0:
current = neighbor
with probability eΔE/T set current = neighbor
return current
Traveling Salesman Problem
Linear Programming
Linear Programming
• Minimize a cost function c1x1 + c2x2 + … + cnxn
• With constraints of form a1x1 + a2x2 + … + anxn ≤ b
or of form a1x1 + a2x2 + … + anxn = b
• With bounds for each variable li ≤ xi ≤ ui
Linear Programming Example
• Two machines X1 and X2. X1 costs $50/hour to run, X2 costs $80/hour to run. Goal is to minimize cost.
• X1 requires 5 units of labor per hour. X2 requires 2 units of labor per hour. Total of 20 units of labor to spend.
•
X1 produces 10 units of output per hour. X2 produces 12 units of output per hour. Company needs 90 units of output.
Linear Programming Example
50x +80x
Two machines X and X . X costs $50/hour to run, X
Cost Function:
•11212 2
costs $80/hour to run.
• X1 requires 5 units of labor per hour. X2 requires 2 units of labor per hour. Total of 20 units of labor to spend.
•
X1 produces 10 units of output per hour. X2 produces 12 units of output per hour. Company needs 90 units of output.
Linear Programming Example
50x +80x
Two machines X and X . X costs $50/hour to run, X
Cost Function:
•11212 2
costs $80/hour to run.
• X1 requires 5 units of labor per hour. X2 requires 2
•
X1 produces 10 units of output per hour. X2 produces 12 units of output per hour. Company needs 90 units of output.
5x +2x ≤20
units of labor per hour. Total of 20 units of labor to
Constraint: spend.
12
Linear Programming Example
•
X1 produces 10 units of output per hour. X2 produces
50x +80x
Two machines X and X . X costs $50/hour to run, X
Cost Function:
•11212 2
costs $80/hour to run.
• X1 requires 5 units of labor per hour. X2 requires 2
5x +2x ≤20
units of labor per hour. Total of 20 units of labor to
Constraint: spend.
12
10x +12x ≥90
12 units of output per hour. Company needs 90 units
Constraint: of output.
12
Linear Programming Example
•
X1 produces 10 units of output per hour. X2 produces
50x +80x
Two machines X and X . X costs $50/hour to run, X
Cost Function:
•11212 2
costs $80/hour to run.
• X1 requires 5 units of labor per hour. X2 requires 2
5x +2x ≤20
units of labor per hour. Total of 20 units of labor to
Constraint: spend.
12
(−10x ) + (−12x ) ≤ − 90
12 units of output per hour. Company needs 90 units
Constraint: of output.
12
Linear Programming Algorithms
• •
Simplex Interior-Point
Constraint Satisfaction
Student:
1
2
3 4
Student:
Taking classes:
1
2
3 4
ABC
BDE CEF
FG
E
Student:
Taking classes:
Exam slots:
1
2
3 4
ABC
BDE CEF
FG
Monday Tuesday Wednesday
E
1
2
3 4
ABC
A BC
F
G
BDE CEF
EF
D
G
E
1
2
3 4
ABC
A BC
F
G
BDE CEF
EF
D
G
E
1
2
3 4
ABC
A BC
F
G
BDE CEF
EF
D
G
E
1
2
3 4
ABC
A BC
F
G
BDE CEF
EF
D
G
E
1
2
3 4
ABC
A BC
F
G
BDE CEF
EF
D
G
E
1
2
3 4
ABC
A BC
F
G
BDE CEF
EF
D
G
E
1
2
3 4
ABC
A BC
F
G
BDE CEF
EF
D
G
E
1
2
3 4
ABC
A BC
F
G
BDE CEF
EF
D
G
E
1
2
3 4
ABC
A BC
F
G
BDE CEF
EF
D
G
E
D
A BC
F
G
E
Constraint Satisfaction Problem
• Set of variables {X1, X2, …, Xn}
• Set of domains for each variable {D1, D2, …, Dn}
•
Set of constraints C
5
3
7
6
1
9
5
9
8
6
8
6
3
4
8
3
1
7
2
6
6
2
8
4
1
9
5
8
7
9
Variables
{(0, 2), (1, 1), (1, 2), (2, 0), …}
Domains
{1, 2, 3, 4, 5, 6, 7, 8, 9}
for each variable
Constraints
{(0, 2) ≠ (1, 1) ≠ (1, 2) ≠ (2, 0), …}
D
A BC
F
G
Variables
{A, B, C, D, E, F, G} Domains
{Monday, Tuesday, Wednesday} for each variable
Constraints
E
{A≠B, A≠C, B≠C, B≠D, B≠E, C≠E, C≠F, D≠E, E≠F, E≠G, F≠G}
hard constraints
constraints that must be satisfied in a correct solution
soft constraints
constraints that express some notion of which solutions are preferred over others
D
A BC
F
G
E
unary constraint
constraint involving only one variable
unary constraint
{A ≠ Monday}
binary constraint
constraint involving two variables
binary constraint
{A ≠ B}
node consistency
when all the values in a variable’s domain satisfy the variable’s unary constraints
AB
{Mon, Tue, Wed} {Mon, Tue, Wed}
{A ≠ Mon, B ≠ Tue, B ≠ Mon, A ≠ B}
AB
{Mon, Tue, Wed} {Mon, Tue, Wed}
{A ≠ Mon, B ≠ Tue, B ≠ Mon, A ≠ B}
AB
{Tue, Wed} {Mon, Tue, Wed}
{A ≠ Mon, B ≠ Tue, B ≠ Mon, A ≠ B}
AB
{Tue, Wed} {Mon, Tue, Wed}
{A ≠ Mon, B ≠ Tue, B ≠ Mon, A ≠ B}
AB
{Tue, Wed} {Mon, Wed}
{A ≠ Mon, B ≠ Tue, B ≠ Mon, A ≠ B}
AB
{Tue, Wed} {Mon, Wed}
{A ≠ Mon, B ≠ Tue, B ≠ Mon, A ≠ B}
AB
{Tue, Wed} {Wed}
{A ≠ Mon, B ≠ Tue, B ≠ Mon, A ≠ B}
AB
{Tue, Wed} {Wed}
{A ≠ Mon, B ≠ Tue, B ≠ Mon, A ≠ B}
arc consistency
when all the values in a variable’s domain satisfy the variable’s binary constraints
arc consistency
To make X arc-consistent with respect to Y, remove elements from X’s domain until every choice for X has a possible choice for Y
AB
{Tue, Wed} {Wed}
{A ≠ Mon, B ≠ Tue, B ≠ Mon, A ≠ B}
AB
{Tue, Wed} {Wed}
{A ≠ Mon, B ≠ Tue, B ≠ Mon, A ≠ B}
AB
{Tue} {Wed}
{A ≠ Mon, B ≠ Tue, B ≠ Mon, A ≠ B}
AB
{Tue} {Wed}
{A ≠ Mon, B ≠ Tue, B ≠ Mon, A ≠ B}
Arc Consistency
function REVISE(csp, X, Y): revised = false
for x in X.domain:
if no y in Y.domain satisfies constraint for (X, Y): delete x from X.domain
revised = true
return revised
Arc Consistency
function AC-3(csp):
queue = all arcs in csp while queue non-empty:
(X, Y) = DEQUEUE(queue) if REVISE(csp, X, Y):
if size of X.domain == 0: return false
for each Z in X.neighbors – {Y}: ENQUEUE(queue, (Z, X))
return true
D
A BC
F
G
E
{Mon, Tue, Wed} A
{Mon, Tue, Wed} B
C
{Mon, Tue, Wed}
{Mon, Tue, Wed}
D
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
{Mon, Tue, Wed} E
Search Problems
• • • • •
initial state actions transition model goal test
path cost function
CSPs as Search Problems
• • •
• •
initial state: empty assignment (no variables) actions: add a {variable = value} to assignment
transition model: shows how adding an assignment changes the assignment
goal test: check if all variables assigned and constraints all satisfied
path cost function: all paths have same cost
Backtracking Search
Backtracking Search
function BACKTRACK(assignment, csp):
if assignment complete: return assignment
var = SELECT-UNASSIGNED-VAR(assignment, csp) for value in DOMAIN-VALUES(var, assignment, csp):
if value consistent with assignment:
add {var = value} to assignment result = BACKTRACK(assignment, csp) if result ≠ failure: return result
remove {var = value} from assignment return failure
A
{Mon, Tue, Wed}
{Mon, Tue, Wed} B
C
{Mon, Tue, Wed}
{Mon, Tue, Wed}
D
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
{Mon, Tue, Wed} E
Mon
A
{Mon, Tue, Wed} B
C
{Mon, Tue, Wed}
{Mon, Tue, Wed}
D
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
{Mon, Tue, Wed} E
Mon
A
Mon B D
C
{Mon, Tue, Wed}
{Mon, Tue, Wed}
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
{Mon, Tue, Wed} E
Mon
A
Mon B D
C
{Mon, Tue, Wed}
{Mon, Tue, Wed}
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
{Mon, Tue, Wed} E
Mon
A
Tue B D
C
{Mon, Tue, Wed}
{Mon, Tue, Wed}
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
{Mon, Tue, Wed} E
Mon
A
Tue B D
C
{Mon, Tue, Wed}
Mon
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
{Mon, Tue, Wed} E
Mon
A
Tue B D
Mon E
C
{Mon, Tue, Wed}
Mon
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
Mon
A
Tue B D
Mon E
C
{Mon, Tue, Wed}
Mon
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
Mon
A
Tue B D
Tue E
C
{Mon, Tue, Wed}
Mon
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
Mon
A
Tue B D
Tue E
C
{Mon, Tue, Wed}
Mon
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
Mon
A
Tue B D
Wed E
C
{Mon, Tue, Wed}
Mon
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
Mon
A
Tue B D
Wed E
C Mon
Mon
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
Mon
A
Tue B D
Wed E
C Mon
Mon
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
Mon
A
Tue B D
Wed E
C Tue
Mon
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
Mon
A
Tue B D
Wed E
C Tue
Mon
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
Mon
A
Tue B D
Wed E
C Wed
Mon
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
Mon
A
Tue B D
Wed E
C Wed
Mon
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
Mon
A
Tue B D
Wed E
C
{Mon, Tue, Wed}
Mon
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
Mon
A
Tue B D
C
{Mon, Tue, Wed}
Mon
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
{Mon, Tue, Wed} E
Mon
A
Tue B D
C
{Mon, Tue, Wed}
{Mon, Tue, Wed}
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
{Mon, Tue, Wed} E
Mon
A
Tue B D
C
{Mon, Tue, Wed}
Tue
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
{Mon, Tue, Wed} E
Mon
A
Tue B D
C
{Mon, Tue, Wed}
Tue
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
{Mon, Tue, Wed} E
Mon
A
Tue B D
C
{Mon, Tue, Wed}
Wed
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
{Mon, Tue, Wed} E
Mon
A
Tue B D
Mon E
C
{Mon, Tue, Wed}
Wed
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
Mon
A
Tue B D
Mon E
C Mon
Wed
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
Mon
A
Tue B D
Mon E
C Mon
Wed
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
Mon
A
Tue B D
Mon E
C Tue
Wed
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
Mon
A
Tue B D
Mon E
C Tue
Wed
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
Mon
A
Tue B D
Mon E
C Wed
Wed
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
Mon
A
Tue B D
Mon E
C Wed
Wed
F
Mon
G {Mon, Tue, Wed}
Mon
A
Tue B D
Mon E
C Wed
Wed
F
Mon
G {Mon, Tue, Wed}
Mon
A
Tue B D
Mon E
C Wed
Wed
F
Tue
G {Mon, Tue, Wed}
Mon
A
Tue B D
Mon E
C Wed
Wed
F
Tue
G Mon
Mon
A
Tue B D
Mon E
C Wed
Wed
F
Tue
G Mon
Mon
A
Tue B D
Mon E
C Wed
Wed
F
Tue
G Tue
Mon
A
Tue B D
Mon E
C Wed
Wed
F
Tue
G Tue
Mon
A
Tue B D
Mon E
C Wed
Wed
F
Tue
G Wed
Inference
Mon
A
Tue B D
C
{Mon, Tue, Wed}
{Mon, Tue, Wed}
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
{Mon, Tue, Wed} E
Mon
A
Tue B D
C
{Mon, Tue, Wed}
Mon
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
{Mon, Tue, Wed} E
Mon
A
Tue B D
C
{Mon, Tue, Wed}
Mon
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
{Mon, Tue, Wed} E
Mon
A
Tue B D
C
{Mon, Tue, Wed}
{Mon, Tue, Wed}
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
{Mon, Tue, Wed} E
Mon
A
Tue B D
C {Wed}
{Mon, Tue, Wed}
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
{Mon, Tue, Wed} E
Mon
A
Tue B D
{Mon} E
C {Wed}
{Mon, Tue, Wed}
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
Mon
A
Tue B D
{Mon} E
C {Wed}
{Wed}
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
Mon
A
Tue B D
{Mon} E
C {Wed}
{Wed}
F
{Tue}
G {Mon, Tue, Wed}
Mon
A
Tue B D
{Mon} E
C {Wed}
{Wed}
F
{Tue}
G {Wed}
Mon
A
Tue B D
Mon E
C Wed
Wed
F
Tue
G Wed
maintaining arc-consistency
algorithm for enforcing arc-consistency every time we make a new assignment
maintaining arc-consistency
When we make a new assignment to X, calls AC-3, starting with a queue of all arcs (Y, X) where Y is a neighbor of X
function BACKTRACK(assignment, csp):
if assignment complete: return assignment
var = SELECT-UNASSIGNED-VAR(assignment, csp) for value in DOMAIN-VALUES(var, assignment, csp):
if value consistent with assignment:
add {var = value} to assignment
inferences = INFERENCE(assignment, csp)
if inferences ≠ failure: add inferences to assignment result = BACKTRACK(assignment, csp)
if result ≠ failure: return result
remove {var = value} and inferences from assignment return failure
function BACKTRACK(assignment, csp):
if assignment complete: return assignment
var = SELECT-UNASSIGNED-VAR(assignment, csp) for value in DOMAIN-VALUES(var, assignment, csp):
if value consistent with assignment:
add {var = value} to assignment
inferences = INFERENCE(assignment, csp)
if inferences ≠ failure: add inferences to assignment result = BACKTRACK(assignment, csp)
if result ≠ failure: return result
remove {var = value} and inferences from assignment return failure
function BACKTRACK(assignment, csp):
if assignment complete: return assignment
var = SELECT-UNASSIGNED-VAR(assignment, csp) for value in DOMAIN-VALUES(var, assignment, csp):
if value consistent with assignment:
add {var = value} to assignment
inferences = INFERENCE(assignment, csp)
if inferences ≠ failure: add inferences to assignment result = BACKTRACK(assignment, csp)
if result ≠ failure: return result
remove {var = value} and inferences from assignment return failure
SELECT-UNASSIGNED-VAR
• •
minimum remaining values (MRV) heuristic: select the variable that has the smallest domain
degree heuristic: select the variable that has the highest degree
Mon
A
Tue B D
C {Wed}
{Mon, Wed}
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
{Mon, Tue, Wed} E
Mon
A
Tue B D
C {Wed}
{Mon, Wed}
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
{Mon, Tue, Wed} E
A
{Mon, Tue, Wed}
{Mon, Tue, Wed} B
C
{Mon, Tue, Wed}
{Mon, Tue, Wed}
D
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
{Mon, Tue, Wed} E
A
{Mon, Tue, Wed}
{Mon, Tue, Wed} B
C
{Mon, Tue, Wed}
{Mon, Tue, Wed}
D
F
{Mon, Tue, Wed}
G {Mon, Tue, Wed}
{Mon, Tue, Wed} E
function BACKTRACK(assignment, csp):
if assignment complete: return assignment
var = SELECT-UNASSIGNED-VAR(assignment, csp) for value in DOMAIN-VALUES(var, assignment, csp):
if value consistent with assignment:
add {var = value} to assignment
inferences = INFERENCE(assignment, csp)
if inferences ≠ failure: add inferences to assignment result = BACKTRACK(assignment, csp)
if result ≠ failure: return result
remove {var = value} and inferences from assignment return failure
function BACKTRACK(assignment, csp):
if assignment complete: return assignment
var = SELECT-UNASSIGNED-VAR(assignment, csp) for value in DOMAIN-VALUES(var, assignment, csp):
if value consistent with assignment:
add {var = value} to assignment
inferences = INFERENCE(assignment, csp)
if inferences ≠ failure: add inferences to assignment result = BACKTRACK(assignment, csp)
if result ≠ failure: return result
remove {var = value} and inferences from assignment return failure
DOMAIN-VALUES
•
•
least-constraining values heuristic: return variables in order by number of choices that are ruled out for neighboring variables
try least-constraining values first
Mon
A
{Mon, Tue, Wed} B
C
{Tue, Wed}
{Mon, Tue, Wed}
D
F
{Mon, Tue}
G Wed
{Mon, Tue, Wed} E
Mon
A
{Mon, Tue, Wed} B
C Wed
{Mon, Tue, Wed}
D
F
{Mon, Tue}
G Wed
{Mon, Tue, Wed} E
Mon
A
Tue B D
Mon E
C Wed
Wed
F
Tue
G Wed
Problem Formulation
50×1 + 80×2
5×1 + 2×2 ≤ 20 (−10×1) + (−12×2) ≤ − 90
Linear Programming
A BC
DF E
G
Local Search
Constraint Satisfaction
Optimization
Introduction to
Artificial Intelligence
with Python