程序代写CS代考 chain CSC242: Introduction to Artificial Intelligence

CSC242: Introduction to Artificial Intelligence
Lecture 2.3
Please put away all electronic devices

Boolean CSP • All variables must be Booleans
• Domains all { true, false }
• Constraints: Identify possible combinations of the boolean variables

Propositional Logic
Aristole George Boole (384BC – 332BC) (1815-1864)

Propositional Logic
• Propositions: things that are true or false
• Connectives: combine propositions into larger propositions
• Sentences: statements about the world (can be true or false)
• Boolean functions of Boolean variables

Truth Table
L1,1 ∧ (W1,2∨W2,1)
L1,1
W1,2
W2,1
W1,2∨W2,1
L1,1∧ (W1,2∨W2,1)
F
F
F
F
F
F
F
T
T
F
F
T
F
T
F
F
T
T
T
F
T
F
F
F
F
T
F
T
T
T
T
T
F
T
T
T
T
T
T
T

Propositional Logic
• Possible worlds • Models
• Satisfiability
• Unsatisfiable

Background Knowledge
B1,1 ⇔ P1,2 ∨ P2,1
B1,2 ⇔ P1,1 ∨ P2,2 ∨ P3,1
B2,2 ⇔ P1,2 ∨ P2,3 ∨ P3,2 ∨ P2,1

S1,1 ⇔ W1,2 ∨ W2,1
OK1,1 ⇔ ¬(P1,1 ∨ W1,1)
OK1,2 ⇔ ¬(P1,2 ∨ W1,2)
OK2,1 ⇔ ¬(P2,1 ∨ W2,1)
S1,2 ⇔ W1,1 ∨ W2,2 ∨ W3,1
S2,2 ⇔ W1,2 ∨ W2,3 ∨ W3,2 ∨ W2,1 …
W1,1 ∨W1,2 ∨…∨W3,4 ∨W4,4
¬(W1,1 ∧ W1,2), ¬(W1,1 ∧ W1,3), …, ¬(W3,4 ∧ W4,4)

Knowledge-Based Agents Perceive
KB
Act
Infer

Perception
4 3 2
1
1234
Stench
PIT
Stench
Stench
Gold
PIT
START
PIT
¬B1,1, ¬S1,1
B
B
B
B
B
B
e
e
e
e
e e
e
e
e
e
e
e
e
e
e
e
e
e
r r
r
r
r
r
z
z
z
z
z
z

• Given what I know… • What should I do?
See AIMA 7.7
Inference

Inference
Given what I know… Is there no pit in room [2,1]?
R1: ¬P1,1
R2: B1,1 ⇔ (P1,2 ∨ P2,1)
¬B1,1
¬P2,1?

Entailment
• αentailsβ:α⊨β
• Everymodelofαisalsoamodelofβ
• Whenever α is true, so is β
• β is true in every world consistent with α
• Models(α) ⊆ Models(β) • β logically follows from α

Model Checking
• Given knowledge α and query β • For every possible world w
• If α is satisfied by w
• If β is not satisfied by w
• Conclude that α ⊭ β • Conclude that α ⊨ β

Model Checking
• Given knowledge α and query β • For every possible world w
• If α is satisfied by w
• If β is not satisfied by w
• Conclude that α ⊭ β
• Conclude that α ⊨ β
AIMA Fig. 7.10

Model Checking
Possible Worlds
P1,1
P1,2
P2,1
B1,1
F
F
F
F
F
F
F
T
F
F
T
F

T
T
F
T
T
T
T
F
T
T
T
T

Model Checking
Knowledge
P1,1
P1,2
P2,1
B1,1
R1
R2
¬B1,1
¬P2,1
F
F
F
F
T
T
T
F
F
F
T
T
F
F
F
F
T
F
T
F
T


T
T
F
T
F
T
F
T
T
T
F
F
F
T
T
T
T
T
F
T
F

Model Checking
Knowledge
P1,1
P1,2
P2,1
B1,1
R1
R2
¬B1,1
¬P2,1
F
F
F
F
T
T
T
F
F
F
T
T
F
F
F
F
T
F
T
F
T


T
T
F
T
F
T
F
T
T
T
F
F
F
T
T
T
T
T
F
T
F

Model Checking
Knowledge
P1,1
P1,2
P2,1
B1,1
R1
R2
¬B1,1
¬P2,1
F
F
F
F
T
T
T
F
F
F
T
T
F
F
F
F
T
F
T
F
T


T
T
F
T
F
T
F
T
T
T
F
F
F
T
T
T
T
T
F
T
F

Model Checking
Query
P1,1
P1,2
P2,1
B1,1
R1
R2
¬B1,1
¬P2,1
F
F
F
F
T
T
T
T
F
F
F
T
T
F
F
F
F
T
F
T
F
T


T
T
F
T
F
T
F
T
T
T
F
F
F
T
T
T
T
T
F
T
F

Entailment
Given what I know… Is there no pit in room [2,1]?
R1: ¬P1,1
R2: B1,1 ⇔ (P1,2 ∨ P2,1)
¬B1,1
¬P2,1? KB ⊨ ¬P2,1

Entailment
Given what I know… Is there no pit in room [1,2]?
R1: ¬P1,1
R2: B1,1 ⇔ (P1,2 ∨ P2,1)
¬P1,2? ¬B1,1 KB ⊨ ¬P1,2
R3: B2,1 ⇔ (P1,1 ∨ P2,2 ∨ P3,1)
B2,1 KB ⊭ P2,2 KB ⊭ ¬P2,2

But…

Model Checking
• Given knowledge α and query β • For every possible world w
• If α is satisfied by w
• If β is not satisfied by w
• Conclude that α ⊭ β • Conclude that α ⊨ β

Model Checking
n propositions m sentences, O(k) connectives
P1,1
P1,2

OK1,1
OK2,1
R1
R2

¬B1,1
¬S1,1
OK2,1
F
F

F
F
F



T


F
F

T
T
T
T


T
T


T
T

F
T
F



F
T
T

T
F
T
F


T
T
T

T
T
T
T


F

Model Checking
n propositions m sentences, O(k) connectives
P1,1
P1,2

OK1,1
OK2,1
R1
R2

¬B1,1
¬S1,1
OK2,1
F
F

F
F
F



T


F
F

T
T
T
T


T
T


T
T

F
T
F



F
T
T

T
F
T
F


T
T
T

T
T
T
T


F
O(k)

Model Checking
n propositions m sentences, O(k) connectives
P1,1
P1,2

OK1,1
OK2,1
R1
R2

¬B1,1
¬S1,1
OK2,1
F
F

F
F
F



T


F
F

T
T
T
T


T
T


T
T

F
T
F



F
T
T

T
F
T
F


T
T
T

T
T
T
T


F
O(mk)

Model Checking
n propositions m sentences, O(k) connectives
P1,1
P1,2

OK1,1
OK2,1
R1
R2

¬B1,1
¬S1,1
OK2,1
F
F

F
F
F



T


F
F

T
T
T
T


T
T


T
T

F
T
F



F
T
T

T
F
T
F


T
T
T

T
T
T
T


F
O(2n)

Model Checking
n propositions m sentences, O(k) connectives
P1,1
P1,2

OK1,1
OK2,1
R1
R2

¬B1,1
¬S1,1
OK2,1
F
F

F
F
F



T


F
F

T
T
T
T


T
T


T
T

F
T
F



F
T
T

T
F
T
F


T
T
T

T
T
T
T


F
O(2nmk)
Intractable!

Entailment
• αentailsβ:α⊨β
• Everymodelofαisalsoamodelofβ
• Whenever α is true, so is β
• β is true in every world consistent with α
• Models(α) ⊆ Models(β) • β logically follows from α
co-NP-complete!

Propositional Logic
• Programming language for knowledge • Factored representation (Boolean CSP)
Propositions, connectives, sentences • Possible worlds, satisfiability, models • Entailment: α ⊨ β
Every model of α is a model of β Model checking intractable!

• •

P1,1
P1,2

OK1,1
OK2,1
R1
R2

¬B1,
¬S1, OK2,1
F
F

F
F
F


1…
1T


F
F

T
TT
T


T
T


T
T

F
T
F



F
T
T

T
F
T
F


T
T
T

T
T
T
T


F
Darn model checking… so intractable

Rule: If you know α, then you also know β.

Rule: If you know α, then you also know β. No model checking!

Rule: If you know α, then you also know β. No model checking!
Seems impossible…

KB = { Hungry ⇒ Cranky, Hungry }

KB = { Hungry ⇒ Cranky, Hungry }
Cranky

KB = { Hungry ⇒ Cranky, Hungry }
Cranky
Hungry
Cranky
Hungry ⇒ Cranky
F
F
T
F
T
T
T
F
F
T
T
T

KB = { Hungry ⇒ Cranky, Hungry }
Cranky
Hungry
Cranky
Hungry ⇒ Cranky
F
F
T
F
T
T
T
F
F
T
T
T

KB = { Hungry ⇒ Cranky, Hungry }
Cranky
Hungry
Cranky
Hungry ⇒ Cranky
F
F
T
F
T
T
T
F
F
T
T
T

KB = { Hungry ⇒ Cranky, Hungry }
Cranky
KB ⊨ Cranky
Hungry
Cranky
Hungry ⇒ Cranky
F
F
T
F
T
T
T
F
F
T
T
T

Inference Rule
Hungry ⇒ Cranky, Hungry Cranky

Inference Rule
Premises (Antecedents)
Hungry ⇒ Cranky, Hungry Cranky

Inference Rule
Premises (Antecedents)
Hungry ⇒ Cranky, Hungry Cranky
Conclusions (Consequents)

Inference Rule
Hungry ⇒ Cranky, Hungry Cranky
Hungry
Cranky
Hungry ⇒ Cranky
F
F
T
F
T
T
T
F
F
T
T
T

Inference Rule
φ ⇒ ψ, φ ψ
φ
ψ
φ⇒ψ
F
F
T
F
T
T
T
F
F
T
T
T

Modus Ponens “mode that affirms”
φ ⇒ ψ, φ ψ
MP:

Derivation
{ Hungry ⇒ Cranky, Hungry } ⊢MP Cranky

Derivation
{ Hungry ⇒ Cranky, Hungry } ⊢ Cranky

Derivation
{ Hungry ⇒ Cranky, Hungry } ⊢ Cranky α ⊢ β ⊢ γ ⊢ … ⊢ ω
α⊢ω

α⊨β α⊢β
β logically follows from α Easier to compute
Intractable to compute
But does α really follow from β ?

Soundness Ifα⊢β thenα⊨β

Soundness
Ifα⊢β thenα⊨β Modus Ponens is sound

Hungry ⇒ Cranky, Cranky φ ⇒ ψ, ψ Hungry φ

Hungry ⇒ Cranky, Cranky φ ⇒ ψ, ψ Hungry φ
φ
ψ
φ⇒ψ
F
F
T
F
T
T
T
F
F
T
T
T

Hungry ⇒ Cranky, Cranky φ ⇒ ψ, ψ Hungry φ
φ
ψ
φ⇒ψ
F
F
T
F
T
T
T
F
F
T
T
T

Hungry ⇒ Cranky, Cranky φ ⇒ ψ, ψ Hungry φ
{ φ ⇒ ψ, ψ } ⊨ φ
φ
ψ
φ⇒ψ
F
F
T
F
T
T
T
F
F
T
T
T

Hungry ⇒ Cranky, Cranky φ ⇒ ψ, ψ Hungry φ
{ φ ⇒ ψ, ψ } ⊨ φ
Unsound!
φ
ψ
φ⇒ψ
F
F
T
F
T
T
T
F
F
T
T
T

Affirming the Consequent
Hungry ⇒ Cranky, Cranky φ ⇒ ψ, ψ Hungry φ
{ φ ⇒ ψ, ψ } ⊨ φ
Unsound!
φ
ψ
φ⇒ψ
F
F
T
F
T
T
T
F
F
T
T
T

Hungry ⇒ Cranky, ¬Cranky φ ⇒ ψ, ¬ψ ¬Hungry ¬φ

Hungry ⇒ Cranky, ¬Cranky φ ⇒ ψ, ¬ψ ¬Hungry ¬φ
φ
ψ
φ⇒ψ
¬φ
¬ψ
F
F
T
T
T
F
T
T
T
F
T
F
F
F
T
T
T
T
F
F

Hungry ⇒ Cranky, ¬Cranky φ ⇒ ψ, ¬ψ ¬Hungry ¬φ
φ
ψ
φ⇒ψ
¬φ
¬ψ
F
F
T
T
T
F
T
T
T
F
T
F
F
F
T
T
T
T
F
F

Hungry ⇒ Cranky, ¬Cranky φ ⇒ ψ, ¬ψ ¬Hungry ¬φ
φ
ψ
φ⇒ψ
¬φ
¬ψ
F
F
T
T
T
F
T
T
T
F
T
F
F
F
T
T
T
T
F
F

Hungry ⇒ Cranky, ¬Cranky φ ⇒ ψ, ¬ψ ¬Hungry ¬φ
{ φ ⇒ ψ, ¬ψ } ⊨ ¬φ
Sound!
φ
ψ
φ⇒ψ
¬φ
¬ψ
F
F
T
T
T
F
T
T
T
F
T
F
F
F
T
T
T
T
F
F

Modus Tollens
MT:Hungry ⇒ Cranky, ¬Cranky MT:φ ⇒ ψ, ¬ψ
¬Hungry
{ φ ⇒ ψ, ¬ψ } ⊨ ¬φ
¬φ Sound!
φ
ψ
φ⇒ψ
¬φ
¬ψ
F
F
T
T
T
F
T
T
T
F
T
F
F
F
T
T
T
T
F
F

Equivalences For any sentences φ and ψ
If φ ≡ ψ then
φ and ψ
ψφ
are inference rules

Equivalences
AIMA Fig. 7.11
(α ∧ β) ≡ (β ∧ α)
(α ∨ β) ≡ (β ∨ α)
((α ∧ β) ∧ γ) ≡ (α ∧ (β ∧ γ)) ((α ∨ β) ∧ γ) ≡ (α ∨ (β ∨ γ)) ¬(¬α) ≡ α
(α ⇒ β) ≡ (¬β ⇒ ¬α)
(α ⇒ β) ≡ (¬α ∨ β)
(α ⇔ β) ≡ ((α ⇒ β) ∧ (β ⇒ α)) ¬(α ∧ β) ≡ (¬α ∨ ¬β)
¬(α ∨ β) ≡ (¬α ∧ ¬β)
(α ∧ (β ∨ γ)) ≡ ((α ∧ β) ∨ (α ∧ γ)) (α ∨ (β ∧ γ)) ≡ ((α ∨ β) ∧ (α ∨ γ))
Commutativity of ∧ Commutativity of ∨ Associativity of ∧ Associativity of ∨ Double-negation elimination Contrapositive
Implication elimination Biconditional elimination De Morgan’s law
De Morgan’s law Distributivity of ∧ over ∨ Distributivity of ∨ over ∧

↵^ ↵
And-elimination
¬(↵) ¬(↵⇥) ¬↵⇥¬ ¬↵¬
Inference Rules
¬¬↵ ↵
Double negation
) ⇥, ⇥
Modus Ponens
↵ ,
(↵ ) ) ^ ( ) ↵)
DeMorgan’s Laws
(↵ ) ) ^ ( ) ↵) ↵ ,
Definition of biconditional

4 3 2
1
1234
Stench
PIT
Stench
Stench
Gold
PIT
START
PIT
B
B
B
B
B
B
e
e
e
e
e e
e
e
e
e
e
e
e
e
e
e
e
e
r r
r
r
r
r
z
z
z
z
z
z

Proof
• Inference rules produce theorems derived from other sentences
• The sequence of inference rule applications used in the derivation constitutes a proof of the theorem

Proof
Given what I know… Is there no pit in room [1,2]?
1: ¬P1,1
2: B1,1 ⇔ (P1,2 ∨ P2,1)
3: B2,1 ⇔ (P1,1 ∨ P2,2 ∨ P3,1) 4: ¬B1,1
5: B2,1
¬P1,2?

1: ¬P1,1
2: B1,1 ⇔ (P1,2 ∨ P2,1)
3: B2,1 ⇔ (P1,1 ∨ P2,2 ∨ P3,1)
4: ¬B1,1 5: B2,1
Rule
Premises
Conclusion
Bicond elim
2
6: (B1,1 ⇒ (P1,2 ∨ P2,1)) ∧ ((P1,2 ∨ P2,1) ⇒ B1,1)
And elim
6
7: ((P1,2 ∨ P2,1) ⇒ B1,1)
Contrapositive
7
8: (¬B1,1 ⇒ ¬(P1,2 ∨ P2,1))
MP
8, 4
9: ¬(P1,2 ∨ P2,1)
De Morgan
9
10: ¬P1,2 ∧ ¬P2,1
And elim
10
11: ¬P1,2

Proof
• Each step’s premises must be in the KB already
• Each step’s conclusion is added to the KB
• The last step derives the query
KB ⊢ ¬P1,2

Proof
• Each step’s premises must be in the KB already
• Each step’s conclusion is added to the KB
• The last step derives the query
KB ⊢ ¬P1,2
• If all the inference rules are sound
KB ⊨ ¬P1,2

Proof as Search
• States are sets of sentences (KBs)
• Actions are applying inference rules
• Actions(s) = { rm | Match(Premises(r), KB)=m } • Result(rm, s) = s ∪ Subst(m, Conclusions(r))
• Initial state: initial KB • Goal test: query ∈ KB

Theorem Proving
• Searching for proofs is an alternative to enumerating models
• “Inmanypracticalcases,findinga proof can be more efficient because the proof can ignore irrelevant propositions, no matter how many of them there are.”

Theorem Proving
• States are sets of sentences (KBs)
• Actions are applying inference rules
• Actions(s) = { rm | Match(Premises(r), KB)=m } • Result(rm, s) = s ∪ Subst(m, Conclusions(r))
• Initial state: initial KB • Goal test: query ∈ KB

Theorem Proving • Need a complete search strategy

Theorem Proving
• Need a complete search strategy
• Need a complete set of inference rules

Completeness Ifα⊨β thenα⊢β

Completeness Ifα⊨β thenα⊢β
φ ⇒ ψ, φ ψ
Modus Ponens is not complete
MP:

Proof
Rule
Premises
Conclusion
Bicond elim
2
6: (B1,1 ⇒ (P1,2 ∨ P2,1)) ∧ ((P1,2 ∨ P2,1) ⇒ B1,1)
And elim
6
7: ((P1,2 ∨ P2,1) ⇒ B1,1)
Contrapositive
7
8: (¬B1,1 ⇒ ¬(P1,2 ∨ P2,1))
MP
8, 4
9: ¬(P1,2 ∨ P2,1)
De Morgan
9
10: ¬P1,2 ∧ ¬P2,1
And elim
10
11: ¬P1,2

Theorem Proving
• Need a complete search strategy
• Need a complete set of inference rules

Theorem Proving
• Need a complete search strategy
• Need a complete set of inference rules
• Or a single complete inference rule

Hungry ∨ Cranky ¬Hungry
Cranky

(a)
V ¬PV 2,2
(b)
A B G OK P
P_P_P 1,1 2,2 3,1
S V W
¬PA 1,1
2,2
= Agent
= Breeze
= Glitter, Gold = Safe square = Pit
= Stench
= Visited
= Wumpus
P?
B2,1
1,4 2,4 3,4 4,4
1,4
2,4
3,4
4,4
1,3 W!
2,3
3,3
4,3
1,2 A
S OK
2,2
OK
P?
3,2
4,2
1,1
V OK
2,1
B
V
A
OK
3,1
P!P?
4,1
1,3 W! 1,2 S
P_P 2,2
3,3 P? 4,3 3,2 4,2
3,1
3,1 P! 4,1
1,1
2,1 B OK OK
VV
OK OK
2,3
P3,1
SG B

Reasoning By Cases
If A or B is true and you know it’s not A, then it must be B

l1 ∨ … ∨ li ∨ … ∨ lk ¬li l1 ∨ … ∨ li-1 ∨ li+1 ∨ … ∨ lk

l1 ∨ … ∨ li ∨ … ∨ lk ¬li l1 ∨ … ∨ li-1 ∨ li+1 ∨ … ∨ lk

l1 ∨ … ∨ li ∨ … ∨ lk ¬li l1 ∨ … ∨ li-1 ∨ li+1 ∨ … ∨ lk
li is gone

Literals
l1 ∨ … ∨ li ∨ … ∨ lk ¬li l1 ∨ … ∨ li-1 ∨ li+1 ∨ … ∨ lk

Literals
li=¬m
Complementary
l1 ∨ … ∨ li ∨ … ∨ lk m l1 ∨ … ∨ li-1 ∨ li+1 ∨ … ∨ lk
m=¬li or

Literal
Complementary
m=¬li or
li=¬m l1 ∨ … ∨ li ∨ … ∨ lk m
l1 ∨ … ∨ li-1 ∨ li+1 ∨ … ∨ lk Clause

Unit Resolution
Clause Unit Clause l1 ···li ···lk, m
l1 ···li1 li+1 ···lk
l1, …, lk and m are literals li and m are complementary

1. Hungry ∨ Cranky 2. ¬Hungry
Q: Cranky?
Unit Res: Hungry
1,2
3. Cranky
KB ⊢UR Cranky

1,4
2,4
3,4
4,4
1,3 W!
2,3
3,3
4,3
1,2 A
S OK
2,2
OK
P?
3,2
4,2
1,1
V OK
2,1
B
V
A
OK
3,1
P!P?
4,1
(a)
A B G OK P
= Agent
= Breeze
= Glitter, Gold
1,4 2,4 P? 3,4 4,4
S V W
= Stench
A
B
= Safe square 1,3
1. P1,1 ∨ P2,2 ∨ P3,1
= Pit
2,3
3,3 4,3
W! 2.¬P SG
P?
= Visited 1,1
= Wumpus 1,2 S 3. ¬PV
2,2 2,1
V OK
3,2 4,2
3,1 4,1
(b)
1,1
Q: P3,1? VV
2,2
OK
OK OK
B
P!
Unit Res: P1,1
1,2
4. P2,2 ∨ P3,1
Unit Res: P2,2
4,3
5. P3,1
KB ⊢UR P3,1

Unit Resolution
• Sound: if ↵ ` then ↵ |=
Easy to show
• Notcomplete: if↵|= then↵`
Give a counterexample
• •

Hungry ∨ Cranky ¬Hungry ∨ Sleepy Cranky ∨ Sleepy

Resolution
l1 ···li ···lk, m1 ···mj ···mn
l1 ···li1 li+1 ···lk m1 ···mj1 mj+1 ···mn
l1, …, lk , m1, …, mn are literals li and mj are complementary
li and mj are gone
Technical note: Resulting clause must be factored to contain only one copy of each literal. (See AIMA)

1. Hungry ∨ Cranky
2. ¬Sleepy ∨ ¬Hungry
3. Cranky ∨ Sleepy
Rule
Premises
Conclusion
Resolution: Hungry
1,2
4. Cranky ∨ ¬Sleepy
Resolution: Sleepy
3,4
5. Cranky ∨ Cranky
Factoring
5
6. Cranky
KB ⊢ Cranky

Resolution
• Sound: if ↵ ` then ↵ |=
Easy to show

Resolution
• Sound: if ↵ ` then ↵ |=
Easy to show
• Notcomplete: if↵|= then↵`
Give a counterexample
• •

Resolution is Refutation-Complete
• If a set of clauses is unsatisfiable, then resolution can derive the empty clause
(□)
AIMA p. 255

Resolution
l1 ···li ···lk, m1 ···mj ···mn
l1 ···li1 li+1 ···lk m1 ···mj1 mj+1 ···mn
l1, …, lk , m1, …, mn are literals li and mj are complementary
Technical note: Resulting clause must be factored to contain only one copy of each literal. (See AIMA)

Challenges for Using Resolution
• Only works on clauses
• Only refutation-complete

Conjunctive Normal Form (CNF)
• Any sentence of propositional logic can be converted into an equivalent conjunction (set) of clauses

Conjunctive Normal Form (CNF)
• Eliminate ⇔: α⇔β → α⇒β ∧ β⇒α
• Eliminate ⇒: α⇒β → ¬α ∨ β
• Move negation in:
¬¬α → α
¬(α∨β) → (¬α∧¬β)
¬(α∧β) → (¬α∨¬β)
• Distribute ∨ over ∧:
• •
AIMA p. 253-254
• •
(α ∨ (β ∧ γ)) → ((α ∨ β ) ∧ (α ∨ γ))

Challenges for Using Resolution
• Only works on clauses
• Convert KB & query to clauses (CNF)
• Only refutation-complete

Challenges for Using Resolution
• Only works on clauses
• Convert KB & query to clauses (CNF)
• Only refutation-complete

(□)
Resolution is Refutation-Complete
• If a set of clauses is unsatisfiable, then resolution can derive the empty clause
a clause with zero literals

Entailment and
Satisfiability
• If a set of clauses is unsatisfiable, then
resolution can derive the empty clause (□) • KB ⊨ β
• iffeverymodelofKBisamodelofβ • iffnomodelofKBisamodelof¬β
• iff there are no models of KB ∪ { ¬β } • iff KB ∪ { ¬β } is unsatisfiable

Resolution Refutation
• Convert KB ∪ { ¬β } to CNF
• Apply resolution rule until:
No new clauses can be added

• KB 2
Two clauses resolve to yield the empty clause (contradiction)

• KB |=

Resolution Refutation
Given what I know… Is there no pit in room [1,2]?
2: B1,1 ⇔ (P1,2 ∨ P2,1) 4: ¬B1,1
¬P1,2?

Resolution Refutation
KB = { B1,1 ⇔ (P1,2 ∨ P2,1), ¬B1,1 } Query: ¬P1,2?

Add ¬Query to KB
B1,1 ⇔ (P1,2 ∨ P2,1), ¬B1,1, ¬¬P1,2

Convert to CNF
B1,1 ⇔ (P1,2 ∨ P2,1), ¬B1,1, ¬¬P1,2
B1,1⇒(P1,2∨P2,1), (P1,2∨P2,1)⇒B1,1, ¬B1,1, ¬¬P1,2 ¬B1,1∨(P1,2∨P2,1), ¬(P1,2∨P2,1)∨B1,1, ¬B1,1, ¬¬P1,2 ¬B1,1∨(P1,2∨P2,1), (¬P1,2∧¬P2,1)∨B1,1, ¬B1,1, P1,2 ¬B1,1∨(P1,2∨P2,1), (¬P1,2∨B1,1), (¬P2,1∨B1,1), ¬B1,1, P1,2 ¬B1,1∨P1,2∨P2,1, ¬P1,2∨B1,1, ¬P2,1∨B1,1, ¬B1,1, P1,2

Resolution Refutation
1: ¬B1,1∨P1,2∨P2,1 2: ¬P1,2∨B1,1
3: ¬P2,1∨B1,1 4: ¬B1,1
5: P1,2
Premises
Literal
Result
2,4
B1,1
6: ¬P1,2
5,6
P1,2
7: □

Resolution Refutation
1: ¬B1,1∨P1,2∨P2,1 2: ¬P1,2∨B1,1
3: ¬P2,1∨B1,1 4: ¬B1,1
5: P1,2
4: ¬B1,1
5: P1,2
7: □
2: ¬P1,2∨B1,1
6: ¬P1,2

Resolution Refutation
• Convert KB ∪ { ¬β } to CNF
• Apply resolution rule until:
No new clauses can be added

• KB 2
Two clauses resolve to yield the empty clause (contradiction)

• KB |=

Resolution
• Complete when used in a refutation (proof by contradiction)
• Search challenges remaining:
• Which clauses to resolve?
• On which complementary literals?

Resolution
• Complete when used in a refutation (proof by contradiction)
• Search challenges remaining:
• Which clauses to resolve?
• On which complementary literals?
Intractable!

But wait there’s more…

Theorem Proving
• States are sets of sentences (KBs)
• Actions are applying inference rules
• Actions(s) = { rm | Match(Premises(r), KB)=m } • Result(rm, s) = s ∪ Subst(m, Conclusions(r))
• Initial state: initial KB • Goal test: query ∈ KB

Inference for Knowledge-Based Agents
• Data-driven (forward chaining)
• Goal-directed (backward chaining)

Forward Chaining • Given new fact φ (often perception)
• Add φ to agenda
• While agenda is not empty
• Remove sentence α from agenda
• AddαtoKB
• Apply inference using only rules whose premises include α
• Add conclusion β to agenda

Forward Chaining
• Reasons forward from new facts • Data-driven
• Done by humans—to some extent • When to stop?

Forward Chaining
• Reasons forward from new facts • Data-driven
• Done by humans—to some extent • When to stop?
• For KBs using only definite clauses • Sound, complete, linear time
AIMA 7.5.4

Backward Chaining
• In order to prove β
• Find an implication whose
conclusion is β
• Try to prove its premises α (recursively)

Backward Chaining
• Reasons backward from query • Goal-directed
• Useful for answering specific questions

Backward Chaining
• Reasons backward from query • Goal-directed
• Useful for answering specific questions
• For KBs using only definite clauses • Sound, complete, linear time
AIMA 7.5.4

Propositional Theorem
Proving
• Inference rules: Soundness, Completeness
• Proof:α⊢β
Searching for proofs is an alternative to enumerating models; “can be more efficient”
• Resolution is a sound and complete inference rule
Works on clauses (CNF); requires refutation
• Forward and backward chaining

For next time:
AIMA 8.0-8.3; 8.1.1-8.1.2 fyi