CS计算机代考程序代写 chain CSC242 Lecture 2.3 Propositional Theorem Proving

CSC242 Lecture 2.3 Propositional Theorem Proving

CSC242:

Introduction to

Artificial Intelligence

Lecture 2.3

Please put away all electronic devices

Boolean CSP
• All variables must be Booleans

• Domains all { true, false }

• Constraints: Identify possible
combinations of the boolean variables

Propositional Logic

Aristole

(384BC – 332BC)

George Boole

(1815-1864)

Propositional Logic
• Propositions: things that are true or

false

• Connectives: combine propositions into

larger propositions

• Sentences: statements about the world

(can be true or false)

• Boolean functions of Boolean variables

Truth Table

L1,1 W1,2 W2,1 W1,2∨W2,1
L1,1∧

(W1,2∨W2,1)

F F F F F
F F T T F
F T F T F
F T T T F
T F F F F
T F T T T
T T F T T
T T T T T

L1,1 ∧ (W1,2∨W2,1)

Propositional Logic
• Possible worlds

• Models

• Satisfiability

• Unsatisfiable

Background Knowledge
B1,1 ⇔ P1,2 ∨ P2,1
B1,2 ⇔ P1,1 ∨ P2,2 ∨ P3,1
B2,2 ⇔ P1,2 ∨ P2,3 ∨ P3,2 ∨ P2,1

S1,1 ⇔ W1,2 ∨ W2,1
S1,2 ⇔ W1,1 ∨ W2,2 ∨ W3,1
S2,2 ⇔ W1,2 ∨ W2,3 ∨ W3,2 ∨ W2,1

OK1,1 ⇔ ¬(P1,1 ∨ W1,1)
OK1,2 ⇔ ¬(P1,2 ∨ W1,2)
OK2,1 ⇔ ¬(P2,1 ∨ W2,1)

W1,1 ∨ W1,2 ∨ … ∨ W3,4 ∨ W4,4
¬(W1,1 ∧ W1,2), ¬(W1,1 ∧ W1,3), …, ¬(W3,4 ∧ W4,4)

Knowledge-Based Agents
Perceive

InferAct
KB

Perception

PIT

1 2 3 4

1

2

3

4

START

Stench

Stench

Breeze

Gold

PIT

PIT

Breeze

Breeze

Breeze

Breeze

Breeze

Stench

¬B1,1, ¬S1,1

Inference
• Given what I know…

• What should I do? See AIMA 7.7

Inference

Given what I know… Is there no pit in room [2,1]?

R1: ¬P1,1
R2: B1,1 ⇔ (P1,2 ∨ P2,1)
¬B1,1

¬P2,1?

Entailment
• α entails β : α ⊨ β

• Every model of α is also a model of β

• Whenever α is true, so is β

• β is true in every world consistent with α

• β logically follows from α

Models(α) ⊆ Models(β)

Model Checking
• Given knowledge α and query β

• For every possible world w

• If α is satisfied by w

• If β is not satisfied by w

• Conclude that α ⊭ β

• Conclude that α ⊨ β

Model Checking
• Given knowledge α and query β

• For every possible world w

• If α is satisfied by w

• If β is not satisfied by w

• Conclude that α ⊭ β

• Conclude that α ⊨ β AIMA Fig. 7.10

Model Checking

P1,1 P1,2 P2,1 B1,1
F F F F
F F F T
F F T F

T T F T
T T T F
T T T T

Possible Worlds

Model Checking

P1,1 P1,2 P2,1 B1,1 R1 R2 ¬B1,1 ¬P2,1
F F F F T T T
F F F T T F F
F F T F T F T
… …
T T F T F T F
T T T F F F T
T T T T F T F

Knowledge

Model Checking

P1,1 P1,2 P2,1 B1,1 R1 R2 ¬B1,1 ¬P2,1
F F F F T T T
F F F T T F F
F F T F T F T
… …
T T F T F T F
T T T F F F T
T T T T F T F

Knowledge

Model Checking

P1,1 P1,2 P2,1 B1,1 R1 R2 ¬B1,1 ¬P2,1
F F F F T T T
F F F T T F F
F F T F T F T
… …
T T F T F T F
T T T F F F T
T T T T F T F

Knowledge

Model Checking

P1,1 P1,2 P2,1 B1,1 R1 R2 ¬B1,1 ¬P2,1
F F F F T T T T
F F F T T F F
F F T F T F T
… …
T T F T F T F
T T T F F F T
T T T T F T F

Query

Entailment

Given what I know… Is there no pit in room [2,1]?

R1: ¬P1,1
R2: B1,1 ⇔ (P1,2 ∨ P2,1)
¬B1,1

¬P2,1?

KB ⊨ ¬P2,1

Entailment

Given what I know… Is there no pit in room [1,2]?

R1: ¬P1,1
R2: B1,1 ⇔ (P1,2 ∨ P2,1)
R3: B2,1 ⇔ (P1,1 ∨ P2,2 ∨ P3,1)
¬B1,1
B2,1

¬P1,2?

KB ⊨ ¬P1,2

KB ⊭ ¬P2,2
KB ⊭ P2,2

But…

Model Checking
• Given knowledge α and query β

• For every possible world w

• If α is satisfied by w

• If β is not satisfied by w

• Conclude that α ⊭ β

• Conclude that α ⊨ β

Model Checking

P1,1 P1,2 … OK1,1 OK2,1 R1 R2 … ¬B1,1 ¬S1,1 OK2,1
F F … F F F … … … T
… …
F F … T T T T … … T T
… …
T T … F T F … … … F
T T … T F T F … … T
T T … T T T T … … F

n propositions m sentences, O(k) connectives

Model Checking

P1,1 P1,2 … OK1,1 OK2,1 R1 R2 … ¬B1,1 ¬S1,1 OK2,1
F F … F F F … … … T
… …
F F … T T T T … … T T
… …
T T … F T F … … … F
T T … T F T F … … T
T T … T T T T … … F

n propositions m sentences, O(k) connectives

O(k)

Model Checking

P1,1 P1,2 … OK1,1 OK2,1 R1 R2 … ¬B1,1 ¬S1,1 OK2,1
F F … F F F … … … T
… …
F F … T T T T … … T T
… …
T T … F T F … … … F
T T … T F T F … … T
T T … T T T T … … F

n propositions m sentences, O(k) connectives

O(mk)

Model Checking

P1,1 P1,2 … OK1,1 OK2,1 R1 R2 … ¬B1,1 ¬S1,1 OK2,1
F F … F F F … … … T
… …
F F … T T T T … … T T
… …
T T … F T F … … … F
T T … T F T F … … T
T T … T T T T … … F

n propositions m sentences, O(k) connectives

O(2n)

Model Checking

P1,1 P1,2 … OK1,1 OK2,1 R1 R2 … ¬B1,1 ¬S1,1 OK2,1
F F … F F F … … … T
… …
F F … T T T T … … T T
… …
T T … F T F … … … F
T T … T F T F … … T
T T … T T T T … … F

n propositions m sentences, O(k) connectives

O(2nmk) Intractable!

Entailment
• α entails β : α ⊨ β

• Every model of α is also a model of β

• Whenever α is true, so is β

• β is true in every world consistent with α

• β logically follows from α

Models(α) ⊆ Models(β)

co-NP-complete!

Propositional Logic
• Programming language for knowledge

• Factored representation (Boolean CSP)

• Propositions, connectives, sentences

• Possible worlds, satisfiability, models

• Entailment: α ⊨ β

• Every model of α is a model of β

• Model checking intractable!

P1,1 P1,2 … OK1,1 OK2,1 R1 R2 … ¬B1,
1

¬S1,
1

OK2,1

F F … F F F … … … T
… …
F F … T T T T … … T T
… …
T T … F T F … … … F
T T … T F T F … … T
T T … T T T T … … F

Darn model checking…
so intractable

Rule: If you know α, then you also know β.

Rule: If you know α, then you also know β.

No model checking!

Rule: If you know α, then you also know β.

No model checking!

Seems impossible…

KB = { Hungry ⇒ Cranky,
Hungry }

KB = { Hungry ⇒ Cranky,
Hungry }

Cranky

Hungry Cranky Hungry ⇒ Cranky

F F T
F T T
T F F
T T T

KB = { Hungry ⇒ Cranky,
Hungry }

Cranky

Hungry Cranky Hungry ⇒ Cranky

F F T
F T T
T F F
T T T

KB = { Hungry ⇒ Cranky,
Hungry }

Cranky

Hungry Cranky Hungry ⇒ Cranky

F F T
F T T
T F F
T T T

KB = { Hungry ⇒ Cranky,
Hungry }

Cranky

Hungry Cranky Hungry ⇒ Cranky

F F T
F T T
T F F
T T T

KB = { Hungry ⇒ Cranky,
Hungry }

Cranky

KB ⊨ Cranky

Inference Rule

Hungry ⇒ Cranky, Hungry
Cranky

Inference Rule
Premises

(Antecedents)
Hungry ⇒ Cranky, Hungry

Cranky

Inference Rule

Hungry ⇒ Cranky, Hungry
Cranky

Premises

(Antecedents)

Conclusions

(Consequents)

Inference Rule

Hungry Cranky Hungry ⇒ Cranky

F F T
F T T
T F F
T T T

Hungry ⇒ Cranky, Hungry
Cranky

Inference Rule

φ ⇒ ψ, φ
ψ

φ ψ φ ⇒ ψ
F F T
F T T
T F F
T T T

Modus Ponens

φ ⇒ ψ, φ
ψ

MP:

“mode that affirms”

Derivation

{ Hungry ⇒ Cranky, Hungry } ⊢MP Cranky

Derivation

{ Hungry ⇒ Cranky, Hungry } ⊢ Cranky

Derivation

{ Hungry ⇒ Cranky, Hungry } ⊢ Cranky

α ⊢ β ⊢ γ ⊢ … ⊢ ω

α ⊢ ω

α ⊨ β α ⊢ β

β logically follows from α

Intractable to compute

Easier to compute

But does α really
follow from β ?

Soundness

If α ⊢ β then α ⊨ β

Soundness

If α ⊢ β then α ⊨ β

Modus Ponens is sound

φ ⇒ ψ, ψ
φ

Hungry ⇒ Cranky, Cranky
Hungry

φ ⇒ ψ, ψ
φ

φ ψ φ ⇒ ψ
F F T
F T T
T F F
T T T

Hungry ⇒ Cranky, Cranky
Hungry

φ ⇒ ψ, ψ
φ

φ ψ φ ⇒ ψ
F F T
F T T
T F F
T T T

Hungry ⇒ Cranky, Cranky
Hungry

φ ⇒ ψ, ψ
φ

{ φ ⇒ ψ, ψ } ⊨ φ

Hungry ⇒ Cranky, Cranky
Hungry

φ ψ φ ⇒ ψ
F F T
F T T
T F F
T T T

φ ⇒ ψ, ψ
φ

Unsound!

Hungry ⇒ Cranky, Cranky
Hungry

{ φ ⇒ ψ, ψ } ⊨ φ

φ ψ φ ⇒ ψ
F F T
F T T
T F F
T T T

Affirming the Consequent
φ ⇒ ψ, ψ

φ

Unsound!

Hungry ⇒ Cranky, Cranky
Hungry

{ φ ⇒ ψ, ψ } ⊨ φ

φ ψ φ ⇒ ψ
F F T
F T T
T F F
T T T

φ ⇒ ψ, ¬ψ
¬φ

Hungry ⇒ Cranky, ¬Cranky
¬Hungry

φ ⇒ ψ, ¬ψ
¬φ

φ ψ φ ⇒ ψ ¬φ ¬ψ
F F T T T
F T T T F
T F F F T
T T T F F

Hungry ⇒ Cranky, ¬Cranky
¬Hungry

φ ⇒ ψ, ¬ψ
¬φ

φ ψ φ ⇒ ψ ¬φ ¬ψ
F F T T T
F T T T F
T F F F T
T T T F F

Hungry ⇒ Cranky, ¬Cranky
¬Hungry

φ ⇒ ψ, ¬ψ
¬φ

Hungry ⇒ Cranky, ¬Cranky
¬Hungry

φ ψ φ ⇒ ψ ¬φ ¬ψ
F F T T T
F T T T F
T F F F T
T T T F F

φ ⇒ ψ, ¬ψ
¬φ

Hungry ⇒ Cranky, ¬Cranky
¬Hungry

Sound!{ φ ⇒ ψ, ¬ψ } ⊨ ¬φ

φ ψ φ ⇒ ψ ¬φ ¬ψ
F F T T T
F T T T F
T F F F T
T T T F F

Modus Tollens
φ ⇒ ψ, ¬ψ

¬φ
MT:MT:

Hungry ⇒ Cranky, ¬Cranky
¬Hungry

Sound!{ φ ⇒ ψ, ¬ψ } ⊨ ¬φ

φ ψ φ ⇒ ψ ¬φ ¬ψ
F F T T T
F T T T F
T F F F T
T T T F F

Equivalences

For any sentences φ and ψ

If φ ≡ ψ then

are inference rules

φ
ψ

ψ
φ

and

Equivalences
(α ∧ β) ≡ (β ∧ α) Commutativity of ∧
(α ∨ β) ≡ (β ∨ α) Commutativity of ∨

((α ∧ β) ∧ γ) ≡ (α ∧ (β ∧ γ)) Associativity of ∧

((α ∨ β) ∧ γ) ≡ (α ∨ (β ∨ γ)) Associativity of ∨

¬(¬α) ≡ α Double-negation elimination
(α ⇒ β) ≡ (¬β ⇒ ¬α) Contrapositive
(α ⇒ β) ≡ (¬α ∨ β) Implication elimination

(α ⇔ β) ≡ ((α ⇒ β) ∧ (β ⇒ α)) Biconditional elimination
¬(α ∧ β) ≡ (¬α ∨ ¬β) De Morgan’s law
¬(α ∨ β) ≡ (¬α ∧ ¬β) De Morgan’s law

(α ∧ (β ∨ γ)) ≡ ((α ∧ β) ∨ (α ∧ γ)) Distributivity of ∧ over ∨
(α ∨ (β ∧ γ)) ≡ ((α ∨ β) ∧ (α ∨ γ)) Distributivity of ∨ over ∧

AIMA Fig. 7.11

Inference Rules
↵ ^ �

↵ , �
(↵ ) �) ^ (� ) ↵)

(↵ ) �) ^ (� ) ↵)
↵ , �

¬¬↵

And-elimination Double

negation

DeMorgan’s

Laws

Definition of biconditional

¬(↵ � �)
¬↵ ⇥ ¬�

¬(↵ ⇥ �)
¬↵ � ¬�

Modus
Ponens

� ) ⇥,�

PIT

1 2 3 4

1

2

3

4

START

Stench

Stench

Breeze

Gold

PIT

PIT

Breeze

Breeze

Breeze

Breeze

Breeze

Stench

Proof
• Inference rules produce theorems

derived from other sentences

• The sequence of inference rule

applications used in the derivation
constitutes a proof of the theorem

Proof

Given what I know… Is there no pit in room [1,2]?

1: ¬P1,1
2: B1,1 ⇔ (P1,2 ∨ P2,1)
3: B2,1 ⇔ (P1,1 ∨ P2,2 ∨ P3,1)
4: ¬B1,1
5: B2,1

¬P1,2?

Rule Premises Conclusion

Bicond elim 2 6: (B1,1 ⇒ (P1,2 ∨ P2,1)) ∧ ((P1,2 ∨ P2,1) ⇒ B1,1)
And elim 6 7: ((P1,2 ∨ P2,1) ⇒ B1,1)
Contrapositive 7 8: (¬B1,1 ⇒ ¬(P1,2 ∨ P2,1))
MP 8, 4 9: ¬(P1,2 ∨ P2,1)

De Morgan 9 10: ¬P1,2 ∧ ¬P2,1
And elim 10 11: ¬P1,2

1: ¬P1,1
2: B1,1 ⇔ (P1,2 ∨ P2,1)
3: B2,1 ⇔ (P1,1 ∨ P2,2 ∨ P3,1)
4: ¬B1,1
5: B2,1

Proof
• Each step’s premises must be in the KB

already

• Each step’s conclusion is added to the

KB

• The last step derives the query

KB ⊢ ¬P1,2

Proof
• Each step’s premises must be in the KB

already

• Each step’s conclusion is added to the

KB

• The last step derives the query

• If all the inference rules are sound
KB ⊢ ¬P1,2

KB ⊨ ¬P1,2

Proof as Search
• States are sets of sentences (KBs)

• Actions are applying inference rules

• Actions(s) = { rm | Match(Premises(r), KB)=m }

• Result(rm, s) = s ∪ Subst(m, Conclusions(r))

• Initial state: initial KB

• Goal test: query ∈ KB

Theorem Proving
• Searching for proofs is an alternative

to enumerating models

• “In many practical cases, finding a

proof can be more efficient because
the proof can ignore irrelevant
propositions, no matter how many of
them there are.”

Theorem Proving
• States are sets of sentences (KBs)

• Actions are applying inference rules

• Actions(s) = { rm | Match(Premises(r), KB)=m }

• Result(rm, s) = s ∪ Subst(m, Conclusions(r))

• Initial state: initial KB

• Goal test: query ∈ KB

Theorem Proving
• Need a complete search strategy

Theorem Proving
• Need a complete search strategy

• Need a complete set of inference rules

Completeness

If α ⊨ β then α ⊢ β

Completeness

If α ⊨ β then α ⊢ β

Modus Ponens is not complete

φ ⇒ ψ, φ
ψ

MP:

Proof

Rule Premises Conclusion

Bicond elim 2 6: (B1,1 ⇒ (P1,2 ∨ P2,1)) ∧ ((P1,2 ∨ P2,1) ⇒ B1,1)
And elim 6 7: ((P1,2 ∨ P2,1) ⇒ B1,1)
Contrapositive 7 8: (¬B1,1 ⇒ ¬(P1,2 ∨ P2,1))
MP 8, 4 9: ¬(P1,2 ∨ P2,1)

De Morgan 9 10: ¬P1,2 ∧ ¬P2,1
And elim 10 11: ¬P1,2

Theorem Proving
• Need a complete search strategy

• Need a complete set of inference rules

Theorem Proving
• Need a complete search strategy

• Need a complete set of inference rules

• Or a single complete inference rule

Cranky

Hungry ∨ Cranky
¬Hungry

BB P!

A

OK OK

OK
1,1 2,1 3,1 4,1

1,2 2,2 3,2 4,2

1,3 2,3 3,3 4,3

1,4 2,4 3,4 4,4

V

OK

W!

V
P!

A

OK OK

OK
1,1 2,1 3,1 4,1

1,2 2,2 3,2 4,2

1,3 2,3 3,3 4,3

1,4 2,4 3,4 4,4

V

S

OK

W!

V

V V

B
S G

P?

P?

(b)(a)

S

A
B
G

P
S

W

= Agent
= Breeze
= Glitter, Gold

= Pit
= Stench

= Wumpus

OK = Safe square

V = Visited

P?

P?
A

B2,1

P1,1 _ P2,2 _ P3,1

¬P1,1

P2,2 _ P3,1

¬P2,2

P3,1

Reasoning By Cases

If A or B is true and you know it’s not A,

then it must be B

l1 ∨ … ∨ li ∨ … ∨ lk ¬li 

l1 ∨ … ∨ li-1 ∨ li+1 ∨ … ∨ lk

l1 ∨ … ∨ li ∨ … ∨ lk ¬li 

l1 ∨ … ∨ li-1 ∨ li+1 ∨ … ∨ lk

l1 ∨ … ∨ li ∨ … ∨ lk ¬li 

l1 ∨ … ∨ li-1 ∨ li+1 ∨ … ∨ lk

li is gone

l1 ∨ … ∨ li ∨ … ∨ lk ¬li 

l1 ∨ … ∨ li-1 ∨ li+1 ∨ … ∨ lk

Literals

l1 ∨ … ∨ li ∨ … ∨ lk m

l1 ∨ … ∨ li-1 ∨ li+1 ∨ … ∨ lk

Literals
Complementary

m=¬li
or

li=¬m

l1 ∨ … ∨ li ∨ … ∨ lk m

l1 ∨ … ∨ li-1 ∨ li+1 ∨ … ∨ lk

Literal

Clause

Complementary
m=¬li

or
li=¬m

Unit Resolution

l1 � · · · � li � · · · � lk, m
l1 � · · · � li�1 � li+1 · · · � lk

l1, …, lk and m are literals

li and m are complementary

Clause Unit Clause

1. Hungry ∨ Cranky
2. ¬Hungry
Q: Cranky?

Unit Res: Hungry 1,2 3. Cranky

KB ⊢UR Cranky

1. P1,1 ∨ P2,2 ∨ P3,1
2. ¬P1,1
3. ¬P2,2
Q: P3,1? BB P!

A

OK OK

OK
1,1 2,1 3,1 4,1

1,2 2,2 3,2 4,2

1,3 2,3 3,3 4,3

1,4 2,4 3,4 4,4

V

OK

W!

V
P!

A

OK OK

OK
1,1 2,1 3,1 4,1

1,2 2,2 3,2 4,2

1,3 2,3 3,3 4,3

1,4 2,4 3,4 4,4

V

S

OK

W!

V

V V

B
S G

P?

P?

(b)(a)

S

A
B
G

P
S

W

= Agent
= Breeze
= Glitter, Gold

= Pit
= Stench

= Wumpus

OK = Safe square

V = Visited

P?

P?
A

Unit Res: P1,1 1,2 4. P2,2 ∨ P3,1

Unit Res: P2,2 4,3 5. P3,1

KB ⊢UR P3,1

Unit Resolution
• Sound:

• Easy to show

• Not complete:

• Give a counterexample

↵ ` � ↵ |= �thenif

↵ ` �↵ |= � thenif

Hungry ∨ Cranky

¬Hungry ∨ Sleepy

Cranky ∨ Sleepy

Resolution
l1 � · · · � li � · · · � lk, m1 � · · · �mj � · · · �mn

l1 � · · · � li�1 � li+1 · · · � lk �m1 � · · · �mj�1 �mj+1 · · · �mn

l1, …, lk , m1, …, mn are literals

li and mj are complementary

Technical note: Resulting clause must be factored
to contain only one copy of each literal.

(See AIMA)

li and mj are gone

Rule Premises Conclusion

Resolution: Hungry 1,2 4. Cranky ∨ ¬Sleepy
Resolution: Sleepy 3,4 5. Cranky ∨ Cranky
Factoring 5 6. Cranky

1. Hungry ∨ Cranky
2. ¬Sleepy ∨ ¬Hungry
3. Cranky ∨ Sleepy

KB ⊢ Cranky

Resolution
• Sound:

• Easy to show
↵ ` � ↵ |= �thenif

Resolution
• Sound:

• Easy to show

• Not complete:

• Give a counterexample

↵ ` � ↵ |= �thenif

↵ ` �↵ |= � thenif

Resolution is

Refutation-Complete

• If a set of clauses is unsatisfiable, then
resolution can derive the empty clause
(□)

AIMA p. 255

Resolution
l1 � · · · � li � · · · � lk, m1 � · · · �mj � · · · �mn

l1 � · · · � li�1 � li+1 · · · � lk �m1 � · · · �mj�1 �mj+1 · · · �mn

l1, …, lk , m1, …, mn are literals

li and mj are complementary

Technical note: Resulting clause must be factored
to contain only one copy of each literal.

(See AIMA)

Challenges for Using
Resolution

• Only works on clauses

• Only refutation-complete

Conjunctive Normal Form
(CNF)

• Any sentence of propositional logic can
be converted into an equivalent
conjunction (set) of clauses

Conjunctive Normal Form
(CNF)

• Eliminate ⇔: α⇔β → α⇒β ∧ β⇒α

• Eliminate ⇒: α⇒β → ¬α ∨ β

• Move negation in:

• ¬¬α → α

• ¬(α ∨ β) → (¬α ∧ ¬β)

• ¬(α ∧ β) → (¬α ∨ ¬β)

• Distribute ∨ over ∧:

• (α ∨ (β ∧ γ)) → ((α ∨ β ) ∧ (α ∨ γ))

AIMA p. 253-254

Challenges for Using
Resolution

• Only works on clauses

• Convert KB & query to clauses (CNF)

• Only refutation-complete

Challenges for Using
Resolution

• Only works on clauses

• Convert KB & query to clauses (CNF)

• Only refutation-complete

Resolution is

Refutation-Complete

• If a set of clauses is unsatisfiable, then
resolution can derive the empty clause
(□)

a clause with zero literals

Entailment and
Satisfiability

• If a set of clauses is unsatisfiable, then
resolution can derive the empty clause (□)

• KB ⊨ β
• iff every model of KB is a model of β

• iff no model of KB is a model of ¬β

• iff there are no models of KB ∪ { ¬β }

• iff KB ∪ { ¬β } is unsatisfiable

Resolution Refutation
• Convert KB ∪ { ¬β } to CNF

• Apply resolution rule until:

• No new clauses can be added

• Two clauses resolve to yield the

empty clause (contradiction)

• KB |= �

KB 2 �

Resolution Refutation

Given what I know… Is there no pit in room [1,2]?

2: B1,1 ⇔ (P1,2 ∨ P2,1)
4: ¬B1,1

¬P1,2?

Resolution Refutation

KB = { B1,1 ⇔ (P1,2 ∨ P2,1), ¬B1,1 }

Query: ¬P1,2?

Add ¬Query to KB

B1,1 ⇔ (P1,2 ∨ P2,1), ¬B1,1, ¬¬P1,2

Convert to CNF

B1,1 ⇔ (P1,2 ∨ P2,1), ¬B1,1, ¬¬P1,2
B1,1⇒(P1,2∨P2,1), (P1,2∨P2,1)⇒B1,1, ¬B1,1, ¬¬P1,2
¬B1,1∨(P1,2∨P2,1), ¬(P1,2∨P2,1)∨B1,1, ¬B1,1, ¬¬P1,2

¬B1,1∨(P1,2∨P2,1), (¬P1,2∧¬P2,1)∨B1,1, ¬B1,1, P1,2

¬B1,1∨(P1,2∨P2,1), (¬P1,2∨B1,1), (¬P2,1∨B1,1), ¬B1,1, P1,2

¬B1,1∨P1,2∨P2,1, ¬P1,2∨B1,1, ¬P2,1∨B1,1, ¬B1,1, P1,2

Resolution Refutation
1: ¬B1,1∨P1,2∨P2,1
2: ¬P1,2∨B1,1
3: ¬P2,1∨B1,1
4: ¬B1,1
5: P1,2

Premises Literal Result

2,4 B1,1 6: ¬P1,2
5,6 P1,2 7: □

Resolution Refutation
1: ¬B1,1∨P1,2∨P2,1
2: ¬P1,2∨B1,1
3: ¬P2,1∨B1,1
4: ¬B1,1
5: P1,2

6: ¬P1,2

7: □

2: ¬P1,2∨B1,1 4: ¬B1,1

5: P1,2

Resolution Refutation
• Convert KB ∪ { ¬β } to CNF

• Apply resolution rule until:

• No new clauses can be added

• Two clauses resolve to yield the

empty clause (contradiction)

• KB |= �

KB 2 �

Resolution
• Complete when used in a refutation

(proof by contradiction)

• Search challenges remaining:

• Which clauses to resolve?

• On which complementary literals?

Resolution
• Complete when used in a refutation

(proof by contradiction)

• Search challenges remaining:

• Which clauses to resolve?

• On which complementary literals?

Intractable!

But wait there’s more…

Theorem Proving
• States are sets of sentences (KBs)

• Actions are applying inference rules

• Actions(s) = { rm | Match(Premises(r), KB)=m }

• Result(rm, s) = s ∪ Subst(m, Conclusions(r))

• Initial state: initial KB

• Goal test: query ∈ KB

Inference for
Knowledge-Based Agents

• Data-driven (forward chaining)

• Goal-directed (backward chaining)

Forward Chaining
• Given new fact φ (often perception)

• Add φ to agenda

• While agenda is not empty

• Remove sentence α from agenda

• Add α to KB

• Apply inference using only rules whose

premises include α

• Add conclusion β to agenda

Forward Chaining
• Reasons forward from new facts

• Data-driven

• Done by humans—to some extent

• When to stop?

Forward Chaining
• Reasons forward from new facts

• Data-driven

• Done by humans—to some extent

• When to stop?

• For KBs using only definite clauses

• Sound, complete, linear time
AIMA 7.5.4

Backward Chaining
• In order to prove β

• Find an implication whose
conclusion is β

• Try to prove its premises α
(recursively)

Backward Chaining
• Reasons backward from query

• Goal-directed

• Useful for answering specific questions

Backward Chaining
• Reasons backward from query

• Goal-directed

• Useful for answering specific questions

• For KBs using only definite clauses

• Sound, complete, linear time

AIMA 7.5.4

• Inference rules: Soundness, Completeness

• Proof: α ⊢ β

• Searching for proofs is an alternative to
enumerating models; “can be more efficient”

• Resolution is a sound and complete inference
rule

• Works on clauses (CNF); requires refutation

• Forward and backward chaining

Propositional Theorem
Proving

For next time:

AIMA 8.0-8.3;

8.1.1-8.1.2 fyi