Introduction to AI Knowledge Representation and Reasoning
Introduction to AI
Logic for
Knowledge Representation and
Automated Reasoning
Francesca Toni
Outline
• Resolution and unification and their use for
automated reasoning
• Foundations of logic programming for
knowledge representation and automated
reasoning
Recommended reading:
(most of) Chapters 7-9
Additional reading:
Chapter 5
2
Knowledge representation and
automated reasoning
Problem Solution
(Symbolic) (Automated)
Representation Reasoning output
3
Automated Reasoning
In order to find a solution to a problem:
1. find a suitable representation for the problem, equipped with an
automated reasoning mechanism (for automatic computation of outputs)
2. compute output
3. the output can be mapped directly into a solution to the original problem
Example
4
Yes (or No)
Logic program
(“pure” Prolog program)
SLDNF (Prolog)
𝑐𝑜𝑣𝑒𝑟𝑒𝑑 𝑋 ← 𝑎ℎ 𝑋 , 𝑡𝑟 𝑋, 𝐶 , 𝑝𝑟 𝐶 , 𝑛𝑜𝑡 ¬𝑐𝑜𝑣𝑒𝑟𝑒𝑑 𝑋, 𝐶
¬𝑐𝑜𝑣𝑒𝑟𝑒𝑑 𝑋, 𝐶 ← ¬𝑟𝑒𝑔(𝐶, 𝑋), 𝑛𝑜𝑡 𝑖𝑛(𝑋, 𝐶)
𝑎ℎ 𝑓𝑡 𝑡𝑟 𝑓𝑡, 𝑎𝑙𝑝ℎ𝑎 𝑝𝑟 𝑎𝑙𝑝ℎ𝑎
¬𝑟𝑒𝑔(𝑎𝑙𝑝ℎ𝑎, 𝑓𝑡) 𝑖𝑛(𝑓𝑡, 𝑎𝑙𝑝ℎ𝑎)
𝑐𝑜𝑣𝑒𝑟𝑒𝑑 𝑓𝑡 ?
is Francesca covered?
success (or failure)
From knowledge representation and
automated reasoning to…
5
Verification
Explanation
Machine Learning
Inductive Logic Programming
Differential Inductive Logic
Neural Inductive Logic Programming
Logic Tensor Networks
…
Model checking
….
https://www.doc.ic.ac.uk/~shm/ilp.html
https://arxiv.org/abs/1711.04574
https://arxiv.org/abs/1705.11040
https://arxiv.org/abs/1705.08968
https://www.ijcai.org/proceedings/2017/0158.pdf
Logic for
Knowledge Representation and (Automated) Reasoning
• How to represent knowledge underlying a (solution
to a) problem in a form that a machine can
automatically reason with?
• Logic-based mechanisms, as logic is equipped with
– formal languages (representation)
– automated theorem provers (reasoning)
6
A brief history of reasoning
450b.c. Stoics propositional logic, inference (maybe)
322b.c. Aristotle “syllogisms” (inference rules), quantifiers
1565 Cardano probability theory (propositional logic + uncertainty)
1847 Boole propositional logic (again)
1879 Frege first-order logic (FOL)
1922 Wittgenstein proof by truth tables
1930 Gödel ∃ complete algorithm for FOL
1930 Herbrand complete algorithm for FOL (reduce to propositional)
1931 Gödel ∄ complete algorithm for arithmetic
1960 Davis/Putnam “practical” algorithm for propositional logic
1965 Robinson “practical” algorithm for FOL—resolution
1982 Martelli/Montanari “practical” algorithm for unification
7
Clausal form
• Resolution works with (sets/conjunctions of) clauses:
¬p1 ∨ … ∨¬pm ∨ q1 ∨ … ∨ qn
where each pi and each qj is an atom, m 0,n 0
(m=n=0: empty clause/false/contradiction,
often written )
• Every clause can be written equivalently as an
implication:
p1 ∧ … ∧ pm → q1 ∨ … ∨ qn
often written as:
q1 ∨ … ∨ qn ← p1 ∧ … ∧ pm
8
Clausal form: note
(𝐴 (𝐵ٿ ⋁¬C
9
(𝐴 ⋁¬C) ٿ (𝐵⋁¬C)
(A ∧ (𝐵→ C))→D
(¬𝐴 ⋁𝐵 ⋁𝐷) ٿ (¬𝐴 ⋁¬𝐶⋁𝐷)
∃𝑥 𝑃(𝑥)
P(𝑆𝐾1)
∃𝑥 (𝑃 𝑥 ∧ ∀𝑦 𝑄(𝑥, 𝑦)
∀𝑦 (𝑃 𝑆𝐾2 ∧ (𝑄 𝑆𝐾2, 𝑦 )
• every formula in propositional logic is logically equivalent to a conjunction of
clauses (conjunctive normal form), e.g.:
• every sentence in first-order logic can be written equivalently as a conjunction
of clauses (universal quantification + conjunctive normal form + Skolemization),
e.g.:
Resolution inference rule:
propositional case
This rule combines two clauses to make a new one.
Basic propositional version:
∨ , ¬ ∨ or equivalently ¬ → , →
∨ ¬ →
It is applied repeatedly until the empty clause is derived.
e.g. given ¬ A ∨ B, ¬ B, ¬ C ∨ A, C:
• resolution with ¬ A ∨ B and ¬ C ∨ A gives B ∨ ¬ C
• resolution with C and B ∨ ¬ C gives B
• resolution with B and ¬ B gives
10
Completeness of resolution
• If a set of propositional clauses is unsatisfiable, then
resolution will eventually return the empty clause.
• Thus, to prove that P (the query/goal) is entailed by a
set of sentences S (i.e. S ╞ P):
1. compute the conjunctive normal form S′ of S
2. compute the conjunctive normal form NP′ of ¬P
3. apply resolution to S′ and NP′ to obtain
• Issues:
– First-order case?
– Search for “good” sequence of resolution steps?
11
First-order case: Universal instantiation
Every instantiation of a universally quantified sentence is entailed by it:
∀ v for any variable v and term g
{v/g} {v/g} is a substitution
• If g is a ground term (with no variables): ground instantiation
• for a substitution, is the formula obtained from by applying
E.g.: ∀x ∀y (Father(x,y) ∧ Happy(y) → Happy(x)) yields instantiations:
Father(Joe,Joe) ∧ Happy(Joe) → Happy(Joe) substitution {x/Joe, y/Joe}
Father(Joe,Ann) ∧ Happy(Ann) → Happy(Joe) substitution {x/Joe, y/Ann}
Father(Joe,Bob) ∧ Happy(Bob) → Happy(Joe) substitution {x/Joe, y/Bob}
∀ xFather(x,Ann) ∧ Happy(Ann) → Happy(x) substitution {y/Ann}
∀ zFather(Mary, z) ∧ Happy(z) → Happy(Mary) substitution {x/Mary, y/z}
∀ x′∀ y′Father(x′, y′) ∧ Happy(y′) → Happy(x′) substitution {x/x′, y/y′}
12
First-order case: reduction to propositional inference
For a small set of sentences S, one way to proceed:
1. replace all sentences in S by their ground instantiations
2. now just use inference methods for propositional logic
But …
With p predicates of arity k and n constants, there are
p∗nk instantiations!
Worse still, if there are function symbols in S (the
given set of sentences), there are infinitely many
instantiations:
• e.g. A, F(A), F(F(A)), F(F(F(A))),… are all ground terms.
13
First-order case: Unification
A substitution unifies atomic sentences
p and q if p = q
14
p q
Knows(John, x) Knows(John, Jane) {x/Jane}
Knows(John, x) Knows(y,OJ) {x/OJ, y/John}
Knows(John, x) Knows(y,Mother(y)) {y/John,
x/Mother(John)}
Unification+resolution
Idea: Unify rule premises with known facts, apply
unifier to conclusion.
E.g. from Knows(John, Jane)
Knows(John,OJ)
Knows(John,Mother(John))
and ∀ x(Knows(John, x) → Likes(John, x))
we can conclude Likes(John, Jane)
Likes(John,OJ)
Likes(John,Mother(John))
15
Most general unifier
Example: Knows(John, x) and Knows(John, y)
The following are all unifiers:
{x/John, y/John} {x/Jane, y/Jane}
{x/Mother(John), y/Mother(John)}
{x/Mother(z), y/Mother(z)} {x/z, y/z}
Only {x/z, y/z} is a most general unifier (mgu).
is a most general unifier of formulas and if and if
1. is a unifier of formulas and , i.e. = , and
2. if is any other unifier of and ( = ) then is an
instance of , i.e. = ( ) ’ for some substitution ′.
16
Unification algorithm
There is a (very efficient) unification algorithm which checks whether any two
formulas can be unified, and produces a most general unifier if they can. (Details
omitted – but Prolog implements (most of) it)
Unification is very powerful. Some examples:
p(x, y, F(z))
p(F(y),A, x)
p(x, x, F(F(A)))
p(y, F(z), F(y))
p(x, F(A), y)
p(F(y), x,B)
p(x, x, F(A))
p(F(y), y, z)
17
unifier {x/F(A),y/A,z/A}
unifier {x/F(A),y/F(A),z/A}
cannot unify: A B for constants A and B
cannot unify: would require y = F(y) – `occurs check’
Resolution inference rule:
first-order case
Basic first-order version:
∨ , ¬ ’ ∨ where is a mgu of ,’
( ∨ )
Full first-order version:
1 ∨ … ∨ j-1 ∨ j ∨ j+1 … ∨ m, 1 ∨ … k-1 ∨ k ∨ k+1 ∨ … ∨ n
(1 ∨ … ∨ j-1 ∨ j+1 … ∨ m, 1 ∨ … k-1 ∨ k+1 ∨ … ∨ n)
where is a mgu of j , ¬k
18
Special case: definite clauses
For many practical purposes it is sufficient to restrict attention to the
special case of definite clauses :
p ← q1, q2, …, qn [equivalently ¬q1∨…∨¬qn∨p]
where p, q1, …, qn are all atoms, (n ≥ 0).
– p is the head and q1, …, qn the body of the clause.
– if n = 0, the clause p ← can be written as p – called a fact.
Note: Definite clauses are often called ‘Horn clauses’. Strictly speaking
though this is incorrect, as Horn clauses also include ← q1, …, qn .
← q1, …, qn is logically equivalent to ¬q1∨…∨¬qn, which is logically
equivalent to ¬(q1 ∧ … ∧ qn).
Note: a set of definite clauses is often referred to as a (positive) logic
program
19
Generalized Modus Ponens (GMP)
For S a set of definite clauses, Generalized Modus Ponens is given by:
p′1, p′2, …, p′n, (q ← p1, p2, …, pn)
q
where p′i=pi forall i ( is the composition of the mgus for all p′I, pi)
(Note: this is a special case of (several steps of) resolution.)
E.g.
Faster(Bob,Pat), Faster(Pat,Steve), ∀ x,y Faster(x, y)←Faster(x, z), Faster(z,y)
Faster(x,y)
where = {x/Bob, y/Steve}
20
From propositional to first-order
resolution: summary
• To prove that S ╞ P:
1. compute the conjunctive normal form S′ of S
2. compute the conjunctive normal form NP′ of ¬P
3. apply first-order resolution to S′ and NP′
– If S′ is a set of definite clause (a logic program) and NP′ is a
Horn clause ← q1, …, qn then apply GMP to derive q1, …, qn
from S’
• Issue: Search for “good” sequence of
resolution/GMP steps?
21
Definite clauses and GMP:
Example
The US law says that it is a crime for an
American to sell weapons to hostile nations. The
country Nono, and enemy of America, has some
missiles of type M1, and all of its missiles were
sold to it by Colonel West, an American.
Show that Colonel West is a criminal.
22
Definite clauses for Colonel West
• It is a crime for an American to sell weapons to hostile nations:
Criminal(x) ← American(x), Weapon(y), Sells(x, y, z), Hostile(z)
• Nono has missiles of type M1:
Owns(Nono,M1)
Missile(M1)
• All of Nono’s missiles were sold to it by Colonel West, who is an American:
Sells(West, x,Nono) ← Owns(Nono, x),Missile(x)
American(West)
• Missiles are weapons, while an enemy of America counts as “hostile”:
Weapon(x) ← Missile(x)
Hostile(x) ← Enemy(x,America)
• Nono is an enemy of America:
Enemy(Nono,America)
23
Reasoning using Resolution:
forward chaining (bottom-up computation)
• To prove that S ╞ P:
1. compute the conjunctive normal form S′ of S
2. compute the conjunctive normal form NP′ of ¬P
3. If S′ is a set of definite clause and NP′ is a Horn clause ← q1, …, qn
then apply GMP to derive q1, …, qn from S’
• Forward chaining:
– Split S′ into a set of facts E and a set of rules (definite
clauses) Pr.
– Apply the rules in Pr to the facts in E to derive (using GMP)
a new set of implied facts E′
– Add E′ to E.
– Repeat until no new facts are generated.
– (If q1, …, qn are in E succeed.)
24
Forward chaining for Colonel West
25
American(West)
Missile(M1)
Owns(Nono,M1) Enemy(Nono,America)
Forward chaining for Colonel West
26
American(West)
Missile(M1)
Owns(Nono,M1) Enemy(Nono,America)
Weapon(M1) Sells(West, M1,Nono) Hostile(Nono)
Forward chaining for Colonel West
27
American(West)
Missile(M1)
Owns(Nono,M1) Enemy(Nono,America)
Weapon(M1) Sells(West, M1,Nono) Hostile(Nono)
Criminal(West)
Forward chaining: observations
1. If a rule matched the facts on iteration k then it
will still match the facts on iteration k + 1. (Lots
of recomputation!) However…
2. In iteration k+1 it is only necessary to consider
rules which have at least one condition in their
body matching a fact obtained at iteration k.
3. If we have a particular query in mind that we
want to answer, bottom up computation is likely
to produce a lot of irrelevant facts.
28
Reasoning using Resolution:
backward chaining (top-down computation)
• To prove that S ╞ P:
1. compute the conjunctive normal form S′ of S
2. compute the conjunctive normal form NP′ of ¬P
3. If S′ is a set of definite clause and NP′ is a Horn clause ← q1, …, qn then
apply GMP backwards to derive q1, …, qn from S’
• Backward chaining: To solve goal G wrt
– if there is a matching fact G′ in S’, “add” mgu to (G=
G’)
– for each rule G′ ← G1, …,Gm in S’ whose head G′ matches G
via mgu ′ (G’=G’’), solve goals G1’, …,Gm’ wrt after
“adding” ’ to
– Repeat until there are no goals to solve, return
– (Initially the goals to be solved are q1,…,qn and={}) 29
Backward chaining for Colonel West
30
Criminal(West) {}
Backward chaining for Colonel West
31
American(West)
Weapon(y) Sells(West,y,z) Hostile(z)
Criminal(West) {x/West}
using Criminal(x) ← American(x), Weapon(y), Sells(x, y, z), Hostile(z)
Backward chaining for Colonel West
32
American(West)
Weapon(y) Sells(West, y,z) Hostile(z)
Criminal(West) {x/West}
{}
using American(West)
Backward chaining for Colonel West
33
American(West)
Missile(y)
Weapon(y) Sells(West, y,z) Hostile(z)
Criminal(West) {x/West, x’/y}
{x’/y} using Weapon(x’) ← Missile(x’)
Backward chaining for Colonel West
34
American(West)
Missile(M1)
Weapon(M1) Sells(West, M1,z) Hostile(z)
Criminal(West) {…,y/M1}
{y/M1}
Backward chaining for Colonel West
35
American(West)
Missile(M1) Owns(Nono,M1)
Weapon(M1) Sells(West, M1,Nono) Hostile(Nono)
Criminal(West) {…,z/Nono}
{z/Nono}
Missile(M1)
Backward chaining for Colonel West
36
American(West)
Missile(M) Owns(Nono,M1)
Weapon(y) Sells(West, M1,Nono) Hostile(Nono)
Criminal(West) {…}
{}
Missile(M1)
Backward chaining for Colonel West
37
American(West)
Missile(M) Owns(Nono,M1)
Weapon(y) Sells(West, M1,Nono) Hostile(Nono)
Criminal(West) {…}
Missile(M1)
{}
Backward chaining for Colonel West
38
American(West)
Missile(M) Owns(Nono,M1)
Enemy(Nono,America)
Weapon(y) Sells(West, M1,Nono) Hostile(Nono)
Criminal(West) {…}
Missile(M1)
{}
Backward chaining for Colonel West
39
American(West)
Missile(M) Owns(Nono,M1)
Enemy(Nono,America)
Weapon(y) Sells(West, M1,Nono) Hostile(Nono)
Criminal(West) {…}
Missile(M1)
{}
Answer=
restriction of computed substitution to variables in Criminal(West)=
{}
Another way of depicting a backward chaining for Colonel West
Criminal(West)
|
American(West), Weapon(y), Sells(West, y, z),Hostile(z)
|
Weapon(y), Sells(West, y, z),Hostile(z)
|
Missile(y), Sells(West, y, z),Hostile(z)
{y/M1} |
Sells(West,M1, z),Hostile(z)
{z/Nono} |
Owns(Nono,M1),Missile(M1),Hostile(Nono)
|
Hostile(Nono)
|
Enemy(Nono,America)
|
40
Answer: substitution {}
Backward chaining for Colonel West – strictly speaking
Criminal(West)
{x/West} |
American(West), Weapon(y), Sells(West, y, z),Hostile(z)
{} |
Weapon(y), Sells(West, y, z),Hostile(z)
{x’/y} |
Missile(y), Sells(West, y, z),Hostile(z)
{y/M1} |
Sells(West,M1, z),Hostile(z)
{z/Nono} |
Owns(Nono,M1),Missile(M1),Hostile(Nono)
{} |
Hostile(Nono)
{} |
Enemy(Nono,America)
{} |
41
Resolution view of Colonel West Example
42
¬ American(x) ∨¬Weapon(y) ∨¬ Sells(x,y,z) ∨¬ Hostile(z) ∨ Criminal(x)
American(West)
¬Missile(x) ∨ Weapon(x)
Missile(M1)
¬Missile(x) ∨¬ Owns(Nono,x) ∨ Sells(West,x,Nono)
Missile(M1)
Owns(Nono,M1)
¬ Enemy(x,America) ∨ Hostile(x)
Enemy(Nono,America)
¬ Criminal(West)
¬ American(West) ∨¬Weapon(y) ∨¬ Sells(West,y,z) ∨¬ Hostile(z)
¬Weapon(y) ∨¬ Sells(West,y,z) ∨¬ Hostile(z)
¬Missile(y) ∨¬ Sells(West,y,z) ∨¬ Hostile(z)
¬ Sells(West,M1,z) ∨¬ Hostile(z)
¬Missile(M1) ∨¬ Owns(Nono,M1) ∨¬ Hostile(Nono)
¬ Owns(Nono,M1) ∨¬ Hostile(Nono)
¬ Hostile(Nono)
¬ EnemyNono,America)
SLD resolution
• The kind of resolution in the resolution view of
Colonel West via backward chaining is SL
(Selective Linear) resolution for Definite clauses –
SLD resolution:
¬1 ∨ … ∨¬j ∨ … ∨¬m, ∨¬1 ∨ … ∨¬n
(¬1 ∨ … ∨ ¬1 ∨ … ∨¬n ∨ … ∨¬m)
where is a mgu of j ,
43
GOAL definite clause in logic program
Alternative view of SLD resolution
The computation of a goal (query) G is a series of
derivation steps:
44
G
G1
Gn-1
θ1
θ2
θn
…
• i is the mgu at the i-th
derivation step
• The answer computed is
(the restriction to the vars
of G of) the composition
of all these mgus:
= 1 ◦ … ◦ n
Alternative view of SLD resolution
• Each derivation step looks like this:
L1,…,Lj-1, B, Lj+1,…,Ln
i match B with B’ M1,…,Mk , with Bi =B’i
(L1,…,Lj-1, M1,…,Mk, Lj+1,…,Ln)i
The sub-goal B selected for matching can be any one of the
sub-goals in the current goal
– e.g. always choose the leftmost sub-goal.
– the answers computed are the same, whichever sub-goal is
selected!
• Many possible choices for matching clause.
– The choice might affect termination
45
Semantics of definite clauses/logic programs
• Classical models
• Herbrand models
• Immediate consequence operator
Note: semantically, each set of definite clauses S can be
equated to the set of all its ground instances over the
underlying Herbrand universe, i.e. the (possibly infinite) set
of all ground terms that can be constructed from constant
and function symbols in S
e.g. the Herbrand universe of S={P(x) Q(F(x)), R(1) } is
{1, F(1), F(F(1)), …}
From now on each set of definite clauses stands for the
set of all its ground instances over its Herbrand universe
46
Classical models
Interpretations of (set of definite clauses) S:
• mappings from ground terms of S to elements of
some domain D (ground terms denote elements of D)
e.g. for S={P(x) Q(x),R(x), Q(1) , Q(Succ(1)) , R(1) }
D may be {1,2,3,…} where Succ(1) denotes 2 etc
Models of S
• interpretations of S in which every clause of S is true
47
Herbrand models
• These are models where ground terms denote
themselves, i.e. whose domain is the Herbrand
universe of (the given set of definite clauses) S
e.g. for S={P(x) Q(x),R(x), Q(1) , Q(Succ(1)) , R(1) }
both {P(1), Q(1), R(1), Q(Succ(1))}
and {P(1), Q(1), R(1), Q(Succ(1)), P(Succ(1))}
are Herbrand models (but only the former is minimal)
48
If S is a set of definite clauses, then
S has a model iff S has a Herbrand model
iff S has a minimal Herband model
So we can restrict attention to minimal Herbrand models
The immediate consequence operator
Let HB (the Herbrand Base of S) be the set of all ground
atoms constructed from predicate symbols in S over the
Herbrand universe of a set of definite clauses S:
for X⊆ 𝐻𝐵:
TS(X)= {a ∈ 𝐻𝐵 | a b1, …, bm ∈ S, {b1, …, bm} ⊆ X}
e.g. for S={P(x) Q(x),R(x), Q(1) , Q(Succ(1)) , R(1) }
TS({})= {R(1),Q(1), Q(Succ(1))}
TS({R(1),Q(1)})= {R(1),Q(1),P(1), Q(Succ(1))}
49
TS is continuous and admits a least fixed point, given
by TS↑
𝜔, and this is the minimal Herband model of S
Example of TS↑
𝜔
S={P(x) Q(x),R(x), Q(1) , Q(Succ(1)) , R(1) }
TS↑
1 = TS({})= {R(1),Q(1), Q(Succ(1))}
TS↑
2= TS(TS↑
1)= TS(TS({}))= {R(1),Q(1), Q(Succ(1)),P(1)}
TS↑
3 = TS(TS↑
2)= TS↑
2
…
TS↑
𝜔= TS↑
2 = minimal Herbrand model of S
50
SLD resolution is complete
If S ╞ Pσ (i.e. Pσ belongs to the minimal Herbrand model of S,
or to the least fixed point of TS ) then there exists an SLD-
refutation of P (to obtain ) with answer and a substitution
ξ such that Pσ=Pξ
Example: S={P(x) Q(x),R(x), Q(1) , Q(Succ(1)) , R(1) }
P(x)
Q(x),R(x)
R(1)
={x/1}
P(1) belongs to the minimal Herbrand model of S = {R(1),Q(1),
Q(Succ(1)),P(1)} – so S ╞ P(1)
51
SLD resolution is complete
If S ╞ Pσ (i.e. Pσ belongs to the minimal Herbrand model
of S, or to the least fixed point of TS ) then there exists an
SLD-refutation of P (to obtain ) with answer and a
substitution ξ such that Pσ=Pξ
Example: S={P(x,y) Q(x), Q(1) , R(2) }
P(x,y)
Q(x)
={x/1}
P(1,2) belongs to the minimal Herbrand model of S = {Q(1),
R(2),P(1,1),P(1,2)}, ξ={y/2}
52
Summary – Logic for
Knowledge Representation and
Automated Reasoning
• Automated reasoning in FOL often needs one (or
more) of:
– Unification
– Resolution
– Generalized Modus Ponens/SLD resolution – Definite
Clauses
– Forward chaining (bottom-up computations)
– Backward chaining (top-down, goal-directed
computations)
• Completeness of (SLD) resolution
53
Note: Logic programming vs Prolog
• Prolog: the most widely used programming language based upon
logic programming.
– a programming language!
– Program = set of clauses: head :- literal1,…,literaln.
– literals might include:
• negation as failure (also in logic programming)
• findall assert, IO features, etc (not in logic programming)
• conventions on variables/constants etc
• Prolog has:
– Efficient unification
– Efficient retrieval of matching clauses by indexing techniques.
– Depth-first, left-to-right search (with backtracking )
– Built-in predicates for arithmetic etc., e.g., X is Y*Z+3
54
Colonel West in Prolog
criminal(X):- american(X), weapon(Y), sells(X,Y,Z), hostile(Z).
sells(west,Y,nono):- owns(nono,Y), missile(Y).
…
enemy(nono, america).
Query:
?- criminal(X).
X = west;
no.
55