CM3112 Artificial Intelligence
Probability theory: Inference by enumeration
Steven Schockaert
SchockaertS1@cardiff.ac.uk
School of Computer Science & Informatics Cardiff University
Inference by enumeration
We consider formulas over the set of atoms {toothache,cavity,catch} The probability distribution over the possible worlds is encoded in
the following table
Start with the joint distribution:
Inference by enumeration
toothache
toothache
catch
catch
catch
catch
cavity
.108
.012
.072
.008
cavity
.016
.064
.144
.576
For any proposition φ, sum the atomic events where it is true: P(φ) = Σ P(ω)
This probabilityω:ωd|=iφstribution is called the joint distribution because it considers all properties together
L
L
L L
(meningitis sti -neck) =
P (sti -neck|meningitis) · P (meningitis)
P|
Inference by enumeration
P (sti -neck) = 0.8 · 0.0001 = 0.0008
0.1
From the joint distribution, we can derive the probability that
P (Functionality = excellent ⇤ Design = good) formulas over these atoms are satisfied by enumerating the
Inference by enumeration
possible worlds in which they are satisfied
Start with the joint distribution:
P(Marks = high Functionality = excellent Design = good)
= P (Marks = high Functionality = excellent ⇥ Design = good)
P (rain, sunny) P
For any proposition φ, sum the atomic events where it is true:
P (φ) = Σω:ω|=φP (ω)
P (toothache) =P (toothache, catch, cavity) + P (toothache, catch, ¬cavity)
+ P (toothache, ¬catch, cavity) + P (toothache, ¬catch, ¬cavity) =0.108 + 0.012 + 0.016 + 0.064
=0.2
|
toothache
,
toothache
catch
|
catch
catch
catch
cavity
= (rai
.108 .012
n ⇥ sunny) .016 .064
.072
.008
cavity
.144
.576
L
L
L L
P (Functionality = excellent ⇤ Design = good) Inference by enumeration
P (Marks = high|Functionality = excellent, Design = good)
= P (Marks = high|Functionality = excellent ⇥ Design = good)
From the joint distribution, we can derive the probability that formulas over these atoms are satisfied by enumerating the
Inference by enumeration
P (rain, sunny) = P (rain ⇥ sunny)
possible worlds in which they are satisfied
Start with the joint distribution:
P(toothache)=P(toothache catch cavity)+ (toothache,catch,¬cavity)
+ P (toothache, catch, cavity) + P (toothache, ¬catch, ¬cavity)
=0.108 + 0 0.064 =0
For any proposition φ, sum the atomic events where it is true: P (φ) = Σω:ω|=φP (ω)
P (toothache ⇤ cavity) = 0.108 + 0.012 + 0.016 + 0.064 + 0.072 + 0.008 = 0.28 4
toothache
toothache
,
catch
,catch
P
catch
catch
cavity
.108 .012
¬
.012 + 0.016 + .016 .064
.072 .008
.2 cavity
.144
.576
L
L
L L
Inference by enumeration
Conditional probabilities can be found in a similar way:
Start with the joint distribution:
Inference by enumeration
toothache
toothache
catch
catch
catch
catch
cavity
.108 .012
.072
.008
cavity
.016 .064
.144
.576
For any proposition φ, sum the atomic events where it is true:
P(φ) = Σ
P (¬cavity|toothache) =
P (¬cavity toothache) P (tootache)
0.016 + 0.064
0.108 + 0.012 + 0.016 + 0.064
ω:ω|=φ
P(ω)
=
= 0.4
L
L
L L
t
a
P
Marginal Probability
n
⇥w
a
Marginalisation
Marginalisation : project joint distributio over subset of variables
a probability distribution over a subset A’ of A
toothache ¬toothache Such a probability distribution over A’ is called a marginal
distribution
catch ¬catch catch ¬catch cavity .108 .012 .072 .008 ¬cavity .016 .064 .144 .576
P(Catch, Cavity)
F. C. Langbein, Artificial Intelligence – IV. Uncertain Knowledge and Reasoning; 2. Prob
Inference by enumeration
with the joint distribution:
Sum over all value combinations of e
P(X,Y) = ⇥P(X,Y,z) = ⇥ From a probability distribution over a set of atoms A, we can derive
zz P(Toothache, Catch, Cavity)
For example
toothache
toothache
catch
catch
catch
catch
cavity
.108
.012
.072
.008
cavity
.016
.064
.144
.576
cavity
¬cavity
catch
.18
¬catch
.02
.16
.64
ny proposition φ, sum the atomic events where it is true: (φ) = Σω:ω|=φP (ω)
L
L
L L
⇥⇥⇥ ⇥⇥⇥
(X,Y) = P(X,Y,z) = P(X,Y,z,w) P(X,Y) = P(X,Y,z) = P(X,Y,z,w)
ple ample
z z z z ww
Marginalisation
oothache, Catch, Cavity) (Toothache, Catch, Cavity)
toothache ¬toothache toothache ¬toothache
catch ¬catch catch ¬catch catch ¬catch catch ¬catch
y .108 .012 .072 .008 vity .108 .012 .072 .008
y .016 .064 .144 .576 vity .016 .064 .144 .576
P(Catch, Cavity) P(Catch, Cavity)
PP(C(Caavivtiyty) )
¬
cacvaitvyi
¬cavi ¬cavity
cactcahtc
y.18.18
h¬c¬actachtc
.02.02
h
y .16 .16
.64 .64
t t
cacvaivtyity .
¬cacvaivtyity .
8.8
cial Intelligence – IV. Uncertain Knowledge and Reasoning; 2. Probabilistic Inference Intelligence – IV. Uncertain Knowledge and Reasoning; 2. Probabilistic Inference
3
3
2.2
P
m
T P
a a
fi
Inference by enumeration
By enumerating all possibilities and adding the corresponding probabilities, we can infer all probabilities of interest
However, the amount of information that needs to be specified is exponential in the number of random variables
Likewise, the time complexity is exponential in the number of random variables
This method is therefore only practical for very simple problems (real-world problems may have hundreds or thousands of random variables)