tutorial5.dvi
COMP9414: Artificial Intelligence
Tutorial 5: Reasoning with Uncertainty
1. Show how to derive Bayes’ Rule from the definition P (A ∧B) = P (A|B).P (B).
2. Suppose you are give the following information
Mumps causes fever 75% of the time
The chance of a patient having mumps is 1
15000
The chance of a patient having fever is 1
1000
Determine the conditional probability of a patient suffering from mumps given that they
have don’t have a fever, i.e. P (Mumps |¬Fever ).
3. Consider the following statements
Headaches and blurred vision may be the result of sitting too close to a monitor.
Headaches may also be caused by bad posture. Headaches and blurred vision may
cause nausea. Headaches may also lead to blurred vision.
(i) Represent the causal links in a Bayesian network. Let H stand for “headache”, B for
“blurred vision”, S for “sitting too close to a monitor”, P for “bad posture” and N for
“nausea”. In terms of conditional probabilities, write a formula for the event that all
five variables are true, i.e. P (H ∧B ∧ S ∧ P ∧N).
(ii) Suppose the following probabilities are given
P (H |S, P ) = 0.8 P (H |¬S, P ) = 0.4
P (H |S,¬P ) = 0.6 P (H |¬S,¬P ) = 0.02
P (B|S,H) = 0.4 P (B|¬S,H) = 0.3
P (B|S,¬H) = 0.2 P (B|¬S,¬H) = 0.01
P (S) = 0.1
P (P ) = 0.2
P (N |H,B) = 0.9 P (N |¬H,B) = 0.3
P (N |H,¬B) = 0.5 P (N |¬H,¬B) = 0.7
Furthermore, assume that some patient is suffering from headaches but not from nau-
sea. Calculate joint probabilities for the 8 remaining possibilities (that is, according to
whether S, B, P are true or false).
(iii) What is the probability that the patient suffers from bad posture given that they are
suffering from headaches but not from nausea?
4. Consider the “burglar alarm” Bayesian network from the lectures. Derive, using Bayes’ Rule,
an expression for P (Burglary|Alarm) in terms of the conditional probabilities represented
in the network. Then calculate the value of this probability.
Is this number what you expected? Explain what is going on.
Check your answer using the AIPython program probVE.py. Answers to method calls such
as bn4v.query(B,{A:True}) are in a list form [F,T] giving the probability that B is false
and the probability that B is true, in conjunction with the list of conditions (here that A
is true). The desired answer is then calculated by normalization. The Bayesian network is
encoded as bn4 in probGraphicalModels.py.
5. Prove the conditional version of Bayes’ Rule: P (B|A,C) =
P (A|B,C)P (B|C)
P (A|C)
. Here C is an
added condition to all terms in the original version of Bayes’ Rule.
6. Programming. Try out the bigram tagger from NLTK. Here you can load a corpus of text,
create the bigram model (this can take several seconds), and then use the model to tag a
given sentence. The example below from the NLTK book uses the Brown corpus. You can
also print co-occurring words in a window around a given word.
import nltk
from nltk.corpus import brown
text = nltk.Text(word.lower() for word in nltk.corpus.brown.words())
brown_tagged_sents = brown.tagged_sents(categories=’news’)
brown_sents = brown.sents(categories=’news’)
size = int(len(brown_tagged_sents) * 0.9)
train_sents = brown_tagged_sents[:size]
bigram_tagger = nltk.BigramTagger(train_sents)
print(bigram_tagger.tag(brown_sents[2007]))
print(text.similar(’woman’))