留学生辅导 COMP3308/3608 Artificial Intelligence

COMP3308/3608 Artificial Intelligence
Week 11 Tutorial exercises Probabilistic Reasoning. Bayesian Networks.
Exercise 1. Bayesian network (Homework)
Consider the Bayesian network below where all variables are binary:

Copyright By PowCoder代写 加微信 powcoder

Compute the following probability and show your calculations: P(M=T, E=T, A=F, C=T).
Exercise 2. Probabilistic reasoning
Consider the dental example discussed at the lecture; its full joint probability distribution is shown below:
Calculate the following:
a) P(toothache)
b) P(Cavity)
c) P(Toothache |cavity)
d) P(Cavity| toothache  catch)
Exercise 3. Bayesian networks
(Based on J. Kelleher, B. McNamee and A. D’Arcy, Fundamentals of ML for Predictive Data Analytics, MIT, 2015)
Given is the following description of the causal relationships between storms, burglars, cats and house alarms:
COMP3308/3608 Artificial Intelligence, s1 2022
1. Stormy nights are rare.

2. Burglary is also rare, and if it is a stormy night, burglars are likely to stay at home (burglars don’t like going out in storms).
3. Cats don’t like storms either, and if there is a storm, they like to go inside.
4. The alarm on your house is designed to be triggered if a burglar breaks into your house, but sometimes it can be set off by your cat coming into the house, and sometimes it might not be
triggered even if a burglar breaks in (it could be faulty or the burglar might be very good).
a) Define the topology of a Bayesian network that encodes these relationships.
b) Using the data from the table below, create the Conditional Probability tables (CPTs) for the Bayesian network from the previous step.
c) Compute the probability that the alarm will be on, given that there is a storm but we don’t know if a burglar has broken in or where the cat is.
Exercise 4. Using Bayesian networks for classification
The figure below shows the structure and probability tables of a Bayesian network. Battery (B), Gauge (G) and Fuel (F) are attributes, and Start (S) is the class. If the evidence E is B=good, F=empty, G=empty, which class will the Bayesian network predict: S=yes or S=no?
COMP3308/3608 Artificial Intelligence, s1 2022
Hint: Compute P(E,S=yes) and P(E, S=no), normalize them to obtain the conditional probabilities P(S=yes|E) and P(S=no|E), compare them and take the class with the higher probability as the class for the new example.

Exercise 5. Using Weka – Naïve Bayes and Bayesian networks
Load the weather nominal data. Choose 10-fold cross validation. Run the Naïve Bayes classifier and note the accuracy. Run Bayesian net with the default parameters. Right-click on the history item and select Visualize graph. This graph corresponds to Naïve Bayes classifier, that’s why the accuracy results are the same. This is because the parameter maximum_number_of_parents_of_a node is set to 1 in the K2 algorithm used to automatically learn the Bayesian network. Change it to 2 (for example) and a more complex Bayesian net will be generated and the accuracy will change (improves in this case).
Exercise 6. Probabilistic reasoning (Advanced only) – to be done at home
Twenty people flip a fair coin. What is the probability that at least 4 of them will get heads?
(From J. Kelleher, B. McNamee and A. D’Arcy, Fundamentals of ML for Predictive Data Analytics, MIT, 2015)
Hint: The probability of getting k outcomes, each with a probability p, in a sequence of n binary experiments is:
n k n−k k*p *(1−p)
COMP3308/3608 Artificial Intelligence, s1 2022

程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com