程序代写代做代考 Hidden Markov Mode Bayesian network Bayesian algorithm AI CISC 6525 Artificial Intelligence

CISC 6525 Artificial Intelligence

Fall 2017 Final Exam

Thursday December 14th 2017

In class, closed book and notes. Do all questions.

Q1. Finding oil is an uncertain business. Oil is more likely Found in rocks of Type shale than in

other sedimentary rocks, and more in rocks of Age younger than 100M years than in older rocks.

The decision to Drill for oil depends on the how likely oil is to be found and whether there is a

Glut of oil or not. Answer the following:

a. Draw a Bayesian network to represent the joint distribution
P(Type, Age, Found, Glut, Drill) and explain what kind of variables these are 8

Type Age Glut

Found

Drill

All Boolean random variables: Shale vs. not Shale, <100M vs >=100M etc 2

b. Write the formula to evaluate the joint probability of finding oil in shale older than 100M
years and drilling for it in an oil glut. 5

P(shale, >100M, found, drill) = P(shale)P(>100M)P(found|shale,>100M)P(glut)P(drill|found,glut)

c. Define and compute the compactness ratio for the network. 3

Compactness = sum of CPTs+priors / size of joint = (3+4+4)/32 = 11/32 or 11/31, =0.34

d. Write a formula to evaluate P(drill | shale) 7

P(drill | shale) = P( shale | drill ) P(drill)/P(shale)

=  foundageglutP(shale,Age,Found,Glut,drill)

=  foundageglutP(shale)P(Age)P(Found|shale,Age)P(drill|Found,Glut)

Q2. You are writing an AI surveillance program for monitoring a suspect. You need to be able

to predict the probability of the suspect being at his house. However, you can’t directly monitor

the house and the suspect only goes out at night when you can’t see him. All you can do is

monitor whether he has taken in his daily newspaper each morning or not. However, sometimes

the newspaper is just not delivered. Refer to whether or not the suspect is in the house at day t as

St and whether or not the newspaper was taken in on that day as Nt. Answer the following:

a. What kind of Bayesian network do you need to model this problem 3

Dynamic Bayesian Network; HMM -1

b. Draw the network 7
St St+2 S random Boolean variable, T if suspect present

Nt Nt+1 N random Boolean variable, T if newspaper taken

c. Describe what conditional probability tables you need to fully describe this problem. 5

Need transition model P(St+1|St) and Sensor model P(Nt | St )

d. Assume you have these tables, as well as the prior probability of the suspect being in the
house, and you observe whether or not the newspaper is taken in each day for a week.

What probabilistic method can you apply to determine the probability of the suspect

being at home at the end of the week given the evidence P( St | N0:t ) 2

Probabilistic filtering on the Dynamic Bayesian Network 2

e. Derive or write down the recursive formula for this method 8

 P(Nt | St) St-1 P( St | St-1 ) P( St-1 | N1:t-1 )

Q3. Represent the surveillance problem in Q2 as a Hidden Markov Model (HMM) using the

following information: The probability of the suspect being at home today given he was at home

yesterday is 0.7 but given he was not at home its 0.4. The probability of the newspaper being

gone given the suspect is at home is 0.9 but given he was not at home is 0.2. Answer the

following:

a. Write the transition model matrix for the HMM 7

𝑻 = [
𝟎. 𝟕 𝟎. 𝟑
𝟎. 𝟒 𝟎. 𝟔

]

b. Write the sensor model matrix for the HMM given the newspaper is not there 8

𝑶 = [
𝟎. 𝟐 𝟎

𝟎 𝟎. 𝟖
]

c. Evaluate P(S1|N1:1) using these matrices, a prior value S0 of 0.5 and where the newspaper
was not seen that first day (so N1 is false). 10 (-4 for wrong order of matrices)

𝑷(𝑺 |𝑵𝟏:𝟏 ) = 𝒇𝟏:𝟏 = 𝜶 𝑶𝑻
𝑻𝒇𝟎 = 𝜶 [

𝟎. 𝟐 𝟎
𝟎 𝟎. 𝟖

] [
𝟎. 𝟕 𝟎. 𝟒
𝟎. 𝟑 𝟎. 𝟔

] [
𝟎. 𝟓𝟎
𝟎. 𝟓𝟎

] = 𝜶 [
𝟎. 𝟒𝟗𝟓
𝟎. 𝟎𝟗

] = [
𝟎. 𝟖𝟒𝟕
𝟎. 𝟏𝟓𝟑

]

Q4. Answer the following questions about Particle Filtering:

a. What is particle filtering in the context of a Dynamic Bayesian Network? 5

It is an inexact method of calculating the joint distribution

b. What advantages does it have? 5

It has much better computational complexity than exact calculation.

c. Explain each of the steps in the particle filtering algorithm. 10

1. Select prior samples

2. Propagate forward using transition model P(Xt+1|Xt)

3. Evaluate weight of each sample by evaluating it in sensor model P(Et|Xt)

4. Resample by taking more of samples with higher weights than lower weights

5. Go back to step 2 until all time steps are done

d. What step focuses the samples on the high probability portion of the state space? 5

The resampling using weight causes generates samples that represent evidence well

thus keeping the predictions tied to reality.