COMP30024 Artificial Intelligence
Week 12 Problem Sheet
Weekly Topic Review
Bayes’ Theorem allows us to update uncertainty as new information is acquired. Assume we hold some hypothesis H and observe a series of independent measurements {x1, x2, . . . , xT }. For example, in the robotics context H could be reflect beliefs about the current environmental state, and the sequence {xt}Tt=1 could be sensor measurements. We want to understand how our uncertainty about H evolves given these observations using the following ingredients: a) an observation model: p(xt|H) and b) a prior over hypotheses: p(H). Our goal is to find the time-dependent posterior over hypotheses: p(H|x1, x2, . . . , xt).
Copyright By PowCoder代写 加微信 powcoder
We let xn = (x1, x2, . . . , xn) denote the history of measurements up until step n. Bayes’ theo- rem says the posterior at t is: p(H|xt) ∝ p(xt|H)p(H) Similarly, at time t + 1, the posterior is: p(H|xt+1) ∝ p(xt+1|H)p(H). How to get from P(H|xt) to P(H|xt+1)? Do we have to start over? A straightforward invocation of Bayes’ theorem together with the chain rule of probability obviates this computational difficulty:
p(H|xt+1) ∝ p(xt+1|H)p(H)
= p(xt+1, xt|H)p(H)
= p(xt+1|xt, H)p(xt|H)p(H) ∝ p(xt+1|H)p(H|xt)
Posterior ∝ Likelihood × Prior
New Posterior ∝ × Current Posterior
We return to the question asked originally – how does uncertainty in H evolve given new observa- tions? The answer is to reuse the current posterior distribution as the prior distribution in the next time step, and normalize appropriately.
p(H|x1) → p(H|x2) → p(H|x3) → . . . → p(H|xT )
Incidentally, this demonstrates the coherency property of Bayes’ theorem – if if we acquire multiple pieces of information, and wish to update our probabilities to reflect our uncertainty incorporating all the information, it is equivalent to update our beliefs sequentially, or simultaneously, taking all information into account at once.
1. Robotics [T] Suppose for some environment, the odds of there being an obstacle present are 1 in 10 and that a range sensor has a false positive rate of 30% and a false negative rate of 30%.
(a) Use the incremental form of Bayes’ Theorem to find the probability that an obstacle is present if the detector returns three positive detections in a row.
(b) What is the probability of no obstacle being present if the detector returns a positive detection followed by a negative detection?
2. Leckieitis, the Return [T] After your yearly checkup, the doctor has bad news and good news. The bad news is that you tested positive for the serious disease Leckieitis, and the test is 99% ’accurate’ (i.e., the probability of testing positive given that you have the disease is 0.99, as is the probability of testing negative if you don’t have the disease). The good news is that this is a rare disease, striking only one in 10,000 people.
(a) What are the chances that you actually have the disease?
(b) You decide to get tested a second time. The new test is independent of the original test (given your disease status), and has the same ’accuracy’ as the original test. Unfortu- nately, you test positive a second time. Find the probability that you have the diseases in two ways:
(i) In one step, conditioning on both test results simultaneously.
(ii) In two steps, first updating beliefs based on the first test result, then again based on the second test result.
(c) To be thorough, you visit n Leckieitis screening sites throughout Melbourne, receiving p positive results and n − p negative results. Assuming all tests have identical accuracy and are independent given your disease status, what is the posterior probability that you have Leckieitis?
程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com