CS计算机代考程序代写 chain AI CIS 471/571(Fall 2020): Introduction to Artificial Intelligence

CIS 471/571(Fall 2020): Introduction to Artificial Intelligence
Lecture 12: Probability
Thanh H. Nguyen
Source: http://ai.berkeley.edu/home.html

Reminder
§Project 3: Reinforcement Learning § Deadline: Nov 10th, 2020
§Homework 3: MDPs and Reinforcement Learning § Deadline: Nov 10th, 2020
Thanh H. Nguyen
11/9/20
2

Today
§ Probability
§Random Variables
§Joint and Marginal Distributions §Conditional Distribution
§Product Rule, Chain Rule, Bayes’ Rule § Inference
§ Independence
§You’ll need all this stuff A LOT for the next few weeks, so make sure you go over it now!

Uncertainty
§ General situation:
§ Observed variables (evidence): Agent knows certain things about the state of the world (e.g., sensor readings or symptoms)
§ Unobserved variables: Agent needs to reason about other aspects (e.g. where an object is or what disease is present)
§ Model: Agent knows something about how the known variables relate to the unknown variables
§ Probabilistic reasoning gives us a framework for managing our beliefs and knowledge

Random Variables
§ A random variable is some aspect of the world about which we (may) have uncertainty
§ R = Is it raining?
§ T = Is it hot or cold?
§ D = How long will it take to drive to work? § L = Where is the ghost?
§ We denote random variables with capital letters
§ Like variables in a CSP, random variables have
§ R in {true, false} (often write as {+r, -r})
§ T in {hot, cold}
§ D in [0, ¥)
§ L in possible locations, maybe {(0,0), (0,1), …}
domains

Probability Distributions
§Associate a probability with each value § Temperature:
§ Weather:
W
P
sun
0.6
rain
0.1
fog
0.3
meteor
0.0
T
P
hot
0.5
cold
0.5

Probability Distributions
§ Unobserved random variables have distributions
Shorthand notation:
OK if all domain entries are unique
T
P
hot
0.5
cold
0.5
W
P
sun
0.6
rain
0.1
fog
0.3
meteor
0.0
§ A distribution is a TABLE of probabilities of values § A probability (lower case value) is a single number
§ Must have: and

Joint Distributions
§ A joint distribution over a set of random variables: specifies a real number for each assignment (or outcome):
§ Must obey:
§ Size of distribution if n variables with domain sizes d?
§ For all but the smallest distributions, impractical to write out!
T
W
P
hot
sun
0.4
hot
rain
0.1
cold
sun
0.2
cold
rain
0.3

Probabilistic Models
§ A probabilistic model is a joint distribution over a set of random variables
§ Probabilistic models:
§ (Random) variables with domains
§ Assignments are called outcomes
§ Joint distributions: say whether assignments (outcomes) are likely
§ Normalized: sum to 1.0
§ Ideally: only certain variables directly interact
§ Constraint satisfaction problems:
§ Variables with domains
§ Constraints: state whether assignments are possible
§ Ideally: only certain variables directly interact
Distribution over T,W
T
W
P
hot
sun
0.4
hot
rain
0.1
cold
sun
0.2
cold
rain
0.3
Constraint over T,W
T
W
P
hot
sun
T
hot
rain
F
cold
sun
F
cold
rain
T

Events
§An event is a set E of outcomes
§From a joint distribution, we can calculate the probability of any event
§ Probability that it’s hot AND sunny? § Probability that it’s hot?
§ Probability that it’s hot OR sunny?
§ Typically, the events we care about are partial assignments, like P(T=hot)
T
W
P
hot
sun
0.4
hot
rain
0.1
cold
sun
0.2
cold
rain
0.3

Quiz: Events
§P(+x, +y) ? §P(+x) ?
§P(-y OR +x) ?
X
Y
P
+x
+y
0.2
+x
-y
0.3
-x
+y
0.4
-x
-y
0.1

Marginal Distributions
§ Marginal distributions are sub-tables which eliminate variables
§ Marginalization (summing out): Combine collapsed rows by adding
T
P
hot
0.5
cold
0.5
T
W
P
hot
sun
0.4
hot
rain
0.1
cold
sun
0.2
cold
rain
0.3
W
P
sun
0.6
rain
0.4

Quiz: Marginal Distributions
X
P
+x
-x
X
Y
P
+x
+y
0.2
+x
-y
0.3
-x
+y
0.4
-x
-y
0.1
Y
P
+y
-y

Conditional Probabilities
§ A simple relation between joint and marginal probabilities § In fact, this is taken as the definition of a conditional probability
P(a,b)
P(a) P(b)
T
W
P
hot
sun
0.4
hot
rain
0.1
cold
sun
0.2
cold
rain
0.3

Quiz: Conditional Probabilities
§P(+x | +y) ?
§P(-x | +y) ?
X
Y
P
+x
+y
0.2
+x
-y
0.3
-x
+y
0.4
-x
-y
0.1
§P(-y | +x) ?

Conditional Distributions
§Conditional distributions are probability distributions over some variables given fixed values of others
Conditional Distributions
Joint Distribution
W
P
sun
0.8
rain
0.2
T
W
P
hot
sun
0.4
hot
rain
0.1
cold
sun
0.2
cold
rain
0.3
W
P
sun
0.4
rain
0.6

Normalization Trick
T
W
P
hot
sun
0.4
hot
rain
0.1
cold
sun
0.2
cold
rain
0.3
W
P
sun
0.4
rain
0.6

Normalization Trick
SELECT the joint probabilities matching the evidence
NORMALIZE the selection
(make it sum to one)
T
W
P
hot
sun
0.4
hot
rain
0.1
cold
sun
0.2
cold
rain
0.3
W
P
sun
0.4
rain
0.6
T
W
P
cold
sun
0.2
cold
rain
0.3

Normalization Trick
SELECT the joint probabilities matching the evidence
P
NORMALIZE the selection
(make it sum to one)
T
W
P
hot
sun
0.4
hot
rain
0.1
cold
sun
0.2
cold
rain
0.3
W
P
sun
0.4
rain
0.6
T
cold
cold
W
sun
rain
0.2
0.3
§ Why does this work? Sum of selection is P(evidence)! (P(T=c), here)

Quiz: Normalization Trick
§P(X | Y=-y) ?
SELECT the joint probabilities matching the evidence
NORMALIZE the selection
(make it sum to one)
X
Y
P
+x
+y
0.2
+x
-y
0.3
-x
+y
0.4
-x
-y
0.1

Probabilistic Inference
§ Probabilistic inference: compute a desired probability from other known probabilities (e.g. conditional from joint)
§ We generally compute conditional probabilities § P(on time | no reported accidents) = 0.90
§ These represent the agent’s beliefs given the evidence
§ Probabilities change with new evidence:
§ P(on time | no accidents, 5 a.m.) = 0.95
§ P(on time | no accidents, 5 a.m., raining) = 0.80
§ Observing new evidence causes beliefs to be updated

Inference by Enumeration
§ General case:
§ Evidence variables: § Query* variable:
§ Hidden variables:
§ Step 1: Select the entries consistent with the evidence
§ We want:
* Works fine with multiple query variables, too
All variables
§ Step 2: Sum out H to get joint of Query and evidence
§ Step 3: Normalize
⇥1 Z

Inference by Enumeration
§ P(W)?
§P(W | winter)?
S
T
W
P
summer
hot
sun
0.30
summer
hot
rain
0.05
summer
cold
sun
0.10
summer
cold
rain
0.05
winter
hot
sun
0.10
winter
hot
rain
0.05
winter
cold
sun
0.15
winter
cold
rain
0.20
§P(W | winter, hot)?

Inference by Enumeration
§ Obvious problems:
§ Worst-case time complexity O(dn)
§ Space complexity O(dn) to store the joint distribution

The Product Rule
§Sometimes have conditional distributions but want the joint

The Product Rule
§ Example:
D
W
P
wet
sun
0.1
dry
sun
0.9
wet
rain
0.7
dry
rain
0.3
D
W
P
wet
sun
0.08
dry
sun
0.72
wet
rain
0.14
dry
rain
0.06
R
P
sun
0.8
rain
0.2

The Chain Rule
§ More generally, can always write any joint distribution as an incremental product of conditional distributions
§Why is this always true?

Bayes Rule

Bayes Rule
§ Two ways to factor a joint distribution over two variables:
That’s my rule!
§ Dividing, we get:
§ Why is this at all helpful?
§ Lets us build one conditional from its reverse
§ Often one conditional is tricky but the other one is simple § In the running for most important AI equation!

Quiz
§Given:
D
W
P
wet
sun
0.1
dry
sun
0.9
wet
rain
0.7
dry
rain
0.3
R
P
sun
0.8
rain
0.2
§What is P(W | dry) ?

Inference with Bayes’ Rule
§ Example: Diagnostic probability from causal probability: P (cause|e↵ect) = P (e↵ect|cause)P (cause)
§ Example:
§ M: meningitis, S: stiff neck
P (e↵ect)
P (+m) = 0.0001
P (+s| + m) = 0.8 P(+s| m) = 0.01
P(+m| + s) = P(+s| + m)P(+m) = P(+s)
P(+s| + m)P(+m) = P(+s| + m)P(+m) + P(+s| m)P(m)
0.8 ⇥ 0.0001 = 0.8 ⇥ 0.0001 + 0.01 ⇥ 0.9999
Example givens
§ Note: posterior probability of meningitis still very small § Note: you should still get stiff necks checked out! Why?