Quantifying Uncertainty with Probability
CSci 5512: Artificial Intelligence II
Instructor:
January 25, 2022
Copyright By PowCoder代写 加微信 powcoder
Instructor:
Quantifying Uncertainty with Probability
Probability
Sample space Ω of events
Each “event” ω ∈ Ω has an associated “measure”
Probability of the event P(ω) Axioms of Probability:
∀ω, P(ω) ∈ [0, 1]
P(ω1 ∪ω2)=P(ω1)+P(ω2)−P(ω1 ∩ω2)
Quantifying Uncertainty with Probability
Instructor:
Random Variables
Random variables (r.v.s) are mappings of events (to real numbers)
MappingX :Ω→R
Any event ω maps to X(ω)
Tossing a coin has two possible outcomes
Denoted by {H,T} or {0,1}
Fair coin has uniform probabilities
P(X =0)= 12 P(X =1)= 12
Random variables can be
Discrete, e.g., Bernoulli Continuous, e.g., Gaussian
Quantifying Uncertainty with Probability
Instructor:
Distribution, Density
For a continous r.v.
Distribution function F(x) = P(X ≤ x)
Corresponding density function f (x)dx = dF(x)
For a discrete r.v.
Probability mass function f (x) = P(X = x) = P(x)
We will call this the probability of a discrete event Distribution function F(x) = P(X ≤ x)
Quantifying Uncertainty with Probability
Instructor:
Joint Distributions, Marginals
For two continuous r.v.s. X1, X2
Joint distribution F(x1,x2) = P(X1 ≤ x1,X2 ≤ x2)
Joint density function f (x1, x2) can be defined as before The marginal probability density
For two discrete r.v.s. X1, X2
Jointprobabilityf(x1,x2)=P(X1 =x1,X2 =x2)=P(x1,x2) The marginal probability
P(X1 =x1)=P(X1 =x1,X2 =x2) x2
Can be extended to joint distribution over several r.v.s. Many hard problems involve computing marginals
f (x1, x2)dx2
Quantifying Uncertainty with Probability
Instructor:
Expectation
The expected value of a r.v. X
For continuous r.v.s. E[X] = xp(x)dx
x For discrete r.v.s. E [X ] = i xi pi
Expectation is a linear operator
E [aX + bY + c ] = aE [X ] + bE [Y ] + c
Quantifying Uncertainty with Probability
Instructor:
Independence
Joint probability P(X1 = x1, X2 = x2)
X1, X2 are different dice
X1 denotes if grass is wet, X2 denotes if sprinkler was on
Two r.v.s. are independent if
P(X1 = x1,X2 = x2) = P(X1 = x2)P(X2 = x2) Two different dice are independent
If sprinkler was on, then grass will be wet ⇒ dependent
Quantifying Uncertainty with Probability
Instructor:
Conditional Probability, Bayes Rule
Grass Wet Grass Dry
Springkler On
Springkler Off
Inference problems:
Given ‘grass wet’ what is P(“sprinkler on”|“grass wet”) Given ‘symptom’ what is P(“disease”|“symptom”)
For any r.v.s. X , Y , the conditional probability P(x|y) = P(x,y)
P(y) Since P(x,y) = P(y|x)P(x), we have
P(x|y) = P(y|x)P(x) P(y)
Expressing ‘posterior’ in terms of ‘conditional’: Bayes Rule Quantifying Uncertainty with Probability
Instructor:
Product Rule & Independence
Product Rule:
For X1, X2, P(X1, X2) = P(X1)P(X2|X1)
For X1, X2, X3, P(X1, X2, X3) = P(X1)P(X2|X1)P(X3|X1, X2) In general, the chain rule:
P(X1,…,Xn) = P(Xi|X1,…,Xi−1)
Joint distribution of n Boolean variables Specification requires 2n − 1 parameters
Recall Independence:
For X1, X2, P(X1, X2) = P(X1)P(X2) In general
n P(X1,…,Xn) = P(Xi)
Independence reduces specification to n parameters Quantifying Uncertainty with Probability
Instructor:
Independence
Consider 4 variables: Toothache, Catch, Cavity, Weather Independence implies
P(Toothache,Catch,Cavity,Weather)
= P(Toothache,Catch,Cavity)P(Weather)
In terms of the joint distribution:
32 parameters reduces to 12
For boolean variables 2n − 1 reduces to n
Absolute independence powerful but rare
Quantifying Uncertainty with Probability
Instructor:
Conditional Independence
X and Y are conditionally independent given Z P(X,Y|Z) = P(X|Z)P(Y|Z)
P(Toothache,Catch|Cavity) = P(Toothache|Cavity)P(Catch|Cavity)
Conditional Independence simplifies joint distributions Often reduces from exponential to linear in n
P(X,Y,Z) = P(Z)P(X|Z)P(Y|Z)
Quantifying Uncertainty with Probability
Instructor:
Naive Bayes Model
IfX1,…,Xn areindependentgivenY
P(Y,X1,…,Xn) = P(Y)P(Xi|Y) i=1
P(Cavity,Toothache,Catch)
= P(Cavity)P(Toothache|Cavity)P(Catch|Cavity)
More generally
P(Cause, Effect1, . . . , Effectn) = P(Cause) P(Effecti |Cause)
Quantifying Uncertainty with Probability
Instructor:
程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com