Probability
Nishant Mehta Lecture 15 – Part I
Sample space
The sample space, or outcome space, is the set of all possible
outcomes. We denote it as ⌦
Suppose we flip a coin once. Then the sample space is
⌦ = {H,T}
If instead we flip a coin twice, then the sample space is
⌦ = {H,T}2 = {HH,HT,TH,TT}
Probability Distribution
First two of Kolmogorov’s probability axioms
(1) For any outcome a 2 ⌦ , P(a) 0
(2) P(⌦) = 1 (with probability 1, some outcome must happen)
Coin-flipping example
Suppose we flip a coin once, so ⌦ = {H,T} Probability distribution of outcome is specified by the
Bernoulli distribution.
Let P(H) = p. We call p the success probability.
1
2
A fair coin corresponds to p =
Dice example
Suppose we roll a pair of dice; then ⌦ = {1,2,…,6}2 Probability distribution for the outcome (a pair of numbers) is
the uniform distribution.
The uniform distribution satisfies P(a) = P(b) for all a, b 2 ⌦
Therefore, we have P(a) = 1 for all a 2 ⌦ |⌦|
In the dice example, P(i,j) = 1 for any i,j 2 {1,2,…,6} 36
Events
An event A ✓ ⌦ is a subset of the sample space
Suppose we flip a coin twice. Then {HTX,TH} is an event
The probability of an event A is P(A) = P(a) a2A
Suppose we roll one die. What is the probability of rolling an even number? We can use shorthand:
P(a is even) = P({a 2 ⌦: a is even}) = P({2,4,6})
Random variables
A random variable X is a function from the sample space to V ✓ R
X:⌦!V
Suppose we roll a pair of dice and then win an amount of dollars
equal to the sum of the rolls.
If the outcome is (a, b), then the amount we win is given by the random variable X = a + b .
Example 1
Random variables
A random variable X is a function from the sample space to V ✓ R
X:⌦!V
Suppose that K horses are racing, and we bet money on horse j.
If horse j wins the race, we win $100; otherwise, we win $0.
Formally, we have sample space ⌦ = {1,2,…,K}, where the outcome is i if horse i wins.
The amount we win is given by the random variable:
X =100·1[horsej wins]=100·1[a=j]
Example 2
Events based on random variables
We can define events in terms of random variables
Example: For v 2 V , we can define the event X = v
Formally, we have P(X = v) = P({a 2 ⌦: X(a) = v})
Let X be the sum of the numbers when rolling a pair of dice P(X =3)=P (a,b)2{1,2,…,6}2 :a+b=3
= P ({(1, 3), (3, 1)})
2
36
=
Events based on random variables
We can define events in terms of random variables Example: ForU ✓ V, we can define the event X 2 U Formally, we have P(X 2 U) = P({a 2 ⌦: X(a) 2 U})
Let X be the sum of the numbers when rolling a pair of dice
P(X <3)=P(X =2)+P(X =3)
= P ({(1, 1)}) + P ({(1, 2), (2, 1)})
3
36
=
Expected value
For a random variable X , we defined the expected value as
E[X] = X vP(X = v) v2V
This can be re-expressed as
E[X]=XvP(X =v)=XvP({a2⌦:X(a)=v}) v2V v2V
=X X vP(a) v2V a2⌦: X(a)=v
= X X X(a)P(a) v2V a2⌦: X(a)=v
= X(a)P(a) a2⌦
Expected value - Exercise
Suppose that you are playing the game blackjack, and there are
three outcomes: {blackjack-win, win, lose}.
If the outcome is “blackjack-win”, you win $150.
If the outcome is “win”, you win $100.
If the outcome is “lose”, you win -$100 (so, you lose $100).
Let the probability of blackjack-win be 0.02, the probability of win be 0.48, and the probability of lose be 0.5.
Let the random variable X be the amount of money you win. What is the expected value of X ?
Expected value
Linearity of expectation:
For random variables X and Y and constants a, b, c, we have: E[aX]=aE[X] and E[X +Y]=E[X]+E[Y]
so, in particular,
E[aX +bY +c]=aE[X]+bE[Y]+c
Independence
Two events A and B are independent if P(A [ B) = P(A) · P(B)
Two random variables X and Y are independent if,
for any u, v 2 V , the events [X = u] and [Y = v ] are independent
Example: 3 coin flips
We have P(HHT) = P(H)P(H)P(T)