CS计算机代考程序代写 chain MAST30001 Stochastic Modelling

MAST30001 Stochastic Modelling

Tutorial Sheet 2

1. Show that the Markov property does not in general imply that for any sets A,B
and C,

P(Xn+1 ∈ C|Xn ∈ B,Xn−1 ∈ A) = P(Xn+1 ∈ C|Xn ∈ B).
(That is, define a Markov chain and sets A,B,C where the equality doesn’t hold.)
You can see there’s a problem since if S denotes the state space and we set B = S,
then the relation above becomes

P(Xn+1 ∈ C|Xn−1 ∈ A) = P(Xn+1 ∈ C);

that is, that Xn+1 and Xn−1 are independent, which isn’t true in general. To be
concrete, take simple symmetric random walk on Z starting from 0 (with S = Z).
Set A = {n−1}, B = S and C = {n+1}. Then the left hand side is 1/2×1/2 = 1/4,
but the right hand side is less than 1/4 for all n ≥ 2.

2. Let (Xn)n∈Z+ be a Markov chain with state space {1, 2, 3} and transition matrix
 0 1/3 2/31/4 3/4 0

2/5 0 3/5


(a) Draw the transition diagram corresponding to this chain.

3

1 2

1/3

3/4

3/5

2/5

2/3

1/4

(b) Compute P(X3 = 1, X2 = 2, X1 = 2|X0 = 1).
p1,2p2,2p2,1 = (1/3)(3/4)(1/4).

(c) If X0 is uniformly distributed on {1, 2, 3}, compute P(X3 = 1, X2 = 2, X1 = 2).

3∑
i=1

(1/3)pi,2p2,2p2,1 = (1/3)(1/3)(3/4)(1/4) + (1/3)(3/4)(3/4)(1/4).

(d) Now assuming that P(X0 = 1) = P(X0 = 2) = 1/2, compute P(X1 = 1, X4 =
1, X7 = 1).
Firstly, P(X1 = 1|X0 = 1) = 0. Next, the only ways to move from state 1 to
state 1 in 3 steps are to go 1-2-2-1, or 1-3-3-1. Thus

P(X3 = 1|X0 = 1) =
1

3
·

3

4
·

1

4
+

2

3
·

3

5
·

2

5
=

89

400
.

From the above computations and the Markov property we have

P(X1 = 1, X4 = 1, X7 = 1) = 0+P(X7 = 1|X4 = 1)P(X4 = 1|X1 = 1)P(X1 = 1),

which is equal to
1

8

(
89

400

)2
.

3. A simplified model for the spread of a contagion in a small population of size 4 is
as follows. At each discrete time unit, two individuals in the population are chosen
uniformly at random (independent of the past) to meet. If one of these persons is
healthy and the other has the contagion, then with probability 1/4 (independent of
the past) the healthy person becomes sick. Otherwise the system stays the same.

(a) If Xn is the number of healthy people at step n, then explain why X0, X1 . . .
is a Markov chain.
The sequence of variables X0, X1 . . . either stays the same or decreases by one
at each step. The chance that the chain decreases by one from step n to n + 1
given the entire history up to n is the chance a healthy person and a sick person
are chosen times 1/4:

P(Xn+1 = Xn − 1|Xn, . . . , X0) =
Xn(4−Xn)

6
×

1

4
,

which only depends on the history through Xn.

(This can be written as

P(Xn+1 = xn − 1|Xn = xn, . . . , X0 = x0) =
xn(4− xn)

6
×

1

4
,

for any sequence (x0, . . . , xn) such that the left hand side is well defined.)

(b) Specify the transition probabilities of Xn.
Using the formula above,

pi,i−1 = 1− pi,i = i(4− i)/24, i = 0, . . . , 4,

and the remaining probabilities are zero.

(c) Draw the transition diagram for this chain.

4 3 2 1 0

1 7/8 5/6 7/8 1

1/8 1/6 1/8

(d) If initially the chance that a given person in the population has the disease
equals 1/2, determined independently, then what is the chance everyone has
the disease after two steps in the process?

We’re looking for P(X2 = 0) which according to the Chap-Kol equations and
the fact that the initial distribution is binomial(4, 1/2) is

4∑
k=0

P(X2 = 0|X0 = k)P(X0 = k) =
4∑

k=0

(P 2)k0

(
4

k

)
1

24
,

where P is the one step transition matrix of the chain, given by the previous
part of the problem.

(e) Now suppose that exactly one person is infected at time 0. Find the expected
time until everyone is infected.
We are starting from state 3. The time to reach state 2 is G1 ∼Geometric(1/8),
and from state 2 the additional time to reach state 1 is G2 ∼Geometric(1/6),
and from state 1 the additional time to reach state 0 is G3 ∼Geometric(1/8).
Then total time is G1 + G2 + G3 which has expectation 8+6+8=22.

4. Let (Yn)n≥0 be i.i.d. random variables with P(Yi = 1) = P(Yi = −1) = 1/2 and let
Xn = (Yn+1 + Yn)/2.

(a) Find the transition probabilities P(Xn+m = k|Xn = j) for m = 1, 2, . . . and
j, k = 0,±1.
Since the Yi are i.i.d. and Xn is a function of Yn, Yn+1, Xn and Xn+m are
independent for m ≥ 2. Thus for m ≥ 2,

P(Xn+m = k|Xn = j) = P(Xn+m = k) =
{

1/4, k = ±1,
1/2, k = 0.

When m = 1, the event Xn = 1 corresponds to Yn = Yn+1 = 1 and so

P(Xn+1 = k|Xn = 1) = P(Xn+1 = k|Yn+1 = 1) =
{

1/2, k = 1,
1/2, k = 0.

Similarly,

P(Xn+1 = k|Xn = −1) =
{

1/2, k = −1,
1/2, k = 0.

When Xn = 0, then either Yn = 1 and Yn+1 = −1 or Yn = −1 and Yn+1 = 1,
and these events have equal probability. So

P(Xn+1 = 1|Xn = 0) =
P(Yn+2 = 1, Yn+1 = 1, Yn = −1)

P((Yn+1 + Yn)/2 = 0)
=

1/8

1/2
= 1/4,

and similarly

P(Xn+1 = k|Xn = 0) =
{

1/4, k = ±1,
1/2, k = 0.

(b) Show that (Xn)n≥0 is not a Markov chain.
The Markov property is not satisfied since

P(Xn+1 = 1|Xn = 0, Xn−1 = 1) = 0 6=
1

4
= P(Xn+1 = 1|Xn = 0).