CS计算机代考程序代写 chain MAST30001 Stochastic Modelling

MAST30001 Stochastic Modelling

Tutorial Sheet 3

1. Let Y1, Y2, . . . be i.i.d. random variables with probability mass function given by the
following table.

k 0 1 2 3
P(Y = k) 0.2 0.2 0.3 0.3

Set X0 = 0 and let Xn = max{Y1, . . . , Yn} be the largest Yi observed to time n.

(a) Explain why (Xn) is a Markov chain and determine its transition matrix P .

(b) Find all communicating classes for this chain and all absorbing states.

(c) Describe the long run behavior of the chain. In particular can you determine
the matrix limn→∞ P

n?

(d) Find the expected time to reach state 3, starting from state 0.

2. A game involves rolling a ball around a table, starting from a fixed position. Three
players, Alex, Bobby and Célia, take turns attempting to grab the ball as it passes
them. If Alex is unsuccessful then Bobby attempts to grab the ball and if he is also
unsuccessful then Célia attempts to grab the ball. If they are all unsuccessful, the
ball is rolled again from the starting position. The game stops as soon as any player
is successful (and that player wins!). Suppose that on each attempt, Alex, Bobby
and Célia have probabilities p1, p2 and p3 (all ∈ (0, 1), with p1 > 0) respectively, of
grabbing the ball, independent of previous attempts.

(a) Using Markov chain theory, find the probability that Alex wins the game.

(b) What condition(s) do p1, p2 and p3 have to satisfy so that each player is equally
likely to win the game? (Include the largest possible values of p1,p2, and p3 in
your answer.)

3. Two dogs named Fluffy and Duffy have m fleas distributed between them. At
discrete time steps a flea (chosen uniformly at random) jumps to the other dog. Let
Xn be the number of fleas on Fluffly after the nth time step.

(a) Is Xn a Markov chain? What are the transition probabilities?

(b) If X0 has a binomial distribution with parameters (m, 1/2) (meaning that each
flea tosses a fair coin to determine on which dog it starts), what is the distri-
bution of X10? [Hint: First compute the distribution of X1.]

(c) If m = 3, and P(X0 = 1) = 1, find the probability that Fluffy is free of fleas
before Duffy.

4. Let (Xn)n≥1 be a Markov chain with state space {1, . . . , k} for some k ≥ 2. Show
that if i and j communicate, then the probability that the chain started in state i
reaches state j in fewer than k steps is greater than 0.