CS计算机代考程序代写 MAST30001 Stochastic Modelling

MAST30001 Stochastic Modelling

Tutorial Sheet 6

You’ve probably already seen/done some of these before, but it’s useful to do them
yourselves/see them again!

1. Let X ∼Exponential(λ), with λ > 0. Prove that P(X > s + t|X > t) = P(X > s)
for every s, t > 0.

P(X > s+t|X > t) =
P(X > s+ t,X > t)

P(X > t)
=

P(X > s+ t)
P(X > t)

=
e−λ(s+t)

e−λt
= e−λs = P(X > s).

2. Let (Xi)i∈N be independent random variables with Xi ∼Exponential(λi). Find the
distribution of Yn = mini≤nXi.

P(Yn > y) = P(∩ni=1{Xi > y}) =
n∏
i=1

P(Xi > y) =
n∏
i=1

e−λiy = e−(
∑n

i=1 λi)y,

so Yn ∼Exponential(
∑n

i=1 λi).

3. Let (Ti)i∈N be i.i.d. Exponential(λ) random variables, and let N be a Geometric(p)
random variable that is independent of the other variables.

(a) Find the moment generating function E[etT1 ] of T1.

E[etT1 ] =
∫ ∞
0

etxλe−λxdx =

∫ ∞
0

e(t−λ)xλdx.

If t ≥ λ then this is infinite. Otherwise it is λ/(λ− t).
(b) Let Y =

∑N
i=1 Ti. Find the distribution of Y .

We’ll find the MGF of Y (which characterizes its distribution).

E[etY ] = E[et
∑N

i=1 Ti ] =
∞∑
n=1

E[et
∑N

i=1 Ti |N = n]P(N = n)

=
∞∑
n=1

E[et
∑n

i=1 Ti |N = n]P(N = n) =
∞∑
n=1

E[et
∑n

i=1 Ti ]P(N = n),

where we have used independence of N from the Ti variables in the last line.
Now use the fact that the Ti are i.i.d. as usual to see that for t < λ the above is equal to ∞∑ n=1 ( E[etT1 ] )n P(N = n) = ∞∑ n=1 ( λ λ− t )n P(N = n) = p 1− p ∞∑ n=1 (λ(1− p) λ− t )n . For t < λp this is equal to λp λp− t . This is the MGF of an Exponential(λp) distribution, hence Y ∼Exponential(λp). 4. Let X ≥ 0 be a random variable satisfying (∗) P(X > s+ t|X > t) = P(X > s), for all s, t ≥ 0.

Show that X ∼Exponential(λ), for some λ ≥ 0.
Let G(s) = P(X > s). [Note that G(s) > 0 for every s otherwise the conditioning
in (∗) above is not well defined.] The property (∗) can be rewritten as G(t + s) =
G(t)G(s) for every t, s ≥ 0. By induction G(nt) = G(t)n for each n and each t ≥ 0.
This shows that G(n) = G(1)n and G(1/n) = G(1)1/n for every n ∈ N. It follows
that for any non-negative rational number r = m/n,

G(r) = G(m/n) = G(1/n)m = (G(1)1/n)m = G(1)r.

Now note that G is right continuous since 1−G is (1−G is a cdf), so for any t ≥ 0
and rationals tn ↓ t we have

G(t) = lim
n→∞

G(tn) = lim
n→∞

G(1)tn = G(1)t.

If G(1) = 1 then P(X > n) = 1 for all n so P(X = ∞) = 1. This corresponds to
the case of λ = 0. Otherwise 0 < G(1) < 1 and we can write G(1) = e−λ for some λ > 0, and G(t) = G(1)t says that P(X > t) = e−λt for each t ≥ 0, as required.