CS计算机代考程序代写 MATH3075/3975 Financial Derivatives

MATH3075/3975 Financial Derivatives

Tutorial 1: Solutions

Exercise 1 (a) The conditional distributions of X given Y = j are:

for j = 1: (1/5, 3/5, 1/5),

for j = 2: (2/3, 0, 1/3),

for j = 3: (0, 3/5, 2/5),

and thus the conditional expectations EP(X|Y = j) are computed as follows:

EP(X|Y = 1) = 1/5 + 6/5 + 3/5 = 2,
EP(X|Y = 2) = 2/3 + 0 + 3/3 = 5/3,
EP(X|Y = 3) = 0 + 6/5 + 6/5 = 12/5.

(b) The marginal distributions of X and Y are:

for X : (4/18, 9/18, 5/18),

for Y : (10/18, 3/18, 5/18).

On the one hand, we obtain

EP(X) =
3∑
i=1

iP(X = i) =
4

18
+ 2 ·

9

18
+ 3 ·

5

18
=

37

18
.

On the other hand, we get

EP
(
EP(X|Y )

)
=

3∑
j=1

EP(X|Y = j)P(Y = j) = 2 ·
10

18
+

5

3
·

3

18
+

12

5
·

5

18
=

37

18
.

Hence the equality EP(X) = EP
(
EP(X|Y )

)
holds, as was expected.

(c) Random variables X and Y are not independent since the condition

P(X = i, Y = j) = P(X = i)P(Y = j), ∀ i, j = 1, 2, 3,

is not satisfied. For instance, if we take i = 1 and j = 2, then

P(X = 1, Y = 3) = 0 6=
4

18
·

5

18
= P(X = 1)P(Y = 3).

Exercise 2 (a) We need to show that∫ ∞
−∞

∫ ∞
−∞

f(X,Y )(x, y) dxdy =

∫ ∞
0

∫ ∞
0

1

y
e
−x
y e−y dxdy

?
= 1.

1

We first compute the marginal density of Y . For y ≥ 0, we obtain

fY (y) =

∫ ∞
0

1

y
e
−x
y e−y dx =

1

y
e−y

∫ ∞
0

e
−x
y dx =

1

y
e−y

[
−ye−

x
y

]∞
0

= e−y.

Of course, we have fY (y) = 0 for all y < 0. Therefore,∫ ∞ 0 ∫ ∞ 0 1 y e −x y e−y dxdy = ∫ ∞ 0 fY (y) dy = ∫ ∞ 0 e−y dy = 1. (b) For any fixed y ≥ 0, the conditional density of X given Y = y equals fX|Y (x| y) = f(X,Y )(x, y) fY (y) = 1 y e −x y , ∀x ≥ 0, and fX|Y (x| y) = 0 for x < 0. Consequently, for every y ≥ 0 EP(X|Y = y) = ∫ ∞ 0 xfX|Y (x| y) dx = ∫ ∞ 0 x y e −x y dx = y ∫ ∞ 0 ze−z dz = y. Exercise 3 We first compute the conditional cumulative distribution function of X given the event {X < 0.5} FX|X< 0.5(x) := P(X ≤ x|X < 0.5), ∀x ∈ R. We obtain FX|X< 0.5(x) = P(X ≤ x, X < 0.5) P(X < 0.5) =   0, if x ≤ 0, 2P(X ≤ x) = 2x, if x ∈ (0, 0.5), 1, if x ≥ 0.5. so that the conditional density of X given the event {X < 0.5} equals fX|X< 0.5(x) =   0, if x ≤ 0, 2, if x ∈ (0, 0.5), 0, if x ≥ 0.5. Therefore, EP(X|X < 0.5) = ∫ ∞ −∞ xfX|X< 0.5(x) dx = ∫ 0.5 0 2x dx = 0.25. Exercise 4 We have fX(x) = 1 λ e− x λ , ∀x > 0,

and thus

P(X > x) = 1− FX(x) =
∫ ∞
x

fX(u) du = e
− x
λ , ∀x > 0.

Consequently,

P(X > x|X > 1) =
P(X > x,X > 1)

P(X > 1)
=



P(X>x)
P(X>1) = e

−x−1
λ , if x ≥ 1,

1, if x < 1. 2 Hence the conditional density equals fX|X>1(x) =

{
1
λ
e−

x−1
λ , if x ≥ 1,

0, if x < 1, and thus EP(X|X > 1) =
∫ ∞
1

x

λ
e−

x−1
λ dx = 1 + λ = 1 + EP(X).

Exercise 5 Since

Cov(X,Y ) = EP(XY )− EP(X)EP(Y ) = EP(X3)− EP(X)EP(X2) = 0

since EP(X) = EP(X3) = 0. Therefore, the random variables X and Y are uncorrelated. They are
not independent, however, since, for instance

EP(Y |X = 1) = 1 6= 4 = EP(Y |X = 2).

Recall that under independence of X and Y we have EP(Y |X) = EP(Y ) and EP(X|Y ) = EP(X).

Exercise 6 (MATH3975) We have

Cov(X,Y ) = EP(XY )− EP(X)EP(Y )
= EP

(
(U + V )(U − V )

)
− EP(U + V )EP(U − V )

= EP
(
U2 − V 2

)
− EP(U + V )

(
EP(U)− EP(V )

)
= 0.

since EP(U) = EP(V ) and EP(U2) = EP(V 2). Hence the random variables X and Y are uncorrelated.
To check whether X and Y are independent, we need first to specify the joint distribution of X
and Y . For instance, if we take U = V , then X = 2U and Y = 0 so that X and Y are independent.
However, if we take U and V independent (but not deterministic), then X and Y are not necessarily
independent. For instance, we may take as U and V the i.i.d. Bernoulli random variables with
P(U = 1) = p = 1− P(U = 0). It is then easy to check that X and Y are not independent.

3