7CCMFM18 Machine Learning¶
King’s College London
Academic year 2019-2020
Lecturer: Blanka Horvath
Example: Deep Hedging in the Black-Scholes Model¶
2 March 2020
Let us first import the necessary libraries and functions, and set plotting style.
In [1]:
import numpy as np
import numpy.random as npr
from scipy.stats import norm
import tensorflow.keras as keras
import tensorflow.keras.backend as kb
import matplotlib.pyplot as plt
plt.style.use(‘ggplot’)
We define auxiliary functions related to the Black-Scholes model, computing call option prices and delta under the model.
In [2]:
def BlackScholes(S0,r,sigma,T,K):
d1 = 1 / (sigma * np.sqrt(T)) * (np.log(S0/K) + (r+sigma**2/2)*T)
d2 = d1 – sigma * np.sqrt(T)
return norm.cdf(d1) * S0 – norm.cdf(d2) * K * np.exp(-r*T)
#callprice = BlackScholes(S0,0,sigma,1,K)
def BlackScholesCallDelta(S0,r,sigma,T,K):
d1 = 1 / (sigma * np.sqrt(T)) * (np.log(S0/K) + (r+sigma**2/2)*T)
return norm.cdf(d1)
We construct a price process $S = (S_t)_{t=0}^T$ as follows: \begin{equation} S_t = S0 \exp\bigg(\mu\frac{t}{T}+\sigma \sum{i=1}^t \xi_i\bigg), \quad t = 0,1,\ldots,T, \end{equation} where $\mu>0$, $\sigma>0$ and $S_0>0$ are constants, and $\xi_1,\ldots,\xi_T$ are mutually independent $N(0,\frac{1}{T})$-distributed random variables. When $T \rightarrow \infty$, the process $S$ approximates the continuous-time Black-Scholes price process \begin{equation} S^{\mathrm{BS}}_t = S_0 \exp(\mu t + \sigma W_t), \quad t \in [0,1], \end{equation} where $W = (W_t)_{t \in [0,1]}$ is a standard Brownian motion.
Next, we set the parameter values:
In [3]:
mu = 0.1
sigma = 0.5
T = 100
S_0 = 1
We generate $N$ independent samples $S^0,\ldots,S^{N-1}$ of the price path $S = (S_t)_{t=0}^T$ and store them in an $N \times (T+1)$ array.
In [4]:
N = 100000
xi = npr.normal(0, np.sqrt(1 / T), (N, T))
W = np.apply_along_axis(np.cumsum, 1, xi)
W = np.concatenate((np.zeros((N, 1)), W),1)
drift = np.linspace(0, mu , T + 1)
drift = np.reshape(drift, (1, T + 1))
drift = np.repeat(drift, N, axis=0)
S = S_0 * np.exp(drift + sigma * W)
For future use, we compute an $N\times T$ array containing differenced prices.
In [5]:
dS = np.diff(S, 1, 1)
We also create a list of matrices \begin{equation} \boldsymbol{X} :=\begin{bmatrix} \begin{bmatrix} 0 & S^0_0 \ 0 & S^1_0 \ \vdots & \vdots\ 0 & S^{N-1}_0 \end{bmatrix}, \begin{bmatrix} \frac{1}{T} & S^0_1 \ \frac{1}{T} & S^1_1 \ \vdots & \vdots\ \frac{1}{T} & S^{N-1}1 \end{bmatrix}, \ldots, \begin{bmatrix} \frac{T-1}{T} & S^0{T-1} \ \frac{T-1}{T} & S^1{T-1} \ \vdots & \vdots\ \frac{T-1}{T} & S^{N-1}{T-1} \end{bmatrix} \end{bmatrix} \end{equation} which will form the features of our training data.
In [6]:
tim = np.linspace(0, 1, T+1)
X = []
for i in range(T):
timv = np.repeat(tim[i],N)
timv = np.reshape(timv,(N,1))
Sv = np.reshape(S[:,i],(N,1))
X.append(np.concatenate((timv,Sv),1))
Before proceeding further, it is useful to plot a couple of price paths in our data set.
In [7]:
plt.plot(tim,S[0],label=”$i=0$”)
plt.plot(tim,S[1],label=”$i=1$”)
plt.plot(tim,S[2],label=”$i=2$”)
plt.xlabel(r”$\frac{t}{T}$”)
plt.ylabel(r”$S^i_t$”)
plt.legend()
plt.show()

Our aim is to hedge the call option \begin{equation} (S_T – K)^+, \end{equation} written on $S$, that is, develop an adapted and self-financing trading strategy in the underlying stock so that its terminal wealth matches the option payoff as closely as possible. Such a trading strategy is specified by its initial wealth $x \in \mathbb{R}$ and position $\gamma_t$ in the underlying stock at time $t$ for any $t = 0,1,\ldots,T-1$. By adaptedness, $\gamma_t$ must be a function of the past prices $S_t,S_{t-1},\ldots,S_0$ only. Assuming zero interest rate, by the self-financing property, the terminal wealth of the strategy can be expressed as \begin{equation} VT = x + \sum{t=1}^T \gamma_{t-1} (St-S{t-1}). \end{equation} Then the option hedger’s profit and loss is \begin{equation} \mathrm{PnL} = V_T – (S_T – K)^+, \end{equation} Note that, with $x$ and $S_0$ fixed, we can view $\mathrm{PnL}$ as a function of the trading strategy $\gamma_{0},\gamma_1,\ldots,\gamma_{T-1}$ and the price increments $S_1-S_0,S_2-S_1,\ldots,S_T-S_{T-1}$, since \begin{equation} \mathrm{PnL}(\gamma_{0},\gamma1,\ldots,\gamma{T-1},;S_1-S_0,S_2-S_1,\ldots,ST-S{T-1}) = x + \sum{t=1}^T \gamma{t-1} (St-S{t-1}) – \bigg(S0 + \sum{t=1}^T (St-S{t-1}) -K \bigg)^+. \end{equation}
The key insight of deep hedging is to represent the trading strategy as a neural network, whose inputs are the available market data and output is the hedging position, that is \begin{equation} \gamma_t = ft(S{t},S_{t-1},\ldots,S_0), \end{equation} where $f_t$ is a neural network for any $t = 0,1,\ldots,T-1$. Here, since we know that $S$ is a Markov process, we can simplify the problem slightly (although this is not necessary in general) by seeking a single network $f : [0,1]\times \mathbb{R} \rightarrow \mathbb{R}$ such that \begin{equation} \gamma_t = f\bigg(\frac{t}{T},S_t\bigg), \quad t = 0,1,\ldots,T-1. \end{equation} However, to evaluate $\mathrm{PnL}$, we need all values $f(0,S_0),f(\frac{1}{T},S_T),\ldots,f(\frac{T-1}{T},S_{T-1})$, so we need to create a large hedging network $F$ by concatenating $f(\frac{t}{T},S_t)$ over $t = 0,1,\ldots,T-1$, so that our feedforward network is the map \begin{equation} F : \begin{bmatrix} (0, S_0) & \Big(\frac{1}{T},S1\Big) & \cdots & \Big(\frac{T-1}{T},S{T-1}\Big)\end{bmatrix} \mapsto \begin{bmatrix} f(0,S_0) & f\Big(\frac{1}{T},ST\Big) & \cdots & f\Big(\frac{T-1}{T},S{T-1}\Big) \end{bmatrix}. \end{equation} This network is not fully connected and it has shared layers (since $f$ is repeated), so we need to use the Functional API in Keras to specify it. For $f$, we specify \begin{equation} f \in \mathcal{N}_4(2,100,100,100,1; \mathrm{RELU},\mathrm{RELU},\mathrm{RELU},\mathrm{Sigmoid}), \end{equation} where we choose $\mathrm{Sigmoid}$ due to the financial intuition that the hedging position should be between $0$ and $1$. (This choice helps training, but is not really necessary.)
In [8]:
inputs = []
predictions = []
layer1 = keras.layers.Dense(100, activation=’relu’)
layer2 = keras.layers.Dense(100, activation=’relu’)
layer3 = keras.layers.Dense(100, activation=’relu’)
layer4 = keras.layers.Dense(1, activation=’sigmoid’)
for i in range(T):
sinput = keras.layers.Input(shape=(2,))
x = layer1(sinput)
x = layer2(x)
x = layer3(x)
sprediction = layer4(x)
inputs.append(sinput)
predictions.append(sprediction)
predictions = keras.layers.Concatenate(axis=-1)(predictions)
model = keras.models.Model(inputs=inputs, outputs=predictions)
model.summary()
WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/tensorflow_core/python/ops/resource_variable_ops.py:1630: calling BaseResourceVariable.__init__ (from tensorflow.python.ops.resource_variable_ops) with constraint is deprecated and will be removed in a future version.
Instructions for updating:
If using Keras pass *_constraint arguments to layers.
Model: “model”
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_1 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_2 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_3 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_4 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_5 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_6 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_7 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_8 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_9 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_10 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_11 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_12 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_13 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_14 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_15 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_16 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_17 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_18 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_19 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_20 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_21 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_22 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_23 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_24 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_25 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_26 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_27 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_28 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_29 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_30 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_31 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_32 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_33 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_34 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_35 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_36 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_37 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_38 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_39 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_40 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_41 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_42 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_43 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_44 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_45 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_46 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_47 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_48 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_49 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_50 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_51 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_52 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_53 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_54 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_55 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_56 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_57 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_58 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_59 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_60 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_61 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_62 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_63 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_64 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_65 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_66 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_67 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_68 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_69 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_70 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_71 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_72 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_73 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_74 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_75 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_76 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_77 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_78 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_79 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_80 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_81 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_82 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_83 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_84 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_85 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_86 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_87 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_88 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_89 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_90 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_91 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_92 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_93 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_94 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_95 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_96 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_97 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_98 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_99 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_100 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
dense (Dense) (None, 100) 300 input_1[0][0]
input_2[0][0]
input_3[0][0]
input_4[0][0]
input_5[0][0]
input_6[0][0]
input_7[0][0]
input_8[0][0]
input_9[0][0]
input_10[0][0]
input_11[0][0]
input_12[0][0]
input_13[0][0]
input_14[0][0]
input_15[0][0]
input_16[0][0]
input_17[0][0]
input_18[0][0]
input_19[0][0]
input_20[0][0]
input_21[0][0]
input_22[0][0]
input_23[0][0]
input_24[0][0]
input_25[0][0]
input_26[0][0]
input_27[0][0]
input_28[0][0]
input_29[0][0]
input_30[0][0]
input_31[0][0]
input_32[0][0]
input_33[0][0]
input_34[0][0]
input_35[0][0]
input_36[0][0]
input_37[0][0]
input_38[0][0]
input_39[0][0]
input_40[0][0]
input_41[0][0]
input_42[0][0]
input_43[0][0]
input_44[0][0]
input_45[0][0]
input_46[0][0]
input_47[0][0]
input_48[0][0]
input_49[0][0]
input_50[0][0]
input_51[0][0]
input_52[0][0]
input_53[0][0]
input_54[0][0]
input_55[0][0]
input_56[0][0]
input_57[0][0]
input_58[0][0]
input_59[0][0]
input_60[0][0]
input_61[0][0]
input_62[0][0]
input_63[0][0]
input_64[0][0]
input_65[0][0]
input_66[0][0]
input_67[0][0]
input_68[0][0]
input_69[0][0]
input_70[0][0]
input_71[0][0]
input_72[0][0]
input_73[0][0]
input_74[0][0]
input_75[0][0]
input_76[0][0]
input_77[0][0]
input_78[0][0]
input_79[0][0]
input_80[0][0]
input_81[0][0]
input_82[0][0]
input_83[0][0]
input_84[0][0]
input_85[0][0]
input_86[0][0]
input_87[0][0]
input_88[0][0]
input_89[0][0]
input_90[0][0]
input_91[0][0]
input_92[0][0]
input_93[0][0]
input_94[0][0]
input_95[0][0]
input_96[0][0]
input_97[0][0]
input_98[0][0]
input_99[0][0]
input_100[0][0]
__________________________________________________________________________________________________
dense_1 (Dense) (None, 100) 10100 dense[0][0]
dense[1][0]
dense[2][0]
dense[3][0]
dense[4][0]
dense[5][0]
dense[6][0]
dense[7][0]
dense[8][0]
dense[9][0]
dense[10][0]
dense[11][0]
dense[12][0]
dense[13][0]
dense[14][0]
dense[15][0]
dense[16][0]
dense[17][0]
dense[18][0]
dense[19][0]
dense[20][0]
dense[21][0]
dense[22][0]
dense[23][0]
dense[24][0]
dense[25][0]
dense[26][0]
dense[27][0]
dense[28][0]
dense[29][0]
dense[30][0]
dense[31][0]
dense[32][0]
dense[33][0]
dense[34][0]
dense[35][0]
dense[36][0]
dense[37][0]
dense[38][0]
dense[39][0]
dense[40][0]
dense[41][0]
dense[42][0]
dense[43][0]
dense[44][0]
dense[45][0]
dense[46][0]
dense[47][0]
dense[48][0]
dense[49][0]
dense[50][0]
dense[51][0]
dense[52][0]
dense[53][0]
dense[54][0]
dense[55][0]
dense[56][0]
dense[57][0]
dense[58][0]
dense[59][0]
dense[60][0]
dense[61][0]
dense[62][0]
dense[63][0]
dense[64][0]
dense[65][0]
dense[66][0]
dense[67][0]
dense[68][0]
dense[69][0]
dense[70][0]
dense[71][0]
dense[72][0]
dense[73][0]
dense[74][0]
dense[75][0]
dense[76][0]
dense[77][0]
dense[78][0]
dense[79][0]
dense[80][0]
dense[81][0]
dense[82][0]
dense[83][0]
dense[84][0]
dense[85][0]
dense[86][0]
dense[87][0]
dense[88][0]
dense[89][0]
dense[90][0]
dense[91][0]
dense[92][0]
dense[93][0]
dense[94][0]
dense[95][0]
dense[96][0]
dense[97][0]
dense[98][0]
dense[99][0]
__________________________________________________________________________________________________
dense_2 (Dense) (None, 100) 10100 dense_1[0][0]
dense_1[1][0]
dense_1[2][0]
dense_1[3][0]
dense_1[4][0]
dense_1[5][0]
dense_1[6][0]
dense_1[7][0]
dense_1[8][0]
dense_1[9][0]
dense_1[10][0]
dense_1[11][0]
dense_1[12][0]
dense_1[13][0]
dense_1[14][0]
dense_1[15][0]
dense_1[16][0]
dense_1[17][0]
dense_1[18][0]
dense_1[19][0]
dense_1[20][0]
dense_1[21][0]
dense_1[22][0]
dense_1[23][0]
dense_1[24][0]
dense_1[25][0]
dense_1[26][0]
dense_1[27][0]
dense_1[28][0]
dense_1[29][0]
dense_1[30][0]
dense_1[31][0]
dense_1[32][0]
dense_1[33][0]
dense_1[34][0]
dense_1[35][0]
dense_1[36][0]
dense_1[37][0]
dense_1[38][0]
dense_1[39][0]
dense_1[40][0]
dense_1[41][0]
dense_1[42][0]
dense_1[43][0]
dense_1[44][0]
dense_1[45][0]
dense_1[46][0]
dense_1[47][0]
dense_1[48][0]
dense_1[49][0]
dense_1[50][0]
dense_1[51][0]
dense_1[52][0]
dense_1[53][0]
dense_1[54][0]
dense_1[55][0]
dense_1[56][0]
dense_1[57][0]
dense_1[58][0]
dense_1[59][0]
dense_1[60][0]
dense_1[61][0]
dense_1[62][0]
dense_1[63][0]
dense_1[64][0]
dense_1[65][0]
dense_1[66][0]
dense_1[67][0]
dense_1[68][0]
dense_1[69][0]
dense_1[70][0]
dense_1[71][0]
dense_1[72][0]
dense_1[73][0]
dense_1[74][0]
dense_1[75][0]
dense_1[76][0]
dense_1[77][0]
dense_1[78][0]
dense_1[79][0]
dense_1[80][0]
dense_1[81][0]
dense_1[82][0]
dense_1[83][0]
dense_1[84][0]
dense_1[85][0]
dense_1[86][0]
dense_1[87][0]
dense_1[88][0]
dense_1[89][0]
dense_1[90][0]
dense_1[91][0]
dense_1[92][0]
dense_1[93][0]
dense_1[94][0]
dense_1[95][0]
dense_1[96][0]
dense_1[97][0]
dense_1[98][0]
dense_1[99][0]
__________________________________________________________________________________________________
dense_3 (Dense) (None, 1) 101 dense_2[0][0]
dense_2[1][0]
dense_2[2][0]
dense_2[3][0]
dense_2[4][0]
dense_2[5][0]
dense_2[6][0]
dense_2[7][0]
dense_2[8][0]
dense_2[9][0]
dense_2[10][0]
dense_2[11][0]
dense_2[12][0]
dense_2[13][0]
dense_2[14][0]
dense_2[15][0]
dense_2[16][0]
dense_2[17][0]
dense_2[18][0]
dense_2[19][0]
dense_2[20][0]
dense_2[21][0]
dense_2[22][0]
dense_2[23][0]
dense_2[24][0]
dense_2[25][0]
dense_2[26][0]
dense_2[27][0]
dense_2[28][0]
dense_2[29][0]
dense_2[30][0]
dense_2[31][0]
dense_2[32][0]
dense_2[33][0]
dense_2[34][0]
dense_2[35][0]
dense_2[36][0]
dense_2[37][0]
dense_2[38][0]
dense_2[39][0]
dense_2[40][0]
dense_2[41][0]
dense_2[42][0]
dense_2[43][0]
dense_2[44][0]
dense_2[45][0]
dense_2[46][0]
dense_2[47][0]
dense_2[48][0]
dense_2[49][0]
dense_2[50][0]
dense_2[51][0]
dense_2[52][0]
dense_2[53][0]
dense_2[54][0]
dense_2[55][0]
dense_2[56][0]
dense_2[57][0]
dense_2[58][0]
dense_2[59][0]
dense_2[60][0]
dense_2[61][0]
dense_2[62][0]
dense_2[63][0]
dense_2[64][0]
dense_2[65][0]
dense_2[66][0]
dense_2[67][0]
dense_2[68][0]
dense_2[69][0]
dense_2[70][0]
dense_2[71][0]
dense_2[72][0]
dense_2[73][0]
dense_2[74][0]
dense_2[75][0]
dense_2[76][0]
dense_2[77][0]
dense_2[78][0]
dense_2[79][0]
dense_2[80][0]
dense_2[81][0]
dense_2[82][0]
dense_2[83][0]
dense_2[84][0]
dense_2[85][0]
dense_2[86][0]
dense_2[87][0]
dense_2[88][0]
dense_2[89][0]
dense_2[90][0]
dense_2[91][0]
dense_2[92][0]
dense_2[93][0]
dense_2[94][0]
dense_2[95][0]
dense_2[96][0]
dense_2[97][0]
dense_2[98][0]
dense_2[99][0]
__________________________________________________________________________________________________
concatenate (Concatenate) (None, 100) 0 dense_3[0][0]
dense_3[1][0]
dense_3[2][0]
dense_3[3][0]
dense_3[4][0]
dense_3[5][0]
dense_3[6][0]
dense_3[7][0]
dense_3[8][0]
dense_3[9][0]
dense_3[10][0]
dense_3[11][0]
dense_3[12][0]
dense_3[13][0]
dense_3[14][0]
dense_3[15][0]
dense_3[16][0]
dense_3[17][0]
dense_3[18][0]
dense_3[19][0]
dense_3[20][0]
dense_3[21][0]
dense_3[22][0]
dense_3[23][0]
dense_3[24][0]
dense_3[25][0]
dense_3[26][0]
dense_3[27][0]
dense_3[28][0]
dense_3[29][0]
dense_3[30][0]
dense_3[31][0]
dense_3[32][0]
dense_3[33][0]
dense_3[34][0]
dense_3[35][0]
dense_3[36][0]
dense_3[37][0]
dense_3[38][0]
dense_3[39][0]
dense_3[40][0]
dense_3[41][0]
dense_3[42][0]
dense_3[43][0]
dense_3[44][0]
dense_3[45][0]
dense_3[46][0]
dense_3[47][0]
dense_3[48][0]
dense_3[49][0]
dense_3[50][0]
dense_3[51][0]
dense_3[52][0]
dense_3[53][0]
dense_3[54][0]
dense_3[55][0]
dense_3[56][0]
dense_3[57][0]
dense_3[58][0]
dense_3[59][0]
dense_3[60][0]
dense_3[61][0]
dense_3[62][0]
dense_3[63][0]
dense_3[64][0]
dense_3[65][0]
dense_3[66][0]
dense_3[67][0]
dense_3[68][0]
dense_3[69][0]
dense_3[70][0]
dense_3[71][0]
dense_3[72][0]
dense_3[73][0]
dense_3[74][0]
dense_3[75][0]
dense_3[76][0]
dense_3[77][0]
dense_3[78][0]
dense_3[79][0]
dense_3[80][0]
dense_3[81][0]
dense_3[82][0]
dense_3[83][0]
dense_3[84][0]
dense_3[85][0]
dense_3[86][0]
dense_3[87][0]
dense_3[88][0]
dense_3[89][0]
dense_3[90][0]
dense_3[91][0]
dense_3[92][0]
dense_3[93][0]
dense_3[94][0]
dense_3[95][0]
dense_3[96][0]
dense_3[97][0]
dense_3[98][0]
dense_3[99][0]
==================================================================================================
Total params: 20,601
Trainable params: 20,601
Non-trainable params: 0
__________________________________________________________________________________________________
We train $f$, and subsequently $F$, so that quadratic hedging error, that is, $\mathrm{PnL}^2$ is empirically minimised. To this end, we define loss function \begin{equation} \ell\big((\hat{y}_0,\hat{y}1,\ldots,\hat{y}{T-1}),(y_0,y1,\ldots,y{T-1})\big) := \mathrm{PnL}(\hat{y}_0,\hat{y}1,\ldots,\hat{y}{T-1};y_0,y1,\ldots,y{T-1})^2. \end{equation} We fix $x$ as the corresponding Black-Scholes call price $\mathrm{BS}(S_0,K,1)$. The loss function $\ell$ is a custom one, so it needs to be implemented separately. When implementing it, it is important that we use functions from the backend of Keras; they are functions that TensorFlow is able to differentiate algorithmically.
In [0]:
K = 1
callprice = BlackScholes(S_0, 0, sigma, 1, K)
def loss_call(y_true,y_pred):
return (callprice + kb.sum(y_pred * y_true,axis=-1) – kb.maximum(S_0 + kb.sum(y_true,axis=-1) – K,0.))**2
We train now the network using Adam optimisation algorithm, minibatch size $100$, doing $4$ epochs. Note that in training, \begin{equation} \hat{y}_t = f\Big(\frac{t}{T},S^i_t\Big), \quad yt = S^i{t+1} – S^i_{t}, \quad t=0,1,\ldots,T-1. \end{equation} Technically, the features are provided using the list $\boldsymbol{X}$ constructed above.
In [10]:
epochs = 4
model.compile(optimizer=’adam’, loss=loss_call, metrics=[])
model.fit(X,dS,batch_size=100,epochs=epochs)
WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/tensorflow_core/python/ops/math_grad.py:1375: where (from tensorflow.python.ops.array_ops) is deprecated and will be removed in a future version.
Instructions for updating:
Use tf.where in 2.0, which has the same broadcast rule as np.where
Train on 100000 samples
Epoch 1/4
100000/100000 [==============================] – 36s 356us/sample – loss: 0.0016
Epoch 2/4
100000/100000 [==============================] – 30s 301us/sample – loss: 3.5289e-04
Epoch 3/4
100000/100000 [==============================] – 30s 300us/sample – loss: 3.4036e-04
Epoch 4/4
100000/100000 [==============================] – 30s 304us/sample – loss: 3.3498e-04
Out[10]:
Since for large $T$ the price process $S$ (under time rescaling) is close to the Black-Scholes price process $S^{\mathrm{BS}}$, the hedging strategy $\gamma_t = f(\frac{t}{T},S_t)$ should be close to the (continuous-time) Black-Scholes delta hedging strategy \begin{equation} \gamma^\mathrm{BS}_t = \frac{\partial}{\partial S}\mathrm{BS}\bigg(S_t,K,1-\frac{t}{T}\bigg), \end{equation} which amounts to perfect replication, $\mathrm{PnL}=0$.
Let us study if this is the case:
In [11]:
t = 0.7
tStest = []
Sval = np.linspace(0,2,num=T)
for i in range(T):
z = (t,Sval[i])
z = np.reshape(z,(1,2))
tStest.append(z)
Delta_learn = np.reshape(model.predict(tStest), (T,))
Delta_BS = BlackScholesCallDelta(Sval, 0, sigma, 1-t, K)
plt.plot(Sval, Delta_learn, label=r”$f(\frac{t}{T},S_{t})$”)
plt.plot(Sval, Delta_BS, “b–“, label=r”$\frac{\partial}{\partial S}\mathrm{BS}(S_t,K,1-\frac{t}{T})$”)
plt.xlabel(r”$S_t$ (spot price)”)
plt.ylabel(r”$\gamma_t$ (hedge ratio)”)
plt.title(r’$\frac{t}{T}=$%1.2f’ % t, loc=’left’, fontsize=11)
plt.title(r’$K=$%1.2f’ % K, loc=’right’, fontsize=11)
plt.legend()
plt.show()
/usr/local/lib/python3.6/dist-packages/ipykernel_launcher.py:7: RuntimeWarning: divide by zero encountered in log
import sys

We have “derived” the Black-Scholes delta hedge by deep learning! Note that this is unsupervised learning: we did not tell the network $f$ what the Black-Scholes delta hedge is, it learned it by PnL optimisation.
In [0]: