CS计算机代考程序代写 Excel finance algorithm Chapter 4

Chapter 4
Generating Random Processes
4.1 Stochastic Processes
A stochastic process is a collection {X(t) : t ∈ T} of random variables indexed by the set T, which usually represents time. The index set T could be discrete or continuous. The set of possible values X(t) can take is the state space, which also can be discrete or continuous. Ross [251] is an excellent introduction to stochastic processes, and includes a chapter on simulation.
A counting process records the number of events or arrivals that occur by time t. A counting process has independent increments if the number of arrivals in disjoint time intervals are independent. A counting process has stationary increments if the number of events occurring in an interval depends only on the length of the interval. An example of a counting process is a Poisson process.
To study a counting process through simulation, we can generate a real- ization of the process that records events for a finite period of time. The set of times of consecutive arrivals records the outcome and determines the state X(t) at any time t. In a simulation, the sequence of arrival times must be finite. One method of simulation for a counting process is to choose a suffi- ciently long time interval and generate the arrival times or the interarrival times in this interval.
4.1.1 Poisson Processes
A homogeneous Poisson process {N(t),t ≥ 0} with rate λ is a counting process, with independent increments, such that N(0) = 0 and
eλt (λt)n
P (N (s + t) − N (s) = n) = n! , n ≥ 0, t, s > 0. (4.1)
Thus, a homogeneous Poisson process has stationary increments and the num- ber of events N(t) in [0,t] has the Poisson(λt) distribution. If T1 is the time until the first arrival,
P(T1 >t)=P(N(t)=0)=e−λt, t≥0,
Rizzo, Maria L.. Statistical Computing with R, Second Edition, CRC Press LLC, 2019. ProQuest Ebook Central, http://ebookcentral.proquest.com/lib/ualberta/detail.action?docID=5731927.
Created from ualberta on 2021-03-06 10:37:52.
99
Copyright © 2019. CRC Press LLC. All rights reserved.

100 Statistical Computing with R
so T1 is exponentially distributed with rate λ. The interarrival times T1, T2, . . . are the times between successive arrivals. The interarrival times are iid expo- nentials with rate λ, which follows from (4.1) and the memoryless property of the exponential distribution.
One method of simulating a Poisson process is to generate the interarrival times. Then the time of the nth arrival is the sum Sn = T1 + ··· + Tn (the waiting time until nth arrival). A sequence of interarrival times {Tn}∞n=1 or sequence of arrival times {Sn}∞n=1 are a realization of the process. Thus, a realization is an infinite sequence, rather than a single number. In a simulation, the finite sequence of interarrival times {Tn}Nn=1 or arrival times {Sn}Nn=1 are a simulated realization of the process on the interval [0, SN ).
Another method of simulating a Poisson process is to use the fact that the conditional distribution of the (unordered) arrival times given N(t) = n is the same as that of a random sample of size n from a Uniform(0, t) distribution.
The state of the process at a given time t is equal to the number of arrivals in [0,t], which is the number min(k : Sk > t)−1. That is, N(t) = n−1, where Sn is the smallest arrival time exceeding t.
Algorithm for simulating a homogeneous Poisson process on an in- terval [0, t0] by generating interarrival times.
1. SetS1 =0.
2. For j = 1,2,… while Sj ≤ t0:
(a) Generate Tj ∼ Exp(λ). (b) Set Sj = T1 + · · · + Tj .
3. N(t0) = minj(Sj > t0) − 1.
It is inefficient to implement this algorithm in R using a for loop. It should
be translated into vectorized operations, as shown in the next example.
Example 4.1 (Poisson process). This example illustrates a simple approach to simulation of a Poisson process with rate λ. Suppose we need N(3), the number of arrivals in [0, 3]. Generate iid exponential times Ti with rate λ and find the index n where the cumulative sum Sn = T1 + · · · + Tn first exceeds 3. It follows that the number of arrivals in [0, 3] is n − 1. On average this number is E[N(3)] = 3λ.
lambda <- 2 t0 <- 3 Tn <- rexp(100, lambda) Sn <- cumsum(Tn) n <- min(which(Sn > t0))
Results from two runs are shown below.
#interarrival times
#arrival times
#arrivals+1 in [0, t0]
Rizzo, Maria L.. Statistical Computing with R, Second Edition, CRC Press LLC, 2019. ProQuest Ebook Central, http://ebookcentral.proquest.com/lib/ualberta/detail.action?docID=5731927.
Created from ualberta on 2021-03-06 10:37:52.
Copyright © 2019. CRC Press LLC. All rights reserved.

Generating Random Processes 101
> n-1
[1] 8
> round(Sn[1:n], 4)
[1] 1.2217 1.3307 1.3479 1.4639 1.9631 2.0971
2.3249 2.3409 3.9814
> n-1
[1] 5
> round(Sn[1:n], 4)
[1] 0.4206 0.8620 1.0055 1.6187 2.6418 3.4739
For this example, the average of simulated values N(3) = n − 1 for a large number of runs should be close to E[N(3)] = 3λ = 6. ⋄
An alternate method of generating the arrival times of a Poisson process is based on the fact that given the number of arrivals in an interval (0,t), the conditional distribution of the unordered arrival times are uniformly dis- tributed on (0, t). That is, given that the number of arrivals in (0, t) is n, the arrival times S1 , . . . , Sn are jointly distributed as an ordered random sample of size n from a Uniform(0, t) distribution.
Applying the conditional distribution of the arrival times, it is possible to simulate a Poisson(λ) process on an interval (0, t) by first generating a random observation n from the Poisson(λt) distribution, then generating a random sample of n Uniform(0,t) observations and ordering the uniform sample to obtain the arrival times.
Example 4.2 (Poisson process, cont.). Returning to Example 4.1, simulate a Poisson(λ) process and find N(3), using the conditional distribution of the arrival times. As a check, we estimate the mean and variance of N(3) from 10000 replications.
lambda <- 2 t0 <- 3 upper <- 100 pp <- numeric(10000) for (i in 1:10000) { N <- rpois(1, lambda * upper) Un <- runif(N, 0, upper) Sn <- sort(Un) n <- min(which(Sn > t0)) pp[i]<-n-1 #unordered arrival times #arrival times #arrivals+1 in [0, t0] #arrivals in [0, t0] } Alternately, the loop can be replaced by replicate, as shown. pp <- replicate(10000, expr = { N <- rpois(1, lambda * upper) Rizzo, Maria L.. Statistical Computing with R, Second Edition, CRC Press LLC, 2019. ProQuest Ebook Central, http://ebookcentral.proquest.com/lib/ualberta/detail.action?docID=5731927. Created from ualberta on 2021-03-06 10:37:52. Copyright © 2019. CRC Press LLC. All rights reserved. 102 Statistical Computing with R Un <- runif(N, 0, upper) Sn <- sort(Un) n <- min(which(Sn > t0))
n – 1 })
#unordered arrival times
#arrival times
#arrivals+1 in [0, t0]
#arrivals in [0, t0]
The mean and variance should both be equal to λt = 6 in this example. Here the sample mean and sample variance of the generated values N(3) are indeed very close to 6.
> c(mean(pp), var(pp))
[1] 5.977100 5.819558
Actually, it is possible that none of the generated arrival times exceed the time t0 = 3. In this case, the process needs to be simulated for a longer time than the value in upper. Therefore, in practice, one should choose upper according to the parameters of the process, and do some error checking. For example, if we need N(t0), one approach is to wrap the min(which()) step with try and check that the result of try is an integer using is.integer. See the corresponding help topics for details.
Ross [251] discusses the computational efficiency of the two methods ap- plied in Examples 4.1 and 4.2. Actually, the second method is considerably slower (by a factor of 4 or 5) than the previous method of Example 4.1 when coded in R. The rexp generator is almost as fast as runif, while the sort operation adds O(nlog(n)) time. Some performance improvement might be gained if this algorithm is coded in C and a faster sorting algorithm designed for uniform numbers is used. ⋄
Nonhomogeneous Poisson Processes
A counting process is a Poisson process with intensity function λ(t), t ≥ 0 if N(t) = 0, N(t) has independent increments, and for h > 0,
P(N(t+h)−N(t)≥2)=o(h), and
P (N (t + h) − N (t) = 1) = λ(t)h + o(h).
The Poisson process N(t) is nonhomogeneous if the intensity function λ(t) is not constant. A nonhomogeneous Poisson process has independent increments but does not have stationary increments. The distribution of
N(s+t)−N(s)
is Poisson with mean 􏰌 s+t λ(y)dy. The function m(t) = E[N(t)] = 􏰌 t λ(y)dy
s0
is called the mean value function of the process. Note that m(t) = λ in the
case of the homogeneous Poisson process, where the intensity function is a constant.
Every nonhomogeneous Poisson process with a bounded intensity function can be obtained by time sampling a homogeneous Poisson process. Suppose
Rizzo, Maria L.. Statistical Computing with R, Second Edition, CRC Press LLC, 2019. ProQuest Ebook Central, http://ebookcentral.proquest.com/lib/ualberta/detail.action?docID=5731927.
Created from ualberta on 2021-03-06 10:37:52.
Copyright © 2019. CRC Press LLC. All rights reserved.

Generating Random Processes 103
that λ(t) ≤ λ < ∞ for all t ≥ 0. Then sampling a Poisson(λ) process such that an event happening at time t is accepted or counted with probability λ(t)/λ generates the nonhomogeneous process with intensity function λ(t). To see this, let N(t) be the number of accepted events in [0,t]. Then N(t) has the Poisson distribution with mean 􏰋 t λ(y) 􏰋 t E[N(t)] = λ λ dy = λ(y)dy. 00 To simulate a nonhomogeneous Poisson process on an interval [0,t0], find λ0 < ∞ such that λ(t) <= λ0, 0 ≤ t ≤ t0. Then generate from the homoge- neous Poisson(λ0) process the arrival times {Sj}, and accept each arrival with probability λ(Sj)/λ0. The steps to simulate the process on an interval [0,t0) are as follows. Algorithm for simulating a nonhomogeneous Poisson process on an interval [0,t0] by sampling from a homogeneous Poisson process. 1. SetS1 =0. 2. For j = 1,2,... while Sj ≤ t0: (a) GenerateTj ∼Exp(λ0)andsetSj =T1+···+Tj. (b) Generate Uj ∼ Uniform(0,1). (c) If Uj ≤ λ(Sj )/λ0 accept (count) this arrival and set Ij = 1; otherwise Ij = 0. 3. Deliver the arrival times {Sj : Ij = 1}. Although this algorithm is quite simple, for implementation in R it is more efficient if translated into vectorized operations. This is shown in the next example. Example 4.3 (Nonhomogeneous Poisson process). Simulate a realization from a nonhomogeneous Poisson process with intensity function λ(t) = 3cos2(t). Here the intensity function is bounded above by λ = 3, so the jth arrival is accepted if Uj ≤ 3cos2(Sj)/3 = cos2(Sj). lambda <- 3 upper <- 100 N <- rpois(1, lambda * upper) Tn <- rexp(N, lambda) Sn <- cumsum(Tn) Un <- runif(N) keep <- (Un <= cos(Sn)^2) #indicator, as logical vector Sn[keep] Now, the values in Sn[keep] are the ordered arrival times of the nonhomoge- neous Poisson process. Rizzo, Maria L.. Statistical Computing with R, Second Edition, CRC Press LLC, 2019. ProQuest Ebook Central, http://ebookcentral.proquest.com/lib/ualberta/detail.action?docID=5731927. Created from ualberta on 2021-03-06 10:37:52. Copyright © 2019. CRC Press LLC. All rights reserved. 104 Statistical Computing with R > round(Sn[keep], 4)
[1] 0.0237 0.5774 0.5841 0.6885 2.3262
2.4403 2.9984 3.4317 3.7588 3.9297
[11] 4.2962 6.2602 6.2862 6.7590 6.8354
7.0150 7.3517 8.3844 9.4499 9.4646 . . .
To determine the state of the process at time t = 2π, for example, refer to the entries of Sn indexed by keep.
> sum(Sn[keep] <= 2*pi) [1] 12 > table(keep)/N
keep
FALSE TRUE
0.4969325 0.5030675
Thus N(2π) = 12, and in this example approximately 50% of the arrivals were counted. ⋄
4.1.2 Renewal Processes
A renewal process is a generalization of the Poisson process. If {N (t), t ≥ 0} is a counting process, such that the sequence of nonnegative interarrival times T1, T2, . . . are iid (not necessarily exponential distribution), then {N(t), t ≥ 0} is a renewal process. The function m(t) = E[N(t)] is called the mean value function of the process, which uniquely determines the distribution of the interarrival times.
If the distribution FT (t) of the iid interarrival times is specified, then a renewal process can be simulated by generating the sequence of interarrival times, by a method similar to Example 4.1.
Example 4.4 (Renewal process). Suppose the interarrival times of a renewal process have the geometric distribution with success probability p. (This exam- ple is discussed in [251].) Then the interarrival times are nonnegative integers, and Sj = T1 + · · · + Tj have the negative binomial distribution with size pa- rameter r = j and probability p. The process can be simulated by generating geometric interarrival times and computing the consecutive arrival times by the cumulative sum of interarrival times.
t0 <- 5 Tn <- rgeom(100, prob = .2) Sn <- cumsum(Tn) n <- min(which(Sn > t0))
#interarrival times
#arrival times
#arrivals+1 in [0, t0]
The distribution of N(t0) can be estimated by replicating the simulation above.
Rizzo, Maria L.. Statistical Computing with R, Second Edition, CRC Press LLC, 2019. ProQuest Ebook Central, http://ebookcentral.proquest.com/lib/ualberta/detail.action?docID=5731927.
Created from ualberta on 2021-03-06 10:37:52.
Copyright © 2019. CRC Press LLC. All rights reserved.

Generating Random Processes 105
Nt0 <- replicate(1000, expr = { Sn <- cumsum(rgeom(100, prob = .2)) min(which(Sn > t0)) – 1
})
table(Nt0)/1000
Nt0
01234567 0.273 0.316 0.219 0.108 0.053 0.022 0.007 0.002
To estimate the means E[N(t)], vary the time t0. t0 <- seq(0.1, 30, .1) mt <- numeric(length(t0)) for (i in 1:length(t0)) { mt[i] <- mean(replicate(1000, { Sn <- cumsum(rgeom(100, prob = .2)) min(which(Sn > t0[i])) – 1
}))
}
plot(t0, mt, type = “l”, xlab = “t”, ylab = “mean”)
Let us compare with the homogeneous Poisson process, where the interarrival times have a constant mean. Here we have p = 0.2 so the average interarrival time is 0.8/0.2 = 4. The Poisson process that has mean interarrival time 4 has Poisson parameter λt = t/4. We added a reference line to the plot corresponding to the Poisson process mean λt = t/4 using abline(0, .25).
The plot is shown in Figure 4.1. It should not be surprising that the mean of the renewal process is very close to λt, because the geometric distribution is the discrete analog of exponential; it has the memoryless property. That is, if X ∼ Geometric(p), then for all j,k = 0,1,2,…
(1 − p)j+k
P(X >j+k|X >j)= (1−p)j =(1−p)k =P(X >k).
4.1.3 Symmetric Random Walk

Let X1,X2,… be a sequence of iid random variables with probability distribution P(Xi = 1) = P(Xi = −1) = 1/2. Define the partial sum Sn = 􏰉ni=1 Xi. The process {Sn, n ≥ 0} is called a symmetric random walk. For example, if a gambler bets $1 on repeated trials of coin flipping, then Sn represents the gain/loss after n tosses.
Rizzo, Maria L.. Statistical Computing with R, Second Edition, CRC Press LLC, 2019. ProQuest Ebook Central, http://ebookcentral.proquest.com/lib/ualberta/detail.action?docID=5731927.
Created from ualberta on 2021-03-06 10:37:52.
Copyright © 2019. CRC Press LLC. All rights reserved.

106 Statistical Computing with R
0 5 10 15 20 25 30 t
FIGURE 4.1: Sequence of sample means of a simulated renewal process in Example 4.4. The reference line corresponds to the mean λt = t/4 of a homogeneous Poisson process.
Example 4.5 (Plot a partial realization of a random walk). It is very simple to generate a symmetric random walk process over a short time span.
n <- 400 incr <- sample(c(-1, 1), size = n, replace = TRUE) S <- as.integer(c(0, cumsum(incr))) plot(0:n, S, type = "l", main = "", xlab = "i") A partial realization of the symmetric random walk process starting at S0 = 0 is shown in Figure 4.2. The process has returned to 0 several times within time [1, 400]. > which(S == 0)
[1] 1 3 27 29 31 37 41 95225229233237239241
The value of Sn can be determined by the partial random walk starting at the most recent time the process returned to 0. ⋄
If the state of the symmetric random walk Sn at time n is required, but not the history up to time n, then for large n it may be more efficient to generate Sn as follows.
Assume that S0 = 0 is the initial state of the process. If the process has
Rizzo, Maria L.. Statistical Computing with R, Second Edition, CRC Press LLC, 2019. ProQuest Ebook Central, http://ebookcentral.proquest.com/lib/ualberta/detail.action?docID=5731927.
Created from ualberta on 2021-03-06 10:37:52.
Copyright © 2019. CRC Press LLC. All rights reserved.
mean 02468

Generating Random Processes 107
0 100 200 300 400 i
FIGURE 4.2: Partial realization of a symmetric random walk in Example 4.5.
returned to the origin before time n, then to generate Sn we can ignore the past history up until the time the process most recently hit 0. Let T be the time until the first return to the origin. Then to generate Sn, one can simplify the problem by first generating the waiting times T until the total time first exceeds n. Then starting from the last return to the origin before time n, generate the increments Xi and sum them.
Algorithm to simulate the state Sn of a symmetric random walk The following algorithm is adapted from [72, XIV.6].
Let Wj be the waiting time until the jth return to the origin.
1. SetW1 =0.
2. For j = 1, 2, . . . while Wj ≤ n:
(a) Generate a random Tj from the distribution of the time until the first return to 0.
(b) Set Wj = T1 + · · · + Tj .
3. Sett0 =Wj −Tj (timeoflastreturnto0intimen.)
4. Sets1 =0.
5. Generate the increments from time t0 + 1 until time n: For i = 1, 2, . . . , n − t0
(a) Generate a random increment xi ∼ P (X = ±1) = 1/2.
Rizzo, Maria L.. Statistical Computing with R, Second Edition, CRC Press LLC, 2019. ProQuest Ebook Central, http://ebookcentral.proquest.com/lib/ualberta/detail.action?docID=5731927.
Created from ualberta on 2021-03-06 10:37:52.
Copyright © 2019. CRC Press LLC. All rights reserved.
S
−20 −15 −10 −5 0 5 10

108 Statistical Computing with R
(b) Set si = x1 + · · · + xi.
(c)Ifsi =0resetthecountertoi=1(anotherreturnto0isnot
accepted, so reject this partial random walk and generate a new sequence of increments starting again from time t0 + 1.)
6. Deliver si.
To implement the algorithm, one needs to provide a generator for T, the time until the next return of the process to 0. The probability distribution of T [72, Thm. 6.1] is given by
􏰀2n−2􏰁 1 Γ(2n−1) P(T=2n)=p2n= n−1 n22n−1 =n22n−1Γ2(n), n≥1,
P(T =2n+1)=0, n≥0.
Example 4.6 (Generator for the time until return to origin). An efficient algorithm for generating from the distribution T is given by Devroye [72, p. 754]. Here we will apply an inefficient version that is easily implemented in R. Notice that p2n equals 1/(2n) times the probability P (X = n − 1) where X ∼ Binomial (2n − 2, p = 1/2).
The following methods are equivalent.
#compute the probabilities directly
n <- 1:10000 p2n <- exp(lgamma(2*n-1) - log(n) - (2*n-1)*log(2) - 2*lgamma(n)) #or compute using dbinom P2n <- (.5/n) * dbinom(n-1, size = 2*n-2, prob = 0.5) Recall that if X is a discrete random variable and ... pP2n))
Here are two examples to illustrate the method of looking up the solution FX(xi−1) < u ≤ FX(xi) in the probability vector. #first part of pP2n [1] 0.5000000 0.6250000 0.6875000 0.7265625 0.7539062 0.7744141 Rizzo, Maria L.. Statistical Computing with R, Second Edition, CRC Press LLC, 2019. ProQuest Ebook Central, http://ebookcentral.proquest.com/lib/ualberta/detail.action?docID=5731927. Created from ualberta on 2021-03-06 10:37:52. Copyright © 2019. CRC Press LLC. All rights reserved. Generating Random Processes 109 In the first example u = 0.6612458 and the first return to the origin occurs at time n = 6, and in the second example u = 0.5313384 and the next return to the 0 occurs at time n = 4 after the first return to 0. Thus the second return to the origin occurs at time 10. (The case u > max(pP2n) must be handled separately.)
Suppose now that n is given and we need to compute the time of the last return to 0 in (0, n].
n <- 200 sumT <- 0 while (sumT <= n) { u <- runif(1) s <- sum(u > pP2n)
if (s == length(pP2n)) warning(“T is truncated”)
Tj <- 2 * (1 + s) #print(c(Tj, sumT)) sumT <- sumT + Tj } sumT - Tj In case the random uniform exceeds the maximal value in the cdf vector pP2n, a warning is issued. Here instead of issuing a warning, one could append to the vector and return a valid T . We leave that as an exercise. A better algorithm is suggested by Devroye [72, p. 754]. One run of the simulation above generates the times 110, 128, 162, 164, 166, 168, and 210 that the process visits 0 (uncomment the print statement to print the times). Therefore the last visit to 0 before n = 200 is at time 168. Finally, S200 can be generated by simulating a symmetric random walk starting from S168 = 0 for t = 169, . . . , 200 (rejecting the partial random walk if it hits 0). ⋄ 4.2 Brownian Motion One of the most fundamental models in finance and science is the Brownian motion or Wiener process. The real-valued stochastic process {Wt}t≥0 such that 1. W0 = 0 almost surely, 2. Wt is almost surely continuous, 3. Wt − Ws ∼ N (0, t − s), 0 ≤ s < t, 4.If0≤s0 ≤t0 ≤s.,:, ‘<' "-' $-~0~ <;:,~Oih':':s:-\::,~r:::,~~~~-s.~r:.<:s ,_':::i vO (;,~~,($~..._"-~'q_O ) ' l f ..._"3 1 :.1!1 e e e •1•f1 Shotput•• •e•· 100m 1 8 6 Long jump High.jump . , 400 e •~ _ .4 2 ~ ~·~•· 11Om.hurdle Discus . 0 2 · Pole.vaun Javeline -~ .4 6 FIGURE 5.3: Correlation plot of athletes’ performance in the decathlon events in Example 5.2. " 0(})I 0(l_ -'" 0.29 0.1 9 -0.14 -0.25 • 0.2 0.1 2 -0.03 0.06 0.3Z 0.12 FIGURE 5.4: Correlation plot of athletes’ performance in the decathlon events with correlation coefficients, Example 5.2. Rizzo, Maria L.. Statistical Computing with R, Second Edition, CRC Press LLC, 2019. ProQuest Ebook Central, http://ebookcentral.proquest.com/lib/ualberta/detail.action?docID=5731927. Created from ualberta on 2021-03-06 10:37:52. ,.§ ,§:- •n 1500m ~ ~;8 =>
110m.hurdle -0.33 0 0.01 0.04 DISCUS -0.15 0.16 0.26 Pole.vaun -0.03 0.25
·
l
~~ ~.2. L ~
-.J
“c’-5, $,u~ B~0
“‘
Javeline · 0 .1 8
Copyright © 2019. CRC Press LLC. All rights reserved.