IT代考 MIE 1624H February 1, 2022

Finite Difference Methods
MIE 1624H February 1, 2022
Finite difference methods:
Finite difference methods are based on the idea of approximating each partial derivative by a difference quotient. Taylor series expansion states that a function f(x) may be represented as

Copyright By PowCoder代写 加微信 powcoder

f(x + h) = f(x) + hf′(x) + 12h2f′′(x) If we neglect the term of order h2, we get
f′(x) = f(x + h) − f(x) h
This is the forward approximation for the derivative; the derivative is defined as a limit of the dif- ference quotient above as h → 0. There are alternative ways to approximate first-order derivatives. By similar reasoning, we may write
f(x − h) = f(x) − hf′(x) + 12h2f′′(x) from which we obtain the backward approximation,
f′(x) = f(x) − f(x − h) h
In both cases we get a truncation error of order O(h) (why?). A better approximation can be obtained by subtracting expression for f (x − h) from expression for f (x + h) and rearranging:
f′(x) = f(x+h)−f(x−h). 2h
This is the central or symmetric approximation, and for small h it is a better approximation, since the truncation error is O(h2) (why?). Figure below illustrates the computations.

The reasoning may be extended to second-order derivatives. This is obtained by adding expres-
sion for f(x − h) and f(x + h), which yields
f(x+h)+f(x−h) = 2f(x)+h2f′′(x)
and rearranging yields
Convex functions:
f′′(x) = f(x+h)−2f(x)+f(x−h). h2
Definition A function f is convex if for any x1, x2 ∈ C and 0 ≤ λ ≤ 1 f(λx1 + (1 − λ)x2) ≤ λf(x1) + (1 − λ)f(x2).
Hessian ∇f2(x) is PD =⇒ strictly convex function. Hessian ∇f2(x) is PSD =⇒ convex function.
For a convex function of one variable it holds that f′′(x) ≥ 0.
From the finite differences approximation of the second derivative and h > 0, expression
can be re-arranged as
f′′(x)= f(x+h)−2f(x)+f(x−h) ≥0 h2
f(x)≤ f(x+h)+f(x−h), 2
which is equivalent to the definition of a convex function. 2

程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com