Numerical Methods & Scientific Computing: lecture notes
IVPs
Other MATLAB solvers
Multistep methods
The other major class of nonsti↵ solvers use previous values of yk The best-behaved are the Adams methods (Adams, 1883)
Go back to the quadrature formula:
Basic idea:
Z tn+1 tn
yn+1 =yn +
usepreviousf valuesfk ⌘f(tk,yk),k=n,n 1,··· toconstructa
f(⌧,y(⌧))d⌧
polynomial interpolant
integrate the interpolant (extrapolation) to get yn+1
Numerical Methods & Scientific Computing: lecture notes
IVPs
Other MATLAB solvers
Adams-Bashforth methods
A family of explicit multistep methods
our previous derivation using LH rectangle rule
Example
Use fn only ! Euler’s Method = AB1
Example
Use fn,fn 1 ! linear interpolant ! 2nd order 2-step method AB2
yn+1 = yn + h[3fn 1fn 1] 22
Note: we need 2 starting values y0,y1 to start this
Multistep methods are not self-starting
Need to use another method to start them o↵.
Numerical Methods & Scientific Computing: lecture notes
IVPs
Other MATLAB solvers
Why bother?
Same idea ! AB3:
yn+1 = yn + h [23fn 16fn 1 + 5fn 2]
12
AB4 etc. Why bother?
AB methods need only 1 function evaluation per step
=) very e cient compared to RK methods!
Numerical Methods & Scientific Computing: lecture notes
IVPs
Other MATLAB solvers
Properties of AB methods
stability analysis more complicated for multistep methods
i.e. harder to guarantee 0-stability.
But Adams methods are well-behaved …
All Adams methods are 0-stable
=) All Adams methods are convergent
For Adams-Bashforth methods, regions of absolute stability shrink rapidly with increasing order (Diagram)
To improve linear stability behavior look at implicit methods.
Numerical Methods & Scientific Computing: lecture notes
IVPs
Other MATLAB solvers
Adams-Moulton methods
Back to derivation but now include fn+1 in the interpolant
! yn+1 appears on both sides of the equation, so must solve a nonlinear
equation (or system) at each time step
! a family of implicit multistep methods — Adams-Moulton methods (Adams 1883)
Numerical Methods & Scientific Computing: lecture notes
IVPs
Other MATLAB solvers
Backward Euler method
Example
0th degree interpolant thru fn+1: use RH rectangle rule yn+1 =yn +hfn+1 =yn +hf(tn+1,yn+1)
Backward Euler method = AM0 (1st order implicit method)
RAS: {h 2 C 😐 1 h |> 1}
Backward Euler method is 1st order convergent and A-stable!
Numerical Methods & Scientific Computing: lecture notes
IVPs
Other MATLAB solvers
Trapezoid method
Example
put linear interpolant thru fn, fn+1 and integrate over [tn, tn+1] (trapezoid rule) ! implicit trapezoid rule = AM1
yn+1 =yn + h[fn +fn+1]=yn + h[f(tn,yn)+f(tn+1,yn+1)] 22
Error of trapezoid rule ⇠ h3 ! 2nd order method RAS: {h 2 C : Re( ) < 0}
Trapezoid method is 2nd order convergent and A-stable! Implemented in variable-step manner in MATLAB’s ode23t
Numerical Methods & Scientific Computing: lecture notes
IVPs
Other MATLAB solvers
Predictor-corrector methods
In practice, we don’t solve AM methods to convergence instead just do 1 iteration of the fixed point iteration
Example
Solve trapezoid method
yn+1 =yn + h[f(tn,yn)+f(tn+1,yn+1)]⌘g(yn+1) 2
using the fixed point iteration
y(k+1) =g(y(k) )=yn + h[f(tn,yn)+f(tn+1,y(k) )] n+1 n+1 2 n+1
but need initial guess for yn+1 !
use corresponding AB method (explicit) of same order to get initial guess
y (0) n+1
Predictor-Corrector pair (Moulton, Milne 1926)
!
Numerical Methods & Scientific Computing: lecture notes
IVPs
Other MATLAB solvers
Adams-Bashforth-Moulton
Use AB (explicit) to get initial guess yˆ — the predictor step n+1
then use AM method (1 iteration of fixed point iteration) to get yn+1 — the corrector step
Example
AB2-AM1 (both 2nd order)
yˆ =y +h[3f 1f ]Predict
n+1 n 2n 2n 1
fˆ =f(t ,yˆ )Evaluate
n+1 n+1 n+1ˆ
yn+1 = yn + h [fn + fn+1] Correct
2
fn+1 = f (tn+1, yn+1) Evaluate (ready for next step)
PECE
Overall ! an explicit method with 2 function evaluations per step but much bigger stability regions (than AB)
Numerical Methods & Scientific Computing: lecture notes
IVPs
Other MATLAB solvers
Modern Adams codes
In practice, modern Adams codes are variable-step (messy) and variable order — not for amateurs
Example
Mathematica’s NDSolve (500,000 lines of C)
Example
MATLAB’s ode113 — Adams methods of orders 1–13 (700 lines of MATLAB)
Numerical Methods & Scientific Computing: lecture notes
IVPs
Other MATLAB solvers
Summary of nonsti↵ solvers
1-step (RK)
Multistep (Adams)
expensive for high order p
p Fevals per step
stability region grows with order 0-stable =) convergent variable-step ! simple code self-starting
general purpose solver
easy to get high order
2 Fevals per step (Pred-Corr) stability region shrinks with order 0-stable =) convergent
variable step/order ! complex code start with low order or RK
best if problem very smooth
or tight tolerances
or f very expensive
Numerical Methods & Scientific Computing: lecture notes
IVPs
Other MATLAB solvers
Sti↵ solvers
The methods we have seen so far (except implicit trapezoid method = AM1) have had finite regions of absolute stability
=) if Re(hJ) ⌧ 0, the stepsize will need to be tiny for accurate numerical solution
=) h determined by stability, not accuracy
What happens if you try a nonsti↵ variable-step solver (e.g. ode23) on a
sti↵ problem? Demo
Numerical Methods & Scientific Computing: lecture notes
IVPs
Other MATLAB solvers
A nonsti↵ solver on a sti↵ problem
If h is outside stability region, the 2 estimates will vary a lot (due to error growth)
! the solver thinks h is too big (for accuracy reasons) so cuts h down
=) solver keeps going happily, just with very small steps, so as to keep h inside the stability region
Moral: a nonsti↵ solver will solve a sti↵ problem — just takes a long time to do it!
Numerical Methods & Scientific Computing: lecture notes
IVPs
Other MATLAB solvers
The Dahlquist Barrier
To avoid this problem need methods with unbounded stability domains, such as A-stable methods
The Backward Euler (Implicit Trapezoid) methods are A-stable, but only 1st (2nd) order resp.
Can we do better? NO
To get around the Dahlquist barrier, we weaken our requirement from A-stability.
Problems with sharp transients typically have near the negative real axis, not near the imaginary axis (oscillatory ODEs)
=) A-stability is too strong for most sti↵ problems
Theorem
No A-stable multistep method has order > 2 (Dahlquist 1963)
Numerical Methods & Scientific Computing: lecture notes
IVPs
Other MATLAB solvers
A(↵)-stability
=) weaken A-stability to require
| yn+1 |<| yn |
for h in a wedge about negative real axis i.e. doesn’t include region near imaginary axis ! A(↵)-stability
then the Dahlquist barrier doesn’t apply
! higher order methods?
Numerical Methods & Scientific Computing: lecture notes
IVPs
Other MATLAB solvers
BDF methods
The first common class of methods to have these properties were the BDF methods (Curtiss &R t Hirschfelder 1953, Gear 1971), which we get
by approximating y0 not n+1 f d⌧. Start from tn
y0(tn+1) = f(tn+1,yn+1)
=) we’ll have only 1 f i.e. fn+1 ! an implicit method
Now approximate the LHS by the derivative of the polynomial interpolant thru tn+1, tn, tn 1 · · ·
Use tn+1, tn! Backward Di↵erence Formula y0 ⇡ yn+1 yn
h
! Backward Euler method =AM0= BDF1
Example
Numerical Methods & Scientific Computing: lecture notes
IVPs
Other MATLAB solvers
BDF1-5
Example
Use tn+1, tn, tn 1!
yn+1 4yn + 1yn 1 = 2hfn+1
333
= BDF2 — a 2nd order A-stable, L-stable method
Proceeding in the same way ! BDF3, BDF4, BDF5 with stability regions showing A(↵) stability.
BDF methods have traded in A-stability near the imaginary axis for L-stability ! first general purpose sti↵ solvers
Numerical Methods & Scientific Computing: lecture notes
IVPs
Other MATLAB solvers
Solving implicit methods
These stability regions only hold if we solve nonlinear equations to convergence – how to do this?
Can we use fixed point iteration?
yn+1 =yn +hf(tn+1,yn+1)⌘g(yn+1)
Fixed point iteration only converges if | g0(x⇤) |< 1 (a contraction
mapping)
In our case g0 = h@f = hJ so the restriction | g0(x⇤) |< 1 =) @y
| hJ |< 1
which then puts the same restrictions on stepsize that we must lift to solve sti↵ problems
Example
use Backward Euler to illustrate
Moral: cannot use fixed-point iteration to solve implicit equations for sti↵ problems
Numerical Methods & Scientific Computing: lecture notes
IVPs
Other MATLAB solvers
Modern sti↵ solvers
Variable-step variable-order codes based on BDF or similar methods
Example
Mathematica’s NDSolve switches between Adams methods and BDF methods when it thinks DE has become sti↵.
Example
Matlab’s ode15s uses similar methods and has an option to use BDF
All MATLAB’s IVP solvers have the same calling sequence
=) justreplaceode23byode113orode15stouseadi↵erentsolver!
Numerical Methods & Scientific Computing: lecture notes
IVPs
Other MATLAB solvers
Is that all?
NO
Implicit RK solvers (Butcher,1964):
are all convergent
can be A-stable and high-order
can be L-stable and high-order
can be proved stable for nonlinear nonautonomous problems
BUT take a lot more work! 🙁
Example
MATLAB ode23tb
2nd order A-stable, AND L-stable
Numerical Methods & Scientific Computing: lecture notes
IVPs
Other MATLAB solvers
End of Week 11