NUMERICAL OPTIMISATION
ASSIGNMENT 5
MARTA BETCKE
KIKO RUL·LAN
EXERCISE 1
(a) Implement the BFGS method by modifying the descentLineSearch function. More help is pro-
vided inside Cody Coursework.
Submit your solution via Cody Coursework. [20pt]
(b) Make your implementation efficient as explained in the lecture i.e. avoid explicitly forming the
inverse Hessian matrix Hk. Copy the code lines implementing the update of Hk into your report
and briefly explain what makes the implementation efficient.
Submit your solution via TurnitIn. [20pt]
EXERCISE 2
Implement the SR-1 method by modifying the trustRegion function. More help is provided inside Cody
Coursework. Note: Here you are not expected to provide an efficient implementation as it would require
some changes to solverCM2dSubspaceExt which are out of scope at this point.
Submit your solution via Cody Coursework. [20pt]
EXERCISE 3
(a) Minimise the function
f(x, y) = (x− 3y)2 + x4
using BFGS (Ex 1a) and SR1 (Ex 2) methods starting from x0 = (10, 10)
T . Compare the perfor-
mance of the methods. To this end provide any parameters and plots that you consider relevant.
Submit your solution via TurnitIn. [20pt]
(b) Both implementations return a sequence of matrices as a field of the info structure:
(i) {HBFGSk }k≥0 when using BFGS,
(ii) {BSR1k }k≥0 when using SR1.
Plot the error of these sequences obtained in Ex 3a with respect to the matrices they approximate.
In particular, plot
(i) {||I −HBFGSk ∇
2f(xk)||2}k≥0,
(ii) {||BSR1k −∇
2f(xk)||2}k≥0,
and explain your results.
Submit your solution via TurnitIn. [20pt]
Remark. The submission to TurnitIn should not exceed 4 pages. Avoid submitting code unless
explicitly asked for and focus on explaining your results.