MIE 1621 Computational Project Part 1
Due March 10, 2020 by 5PM. Bring your report to MC 320 and slide under the door if I am not in. E-mail a softcopy of your report and code(and script) to Paz at pazinski.hong@mail.utoronto.ca.
Write a program in MATLAB or PYTHON for minimizing a multivariate function f(x) using gradient-based method with backtracking. You must code your gradient method from scratch and not use any existing function for gradient methods. You need to write a brief report that summarizes your results as required below. Also, in your report you need to have a print out of your code (use good programming practice such as commenting your code.) Finally, send a soft copy of your code to the TA along with a script so that the TA can easily execute your code to see the results in your report.
(a) Use backtracking as described in class to compute step-lengths (so you need to set the parameters s; ; and ).
(b) Use as a stopping condition krf (x)k =(1 + jf (x)j) with = 10 5 or stop if the number of iterations hits 1000.
(c) Print the initial point and for each iteration print the search direction, the step length, and the new iterate x(k+1): If the number of itrerations is more than 15 then printout the details of the just the Örst 10 iterations as well as the details of the last 5 iterations before the stopping condition is met. Indicate if the iteration maximum is reached.
(d) Test your algorithms on the following test problems f1(x)=x21+x2+x23 withx(0) =(1;1;1)T
f2(x) = x21 + 2×2 2x1x2 2×2 with x(0) = (0; 0)T f3(x) = 100(x2 x21)2 + (1 x1)2 with x(0) = ( 1:2; 1)T f4(x) = (x1 + x2)4 + x2 with x(0) = (2; 2)T
f5(x) = (x1 1)2 + (x2 1)2 + c(x21 + x2 0:25)2 with x(0) = (1; 1)T
For f5(x), test the following three di§erent settings of the parameter c = 1; c = 10; and c = 100: Comment on how larger c a§ects the performance of the algorithm.
(e) Are your computational results consistent with the theory of the gradient- based methods?
1