ECS130 Homework Assignment #6 Due: 11:59pm, March 5, 2022
1. Newton’s method can be used to compute the reciprocals without division. To compute a1 of
a∈R,letf(x)=x1 −asothatf(x)=0whenx=a1.
(a) Write down the Newton iteration
Copyright By PowCoder代写 加微信 powcoder
(b) Compute (by hand or with a calculate) the first four Newton iterations for approximating 1/3 starting with x0 = 0.5 (not using any division).
(c) Take xk to be the estimate of a1 from the k-th iteration, and define ek = axk − 1, show that ek+1 = −e2k.
(d) Is this method always convergent regardless of the initial guess of a1 ?
2. Consider the following cubic polynomial
p(x) = 816×3 − 3835×2 + 6000x − 3125. It has three three closely spaced roots: 25/15, 25/16, 25/17
(a) Plot p(x) for 1.43 ≤ x ≤ 1.71. Show the location of the three roots.
(b) Starting with the interval [1,2], show the computed root and the number of iterations
of the bisection method.
(c) Starting with x0 = 1.5, show the computed root and the number of iterations of Newton’s method.
(d) Starting with x0 = 1 and x1 = 2, show the computed root and the number of iterations of the secant method.
Remark: use the machine precision eps = 2.2 × 10−16 as the stopping criterion for (b)-(d).
3. Investigate the convergence of the secant method on the function
f(x) = sign(x − 2)|x − 2|. with the initials x0 = 0 and x1 = 1.
Remark: we have shown that Newton’s fails for this function, see “zeroseg3.m” available on the course ebsite.
4. Consider the gradient descent algorithm for solving the following optimization of a quadratic
where A ∈ Rn×n is symmetric, b ∈ Rn, and c ∈ R. Show that the stepsize τk by line search
min f (xk − τ · ∇f (xk )) is given by τ
∥∇f (xk )∥2
τk = 2 ∇f(xk)T A∇f(xk).
f ( x 1 , x 2 ) = 12 x 21 + 29 x 2 2 . It’s easy to see that the minimizer is x∗ = (0, 0).
min f (x) = xT Ax + 2bT x + c, x∈Rn
(a) Investigate the gradient descent method with constant stepsizes τk = 0.1,0.2,0.3 for finding the minimizer, with the starting point (5,5) and stopping criterion ∥∇f(xk)∥ ≤ ε = 10−5.
(b) Apply the gradient descent method with line search for finding the minimizer, with the starting point (5,5) and stopping criterion ∥∇f(xk)∥ ≤ ε = 10−5.
f(x1,x2)=x41 +x1x2 +(1+x2)2.
Apply Newton’s method for minimizing f(x1,x2) with the starting point (3,−3) and stopping criterion ∥∇f(xk)∥ ≤ ε = 10−8.
程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com