CS计算机代考程序代写 ENGG7302 Tutorial Optimisation

ENGG7302 Tutorial Optimisation

Exercises:
OP1.1
Revise important definitions, and practice analytical optimisation. Heath, Chapter 6, Exercises 6.1-6.7
If you are having trouble with the calculus, you may want to do some revision questions from an engineering calculus text book (i.e: Kreizig, Stewart).
f
n
S
à Not Coercive then f may have no
Optimization Problems One-Dimensional Optimization Multi-Dimensional Optimization
Definitions
Existence and Uniqueness
Optimality Conditions
Existence of Minimum
If f is continuous on closed and bounded set S ✓ Rn, then
OP1.2 (
Se
e
al
s
o
,
Q
ue
à Not Coercive à Coercive
h
s
t
io
n
5
,
s
econd Semester Examinations, November 2007)
as
g
lo
b
a
lm
If S is not closed or is unbou
à Coercive
− 􏰃 , 􏰂]. Calculate the
cal points.
ile H􏰆 (x􏰀) is not positive
in
i
m
u
m
o
􏰂
Consider the function f(x) = 5x􏰀 + x􏰁 + 3x􏰀x􏰁 + 5.
local or global minimum on S
(a) Calculate the gradient of this function, ∇f(x). Continuous function f on unbounded set S ✓ Rn is
coercive if 􏰀 (b) f(x) has two critical points at x
􏰁
= [
= [0 0] and x lim f(x) = +1
kxk!1
Hessian matrix, H and evaluate it at each of the criti
􏰆
i.e., f(x) must be large whene􏰁ver kxk is large
(c) It can be shown that H􏰆 (x ) is positive definite, wh 􏰀n􏰁
dIfeffinsictoee.rWcivheaotntycploeseodf,curnitbiocuanldpeodinstestSat✓xRa,ntdhexnf? has global minimum on S
Michael T. Heath
Scientific Computing
8 / 74
nded,
􏰁
OP1.3 (See also, Question 5, second Semester Examinations, November 2007)
􏰄􏰅 􏰄

Optimization Problems One-Dimensional Optimization Multi-Dimensional Optimization
Uniqueness of Minimum
à Strictly Convex Existence and Uniqueness
à Strictly Convex à Nonconvex
Definitions
à Convex
Optimality Conditions
Strictly Convex
Nonconvex
Set S ✓ Rn is convex if it contains line segment between any two of its points
Functionf:S✓Rn !Risconvex onconvexsetSifits graph along any line segment in S lies on or below chord connecting function values at endpoints of segment
Any local minimum of convex function f on convex set S ✓ Rn is global minimum of f on S
Any local minimum of strictly convex function f on convex set S ✓ Rn is unique global minimum of f on S
Convex
Michael T. Heath
Scientific Computing
10 / 74

For (a)
f ′(x) = 2x
f′′(0)= 0 àx = 0 is a critical point
f ′′(x) = 2 > 0
So x = 0 is a minimum
à x=0 is a critical point f ′′(0+ ) > 0, f ′′(0− ) < 0 For (b) f ′(x) = 3x2 f ′(0) = 0 f ′′(x) = 6x So x = 0 is a saddle point For (c) f ′(0) = 0 f′′(x)=12x2 f ′′(0+ ) > 0, f ′′(0− ) > 0
f ′(x) = 4×3
So x = 0 is a minimum
à x=0 is a critical point

For (a), Make
When x = -5, When x = 1,
f′(x)=3×2 +12x−15 f ′′(x) = 6x +12
3×2 +12x−15=0 (x + 5)(x −1) = 0
x=−5 or x=1 f ′′(x) = −18 < 0 f ′′(x) = 18 > 0
àx = -5 is a maximum àx = 1 is a minimum

OP1.2 (See also, Question 5, second Semester Examinations, November 2007) Consider the function f(x) = 5x􏰀􏰁 + x􏰁􏰂 + 3x􏰀x􏰁 + 5.
(a) Calculate the gradient of this function, ∇f(x).
(b) f(x) has two critical points at x􏰀 = [0 0] and x􏰁 = [− 􏰃 , 􏰂]. Calculate the
􏰄􏰅 􏰄 Hessian matrix, H􏰆 and evaluate it at each of the critical points.
(a)
⎡ df ⎤ ⎢ dx ⎥
⎡10x+3x ⎤
OP1.3 (See also, Question 5, second Semester Examinations, November 2007)
⎢1⎥
∇f(x)= =⎢12⎥
⎢df⎥⎢2 ⎥ Consider the constrained optimisation3xpro+bl3exm
Minimize
(b)
Subject to
⎢⎥⎣21⎦ ⎢ dx2 ⎥
⎣⎦ ⎡2()2􏰁⎤􏰁
fx =3x􏰀+x􏰁
⎢dfdf⎥
⎢ dx12 dx1dx2 ⎥ ⎡ 10 3 ⎤
Hf(x1,x2)=⎢ d(f) d f ⎥=⎢ 3 6x ⎥ 22
⎢h􏰀x=x􏰁−x􏰀⎥+2⎣=0 2⎦ ⎢ dx dx dx2 ⎥
⎣212⎦
(a) Use the Lagrangian function to find the solution to this constrained optimisation
problem.
For x1=[0 0], Hf (0, 0) = [10 3; 3 0]
For x2=[-9/50 3/5], Hf (-9/50, 3/5) = [10 3; 3 18/5] Consider the constrained optimisation problem
Minimize

Stationary Values for the 2D case
Step 1: Calculate the 2nd derivatives A=d2z/dx2 , B=d2z/dy2 , C=d2z/dxdy
Step 2: Check the value of D=A*B-C2
If D > 0, and A>0,B>0, then the stationary point!a true
min;
If D > 0, and A<0,B<0, then the stationary point!a true max; if D < 0, then the stationary point!the saddle point; if D = 0, then it is inconclusive. OP1.3 (See also, Question 5, second Semester Examinations, November 2007) Consider the constrained optimisation problem Minimize Subject to f(x) = 3x􏰀􏰁 + x􏰁􏰁 h􏰀(x) = x􏰁 − x􏰀 + 2 = 0 (a) Use the Lagrangian function to find the solution to this constrained optimisation problem. ⎡df⎤ ⎡d2f d2f⎤ Consider the constrained optimisation problem ⎢⎥⎢⎥ Minim⎢izedx1 ⎥ ⎡6x1 ⎤ ⎢ dx12 dx1dx2 ⎥ ⎡6 0⎤ ∇f(x)=⎢ df ⎥=⎢ 2x ⎥ Hf(x1,x2)=⎢ d2f d2f ⎥=⎢⎣ 0 2 ⎥⎦ ⎢⎥⎣2f⎦(x)=3x􏰀􏰁+x􏰁􏰁 ⎢dxdx2⎥ ⎢dx2⎥ ⎢ ⎣⎦⎣21⎦ Jacobianmatrix Jh(x)=⎡dh1 dh1 ⎤=⎡−11⎤ 1 ⎢dxdx⎥⎣ ⎦ dx2 ⎥ ⎢12⎥ ⎣⎦ ⎡ T⎤⎡6x1−λ⎤ Lagragianfunction ∇L(x,λ)=⎢ ∇f(x)+λJh1(x) ⎥=⎢ 2x +λ ⎥=0 ⎢ h(x) ⎥⎢ 2 ⎥ ⎣ 1 ⎦⎢x−x+2⎥ ⎣21⎦ ⎧ 6x1−λ=0 ⎧ x1=0.5 ⎪⎨⎪ 2x2+λ=0 ⎯⎯→⎪⎨⎪x2=−1.5 ⎪⎩ x 2 − x 1 + 2 = 0 ⎩ λ = 3 OP1.3 (See also, Question 5, second Semester Examinations, November 2007) Consider the constrained optimisation problem Minimize f(x) = 3x􏰀􏰁 + x􏰁􏰁 Subject to (a) Use the Lagrangian function to find the solution to this constrained optimisation problem. Find a vector z that belongs to the null h􏰀(x) = x􏰁 − x􏰀 + 2 = 0 B(x,λ)= Hf (x)+∑m λiHgi (x) space of J (x) = [-1 1] Consider the constrained optimisation problem h H J (x) z = 0 à -z +z = 0 i=1 Minimize ⎡ 6 0 ⎤ ⎡ 0 0 ⎤ Suppose z = [z1 z2] +3 􏰁 􏰁 =H (x)+λH (x)= fh⎢⎥⎢()⎥ = ⎡ ⎢⎣ 6 0 ⎤ 0 2 ⎥⎦ So null(Jh(x) ) = <[1 1]H>
let z = [1 1]H, a is not equal to zero
1
⎣02⎦ ⎣00⎦
h12 fx =3x􏰀+x􏰁
zT⋅B⋅z=⎡ =8>0
⎤⋅⎡1⎤
⎤⋅⎡6 0⎤⎡1⎤=⎡
⎣1 1⎦⎢⎣0 2⎥⎦⎢⎣1⎥⎦ ⎣6 2⎦⎢⎣1⎥⎦
So B is positive definite on null space of Jh(x)
The critical point [0.5, -1.5] is a minima, and f min = f(0.5, -1.5) = 3

Constrained Optimality, continued
/
Lagrangian function L: Rn+m ! R, is defined by L(x, ) = f (x) + T g(x)
Its gradient is given by
rL(x, ) = rf (x) + JgT (x) g(x)
Its Hessian is given by
where
HL(x, ) = B(x, ) Jg(x)
JgT (x) O
iHgi (x)
B(x, ) = Hf (x) +
Xm i=1
Michael T. Heath Scientific Computing 14

Multi-Dimensional Optimization Optimality Conditions
Constrained Optimality, continued
Together, necessary condition and feasibility imply critical point of Lagrangian function,
r L ( x , ) =  r f ( x ) + J gT ( x ) = 0 g(x)
Hessian of Lagrangian is symmetric, but not positive definite, so critical point of L is saddle point rather than minimum or maximum
Critical point (x⇤, ⇤) of L is constrained minimum of f if B(x⇤,⇤) is positive definite on null space of Jg(x⇤)
If columns of Z form basis for null space, then test projected Hessian ZT BZ for positive definiteness
Michael T. Heath Scientific Computing 15 / 74

Consider the constrained optimisation problem Minimize
Subject to
f(x) = 3x􏰀􏰁 + x􏰁􏰁 h􏰀(x) = x􏰁 − x􏰀 + 2 ≤ 0 h􏰁(x) = x􏰁 + x􏰀 + 2 ≤ 0
(b) Write down the Lagrangian function for this problem.
(c) Draw a diagram showing the constraints and contours of objective function in the
(x􏰀, x􏰁) solution space. Indicate by shading the feasible region of the space. (b) First convert to standard form:
Maximize f(x) = -3×12-x22 Subject to h1(x) <= 0 and h2(x) <= 0 Lagrangian function (μ1 , μ2 >= 0)
L(x,μ1,μ2)=−3×12 −x2 +μ1(x2 −x1+2)+μ2(x2 +x1+2)
make ⎡ ⎤ ⎢ dL ⎥ ⎢ dx1 ⎥
⎧x=0 ⎪1
⎢ ∇L(x,μ1,μ2)= ⎢
dx
⎥ ⎢−2x+μ+μ ⎥ 2 ⎥ =⎢ 2 1 2⎥
=0
⎨ μ = 2 ⎪1
So the optimal solution is (0, -2)
⎢ dL ⎥ ⎡ −6×1−μ1+μ2 ⎤
⎪ x =−2 2
⎢dL ⎥ ⎢ x2−x1+2 ⎥
⎢dμ⎥⎢x+x+2⎥
⎪ ⎩μ=2
1⎣⎦ dL 21
2
⎢ dμ ⎥ ⎣2⎦

Consider the constrained optimisation problem Minimize
f(x) = 3x􏰀􏰁 + x􏰁􏰁
h(x)=x −x +2≤0 ⎧ L 😡 −x +2=0
(b) Write down the Lagrangian function for this problem.
(x􏰀, x􏰁) solution space. Indicate by shading the feasible region of the space. L2: x2+x1+2=0
Subject to
􏰀􏰁􏰀 ⎪⎨121 h􏰁(x)=x􏰁+x􏰀+2≤0 ⎪⎩ L2:x2+x1+2=0
(c) Draw a diagram showing the constraints and contours of objective function in the
x
2
(-2, 0)
h2(x) <= 0 (2, 0) L1: x2-x1+2=0 x1 h1(x) <= 0 (0, -2) h1(x) <= 0 h2(x) <= 0 The optimal solution is (0, -2)