ECEGY 6233
System Optimization Methods
http://engineering.nyu.edu/people/zhong-ping-jiang
Z. P. Jiang
Department of Electrical and Computer Engineering NYU Tandon School of Engineering
Rm 1001/ 370 Jay St. Tel: (646) 997 3646; Email: zjiang@nyu.edu
© No reproduction permitted without written permission of the author.
2/1/2021 @2020 New York University Tandon 1 School of Engineering
What is Optimization?
Selection of the best possible solution from within a set of allowed alternatives.
Course Objective:
Introduction to Practical Optimization Methods * linear and nonlinear programming
* calculus of variations
* dynamic programming
* other optimization problems & methods
2/1/2021 @2020 New York University Tandon 2 School of Engineering
Features of this Course
• Balance between theory and applications • Mix of academic examples and puzzles
• Basic homework vs. open‐ended projects • Independent reading vs. team work
• Numerical algorithms
2/1/2021 @2020 New York University Tandon 3 School of Engineering
What are the basic ingredients of optimization?
• Problem description
• Data and information collection • Definition of design variables
• Optimization criterion
• Formulation of constraints
2/1/2021 @2020 New York University Tandon 4 School of Engineering
Motivational Examples of Optimization
2/1/2021
@2020 New York University Tandon 5 School of Engineering
Linear Equations
Linear Programming
Least Squares Data Fitting
Nonlinear Programming
Linear Equations Goal: a first pass to data fitting
Given three points: (2, 1), (3, 6) and (5, 4), find the corresponding quadratic function
2/1/2021
@2020 New York University Tandon 6 School of Engineering
f(t)x xtxt2 123
withunknownx, 1i3 i
Linear Equations
So, we arrive at the following linear equations:
x 2x 4x 1, 123
x 3x 9x 6, 123
x 5x 25x 4. 123
Do you know how to solve linear equations like the above one? Linear algebra or matrix theory as a prerequisite.
2/1/2021 @2020 New York University Tandon 7 School of Engineering
Comment
A constrained optimization problem often involves
• linear equations
2/1/2021 @2020 New York University Tandon 8 School of Engineering
Linear Programming Manufacturing “bookshelf (x)”, “cabinets with doors
1x (x2)”,“cabinetswithdrawers( 3)”,and“customer‐
designed cabinets ( x4 )”.
Objective: maximize the weekly revenue
2/1/2021 @2020 New York University Tandon 9 School of Engineering
Cabinet
Wood Labor
Revenue
Bookshelf
10 2 100 12 2 150 25 8 200
With Doors With Drawers
Custom- 20 12 400 designed
Only 5000 units of wood and 1500 units of labor are available.
2/1/2021 @2020 New York University Tandon 10 School of Engineering
2/1/2021
@2020 New York University Tandon 11 School of Engineering
LP Model
Solve the following linear programming (LP) problem with constraints:
maximize P 100x 150x 200x 400x 1234
subject to 10x 12x 25x 20x 5000 1234
2x 4x 8x 12x 1500 1234
x,x,x,x 0. 1234
Comment 1
An optimization problem usually involves
Objective function Variables or unknowns
2/1/2021 @2020 New York University Tandon 12 School of Engineering
Comment 2
Optimization problems often involve
Constraints (on the unknowns)
2/1/2021 @2020 New York University Tandon 13 School of Engineering
Least Squares Data Fitting
Assume that some, or all, measurements are in error and that there are more data than the number of variables.
Goal: minimize the observation errors e mxxtxt2
ii12i3i 1iN, N3
2/1/2021 @2020 New York University Tandon 14 School of Engineering
Formulation of the Problem
Mathematically, we want to solve
NN2
minimize e2 mxxtxt2
i12i3i xi1i1
i
2/1/2021 @2020 New York University Tandon 15 School of Engineering
Comment
Optimization problems may be
Nonlinear (without constraints) In this case, the problem is referred to as:
Unconstrained Nonlinear Programming
2/1/2021 @2020 New York University Tandon 16 School of Engineering
Nonlinear Programming Electrical connections:
2/1/2021
@2020 New York University Tandon 17 School of Engineering
w1 w3
?
w2 w4
2/1/2021
@2020 New York University Tandon 18 School of Engineering
NP Model
P w
4
minimize
subjectto w= xxyy,1i4
i i1
22 ii0i0
22 x1 y4 4
11 22
x2 9 y2 5 1 2x34, 3y31
6x4 8, 2y4 2 .
Comment
Optimization problems may be
Nonlinear (with constraints)
In this case, the problem is referred to as:
Constrained Nonlinear Programming
2/1/2021 @2020 New York University Tandon 19 School of Engineering
“Blood Diamond” Game Context: 5 pirates robbed 100 diamonds and decide
to distribute these diamonds by the rules:
• Draw lots to give each pirate a number out of 1, 2, 3, 4, 5.
• First, No. 1 pirate proposes a distribution plan for the group of fives to vote. Only when he gets the majority of votes, his proposal will be approved.
Otherwise, he will be thrown into the sea.
2/1/2021 @2020 New York University Tandon 20 School of Engineering
•
After No. 1 pirate dies, No.2 proposes his distribution plan for the group of four to vote. Only when he gets the majority of votes, his proposal will be approved.
•
Otherwise, he too will be thrown into the sea. Continue this way for No. 3 and No. 4
“Blood Diamond” (cont’d)
2/1/2021 @2020 New York University Tandon 21 School of Engineering
“Blood Diamond” (cont’d) Hypothesis:
All five pirates are equally smart and can make their right judgement.
2/1/2021 @2020 New York University Tandon 22 School of Engineering
“Blood Diamond” (cont’d) Problem:
What should No.1 pirate propose to maximize his profit (and, of course, his chance of survival)?
o Application of Dynamic Programming
o Solution (and systematic theory!) at the end of this
semester
2/1/2021 @2020 New York University Tandon 23 School of Engineering
Preliminaries
• Basic Notions of Optimality
• Convexity
• Taylor Series
• The General Optimization Algorithm
• Rates of Convergence
2/1/2021 @2020 New York University Tandon 24 School of Engineering
Basic Notions of Optimality
Consider an optimization problem
f (x) P f (x) performance index xS
minimize
subjectto gi(x)0, iI constraints
2/1/2021
@2020 New York University Tandon 25 School of Engineering
g j ( x ) 0 , j J It is worth noting that
maximize f (x) minimize f (x) xS xS
• FeasiblesetS:allfeasiblepoints.
Feasibility
• Apointsatisfyingalltheconstraintsissaidtobe feasible.
• Activeconstraints:when gj(x)0.
• Inactiveconstraints:when gj(x)0.
• Boundary:Thesetoffeasiblepointsatwhichat least one inequality is active.
• Interiorpoints:Allotherpointsthanboundary
2/1/2021 @2020 New York University Tandon 26 School of Engineering
2/1/2021
@2020 New York University Tandon 27 School of Engineering
X1
B = (3, 0, 1) C = (1, 1, 1).
Bx
g3(x)x2 0 0 X2
X3
g(x)x 2x 3x 60 1123
An Example
g (x)x 0 21
Ax
xC g4(x)x3 0.
Points:
A = (0, 0, 2)
Minimizers
• Global minimizer x if
• Strict global minimizer if, additionally,
f(x) f(x), xS.
f (x ) f (x), xS, x x . • Local minimizer if
f(x ) f(x), xS, suchthat |xx |.
• Strictlocalminimizerif
f(x ) f(x), xS, suchthatxx , |xx |.
2/1/2021 @2020 New York University Tandon 28 School of Engineering
•
For unconstrained problems, a stationary point of
• •
x
A minimizer (or a maximizer) is a stationary point,
but not vice versa.
Stationary Points
f(x) is such that
f(x)f (x)0
Saddle point: a stationary point that is not minimizer nor maximizer.
2/1/2021 @2020 New York University Tandon 29 School of Engineering
Convexity
• Convex set S, if
x(1)yS forall01.
• Convex function f, if
f ( x (1 ) y ) f ( x ) (1 ) f ( y )
x,yS and01.
2/1/2021 @2020 New York University Tandon 30 School of Engineering
Convex Programming Problem
For a convex set S and a convex function f over S,
S xn|g(x)0.
i
f (x) xS
min
For example, such a set S can be defined via concave functions gi ,
Special case of Convex Programming: Linear Programming (LP), where both the objective and the constraints are linear.
2/1/2021 @2020 New York University Tandon 31 School of Engineering
An Important Result
A local minimizer x of a convex programming problem is a global minimizer.
Moreover, if the cost function is strictly convex, then x is the unique global minimizer.
Proof. Prove by contradiction. See the book.
2/1/2021 @2020 New York University Tandon 32 School of Engineering
polynomial around a point
Taylor Series
Goal: Approximate a (sufficiently smooth) function by a
x0.
1 pn
1-variable:
f(x p)f(x)pf'(x) p2f”(x) f(n)(x)
0 0 020 n!0 n-variables:
f(x p)f(x)pTf(x)1pT2f(x)p 00020
2/1/2021 @2020 New York University Tandon 33 School of Engineering
Notation:
The first-order derivative (Gradient) : f (x) f (x) f (x)T
f(x)x ,x ,,x n 12n
The second-order derivative (Jacobian of f (x) or Hessian of f ):
2 f(x)2 f(x)nn.
xx ij
Note that the Hessian matrix is symmetric; see Theorem 9.14 in: Rudin, “Principles of Mathematical Analysis”, 1976.
2/1/2021 @2020 New York University Tandon 34 School of Engineering
2/1/2021
@2020 New York University Tandon 35 School of Engineering
Example
f(x)x2 3xx x2 1122
Compute its 1st-order derivative f (x) and its 2nd-order derivative 2 f (x)
at x (1, 1).
Necessary Condition
A necessary condition for optimization is:
f (x0 ) 0.
2/1/2021 @2020 New York University Tandon 36 School of Engineering
Sufficient Conditions
Sufficient conditions for minimization are:
f (x0 ) 0, 2 f(x )0.
0
2/1/2021 @2020 New York University Tandon 37 School of Engineering
Comments
2/1/2021
@2020 New York University Tandon 38 School of Engineering
A matrix M nn is said to be positive definite, ifxn, x0, itholdsthat
xT Mx 0.
The eigenvalues of M are all such that
det I M 0.
For a symmetric matrix M , M 0 if and only if
all its eigenvalues are positive.
General Optimization Algorithm
Step 0: Specify some initial guess of the solution x0
Step k (k=0,1,…): If x0 is optimal, STOP.
Otherwise, determine an improved estimate of the solution: xk 1 xk k pk with step length k and search direction pk
2/1/2021
@2020 New York University Tandon School of Engineering
39
Rates of Convergence
• Does an algorithm converge?
• How fast does it converge?
2/1/2021 @2020 New York University Tandon 40 School of Engineering
•
Sequence xk x with rate r and rate constant C, if lim ek1 C, wheree x x
•
Linear convergence if r = 1.
1) 0 < C < 1, the sequence converges. 2) C > 1, the sequence diverges.
•
Superlinear convergence if r = 1 and C = 0. Quadratic convergence if r = 2.
•
2/1/2021
@2020 New York University Tandon 41 School of Engineering
k
ek
r
kk
A Brief Introduction to Lagrange Multiplier Method
Lagrange multiplier theory essentially targets
at constrained optimization problems. For example,
maxPfx, xn 0
subject to the constraint c f1(x).
To this end, introduce the augmented performance index
P f xT c f (x) a01
with the vector of Lagrange multipliers.
2/1/2021 @2020 New York University Tandon 42 School of Engineering
A Brief Introduction to Lagrange Multiplier Method
Then, necessary conditions for optimality are
P a
0 and
x
c f1(x).
For more details, wait for
the lectures on “nonlinear programming”.
2/1/2021 @2020 New York University Tandon 43 School of Engineering