数值优化代写: Exercise sheet 2 Optimization

Model Predictive Control Prof. Manfred Morari

ESE619, Spring 2018 Due: Feb 14 in class

Exercise sheet 2 Optimization

Instructions: You are not allowed to use a calculator / computer unless specified. Exercise 1 Properties of Sets and Functions

Let x1, x2 ∈ Rn be in the feasible set X of an optimization problem, i.e.
x1, x2 ∈ X = {x ∈ Rn |gi(x) ≤ 0, i = 1,…m, hj(x) = 0, j = 1,…q} .

  1. Can you find simple conditions on functions gi(x), i = 1,…m, and hj(x), j = 1,…q, such that the feasible set X is convex? [2 pts]
  2. Let z = θx1 + (1 − θ)x2, θ ∈ [0, 1], be any point on the line segment between points x1 and x2. Can you find conditions on function f (x) such that ‘z is better than the worst of x1 and x2’, i.e.

    f(z)≤max{f(x1),f(x2)} ?
    If yes, can it happen that point z is better than both of them? Try to sketch such a situation.

    [5 pts]

  3. Which property of the feasible set X would ensure that point y = x1 + x2 is feasible, given

    x1, x2 ∈ X? [3 pts]

Exercise 2 Checking Convexity of Sets

Which of the following sets are convex? Give reasons for your answers.

  1. Aslab,i.e.theset{x∈Rn|α≤aTx≤β}wherea∈Rn andα,β∈R?

    Hint: Try to sketch this set for n = 2! [3 pts]

  2. Let s : Rn → R be any function with domain doms ⊆ Rn. Let set S be any subset of doms,

    though not necessarily a convex one. Is the set M defined as
    M := {x |∥x − y∥ ≤ s(y) for all y ∈ S}

    a convex set? Note that ∥.∥ denotes any norm on Rn. [4 pts]

  3. Consider s and S as above, again ∥.∥ denotes any norm on Rn. We consider the set M ̃

    defined as

    M ̃ := {x | ∃y ∈ S s.t. ∥x − y ∥ ≤ s (y )} Is M ̃ a convex set, assuming that s is a convex function? [4 pts]

  4. The set of points closer to a given point x0 than to a given set Q ⊆ Rn, i.e. {x|∥x−x0∥2≤∥x−y∥2 forally∈Q} ,

    where ∥.∥2 denotes the standard 2-norm (also called Euclidean norm)? [4 pts]

1

Exercise 3 1-Norm, ∞-Norm

Instead of the standard 2-norm, the 1-norm or the ∞-norm are sometimes used in the MPC cost function. In this exercise you should show that both a minimization problem with a 1-norm objective and an ∞-norm objective

min ∥Ax∥p , p ∈ {1,∞} x

can be recast as a linear program (LP)

min bT y y

s.t. F y ≤ g ,
with vectors b, g and matrix F defined appropriately. [10 pts]

Note that we assume matrix A ∈ RN×n where its N row vectors are denoted as aiT, i = 1,…N, i.e.

Exercise 4

Linear Regression with 1− and ∞−Norms

 a 1T   . 

A=.. aNT

We now use the ideas from Exercise ?? to solve a linear regression problem: Use the command linprog from MATLAB to solve the LP’s stemming from a reformulation of the linear regression problems

min ∥Ax − b∥p , p ∈ {1, ∞} , x

where A is a random matrix and b is a random vector, e.g. created in MATLAB with the commands A=rand(10,5) and b=rand(10,1). [10 pts]

Note: What we are actually doing here is to solve the overdetermined linear equation system Ax = b. The solution for the case p = 2 is called the linear least square solution and is given directly in MATLAB by A\b.

Optional: Try to install e.g. Yalmip (http://users.isy.liu.se/johanl/yalmip/) which helps you to model optimization problems in MATLAB in a more convenient way than before. Also, have a look at different LP solvers (under ‘Solvers’ on the latter website) that are more powerful than MATLAB’s linprog.

Exercise 5 Quadratic Program Consider the optimization problem:

min subject to

1(x12 + x2 + 0.1×32) + 0.55 x3 2

x1+x2+x3=1 x1 ≥ 0
x2 ≥ 0
x3 ≥ 0

1. Show that x ∗ = (0.5, 0.5, 0) is a local minimum. [5 pts]
2. Is x∗ also a global minimum? Explain why, or why not. [5 pts]

2