notes

myweb.clemson.edu

notes

IE 426

Optimization models and applications

Examples: Nonconvex sets

◮ R, R 2 , R 3 , etc. are convex

◮ [a, b] is convex

◮ {4} is convex

Lecture 2 — August 28, 2008

Convexity, relaxations

Convex

Convex sets

Def.: A set S ⊆ R n is convex if any two points x ′ and x ′′ of S are

joined by a segment entirely contained in S:

∀x ′ , x ′′ ∈ S,α ∈ [0, 1], αx ′ + (1 − α)x ′′ ∈ S

The intersection of two convex sets is convex.

Examples: Nonconvex sets

x ′

x ′′

◮ {0, 1} is nonconvex

x ′

x ′′

◮ {x ∈ R : x ≤ 2 ∨ x ≥ 3} is nonconvex

◮ Z is nonconvex

x ′

x ′′


Convex functions

Def.: A function f : R n → R is convex if, for any two points x ′

and x ′′ ∈ R n and for any α ∈ [0, 1]

f(αx ′ + (1 − α)x ′′ ) ≤ αf(x ′ ) + (1 − α)f(x ′′ )

◮ The sum of convex functions is a convex function

◮ Multiplying a convex function by a positive scalar gives a

convex function

◮ linear functions k

i=1 aixi are convex, irrespective of the

sign of ai’s.

Definition

f(x ′′ )

Examples of nonconvex functions Examples

f(x ′ )

f(x)

αf(x ′ ) + (1 − α)f(x ′′ )

f(αx ′ + (1 − α)x ′′ )

◮ The function f(x) = x is convex

x ′ x ′′

αx ′ + (1 − α)x ′′

◮ The function f(x1, x2) = x1 + x2 is convex

◮ The function f(x1, x2) = x 2 1 + x2 is convex

is convex

◮ The function f(x1, x2) = 5x2 1 + 3x2 2

◮ The function f(x1, x2) = x2 1 + x2 2 − x1x2 is convex

◮ The function f(x1, x2) = x 2 1 + x2 2 + 5x1x2 is nonconvex

◮ The function f(x1, x2) = x 2 1 −x2 2

is nonconvex

◮ The function f(x1, x2) = x1x2 is nonconvex

◮ The function f(x) = sin x, for x ∈ [0, 2π] is nonconvex

◮ The function f(x) = −x 2 is nonconvex

x


Convex constraints

◮ A constraint g(x)≤b, with g : R n → R, defines a subset S of

R n , that is,

S = {x ∈ R n : g(x) ≤ b}

◮ the constraint g(x)≤b is convex if the set S is convex.

◮ if the function g(x) is convex, the constraint g(x)≤b is

convex.

◮ linear constraints k i=1 aixi

⎧ ⎫

⎨ ≤ ⎬

= b are convex

⎩ ⎭


The general optimization problem

Consider a vector x ∈ R n of variables.

An optimization problem can be expressed as:

P : minimize f0(x)

such that f1(x) ≤ b1

f2(x) ≤ b2

.

fm(x) ≤ bm

Convex constraints

b

g(x)

l u

{x ∈ R : g(x) ≤ b} = [l, u]

note: if the function g(x) is convex, the constraint g(x)≥b may be

nonconvex!

Feasible solutions, local and global optima

Define F = {x ∈ R n : f1(x) ≤ b1, f2(x) ≤ b2,... , fm(x) ≤ bm}, that

is, F is the feasible set of an optimization problem.

All points x ∈ F are called feasible solutions.

A vector x l ∈ R n is a local optimum if

◮ x l ∈ F

◮ there is a neighbourhood N of x l with no better point than x l :

∀x ∈ N ∩ F, f0(x) ≥ f0(x l )

A vector x g ∈ R n is a global optimum if

◮ x g ∈ F

◮ there is no x ∈ F better than x g , i.e.,

f0(x) ≥ f0(x g ) ∀x ∈ F

x


Local optima, global optima

Local Optima

Global Optimum

Examples of convex problems

P1 : minimize x2 1 + 2x2 such that

2

x2 1 + x2 2 ≤ 1

0 ≤ x1 ≤ 2

1 ≤ x2 ≤ 5

nonconvex! → x2 ∈ Z

P2 : minimize x1 − 2x 2 2 ← nonconvex objective!

such that x2 1 + x2 2 ≤ 1

x2 = 0

0 ≤ x1 ≤ 5

Convex problems

Def.: An optimization problem is convex if

◮ the objective function is convex

◮ all constraints are convex

Convex optimization problems are easy: any local optimum is

also a global optimum.

(Hint) When modeling an optimization problem, it would be

good if we found a convex problem.

Your first Optimization model

Variables r: ray of the can’s base

h: height of the can

Objective 2πrh + 2πr 2 (minimize)

Constraints πr 2 h = V

h > 0

r > 0


Relaxation of an Optimization problem

Consider an optimization problem

P : minimize f0(x)

such that f1(x) ≤ b1

f2(x) ≤ b2

.

fm(x) ≤ bm,

or P : min{f0(x) : x ∈ F} for short.

A problem P ′ : min{f ′ 0 (x) : x ∈ F′ } is a relaxation of P if:

◮ F ′ ⊇ F

◮ f ′ 0 (x) ≤ f0(x) for all x ∈ F. 1

If P ′ is a relaxation of a problem P, then the global optimum of

P ′ is ≤ the global optimum of P.

1 We don’t care what f ′ 0(x) is outside of F.

Relaxations of an Optimization problem

f(x)

f(x)

l u

l u

x

x

f(x)

f(x)

l u

l u

x

x

Examples

◮ min{f(x) : −1 ≤ x ≤ 1} is a relaxation of min{f(x) : x = 0}

◮ min{f(x) : −1 ≤ x ≤ 1} is a r. of min{f(x) : 0 ≤ x ≤ 1}

◮ min{f(x) : −1 ≤ x ≤ 1} is not a r. of min{f(x) : −2 ≤ x ≤ 1}

◮ min{f(x) : g(x) ≤ b} is a r. of min{f(x) : g(x) ≤ b − 1}

◮ min{f(x) − 1 : g(x) ≤ b} is a r. of min{f(x) : g(x) ≤ b}

Relaxations

Consider again a problem

P : min{f0(x) : f1(x) ≤ b1, f2(x) ≤ b1,... , fm(x) ≤ bm}, or

P : min{f0(x) : x ∈ F} for short.

◮ deleting a constraint from P provides a relaxation of P.

◮ adding a constraint fm+1(x) ≤ bm+1 to a problem P does the

opposite:

and therefore

F ′′ = {x ∈ R n : f1(x) ≤ b1,

f2(x) ≤ b2,

...,

fm(x) ≤ bm,

fm+1(x) ≤ bm+1} ⊆ F

min{f0(x) : x ∈ F ′′ } ≥ min{f0(x) : x ∈ F}

More magazines by this user
Similar magazines