06.06.2013 Views

STOCHASTIC

STOCHASTIC

STOCHASTIC

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

the property that linear interpolations never underestimate the functions.<br />

Alternatively, linear supports never overestimate the functions. These and<br />

related characterizations are illustrated in Exercise ME-8. There are two<br />

important properties possessed by problems of minimizing convex functions<br />

subject to constraints having the form that a set of convex functions not<br />

exceed zero. First, the set of feasible points determined by the constraints<br />

is a convex set. Second, local minima are always global minima. For applications,<br />

certain generalizations of the convexity concept are needed.<br />

Mangasarian's first article introduces pseudo-convex functions. These<br />

functions are differentiable and possess both of the properties mentioned<br />

above, yet are not necessarily convex. They are essentially defined by the<br />

property that if a directional derivative is positive (points up), then the function<br />

continues to increase in the given direction. An even wider class of functions,<br />

called quasi-convex, is defined by retaining only the first property. There are<br />

several alternative ways to define quasi-convex functions and they are explored<br />

in Exercise ME-11. These generalized functions along with other related<br />

functions are discussed in Exercises CR-15 and ME-9. In Exercises CR-8<br />

and 9 the reader is asked to investigate the convexity and generalized convexity<br />

properties of simple functions in one and several dimensions, respectively.<br />

Exercise ME-12 shows how one can verify whether or not a given function<br />

is a convex or generalized convex function by examining a Hessian matrix<br />

of second partial derivatives or a Hessian matrix bordered by first<br />

partials.<br />

The most useful results concerning minimization problems are surely the<br />

Kuhn-Tucker necessary and sufficient conditions. In Mangasarian's article<br />

it is shown that the Kuhn-Tucker conditions are sufficient if the objective is<br />

pseudo-convex and the constraints are quasi-concave. The conditions are<br />

necessary if a mild constraint qualification is met. This is discussed in Exercise<br />

ME-13, where a proof of their necessity that utilizes Farkas' lemma is outlined.<br />

Exercises CR-10 and 13 illustrate how the Kuhn-Tucker conditions may be<br />

used- to solve simple mean-variance tradeoff models and other optimization<br />

problems, respectively. The determination of the correct sign of the multipliers<br />

is considered in Exercise CR-16. The Kuhn-Tucker conditions are intimately<br />

related to the Lagrange function. Exercise ME-10 illustrates the relationship<br />

between saddle points of the Lagrangian and solutions of the minimization<br />

problem. The primal problem may be considered as a minimax of the<br />

Lagrangian where the min is with respect to original (primal) variables, and<br />

the max is with respect to Lagrange (dual) variables. A dual problem results<br />

when one maximins. When the functions are convex and differentiable one<br />

obtains the Wolfe dual. Mangasarian presents some results related to this<br />

dual problem in his first paper; other results are developed in Exercise ME-14.<br />

The reader is asked to determine the actual dual problems in some special<br />

INTRODUCTION 5

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!