global optimality conditions in non-convex optimization

43
Introduction Complexity Issues Optimality Conditions References Global optimality conditions in non-convex optimization Panos M. Pardalos Distinguished Professor Center For Applied Optimization, Industrial and Systems Engineering, University of Florida, Florida, USA. National Research University Higher School of Economics, Laboratory of Algorithms and Technologies for Network Analysis, Nizhny Novgorod, Russia Global optimality conditions in non-convex optimization

Upload: others

Post on 10-Nov-2021

16 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Global optimality conditions in non-convexoptimization

Panos M. PardalosDistinguished Professor Center For Applied Optimization,Industrial and Systems Engineering, University of Florida,

Florida, USA.National Research University Higher School of Economics,

Laboratory of Algorithms and Technologies for Network Analysis,Nizhny Novgorod, Russia

Global optimality conditions in non-convex optimization

Page 2: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Outline

1 Introduction

2 Complexity IssuesComplexity of Kuhn-Tucker ConditionsComplexity of local minimizationComplexity of checking convexity of a function

3 Optimality ConditionsOptimality conditions in combinatorial optimizationMin-max optimality conditionsOptimality Conditions for Lipschitz functions

4 References

Global optimality conditions in non-convex optimization

Page 3: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Introduction

Greek mathematicians solved optimally some problems related to their geometrical studies.

Euclid considered the minimal distance between a point and a line.

Heron proved that light travels between two points through the path with shortest length when reflectingfrom a mirror.

Optimality in Nature

Fermat’s principle (principle of least time)

Hamilton’s principle (principle of stationary action)

Maupertuis’ principle (principle of least action)

Optimality in Biology:

Optimality Principles in Biology (R. Rosen, New York: Plenum Press, 1967).

Example: What determines the radius of the aorta? The human aortic radius is about 1.5cm (minimize thepower dissipated through blood flow)

1951: H.W. Kuhn and A.W. Tucker, Optimality conditions for nonlinear problems.F. John in 1948 and W. Karush in 1939 had presented similar conditions

Global optimality conditions in non-convex optimization

Page 4: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Complexity of Kuhn-Tucker ConditionsComplexity of local minimizationComplexity of checking convexity of a function

Complexity of Kuhn-Tucker Conditions

Consider the following quadratic problem:

min f (x) = cT x +1

2xTQx

st. x ≥ 0,

where Q is an arbitrary n× n symmetric matrix, x ∈ Rn. The KKToptimality conditions for this problem become so-called linearcomplementarity problem (LCP(Q, c))

Global optimality conditions in non-convex optimization

Page 5: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Complexity of Kuhn-Tucker ConditionsComplexity of local minimizationComplexity of checking convexity of a function

Complexity of Kuhn-Tucker Conditions

Linear complementarity problem LCP(Q, c) is formulated asfollows.Find x ∈ Rn (or prove that no such an x exists) such that:

Qx + c ≥ 0, x ≥ 0

xT (Qx + c) = 0.

Global optimality conditions in non-convex optimization

Page 6: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Complexity of Kuhn-Tucker ConditionsComplexity of local minimizationComplexity of checking convexity of a function

Theorem

Theorem (Horst, Pardalos, Thoai, 1994 - [5])

The problem LCP(Q, c) is NP-hard.

Proof.

Consider the following LCP(Q, c) problem in Rn+3 defined by

Q(n+3)×(n+3) =

−In en −en 0n

eTn −1 −1 −1−eT

n −1 −1 −10Tn −1 −1 −1

, cTn+3 = (a1, . . . , an,−b, b, 0),

where ai , i = 1, . . . , n, and b are positive integers, In is the n × n-unitmatrix and the vectors en ∈ Rn, 0n ∈ Rn are defined by

eTn = (1, 1, . . . , 1), 0T

n = (0, 0, . . . , 0).

Global optimality conditions in non-convex optimization

Page 7: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Complexity of Kuhn-Tucker ConditionsComplexity of local minimizationComplexity of checking convexity of a function

Theorem (Continue)

Proof.

Consider the following knapsack problem. Find a feasible solution to thesystem

n∑i=1

aixi = b, xi ∈ {0, 1} (i = 1, . . . , n).

This problem is known to be NP-complete. We will show that LCP(Q, c)is solvable iff the associated knapsack problem is solvable.If x solves the knapsack problem, then y = (a1x1, . . . , anxn, 0, 0, 0)T

solves LCP(Q, c).

Conversely, assume y solves the considered LCP(Q, c). This implies that∑ni=1 yi = b and 0 ≤ yi ≤ ai . Finally, if yi < ai , then yT (Qy + c) = 0

enforces yi = 0. Hence, x = ( y1

a1, . . . , ynan ) solves the knapsack

problem.

Global optimality conditions in non-convex optimization

Page 8: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Complexity of Kuhn-Tucker ConditionsComplexity of local minimizationComplexity of checking convexity of a function

Complexity of local minimization

Consider following quadratic problem:

min f (x)

s.t. Ax ≥ b, x ≥ 0

where f (x) is an indefinite quadratic function. We showed, thatthe problem of checking local optimality for a feasible point andthe problem of checking whether a local minimum is strict areNP-hard.

Global optimality conditions in non-convex optimization

Page 9: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Complexity of Kuhn-Tucker ConditionsComplexity of local minimizationComplexity of checking convexity of a function

3-satisfiability problem

Consider the 3-satisfiability (3-SAT) problem: given a set ofBoolean variables x1, . . . , xn and given a Boolean expression S (inconjunctive normal form) with exactly 3 literals per clause,

S =m∧i=1

(3∨

j=1

lij), lij ∈ {xi , x̄i |i = 1, . . . , n}

is there a truth assignment for the variables xi which makes S true?

The 3-SAT problem is known to be NP-complete.

Global optimality conditions in non-convex optimization

Page 10: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Complexity of Kuhn-Tucker ConditionsComplexity of local minimizationComplexity of checking convexity of a function

Construction of indefinite quadratic problem instances

For each instance of a 3-SAT problem we construct an instance of anoptimization problem in the real variables x0, . . . , xn. For each clause in Swe associate a linear inequality in a following way:

If lij = xk , we retain xk

If lij = x̄k we use 1− xk

We add an additional variable x0 and require that the corresponding sumis greater than or equal to 3

2 . Thus, we associate to S a system of linearinequalities Asx ≥ (3/2 + c). Let D(S) ⊂ Rn+1 be a feasible set ofpoints satisfying these constraints.With a given instance of the 3-SAT problem we associate the followingindefinite quadratic problem:

minx∈D(S)

f (x) = −n∑

i=1

(xi − (1/2− x0))(xi − (1/2 + x0)).

Global optimality conditions in non-convex optimization

Page 11: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Complexity of Kuhn-Tucker ConditionsComplexity of local minimizationComplexity of checking convexity of a function

Complexity of local minimization

Theorem (Pardalos, Schnitger, 1988 - [6])

S is satisfiable iff x∗ = (0, 1/2, . . . , 1/2)T is not a strict minimumS is satisfiable iff x∗ = (0, 1/2, . . . , 1/2)T is not a local minimum

Corollary

For a quadratic indefinite problem the problem of checking localoptimality for a feasible point and the problem of checking whethera local minimum is strict are NP-hard.

Global optimality conditions in non-convex optimization

Page 12: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Complexity of Kuhn-Tucker ConditionsComplexity of local minimizationComplexity of checking convexity of a function

Complexity of checking convexity of a function

The role of convexity in modern day mathematicalprogramming has proven to be fundamental

The great watershed in optimization is not between linearityand nonlinearity, but convexity and nonconvexity(R. Rockafellar)

The tractability of a problem is often assessed by whether theproblem has some sort of underlying convexity.

Can we decide in an efficient manner if a given optimizationproblem is convex?

Global optimality conditions in non-convex optimization

Page 13: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Complexity of Kuhn-Tucker ConditionsComplexity of local minimizationComplexity of checking convexity of a function

Complexity of checking convexity of a function

One of seven open problems in complexity theory for numericaloptimization (Pardalos, Vavasis, 1992 - [7]):

Given a degree-4 polynomial in n variables, what is the complexityof determining whether this polynomial describes a convexfunction?

Global optimality conditions in non-convex optimization

Page 14: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Complexity of Kuhn-Tucker ConditionsComplexity of local minimizationComplexity of checking convexity of a function

Theorem

Theorem (Ahmadi et al., 2011)

Deciding convexity of degree four polynomials is strongly NP-hard.This is true even when the polynomials are restricted to behomogeneous (all terms with nonzero coefficients have the sametotal degree).

Corollary (Ahmadi et al., 2011)

It is NP-hard to check convexity of polynomials of any fixed evendegree d ≥ 4.

Global optimality conditions in non-convex optimization

Page 15: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Complexity of Kuhn-Tucker ConditionsComplexity of local minimizationComplexity of checking convexity of a function

Theorem

Theorem (Ahmadi et al., 2011)

It is NP-hard to decide strong convexity of polynomials of anyfixed even degree d = 4.

Theorem (Ahmadi et al., 2011)

It is NP-hard to decide strict convexity of polynomials of any fixedeven degree d = 4.

Global optimality conditions in non-convex optimization

Page 16: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Complexity of Kuhn-Tucker ConditionsComplexity of local minimizationComplexity of checking convexity of a function

Theorem

Theorem (Ahmadi et al., 2011)

For any fixed odd degree d, the quasi-convexity of polynomials ofdegree d can be checked in polynomial time.

Corollary (Ahmadi et al., 2011)

For any fixed odd degree d, the pseudoconvexity of polynomials ofdegree d can be checked in polynomial time.

Global optimality conditions in non-convex optimization

Page 17: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Complexity of Kuhn-Tucker ConditionsComplexity of local minimizationComplexity of checking convexity of a function

Theorem

Theorem (Ahmadi et al., 2011)

It is NP-hard to check quasiconvexity/pseudoconvexity of degreefour polynomials. This is true even when the polynomials arerestricted to be homogeneous.

Corollary (Ahmadi et al., 2011)

It is NP-hard to decide quasiconvexity of polynomials of any fixedeven degree d ≥ 4.

Global optimality conditions in non-convex optimization

Page 18: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Complexity of Kuhn-Tucker ConditionsComplexity of local minimizationComplexity of checking convexity of a function

The complexity results described above can be summarized in thefollowing table [1]:

Global optimality conditions in non-convex optimization

Page 19: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Optimality conditions in combinatorial optimizationMin-max optimality conditionsOptimality Conditions for Lipschitz functions

Linear Complementarity problem formulated as MIP

Theorem (Pardalos, Rosen, 1988)

The following mixed integer linear program

max α

s.t. 0 ≤ My + αq ≤ z ,

0 ≤ y ≤ e − z , z ∈ {0, 1}n, 0 ≤ α ≤ 1,

associated to the LCP(M,q) with q 6= 0 has an optimal solution(y∗, z∗, α∗) satisfying α∗ ≥ 0.The LCP has a solution, and x∗ = y∗

α∗ is a solution, if and only ifα∗ > 0.

The LCP is equivalent to the linear integer feasibility problem

Global optimality conditions in non-convex optimization

Page 20: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Optimality conditions in combinatorial optimizationMin-max optimality conditionsOptimality Conditions for Lipschitz functions

Motzkin-Strauss formulation for maximum clique

Let A be the adjacency matrix of a graph G . We write u ∼ v if uand v are distinct and adjacent, and u 6∼ v otherwise. Consider theMotzkin - Strauss formulation (also called the Motskin-StraussQP) of the Maximum Clique Problem:

max xTAx/2 (P)

s.t. eT x = 1

x ≥ 0

The simplex of feasible solutions of P we denote by 4n.

Global optimality conditions in non-convex optimization

Page 21: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Optimality conditions in combinatorial optimizationMin-max optimality conditionsOptimality Conditions for Lipschitz functions

Necessary Optimality Conditions

First Order (KKT) Conditions:

−Ax + λe − µ = 0 (1)

xT e = 1

x ≥ 0

µ ≥ 0

µuxu = 0 ∀u

Second Order (KKT) Conditions:

Let Z (x) = {u|xu = 0}. (2)

− yTAy ≥ 0, ∀y s.t.yT e = 0, yu = 0 ∀u ∈ Z (x).

Global optimality conditions in non-convex optimization

Page 22: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Optimality conditions in combinatorial optimizationMin-max optimality conditionsOptimality Conditions for Lipschitz functions

Necessary Optimality Conditions

Since the constraints of P are simplex constraints, regularityconditions always hold, and hence both (1) and (2) arenecessary for a solution to be locally (globally) optimal.

The following chain of inclusions takes place:

global optimal solutions ⊂ locally optimal solutions

⊂ second order points

⊂ first order points.

Global optimality conditions in non-convex optimization

Page 23: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Optimality conditions in combinatorial optimizationMin-max optimality conditionsOptimality Conditions for Lipschitz functions

A key semidefiniteness lemma

Lemma (L. E. Gibbons, D. W. Hearn, P. M. Pardalos, M. V.Ramana, 1997)

Let G = (V ,E ) be a graph whose adjacency matrix is A. LetS ⊂ V , T = V \S, and define the polyhedral cone:

Y = {y |yT e = 0, yv ≥ 0 ∀v ∈ T}.

Then the following are equivalent.

Global optimality conditions in non-convex optimization

Page 24: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Optimality conditions in combinatorial optimizationMin-max optimality conditionsOptimality Conditions for Lipschitz functions

A key semidefiniteness lemma (Continue)

Lemma

(1) A is negative semidefinite over the cone Y , i.e., yTAy ≤ 0 ∀y ∈ Y .(2) If u, v ,w ∈ V , u 6∼ v , u 6∼ w , v ∼ w, then either u, v ∈ T oru, w ∈ T .(3) The node sets S ,T can be partitioned as:

S = S1 ∪ S2 ∪ · · · ∪ Sl , T = T0 ∪ T1 ∪ · · · ∪ Tl ,

such that(a) Each Si is nonempty.(b) Si ∪ Ti is an independent set for every i = 1, . . . , l .(c) For all i 6= j , every vertex in Si ∪ Ti is adjacent to every vertex in Sj .(d) Every node of T0 is adjacent to every node in S.

Global optimality conditions in non-convex optimization

Page 25: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Optimality conditions in combinatorial optimizationMin-max optimality conditionsOptimality Conditions for Lipschitz functions

First order solutions

Theorem

Let x ∈ 4n, and S(x) = {u|xu > 0}. Then x is a first order pointif and only if the vector

µ = (xTAx)e − Ax

satisfies µ ≥ 0 and µu = 0 ∀u ∈ S(x).

Global optimality conditions in non-convex optimization

Page 26: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Optimality conditions in combinatorial optimizationMin-max optimality conditionsOptimality Conditions for Lipschitz functions

Second order solutions

Theorem

Let x be a feasible solution to P, and let H be the inducedsubgraph of G indexed on S(x), the support of x. Then x is asecond order point if and only if the following hold:(1) H is a complete l-partite graph (for some l), with the partitionS(x) = S1 ∪ . . . sl , with each Si being nonempty. U Si, with eachSi being nonempty.(2)

∑u∈Sj xu = 1/l ∀j = 1, . . . , l .

(3) Ax ≤ (1− 1/l)eAlso, if x is a second order point, then f (x) is 1

2 (1− 1/l), andλ = 1− 1/l , where λ is the Lagrange multiplier associated withthe constraint eT x = 1 in P.

Global optimality conditions in non-convex optimization

Page 27: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Optimality conditions in combinatorial optimizationMin-max optimality conditionsOptimality Conditions for Lipschitz functions

Locally optimal solutions

Theorem

Let x ∈ 4n. Then x is a local maximum of P if and only if there existsan integer l such that, with

λ := 1− 1/l , S := {u|xu > 0}, T := {u|xu = 0, (Ax)u = λ},

we have(1) Ax ≤ λe.(2) (Ax)u = λ ∀u ∈ S (3) There exist partitions of S and T ,

S = S1 ∪ · · · ∪ Sl , T = T1 ∪ · · · ∪ Tl ,

such that(a) Si 6= ∅ ∀i = 1, . . . , l .(b) Si ∪ Ti is independent for every i .(c) i 6= j , u ∈ Si ∪ Ti , v ∈ Sj ⇒ u ∼ v .

Global optimality conditions in non-convex optimization

Page 28: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Optimality conditions in combinatorial optimizationMin-max optimality conditionsOptimality Conditions for Lipschitz functions

Minimizing the rank of a matrix

Consider a problem of minimizing the rank of a matrix:

min f (A) := rank of A

s.t. A ∈ C ,

where C is a subset of Mm,n(R)(the vector space of m by n realmatrices).

Theorem (Hiriart-Urruty, 2011)

Every admissible point in the above problem is a local minimizer.

Comment: The result is somehow strange: the various generalizedsubdifferentials, including Clarkes one, all coincide and, of course,contain the zero element.

Global optimality conditions in non-convex optimization

Page 29: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Optimality conditions in combinatorial optimizationMin-max optimality conditionsOptimality Conditions for Lipschitz functions

Min-max optimality conditions

Consider a problem:

minx∈X

maxi∈I

fi (x),

where X is a convex region id a d-dimensional Euclidian space Rn,I is a finite index set, and the fi (x) are continuous functions overX . A subset Z of X is called an extreme subset of X if

x , y ∈X

λx + (1− λ)y ∈ Z for some 0 < λ <l

}⇒ x , y ∈ Y

Global optimality conditions in non-convex optimization

Page 30: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Optimality conditions in combinatorial optimizationMin-max optimality conditionsOptimality Conditions for Lipschitz functions

Min-max optimality conditions

Theorem (Du, Pardalos, 1993)

Let f (x , y) be a continuous function on X × Y where X is apolytope in Rm and Y is a compact set in Rn. Letg(x) = maxy∈Y f (x , y). If (x , y) is concave with respect to x, thenthe minimum value of g(x) over X is achieved at some point x̂satisfying the following condition:(*) There exists an extreme subset Z of X such that x̂ ∈ Z andthe set I (x̂) = {y |g(x̂) = f (x̂ , y)} is maximal over Z .

Global optimality conditions in non-convex optimization

Page 31: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Optimality conditions in combinatorial optimizationMin-max optimality conditionsOptimality Conditions for Lipschitz functions

General localization theorem

Theorem (Georgiev, Pardalos, Chinchuluun, 2008 - [4])

Suppose that Y is a compact topological space, X ⊂ Rn is acompact polyhedron, f : X × Y → R is a continuous function andf (·, y) is concave for every y ∈ Y . Define g(x) = maxy∈Y f (x , y).Then the minimum of g over X is attained at some generalizedvertex x̂ .

Global optimality conditions in non-convex optimization

Page 32: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Optimality conditions in combinatorial optimizationMin-max optimality conditionsOptimality Conditions for Lipschitz functions

Optimality Conditions for Lipschitz functions

Consider following notations:

E - Banach space with dual E ∗;

‖ · ‖ and ‖ · ‖∗ - the norms in E and E ∗ respectively;

BE and BE∗ - closed unit balls of E and E ∗ respectively;

D - closed and nonempty subset of E .

Global optimality conditions in non-convex optimization

Page 33: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Optimality conditions in combinatorial optimizationMin-max optimality conditionsOptimality Conditions for Lipschitz functions

Clarke’s subdifferential

Consider a locally Lipschitz function f : E → R. The Clarke’ssubdifferential is given by (Clarke, 1990 - [2]):

∂f (x) ={

x∗ ∈ E ∗ : 〈x∗, h〉 ≤ f ◦(x ; h) ∀h ∈ E},

where

f ◦(x ; h) = lim supu→x ,t↓0

f (u + th)− f (u)

t

is the Clarke directional derivative of f at x in the direction h.

Global optimality conditions in non-convex optimization

Page 34: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Optimality conditions in combinatorial optimizationMin-max optimality conditionsOptimality Conditions for Lipschitz functions

Clarke’s regularity

If the regular directional derivative of f at x in direction h

f ′(x ; h) = limt↓0

f (x + th)− f (x)

t

exists and f ′(x ; h) = f ◦(x ; h) for every h ∈ E , then f is calledregular in sense of Clarke.

Global optimality conditions in non-convex optimization

Page 35: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Optimality conditions in combinatorial optimizationMin-max optimality conditionsOptimality Conditions for Lipschitz functions

Optimality Conditions

Suppose the following locally Lipschitz functions are given:

gi : E → R, i ∈ I ∪ {0},hj : E → R, j ∈ J.

where

I = {1, 2, ..., n}J = {1, 2, ...,m}.

Global optimality conditions in non-convex optimization

Page 36: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Optimality conditions in combinatorial optimizationMin-max optimality conditionsOptimality Conditions for Lipschitz functions

Optimality Conditions

Consider the following minimization problem:{minimize g0(x)

subject to x ∈ D, gi (x) ≤ 0, hj(x) = 0 (i ∈ I , j ∈ J).(1)

Global optimality conditions in non-convex optimization

Page 37: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Optimality conditions in combinatorial optimizationMin-max optimality conditionsOptimality Conditions for Lipschitz functions

Optimality Conditions

Denote ND(x) the generalized normal cone in the sense of Clarkeat x ∈ D (Clarke, 1990):

ND(x) = {x∗ ∈ E ∗ : ∃t > 0, x∗ ∈ t∂d(x ,D)},

where d(.,D is the distance function:

d(x ,D) = inf{‖x − y‖ : y ∈ D}.

When D is convex, then ND(x) coincides with the classical normalcone in the sense of convex analysis N(D, x) [2].

Global optimality conditions in non-convex optimization

Page 38: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Optimality conditions in combinatorial optimizationMin-max optimality conditionsOptimality Conditions for Lipschitz functions

Optimality Conditions

Theorem (Georgiev, Chinchuluun, Pardalos, 2011 - [3])

Consider the following minimization problem:{minimize f (x)

subject to x ∈ D,(2)

where f : D → R is a locally Lipschitz function, D is a nonempty closedconvex subset of a Banach space E . Denote the level set of f at c on Dby

Dc(f ) = {y ∈ D : f (y) = c}.

Global optimality conditions in non-convex optimization

Page 39: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Optimality conditions in combinatorial optimizationMin-max optimality conditionsOptimality Conditions for Lipschitz functions

Optimality Conditions

Theorem

a) Sufficient condition for a global minimum.If the following qualification condition holds:

∂f (y) ∩ N(D, y) = ∅ ∀y ∈ Df (z)(f ), (3)

then the condition

− ∂f (y) ⊂ N(D, y) ∀y ∈ Df (z)(f ) (4)

is sufficient for z ∈ D to be a global solution to Problem (2).b) Necessary condition for a global minimum.

If −f is regular in sense of Clarke, then the condition (4) is necessary for

z ∈ D to be a global solution to Problem (2).

Global optimality conditions in non-convex optimization

Page 40: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Optimality conditions in combinatorial optimizationMin-max optimality conditionsOptimality Conditions for Lipschitz functions

Special Case

The function f is concave and the space is finite dimensional andEuclidean. Then the condition (4) was refined (Strekalovsky, 1987)by replacing the set Df (z)(f ) with the following set:

Dconcavef (z) (f ) = {y ∈ D : f (y) = f (z), y is an extreme point of D}.

This refinement reflects the fact that the minimum of a concavefunction over a convex set is attained at an extreme point.

Global optimality conditions in non-convex optimization

Page 41: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Optimality conditions in combinatorial optimizationMin-max optimality conditionsOptimality Conditions for Lipschitz functions

Special Case

The function f is a maximum of concave functions:

f (x) = maxy∈Y

g(x , y),

where g(., y) is concave for every y ∈ Y , Y being a compacttopological space and

g : X × Y → R

is continuous.

Global optimality conditions in non-convex optimization

Page 42: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

Optimality conditions in combinatorial optimizationMin-max optimality conditionsOptimality Conditions for Lipschitz functions

Open Question

Is it possible to refine the condition (4) in sense to replace the setDf (z)(f ) with the following set

Dmax ,concf (z) (f ) = {y ∈ D : f (y) = f (z), y is a generalized vertex of D}?

The motivation for this question is the following result (Georgiev,Chinchuluun, Pardalos, 2008 - [4]): the global minimum of f overD is attained at some generalized vertex of D.

Global optimality conditions in non-convex optimization

Page 43: Global optimality conditions in non-convex optimization

IntroductionComplexity Issues

Optimality ConditionsReferences

References

A.A. Ahmadi, A. Olshevsky, P.A. Parrilo, and J.N. Tsitsiklis.

Np-hardness of deciding convexity of quartic polynomials and related problems.Mathematical Programming, pages 1–24, 2011.

F.H. Clarke.

Optimization and nonsmooth analysis, volume 5.Society for Industrial Mathematics, 1990.

P. Georgiev, A. Chinchuluun, and P. Pardalos.

Optimality conditions of first order for global minima of locally lipschitz functions.Optimization, 60, 1(2):277–282, 2011.

P.G. Georgiev, P.M. Pardalos, and A. Chinchuluun.

Localization of minimax points.Journal of Global Optimization, 40(1):489–494, 2008.

R. Horst, P.M. Pardalos, and N.V. Thoai.

Introduction to global optimization.Springer, 2000.

P.M. Pardalos and G. Schnitger.

Checking local optimality in constrained quadratic programming is np-hard.Operations Research Letters, 7(1):33–35, 1988.

P.M. Pardalos and S.A. Vavasis.

Open questions in complexity theory for numerical optimization.Mathematical Programming, 57(1):337–339, 1992.

Global optimality conditions in non-convex optimization