Nonlinear Conic Optimization —why and how— Michal Ko ˇcvara Manchester, 19 March 2014

Nonlinear Conic Optimization
—why and how—
Michal Koˇcvara
School of Mathematics, The University of Birmingham
Manchester, 19 March 2014
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
1 / 38
Conic optimization
“generalized” mathematical optimization problem
min f (x)
subject to
gi (x) ∈ Ki ,
i = 1, . . . , m
Ki — convex cone ⊂ Rni , ni ≤ n
f , gi —





linear → linear conic optimization, LCO
convex → convex conic optimization, CCO
non-convex → non-convex conic optimization, NCO
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
2 / 38
Linear conic optimization
min c T x
subject to
A(i) x − b (i) ∈ Ki ,
i = 1, . . . , m


 non-negative orthant → linear optimization, LP
Ki —
second-order cone → s.-o. conic optimization, SOCP


semidefinite cone → semidefinite optimization, SDP
Algorithms


 non-negative orthant → simplex, interior-point
Ki —
second-order cone → interior-point, augmented Lagr.


semidefinite cone → interior-point, augmented Lagr.
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
3 / 38
Linear conic optimization
min c T x
subject to
A(i) x − b (i) ∈ Ki ,
i = 1, . . . , m


 non-negative orthant → linear optimization, LP
Ki —
second-order cone → s.-o. conic optimization, SOCP


semidefinite cone → semidefinite optimization, SDP
Software


 non-negative orthant → CPLEX, Gurobi, MOSEK,. . .
Ki —
second-order cone → . . . MOSEK, SeDuMi, SDPT3,. . .


semidefinite cone → MOSEK, SeDuMi, SDPT3,. . .
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
3 / 38
Convex conic optimization
min f (x)
subject to
gi (x) ∈ Ki ,
i = 1, . . . , m
f , gi — (smooth) convex functions
Algorithms, software
CVX (Michael Grant, Steve Boyd):
convex functions (like -log, exp) supported by using a “successive approximation” approach; approximation by linear conic optimization, solved by LCO software.
The solver is called multiple times to refine the solution to the
required accuracy.
May be slow; not all convex functions supported.
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
4 / 38
Convex conic optimization
min f (x)
subject to
gi (x) ∈ Ki ,
i = 1, . . . , m
f , gi — (smooth) convex functions
Algorithms, software
Interior-point methods
Nemirovski)
supported
theoretically
(Nesterov-
No efficient implementation.
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
4 / 38
Non-convex conic optimization
min f (x)
subject to
gi (x) ∈ Ki ,
i = 1, . . . , m
Ki . . . nonnegative orthant → nonlinear optimization
SQP, interior-point methods, augmented Lagrangian,. . .
SNOPT, IPOPT, KNITRO,. . .
Ki . . . second-order or semidefinite cone
Algorithms? Software? Do we need it at all?
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
5 / 38
Semidefinite programming (SDP)
“generalized” mathematical optimization problem
min f (x)
subject to
g(x) ≥ 0
A(x) 0
A(x) — (non)linear
matrix operator Rn → Sm
P
(A(x) = A0 + xi Ai )
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
6 / 38
Linear SDP: primal-dual pair
infhC, X i := Tr(CX )
X
∗
s.t. A (X ) = b
(P)
[hAi , X i = bi , i = 1, . . . , n]
X 0
suphb, yi :=
y ,S
X
s.t. A(y) + S = C
bi yi
(D)
X
[
yi Ai + S = C]
S0
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
7 / 38
Linear Semidefinite Programming
Vast area of applications. . .
• LP and CQP is SDP
• eigenvalue optimization
• robust programming
• control theory
• relaxations of integer optimization problems
• approximations to combinatorial optimization problems
• structural optimization
• chemical engineering
• machine learning
• many many others. . .
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
8 / 38
Why nonlinear SDP?
Problems from
• Structural optimization
• Control theory
• Examples below
There are more but the researchers just don’t know about
them. . .
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
9 / 38
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
10 / 38
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
11 / 38
Free Material Optimization
Aim:
Given an amount of material, boundary conditions and external
load f , find the material (distribution) so that the body is as stiff
as possible under f .
The design variables are the material properties at each point of
the structure.
M. P. Bendsøe, J.M. Guades, R.B. Haber, P. Pedersen and
J. E. Taylor: An analytical model to predict optimal material properties in the context of optimal structural design. J. Applied Mechanics, 61 (1994) 930–937
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
12 / 38
Free Material Optimization
2
2
1
1
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
12 / 38
FMO primal formulation
FMO problem with vibration/buckling constraint
min
u∈Rn ,E1 ,...,Em
m
X
TrEi
i=1
subject to
Ei 0, ρ ≤ TrEi ≤ ρ,
i = 1, . . . , m
⊤
f u≤C
A(E )u = f
A(E ) + G(E , u) 0
. . . nonlinear, non-convex semidefinite problem
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
13 / 38
Nonlinear SDP: algorithms, software?
• Interior point methods (log det barrier, Jarre ’98), Leibfritz
’02,. . .
• Sequential SDP methods (Correa-Ramirez ’04,
Jarre-Diehl)
None of these work (well) for practical problems
• (Generalized) Augmented Lagrangian methods
(Apkarian-Noll ’02–’05, D. Sun, J. Sun ’08, MK-Stingl ’02–)
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
14 / 38
Nonlinear SDP: algorithms, software?
• Interior point methods (log det barrier, Jarre ’98), Leibfritz
’02,. . .
• Sequential SDP methods (Correa-Ramirez ’04,
Jarre-Diehl)
None of these work (well) for practical problems
• (Generalized) Augmented Lagrangian methods
(Apkarian-Noll ’02–’05, D. Sun, J. Sun ’08, MK-Stingl ’02–)
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
14 / 38
Nonlinear SDP: algorithms, software?
• Interior point methods (log det barrier, Jarre ’98), Leibfritz
’02,. . .
• Sequential SDP methods (Correa-Ramirez ’04,
Jarre-Diehl)
None of these work (well) for practical problems
• (Generalized) Augmented Lagrangian methods
(Apkarian-Noll ’02–’05, D. Sun, J. Sun ’08, MK-Stingl ’02–)
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
14 / 38
PENNON collection
PENNON (PENalty methods for NONlinear optimization)
a collection of codes for NLP, (linear) SDP and BMI
– one algorithm to rule them all –
READY
• PENNLP AMPL, MATLAB, C/Fortran
• PENSDP MATLAB/YALMIP, SDPA, C/Fortran
• PENBMI MATLAB/YALMIP, C/Fortran
• PENNON (NLP + SDP) extended AMPL, MATLAB,
C/FORTRAN
NEW
• PENLAB (NLP + SDP) open source MATLAB
implementation
IN PROGRESS
• PENSOC (nonlinear SOCP) ( + NLP + SDP)
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
15 / 38
PENNON: the problem
Nonlinear conic optimization problem with three cones:
min
x∈Rn ,Y ∈Sp1 ×...×Spk
subject to
f (x, Y )
gi (x, Y ) ≥ 0,
i = 1, . . . , mNLP
hi (x, Y ) ∈ Li ,
i = 1, . . . , mSOCP (NLP-SDP)
Ai (x, Y ) 0,
i = 1, . . . , mSDP
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
16 / 38
The algorithm
Based on penalty/barrier functions ϕg,h : R → R and
ΦP : S p → S p :
gi (x, Y ) ≥ 0 ⇐⇒ pi ϕg (gi (x, Y )/pi ) ≥ 0,
i = 1, . . . , mNLP
hi (x, Y ) ∈ Li ⇐⇒ pi ϕh (gi (x, Y )/pi ) ≥ 0,
i = 1, . . . , mSOCP
Ai (x, Y ) 0 ⇐⇒ Φp (Ai (x, Y )) 0,
i = 1, . . . , mSDP
plus number of other properties.
For instance:
Φp (A) = p 2 (A − pI)−1 − pI
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
17 / 38
The algorithm
Augmented Lagrangian of (NLP-SDP):
PmNLP
F (x,Y ,u,v ,U,p)=f (x,Y )+ i=1
ui pi ϕg (gi (x,Y )/pi )
PmSOCP
PmSDP
+ i=1
vi pi ϕh (gi (x,Y )/pi )+ i=1
hUi ,ΦP (Ai (x,Y ))i ;
here u, v, U are Lagrange multipliers.
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
18 / 38
The algorithm
A generalized Augmented Lagrangian algorithm (based on R.
Polyak ’92, Ben-Tal–Zibulevsky ’94, Stingl ’05):
Given x 1 , Y 1 , u 1 , v 1 , U 1 ; pi1 > 0, ∀i.
For k = 1, 2, . . . repeat till a stopping criterium is reached:
(i)
Find x k +1 and Y k +1 s.t. k∇x F (x k +1 , Y k +1 , u k , v k , U k , p k )k ≤ K
(ii)
uik +1 = uik ϕ′g (gi (x k +1 )/pik ),
i = 1, . . . , mNLP
vik +1 = vik ϕ′h (gi (x k +1 )/pik ),
i = 1, . . . , mSOCP
Uik +1 = DA ΦP (Ai (x, Y )); Uik ),
(iii)
pik +1
<
pik ,
i = 1, . . . , mSDP
∀i
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
19 / 38
PENNON theory
Assume
1. f , g, h, A ∈ C 2
2. x, Y ∈ Ω nonempty, bounded
3. Constraint Qualification
Then ∃ an index set K so that:
• xk → xˆ , k ∈ K
b k ∈K
• Uk → U,
b satisfies first-order optimality conditions
• (xˆ , U)
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
20 / 38
What’s new
PENNON being implemented in NAG (The Numerical
Algorithms Group) library
The first routines should appear in the NAG Fortran Library
(Autumn 2014)
By-product:
PENLAB — free, open, fully functional version of PENNON
coded in MATLAB
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
21 / 38
PENLAB
PENLAB — free, open source, fully functional version of
PENNON coded in Matlab
Designed and coded by Jan Fiala (NAG)
• Open source, all in MATLAB (one MEX function)
• The basic algorithm is identical
• Some data handling routines not (yet?) implemented
• PENLAB runs just as PENNON but is slower
Pre-programmed procedures for
• standard NLP (with AMPL input!)
• linear SDP (reading SDPA input files)
• bilinear SDP (=BMI)
• SDP with polynomial MI (PMI)
• easy to add more (QP, robust QP, SOF, TTO. . . )
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
22 / 38
PENLAB
The problem
min
x∈Rn ,Y1 ∈Sp1 ,...,Yk ∈Spk
subject to
f (x, Y )
gi (x, Y ) ≤ 0,
i = 1, . . . , mg
hi (x, Y ) = 0,
i = 1, . . . , mh
Ai (x, Y ) 0,
i = 1, . . . , mA
λi I Yi λi I,
i = 1, . . . , k
(NLP-SDP)
Ai (x, Y ). . . nonlinear matrix operators
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
23 / 38
PENLAB
Solving a problem:
• prepare a structure penm containing basic problem data
• >> prob = penlab(penm); MATLAB class containing
all data
• >> solve(prob);
• results in class prob
The user has to provide MATLAB functions for
• function values
• gradients
• Hessians (for nonlinear functions)
of all f , g, A.
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
24 / 38
Structure penm and f/g/h functions
Example: min x1 + x2
s.t.
x12 + x22 ≤ 1, x1 ≥ −0.5
penm = [];
penm.Nx = 2;
penm.lbx = [-0.5 ; -Inf];
penm.NgNLN = 1;
penm.ubg = [1];
penm.objfun = @(x,Y) deal(x(1) + x(2));
penm.objgrad = @(x,Y) deal([1 ; 1]);
penm.confun = @(x,Y) deal([x(1)ˆ2 + x(2)ˆ2]);
penm.congrad = @(x,Y) deal([2*x(1) ; 2*x(2)]);
penm.conhess = @(x,Y) deal([2 0 ; 0 2]);
% set starting point
penm.xinit = [2,1];
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
25 / 38
PENLAB gives you a free MATLAB solver for large scale NLP!
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
26 / 38
Toy NLP-SDP example
min
x∈R2
1 2
(x + x22 )
2 1

subject to

1
x1 − 1 0
x1 − 1
1
x2  0
0
x2
1
B = sparse([1 -1 0; -1 1 0; 0 0 1]);
A{1} = sparse([0 1 0; 1 0 0; 0 0 0]);
A{2} = sparse([0 0 0; 0 0 1; 0 1 0]);
...
penm.NALIN=1;
...
penm.mconfun =@(x,Y,k)deal(B+A{1}*x(1)+A{2}*x(2))
penm.mcongrad=@(x,Y,k,i) deal(A{i})
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
27 / 38
Example: nearest correlation matrix
Find a nearest correlation matrix:
min
X
n
X
(Xij − Hij )2
(1)
i,j=1
subject to
Xii = 1,
i = 1, . . . , n
X 0
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
28 / 38
Example: nearest correlation matrix
The condition number of the nearest correlation matrix must be
bounded by κ.
Using the transformation of the variable X :
e =X
zX
The new problem:
min
n
X
eij − Hij )2
(z X
e
z,X
i,j=1
subject to
eii = 1,
zX
(2)
i = 1, . . . , n
e κI
IX
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
29 / 38
Example: nearest correlation matrix
Cooperation with Allianz SE, Munich:
Matrices of size up to 3500 × 3500
Code PENCOR:
C code, data in xml format
feasibility analysis
sensitivity analysis w.r.t. bounds on matrix elements
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
30 / 38
NLP with AMPL input
Pre-programmed. All you need to do:
>> penm=nlp_define(’datafiles/chain100.nl’);
>> prob=penlab(penm);
>> prob.solve();
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
31 / 38
NLP with AMPL input
problem
vars
constr.
chain800
pinene400
channel800
torsion100
lane emd10
dirichlet10
henon10
minsurf100
gasoil400
duct15
marine400
steering800
methanol400
3199
8000
6398
5000
4811
4491
2701
5000
4001
2895
6415
3999
4802
2400
7995
6398
10000
21
21
21
5000
3998
8601
6392
3200
4797
ˇ
Michal Kocvara
(University of Birmingham)
constr.
type
=
=
=
≤
≤
≤
≤
box
=&b
=&≤
≤&b
≤&b
≤&b
PENNON
sec
iter.
1
14/23
1
7/7
3
3/3
1
17/17
217
30/86
151
33/71
57 49/128
1
20/20
3
34/34
6
19/19
2
39/39
1
9/9
2
24/24
PENLAB
sec
iter.
6
24/56
11
17/17
1
3/3
17
26/26
64
25/49
73
32/68
63
76/158
97 203/203
13
59/71
9
11/11
22
35/35
7
19/40
16
47/67
New Directions in Nonlinear Optimization
32 / 38
Linear SDP with SDPA input
Pre-programmed. All you need to do:
>>
>>
>>
>>
sdpdata=readsdpa(’datafiles/arch0.dat-s’);
penm=sdp_define(sdpdata);
prob=penlab(penm);
prob.solve();
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
33 / 38
Bilinear matrix inequalities (BMI)
Pre-programmed. All you need to do:
>>
>>
>>
>>
bmidata=define_my_problem; %matrices A, K, ...
penm=bmi_define(bmidata);
prob=penlab(penm);
prob.solve();
min c T x
x∈Rn
s.t.
Ai0
+
n
X
k =1
xk Aik
+
n X
n
X
xk xℓ Kki ℓ < 0,
i = 1, . . . , m
k =1 ℓ=1
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
34 / 38
Polynomial matrix inequalities (PMI)
Pre-programmed. All you need to do:
>>
>>
>>
>>
load datafiles/example_pmidata;
penm = pmi_define(pmidata);
problem = penlab(penm);
problem.solve();
1 T
x Hx + c T x
x∈R 2
subject to blow ≤ Bx ≤ bup
minn
Ai (x) < 0,
with
A(x) =
X
i = 1, . . . , m
xκ(i) Qi
i
where κ(i) is a multi-index with possibly repeated entries and
xκ(i) is a product of elements with indices in κ(i).
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
35 / 38
YALMIP interface
Define your problem using YALMIP, call yalmip2bmi, solve by
PENLAB.
%YALMIP commands
A = [-1 2;-3 -4]; B=-[1;1];
P = sdpvar(2,2); K = sdpvar(1,2);
F = [(A+B*K)’*P+P*(A+B*K) < -eye(2); P>eye(2)]
yalpen=export(F,trace(P),sdpsettings(’solver’,’pe
%PENLAB commands
bmi=yalmip2bmi(yalpen);
penm = bmi_define(bmi);
prob = penlab(penm);
solve(prob);
K = (prob.x(4:5))’;
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
36 / 38
Other pre-programmed modules
• Nearest correlation matrix
• Truss topology optimization (stability constraints)
• Static output feedback (input from COMPlib, formulated as
PMI)
• Robust QP
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
37 / 38
Other Applications, Availability
•
•
•
•
•
•
polynomial matrix inequalities (with Didier Henrion)
nearest correlation matrix
sensor network localization
approximation by nonnegative splines
financial mathematics (with Ralf Werner)
Mathematical Programming with Equilibrium Constraints
(with Danny Ralph)
• structural optimization with matrix variables and nonlinear
matrix constraints (PLATO-N EU FP6 project)
• approximation of arrival rate function of a non-homogeneous Poisson process (F. Alizadeh, J. Eckstein)
• Conic Geometric Programming (Venkat Chandrasekaran)
Many other applications. . . . . . any hint welcome
Free academic version of the code PENNON available
PENLAB available for download from my website
ˇ
Michal Kocvara
(University of Birmingham)
New Directions in Nonlinear Optimization
38 / 38