Mesh adaptive direct search algorithms for constrained optimization

被引:871
作者
Audet, C
Dennis, JE
机构
[1] Ecole Polytech Montreal, GERAD, Montreal, PQ H3C 3A7, Canada
[2] Ecole Polytech Montreal, Dept Math & Genie Ind, Montreal, PQ H3C 3A7, Canada
[3] Rice Univ, Dept Appl & Computat Math, Seattle, WA 98136 USA
关键词
mesh adaptive direct search algorithms (MADS); convergence analysis; constrained optimization; nonsmooth analysis; Clarke derivatives; hypertangent; contingent cone;
D O I
10.1137/040603371
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
This paper addresses the problem of minimization of a nonsmooth function under general nonsmooth constraints when no derivatives of the objective or constraint functions are available. We introduce the mesh adaptive direct search ( MADS) class of algorithms which extends the generalized pattern search (GPS) class by allowing local exploration, called polling, in an asymptotically dense set of directions in the space of optimization variables. This means that under certain hypotheses, including a weak constraint qualification due to Rockafellar, MADS can treat constraints by the extreme barrier approach of setting the objective to infinity for infeasible points and treating the problem as unconstrained. The main GPS convergence result is to identify limit points (x) over cap, where the Clarke generalized derivatives are nonnegative in a finite set of directions, called refining directions. Although in the unconstrained case, nonnegative combinations of these directions span the whole space, the fact that there can only be finitely many GPS refining directions limits rigorous justification of the barrier approach to finitely many linear constraints for GPS. The main result of this paper is that the general MADS framework is flexible enough to allow the generation of an asymptotically dense set of refining directions along which the Clarke derivatives are nonnegative. We propose an instance of MADS for which the refining directions are dense in the hypertangent cone at (x) over cap with probability 1 whenever the iterates associated with the re. ning directions converge to a single (x) over cap x. The instance of MADS is compared to versions of GPS on some test problems. We also illustrate the limitation of our results with examples.
引用
收藏
页码:188 / 217
页数:30
相关论文
共 32 条
[1]   Generalized pattern searches with derivative information [J].
Abramson, MA ;
Audet, C ;
Dennis, JE .
MATHEMATICAL PROGRAMMING, 2004, 100 (01) :3-25
[2]   Convergence results for generalized pattern search algorithms are tight [J].
Audet, C .
OPTIMIZATION AND ENGINEERING, 2004, 5 (02) :101-122
[3]   A pattern search filter method for nonlinear programming without derivatives [J].
Audet, C ;
Dennis, JE .
SIAM JOURNAL ON OPTIMIZATION, 2004, 14 (04) :980-1010
[4]   Analysis of generalized pattern searches [J].
Audet, C ;
Dennis, JE .
SIAM JOURNAL ON OPTIMIZATION, 2003, 13 (03) :889-903
[5]  
AUDET C, IN PRESS SIAM J OPTI
[6]  
Avriel M., 2003, NONLINEAR PROGRAMMIN
[7]  
BECHARD V, 2005, CAHIERS GERAD
[8]   A rigorous framework for optimization of expensive functions by surrogates [J].
Booker A.J. ;
Dennis Jr. J.E. ;
Frank P.D. ;
Serafini D.B. ;
Torczon V. ;
Trosset M.W. .
Structural optimization, 1999, 17 (1) :1-13
[9]   Superlinear convergence and implicit filtering [J].
Choi, TD ;
Kelley, CT .
SIAM JOURNAL ON OPTIMIZATION, 2000, 10 (04) :1149-1162
[10]  
Clarke FH, 1983, OPTIMIZATION NONSMOO