Local Superlinear Convergence of Polynomial-Time Interior-Point Methods for Hyperbolicity Cone Optimization Problems

2016 ◽  
Vol 26 (1) ◽  
pp. 139-170 ◽  
Author(s):  
Yu. Nesterov ◽  
L. Tunçel
2001 ◽  
Vol 53 (3) ◽  
pp. 470-488 ◽  
Author(s):  
Heinz H. Bauschke ◽  
Osman Güler ◽  
Adrian S. Lewis ◽  
Hristo S. Sendov

AbstractA homogeneous real polynomial p is hyperbolic with respect to a given vector d if the univariate polynomial t ⟼ p(x − td) has all real roots for all vectors x. Motivated by partial differential equations, Gårding proved in 1951 that the largest such root is a convex function of x, and showed various ways of constructing new hyperbolic polynomials. We present a powerful new such construction, and use it to generalize Gårding’s result to arbitrary symmetric functions of the roots. Many classical and recent inequalities follow easily. We develop various convex-analytic tools for such symmetric functions, of interest in interior-point methods for optimization problems over related cones.


2007 ◽  
Vol 49 (2) ◽  
pp. 259-270 ◽  
Author(s):  
Keyvan Aminis ◽  
Arash Haseli

AbstractInterior-Point Methods (IPMs) are not only very effective in practice for solving linear optimization problems but also have polynomial-time complexity. Despite the practical efficiency of large-update algorithms, from a theoretical point of view, these algorithms have a weaker iteration bound with respect to small-update algorithms. In fact, there is a significant gap between theory and practice for large-update algorithms. By introducing self-regular barrier functions, Peng, Roos and Terlaky improved this gap up to a factor of log n. However, checking these self-regular functions is not simple and proofs of theorems involving these functions are very complicated. Roos el al. by presenting a new class of barrier functions which are not necessarily self-regular, achieved very good results through some much simpler theorems. In this paper we introduce a new kernel function in this class which yields the best known complexity bound, both for large-update and small-update methods.


2003 ◽  
Vol 18 (1) ◽  
pp. 238-244 ◽  
Author(s):  
J. Riquelme Santos ◽  
A. Troncoso Lora ◽  
A. Gomez Expisito ◽  
J.L. Martinez Ramos

Acta Numerica ◽  
1996 ◽  
Vol 5 ◽  
pp. 149-190 ◽  
Author(s):  
Adrian S. Lewis ◽  
Michael L. Overton

Optimization problems involving eigenvalues arise in many different mathematical disciplines. This article is divided into two parts. Part I gives a historical account of the development of the field. We discuss various applications that have been especially influential, from structural analysis to combinatorial optimization, and we survey algorithmic developments, including the recent advance of interior-point methods for a specific problem class: semidefinite programming. In Part II we primarily address optimization of convex functions of eigenvalues of symmetric matrices subject to linear constraints. We derive a fairly complete mathematical theory, some of it classical and some of it new. Using the elegant language of conjugate duality theory, we highlight the parallels between the analysis of invariant matrix norms and weakly invariant convex matrix functions. We then restrict our attention further to linear and semidefinite programming, emphasizing the parallel duality theory and comparing primal-dual interior-point methods for the two problem classes. The final section presents some apparently new variational results about eigenvalues of nonsymmetric matrices, unifying known characterizations of the spectral abscissa (related to Lyapunov theory) and the spectral radius (as an infimum of matrix norms).


Sign in / Sign up

Export Citation Format

Share Document