cutting plane methods
Recently Published Documents


TOTAL DOCUMENTS

46
(FIVE YEARS 1)

H-INDEX

12
(FIVE YEARS 0)

Author(s):  
Yuzhu Wang ◽  
Akihiro Tanaka ◽  
Akiko Yoshise

AbstractWe develop techniques to construct a series of sparse polyhedral approximations of the semidefinite cone. Motivated by the semidefinite (SD) bases proposed by Tanaka and Yoshise (Ann Oper Res 265:155–182, 2018), we propose a simple expansion of SD bases so as to keep the sparsity of the matrices composing it. We prove that the polyhedral approximation using our expanded SD bases contains the set of all diagonally dominant matrices and is contained in the set of all scaled diagonally dominant matrices. We also prove that the set of all scaled diagonally dominant matrices can be expressed using an infinite number of expanded SD bases. We use our approximations as the initial approximation in cutting plane methods for solving a semidefinite relaxation of the maximum stable set problem. It is found that the proposed methods with expanded SD bases are significantly more efficient than methods using other existing approximations or solving semidefinite relaxation problems directly.


Author(s):  
Dimitris Bertsimas ◽  
Nishanth Mundru

We consider the problem of best [Formula: see text]-subset convex regression using [Formula: see text] observations in [Formula: see text] variables. For the case without sparsity, we develop a scalable algorithm for obtaining high quality solutions in practical times that compare favorably with other state of the art methods. We show that by using a cutting plane method, the least squares convex regression problem can be solved for sizes [Formula: see text] in minutes and [Formula: see text] in hours. Our algorithm can be adapted to solve variants such as finding the best convex or concave functions with coordinate-wise monotonicity, norm-bounded subgradients, and minimize the [Formula: see text] loss—all with similar scalability to the least squares convex regression problem. Under sparsity, we propose algorithms which iteratively solve for the best subset of features based on first order and cutting plane methods. We show that our methods scale for sizes [Formula: see text] in minutes and [Formula: see text] in hours. We demonstrate that these methods control for the false discovery rate effectively.


2019 ◽  
Vol 13 (7) ◽  
pp. 1677-1692
Author(s):  
Xi Chen ◽  
Ji-hong Zhang ◽  
Xiao-song Ding ◽  
Tian Yang ◽  
Jing-yi Qian

2016 ◽  
Vol 11 (3) ◽  
pp. 483-495 ◽  
Author(s):  
Ji-hong Zhang ◽  
Xi Chen ◽  
Xiao-song Ding

Author(s):  
Adil Bagirov ◽  
Napsu Karmitsa ◽  
Marko M. Mäkelä

2013 ◽  
Vol 135 (10) ◽  
Author(s):  
Wenshan Wang ◽  
Vincent Y. Blouin ◽  
Melissa K. Gardenghi ◽  
Georges M. Fadel ◽  
Margaret M. Wiecek ◽  
...  

Analytical target cascading (ATC), a hierarchical, multilevel, multidisciplinary coordination method, has proven to be an effective decomposition approach for large-scale engineering optimization problems. In recent years, augmented Lagrangian relaxation methods have received renewed interest as dual update methods for solving ATC decomposed problems. These problems can be solved using the subgradient optimization algorithm, the application of which includes three schemes for updating dual variables. To address the convergence efficiency disadvantages of the existing dual update schemes, this paper investigates two new schemes, the linear and the proximal cutting plane methods, which are implemented in conjunction with augmented Lagrangian coordination for ATC-decomposed problems. Three nonconvex nonlinear example problems are used to show that these two cutting plane methods can significantly reduce the number of iterations and the number of function evaluations when compared to the traditional subgradient update methods. In addition, these methods are also compared to the method of multipliers and its variants, showing similar performance.


Sign in / Sign up

Export Citation Format

Share Document