scholarly journals Smart “Predict, then Optimize”

Author(s):  
Adam N. Elmachtoub ◽  
Paul Grigas

Many real-world analytics problems involve two significant challenges: prediction and optimization. Because of the typically complex nature of each challenge, the standard paradigm is predict-then-optimize. By and large, machine learning tools are intended to minimize prediction error and do not account for how the predictions will be used in the downstream optimization problem. In contrast, we propose a new and very general framework, called Smart “Predict, then Optimize” (SPO), which directly leverages the optimization problem structure—that is, its objective and constraints—for designing better prediction models. A key component of our framework is the SPO loss function, which measures the decision error induced by a prediction. Training a prediction model with respect to the SPO loss is computationally challenging, and, thus, we derive, using duality theory, a convex surrogate loss function, which we call the SPO+ loss. Most importantly, we prove that the SPO+ loss is statistically consistent with respect to the SPO loss under mild conditions. Our SPO+ loss function can tractably handle any polyhedral, convex, or even mixed-integer optimization problem with a linear objective. Numerical experiments on shortest-path and portfolio-optimization problems show that the SPO framework can lead to significant improvement under the predict-then-optimize paradigm, in particular, when the prediction model being trained is misspecified. We find that linear models trained using SPO+ loss tend to dominate random-forest algorithms, even when the ground truth is highly nonlinear. This paper was accepted by Yinyu Ye, optimization.

Author(s):  
Alexander Murray ◽  
Timm Faulwasser ◽  
Veit Hagenmeyer ◽  
Mario E. Villanueva ◽  
Boris Houska

AbstractThis paper presents a novel partially distributed outer approximation algorithm, named PaDOA, for solving a class of structured mixed integer convex programming problems to global optimality. The proposed scheme uses an iterative outer approximation method for coupled mixed integer optimization problems with separable convex objective functions, affine coupling constraints, and compact domain. PaDOA proceeds by alternating between solving large-scale structured mixed-integer linear programming problems and partially decoupled mixed-integer nonlinear programming subproblems that comprise much fewer integer variables. We establish conditions under which PaDOA converges to global minimizers after a finite number of iterations and verify these properties with an application to thermostatically controlled loads and to mixed-integer regression.


2018 ◽  
Vol 8 (11) ◽  
pp. 2080 ◽  
Author(s):  
Enrique Cortés-Toro ◽  
Broderick Crawford ◽  
Juan Gómez-Pulido ◽  
Ricardo Soto ◽  
José Lanza-Gutiérrez

In this article, a novel optimization metaheuristic based on the vapour-liquid equilibrium is described to solve highly nonlinear optimization problems in continuous domains. During the search for the optimum, the procedure truly simulates the vapour-liquid equilibrium state of multiple binary chemical systems. Each decision variable of the optimization problem behaves as the molar fraction of the lightest component of a binary chemical system. The equilibrium state of each system is modified several times, independently and gradually, in two opposite directions and at different rates. The best thermodynamic conditions of equilibrium for each system are searched and evaluated to identify the following step towards the solution of the optimization problem. While the search is carried out, the algorithm randomly accepts inadequate solutions. This process is done in a controlled way by setting a minimum acceptance probability to restart the exploration in other areas to prevent becoming trapped in local optimal solutions. Moreover, the range of each decision variable is reduced autonomously during the search. The algorithm reaches competitive results with those obtained by other stochastic algorithms when testing several benchmark functions, which allows us to conclude that our metaheuristic is a promising alternative in the optimization field.


Author(s):  
Stephane Fliscounakis ◽  
Fabrice Zaoui ◽  
Marie-Pierre Houry ◽  
Emilie Milin

2013 ◽  
Vol 300-301 ◽  
pp. 645-648 ◽  
Author(s):  
Yung Chien Lin

Evolutionary algorithms (EAs) are population-based global search methods. Memetic Algorithms (MAs) are hybrid EAs that combine genetic operators with local search methods. With global exploration and local exploitation in search space, MAs are capable of obtaining more high-quality solutions. On the other hand, mixed-integer hybrid differential evolution (MIHDE), as an EA-based search algorithm, has been successfully applied to many mixed-integer optimization problems. In this paper, a mixed-integer memetic algorithm based on MIHDE is developed for solving mixed-integer constrained optimization problems. The proposed algorithm is implemented and applied to the optimal design of batch processes. Experimental results show that the proposed algorithm can find a better optimal solution compared with some other search algorithms.


Author(s):  
Tetiana Lebedeva ◽  
Natalia Semenova ◽  
Tetiana Sergienko

The article is devoted to the study of the influence of uncertainty in initial data on the solutions of mixed integer optimization vector problems. In the optimization problems, including problems with vector criterion, small perturbations in initial data can result in solutions strongly different from the true ones. The problem of stability of the indicated tasks is studied from the point of view of direct coupled with her question in relation to stability of solutions belonging to some subsets of feasible set.


2021 ◽  
Author(s):  
Yunda Si ◽  
Chengfei Yan

Deep residual learning has shown great success in protein contact prediction. In this study, a new deep residual learning-based protein contact prediction model was developed. Comparing with previous models, a new type of residual block hybridizing 1D and 2D convolutions was designed to increase the effective receptive field of the residual network, and a new loss function emphasizing the easily misclassified residue pairs was proposed to enhance the model training. The developed protein contact prediction model referred to as DRN-1D2D was first evaluated on 105 CASP 11 targets, 76 CAMEO hard targets and 398 membrane proteins together with two in house-developed reference models based on either the standard 2D residual block or the traditional BCE loss function, from which we confirmed that that the dimensional hybrid residual block and the singularity enhanced loss function can both be employed to improve the model performance for protein contact prediction. DRN-1D2D was further evaluated on 39 CASP 13 and CASP 14 free modeling targets together with the two reference models and four state-of-the-art protein contact prediction models including DeepCov, DeepCon, RaptorX-Contact and TripleRes. The result shows that DRN-1D2D consistently achieved the best performance among all these models.


Author(s):  
Josef Jablonský

Linear programming (LP) and mixed integer linear programming (MILP) problems belong among very important class of problems that find their applications in various managerial consequences. The aim of the paper is to discuss computational performance of current optimization packages for solving large scale LP and MILP optimization problems. Current market with LP and MILP solvers is quite extensive. Probably among the most powerful solvers GUROBI 6.0, IBM ILOG CPLEX 12.6.1, and XPRESS Optimizer 27.01 belong. Their attractiveness for academic research is given, except their computational performance, by their free availability for academic purposes. The solvers are tested on the set of selected problems from MIPLIB 2010 library that contains 361 test instances of different hardness (easy, hard, and not solved).


Author(s):  
Merve Bodur ◽  
Timothy C. Y. Chan ◽  
Ian Yihang Zhu

Inverse optimization—determining parameters of an optimization problem that render a given solution optimal—has received increasing attention in recent years. Although significant inverse optimization literature exists for convex optimization problems, there have been few advances for discrete problems, despite the ubiquity of applications that fundamentally rely on discrete decision making. In this paper, we present a new set of theoretical insights and algorithms for the general class of inverse mixed integer linear optimization problems. Specifically, a general characterization of optimality conditions is established and leveraged to design new cutting plane solution algorithms. Through an extensive set of computational experiments, we show that our methods provide substantial improvements over existing methods in solving the largest and most difficult instances to date.


Sign in / Sign up

Export Citation Format

Share Document