scholarly journals Mathematical Modeling and Optimal Blank Generation in Glass Manufacturing

2014 ◽  
Vol 2014 ◽  
pp. 1-12
Author(s):  
Raymond Phillips ◽  
Matthew Woolway ◽  
Dario Fanucchi ◽  
M. Montaz Ali

This paper discusses the stock size selection problem (Chambers and Dyson, 1976), which is of relevance in the float glass industry. Given a fixed integerN, generally between 2 and 6 (but potentially larger), we find theNbest sizes for intermediate stock from which to cut a roster of orders. An objective function is formulated with the purpose of minimizing wastage, and the problem is phrased as a combinatorial optimization problem involving the selection of columns of a cost matrix. Some bounds and heuristics are developed, and two exact algorithms (depth-first search and branch-and-bound) are applied to the problem, as well as one approximate algorithm (NOMAD). It is found that wastage reduces dramatically asNincreases, but this trend becomes less pronounced for larger values ofN(beyond 6 or 7). For typical values ofN, branch-and-bound is able to find the exact solution within a reasonable amount of time.

2015 ◽  
Vol 6 (1) ◽  
pp. 35-46 ◽  
Author(s):  
Yong Wang

Traveling salesman problem (TSP) is a classic combinatorial optimization problem. The time complexity of the exact algorithms is generally an exponential function of the scale of TSP. This work gives an approximate algorithm with a four-vertex-three-line inequality for the triangle TSP. The time complexity is O(n2) and it can generate an approximation less than 2 times of the optimal solution. The paper designs a simple algorithm with the inequality. The algorithm is compared with the double-nearest neighbor algorithm. The experimental results illustrate the algorithm find the better approximations than the double-nearest neighbor algorithm for most TSP instances.


Author(s):  
Nihal Berktaş ◽  
Hande Yaman

This paper presents an exact algorithm for the team formation problem, in which the aim is, given a project and its required skills, to construct a capable team that can communicate and collaborate effectively. This combinatorial optimization problem is modeled as a quadratic set covering problem. The study provides a novel branch-and-bound algorithm where a reformulation of the problem is relaxed so that it decomposes into a series of linear set covering problems, and the relaxed constraints are imposed through branching. The algorithm is able to solve instances that are intractable for commercial solvers. The study illustrates an efficient usage of algorithmic methods and modeling techniques for an operations research problem. It contributes to the field of computational optimization by proposing a new application and a new algorithm to solve a quadratic version of a classical combinatorial optimization problem.


2018 ◽  
Vol 54(5) ◽  
pp. 72
Author(s):  
Quoc, H.D. ◽  
Kien, N.T. ◽  
Thuy, T.T.C. ◽  
Hai, L.H. ◽  
Thanh, V.N.

2021 ◽  
Vol 24 (2) ◽  
pp. 1-35
Author(s):  
Isabel Wagner ◽  
Iryna Yevseyeva

The ability to measure privacy accurately and consistently is key in the development of new privacy protections. However, recent studies have uncovered weaknesses in existing privacy metrics, as well as weaknesses caused by the use of only a single privacy metric. Metrics suites, or combinations of privacy metrics, are a promising mechanism to alleviate these weaknesses, if we can solve two open problems: which metrics should be combined and how. In this article, we tackle the first problem, i.e., the selection of metrics for strong metrics suites, by formulating it as a knapsack optimization problem with both single and multiple objectives. Because solving this problem exactly is difficult due to the large number of combinations and many qualities/objectives that need to be evaluated for each metrics suite, we apply 16 existing evolutionary and metaheuristic optimization algorithms. We solve the optimization problem for three privacy application domains: genomic privacy, graph privacy, and vehicular communications privacy. We find that the resulting metrics suites have better properties, i.e., higher monotonicity, diversity, evenness, and shared value range, than previously proposed metrics suites.


Author(s):  
Jing Tang ◽  
Xueyan Tang ◽  
Andrew Lim ◽  
Kai Han ◽  
Chongshou Li ◽  
...  

Monotone submodular maximization with a knapsack constraint is NP-hard. Various approximation algorithms have been devised to address this optimization problem. In this paper, we revisit the widely known modified greedy algorithm. First, we show that this algorithm can achieve an approximation factor of 0.405, which significantly improves the known factors of 0.357 given by Wolsey and (1-1/e)/2\approx 0.316 given by Khuller et al. More importantly, our analysis closes a gap in Khuller et al.'s proof for the extensively mentioned approximation factor of (1-1/\sqrte )\approx 0.393 in the literature to clarify a long-standing misconception on this issue. Second, we enhance the modified greedy algorithm to derive a data-dependent upper bound on the optimum. We empirically demonstrate the tightness of our upper bound with a real-world application. The bound enables us to obtain a data-dependent ratio typically much higher than 0.405 between the solution value of the modified greedy algorithm and the optimum. It can also be used to significantly improve the efficiency of algorithms such as branch and bound.


2021 ◽  
Vol 17 (4) ◽  
pp. 1-20
Author(s):  
Serena Wang ◽  
Maya Gupta ◽  
Seungil You

Given a classifier ensemble and a dataset, many examples may be confidently and accurately classified after only a subset of the base models in the ensemble is evaluated. Dynamically deciding to classify early can reduce both mean latency and CPU without harming the accuracy of the original ensemble. To achieve such gains, we propose jointly optimizing the evaluation order of the base models and early-stopping thresholds. Our proposed objective is a combinatorial optimization problem, but we provide a greedy algorithm that achieves a 4-approximation of the optimal solution under certain assumptions, which is also the best achievable polynomial-time approximation bound. Experiments on benchmark and real-world problems show that the proposed Quit When You Can (QWYC) algorithm can speed up average evaluation time by 1.8–2.7 times on even jointly trained ensembles, which are more difficult to speed up than independently or sequentially trained ensembles. QWYC’s joint optimization of ordering and thresholds also performed better in experiments than previous fixed orderings, including gradient boosted trees’ ordering.


Symmetry ◽  
2020 ◽  
Vol 12 (1) ◽  
pp. 94 ◽  
Author(s):  
Dario Fasino ◽  
Franca Rinaldi

The core–periphery structure is one of the key concepts in the structural analysis of complex networks. It consists of a partitioning of the node set of a given graph or network into two groups, called core and periphery, where the core nodes induce a well-connected subgraph and share connections with peripheral nodes, while the peripheral nodes are loosely connected to the core nodes and other peripheral nodes. We propose a polynomial-time algorithm to detect core–periphery structures in networks having a symmetric adjacency matrix. The core set is defined as the solution of a combinatorial optimization problem, which has a pleasant symmetry with respect to graph complementation. We provide a complete description of the optimal solutions to that problem and an exact and efficient algorithm to compute them. The proposed approach is extended to networks with loops and oriented edges. Numerical simulations are carried out on both synthetic and real-world networks to demonstrate the effectiveness and practicability of the proposed algorithm.


2015 ◽  
Vol 1120-1121 ◽  
pp. 670-674
Author(s):  
Abdelmadjid Ait Yala ◽  
Abderrahmanne Akkouche

The aim of this work is to define a general method for the optimization of composite patch repairing. Fracture mechanics theory shows that the stress intensity factor tends towards an asymptotic limit K∞.This limit is given by Rose’s formula and is a function of the thicknesses and mechanical properties of the cracked plate, the composite patch and the adhesive. The proposed approach consists in considering this limit as an objective function that needs to be minimized. In deed lowering this asymptote will reduce the values of the stress intensity factor hence optimize the repair. However to be effective this robust design must satisfy the stiffness ratio criteria. The resolution of this double objective optimization problem with Matlab program allowed us determine the appropriate geometric and mechanical properties that allow the optimum design; that is the selection of the adhesive, the patch and their respective thicknesses.


Sign in / Sign up

Export Citation Format

Share Document