scholarly journals Automatically Designing State-of-the-Art Multi- and Many-Objective Evolutionary Algorithms

2020 ◽  
Vol 28 (2) ◽  
pp. 195-226 ◽  
Author(s):  
Leonardo C. T. Bezerra ◽  
Manuel López-Ibáñez ◽  
Thomas Stützle

A recent comparison of well-established multiobjective evolutionary algorithms (MOEAs) has helped better identify the current state-of-the-art by considering (i) parameter tuning through automatic configuration, (ii) a wide range of different setups, and (iii) various performance metrics. Here, we automatically devise MOEAs with verified state-of-the-art performance for multi- and many-objective continuous optimization. Our work is based on two main considerations. The first is that high-performing algorithms can be obtained from a configurable algorithmic framework in an automated way. The second is that multiple performance metrics may be required to guide this automatic design process. In the first part of this work, we extend our previously proposed algorithmic framework, increasing the number of MOEAs, underlying evolutionary algorithms, and search paradigms that it comprises. These components can be combined following a general MOEA template, and an automatic configuration method is used to instantiate high-performing MOEA designs that optimize a given performance metric and present state-of-the-art performance. In the second part, we propose a multiobjective formulation for the automatic MOEA design, which proves critical for the context of many-objective optimization due to the disagreement of established performance metrics. Our proposed formulation leads to an automatically designed MOEA that presents state-of-the-art performance according to a set of metrics, rather than a single one.

Author(s):  
K. Liagkouras ◽  
K. Metaxiotis

This paper provides a systematic study of the technologies and algorithms associated with the implementation of multiobjective evolutionary algorithms (MOEAs) for the solution of the portfolio optimization problem. Based on the examination of the state-of-the art we provide the best practices for dealing with the complexities of the constrained portfolio optimization problem (CPOP). In particular, rigorous algorithmic and technical treatment is provided for the efficient incorporation of a wide range of real-world constraints into the MOEAs. Moreover, we address special configuration issues related to the application of MOEAs for solving the CPOP. Finally, by examining the state-of-the-art we identify the most appropriate performance metrics for the evaluation of the relevant results from the implementation of the MOEAs to the solution of the CPOP.


2021 ◽  
Author(s):  
Leila Zahedi ◽  
Farid Ghareh Mohammadi ◽  
M. Hadi Amini

Machine learning techniques lend themselves as promising decision-making and analytic tools in a wide range of applications. Different ML algorithms have various hyper-parameters. In order to tailor an ML model towards a specific application, a large number of hyper-parameters should be tuned. Tuning the hyper-parameters directly affects the performance (accuracy and run-time). However, for large-scale search spaces, efficiently exploring the ample number of combinations of hyper-parameters is computationally challenging. Existing automated hyper-parameter tuning techniques suffer from high time complexity. In this paper, we propose HyP-ABC, an automatic innovative hybrid hyper-parameter optimization algorithm using the modified artificial bee colony approach, to measure the classification accuracy of three ML algorithms, namely random forest, extreme gradient boosting, and support vector machine. Compared to the state-of-the-art techniques, HyP-ABC is more efficient and has a limited number of parameters to be tuned, making it worthwhile for real-world hyper-parameter optimization problems. We further compare our proposed HyP-ABC algorithm with state-of-the-art techniques. In order to ensure the robustness of the proposed method, the algorithm takes a wide range of feasible hyper-parameter values, and is tested using a real-world educational dataset.


2020 ◽  
Vol 7 (7) ◽  
pp. 1901-1911 ◽  
Author(s):  
Aobo Ren ◽  
Jihua Zou ◽  
Huagui Lai ◽  
Yixuan Huang ◽  
Liming Yuan ◽  
...  

Solution-processed MXene–perovskite image sensor arrays are realized by a top-down method, which combine desirable manufacturing advantages and state-of-the-art performance metrics.


2020 ◽  
pp. postgradmedj-2019-137254
Author(s):  
Noirin O’Herlihy ◽  
Sarah Griffin ◽  
Patrick Henn ◽  
Robert Gaffney ◽  
Mary Rose Cahill ◽  
...  

AimsThe purpose of this study was to (1) characterise the procedure of phlebotomy, deconstruct it into its constituent parts and develop a performance metric for the purpose of training healthcare professionals in a large teaching hospital and to (2) evaluate the construct validity of the phlebotomy metric and establish a proficiency benchmark.MethodBy engaging with a multidisciplinary team with a wide range of experience of preanalytical errors in phlebotomy and observing video recordings of the procedure performed in the actual working environment, we defined a performance metric. This was brought to a modified Delphi meeting, where consensus was reached by an expert panel. To demonstrate construct validity, we used the metric to objectively assess the performance of novices and expert practitioners.ResultsA phlebotomy metric consisting of 11 phases and 77 steps was developed. The mean inter-rater reliability was 0.91 (min 0.83, max 0.95). The expert group completed more steps of the procedure (72 vs 69), made fewer errors (19 vs 13, p=0.014) and fewer critical errors (1 Vs 4, p=0.002) than the novice group.ConclusionsThe metrics demonstrated construct validity and the proficiency benchmark was established with a minimum observation of 69 steps, with no critical errors and no more than 13 errors in total.


2018 ◽  
Vol 26 (4) ◽  
pp. 621-656 ◽  
Author(s):  
Leonardo C. T. Bezerra ◽  
Manuel López-Ibáñez ◽  
Thomas Stützle

Research on multi-objective evolutionary algorithms (MOEAs) has produced over the past decades a large number of algorithms and a rich literature on performance assessment tools to evaluate and compare them. Yet, newly proposed MOEAs are typically compared against very few, often a decade older MOEAs. One reason for this apparent contradiction is the lack of a common baseline for comparison, with each subsequent study often devising its own experimental scenario, slightly different from other studies. As a result, the state of the art in MOEAs is a disputed topic. This article reports a systematic, comprehensive evaluation of a large number of MOEAs that covers a wide range of experimental scenarios. A novelty of this study is the separation between the higher-level algorithmic components related to multi-objective optimization (MO), which characterize each particular MOEA, and the underlying parameters—such as evolutionary operators, population size, etc.—whose configuration may be tuned for each scenario. Instead of relying on a common or “default” parameter configuration that may be low-performing for particular MOEAs or scenarios and unintentionally biased, we tune the parameters of each MOEA for each scenario using automatic algorithm configuration methods. Our results confirm some of the assumed knowledge in the field, while at the same time they provide new insights on the relative performance of MOEAs for many-objective problems. For example, under certain conditions, indicator-based MOEAs are more competitive for such problems than previously assumed. We also analyze problem-specific features affecting performance, the agreement between performance metrics, and the improvement of tuned configurations over the default configurations used in the literature. Finally, the data produced is made publicly available to motivate further analysis and a baseline for future comparisons.


Computers ◽  
2019 ◽  
Vol 8 (2) ◽  
pp. 32 ◽  
Author(s):  
Chiman Kwan ◽  
Jude Larkin ◽  
Bence Budavari ◽  
Bryan Chou ◽  
Eric Shang ◽  
...  

Since lossless compression can only achieve two to four times data compression, it may not be efficient to deploy lossless compression in bandwidth constrained applications. Instead, it would be more economical to adopt perceptually lossless compression, which can attain ten times or more compression without loss of important information. Consequently, one can transmit more images over bandwidth limited channels. In this research, we first aimed to compare and select the best compression algorithm in the literature to achieve a compression ratio of 0.1 and 40 dBs or more in terms of a performance metric known as human visual system model (HVSm) for maritime and sonar images. Our second objective was to demonstrate error concealment algorithms that can handle corrupted pixels due to transmission errors in interference-prone communication channels. Using four state-of-the-art codecs, we demonstrated that perceptually lossless compression can be achieved for realistic maritime and sonar images. At the same time, we also selected the best codec for this purpose using four performance metrics. Finally, error concealment was demonstrated to be useful in recovering lost pixels due to transmission errors.


2020 ◽  
Vol 40 (3) ◽  
pp. 55-69
Author(s):  
Luis Felipe Ariza Vesga ◽  
Johan Sebastián Eslava Garzón ◽  
Rafael Puerta Ramirez

Multi-Objective and Many-objective Optimization problems have been extensively solved through evolutionary algorithms over a few decades. Despite the fact that NSGA-II and NSGA-III are frequently employed as a reference for a comparative evaluation of new evolutionary algorithms, the latter is proprietary. In this paper, we used the basic framework of the NSGA-II, which is very similar to the NSGA-III, with significant changes in its selection operator. We took the first front generated at the non-dominating sort procedure to obtain nonnegative and nonrepeated extreme points. This opensource version of the NSGA-III is called EF1-NSGA-III, and its implementation does not start from scratch; that would be reinventing the wheel. Instead, we took the NSGA-II code from the authors in the repository of the Kanpur Genetic Algorithms Laboratory to extend the EF1-NSGA-III. We then adjusted its selection operator from diversity, based on the crowding distance, to the one found on reference points and preserved its parameters. After that, we continued with the adaptive EF1-NSGA-III (A-EF1-NSGA-III), and the efficient adaptive EF1-NSGA-III (A2-EF1-NSGA-III), while also contributing to explain how to generate different types of reference points. The proposed algorithms resolve optimization problems with constraints of up to 10 objective functions. We tested them on a wide range of benchmark problems, and they showed notable improvements in terms of convergence and diversity by using the Inverted Generational Distance (IGD) and the HyperVolume (HV) performance metrics. The EF1-NSGA-III aims to resolve the power consumption for Centralized Radio Access Networks and the BiObjective Minimum DiameterCost Spanning Tree problems.


Electronics ◽  
2021 ◽  
Vol 10 (14) ◽  
pp. 1731
Author(s):  
Muhammad Abdullah ◽  
Slawomir Koziel ◽  
Stanislaw Szczepanski

The development of diffusion metasurfaces created new opportunities to elevate the stealthiness of combat aircraft. Despite the potential significance of metasurfaces, their rigorous design methodologies are still lacking, especially in the context of meticulous control over the scattering of electromagnetic (EM) waves through geometry parameter tuning. Another practical issue is insufficiency of the existing performance metrics, specifically, monostatic and bistatic evaluation of the reflectivity, especially at the design stage of metasurfaces. Both provide limited insight into the RCS reduction properties, with the latter being dependent on the selection of the planes over which the evaluation takes place. This paper introduces a novel performance metric for evaluating scattering characteristics of a metasurface, referred to as Normalized Partial Scattering Cross Section (NPSCS). The metric involves integration of the scattered energy over a specific solid angle, which allows for a comprehensive assessment of the structure performance in a format largely independent of the particular arrangement of the scattering lobes. We demonstrate the utility of the introduced metric using two specific metasurface architectures. In particular, we show that the integral-based metric can be used to discriminate between the various surface configurations (e.g., checkerboard versus random), which cannot be conclusively compared using traditional methods. Consequently, the proposed approach can be a useful tool in benchmarking radar cross section reduction performance of metamaterial-based, and other types of scattering structures.


2020 ◽  
Vol 12 ◽  
Author(s):  
Francisco Basílio ◽  
Ricardo Jorge Dinis-Oliveira

Background: Pharmacobezoars are specific types of bezoars formed when medicines, such as tablets, suspensions, and/or drug delivery systems, aggregate and may cause death by occluding airways with tenacious material or by eluting drugs resulting in toxic or lethal blood concentrations. Objective: This work aims to fully review the state-of-the-art regarding pathophysiology, diagnosis, treatment and other relevant clinical and forensic features of pharmacobezoars. Results: patients of a wide range of ages and in both sexes present with signs and symptoms of intoxications or more commonly gastrointestinal obstructions. The exact mechanisms of pharmacobezoar formation are unknown but is likely multifactorial. The diagnosis and treatment depend on the gastrointestinal segment affected and should be personalized to the medication and the underlying factor. A good and complete history, physical examination, image tests, upper endoscopy and surgery through laparotomy of the lower tract are useful for diagnosis and treatment. Conclusion: Pharmacobezoars are rarely seen in clinical and forensic practice. They are related to controlled or immediate-release formulations, liquid or non-digestible substances, in normal or altered digestive motility/anatomy tract, and in overdoses or therapeutic doses, and should be suspected in the presence of risk factors or patients taking drugs which may form pharmacobezoars.


This volume vividly demonstrates the importance and increasing breadth of quantitative methods in the earth sciences. With contributions from an international cast of leading practitioners, chapters cover a wide range of state-of-the-art methods and applications, including computer modeling and mapping techniques. Many chapters also contain reviews and extensive bibliographies which serve to make this an invaluable introduction to the entire field. In addition to its detailed presentations, the book includes chapters on the history of geomathematics and on R.G.V. Eigen, the "father" of mathematical geology. Written to commemorate the 25th anniversary of the International Association for Mathematical Geology, the book will be sought after by both practitioners and researchers in all branches of geology.


Sign in / Sign up

Export Citation Format

Share Document