scholarly journals Adaptive Randomness: A New Population Initialization Method

2014 ◽  
Vol 2014 ◽  
pp. 1-14 ◽  
Author(s):  
Weifeng Pan ◽  
Kangshun Li ◽  
Muchou Wang ◽  
Jing Wang ◽  
Bo Jiang

Population initialization is a crucial task in population-based optimization methods, which can affect the convergence speed and also the quality of the final solutions. Generally, if no a priori information about the solutions is available, the initial population is often selected randomly using random numbers. This paper presents a new initialization method by applying the concept of adaptive randomness (AR) to distribute the individuals as spaced out as possible over the search space. To verify the performance of AR, a comprehensive set of 34 benchmark functions with a wide range of dimensions is utilized. Conducted experiments demonstrate that AR-based population initialization performs better than other population initialization methods such as random population initialization, opposition-based population initialization, and generalized opposition-based population initialization in the convergence speed and the quality of the final solutions. Further, the influences of the problem dimensionality, the new control parameter, and the number of trial individuals are also investigated.

2013 ◽  
Vol 109 (5) ◽  
pp. 1259-1267 ◽  
Author(s):  
Devika Narain ◽  
Robert J. van Beers ◽  
Jeroen B. J. Smeets ◽  
Eli Brenner

In the course of its interaction with the world, the human nervous system must constantly estimate various variables in the surrounding environment. Past research indicates that environmental variables may be represented as probabilistic distributions of a priori information (priors). Priors for environmental variables that do not change much over time have been widely studied. Little is known, however, about how priors develop in environments with nonstationary statistics. We examine whether humans change their reliance on the prior based on recent changes in environmental variance. Through experimentation, we obtain an online estimate of the human sensorimotor prior (prediction) and then compare it to similar online predictions made by various nonadaptive and adaptive models. Simulations show that models that rapidly adapt to nonstationary components in the environments predict the stimuli better than models that do not take the changing statistics of the environment into consideration. We found that adaptive models best predict participants' responses in most cases. However, we find no support for the idea that this is a consequence of increased reliance on recent experience just after the occurrence of a systematic change in the environment.


2020 ◽  
Author(s):  
Emily S. Kappenman ◽  
Jaclyn Farrens ◽  
Wendy Zhang ◽  
Andrew X Stewart ◽  
Steven J Luck

Event-related potentials (ERPs) are noninvasive measures of human brain activity that index a range of sensory, cognitive, affective, and motor processes. Despite their broad application across basic and clinical research, there is little standardization of ERP paradigms and analysis protocols across studies. To address this, we created ERP CORE (Compendium of Open Resources and Experiments), a set of optimized paradigms, experiment control scripts, data processing pipelines, and sample data (N = 40 neurotypical young adults) for seven widely used ERP components: N170, mismatch negativity (MMN), N2pc, N400, P3, lateralized readiness potential (LRP), and error-related negativity (ERN). This resource makes it possible for researchers to 1) employ standardized ERP paradigms in their research, 2) apply carefully designed analysis pipelines and use a priori selected parameters for data processing, 3) rigorously assess the quality of their data, and 4) test new analytic techniques with standardized data from a wide range of paradigms.


Author(s):  
Hicham El Hassani ◽  
Said Benkachcha ◽  
Jamal Benhra

Inspired by nature, genetic algorithms (GA) are among the greatest meta-heuristics optimization methods that have proved their effectiveness to conventional NP-hard problems, especially the traveling salesman problem (TSP) which is one of the most studied Supply chain management problems. This paper proposes a new crossover operator called Jump Crossover (JMPX) for solving the travelling salesmen problem using a genetic algorithm (GA) for near-optimal solutions, to conclude on its efficiency compared to solutions quality given by other conventional operators to the same problem, namely, Partially matched crossover (PMX), Edge recombination Crossover (ERX) and r-opt heuristic with consideration of computational overload. We adopt the path representation technique for our chromosome which is the most direct representation and a low mutation rate to isolate the search space exploration ability of each crossover. The experimental results show that in most cases JMPX can remarkably improve the solution quality of the GA compared to the two existing classic crossover approaches and the r-opt heuristic.


Author(s):  
Nacéra Bennacer ◽  
Guy Vidal-Naquet

This paper proposes an Ontology-driven and Community-based Web Services (OCWS) framework which aims at automating discovery, composition and execution of web services. The purpose is to validate and to execute a user’s request built from the composition of a set of OCWS descriptions and a set of user constraints. The defined framework separates clearly the OCWS external descriptions from internal realistic implementations of e-services. It identifies three levels: the knowledge level, the community level and e-services level and uses different participant agents deployed in a distributed architecture. First, the reasoner agent uses a description logic extended for actions in order to reason about: (i) consistency of the pre-conditions and post-conditions of OCWS descriptions and the user constraints with ontologies semantics, (ii) consistency of the workflow matching assertions and the execution dependency graph. Then the execution plan model is generated automatically to be run by the composer agents using the dynamic execution plan algorithm (DEPA), according to the workflow matching and the established execution order. The community composer agents invoke the appropriate e-services and ensure that the non functional constraints are satisfied. DEPA algorithm works dynamically without a priori information about e-services states and has interesting properties such as taking into account the non-determinism of e-services and reducing the search space.


2010 ◽  
Vol 3 (1) ◽  
pp. 209-232 ◽  
Author(s):  
M. Reuter ◽  
M. Buchwitz ◽  
O. Schneising ◽  
J. Heymann ◽  
H. Bovensmann ◽  
...  

Abstract. An optimal estimation based retrieval scheme for satellite based retrievals of XCO2 (the dry air column averaged mixing ratio of atmospheric CO2) is presented enabling accurate retrievals also in the presence of thin clouds. The proposed method is designed to analyze near-infrared nadir measurements of the SCIAMACHY instrument in the CO2 absorption band at 1580 nm and in the O2-A absorption band at around 760 nm. The algorithm accounts for scattering in an optically thin cirrus cloud layer and at aerosols of a default profile. The scattering information is mainly obtained from the O2-A band and a merged fit windows approach enables the transfer of information between the O2-A and the CO2 band. Via the optimal estimation technique, the algorithm is able to account for a priori information to further constrain the inversion. Test scenarios of simulated SCIAMACHY sun-normalized radiance measurements are analyzed in order to specify the quality of the proposed method. In contrast to existing algorithms for SCIAMACHY retrievals, the systematic errors due to cirrus clouds with optical thicknesses up to 1.0 are reduced to values below 4 ppm for most of the analyzed scenarios. This shows that the proposed method has the potential to reduce uncertainties of SCIAMACHY retrieved XCO2 making this data product potentially useful for surface flux inverse modeling.


Author(s):  
Anatoly Maslak

The quality of the questionnaire as a measuring tool largely determines the relevance of the results. The aim of the work is to analyze the quality of the questionnaire as a measuring tool used to evaluate the latent variable "leadership qualities of students". The study was conducted within the framework of the theory of measurement of latent variables, which has important advantages. First of all, the latent variable is determined operationally, through a set of indicators (questionnaire items), the more indicators, the higher the accuracy of the latent variable measurement. The latent variable and indicators are measured on the same interval scale in logits. This allows the use of a wide range of statistical procedures for the analysis of measurement results. The analysis of the following aspects of the quality of the questionnaire as a measuring tool: the presence of extreme indicators in the test, the compatibility of a set of indicators, the compliance of the questionnaire to the level of students on the measured latent variable, the uniformity of the distribution of indicators on the interval scale. The indicators that differentiate students with high and low levels of leadership qualities better than others are highlighted. Recommendations on the adjustment of the questionnaire as a measuring tool for assessing the leadership qualities of students are given. 


2013 ◽  
Vol 2013 ◽  
pp. 1-9 ◽  
Author(s):  
Lihong Guo ◽  
Gai-Ge Wang ◽  
Heqi Wang ◽  
Dinan Wang

A hybrid metaheuristic approach by hybridizing harmony search (HS) and firefly algorithm (FA), namely, HS/FA, is proposed to solve function optimization. In HS/FA, the exploration of HS and the exploitation of FA are fully exerted, so HS/FA has a faster convergence speed than HS and FA. Also, top fireflies scheme is introduced to reduce running time, and HS is utilized to mutate between fireflies when updating fireflies. The HS/FA method is verified by various benchmarks. From the experiments, the implementation of HS/FA is better than the standard FA and other eight optimization methods.


2020 ◽  
Vol 13 (6) ◽  
pp. 168-178
Author(s):  
Pyae Cho ◽  
◽  
Thi Nyunt ◽  

Differential Evolution (DE) has become an advanced, robust, and proficient alternative technique for clustering on account of their population-based stochastic and heuristic search manners. Balancing better the exploitation and exploration power of the DE algorithm is important because this ability influences the performance of the algorithm. Besides, keeping superior solutions for the initial population raises the probability of finding better solutions and the rate of convergence. In this paper, an enhanced DE algorithm is introduced for clustering to offer better cluster solutions with faster convergence. The proposed algorithm performs a modified mutation strategy to improve the DE’s search behavior and exploits Quasi-Opposition-based Learning (QBL) to choose fitter initial solutions. This mutation strategy that uses the best solution as a target solution and applies three differentials contributes to avoiding local optima trap and slow convergence. The QBL based initialization method also contributes to increasing the quality of the clustering results and convergence rate. The experimental analysis was conducted on seven real datasets from the UCI repository to evaluate the performance of the proposed clustering algorithm. The obtained results showed that the proposed algorithm achieves more compact clusters and stable solutions than the competing conventional DE variants. Moreover, the performance of the proposed algorithm was compared with the existing state of the art clustering techniques based on DE. The corresponding results also pointed out that the proposed algorithm is comparable to other DE based clustering approaches in terms of the value of the objective functions. Therefore, the proposed algorithm can be regarded as an efficient clustering tool.


2020 ◽  
Vol 321 ◽  
pp. 03015
Author(s):  
Anthony Beckers ◽  
Gokula Krishna Muralidharan ◽  
Karel Lietaert ◽  
Nachiketa Ray ◽  
Pierre Van Cauwenbergh ◽  
...  

Direct Metal Printing (DMP) or Laser Based Powder Bed Fusion (L-BPF) enables manufacturing of highly complex geometries which are used in a wide range of applications - healthcare to aerospace. Producing these products with excellent and consistent part quality in terms of density and mechanical properties is key. DMP ProX® 320 machine has been used for over 10 years for this purpose. In this study, the key improvements made on the process stability for targeting consistent build quality across build platform and repeatability have been evaluated. The quality is assessed by determining the density, mechanical properties and surface roughness of direct metal printed LaserForm® Ti gr23 (A). The main finding from the study is that the use of the optimized gas flow enables production of LaserForm® Ti gr23 (A) with consistent and improved mechanical properties across the whole build platform. Moreover, there is no need any more for hot isostatic pressing to ensure good fatigue properties. The elongation strain to failure increased by 15 % to 20 %, which is 4-5 % higher than ASTM F3001 specifications. The axial fatigue limit (5x106 loading cycles) was 637 MPa (R=0.1), which is as good as or even better than annealed wrought Ti6Al4V.


Broadband Wireless Access has drawn the fine attention due to the wide range of data requirement and user mobility all the time. Moreover, WiMAX provides the best QoE (Quality of Experience) which is based on the IEEE 802.16 standards; this includes several services such as data, video and audio. However, in order to provide the effective and smooth experience i.e. QoS scheduling plays one of the critical part. In past several mechanism has been proposed for effective scheduling however, through the research it is observed that it can be furthermore improvised hence in this we propose a mechanism named as OUS (Optimized Uplink Scheduling) which helps in improvising the QoS. In here, we have proposed a novel feedback architecture and proposed optimized scheduling which helps in computing the bandwidth request this in terms helps in reducing the delay as well as jitter. Moreover, the performance evaluation is performed through extensive simulation by varying the different SS and frequency and the results analysis confirms that our mechanism performs way better than the existing algorithm.


Sign in / Sign up

Export Citation Format

Share Document