scholarly journals Fuzzy set based error measure for hydrologic model evaluation

2005 ◽  
Vol 7 (3) ◽  
pp. 199-208 ◽  
Author(s):  
Ramesh Teegavarapu ◽  
Amin Elshorbagy

Traditional error measures (e.g. mean squared error, mean relative error) are often used in the field of water resources to evaluate the performance of models developed for modeling various hydrological processes. However, these measures may not always provide a comprehensive assessment of the performance of the model intended for a specific application. A new error measure is proposed and developed in this paper to fill the gap left by existing traditional error measures for performance evaluation. The measure quantifies the error that corresponds to the hydrologic condition and model application under consideration and also facilitates selection of the best model whenever multiple models are available for that application. Fuzzy set theory is used to model the modeler's perceptions of predictive accuracy in specific applications. The development of the error measure is primarily intended for use with models that provide hydrologic time series predictions. Hypothetical and real-life examples are used to illustrate and evaluate this measure. Results indicate that use of this measure is rational and meaningful in the selection process of an appropriate model from a set of competing models.

2021 ◽  
Vol 23 (04) ◽  
pp. 211-224
Author(s):  
Gurcharan Singh ◽  
◽  
Baljodh Singh ◽  
Neelam Kumari ◽  
◽  
...  

This paper deals with the fact thatpentagonal fuzzy numbers are pre-owned and systematic outcomes are discussed in real-life situations. The fuzzy set supposition is combined with well-established classical queuing theory but the classical queuing theory is far away from real-life situations. In this approach, we can use both fuzzy and probability theory to make this work more realistic with the help of the α-cut technique. Symmetric pentagonal fuzzy numbers are used to elaborate on the situation of the queue in linguistic terms.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Mustafa Said Yurtyapan ◽  
Erdal Aydemir

PurposeEnterprise Resource Planning (ERP) software which is a knowledge-based design on the interconnective communication of business units and information share, ensures that business processes such as finance, production, purchasing, sales, logistics and human resources, are integrated and gathered under one roof. This integrated system allows the company to make fast and accurate decisions and increases its competitiveness. Therefore, for an enterprise, choosing the suitable ERP software is extremely important. The aim of this study is to present new research on the ERP software selection process by clarifying the uncertainties and find suitable software in a computational way.Design/methodology/approachERP selection problem design includes uncertainties on the expert opinions and the criteria values using intuitionistic fuzzy set theory and interval grey-numbers to MACBETH multi criteria decision making method. In this paper, a new interval grey MACBETH method approach is proposed, and the degree of greyness approach is used for clarifying the uncertainties. Using this new approach in which grey numbers are used, it is aimed to observe the changes in the importance of the alternatives. Moreover, the intuitionistic fuzzy set method is applied by considering the importance of expert opinions separately.FindingsThe proposed method is based on quantitative decision making derived from qualitative judgments. The results given under uncertain conditions are compared with the results obtained under crisp conditions of the same methods. With the qualitative levels of experts reflected in the decision process, it is clearly seen that ERP software selection problem area has more effective alternative decision solutions to the uncertain environment, and decision makers should not undervalue the unsteadiness of criteria during ERP software selection process.Originality/valueThis study contributes to the relevant literature by (1) utilizing the MACBETH method in the selection of the ERP software by optimization, and (2) validating the importance of expert opinions with uncertainties on a proper ERP software selection procedure. So, the findings of this study can help the decision-makers to evaluate the ERP selection in uncertain conditions.


Kybernetes ◽  
2016 ◽  
Vol 45 (3) ◽  
pp. 474-489 ◽  
Author(s):  
Moloud sadat Asgari ◽  
Abbas Abbasi ◽  
Moslem Alimohamadlou

Purpose – In the contemporary global market, supplier selection represents a crucial process for enhancing firms’ competitiveness. This is a multi-criteria decision-making problem that involves consideration of multiple criteria. Therefore this requires reliable methods to select the best suppliers. The purpose of this paper is to examine and propose appropriate method for selecting suppliers. Design/methodology/approach – ANFIS and fuzzy analytic hierarchy process-fuzzy goal programming (FAHP-FGP) are new methods for evaluating and selecting the best suppliers. These methods are used in this study for evaluating suppliers of dairy industries and the results obtained from methods are compared by performance measures such as Mean Squared Error, Root Mean Squared Error, Normalized Root Men Squared Error, Mean Absolute Error, Normalized Root Men Squared Error, Minimum Absolute Error and R2. Findings – The results indicate that the ANFIS method provides better performance compared to the FAHP-FGP method in terms of the selected suppliers scoring higher in all the performance measures. Practical implications – The proposed method could help companies select the best supplier, by avoiding the influence of personal judgment. Originality/value – This study uses the well-structured method of the fuzzy Delphi in order to determine the supplier evaluation criteria as well as the most recent ANFIS and FAHP-FGP methods for supplier selection. In addition, unlike most other studies, it performs the selection process among all available suppliers.


Biosensors ◽  
2018 ◽  
Vol 8 (4) ◽  
pp. 130 ◽  
Author(s):  
Georgina Ross ◽  
Maria Bremer ◽  
Jan Wichers ◽  
Aart van Amerongen ◽  
Michel Nielen

Lateral Flow Immunoassays (LFIAs) allow for rapid, low-cost, screening of many biomolecules such as food allergens. Despite being classified as rapid tests, many LFIAs take 10–20 min to complete. For a really high-speed LFIA, it is necessary to assess antibody association kinetics. By using a label-free optical technique such as Surface Plasmon Resonance (SPR), it is possible to screen crude monoclonal antibody (mAb) preparations for their association rates against a target. Herein, we describe an SPR-based method for screening and selecting crude anti-hazelnut antibodies based on their relative association rates, cross reactivity and sandwich pairing capabilities, for subsequent application in a rapid ligand binding assay. Thanks to the SPR selection process, only the fast mAb (F-50-6B12) and the slow (S-50-5H9) mAb needed purification for labelling with carbon nanoparticles to exploit high-speed LFIA prototypes. The kinetics observed in SPR were reflected in LFIA, with the test line appearing within 30 s, almost two times faster when F-50-6B12 was used, compared with S-50-5H9. Additionally, the LFIAs have demonstrated their future applicability to real life samples by detecting hazelnut in the sub-ppm range in a cookie matrix. Finally, these LFIAs not only provide a qualitative result when read visually, but also generate semi-quantitative data when exploiting freely downloadable smartphone apps.


2021 ◽  
Vol 19 (1) ◽  
pp. 2-20
Author(s):  
Piyush Kant Rai ◽  
Alka Singh ◽  
Muhammad Qasim

This article introduces calibration estimators under different distance measures based on two auxiliary variables in stratified sampling. The theory of the calibration estimator is presented. The calibrated weights based on different distance functions are also derived. A simulation study has been carried out to judge the performance of the proposed estimators based on the minimum relative root mean squared error criterion. A real-life data set is also used to confirm the supremacy of the proposed method.


2015 ◽  
Vol 7 (1) ◽  
pp. 15-30 ◽  
Author(s):  
Ksenija Mandić ◽  
Boris Delibašić ◽  
Dragan Radojević

The supplier selection process attracted a lot of attention in the business management literature. This process takes into consideration several quantitative and qualitative variables and is usually modeled as a multi-attribute decision making (MADM) problem. A recognized shortcoming in the literature of classical MADM methods is that they don't permit the identification of interdependencies among attributes. Therefore, the aim of this study is to propose a model for selecting suppliers of telecommunications equipment that includes the interaction between attributes. This interaction can model the hidden knowledge needed for efficient decision-making. To model interdependencies among attributes the authors use a recently proposed consistent fuzzy logic, i.e. interpolative Boolean algebra (IBA). For alternatives ranking they use the classical MADM method TOPSIS. The proposed model was evaluated on a real-life application. The conclusion is that decision makers were able to integrate their reasoning into the MADM model using interpolative Boolean algebra.


2019 ◽  
Vol 85 (1) ◽  
pp. 231-235
Author(s):  
Nikki de Rouw ◽  
Sabine Visser ◽  
Stijn L. W. Koolen ◽  
Joachim G. J. V. Aerts ◽  
Michel M. van den Heuvel ◽  
...  

Abstract Purpose Pemetrexed is a widely used cytostatic agent with an established exposure–response relationship. Although dosing is based on body surface area (BSA), large interindividual variability in pemetrexed plasma concentrations is observed. Therapeutic drug monitoring (TDM) can be a feasible strategy to reduce variability in specific cases leading to potentially optimized pemetrexed treatment. The aim of this study was to develop a limited sampling schedule (LSS) for the assessment of pemetrexed pharmacokinetics. Methods Based on two real-life datasets, several limited sampling designs were evaluated on predicting clearance, using NONMEM, based on mean prediction error (MPE %) and normalized root mean squared error (NRMSE %). The predefined criteria for an acceptable LSS were: a maximum of four sampling time points within 8 h with an MPE and NRMSE ≤ 20%. Results For an accurate estimation of clearance, only four samples in a convenient window of 8 h were required for accurate and precise prediction (MPE and NRMSE of 3.6% and 5.7% for dataset 1 and of 15.5% and 16.5% for dataset 2). A single sample at t = 24 h performed also within the criteria with MPE and NRMSE of 5.8% and 8.7% for dataset 1 and of 11.5% and 16.4% for dataset 2. Bias increased when patients had lower creatinine clearance. Conclusions We presented two limited sampling designs for estimation of pemetrexed pharmacokinetics. Either one can be used based on preference and feasibility.


2020 ◽  
Vol 74 (7) ◽  
pp. 791-798
Author(s):  
Carl Emil Eskildsen ◽  
Tormod Næs

In applied spectroscopy, the purpose of multivariate calibration is almost exclusively to relate analyte concentrations and spectroscopic measurements. The multivariate calibration model provides estimates of analyte concentrations based on the spectroscopic measurements. Predictive performance is often evaluated based on a mean squared error. While this average measure can be used in model selection, it is not satisfactory for evaluating the uncertainty of individual predictions. For a calibration, the uncertainties are sample specific. This is especially true for multivariate calibration, where interfering compounds may be present. Consider in-line spectroscopic measurements during a chemical reaction, production, etc. Here, reference values are not necessarily available. Hence, one should know the uncertainty of a given prediction in order to use that prediction for telling the state of the chemical reaction, adjusting the process, etc. In this paper, we discuss the influence of variance and bias on sample-specific prediction errors in multivariate calibration. We compare theoretical formulae with results obtained on experimental data. The results point towards the fact that bias contribution cannot necessarily be neglected when assessing sample-specific prediction ability in practice.


Water ◽  
2019 ◽  
Vol 11 (7) ◽  
pp. 1340
Author(s):  
Woodson ◽  
Adams ◽  
Dymond

Quantitative precipitation estimation (QPE) remains a key area of uncertainty in hydrological modeling and prediction, particularly in small, urban watersheds, which respond rapidly to precipitation and can experience significant spatial variability in rainfall fields. Few studies have compared QPE methods in small, urban watersheds, and studies that have examined this topic only compared model results on an event basis using a small number of storms. This study sought to compare the efficacy of multiple QPE methods when simulating discharge in a small, urban watershed on a continuous basis using an operational hydrologic model and QPE forcings. The research distributed hydrologic model (RDHM) was used to model a basin in Roanoke, Virginia, USA, forced with QPEs from four methods: mean field bias (MFB) correction of radar data, kriging of rain gauge data, uncorrected radar data, and a basin-uniform estimate from a single gauge inside the watershed. Based on comparisons between simulated and observed discharge at the basin outlet for a six-month period in 2018, simulations forced with the uncorrected radar QPE had the highest accuracy, as measured by root mean squared error (RMSE) and peak flow relative error, despite systematic underprediction of the mean areal precipitation (MAP). Simulations forced with MFB-corrected radar data consistently and significantly overpredicted discharge, but had the highest accuracy in predicting the timing of peak flows.


2014 ◽  
Vol 24 (5) ◽  
pp. 418-433 ◽  
Author(s):  
John E.G. Bateson ◽  
Jochen Wirtz ◽  
Eugene Burke ◽  
Carly Vaughan

Purpose – Service employees in subordinate service roles are crucial for operational efficiency and service quality. However, the stressful nature of these roles, inappropriate hire selection, and the proliferation of job boards have created massive recruitment problems for HR departments. The purpose of this paper is to highlights the growing costs of recruiting the right candidates for service roles while offering an alternative approach to recruitment that is more efficient and effective than the traditional approach. Design/methodology/approach – The study offers empirical evidence of five instances in which the use of psychometric sifting procedures reduced recruitment costs, while improving the quality of the resultant hires. Findings – By standing the traditional recruitment process “on its head” and using psychometric tests at the start of the selection process, the recruitment process can be significantly improved. Such tests efficiently weed out unsuitable candidates before they even enter the recruitment process, leaving a smaller, better-qualified pool for possible recruitment. Practical implications – Firms can safely use the psychometric sifts to select applicants according to their operational efficiency, customer orientation, and overall performance. This paper illustrates the use of both traditional questionnaire measures and situational judgment tests to remove unsuitable applicants at the start of the selection process. A real-life case study suggests that such an approach increases the hiring success rate from 6:1 to 2:1. In the opening of a new supermarket by a UK group, this process saved 73,000 hours of managers’ time, representing $1.8 million savings in opening costs. Originality/value – The paper offers a viable cost-saving alternative to a growing problem for HR departments in service firms and provides directions for further research.


Sign in / Sign up

Export Citation Format

Share Document