The performance of semi-distributed modelling in urban drainage: the trade-off between hydrological measurements and geographical data

2013 ◽  
Vol 8 (3-4) ◽  
pp. 479-486
Author(s):  
C. Bonhomme ◽  
G. Petrucci

Nowadays, high resolution geographical data are widely available for research and operational purposes. The inclusion of these data into hydrological models may suggest a direct and clear improvement of their performances. Different configurations and model structures, including an increasing quantity of geographical information, are tested using the widely used Stormwater Management Model 5 on a 2.5 km2 pilot catchment (located in Sucy en Brie, close to Paris, France). The Nash-Sutcliffe criterion is used to estimate the goodness of fit between model simulations and available measurements. If including some basic spatial information on landuse clearly improves the performance of the uncalibrated model, the increase in performance is less obvious if the user continues to refine geographical information on the catchment. Moreover, model predictions are comparable between a model calibrated with an efficient calibration procedure and a more physical approach including a fine spatial description and no calibration. Finally, the quality of data used for calibration and validation seems to be a key parameter to obtain a good fit between measurements and model predictions.

Author(s):  
Raul E. Avelar ◽  
Karen Dixon ◽  
Boniphace Kutela ◽  
Sam Klump ◽  
Beth Wemple ◽  
...  

The calibration of safety performance functions (SPFs) is a mechanism included in the Highway Safety Manual (HSM) to adjust SPFs in the HSM for use in intended jurisdictions. Critically, the quality of the calibration procedure must be assessed before using the calibrated SPFs. Multiple resources to aid practitioners in calibrating SPFs have been developed in the years following the publication of the HSM 1st edition. Similarly, the literature suggests multiple ways to assess the goodness-of-fit (GOF) of a calibrated SPF to a data set from a given jurisdiction. This paper uses the calibration results of multiple intersection SPFs to a large Mississippi safety database to examine the relations between multiple GOF metrics. The goal is to develop a sensible single index that leverages the joint information from multiple GOF metrics to assess overall quality of calibration. A factor analysis applied to the calibration results revealed three underlying factors explaining 76% of the variability in the data. From these results, the authors developed an index and performed a sensitivity analysis. The key metrics were found to be, in descending order: the deviation of the cumulative residual (CURE) plot from the 95% confidence area, the mean absolute deviation, the modified R-squared, and the value of the calibration factor. This paper also presents comparisons between the index and alternative scoring strategies, as well as an effort to verify the results using synthetic data. The developed index is recommended to comprehensively assess the quality of the calibrated intersection SPFs.


2012 ◽  
Vol 13 (2) ◽  
pp. 297 ◽  
Author(s):  
Y. ISSARIS ◽  
S. KATSANEVAKIS ◽  
M. PANTAZI ◽  
V. VASSILOPOULOU ◽  
P. PANAYOTIDIS ◽  
...  

Mapping of ecosystem components (natural and socioeconomic) is a prerequisite for ecosystem-based marine spatial management (EB-MSM). To initiate the process of EB-MSM in the Greek Ionian Sea and the adjacent gulfs, the main relevant ecosystem components were mapped based on existing spatial information and expert judgment. The natural components mapped included habitat types and species targeted for conservation, according to national and European legislation and international agreements. Main human activities/pressures related to fisheries, aquaculture, tourism, and industry were also mapped. In order to assess the quality of data used to map ecosystem components and therefore take into consideration the inherent uncertainty, an assessment of 5 semi-quantitative data indicators based on a pedigree matrix was conducted. Through this qualitative approach we gained information related to the sources, acquisition and verification procedures, statistical properties, and temporal & geographical correlation, along with the collection process quality of the ecosystem components under study. A substantial overlapping between ecological features and human activities was identified, confirming the need for a well-planned approach to marine space management, in order to mitigate conflicts for marine resources and conserve marine ecosystems and their associated goods and services.


2021 ◽  
Vol 21 (2) ◽  
pp. 47-58
Author(s):  
Daniel Lapresa Ajamil ◽  
Alba Otero ◽  
Javier Arana ◽  
Ildefonso Álvarez ◽  
María Teresa Anguera

En metodología observacional, para abordar la fiabilidad de los datos ya registrados, suele recurrirse a coeficientes de concordancia, coeficientes de correlación o a la teoría de la generalizabilidad; además, cada vez está tomando mayor protagonismo la concordancia consensuada. Esta forma de concordancia trata de lograr la coincidencia entre los observadores antes del registro. A pesar de su creciente presencia en estudios observacionales, son pocos los trabajos que han profundizado en el desarrollo y optimización de esta forma cualitativa de concordancia. El presente trabajo, además de constituirse en un ejemplo de la utilización de la concordancia por consenso, ha comparado el resultado obtenido (tiempo empleado y ajuste con el registro ideal) por grupos de consenso formados por diferente número de integrantes (dos, tres y cuatro observadores). No se han encontrado diferencias significativas al comparar los grupos de concordancia por consenso de dos, tres y cuatro integrantes, ni en relación al tiempo empleado en el registro, ni en lo relativo al porcentaje de acuerdo con el registro ideal. La determinación del tamaño muestral necesario para obtener diferencias significativas entre los grupos, ha permitido elevar conclusiones en términos de eficiencia. The reliability of datasets in observational methodology is typically tested using coefficients of agreement, correlation coefficients, or generalizability theory. Another increasingly popular method used to demonstrate the quality of data is the consensus agreement method, in which two or more observers agree on their coding decisions while creating the dataset. Although the consensus agreement method is being increasingly used in observational studies, few studies have conducted an in-depth analysis of how this qualitative procedure is approached or of how it can be optimized. In this study, in addition to presenting a practical example of the application of the consensus agreement method, we compare the results from three groups (of two, three, and four observers) to analyze performance in terms of time required to code the data and goodness of fit with respect to an optimal dataset. No significant differences were found between the three groups for either of the variables analyzed. Prior calculation of the sample size required to detect significant differences between the groups adds strength to our conclusions regarding the efficiency of the consensus agreement method. Na metodologia observacional, para lidar com a confiabilidade de dois dados já registrados, costumamos passar pelos coeficientes de concordância, coeficientes de correlação ou pela teoria da generalização. Além disso, a concordância consensual vem ganhando cada vez mais destaque. Esta forma de concordância tenta chegar a um acordo entre os observadores antes do registro. Apesar de sua crescente presença em estudos observacionais, poucos estudos se aprofundam no desenvolvimento e otimização de uma forma qualitativa de concordância. Ou apresentar trabalho, além de ser um exemplo do uso de concordância de consenso, comparação ou resultado obtido (tempo despendido e ajuste como registro ideal) por grupos de consenso formados por diferentes números de membros (dois, três e quatro observadores). Não foram encontradas diferenças significativas na comparação de dois grupos de concordância por consenso de dois, três e quatro membros, não em relação ao tempo gasto não registrado, mas em relação ao percentual de concordância conforme lista ideal. A determinação do tamanho dá a amostra necessária para obter diferenças significativas entre os grupos, permitindo-nos tirar conclusões em termos de eficiência.


2021 ◽  
Vol 9 ◽  
Author(s):  
Marelyn Telun Daniel ◽  
Tham Fatt Ng ◽  
Mohd. Farid Abdul Kadir ◽  
Joy Jacqueline Pereira

Landslide susceptibility assessment was conducted in Canada Hill, Sarawak, Malaysia through a combined bivariate statistics and expert consultation approach using geographical information system, which captures landslide-conditioning parameters specific to the study area; to ensure its usefulness in practice. Over the past four decades, many landslide parameters and increasingly sophisticated statistical methods have been used in landslide research. However, the findings have had very limited use in practice as the actual ground conditions are not well represented. The weakness is due to poor quality of data in landslide inventories and inadequate understanding of landslide-conditioning parameters. In this study, bivariate statistical method was used in conjunction with an iterative process of expert consultation. Thirteen original landslide-conditioning parameters were narrowed down to six, with the addition of a unique parameter, planar failure potential, which was selected based on expert input. The parameter captures planar failure landslides, which has the highest impact in the study area, causing loss of lives and property destruction. The inaugural landslide susceptibility map for the study area has five classes; very low, low, moderate, high and very high susceptibility. All major planar failures and most smaller circular failures fall within the very high susceptibility class, with a success rate of 75.8%. The approach used in this study has improved the quality of the landslide inventory and delineated key conditioning parameters. The resultant map captures local conditions, which is useful for landslide management.


Author(s):  
Amir Masoud Forati ◽  
Rina Ghose

Recent advancements in web-based geospatial software and smartphone technology have popularized the process of voluntary production and sharing of geospatial data by individual citizens. Through such Volunteered Geographic Information (VGI) activities, people across the world participate in online mapping projects (such as OpenStreetMap) to insert their spatial information. The quality of data generated by such VGI activities has profound impacts on online mapping projects and their spatial database. In this study, we examine the VGI contribution pattern in OpenStreetMap through three case study neighborhoods located in three major cities: Tehran, London, and Los Angeles, and investigate how it might affect the process of quality assessment of VGI.


Author(s):  
B. L. Armbruster ◽  
B. Kraus ◽  
M. Pan

One goal in electron microscopy of biological specimens is to improve the quality of data to equal the resolution capabilities of modem transmission electron microscopes. Radiation damage and beam- induced movement caused by charging of the sample, low image contrast at high resolution, and sensitivity to external vibration and drift in side entry specimen holders limit the effective resolution one can achieve. Several methods have been developed to address these limitations: cryomethods are widely employed to preserve and stabilize specimens against some of the adverse effects of the vacuum and electron beam irradiation, spot-scan imaging reduces charging and associated beam-induced movement, and energy-filtered imaging removes the “fog” caused by inelastic scattering of electrons which is particularly pronounced in thick specimens.Although most cryoholders can easily achieve a 3.4Å resolution specification, information perpendicular to the goniometer axis may be degraded due to vibration. Absolute drift after mechanical and thermal equilibration as well as drift after movement of a holder may cause loss of resolution in any direction.


Author(s):  
Nur Maimun ◽  
Jihan Natassa ◽  
Wen Via Trisna ◽  
Yeye Supriatin

The accuracy in administering the diagnosis code was the important matter for medical recorder, quality of data was the most important thing for health information management of medical recorder. This study aims to know the coder competency for accuracy and precision of using ICD 10 at X Hospital in Pekanbaru. This study was a qualitative method with case study implementation from five informan. The result show that medical personnel (doctor) have never received a training about coding, doctors writing that hard and difficult to read, failure for making diagnoses code or procedures, doctor used an usual abbreviations that are not standard, theres still an officer who are not understand about the nomenclature and mastering anatomy phatology, facilities and infrastructure were supported for accuracy and precision of the existing code. The errors of coding always happen because there is a human error. The accuracy and precision in coding very influence against the cost of INA CBGs, medical and the committee did most of the work in the case of severity level III, while medical record had a role in monitoring or evaluation of coding implementation. If there are resumes that is not clearly case mix team check file needed medical record the result the diagnoses or coding for conformity. Keywords: coder competency, accuracy and precision of coding, ICD 10


2017 ◽  
Vol 4 (1) ◽  
pp. 25-31 ◽  
Author(s):  
Diana Effendi

Information Product Approach (IP Approach) is an information management approach. It can be used to manage product information and data quality analysis. IP-Map can be used by organizations to facilitate the management of knowledge in collecting, storing, maintaining, and using the data in an organized. The  process of data management of academic activities in X University has not yet used the IP approach. X University has not given attention to the management of information quality of its. During this time X University just concern to system applications used to support the automation of data management in the process of academic activities. IP-Map that made in this paper can be used as a basis for analyzing the quality of data and information. By the IP-MAP, X University is expected to know which parts of the process that need improvement in the quality of data and information management.   Index term: IP Approach, IP-Map, information quality, data quality. REFERENCES[1] H. Zhu, S. Madnick, Y. Lee, and R. Wang, “Data and Information Quality Research: Its Evolution and Future,” Working Paper, MIT, USA, 2012.[2] Lee, Yang W; at al, Journey To Data Quality, MIT Press: Cambridge, 2006.[3] L. Al-Hakim, Information Quality Management: Theory and Applications. Idea Group Inc (IGI), 2007.[4] “Access : A semiotic information quality framework: development and comparative analysis : Journal ofInformation Technology.” [Online]. Available: http://www.palgravejournals.com/jit/journal/v20/n2/full/2000038a.html. [Accessed: 18-Sep-2015].[5] Effendi, Diana, Pengukuran Dan Perbaikan Kualitas Data Dan Informasi Di Perguruan Tinggi MenggunakanCALDEA Dan EVAMECAL (Studi Kasus X University), Proceeding Seminar Nasional RESASTEK, 2012, pp.TIG.1-TI-G.6.


2019 ◽  
Author(s):  
Bogdan Corneliu Andor ◽  
Dionisio Franco Barattini ◽  
Dumitru Emanuel Dogaru ◽  
Simone Guadagna ◽  
Serban Rosu

BACKGROUND Osteoarthritis (OA) is one of the top five most disabling conditions and it affects more than one third of persons over 65 years of age. Currently 80% of persons affected by OA already report having some movement limitation, 20% of people are not be able to perform major activities of daily living, and about 11% of the total affected population need of personal care. On 2014 the European Society for Clinical and Economic Aspects of Osteoporosis and Osteoarthritis (ESCEO) suggested as first step of pharmacological treatment for knee OA a background therapy with chronic symptomatic slow-acting drugs for osteoarthritis (SYSADOAs), such as glucosamine sulphate, chondroitin sulphate and hyaluronic acid (HA). In studies with oral HA, symptoms of OA are often measured using subjective parameters such as the visual analog scale (VAS) or the quality of life questionnaire (QoL) and objective measurements as ultrasonography (US) or range of motion (ROM) are employed in very few trials. This affects the quality of data in the literature. OBJECTIVE The primary objective of this work is to assess the feasibility of implementing US and ROM as objective measurements to correlate the improvement of knee mobility with pain reduction, evaluated using a subjective scale (VAS) in patients assuming a nutraceutical containing HA. The secondary objective is to evaluate the enrollment rate in one month to verify the feasibility for time and budget of the planned future main study. The explorative objective of the trial is to obtain preliminary data on efficacy of the tested product. METHODS This open-label pilot trial is performed in an orthopedic clinic (Timisoara, Romania). Male and female subjects (from 50 to 70 years) diagnosed with symptomatic OA of the knee with mild joint discomfort for at least 6 months are included. Following protocol, 8 patients are administered for 8 weeks Syalox® 300 Plus (River Pharma, Italy), a product based on HA of high molecular weight. Baseline and final visit assessments include orthopedic assessment, US, Knee injury and Osteoarthritis Outcome Score (KOOS) questionnaire, VAS and ROM of knee. RESULTS Data collection occurred between February 2018 and June 2018. All results are expected to be available by the end of 2018. CONCLUSIONS This pilot trial will be the first study to analyze the potential correlation between subjective evaluation (VAS, KOOS questionnaire) and objective measurements (US, ROM and actigraphy). The data from this study will assess the feasibility of the planned monthly recruitment rate and the necessary time and budget, and should provide preliminary information on efficacy of the tested product. CLINICALTRIAL ClinicalTrials.gov (NCT number: NCT03421054).


2020 ◽  
Author(s):  
Juqing Zhao ◽  
Pei Chen ◽  
Guangming Wan

BACKGROUND There has been an increase number of eHealth and mHealth interventions aimed to support symptoms among cancer survivors. However, patient engagement has not been guaranteed and standardized in these interventions. OBJECTIVE The objective of this review was to address how patient engagement has been defined and measured in eHealth and mHealth interventions designed to improve symptoms and quality of life for cancer patients. METHODS Searches were performed in MEDLINE, PsychINFO, Web of Science, and Google Scholar to identify eHealth and mHealth interventions designed specifically to improve symptom management for cancer patients. Definition and measurement of engagement and engagement related outcomes of each intervention were synthesized. This integrated review was conducted using Critical Interpretive Synthesis to ensure the quality of data synthesis. RESULTS A total of 792 intervention studies were identified through the searches; 10 research papers met the inclusion criteria. Most of them (6/10) were randomized trial, 2 were one group trail, 1 was qualitative design, and 1 paper used mixed method. Majority of identified papers defined patient engagement as the usage of an eHealth and mHealth intervention by using different variables (e.g., usage time, log in times, participation rate). Engagement has also been described as subjective experience about the interaction with the intervention. The measurement of engagement is in accordance with the definition of engagement and can be categorized as objective and subjective measures. Among identified papers, 5 used system usage data, 2 used self-reported questionnaire, 1 used sensor data and 3 used qualitative method. Almost all studies reported engagement at a moment to moment level, but there is a lack of measurement of engagement for the long term. CONCLUSIONS There have been calls to develop standard definition and measurement of patient engagement in eHealth and mHealth interventions. Besides, it is important to provide cancer patients with more tailored and engaging eHealth and mHealth interventions for long term engagement.


Sign in / Sign up

Export Citation Format

Share Document