scholarly journals Development of Quality Control Requirements for Improving the Quality of Architectural Design Based on BIM

2020 ◽  
Vol 10 (20) ◽  
pp. 7074
Author(s):  
Jungsik Choi ◽  
Sejin Lee ◽  
Inhan Kim

Building information modeling (BIM) technology has been utilized increasingly in quantitative ways in the architecture, engineering, and construction disciplines. However, owing to increasing requirements for improvement in qualitative factors in BIM-based design projects, it has become necessary to develop a checking and evaluation process for BIM-based architectural design. The purpose of this study is to develop and apply BIM-based quality control requirements for improving the quality of architectural design. To achieve this, the research investigated case studies for BIM data quality control and classified quality control targets according to physical/logical quality and data quality. The quality check criteria and checklists are developed through the reconfiguration of deduced quality control targets by requirement. The research developed a rule-based quality checking system using requirements for efficient quality control based on BIM.

2012 ◽  
Vol 2012 ◽  
pp. 1-8 ◽  
Author(s):  
Janet E. Squires ◽  
Alison M. Hutchinson ◽  
Anne-Marie Bostrom ◽  
Kelly Deis ◽  
Peter G. Norton ◽  
...  

Researchers strive to optimize data quality in order to ensure that study findings are valid and reliable. In this paper, we describe a data quality control program designed to maximize quality of survey data collected using computer-assisted personal interviews. The quality control program comprised three phases: (1) software development, (2) an interviewer quality control protocol, and (3) a data cleaning and processing protocol. To illustrate the value of the program, we assess its use in the Translating Research in Elder Care Study. We utilize data collected annually for two years from computer-assisted personal interviews with 3004 healthcare aides. Data quality was assessed using both survey and process data. Missing data and data errors were minimal. Mean and median values and standard deviations were within acceptable limits. Process data indicated that in only 3.4% and 4.0% of cases was the interviewer unable to conduct interviews in accordance with the details of the program. Interviewers’ perceptions of interview quality also significantly improved between Years 1 and 2. While this data quality control program was demanding in terms of time and resources, we found that the benefits clearly outweighed the effort required to achieve high-quality data.


2017 ◽  
Vol 46 (2) ◽  
pp. 69-77 ◽  
Author(s):  
Beth A Reid ◽  
Lee Ridoutt ◽  
Paul O’Connor ◽  
Deirdre Murphy

Introduction: This article presents some of the results of a year-long project in the Republic of Ireland to review the quality of the hospital inpatient enquiry data for its use in activity-based funding (ABF). This is the first of two papers regarding best practice in the management of clinical coding services. Methods: Four methods were used to address this aspect of the project, namely a literature review, a workshop, an assessment of the coding services in 12 Irish hospitals by structured interviews of the clinical coding managers, and a medical record audit of the clinical codes in 10 hospitals. Results: The results included here are those relating to the quality of the medical records, coding work allocation and supervision processes, data quality control measures, communication with clinicians, and the visibility of clinical coders, their managers, and the coding service. Conclusion: The project found instances of best practice in the study hospitals but also found several areas needing improvement. These included improving the structure and content of the medical record, clinician engagement with the clinical coding teams and the ABF process, and the use of data quality control measures.


2021 ◽  
Author(s):  
Francesco Battocchio ◽  
Jaijith Sreekantan ◽  
Arghad Arnaout ◽  
Abed Benaichouche ◽  
Juma Sulaiman Al Shamsi ◽  
...  

Abstract Drilling data quality is notoriously a challenge for any analytics application, due to complexity of the real-time data acquisition system which routinely generates: (i) Time related issues caused by irregular sampling, (ii) Channel related issues in terms of non-uniform names and units, missing or wrong values, and (iii) Depth related issues caused block position resets, and depth compensation (for floating rigs). On the other hand, artificial intelligence drilling applications typically require a consistent stream of high-quality data as an input for their algorithms, as well as for visualization. In this work we present an automated workflow enhanced by data driven techniques that resolves complex quality issues, harmonize sensor drilling data, and report the quality of the dataset to be used for advanced analytics. The approach proposes an automated data quality workflow which formalizes the characteristics, requirements and constraints of sensor data within the context of drilling operations. The workflow leverages machine learning algorithms, statistics, signal processing and rule-based engines for detection of data quality issues including error values, outliers, bias, drifts, noise, and missing values. Further, once data quality issues are classified, they are scored and treated on a context specific basis in order to recover the maximum volume of data while avoiding information loss. This results into a data quality and preparation engine that organizes drilling data for further advanced analytics, and reports the quality of the dataset through key performance indicators. This novel data processing workflow allowed to recover more than 90% of a drilling dataset made of 18 offshore wells, that otherwise could not be used for analytics. This was achieved by resolving specific issues including, resampling timeseries with gaps and different sampling rates, smart imputation of wrong/missing data while preserving consistency of dataset across all channels. Additional improvement would include recovering data values that felt outside a meaningful range because of sensor drifting or depth resets. The present work automates the end-to-end workflow for data quality control of drilling sensor data leveraging advanced Artificial Intelligence (AI) algorithms. It allows to detect and classify patterns of wrong/missing data, and to recover them through a context driven approach that prevents information loss. As a result, the maximum amount of data is recovered for artificial intelligence drilling applications. The workflow also enables optimal time synchronization of different sensors streaming data at different frequencies, within discontinuous time intervals.


Author(s):  
C. X. Chen ◽  
H. Zhang ◽  
K. Jiang ◽  
H. T. Zhao ◽  
W. Xie ◽  
...  

Abstract. In recent years, China has promulgated the "Civil Code of the People's Republic of China", "Implementation Rules of the Provisional Regulations on Real Estate Registration" and other laws and regulations, which have protected citizens' rights and obligations in real estate from the legal system. It shows that the quality of real estate registration data is very important. At present, there is no set of standards for evaluating the quality of real estate registration data. This article sorts out the production process of real estate registration data and focuses on the four stages of production: digitization results, field surveys and surveying and mapping results, group building results, integration and association. As a result, the main points of real estate registration data quality control were put forward, and a quality evaluation model was developed. Taking Beijing's real estate registration historical archives integrated data quality inspection as an application case, it shows that the quality evaluation model has been successfully applied to actual projects, ensuring the quality of Beijing real estate registration data. It also provides a reference for the next step in China's quality control of the unified registration of natural resources confirmation.


2018 ◽  
Vol 13 (2) ◽  
pp. 131-146
Author(s):  
Mirwan Rofiq Ginanjar ◽  
Sri Mulat Yuningsih

Planning and management of water resources are dependent on the quality of hydrological data. Hydrological data plays an important role in hydrological analysis. The availability of good and qualified hydrological data is one of the determinants of the results of hydrological analysis. However, the facts indicate that many of the available data do not fit their ideal state. To solve this problem, a hydrological data quality control model should be established in order to improve the quality of national hydrological data. The scope includes quality control of rainfall and discharge data. Analysis of the quality control of rainfall data was conducted on 58 rainfall stations spread on the island of Java. The analysis shows that 41 stations are good categorized, 14 stations are in moderate category and 3 stations are badly categorized. Based on these results, a light improvement scenario was performed, good category Station increased to 46 stations, moderate category decreased to 11 stations and bad category reduced to 1 Stations. Quality control of discharge data analysis was conducted on 14 discharge stations spread on Java Island. Analyzes were performed for QC1, QC2 and QC3 then got final QC value. The results on the final QC show no stations for good category, 2 stations for moderate categories and 12 stations for bad category. Based on the results of the analysis, a light improvement scenario was performed with the result of bad category increased to good category 5 stations, bad category increased to moderate 7 stations, and moderate category 1 stations.


Author(s):  
Seunghwa Park ◽  
Inhan Kim

Today’s buildings are getting larger and more complex. As a result, the traditional method of manually checking the design of a building is no longer efficient since such a process is time-consuming and laborious. It is becoming increasingly important to establish and automate processes for checking the quality of buildings. By automatically checking whether buildings satisfy requirements, Building Information Modeling (BIM) allows for rapid decision-making and evaluation. In this context, the work presented here focuses on resolving building safety issues via a proposed BIM-based quality checking process. Through the use case studies, the efficiency and usability of the devised strategy is evaluated. This research can be beneficial in promoting the efficient use of BIM-based communication and collaboration among the project party concerned for improving safety management. In addition, the work presented here has the potential to expand research efforts in BIM-based quality checking processes.


Author(s):  
Antonella D. Pontoriero ◽  
Giovanna Nordio ◽  
Rubaida Easmin ◽  
Alessio Giacomel ◽  
Barbara Santangelo ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document