scholarly journals Managing data entry of a large-scale interview project with optical scanning hardware and software

1991 ◽  
Vol 23 (2) ◽  
pp. 214-218 ◽  
Author(s):  
Cole Barton ◽  
Chris Hatcher ◽  
Karen Schurig ◽  
Paul Marciano ◽  
Kathryn Wilcox ◽  
...  
1975 ◽  
Vol 14 (01) ◽  
pp. 32-34
Author(s):  
Elisabeth Schach

Data reporting the experience with an optical mark page reader is presented (IBM 1231Ν1). Information from 52,000 persons was gathered in seven countries, decentrally coded and centrally processed. Reader performance rates (i.e. sheets read per hour, sheet rejection rates, reading error rates) and costs (coding, verification, reading, etc.) are given.


Electronics ◽  
2019 ◽  
Vol 8 (11) ◽  
pp. 1227 ◽  
Author(s):  
Carrasco ◽  
Álvarez ◽  
Velázquez ◽  
Concha ◽  
Pérez-Cotapos

One of the most widely used electro-mechanical systems in large-scale mining is the electric motor. This device is employed in practically every phase of production. For this reason, it needs to be inspected regularly to maintain maximum operability, thus avoiding unplanned stoppages. In order to identify potential faults, regular check-ups are performed to measure the internal parameters of the components, especially the brushes and brush-holders. Both components must be properly aligned and calibrated to avoid electric arcs to the internal insulation of the motor. Although there is an increasing effort to improve inspection tasks, most inspection procedures are manual, leading to unnecessary costs in inspection time, errors in data entry, and, in extreme cases, measurement errors. This research presents the design, development, and assessment of an integrated measurement prototype for measuring spring tension and other key parameters in brush-holders used in electric motors. It aims to provide the mining industry with a new, fully automatic inspection system that will facilitate maintenance and checking. Our development research was carried out specifically on the brush system of a SAG grinding mill motor. These machines commonly use SIEMENS motors; however, the instrument can be easily adapted to any motor by simply changing the physical dimensions of the prototype.


1992 ◽  
Vol 7 (1) ◽  
pp. 63-78 ◽  
Author(s):  
Magda Stouthamer-Loeber ◽  
Welmoet van Kammen ◽  
Rolf Loeber

Studies that assess large numbers of subjects for longitudinal research, for epidemiological purposes, or for the evaluation of prevention and intervention efforts, are very costly and should be undertaken with the greatest care to ensure their success. The success of a study, apart from its scientific merit, depends largely on the ability of the researcher to plan and set up a smoothly running operation. However, the skills required for such a task are often not acquired in academic training, nor do scientific journals abound with information on the practical aspects of running a large study. This paper summarizes the experience gained in executing a longitudinal study and covers aspects of planning, hiring of staff, training and supervision of interviewers, data collection and data entry and management. The importance of the use of the computer as a management tool is stressed.


PLoS ONE ◽  
2021 ◽  
Vol 16 (10) ◽  
pp. e0254293 ◽  
Author(s):  
Matteo Di Bernardo ◽  
Timothy A. Crombie ◽  
Daniel E. Cook ◽  
Erik C. Andersen

Large-scale ecological sampling can be difficult and costly, especially for organisms that are too small to be easily identified in a natural environment by eye. Typically, these microscopic floral and fauna are sampled by collecting substrates from nature and then separating organisms from substrates in the laboratory. In many cases, diverse organisms can be identified to the species-level using molecular barcodes. To facilitate large-scale ecological sampling of microscopic organisms, we used a geographic data-collection platform for mobile devices called Fulcrum that streamlines the organization of geospatial sampling data, substrate photographs, and environmental data at natural sampling sites. These sampling data are then linked to organism isolation data from the laboratory. Here, we describe the easyFulcrum R package, which can be used to clean, process, and visualize ecological field sampling and isolation data exported from the Fulcrum mobile application. We developed this package for wild nematode sampling, but it can be used with other organisms. The advantages of using Fulcrum combined with easyFulcrum are (1) the elimination of transcription errors by replacing manual data entry and/or spreadsheets with a mobile application, (2) the ability to clean, process, and visualize sampling data using a standardized set of functions in the R software environment, and (3) the ability to join disparate data to each other, including environmental data from the field and the molecularly defined identities of individual specimens isolated from samples.


2008 ◽  
Vol 1 (1) ◽  
Author(s):  
Dina L. Kountoupes ◽  
Karen Oberhauser

Citizen science projects in which members of the public participate in large scale science research programs are excellent ways for universities to engage the broader community in authentic science research. The Monarch Larva Monitoring Project (MLMP) is such a project. It involves hundreds of individuals throughout the United States and southern Canada in a study of monarch butterfly distribution and abundance. This program, run by faculty, graduate students, and staff at the University of Minnesota, provides research opportunities for volunteer monitors. We used mixed methods to understand contexts, outcomes, and promising practices for engaging youth in this project. Slightly over a third of our adult volunteers engaged youth in monitoring activities. They reported that the youth were successful at and enjoyed project activities, with the exception of data entry. Adults innovations increased the success and educational value of the project for children without compromising data integrity. Many adults engaged in extension activities, including independent research that built on their monitoring observations. This project provides an excellent forum for science and environmental education through investigation, direct and long-term interactions with natural settings, and data analysis.


Trials ◽  
2019 ◽  
Vol 20 (1) ◽  
Author(s):  
Jessica E. Lockery ◽  
◽  
Taya A. Collyer ◽  
Christopher M. Reid ◽  
Michael E. Ernst ◽  
...  

Abstract Background Large-scale studies risk generating inaccurate and missing data due to the complexity of data collection. Technology has the potential to improve data quality by providing operational support to data collectors. However, this potential is under-explored in community-based trials. The Aspirin in reducing events in the elderly (ASPREE) trial developed a data suite that was specifically designed to support data collectors: the ASPREE Web Accessible Relational Database (AWARD). This paper describes AWARD and the impact of system design on data quality. Methods AWARD’s operational requirements, conceptual design, key challenges and design solutions for data quality are presented. Impact of design features is assessed through comparison of baseline data collected prior to implementation of key functionality (n = 1000) with data collected post implementation (n = 18,114). Overall data quality is assessed according to data category. Results At baseline, implementation of user-driven functionality reduced staff error (from 0.3% to 0.01%), out-of-range data entry (from 0.14% to 0.04%) and protocol deviations (from 0.4% to 0.08%). In the longitudinal data set, which contained more than 39 million data values collected within AWARD, 96.6% of data values were entered within specified query range or found to be accurate upon querying. The remaining data were missing (3.4%). Participant non-attendance at scheduled study activity was the most common cause of missing data. Costs associated with cleaning data in ASPREE were lower than expected compared with reports from other trials. Conclusions Clinical trials undertake complex operational activity in order to collect data, but technology rarely provides sufficient support. We find the AWARD suite provides proof of principle that designing technology to support data collectors can mitigate known causes of poor data quality and produce higher-quality data. Health information technology (IT) products that support the conduct of scheduled activity in addition to traditional data entry will enhance community-based clinical trials. A standardised framework for reporting data quality would aid comparisons across clinical trials. Trial registration International Standard Randomized Controlled Trial Number Register, ISRCTN83772183. Registered on 3 March 2005.


2016 ◽  
Vol 31 (5) ◽  
pp. 539-546 ◽  
Author(s):  
Tadashi Ishii ◽  
Masaharu Nakayama ◽  
Michiaki Abe ◽  
Shin Takayama ◽  
Takashi Kamei ◽  
...  

AbstractIntroductionThere were 5,385 deceased and 710 missing in the Ishinomaki medical zone following the Great East Japan Earthquake that occurred in Japan on March 11, 2011. The Ishinomaki Zone Joint Relief Team (IZJRT) was formed to unify the relief teams of all organizations joining in support of the Ishinomaki area. The IZJRT expanded relief activity as they continued to manually collect and analyze assessments of essential information for maintaining health in all 328 shelters using a paper-type survey. However, the IZJRT spent an enormous amount of time and effort entering and analyzing these data because the work was vastly complex. Therefore, an assessment system must be developed that can tabulate shelter assessment data correctly and efficiently. The objective of this report was to describe the development and verification of a system to rapidly assess evacuation centers in preparation for the next major disaster.ReportBased on experiences with the complex work during the disaster, software called the “Rapid Assessment System of Evacuation Center Condition featuring Gonryo and Miyagi” (RASECC-GM) was developed to enter, tabulate, and manage the shelter assessment data. Further, a verification test was conducted during a large-scale Self-Defense Force (SDF) training exercise to confirm its feasibility, usability, and accuracy. The RASECC-GM comprises three screens: (1) the “Data Entry screen,” allowing for quick entry on tablet devices of 19 assessment items, including shelter administrator, living and sanitary conditions, and a tally of the injured and sick; (2) the “Relief Team/Shelter Management screen,” for registering information on relief teams and shelters; and (3) the “Data Tabulation screen,” which allows tabulation of the data entered for each shelter, as well as viewing and sorting from a disaster headquarters’ computer. During the verification test, data of mock shelters entered online were tabulated quickly and accurately on a mock disaster headquarters’ computer. Likewise, data entered offline also were tabulated quickly on the mock disaster headquarters’ computer when the tablet device was moved into an online environment.ConclusionsThe RASECC-GM, a system for rapidly assessing the condition of evacuation centers, was developed. Tests verify that users of the system would be able to easily, quickly, and accurately assess vast quantities of data from multiple shelters in a major disaster and immediately manage the inputted data at the disaster headquarters.IshiiT, NakayamaM, AbeM, TakayamaS, KameiT, AbeY, YamaderaJ, AmitoK, MorinoK. Development and verification of a mobile shelter assessment system “Rapid Assessment System of Evacuation Center Condition featuring Gonryo and Miyagi (RASECC-GM)” for major disasters. Prehosp Disaster Med. 2016;31(5):539–546.


ZooKeys ◽  
2012 ◽  
Vol 209 ◽  
pp. 75-86 ◽  
Author(s):  
Riitta Tegelberg ◽  
Jaana Haapala ◽  
Tero Mononen ◽  
Mika Pajari ◽  
Hannu Saarenmaa

Digitarium is a joint initiative of the Finnish Museum of Natural History and the University of Eastern Finland. It was established in 2010 as a dedicated shop for the large-scale digitisation of natural history collections. Digitarium offers service packages based on the digitisation process, including tagging, imaging, data entry, georeferencing, filtering, and validation. During the process, all specimens are imaged, and distance workers take care of the data entry from the images. The customer receives the data in Darwin Core Archive format, as well as images of the specimens and their labels. Digitarium also offers the option of publishing images through Morphbank, sharing data through GBIF, and archiving data for long-term storage. Service packages can also be designed on demand to respond to the specific needs of the customer. The paper also discusses logistics, costs, and intellectual property rights (IPR) issues related to the work that Digitarium undertakes.


Sign in / Sign up

Export Citation Format

Share Document