Application of statistical models to decomposition of systematic and random error in low-voltage SEM metrology

1992 ◽  
Author(s):  
Kevin M. Monahan ◽  
Sadri Khalessi
2002 ◽  
Vol 5 (6a) ◽  
pp. 969-976 ◽  
Author(s):  
Rudolf Kaaks ◽  
Pietro Ferrari ◽  
Antonio Ciampi ◽  
Martyn Plummer ◽  
Elio Riboli

AbstractObjective:To examine statistical models that account for correlation between random errors of different dietary assessment methods, in dietary validation studies.Setting:In nutritional epidemiology, sub-studies on the accuracy of the dietary questionnaire measurements are used to correct for biases in relative risk estimates induced by dietary assessment errors. Generally, such validation studies are based on the comparison of questionnaire measurements (Q) with food consumption records or 24-hour diet recalls (R). In recent years, the statistical analysis of such studies has been formalised more in terms of statistical models. This made the need of crucial model assumptions more explicit. One key assumption is that random errors must be uncorrelated between measurements Q and R, as well as between replicate measurements R1 and R2 within the same individual. These assumptions may not hold in practice, however. Therefore, more complex statistical models have been proposed to validate measurements Q by simultaneous comparisons with measurements R plus a biomarker M, accounting for correlations between the random errors of Q and R.Conclusions:The more complex models accounting for random error correlations may work only for validation studies that include markers of diet based on physiological knowledge about the quantitative recovery, e.g. in urine, of specific elements such as nitrogen or potassium, or stable isotopes administered to the study subjects (e.g. the doubly labelled water method for assessment of energy expenditure). This type of marker, however, eliminates the problem of correlation of random errors between Q and R by simply taking the place of R, thus rendering complex statistical models unnecessary.


2010 ◽  
Vol 15 (8) ◽  
pp. 990-1000 ◽  
Author(s):  
Nathalie Malo ◽  
James A. Hanley ◽  
Graeme Carlile ◽  
Jing Liu ◽  
Jerry Pelletier ◽  
...  

Identification of active compounds in high-throughput screening (HTS) contexts can be substantially improved by applying classical experimental design and statistical inference principles to all phases of HTS studies. The authors present both experimental and simulated data to illustrate how true-positive rates can be maximized without increasing false-positive rates by the following analytical process. First, the use of robust data preprocessing methods reduces unwanted variation by removing row, column, and plate biases. Second, replicate measurements allow estimation of the magnitude of the remaining random error and the use of formal statistical models to benchmark putative hits relative to what is expected by chance. Receiver Operating Characteristic (ROC) analyses revealed superior power for data preprocessed by a trimmed-mean polish method combined with the RVM t-test, particularly for small- to moderate-sized biological hits.


10.28945/3062 ◽  
2007 ◽  
Author(s):  
Jim Everett

Bauxite is mined and transported by conveyor to a processing plant, screened and washed, then placed into blended stockpiles to feed the alumina refinery. While being stacked to the stockpile, the ore is sampled. Completed stockpiles must be acceptably close to target grade (composition), not only in alumina, but also in residual silica, carbon and sodium carbonate. The mine is an open-cut pit. Each day the choice of ore to mine, from multiple locations in the pit, is based upon estimates of grade. Estimated grade, from exploration drilling of the area before mining, has both systematic and random error. This paper describes an information system to guide the daily choice of ore to mine. Continually updating the comparison between forecasts and sampled product, the system provides adjusted forecasts. Ore is selected to bring the exponentially smoothed grade to target, in each of the control minerals.


Author(s):  
Mayuresh Virkar ◽  
N Arul Kumar ◽  
Pranav Chadha ◽  
Reuben Jake Rodrigues ◽  
Anup Kharde

Introduction: The aim of the present study was to compare two immobilization systems for comparison of setup errors in targeted radiotherapy. Methods: Retrospective analysis was done for the patients undergoing radiotherapy from May 2012 to December 2018 at our institution. Immobilization was performed on 30 patients sessions (Vacuum cushion i.e., Vac-Lok™ = 15; Thermoplastic mould i.e., Pelvicast pelvic masks = 15). A total of 763 cone-beams were analysed. The target lesion location was verified by cone-beam computed tomography (CBCT) prior to each session, with displacements assessed by CBCT simulation prior to each treatment session. Systematic setup errors, random setup errors, isocenter deviations in the Medio-lateral (ML), Supero-inferior (SI), Antero-posterior (AP), Rotation (yaw) directions of the patient position was calculated. Results: On comparing the Vac-Lok™ and Pelvicast pelvic masks group with respect to Systematic and random error in the lateral, longitudinal, vertical and YAW direction, no statistically significant difference was seen except the random error in YAW direction (P=0.037, Unpaired t-test). There was no difference observed in comparing the isocentric deviation. Conclusion: It was inferred and concluded that using a vacuum cushion for pelvic radiotherapy provides no added benefit compared to using a thermoplastic mould. Thermoplastic mould is recommended for patients receiving pelvic radiotherapy to improve overall reproducibility.Keywords: Rotational therapy; Radiotherapy; Systematic, random error; Thermoplastic mould; Vacuum cushion.


Sign in / Sign up

Export Citation Format

Share Document