The impact of upgrading the background covariance matrices in NOAA Microwave Integrated Retrieval System (MIRS) (Conference Presentation)

Author(s):  
Junye Chen ◽  
Quanhua Liu ◽  
Mohar Chattopadhyay ◽  
Kevin L. Garrett ◽  
Christopher Grassotti ◽  
...  
1995 ◽  
Vol 20 (1) ◽  
pp. 69-82 ◽  
Author(s):  
David Kaplan

This article considers the impact of missing data arising from balanced incomplete block (BIB) spiraled designs on the chi-square goodness-of-fit test in factor analysis. Specifically, data arising from BIB designs possess a unique pattern of missing data that can be characterized as missing completely at random (MCAR). Standard approaches to factor analyzing such data rest on forming pairwise available case (PAC) covariance matrices. Developments in statistical theory for missing data show that PAC covariance matrices may not satisfy Wishart distribution assumptions underlying factor analysis, thus impacting tests of model fit. One approach, advocated by Muthén, Kaplan, and Hollis (1987) for handling missing data in structural equation modeling, is proposed as a possible solution to these problems. This study compares the new approach to the standard PAC approach in a Monte Carlo framework. Results show that tests of goodness-of-fit are very sensitive to PAC approaches even when data are MCAR, as is the case for BIB designs. The new approach is shown to outperform the PAC approach for continuous variables and is comparatively better for dichotomous variables.


2021 ◽  
Vol 13 (16) ◽  
pp. 3124
Author(s):  
Jakob Raschhofer ◽  
Gabriel Kerekes ◽  
Corinna Harmening ◽  
Hans Neuner ◽  
Volker Schwieger

A flexible approach for geometric modelling of point clouds obtained from Terrestrial Laser Scanning (TLS) is by means of B-splines. These functions have gained some popularity in the engineering geodesy as they provide a suitable basis for a spatially continuous and parametric deformation analysis. In the predominant studies on geometric modelling of point clouds by B-splines, uncorrelated and equally weighted measurements are assumed. Trying to overcome this, the elementary errors theory is applied for establishing fully populated covariance matrices of TLS observations that consider correlations in the observed point clouds. In this article, a systematic approach for establishing realistic synthetic variance–covariance matrices (SVCMs) is presented and afterward used to model TLS point clouds by B-splines. Additionally, three criteria are selected to analyze the impact of different SVCMs on the functional and stochastic components of the estimation results. Plausible levels for variances and covariances are obtained using a test specimen of several dm—dimension. It is used to identify the most dominant elementary errors under laboratory conditions. Starting values for the variance level are obtained from a TLS calibration. The impact of SVCMs with different structures and different numeric values are comparatively investigated. Main findings of the paper are that for the analyzed object size and distances, the structure of the covariance matrix does not significantly affect the location of the estimated surface control points, but their precision in terms of the corresponding standard deviations. Regarding the latter, properly setting the main diagonal terms of the SVCM is of superordinate importance compared to setting the off-diagonal ones. The investigation of some individual errors revealed that the influence of their standard deviation on the precision of the estimated parameters is primarily dependent on the scanning distance. When the distance stays the same, one-sided influences on the precision of the estimated control points can be observed with an increase in the standard deviations.


2019 ◽  
Vol 490 (2) ◽  
pp. 2606-2626 ◽  
Author(s):  
Hao-Yi Wu ◽  
David H Weinberg ◽  
Andrés N Salcedo ◽  
Benjamin D Wibking ◽  
Ying Zu

ABSTRACT Next-generation optical imaging surveys will revolutionize the observations of weak gravitational lensing by galaxy clusters and provide stringent constraints on growth of structure and cosmic acceleration. In these experiments, accurate modelling of covariance matrices of cluster weak lensing plays the key role in obtaining robust measurements of the mean mass of clusters and cosmological parameters. We use a combination of analytical calculations and high-resolution N-body simulations to derive accurate covariance matrices that span from the virial regime to linear scales of the cluster-matter cross-correlation. We validate this calculation using a public ray-tracing lensing simulation and provide a software package for calculating covariance matrices for a wide range of cluster and source sample choices. We discuss the relative importance of shape noise and density fluctuations, the impact of radial bin size, and the impact of off-diagonal elements. For a weak lensing source density ns = 10 arcmin−2, shape noise typically dominates the variance on comoving scales $r_{\rm p}\lesssim 5\ h^{-1} \, \rm Mpc$. However, for ns = 60 arcmin−2, potentially achievable with future weak lensing experiments, density fluctuations typically dominate the variance at $r_{\rm p}\gtrsim 1\ h^{-1} \, \rm Mpc$ and remain comparable to shape noise on smaller scales.


1984 ◽  
Vol 8 (3) ◽  
pp. 113-121 ◽  
Author(s):  
Linda C. Smith ◽  
Amy J. Warner

In IR there is a growing body of empirical data and practical experience with various representations. Taxonomy may be defined as 'all the various activities involved in the construction of classificatory systems.' There are three stages in our research: identification, char acterization, and comparison/evaluation. Identification in volves an enumeration both of the categories of representations which could in principle be part of an IR system and of the different members in each category. Characterization involves identification of properties which could be used to characterize the members of each category. Comparison/evaluation in volves identification of measures which could be used in com paring the members within each category and/or in evaluating the impact of variations in representations on system perfor mance.


2013 ◽  
Vol 51 (14) ◽  
pp. 4336-4348 ◽  
Author(s):  
A. Regattieri ◽  
G. Santarelli ◽  
R. Manzini ◽  
A. Pareschi

2001 ◽  
Vol 29 (3) ◽  
pp. 253-261 ◽  
Author(s):  
Linda Shirato ◽  
Sarah Cogan ◽  
Sandra Yee

In June 1998, the Bruce T. Halle Library opened on Eastern Michigan University’s campus and began using an automated storage and retrieval system for low‐use books and periodicals. Approximately one third of the library’s total collection was placed into this storage system, freeing floor space for many new activities in the library. This system, linked to the library’s online catalog, could retrieve items requested by a patron in less than ten minutes. While the Automated storage/retrieval systems (AS/RS) performed well, other start‐up problems of a new building and public perceptions of the AS/RS made its introduction a challenge. Planning, implementation, and public reaction and acceptance are discussed.


2019 ◽  
Author(s):  
Michael Maraun ◽  
Moritz Heene ◽  
Philipp Sckopke

The behavioural scientist who requires an estimate of narrow heritability, h2, will conduct a twin study, and input the resulting estimated covariance matrices into a particular mode of estimation, the latter derived under supposition of the standard biometric model (SBM). It is now widely acknowledged that the standard biometric model can be expected to misrepresent, in manifold ways, the phenotypic (genetic) architecture of human traits. The impact of this misrepresentation on the accuracy of h2 estimation is unknown. Herein, we aimed to shed some light on this general issue, by undertaking three simulation studies. In each, the parameter recovery performance of five modes- Falconer's coefficient and the SEM models, ACDE, ADE, ACE, and AE- was investigated when they encountered a constructed, non-SBM, architecture, under a particular informational input. In study 1, the architecture was single-locus with dominance effects and genetic-environment covariance, and the input was { ΣMZ,T, ΣDZ,T, ΣMZ,A, ΣDZ,A}; in study 2, the architecture was identical to that of study 1, but the informational input was { ΣMZ,T, ΣDZ,T}; and in study 3, the architecture was multi-locus with dominance effects, genetic-environment covariance, and epistatic interactions. The informational input was {ΣMZ,T, ΣDZ,T, ΣMZ,A, ΣDZ,A}. The results suggest that conclusions regarding the coverage of h2 must be drawn conditional on a) the general class of generating architecture in play; b) specifics of the architecture’s parametric instantiations; c) the informational input into a mode of estimation; and d) the particular mode of estimation employed. In general, the results showed that more complicated the generating architecture, the poorer a mode’s h2 recovery performance. Random forest analyses furthermore revealed that, depending on the genetic architecture, h2, the dominance and locus additive parameter, and proportions of alleles were involved in complex interaction effects impacting on h2 parameter recovery performance of a mode of estimation. Data and materials: https://osf.io/aq9sx/


2020 ◽  
Author(s):  
Wenkai MA ◽  
Yanyan WANG ◽  
Jinchang HU ◽  
Yaohua WU

Abstract A new automatic warehouse sorting system, the crane & shuttle based storage and retrieval system (C&SBS/RS), is proposed in this paper. In C&SBS/RS, the crane-based storage and retrieval system (CBS/RS) in the pallet storage area provides the pallets picking, the shuttle-based storage and retrieval system (SBS/RS) in the tote storage area handles the cases and items picking. When the inventory in SBS/RS is lower than safety stock, SBS/RS initiates replenishment transaction. Besides, the order matrix is proposed to study order structure parameters, such as the order density, the order strength, the wave size and so on. Moreover, this paper analyzes the influence of order structure on the replenishment with four evaluation parameters, e.g., the workload of CBS/RS and SBS/RS, the number of used storage position in SBS/RS, the replenishment time. Numerical experiments are carried out to analysis the impact of the wave size and the proportion of high turnover SKU on those four evaluation parameters under multiple order structure, which is to help warehouse operation manager decide replenishment strategy parameters.


Author(s):  
Pilar Garcés ◽  
David López-Sanz ◽  
Fernando Maestú ◽  
Ernesto Pereda

Background: Modern MEG devices include 102 sensor triplets containing one magnetometer and two planar gradiometers. The first processing step is often a signal space separation (SSS), which provides a powerful noise reduction. A question commonly raised by researchers and reviewers is which data should be employed in source reconstruction: (1) magnetometers only, (2) gradiometers only, (3) magnetometers and gradiometers together. The MEG community is currently divided about the proper answer and strong arguments in favor and against these three approaches often expressed. Methods: First, we provide theoretical evidence that both gradiometers and magnetometers contain the same information after SSS, and argue that they both result from the backprojection of the same SSS components. Then, we compare beamforming source reconstructions from magnetometers and gradiometers in real MEG recordings before and after SSS. Results: Without SSS, the correlation between source time series extracted from magnetometers and gradiometers was high, with Pearson correlation coefficient r=0.5-0.8. After SSS, these correlation values increased dramatically, reaching over 0.90 across all cortical areas. Conclusions: After SSS, almost identical source reconstructions (r>0.9) can be obtained with magnetometers and gradiometers, as long as regularization is selected appropriately to account for the different properties in magnetometers and gradiometers covariance matrices.


Sign in / Sign up

Export Citation Format

Share Document