normalization problem
Recently Published Documents


TOTAL DOCUMENTS

18
(FIVE YEARS 0)

H-INDEX

7
(FIVE YEARS 0)

Author(s):  
Qiang Han ◽  
Xianguo Tuo ◽  
Da Lin

Attitude determination is a significant aspect of the navigation technology. The determination of the attitudes based on accelerometer and magnetometer fusion is basically an applied mathematical problem. In this article, we obtained the proposed quaternionic symbolic solution for accelerometer and magnetometer fusion which is quite different from existing methods. The proposed symbolic solution has the advantages of high accuracy and consumes much less time. Besides, the normalization problem of input data was discussed. Biquaternions ware introduced to describe the existence of the complex quaternion. Several experiments were carried out to verify the proposed symbolic solution.


2018 ◽  
Vol 438 ◽  
pp. 58-72 ◽  
Author(s):  
M. Buzzanca ◽  
V. Carchiolo ◽  
A. Longheu ◽  
M. Malgeri ◽  
G. Mangioni

2017 ◽  
Vol 69 (5) ◽  
pp. 557-575 ◽  
Author(s):  
Rodrigo Costas ◽  
Antonio Perianes-Rodríguez ◽  
Javier Ruiz-Castillo

Purpose The introduction of “altmetrics” as new tools to analyze scientific impact within the reward system of science has challenged the hegemony of citations as the predominant source for measuring scientific impact. Mendeley readership has been identified as one of the most important altmetric sources, with several features that are similar to citations. The purpose of this paper is to perform an in-depth analysis of the differences and similarities between the distributions of Mendeley readership and citations across fields. Design/methodology/approach The authors analyze two issues by using in each case a common analytical framework for both metrics: the shape of the distributions of readership and citations, and the field normalization problem generated by differences in citation and readership practices across fields. In the first issue the authors use the characteristic scores and scales method, and in the second the measurement framework introduced in Crespo et al. (2013). Findings There are three main results. First, the citations and Mendeley readership distributions exhibit a strikingly similar degree of skewness in all fields. Second, the results on “exchange rates (ERs)” for Mendeley readership empirically supports the possibility of comparing readership counts across fields, as well as the field normalization of readership distributions using ERs as normalization factors. Third, field normalization using field mean readerships as normalization factors leads to comparably good results. Originality/value These findings open up challenging new questions, particularly regarding the possibility of obtaining conflicting results from field normalized citation and Mendeley readership indicators; this suggests the need for better determining the role of the two metrics in capturing scientific recognition.


2015 ◽  
Vol 36 (5) ◽  
pp. 662-667 ◽  
Author(s):  
Kaifu Chen ◽  
Zheng Hu ◽  
Zheng Xia ◽  
Dongyu Zhao ◽  
Wei Li ◽  
...  

Genome-wide analyses of changes in gene expression, transcription factor occupancy on DNA, histone modification patterns on chromatin, genomic copy number variation, and nucleosome positioning have become popular in many modern laboratories, yielding a wealth of information during health and disease states. However, most of these studies have overlooked an inherent normalization problem that must be corrected with spike-in controls. Here we describe the reason why spike-in controls are so important and explain how to appropriately design and use spike-in controls for normalization. We also suggest ways to retrospectively renormalize data sets that were wrongly interpreted due to omission of spike-in controls.


Robotica ◽  
2014 ◽  
Vol 33 (6) ◽  
pp. 1250-1280 ◽  
Author(s):  
Luca Carlone ◽  
Vito Macchia ◽  
Federico Tibaldi ◽  
Basilio Bona

SUMMARYIn this work, we investigate a quaternion-based formulation of 3D Simultaneous Localization and Mapping with Extended Kalman Filter (EKF-SLAM) using relative pose measurements. We introduce a discrete-time derivation that avoids thenormalization problemthat often arises when using unit quaternions in Kalman filter and we study its observability properties. The consistency of the estimation errors with the corresponding covariance matrices is also evaluated. The approach is further tested on real data from theRawseeds datasetand it is applied within a delayed-state EKF architecture for estimating a dense 3D map of an unknown environment. The contribution is motivated by the possibility of abstracting multi-sensorial information in terms of relative pose measurements and for its straightforward extensions to the multi robot case.


Author(s):  
E. J. Cross ◽  
G. Manson ◽  
K. Worden ◽  
S. G. Pierce

This paper explores and compares the application of three different approaches to the data normalization problem in structural health monitoring (SHM), which concerns the removal of confounding trends induced by varying operational conditions from a measured structural response that correlates with damage. The methodologies for singling out or creating damage-sensitive features that are insensitive to environmental influences explored here include cointegration, outlier analysis and an approach relying on principal component analysis. The application of cointegration is a new idea for SHM from the field of econometrics, and this is the first work in which it has been comprehensively applied to an SHM problem. Results when applying cointegration are compared with results from the more familiar outlier analysis and an approach that uses minor principal components. The ability of these methods for removing the effects of environmental/operational variations from damage-sensitive features is demonstrated and compared with benchmark data from the Brite-Euram project DAMASCOS (BE97 4213), which was collected from a Lamb-wave inspection of a composite panel subject to temperature variations in an environmental chamber.


2007 ◽  
Vol 16 (02n03) ◽  
pp. 445-452 ◽  
Author(s):  
L. MARASSI ◽  
J. A. S. LIMA

The Press–Schechter (PS) formalism yields the number density of bound objects formed during the evolution of the Universe, which should be compared to the great clusters of galaxies now observed. This basic approach has an intrinsic problem of normalization (the so-called missing factor 2). We argue here that such a problem is related to specific choices of the statistical distribution describing the initial density fluctuations. The fudge factor 2 occurs even in the context of the q-nonextensive statistics which is the simplest one-parametric extension of the Gaussian distribution. In general, other distributions require different corrections, such as the log-normal distribution. However, by assuming that the perturbations can be described by the Burr distribution, we prove that the PS approach in this case is not plagued with the normalization problem.


Sign in / Sign up

Export Citation Format

Share Document