Design of an FMCW Radar Altimeter for Wide-Range and Low Measurement Error

2015 ◽  
Vol 64 (12) ◽  
pp. 3517-3525 ◽  
Author(s):  
Jae-Hyun Choi ◽  
Jong-Hun Jang ◽  
Jin-Eep Roh
2014 ◽  
Vol 50 (25) ◽  
pp. 1975-1977 ◽  
Author(s):  
J.H. Choi ◽  
J.H. Jang ◽  
J.E. Roh

Author(s):  
Marek Ciesielski ◽  
Krzysztof Stasiak ◽  
Mariia Khyzhniak ◽  
Marcin Zywek ◽  
Marek Rupniewski

2021 ◽  
Author(s):  
Midhunkrishna P R ◽  
Lal M J ◽  
Sherly Joy ◽  
Mukundan K K ◽  
Narayanan Namboothiripad M

2020 ◽  
Vol 130 (4) ◽  
pp. 800-812 ◽  
Author(s):  
Juan Vrdoljak ◽  
Kevin Imanol Sanchez ◽  
Roberto Arreola-Ramos ◽  
Emilce Guadalupe Diaz Huesa ◽  
Alejandro Villagra ◽  
...  

Abstract The repeatability of findings is the key factor behind scientific reliability, and the failure to reproduce scientific findings has been termed the ‘replication crisis’. Geometric morphometrics is an established tool in evolutionary biology. However, different operators (and/or different methods) could act as large sources of variation in the data obtained. Here, we investigated inter-operator error in geometric morphometric protocols on complex shapes of Liolaemus lizards, as well as measurement error in three taxa varying in their difficulty of digitalization. We also examined the potential for these protocols to discriminate among complex shapes in closely related species. We found a wide range of inter-operator error, contributing between 19.5% and 60% to the total variation. Moreover, measurement error increased with the complexity of the quantified shape. All protocols were able to discriminate between species, but the use of more landmarks did not imply better performance. We present evidence that complex shapes reduce repeatability, highlighting the need to explore different sources of variation that could lead to such low repeatability. Lastly, we suggest some recommendations to improve the repeatability and reliability of geometric morphometrics results.


Author(s):  
Patricia Penabad Durán ◽  
Paolo Di Barba ◽  
Xose Lopez-Fernandez ◽  
Janusz Turowski

Purpose – The purpose of this paper is to describe a parameter identification method based on multiobjective (MO) deterministic and non-deterministic optimization algorithms to compute the temperature distribution on transformer tank covers. Design/methodology/approach – The strategy for implementing the parameter identification process consists of three main steps. The first step is to define the most appropriate objective function and the identification problem is solved for the chosen parameters using single-objective (SO) optimization algorithms. Then sensitivity to measurement error of the computational model is assessed and finally it is included as an additional objective function, making the identification problem a MO one. Findings – Computations with identified/optimal parameters yield accurate results for a wide range of current values and different conductor arrangements. From the numerical solution of the temperature field, decisions on dimensions and materials can be taken to avoid overheating on transformer covers. Research limitations/implications – The accuracy of the model depends on its parameters, such as heat exchange coefficients and material properties, which are difficult to determine from formulae or from the literature. Thus the goal of the presented technique is to achieve the best possible agreement between measured and numerically calculated temperature values. Originality/value – Differing from previous works found in the literature, sensitivity to measurement error is considered in the parameter identification technique as an additional objective function. Thus, solutions less sensitive to measurement errors at the expenses of a degradation in accuracy are identified by means of MO optimization algorithms.


1993 ◽  
Vol 264 (6) ◽  
pp. E902-E911 ◽  
Author(s):  
D. C. Bradley ◽  
G. M. Steil ◽  
R. N. Bergman

We introduce a novel technique for estimating measurement error in time courses and other continuous curves. This error estimate is used to reconstruct the original (error-free) curve. The measurement error of the data is initially assumed, and the data are smoothed with "Optimal Segments" such that the smooth curve misses the data points by an average amount consistent with the assumed measurement error. Thus the differences between the smooth curve and the data points (the residuals) are tentatively assumed to represent the measurement error. This assumption is checked by testing the residuals for randomness. If the residuals are nonrandom, it is concluded that they do not resemble measurement error, and a new measurement error is assumed. This process continues reiteratively until a satisfactory (i.e., random) group of residuals is obtained. In this case the corresponding smooth curve is taken to represent the original curve. Monte Carlo simulations of selected typical situations demonstrated that this new method ("OOPSEG") estimates measurement error accurately and consistently in 30- and 15-point time courses (r = 0.91 and 0.78, respectively). Moreover, smooth curves calculated by OOPSEG were shown to accurately recreate (predict) original, error-free curves for a wide range of measurement errors (2-20%). We suggest that the ability to calculate measurement error and reconstruct the error-free shape of data curves has wide applicability in data analysis and experimental design.


Sign in / Sign up

Export Citation Format

Share Document