Localization uncertainty in area-based stereo algorithms

1995 ◽  
Vol 25 (12) ◽  
pp. 1628-1634 ◽  
Author(s):  
B. Volpel ◽  
W. M. Theimer
2021 ◽  
Author(s):  
Sanghun Park ◽  
Kunhee Kim ◽  
Eunseop Lee ◽  
Daijin Kim

2019 ◽  
Vol 35 (5) ◽  
pp. 1123-1135
Author(s):  
Piotr R. Slawinski ◽  
Nabil Simaan ◽  
Addisu Z. Taddese ◽  
Keith L. Obstein ◽  
Pietro Valdastri

2020 ◽  
Vol 497 (1) ◽  
pp. 204-209 ◽  
Author(s):  
Hai Yu ◽  
Pengjie Zhang ◽  
Fa-Yin Wang

ABSTRACT Standard siren cosmology of gravitational wave (GW) merger events relies on the identification of host galaxies and their redshifts. But this can be highly challenging due to numerous candidates of galaxies in the GW localization area. We point out that the number of candidates can be reduced by orders of magnitude for strongly lensed GW events, due to extra observational constraints. For the next-generation GW detectors like Einstein Telescope (ET), we estimate that this number is usually significantly less than one, as long as the GW localization uncertainty is better than $\sim 10\, \rm deg^2$. This implies that the unique identification of the host galaxy of lensed GW event detected by ET and Cosmic Explorer (CE) is possible. This provides us a promising opportunity to measure the redshift of the GW event and facilitate the standard siren cosmology. We also discuss its potential applications in understanding the evolution process and environment of the GW event.


Author(s):  
Sungjoon Choi ◽  
Mahdi Jadaliha ◽  
Jongeun Choi ◽  
Songhwai Oh

In this paper, we propose distributed Gaussian process regression (GPR) for resource-constrained distributed sensor networks under localization uncertainty. The proposed distributed algorithm, which combines Jacobi over-relaxation (JOR) and discrete-time average consensus (DAC), can effectively handle localization uncertainty as well as limited communication and computation capabilities of distributed sensor networks. We also extend the proposed method hierarchically using sparse GPR to improve its scalability. The performance of the proposed method is verified in numerical simulations against the centralized maximum a posteriori (MAP) solution and a quick-and-dirty solution. We show that the proposed method outperforms the quick-and-dirty solution and achieve an accuracy comparable to the centralized solution.


Author(s):  
Rupeng Yuan ◽  
Fuhai Zhang ◽  
Jiadi Qu ◽  
Guozhi Li ◽  
Yili Fu

Purpose The purpose of this paper is to propose an enhanced pose tracking method using progressive scan matching, focusing on accuracy, time efficiency and robustness. Design/methodology/approach The general purpose of localization algorithms is to dynamically track a robot instead of globally locating one. In this paper, progressive scan matching is used to promote the performance of pose tracking. Rotational and translational samples are separately generated to accelerate the calculation and to increase the accuracy. Progressive iteration of sample generation can ensure localization to achieve a specific precision. The direction of localization uncertainty is taken into consideration to increase robustness. Nonlinear optimization is adopted to achieve a more precise result. Findings The proposed method was implemented on a self-made mobile robot. Two experiments were conducted to test the accuracy and time efficiency of the method. The comparison with the basic Monte Carlo localization shows the advantages of the method. Another two experiments were conducted to test the robustness of the method. The result shows that the method can relocate a robot from an inaccurate place if the offset is moderate. Originality/value An enhanced pose tracking method is proposed to promote the performance by separately processing rotational and translational samples, progressively iterating the sample generation, taking the direction of localization uncertainty into consideration and adopting nonlinear optimization. The proposed method enables a robot to accurately and quickly locate itself in the environment with robustness.


Sign in / Sign up

Export Citation Format

Share Document