IMU-Vision based Localization Algorithm for Lunar Rover

Author(s):  
Hosun Kang ◽  
Jongwoo An ◽  
Jangmyung Lee
2019 ◽  
Vol 14 (1) ◽  
pp. 65-73 ◽  
Author(s):  
Hosun Kang ◽  
◽  
Jongwoo An ◽  
Hyunsoo Lim ◽  
Seulwoo Hwang ◽  
...  

Author(s):  
Zhuorui Yang ◽  
Aura Ganz

In this paper, we introduce an egocentric landmark-based guidance system that enables visually impaired users to interact with indoor environments. The user who wears Google Glasses will capture his surroundings within his field of view. Using this information, we provide the user an accurate landmark-based description of the environment including his relative distance and orientation to each landmark. To achieve this functionality, we developed a near real time accurate vision based localization algorithm. Since the users are visually impaired our algorithm accounts for captured images using Google Glasses that have severe blurriness, motion blurriness, low illumination intensity and crowd obstruction. We tested the algorithm performance in a 12,000 ft2 open indoor environment. When we have mint query images our algorithm obtains mean location accuracy within 5ft., mean orientation accuracy less than 2 degrees and reliability above 88%. After applying deformation effects to the query images such blurriness, motion blurriness and illumination changes, we observe that the reliability is above 75%.


2014 ◽  
Vol 44 (10) ◽  
pp. 1097-1104
Author(s):  
XinChao XU ◽  
Bo WEN ◽  
Ke WU ◽  
QunZhi LI ◽  
ShiYan WEI ◽  
...  

2018 ◽  
pp. 1483-1499
Author(s):  
Zhuorui Yang ◽  
Aura Ganz

In this paper, we introduce an egocentric landmark-based guidance system that enables visually impaired users to interact with indoor environments. The user who wears Google Glasses will capture his surroundings within his field of view. Using this information, we provide the user an accurate landmark-based description of the environment including his relative distance and orientation to each landmark. To achieve this functionality, we developed a near real time accurate vision based localization algorithm. Since the users are visually impaired our algorithm accounts for captured images using Google Glasses that have severe blurriness, motion blurriness, low illumination intensity and crowd obstruction. We tested the algorithm performance in a 12,000 ft2 open indoor environment. When we have mint query images our algorithm obtains mean location accuracy within 5ft., mean orientation accuracy less than 2 degrees and reliability above 88%. After applying deformation effects to the query images such blurriness, motion blurriness and illumination changes, we observe that the reliability is above 75%.


2015 ◽  
Vol 10 (10) ◽  
pp. 1062
Author(s):  
A. Mesmoudi ◽  
Mohammed Feham ◽  
Nabila Labraoui ◽  
Chakib Bekara

Sign in / Sign up

Export Citation Format

Share Document