scholarly journals An ARCore Based User Centric Assistive Navigation System for Visually Impaired People

2019 ◽  
Vol 9 (5) ◽  
pp. 989 ◽  
Author(s):  
Xiaochen Zhang ◽  
Xiaoyu Yao ◽  
Yi Zhu ◽  
Fei Hu

In this work, we propose an assistive navigation system for visually impaired people (ANSVIP) that takes advantage of ARCore to acquire robust computer vision-based localization. To complete the system, we propose adaptive artificial potential field (AAPF) path planning that considers both efficiency and safety. We also propose a dual-channel human–machine interaction mechanism, which delivers accurate and continuous directional micro-instruction via a haptic interface and macro-long-term planning and situational awareness via audio. Our system user-centrically incorporates haptic interfaces to provide fluent and continuous guidance superior to the conventional turn-by-turn audio-guiding method; moreover, the continuous guidance makes the path under complete control in avoiding obstacles and risky places. The system prototype is implemented with full functionality. Unit tests and simulations are conducted to evaluate the localization, path planning, and human–machine interactions, and the results show that the proposed solutions are superior to those of the present state-of-the-art solutions. Finally, integrated tests are carried out with low-vision and blind subjects to verify the proposed system.

Author(s):  
Tee Zhi Heng ◽  
Ang Li Minn ◽  
Seng Kah Phooi

This chapter presents a novel application for wireless technology to assist visually impaired people. As an alternative to the medical model of rehabilitation, the information explosion era provides the foundation for a technological solution to lead the visually impaired to more independent lives in the community by minimizing the obstacles of living. A “SmartGuide” caregiver monitoring system is built as a standalone portable handheld device linked. The objective of this system is to assist blind and low vision people to walk around independently especially in dynamic changing environments. Navigation assistance is accomplished by providing speech guidance on how to move to a particular location. The system delivers dynamic environmental information to lead the visually impaired to more independent lives in the community by minimizing the obstacles of living. Information of changing environments such as road blockage, road closure, and intelligent navigation aids is provided to the user in order to guide the user safely to his or her destination. This system also includes a camera sensor network to enhance monitoring capabilities for an extra level of security and reliability.


Electronics ◽  
2019 ◽  
Vol 8 (6) ◽  
pp. 697 ◽  
Author(s):  
Jinqiang Bai ◽  
Zhaoxiang Liu ◽  
Yimin Lin ◽  
Ye Li ◽  
Shiguo Lian ◽  
...  

Assistive devices for visually impaired people (VIP) which support daily traveling and improve social inclusion are developing fast. Most of them try to solve the problem of navigation or obstacle avoidance, and other works focus on helping VIP to recognize their surrounding objects. However, very few of them couple both capabilities (i.e., navigation and recognition). Aiming at the above needs, this paper presents a wearable assistive device that allows VIP to (i) navigate safely and quickly in unfamiliar environment, and (ii) to recognize the objects in both indoor and outdoor environments. The device consists of a consumer Red, Green, Blue and Depth (RGB-D) camera and an Inertial Measurement Unit (IMU), which are mounted on a pair of eyeglasses, and a smartphone. The device leverages the ground height continuity among adjacent image frames to segment the ground accurately and rapidly, and then search the moving direction according to the ground. A lightweight Convolutional Neural Network (CNN)-based object recognition system is developed and deployed on the smartphone to increase the perception ability of VIP and promote the navigation system. It can provide the semantic information of surroundings, such as the categories, locations, and orientations of objects. Human–machine interaction is performed through audio module (a beeping sound for obstacle alert, speech recognition for understanding the user commands, and speech synthesis for expressing semantic information of surroundings). We evaluated the performance of the proposed system through many experiments conducted in both indoor and outdoor scenarios, demonstrating the efficiency and safety of the proposed assistive system.


2000 ◽  
Vol 94 (1) ◽  
pp. 34-41 ◽  
Author(s):  
Ng Sau Fun Frency ◽  
Hui Chi Leung Patrick ◽  
Choy Lin Foong May

This study analyzes the decision-making process for selecting and purchasing clothing of 81 people in Hong Kong who are visually impaired. Data were collected through personal interviews. The results show that problems such as unsatisfactory sales services and insufficient clothing information still exist for people with visual impairments (both the group with blindness and the group with low vision), and also reveals that people who are visually impaired have different views on the relative importance of selection criteria for purchasing clothing than do their sighted peers.


Sign in / Sign up

Export Citation Format

Share Document