scholarly journals Cognitive Load Assessment from EEG and Peripheral Biosignals for the Design of Visually Impaired Mobility Aids

2018 ◽  
Vol 2018 ◽  
pp. 1-9 ◽  
Author(s):  
Charalampos Saitis ◽  
Mohammad Zavid Parvez ◽  
Kyriaki Kalimeri

Reliable detection of cognitive load would benefit the design of intelligent assistive navigation aids for the visually impaired (VIP). Ten participants with various degrees of sight loss navigated in unfamiliar indoor and outdoor environments, while their electroencephalogram (EEG) and electrodermal activity (EDA) signals were being recorded. In this study, the cognitive load of the tasks was assessed in real time based on a modification of the well-established event-related (de)synchronization (ERD/ERS) index. We present an in-depth analysis of the environments that mostly challenge people from certain categories of sight loss and we present an automatic classification of the perceived difficulty in each time instance, inferred from their biosignals. Given the limited size of our sample, our findings suggest that there are significant differences across the environments for the various categories of sight loss. Moreover, we exploit cross-modal relations predicting the cognitive load in real time inferring on features extracted from the EDA. Such possibility paves the way for the design on less invasive, wearable assistive devices that take into consideration the well-being of the VIP.

Author(s):  
John Nicholson ◽  
Vladimir Kulyukin

Limited sensory information about a new environment often requires people with a visual impairment to rely on sighted guides for showing or describing routes around the environment. However, route descriptions provided by other blind independent navigators, (e.g., over a cell phone), can also be used to guide a traveler along a previously unknown route. A visually impaired guide can often describe a route as well or better than a sighted person since the guide is familiar with the issues of blind navigation. This chapter introduces a Collaborative Route Information Sharing System (CRISS). CRISS is a collaborative online environment where visually impaired and sighted people will be able to share and manage route descriptions for indoor and outdoor environments. It then describes the system’s Route Analysis Engine module which takes advantage of information extraction techniques to find landmarks in natural language route descriptions written by independent blind navigators.


2015 ◽  
Vol 21 (1-2) ◽  
pp. 179-187 ◽  
Author(s):  
Ivan Lazovic ◽  
Milena Jovasevic-Stojanovic ◽  
Marija Zivkovic ◽  
Visa Tasic ◽  
Zarko Stevanovic

Indoor air quality (IAQ) is very important for children health and well-being, since children are particularly vulnerable and sensitive on presence of air pollutants. This study was performed in two naturally ventilated schools located in the same municipality. First school is located in urban area, at residential - industrial site, while the other school is situated in rural area. School buildings were chosen based on their urban environment features. The measurements were carried out in heating as well as in non-heating period in duration of five consecutive working days. The objective of the study was to analyze IAQ in the classrooms with special emphasis on levels and diurnal variations of particulate matter (PM10 and PM2.5), carbon dioxide (CO2) and nitrogen dioxide (NO2) in occupied and unoccupied school classrooms. In this paper, the CO2 concentrations were measured at both indoor and outdoor environments. Concentrations of CO2 higher than 1000 ppm were regularly detected in the classrooms during teaching hours. Indoor concentrations of PM10 were not exceeded the guideline, daily average, value of 50 ?g/m3. Concentrations of PM2.5 were exceeded the guideline, daily average, value of 25 ?g/m3 in both school during heating period. Concentrations NO2 were not exceeded the guideline value of 200 ?g/m3. Ventilation rates were calculated and compared with the prescribed limits. In both occupied and unoccupied periods high correlation between CO2 and PM concentrations was determined.


Sensors ◽  
2019 ◽  
Vol 19 (12) ◽  
pp. 2771 ◽  
Author(s):  
Simona Caraiman ◽  
Otilia Zvoristeanu ◽  
Adrian Burlacu ◽  
Paul Herghelegiu

The development of computer vision based systems dedicated to help visually impaired people to perceive the environment, to orientate and navigate has been the main research subject of many works in the recent years. A significant ensemble of resources has been employed to support the development of sensory substitution devices (SSDs) and electronic travel aids for the rehabilitation of the visually impaired. The Sound of Vision (SoV) project used a comprehensive approach to develop such an SSD, tackling all the challenging aspects that so far restrained the large scale adoption of such systems by the intended audience: Wearability, real-time operation, pervasiveness, usability, cost. This article is set to present the artificial vision based component of the SoV SSD that performs the scene reconstruction and segmentation in outdoor environments. In contrast with the indoor use case, where the system acquires depth input from a structured light camera, in outdoors SoV relies on stereo vision to detect the elements of interest and provide an audio and/or haptic representation of the environment to the user. Our stereo-based method is designed to work with wearable acquisition devices and still provide a real-time, reliable description of the scene in the context of unreliable depth input from the stereo correspondence and of the complex 6 DOF motion of the head-worn camera. We quantitatively evaluate our approach on a custom benchmarking dataset acquired with SoV cameras and provide the highlights of the usability evaluation with visually impaired users.


The development of surveillance systems for indoor and outdoor environments using currently available wireless sensor technology without violating privacy issues is a challenging task. Passive Infrared (PIR) detectors are suitable for such systems provided solutions to the technical limitations are implemented. In the proposed work, the development of a human tracking system using analogue PIR detectors and currently available wireless sensor technology is presented. Performance is evaluated by conducting real-time tests in different environmental scenarios. Analysis of experimentalresults of human sensing signals indicates that performance is affected by environmental parameters. These findings will be helpful for the researchers while implementing a real-time system in the field


2020 ◽  
Vol 10 (2) ◽  
pp. 523
Author(s):  
Santiago Real ◽  
Alvaro Araujo

In this paper, the Virtually Enhanced Senses (VES) System is described. It is an ARCore-based, mixed-reality system meant to assist blind and visually impaired people’s navigation. VES operates in indoor and outdoor environments without any previous in-situ installation. It provides users with specific, runtime-configurable stimuli according to their pose, i.e., position and orientation, and the information of the environment recorded in a virtual replica. It implements three output data modalities: Wall-tracking assistance, acoustic compass, and a novel sensory substitution algorithm, Geometry-based Virtual Acoustic Space (GbVAS). The multimodal output of this algorithm takes advantage of natural human perception encoding of spatial data. Preliminary experiments of GbVAS have been conducted with sixteen subjects in three different scenarios, demonstrating basic orientation and mobility skills after six minutes training.


Sign in / Sign up

Export Citation Format

Share Document