Obstacle Avoidance in Simulated Environment Using Eye Tracking Technologies

Author(s):  
Csaba Antonya ◽  
Florin Barbuceanu ◽  
Zolta´n Rusa´k ◽  
Doru Talaba ◽  
Silviu Butnariu ◽  
...  

The paper is investigating the relationship between human eye movements, correlated with the visual perception of computer generated scene on one hand and obstacle avoidance strategies on the other hand, during the process of driving a computer game-like car. Several issues were investigated regarding how the gaze fixation point of the driver is moving during obstacle avoidance maneuvers. The relevance of each issue in making a decision was assessed. The main goal is to establish a correlation (mapping) system between gaze fixation parameters and obstacles avoidance strategies in order to be able to develop cognitive algorithms for driver assistance in real world driving conditions, to monitor driver’s vigilance and ultimately to enable progress towards the autonomous vehicle which can avoid possible obstacles or resolve hazardous traffic situations just by monitoring the eye movements of the driver.

2019 ◽  
Vol 121 (5) ◽  
pp. 1967-1976 ◽  
Author(s):  
Niels Gouirand ◽  
James Mathew ◽  
Eli Brenner ◽  
Frederic R. Danion

Adapting hand movements to changes in our body or the environment is essential for skilled motor behavior. Although eye movements are known to assist hand movement control, how eye movements might contribute to the adaptation of hand movements remains largely unexplored. To determine to what extent eye movements contribute to visuomotor adaptation of hand tracking, participants were asked to track a visual target that followed an unpredictable trajectory with a cursor using a joystick. During blocks of trials, participants were either allowed to look wherever they liked or required to fixate a cross at the center of the screen. Eye movements were tracked to ensure gaze fixation as well as to examine free gaze behavior. The cursor initially responded normally to the joystick, but after several trials, the direction in which it responded was rotated by 90°. Although fixating the eyes had a detrimental influence on hand tracking performance, participants exhibited a rather similar time course of adaptation to rotated visual feedback in the gaze-fixed and gaze-free conditions. More importantly, there was extensive transfer of adaptation between the gaze-fixed and gaze-free conditions. We conclude that although eye movements are relevant for the online control of hand tracking, they do not play an important role in the visuomotor adaptation of such tracking. These results suggest that participants do not adapt by changing the mapping between eye and hand movements, but rather by changing the mapping between hand movements and the cursor’s motion independently of eye movements. NEW & NOTEWORTHY Eye movements assist hand movements in everyday activities, but their contribution to visuomotor adaptation remains largely unknown. We compared adaptation of hand tracking under free gaze and fixed gaze. Although our results confirm that following the target with the eyes increases the accuracy of hand movements, they unexpectedly demonstrate that gaze fixation does not hinder adaptation. These results suggest that eye movements have distinct contributions for online control and visuomotor adaptation of hand movements.


Electronics ◽  
2020 ◽  
Vol 9 (6) ◽  
pp. 883 ◽  
Author(s):  
Khayyam Masood ◽  
Rezia Molfino ◽  
Matteo Zoppi

Freight Urban Robotic Vehicle (FURBOT) is an autonomous vehicle designed to transport last mile freight to designated urban stations. It is a slow vehicle designed to tackle urban environment with complete autonomy. A slow vehicle may have slightly different strategies for avoiding obstacles. Unlike on a highway, it has to deal with pedestrians, traffic lights and slower vehicles while maintaining smoothness in its drive. To tackle obstacle avoidance for this vehicle, sensor feedback based strategies have been formulated for smooth drive and obstacle avoidance. A full mathematical model for the vehicle is formulated and simulated in MATLAB environment. The mathematical model uses velocity control for obstacle avoidance without steering control. The obstacle avoidance is attained through velocity control and strategies are formulated with velocity profiling. Innovative techniques are formulated in creating the simulated sensory feed-backs of the environment. Using these feed-backs, correct velocity profiling is autonomously created for giving velocity profile input to the velocity controller. Proximity measurements are assumed to be available for the vehicle in its given range of drive. Novelty is attained by manipulating velocity profile without prior knowledge of the environment. Four different type of obstacles are modeled for simulated environment of the vehicle. These obstacles are randomly placed in the path of the vehicle and autonomous velocity profiling is verified in simulated environment. The simulated results obtained show satisfactory velocity profiling for controller input. The current technique helps to tune the existing controller and in designing of a better velocity controller for the autonomous vehicle and bridges the gap between sensor feed-back and controller input. Moreover, accurate input profiling creates less strain on the system and brings smoothness in drive for an overall safer environment.


Sensors ◽  
2021 ◽  
Vol 21 (6) ◽  
pp. 2244
Author(s):  
S. M. Yang ◽  
Y. A. Lin

Safe path planning for obstacle avoidance in autonomous vehicles has been developed. Based on the Rapidly Exploring Random Trees (RRT) algorithm, an improved algorithm integrating path pruning, smoothing, and optimization with geometric collision detection is shown to improve planning efficiency. Path pruning, a prerequisite to path smoothing, is performed to remove the redundant points generated by the random trees for a new path, without colliding with the obstacles. Path smoothing is performed to modify the path so that it becomes continuously differentiable with curvature implementable by the vehicle. Optimization is performed to select a “near”-optimal path of the shortest distance among the feasible paths for motion efficiency. In the experimental verification, both a pure pursuit steering controller and a proportional–integral speed controller are applied to keep an autonomous vehicle tracking the planned path predicted by the improved RRT algorithm. It is shown that the vehicle can successfully track the path efficiently and reach the destination safely, with an average tracking control deviation of 5.2% of the vehicle width. The path planning is also applied to lane changes, and the average deviation from the lane during and after lane changes remains within 8.3% of the vehicle width.


Sensors ◽  
2021 ◽  
Vol 21 (15) ◽  
pp. 5178
Author(s):  
Sangbong Yoo ◽  
Seongmin Jeong ◽  
Seokyeon Kim ◽  
Yun Jang

Gaze movement and visual stimuli have been utilized to analyze human visual attention intuitively. Gaze behavior studies mainly show statistical analyses of eye movements and human visual attention. During these analyses, eye movement data and the saliency map are presented to the analysts as separate views or merged views. However, the analysts become frustrated when they need to memorize all of the separate views or when the eye movements obscure the saliency map in the merged views. Therefore, it is not easy to analyze how visual stimuli affect gaze movements since existing techniques focus excessively on the eye movement data. In this paper, we propose a novel visualization technique for analyzing gaze behavior using saliency features as visual clues to express the visual attention of an observer. The visual clues that represent visual attention are analyzed to reveal which saliency features are prominent for the visual stimulus analysis. We visualize the gaze data with the saliency features to interpret the visual attention. We analyze the gaze behavior with the proposed visualization to evaluate that our approach to embedding saliency features within the visualization supports us to understand the visual attention of an observer.


1995 ◽  
Vol 118 (3) ◽  
pp. 280-286 ◽  
Author(s):  
James R. Hopfenbeck ◽  
Deborah S. Cowley ◽  
Allen Radant ◽  
Peter P. Roy-Byrne ◽  
David J. Greenblatt

2015 ◽  
Vol 34 (10) ◽  
pp. 915-922 ◽  
Author(s):  
M. P. Bijman ◽  
J. J. Fisher ◽  
L. A. Vallis

Author(s):  
Ryan P. Shaw ◽  
David M. Bevly

This paper presents a new approach for the guidance and control of a UGV (Unmanned Ground Vehicle). An obstacle avoidance algorithm was developed using an integrated system involving proportional navigation (PN) and a nonlinear model predictive controller (NMPC). An obstacle avoidance variant of the classical proportional navigation law generates command lateral accelerations to avoid obstacles, while the NMPC is used to track the reference trajectory given by the PN. The NMPC utilizes a lateral vehicle dynamic model. Obstacle avoidance has become a popular area of research for both unmanned aerial vehicles and unmanned ground vehicles. In this application an obstacle avoidance algorithm can take over the control of a vehicle until the obstacle is no longer a threat. The performance of the obstacle avoidance algorithm is evaluated through simulation. Simulation results show a promising approach to conditionally implemented obstacle avoidance.


2018 ◽  
Vol 71 (9) ◽  
pp. 1860-1872 ◽  
Author(s):  
Stephen RH Langton ◽  
Alex H McIntyre ◽  
Peter JB Hancock ◽  
Helmut Leder

Research has established that a perceived eye gaze produces a concomitant shift in a viewer’s spatial attention in the direction of that gaze. The two experiments reported here investigate the extent to which the nature of the eye movement made by the gazer contributes to this orienting effect. On each trial in these experiments, participants were asked to make a speeded response to a target that could appear in a location toward which a centrally presented face had just gazed (a cued target) or in a location that was not the recipient of a gaze (an uncued target). The gaze cues consisted of either fast saccadic eye movements or slower smooth pursuit movements. Cued targets were responded to faster than uncued targets, and this gaze-cued orienting effect was found to be equivalent for each type of gaze shift both when the gazes were un-predictive of target location (Experiment 1) and counterpredictive of target location (Experiment 2). The results offer no support for the hypothesis that motion speed modulates gaze-cued orienting. However, they do suggest that motion of the eyes per se, regardless of the type of movement, may be sufficient to trigger an orienting effect.


Sign in / Sign up

Export Citation Format

Share Document