The Influence of Restricted Viewing Conditions on Egocentric Distance Perception: Implications for Real and Virtual Indoor Environments

Perception ◽  
10.1068/p5144 ◽  
2005 ◽  
Vol 34 (2) ◽  
pp. 191-204 ◽  
Author(s):  
Sarah H Creem-Regehr ◽  
Peter Willemsen ◽  
Amy A Gooch ◽  
William B Thompson
2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Bo Dong ◽  
Airui Chen ◽  
Yuting Zhang ◽  
Yangyang Zhang ◽  
Ming Zhang ◽  
...  

AbstractInaccurate egocentric distance and speed perception are two main explanations for the high accident rate associated with driving in foggy weather. The effect of foggy weather on speed has been well studied. However, its effect on egocentric distance perception is poorly understood. The paradigm for measuring perceived egocentric distance in previous studies was verbal estimation instead of a nonverbal paradigm. In the current research, a nonverbal paradigm, the visual matching task, was used. Our results from the nonverbal task revealed a robust foggy effect on egocentric distance. Observers overestimated the egocentric distance in foggy weather compared to in clear weather. The higher the concentration of fog, the more serious the overestimation. This effect of fog on egocentric distance was not limited to a certain distance range but was maintained in action space and vista space. Our findings confirm the foggy effect with a nonverbal paradigm and reveal that people may perceive egocentric distance more "accurately" in foggy weather than when it is measured with a verbal estimation task.


2021 ◽  
Vol 18 (2) ◽  
pp. 1-16
Author(s):  
Holly C. Gagnon ◽  
Carlos Salas Rosales ◽  
Ryan Mileris ◽  
Jeanine K. Stefanucci ◽  
Sarah H. Creem-Regehr ◽  
...  

Augmented reality ( AR ) is important for training complex tasks, such as navigation, assembly, and medical procedures. The effectiveness of such training may depend on accurate spatial localization of AR objects in the environment. This article presents two experiments that test egocentric distance perception in augmented reality within and at the boundaries of action space (up to 35 m) in comparison with distance perception in a matched real-world ( RW ) environment. Using the Microsoft HoloLens, in Experiment 1, participants in two different RW settings judged egocentric distances (ranging from 10 to 35 m) to an AR avatar or a real person using a visual matching measure. Distances to augmented targets were underestimated compared to real targets in the two indoor, RW contexts. Experiment 2 aimed to generalize the results to an absolute distance measure using verbal reports in one of the indoor environments. Similar to Experiment 1, distances to augmented targets were underestimated compared to real targets. We discuss these findings with respect to the importance of methodologies that directly compare performance in real and mediated environments, as well as the inherent differences present in mediated environments that are “matched” to the real world.


2012 ◽  
Vol 9 (4) ◽  
pp. 1-17 ◽  
Author(s):  
Marc Rébillat ◽  
Xavier Boutillon ◽  
Étienne Corteel ◽  
Brian F. G. Katz

Displays ◽  
2013 ◽  
Vol 34 (2) ◽  
pp. 153-164 ◽  
Author(s):  
Ivelina V. Piryankova ◽  
Stephan de la Rosa ◽  
Uwe Kloos ◽  
Heinrich H. Bülthoff ◽  
Betty J. Mohler

Perception ◽  
2020 ◽  
Vol 49 (9) ◽  
pp. 940-967
Author(s):  
Ilja T. Feldstein ◽  
Felix M. Kölsch ◽  
Robert Konrad

Virtual reality systems are a popular tool in behavioral sciences. The participants’ behavior is, however, a response to cognitively processed stimuli. Consequently, researchers must ensure that virtually perceived stimuli resemble those present in the real world to ensure the ecological validity of collected findings. Our article provides a literature review relating to distance perception in virtual reality. Furthermore, we present a new study that compares verbal distance estimates within real and virtual environments. The virtual space—a replica of a real outdoor area—was displayed using a state-of-the-art head-mounted display. Investigated distances ranged from 8 to 13 m. Overall, the results show no significant difference between egocentric distance estimates in real and virtual environments. However, a more in-depth analysis suggests that the order in which participants were exposed to the two environments may affect the outcome. Furthermore, the study suggests that a rising experience of immersion leads to an alignment of the estimated virtual distances with the real ones. The results also show that the discrepancy between estimates of real and virtual distances increases with the incongruity between virtual and actual eye heights, demonstrating the importance of an accurately set virtual eye height.


2021 ◽  
Author(s):  
Rebecca L. Hornsey ◽  
Paul B. Hibbard

AbstractWe assessed the contribution of binocular disparity and the pictorial cues of linear perspective, texture, and scene clutter to the perception of distance in consumer virtual reality. As additional cues are made available, distance perception is predicted to improve, as measured by a reduction in systematic bias, and an increase in precision. We assessed (1) whether space is nonlinearly distorted; (2) the degree of size constancy across changes in distance; and (3) the weighting of pictorial versus binocular cues in VR. In the first task, participants positioned two spheres so as to divide the egocentric distance to a reference stimulus (presented between 3 and 11 m) into three equal thirds. In the second and third tasks, participants set the size of a sphere, presented at the same distances and at eye-height, to match that of a hand-held football. Each task was performed in four environments varying in the available cues. We measured accuracy by identifying systematic biases in responses and precision as the standard deviation of these responses. While there was no evidence of nonlinear compression of space, participants did tend to underestimate distance linearly, but this bias was reduced with the addition of each cue. The addition of binocular cues, when rich pictorial cues were already available, reduced both the bias and variability of estimates. These results show that linear perspective and binocular cues, in particular, improve the accuracy and precision of distance estimates in virtual reality across a range of distances typical of many indoor environments.


Sign in / Sign up

Export Citation Format

Share Document