P‐44: Low Resistance Optical Transparent Electrode for Augmented Reality Near‐Eye Display and Eye Tracking Module

2021 ◽  
Vol 52 (1) ◽  
pp. 1224-1227
Author(s):  
Ming-Jaan Ho ◽  
Cloud F.Y. Shen ◽  
Tina H.T. Hsu ◽  
Wen-Zhu Wei
Sensors ◽  
2021 ◽  
Vol 21 (6) ◽  
pp. 2234
Author(s):  
Sebastian Kapp ◽  
Michael Barz ◽  
Sergey Mukhametov ◽  
Daniel Sonntag ◽  
Jochen Kuhn

Currently an increasing number of head mounted displays (HMD) for virtual and augmented reality (VR/AR) are equipped with integrated eye trackers. Use cases of these integrated eye trackers include rendering optimization and gaze-based user interaction. In addition, visual attention in VR and AR is interesting for applied research based on eye tracking in cognitive or educational sciences for example. While some research toolkits for VR already exist, only a few target AR scenarios. In this work, we present an open-source eye tracking toolkit for reliable gaze data acquisition in AR based on Unity 3D and the Microsoft HoloLens 2, as well as an R package for seamless data analysis. Furthermore, we evaluate the spatial accuracy and precision of the integrated eye tracker for fixation targets with different distances and angles to the user (n=21). On average, we found that gaze estimates are reported with an angular accuracy of 0.83 degrees and a precision of 0.27 degrees while the user is resting, which is on par with state-of-the-art mobile eye trackers.


Information ◽  
2021 ◽  
Vol 12 (6) ◽  
pp. 226
Author(s):  
Lisa-Marie Vortmann ◽  
Leonid Schwenke ◽  
Felix Putze

Augmented reality is the fusion of virtual components and our real surroundings. The simultaneous visibility of generated and natural objects often requires users to direct their selective attention to a specific target that is either real or virtual. In this study, we investigated whether this target is real or virtual by using machine learning techniques to classify electroencephalographic (EEG) and eye tracking data collected in augmented reality scenarios. A shallow convolutional neural net classified 3 second EEG data windows from 20 participants in a person-dependent manner with an average accuracy above 70% if the testing data and training data came from different trials. This accuracy could be significantly increased to 77% using a multimodal late fusion approach that included the recorded eye tracking data. Person-independent EEG classification was possible above chance level for 6 out of 20 participants. Thus, the reliability of such a brain–computer interface is high enough for it to be treated as a useful input mechanism for augmented reality applications.


Author(s):  
H. Serhat Cerci ◽  
A. Selcuk Koyluoglu

The purpose of this chapter, which is designed to measure where and how the consumer focuses in an advertising brochure, which visual is more striking, and how much eye strain (twitch) it takes, is to measure the density and visual attention of the eyes through the eye-tracking device during the individual examination. For this study, an experimental laboratory for neuromarketing research was used. After watching the videos and images of the participants in the eye-tracking module, the general evaluations were taken to determine what they remembered, and a comparison opportunity was born. According to the findings, logos, and photographs are more effective than texts. Viewers read large text and skip small text. Suggestions for future research are presented in the chapter.


2020 ◽  
Vol 44 (11) ◽  
Author(s):  
Shang Lu ◽  
Yerly Paola Sanchez Perdomo ◽  
Xianta Jiang ◽  
Bin Zheng

2020 ◽  
Vol 20 (24) ◽  
pp. 15204-15212
Author(s):  
Johannes Meyer ◽  
Thomas Schlebusch ◽  
Wolfgang Fuhl ◽  
Enkelejda Kasneci

2019 ◽  
Vol 26 (4) ◽  
pp. 208-222
Author(s):  
Sheree Josephson ◽  
Melina Myers

2020 ◽  
Vol 28 (20) ◽  
pp. 29788 ◽  
Author(s):  
Jin-ho Lee ◽  
Igor Yanusik ◽  
Yoonsun Choi ◽  
Byongmin Kang ◽  
Chansol Hwang ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document