scholarly journals Using Microsoft HoloLens to improve memory recall in anatomy and physiology: A pilot study to examine the efficacy of using augmented reality in education

Author(s):  
chen chen ◽  
Lei Zhang ◽  
Tony Luczak ◽  
Eboni Smith ◽  
Reuben F Burch
Author(s):  
Christen E. Sushereba ◽  
Laura G. Militello

In this session, we will demonstrate the Virtual Patient Immersive Trainer (VPIT). The VPIT system uses augmented reality (AR) to allow medics and medical students to experience a photorealistic, life-sized virtual patient. The VPIT supports learners in obtaining the perceptual skills required to recognize and interpret subtle perceptual cues critical to assessing a patient’s condition. We will conduct an interactive demonstration of the virtual patient using both a tablet (for group interaction) and an AR-enabled headset (Microsoft HoloLens) for individual interaction. In addition, we will demonstrate use of the instructor tablet to control what the learner sees (e.g., injury types, severity of injury) and to monitor student performance.


Sensors ◽  
2021 ◽  
Vol 21 (6) ◽  
pp. 2234
Author(s):  
Sebastian Kapp ◽  
Michael Barz ◽  
Sergey Mukhametov ◽  
Daniel Sonntag ◽  
Jochen Kuhn

Currently an increasing number of head mounted displays (HMD) for virtual and augmented reality (VR/AR) are equipped with integrated eye trackers. Use cases of these integrated eye trackers include rendering optimization and gaze-based user interaction. In addition, visual attention in VR and AR is interesting for applied research based on eye tracking in cognitive or educational sciences for example. While some research toolkits for VR already exist, only a few target AR scenarios. In this work, we present an open-source eye tracking toolkit for reliable gaze data acquisition in AR based on Unity 3D and the Microsoft HoloLens 2, as well as an R package for seamless data analysis. Furthermore, we evaluate the spatial accuracy and precision of the integrated eye tracker for fixation targets with different distances and angles to the user (n=21). On average, we found that gaze estimates are reported with an angular accuracy of 0.83 degrees and a precision of 0.27 degrees while the user is resting, which is on par with state-of-the-art mobile eye trackers.


2021 ◽  
Vol 18 (2) ◽  
pp. 1-16
Author(s):  
Holly C. Gagnon ◽  
Carlos Salas Rosales ◽  
Ryan Mileris ◽  
Jeanine K. Stefanucci ◽  
Sarah H. Creem-Regehr ◽  
...  

Augmented reality ( AR ) is important for training complex tasks, such as navigation, assembly, and medical procedures. The effectiveness of such training may depend on accurate spatial localization of AR objects in the environment. This article presents two experiments that test egocentric distance perception in augmented reality within and at the boundaries of action space (up to 35 m) in comparison with distance perception in a matched real-world ( RW ) environment. Using the Microsoft HoloLens, in Experiment 1, participants in two different RW settings judged egocentric distances (ranging from 10 to 35 m) to an AR avatar or a real person using a visual matching measure. Distances to augmented targets were underestimated compared to real targets in the two indoor, RW contexts. Experiment 2 aimed to generalize the results to an absolute distance measure using verbal reports in one of the indoor environments. Similar to Experiment 1, distances to augmented targets were underestimated compared to real targets. We discuss these findings with respect to the importance of methodologies that directly compare performance in real and mediated environments, as well as the inherent differences present in mediated environments that are “matched” to the real world.


2021 ◽  
pp. 103841
Author(s):  
Hisham Iqbal ◽  
Fabio Tatti ◽  
Ferdinando Rodriguez y Baena

2021 ◽  
Vol 82 (4) ◽  
pp. 186
Author(s):  
Kathleen Phillips ◽  
Valerie A. Lynn ◽  
Amie Yenser ◽  
Christina Wissinger

Current teaching practice in undergraduate higher education anatomy and physiology courses incorporates the use of various instructional methodologies to reinforce the anatomical relationships between structures.1,2 These methods can include basic hands-on physical models, human and animal dissection labs, and interactive technology. Technological advances continue to drive the production of innovative anatomy and physiology electronic tools, including:virtual dissection in 3-D (e.g., Virtual Dissection Boards from Anatomage, 3D4Medical, and Anatomy.TV),augmented reality (AR) (e.g., Human Anatomy Atlas),mixed reality (e.g., Microsoft HoloLens Case Western Reserve Medical School and Cleveland Clinic digital anatomy app), and3-D virtual reality (VR) (e.g., 3D Organon VR Anatomy and YOU by Sharecare apps).


PEDIATRICS ◽  
2021 ◽  
Vol 147 (3) ◽  
pp. e2020005009
Author(s):  
Patricia L. Dias ◽  
Rachel G. Greenberg ◽  
Ronald N. Goldberg ◽  
Kimberley Fisher ◽  
David T. Tanaka

2019 ◽  
Vol 9 (11) ◽  
pp. 2225 ◽  
Author(s):  
Francesco Osti ◽  
Gian Maria Santi ◽  
Gianni Caligiana

In this paper, we present a solution for the photorealistic ambient light render of holograms into dynamic real scenes, in augmented reality applications. Based on Microsoft HoloLens, we achieved this result with an Image Base Lighting (IBL) approach. The real-time image capturing that has been designed is able to automatically locate and position directional lights providing the right illumination to the holograms. We also implemented a negative “shadow drawing” shader that contributes to the final photorealistic and immersive effect of holograms in real life. The main focus of this research was to achieve a superior photorealism through the combination of real-time lights placement and negative “shadow drawing” shader. The solution was evaluated in various Augmented Reality case studies, from classical ones (using Vuforia Toolkit) to innovative applications (using HoloLens).


2020 ◽  
Vol 7 ◽  
Author(s):  
David Asgar-Deen ◽  
Jay Carriere ◽  
Ericka Wiebe ◽  
Lashan Peiris ◽  
Aalo Duha ◽  
...  

2020 ◽  
Vol 49 (1) ◽  
pp. 287-298
Author(s):  
Peng Liu ◽  
Chenmeng Li ◽  
Changlin Xiao ◽  
Zeshu Zhang ◽  
Junqi Ma ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document