scholarly journals How does mixed reality affect quiet stance?

2017 ◽  
Author(s):  
Gaiqing Kong ◽  
Kunlin Wei ◽  
Konrad P. Kording

AbstractMixed reality (MR) has promise for learning, design, and entertainment, and for use during everyday life. However, when interacting with objects in mixed reality, will moving objects make us fall or perturb our postural stability? To address this question, we recruited participants, instructed them to stand quietly, and measured how much virtual objects presented in mixed reality (Microsoft HoloLens) affected their stance. We analyzed the effects of solid object and text, in both a static and a dynamic setting. Mixed reality events induced some movements, but the effect, while significant, was exceptionally small (< 1mm & < 0.5° perturbations in terms of mean distance and angle rotations). We conclude that induced movement in “real reality” should not be too much of a concern when designing mixed reality applications.

2021 ◽  
Vol 1 ◽  
pp. 2107-2116
Author(s):  
Agnese Brunzini ◽  
Alessandra Papetti ◽  
Michele Germani ◽  
Erica Adrario

AbstractIn the medical education field, the use of highly sophisticated simulators and extended reality (XR) simulations allow training complex procedures and acquiring new knowledge and attitudes. XR is considered useful for the enhancement of healthcare education; however, several issues need further research.The main aim of this study is to define a comprehensive method to design and optimize every kind of simulator and simulation, integrating all the relevant elements concerning the scenario design and prototype development.A complete framework for the design of any kind of advanced clinical simulation is proposed and it has been applied to realize a mixed reality (MR) prototype for the simulation of the rachicentesis. The purpose of the MR application is to immerse the trainee in a more realistic environment and to put him/her under pressure during the simulation, as in real practice.The application was tested with two different devices: the headset Vox Gear Plus for smartphone and the Microsoft Hololens. Eighteen students of the 6th year of Medicine and Surgery Course were enrolled in the study. Results show the comparison of user experience related to the two different devices and simulation performance using the Hololens.


2008 ◽  
Vol 02 (02) ◽  
pp. 207-233
Author(s):  
SATORU MEGA ◽  
YOUNES FADIL ◽  
ARATA HORIE ◽  
KUNIAKI UEHARA

Human-computer interaction systems have been developed in recent years. These systems use multimedia techniques to create Mixed-Reality environments where users can train themselves. Although most of these systems rely strongly on interactivity with the users, taking into account users' states, they still lack the possibility of considering users preferences when they help them. In this paper, we introduce an Action Support System for Interactive Self-Training (ASSIST) in cooking. ASSIST focuses on recognizing users' cooking actions as well as real objects related to these actions to be able to provide them with accurate and useful assistance. Before the recognition and instruction processes, it takes users' cooking preferences and suggests one or more recipes that are likely to satisfy their preferences by collaborative filtering. When the cooking process starts, ASSIST recognizes users' hands movement using a similarity measure algorithm called AMSS. When the recognized cooking action is correct, ASSIST instructs the user on the next cooking procedure through virtual objects. When a cooking action is incorrect, the cause of its failure is analyzed and ASSIST provides the user with support information according to the cause to improve the user's incorrect cooking action. Furthermore, we construct parallel transition models from cooking recipes for more flexible instructions. This enables users to perform necessary cooking actions in any order they want, allowing more flexible learning.


2019 ◽  
Vol 2 ◽  
pp. 1-7
Author(s):  
Mathias Jahnke ◽  
Edyta P. Bogucka ◽  
Maria Turchenko

<p><strong>Abstract.</strong> Mixed reality is a rather new technology but came to its nowadays success through the availability of devices like Microsoft HoloLens which easily support the users and developers to use such devices. Therefore, visualization specialists like cartographers paid attention due to interaction possibilities such devices provide. In particular, to utilize the huge amount of opportunities such device gave. The applicability within the cartographic domain needs to be further investigated.</p><p>The main goal of this contribution is to evaluate the applicability of a mixed reality device in the domain of spatio-temporal representations on the example of the space-time cube to show cultural landscape changes. The hologram of the space-time cube provides the changes of the Royal Castle in Warsaw and their surrounding elements. The hologram therefore incorporated the different buildings of the castle, space-time prisms and space-time links to connect building elements over the years. The visual variables colour hue, colour value and transparency are mainly used to feature distinguishable space-time prisms and to show the space-time links. Different colour schemes are developed which features the characteristics of a mixed reality device. The possibilities of input actions are ranging from gaze/head movement, to gesture and voice.</p><p>The usability evaluation of the mixed reality hologram showed the overall comfort of interactions, perception of the visual components of the space-time cube and determines advantageous features and limitations of the technology. Most of the found limitations are connected to current devices, like e.g. resolution or field of view. An important aspect which came out is, that the experience the user has which such devices/technology plays an important role in successfully use and knowledge discovery from such applications.</p>


2021 ◽  
Vol 82 (4) ◽  
pp. 186
Author(s):  
Kathleen Phillips ◽  
Valerie A. Lynn ◽  
Amie Yenser ◽  
Christina Wissinger

Current teaching practice in undergraduate higher education anatomy and physiology courses incorporates the use of various instructional methodologies to reinforce the anatomical relationships between structures.1,2 These methods can include basic hands-on physical models, human and animal dissection labs, and interactive technology. Technological advances continue to drive the production of innovative anatomy and physiology electronic tools, including:virtual dissection in 3-D (e.g., Virtual Dissection Boards from Anatomage, 3D4Medical, and Anatomy.TV),augmented reality (AR) (e.g., Human Anatomy Atlas),mixed reality (e.g., Microsoft HoloLens Case Western Reserve Medical School and Cleveland Clinic digital anatomy app), and3-D virtual reality (VR) (e.g., 3D Organon VR Anatomy and YOU by Sharecare apps).


2011 ◽  
Vol 12 (8) ◽  
pp. 911-919 ◽  
Author(s):  
Rogério Pessoto Hirata ◽  
Ulysses Fernandes Ervilha ◽  
Lars Arendt-Nielsen ◽  
Thomas Graven-Nielsen

2020 ◽  
Vol 10 (16) ◽  
pp. 5436 ◽  
Author(s):  
Dong-Hyun Kim ◽  
Yong-Guk Go ◽  
Soo-Mi Choi

A drone be able to fly without colliding to preserve the surroundings and its own safety. In addition, it must also incorporate numerous features of interest for drone users. In this paper, an aerial mixed-reality environment for first-person-view drone flying is proposed to provide an immersive experience and a safe environment for drone users by creating additional virtual obstacles when flying a drone in an open area. The proposed system is effective in perceiving the depth of obstacles, and enables bidirectional interaction between real and virtual worlds using a drone equipped with a stereo camera based on human binocular vision. In addition, it synchronizes the parameters of the real and virtual cameras to effectively and naturally create virtual objects in a real space. Based on user studies that included both general and expert users, we confirm that the proposed system successfully creates a mixed-reality environment using a flying drone by quickly recognizing real objects and stably combining them with virtual objects.


2019 ◽  
Vol 30 (1) ◽  
pp. 173-178 ◽  
Author(s):  
M. Stojanovska ◽  
G. Tingle ◽  
L. Tan ◽  
L. Ulrey ◽  
S. Simonson-Shick ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document