Device Logic and Program Structures in Mixed Reality Simulation Environment

Author(s):  
B. Eisen ◽  
M. Jansen ◽  
U. Hoppe
Author(s):  
Elias B. Sayah ◽  
Kishor P

This research presented a practically achievable scenario of a smart warehouse receiving smart containers to be inspected smartly and effectively. This research had four objectives: review role of augmented and virtual realities in construction warehousing; define a scenario in an example construction warehousing layout and create modelling and simulation environment in Blender; simulate the scenario in interactive mode and record the features and experiences; and make recommendations for its practical implementation. A 3D model of an example scenario of inspection of containers was created in Blender software, and its Augmented Reality (AR) functionalities were configured for a simulated experience. An example process of inspection of the containers offloaded in a temporary construction warehouse was experienced within the 3D model and key features and experiences were recorded.


Author(s):  
Kelly A. Burke ◽  
Tal Oron-Gilad ◽  
Gareth Conway ◽  
Peter A. Hancock

The current dismounted soldier and the soldier of the future will be “loaded” with more information processing tasks while they perform shooting tasks. It is conceivable that some increased level of cognitive tasking may be performed simultaneously with required shooting tasks. The current was conducted in a high fidelity mixed reality simulation environment SAST-II. The study was designed to examine the ability of the soldier to perform friend-foe target discrimination and shooting accuracy, with varying target exposure times, friendly target signatures, and varying cognitive load demands. Analysis of variance revealed significant differences for the memory recall task during shooting and non-shooting conditions. Furthermore, results showed that workload increased as a function of task demand, with associated decreases in shooting performance.


Author(s):  
Jacqueline A. Towson ◽  
Matthew S. Taylor ◽  
Diana L. Abarca ◽  
Claire Donehower Paul ◽  
Faith Ezekiel-Wilder

Purpose Communication between allied health professionals, teachers, and family members is a critical skill when addressing and providing for the individual needs of patients. Graduate students in speech-language pathology programs often have limited opportunities to practice these skills prior to or during externship placements. The purpose of this study was to research a mixed reality simulator as a viable option for speech-language pathology graduate students to practice interprofessional communication (IPC) skills delivering diagnostic information to different stakeholders compared to traditional role-play scenarios. Method Eighty graduate students ( N = 80) completing their third semester in one speech-language pathology program were randomly assigned to one of four conditions: mixed-reality simulation with and without coaching or role play with and without coaching. Data were collected on students' self-efficacy, IPC skills pre- and postintervention, and perceptions of the intervention. Results The students in the two coaching groups scored significantly higher than the students in the noncoaching groups on observed IPC skills. There were no significant differences in students' self-efficacy. Students' responses on social validity measures showed both interventions, including coaching, were acceptable and feasible. Conclusions Findings indicated that coaching paired with either mixed-reality simulation or role play are viable methods to target improvement of IPC skills for graduate students in speech-language pathology. These findings are particularly relevant given the recent approval for students to obtain clinical hours in simulated environments.


2006 ◽  
Author(s):  
Karim Abdel-Malek ◽  
Jasbir Arora ◽  
Jingzhou Yang ◽  
Timothy Marler ◽  
Steve Beck ◽  
...  

2019 ◽  
Vol 2019 (1) ◽  
pp. 237-242
Author(s):  
Siyuan Chen ◽  
Minchen Wei

Color appearance models have been extensively studied for characterizing and predicting the perceived color appearance of physical color stimuli under different viewing conditions. These stimuli are either surface colors reflecting illumination or self-luminous emitting radiations. With the rapid development of augmented reality (AR) and mixed reality (MR), it is critically important to understand how the color appearance of the objects that are produced by AR and MR are perceived, especially when these objects are overlaid on the real world. In this study, nine lighting conditions, with different correlated color temperature (CCT) levels and light levels, were created in a real-world environment. Under each lighting condition, human observers adjusted the color appearance of a virtual stimulus, which was overlaid on a real-world luminous environment, until it appeared the whitest. It was found that the CCT and light level of the real-world environment significantly affected the color appearance of the white stimulus, especially when the light level was high. Moreover, a lower degree of chromatic adaptation was found for viewing the virtual stimulus that was overlaid on the real world.


2017 ◽  
Author(s):  
Dirk Schart, Nathaly Tschanz
Keyword(s):  

Author(s):  
S Leinster-Evans ◽  
J Newell ◽  
S Luck

This paper looks to expand on the INEC 2016 paper ‘The future role of virtual reality within warship support solutions for the Queen Elizabeth Class aircraft carriers’ presented by Ross Basketter, Craig Birchmore and Abbi Fisher from BAE Systems in May 2016 and the EAAW VII paper ‘Testing the boundaries of virtual reality within ship support’ presented by John Newell from BAE Systems and Simon Luck from BMT DSL in June 2017. BAE Systems and BMT have developed a 3D walkthrough training system that supports the teams working closely with the QEC Aircraft Carriers in Portsmouth and this work was presented at EAAW VII. Since then this work has been extended to demonstrate the art of the possible on Type 26. This latter piece of work is designed to explore the role of 3D immersive environments in the development and fielding of support and training solutions, across the range of support disciplines. The combined team are looking at how this digital thread leads from design of platforms, both surface and subsurface, through build into in-service support and training. This rich data and ways in which it could be used in the whole lifecycle of the ship, from design and development (used for spatial acceptance, HazID, etc) all the way through to operational support and maintenance (in conjunction with big data coming off from the ship coupled with digital tech docs for maintenance procedures) using constantly developing technologies such as 3D, Virtual Reality, Augmented Reality and Mixed Reality, will be proposed.  The drive towards gamification in the training environment to keep younger recruits interested and shortening course lengths will be explored. The paper develops the options and looks to how this technology can be used and where the value proposition lies. 


Sign in / Sign up

Export Citation Format

Share Document