scholarly journals Perceptual Limits of Optical See-Through Visors for Augmented Reality Guidance of Manual Tasks

2020 ◽  
Vol 67 (2) ◽  
pp. 411-419 ◽  
Author(s):  
Sara Condino ◽  
Marina Carbone ◽  
Roberta Piazza ◽  
Mauro Ferrari ◽  
Vincenzo Ferrari
Sensors ◽  
2020 ◽  
Vol 20 (6) ◽  
pp. 1612 ◽  
Author(s):  
Sara Condino ◽  
Benish Fida ◽  
Marina Carbone ◽  
Laura Cercenelli ◽  
Giovanni Badiali ◽  
...  

Augmented reality (AR) Head-Mounted Displays (HMDs) are emerging as the most efficient output medium to support manual tasks performed under direct vision. Despite that, technological and human-factor limitations still hinder their routine use for aiding high-precision manual tasks in the peripersonal space. To overcome such limitations, in this work, we show the results of a user study aimed to validate qualitatively and quantitatively a recently developed AR platform specifically conceived for guiding complex 3D trajectory tracing tasks. The AR platform comprises a new-concept AR video see-through (VST) HMD and a dedicated software framework for the effective deployment of the AR application. In the experiments, the subjects were asked to perform 3D trajectory tracing tasks on 3D-printed replica of planar structures or more elaborated bony anatomies. The accuracy of the trajectories traced by the subjects was evaluated by using templates designed ad hoc to match the surface of the phantoms. The quantitative results suggest that the AR platform could be used to guide high-precision tasks: on average more than 94% of the traced trajectories stayed within an error margin lower than 1 mm. The results confirm that the proposed AR platform will boost the profitable adoption of AR HMDs to guide high precision manual tasks in the peripersonal space.


Sensors ◽  
2020 ◽  
Vol 20 (5) ◽  
pp. 1444 ◽  
Author(s):  
Fabrizio Cutolo ◽  
Virginia Mamone ◽  
Nicola Carbonaro ◽  
Vincenzo Ferrari ◽  
Alessandro Tognetti

The increasing capability of computing power and mobile graphics has made possible the release of self-contained augmented reality (AR) headsets featuring efficient head-anchored tracking solutions. Ego motion estimation based on well-established infrared tracking of markers ensures sufficient accuracy and robustness. Unfortunately, wearable visible-light stereo cameras with short baseline and operating under uncontrolled lighting conditions suffer from tracking failures and ambiguities in pose estimation. To improve the accuracy of optical self-tracking and its resiliency to marker occlusions, degraded camera calibrations, and inconsistent lighting, in this work we propose a sensor fusion approach based on Kalman filtering that integrates optical tracking data with inertial tracking data when computing motion correlation. In order to measure improvements in AR overlay accuracy, experiments are performed with a custom-made AR headset designed for supporting complex manual tasks performed under direct vision. Experimental results show that the proposed solution improves the head-mounted display (HMD) tracking accuracy by one third and improves the robustness by also capturing the orientation of the target scene when some of the markers are occluded and when the optical tracking yields unstable and/or ambiguous results due to the limitations of using head-anchored stereo tracking cameras under uncontrollable lighting conditions.


Author(s):  
Gianluca Rho ◽  
Alejandro Luis Callara ◽  
Sara Condino ◽  
Shadi Ghiasi ◽  
Mimma Nardelli ◽  
...  

2021 ◽  
Vol 8 (10) ◽  
pp. 131
Author(s):  
Nadia Cattari ◽  
Sara Condino ◽  
Fabrizio Cutolo ◽  
Mauro Ferrari ◽  
Vincenzo Ferrari

Augmented Reality (AR) headsets have become the most ergonomic and efficient visualization devices to support complex manual tasks performed under direct vision. Their ability to provide hands-free interaction with the augmented scene makes them perfect for manual procedures such as surgery. This study demonstrates the reliability of an AR head-mounted display (HMD), conceived for surgical guidance, in navigating in-depth high-precision manual tasks guided by a 3D ultrasound imaging system. The integration between the AR visualization system and the ultrasound imaging system provides the surgeon with real-time intra-operative information on unexposed soft tissues that are spatially registered with the surrounding anatomic structures. The efficacy of the AR guiding system was quantitatively assessed with an in vitro study simulating a biopsy intervention aimed at determining the level of accuracy achievable. In the experiments, 10 subjects were asked to perform the biopsy on four spherical lesions of decreasing sizes (10, 7, 5, and 3 mm). The experimental results showed that 80% of the subjects were able to successfully perform the biopsy on the 5 mm lesion, with a 2.5 mm system accuracy. The results confirmed that the proposed integrated system can be used for navigation during in-depth high-precision manual tasks.


ASHA Leader ◽  
2013 ◽  
Vol 18 (9) ◽  
pp. 14-14 ◽  
Keyword(s):  

Amp Up Your Treatment With Augmented Reality


2003 ◽  
Vol 15 (2) ◽  
pp. 141-156 ◽  
Author(s):  
eve Coste-Maniere ◽  
Louai Adhami ◽  
Fabien Mourgues ◽  
Alain Carpentier

Sign in / Sign up

Export Citation Format

Share Document