A Projection-based Medical Augmented Reality System

Author(s):  
Jiann-Der Lee ◽  
Hao-Che Lee ◽  
Chung-Hung Hsieh ◽  
Chieh-Tsai Wu ◽  
Shin-Tseng Lee
2019 ◽  
Vol 9 (13) ◽  
pp. 2732 ◽  
Author(s):  
Radosław Gierwiało ◽  
Marcin Witkowski ◽  
Maciej Kosieradzki ◽  
Wojciech Lisik ◽  
Łukasz Groszkowski ◽  
...  

This paper presents a projection-based augmented-reality system (MARVIS) that supports the visualization of internal structures on the surface of a liver phantom. MARVIS is endowed with three key features: tracking of spatial relationship between the phantom and the operator’s head in real time, monoscopic projection of internal liver structures onto the phantom surface for 3D perception without additional head-mounted devices, and phantom internal electronic circuit to assess the accuracy of a syringe guidance system. An initial validation was carried out by 25 medical students (12 males and 13 females; mean age, 23.12 years; SD, 1.27 years) and 3 male surgeons (mean age, 43.66 years; SD, 7.57 years). The validation results show that the ratio of failed syringe insertions was reduced from 50% to 30% by adopting the MARVIS projection. The proposed system suitably enhances a surgeon’s spatial perception of a phantom internal structure.


2015 ◽  
Vol 9 (1L) ◽  
pp. 97-104 ◽  
Author(s):  
Tzyh-Chyang Chang ◽  
Chung-Hung Hsieh ◽  
Chung-Hsien Huang ◽  
Ji-Wei Yang ◽  
Shih-Tseng Lee ◽  
...  

Sensors ◽  
2021 ◽  
Vol 21 (9) ◽  
pp. 3061
Author(s):  
Alice Lo Valvo ◽  
Daniele Croce ◽  
Domenico Garlisi ◽  
Fabrizio Giuliano ◽  
Laura Giarré ◽  
...  

In recent years, we have assisted with an impressive advance in augmented reality systems and computer vision algorithms, based on image processing and artificial intelligence. Thanks to these technologies, mainstream smartphones are able to estimate their own motion in 3D space with high accuracy. In this paper, we exploit such technologies to support the autonomous mobility of people with visual disabilities, identifying pre-defined virtual paths and providing context information, reducing the distance between the digital and real worlds. In particular, we present ARIANNA+, an extension of ARIANNA, a system explicitly designed for visually impaired people for indoor and outdoor localization and navigation. While ARIANNA is based on the assumption that landmarks, such as QR codes, and physical paths (composed of colored tapes, painted lines, or tactile pavings) are deployed in the environment and recognized by the camera of a common smartphone, ARIANNA+ eliminates the need for any physical support thanks to the ARKit library, which we exploit to build a completely virtual path. Moreover, ARIANNA+ adds the possibility for the users to have enhanced interactions with the surrounding environment, through convolutional neural networks (CNNs) trained to recognize objects or buildings and enabling the possibility of accessing contents associated with them. By using a common smartphone as a mediation instrument with the environment, ARIANNA+ leverages augmented reality and machine learning for enhancing physical accessibility. The proposed system allows visually impaired people to easily navigate in indoor and outdoor scenarios simply by loading a previously recorded virtual path and providing automatic guidance along the route, through haptic, speech, and sound feedback.


Author(s):  
Ae Kyeong Lim ◽  
Junsun Ryu ◽  
Hong Man Yoon ◽  
Hee Chul Yang ◽  
Seok-ki Kim

2013 ◽  
Vol 60 (9) ◽  
pp. 2636-2644 ◽  
Author(s):  
Hussam Al-Deen Ashab ◽  
Victoria A. Lessoway ◽  
Siavash Khallaghi ◽  
Alexis Cheng ◽  
Robert Rohling ◽  
...  

Author(s):  
Xiang Wang ◽  
Severine Habert ◽  
Meng Ma ◽  
Chun-Hao Huang ◽  
Pascal Fallavollita ◽  
...  

2009 ◽  
Vol 5 (4) ◽  
pp. 415-422 ◽  
Author(s):  
Ramesh Thoranaghatte ◽  
Jaime Garcia ◽  
Marco Caversaccio ◽  
Daniel Widmer ◽  
Miguel A. Gonzalez Ballester ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document