Augmented reality visualization of ultrasound images: system description, calibration, and features

Author(s):  
F. Sauer ◽  
A. Khamene ◽  
B. Bascle ◽  
L. Schinunang ◽  
F. Wenzel ◽  
...  
Author(s):  
Adrian David Cheok

In this chapter, we explore the applications of mixed reality technology for future social and physical entertainment systems. Throughout the case studies that will be presented here, we will show the very broad and significant impacts of mixed reality technology on variety aspects of human interactivity with regards to entertainment. On the technological aspect, the various systems we would be touching on incorporated different technologies ranging from the current mainstream ones such as GPS tracking, Bluetooth, RFID to pioneering researches of vision based tracking, augmented reality, tangible interaction techniques and 3D live mixed reality capture system. We will discuss each projects in detail in terms of their motivations and requirements of the particular application domain, their system description and design decisions, as well as their future impacts on the human social and physical entertainment field.


2015 ◽  
Vol 1 (1) ◽  
pp. 196-197 ◽  
Author(s):  
Stefan Maas ◽  
Marvin Ingler ◽  
Heinrich Martin Overhoff

AbstractUltrasound has been established as a diagnostic tool in a wide range of applications. Especially for beginners, the alignment of sectional images to patient’s spatial anatomy can be cumbersome. A direct view onto the patient’s anatomy while regarding ultrasound images may help to overcome unergonomic examination.To solve these issues an affordable augmented reality system using smart glasses was created, that displays a (virtual) ultrasound image beneath a (real) ultrasound transducer.


2012 ◽  
Vol 98 (6) ◽  
pp. 775-782 ◽  
Author(s):  
Rosario Francesco Grasso ◽  
Giacomo Luppi ◽  
Roberto Luigi Cazzato ◽  
Eliodoro Faiella ◽  
Francesco D'Agostino ◽  
...  

Aims and background “Augmented reality” is a technique to create a composite view by augmenting the real intervention field, visualized by the doctor, with additional information coming from a virtual volume generated using computed tomography (CT), magnetic resonance or ultrasound images previously acquired from the same patient. In the present study we verified the accuracy and validated the clinical use of an augmented reality navigation system produced to perform percutaneous CT-guided lung biopsies. Methods One hundred and eighty consecutive patients with solitary parenchymal lung lesions, enrolled using a nonrandom enrollment system, underwent percutaneous CT-guided aspiration and core biopsy using a traditional technique (group C, 90 patients) and navigation system assistance (group S, 90 patients). For each patient we recorded the largest lesion diameter, procedure time, overall number of CT scans, radiation dose, and complications. The entire experimental project was evaluated and approved by the local institutional review board (ethics committee). Results Each procedure was concluded successfully and a pathological diagnosis was reached in 96% of cases in group S and 90% of cases in group C. Procedure time, overall number of CT scans and incident x-ray radiation dose (CTDIvol) were significantly reduced in navigation system-assisted procedures (P <0.001; z = 5.64) compared with traditional CT-guided procedures. The percentage of procedural complications was 14% in group S and 17% in group C. Conclusion The augmented reality navigation system used in this study was a highly safe, technically reliable and effective support tool in percutaneous CT-guided lung biopsy, allowing to shorten the procedure time and reduce the incident x-ray radiation dose to patients and the rate of insufficient specimens. Furthermore, it has the potential to increase the number of procedures executed in the allocated time without increasing the number of complications.


1998 ◽  
Vol 17 (5) ◽  
pp. 681-693 ◽  
Author(s):  
Y. Sato ◽  
M. Nakamoto ◽  
Y. Tamaki ◽  
T. Sasama ◽  
I. Sakita ◽  
...  

2003 ◽  
Author(s):  
Frank Sauer ◽  
Uwe J. Schoepf ◽  
Ali Khamene ◽  
Sebastian Vogt ◽  
Marco Das ◽  
...  

Diagnostics ◽  
2021 ◽  
Vol 11 (3) ◽  
pp. 441
Author(s):  
Christopher Mela ◽  
Francis Papay ◽  
Yang Liu

A novel multimodal, multiscale imaging system with augmented reality capability were developed and characterized. The system offers 3D color reflectance imaging, 3D fluorescence imaging, and augmented reality in real time. Multiscale fluorescence imaging was enabled by developing and integrating an in vivo fiber-optic microscope. Real-time ultrasound-fluorescence multimodal imaging used optically tracked fiducial markers for registration. Tomographical data are also incorporated using optically tracked fiducial markers for registration. Furthermore, we characterized system performance and registration accuracy in a benchtop setting. The multiscale fluorescence imaging facilitated assessing the functional status of tissues, extending the minimal resolution of fluorescence imaging to ~17.5 µm. The system achieved a mean of Target Registration error of less than 2 mm for registering fluorescence images to ultrasound images and MRI-based 3D model, which is within clinically acceptable range. The low latency and high frame rate of the prototype system has shown the promise of applying the reported techniques in clinically relevant settings in the future.


Radiology ◽  
2006 ◽  
Vol 240 (1) ◽  
pp. 230-235 ◽  
Author(s):  
Marco Das ◽  
Frank Sauer ◽  
U. Joseph Schoepf ◽  
Ali Khamene ◽  
Sebastian K. Vogt ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document