scholarly journals Integrated diagnostic workstation

1999 ◽  
Vol 12 (S1) ◽  
pp. 160-162 ◽  
Author(s):  
Candice K. Bergsneider ◽  
David Piraino ◽  
Michael Recht ◽  
Bradford Richmond ◽  
Doreen Dackiewicz
1998 ◽  
Author(s):  
Fei Cao ◽  
H. K. Huang ◽  
Ewa Pietka ◽  
Vicente Gilsanz ◽  
Steven Ominsky

1997 ◽  
Vol 10 (S1) ◽  
pp. 171-174 ◽  
Author(s):  
Srinka Ghosh ◽  
Katherine P. Andriole ◽  
David E. Avrin ◽  
Ronald L. Arenson

1990 ◽  
Author(s):  
Kazuo Aisaka ◽  
Kazuko Terada ◽  
Akihide Hashizume ◽  
Ryuuichi Suzuki ◽  
Masahiro Ishii ◽  
...  

2001 ◽  
Vol 14 (S1) ◽  
pp. 199-201 ◽  
Author(s):  
David S. Hirschorn ◽  
Clay R. Hinrichs ◽  
Devang M. Gor ◽  
Kartik Shah ◽  
George Visvikis

2003 ◽  
Vol 9 (4) ◽  
pp. 225-229 ◽  
Author(s):  
Luca Pagani ◽  
Lasse Jyrkinen ◽  
Jaakko Niinimäki ◽  
Jarmo Reponen ◽  
Ari Karttunen ◽  
...  

A wireless hand-held Webpad device was used to review a sample set of cranial computerized tomography (CT) studies to assess its diagnostic capabilities and its feasibility as a portable diagnostic workstation for radiology. The data-set consisted of 30 head CT studies of emergency cases. Two neuroradiologists and a senior radiologist participated in the evaluation of the portable workstation. They used a Web-based viewer that we developed, which provided all the major functionalities required for radiological image review. The reported radiological findings and diagnoses were compared with a gold standard, comprising a set of diagnoses previously formulated by a consensus panel of radiologists who had reviewed the original studies. The diagnoses made using the Webpad were correct (no major discrepancies) in 82 out of 90 interpretations (91%), which is comparable to the accuracy reported in image review with a conventional radiological workstation. The average total working time per diagnosis was 5 min 25 s (range 2–12 min). The simplicity of use of the system and its low cost make it suitable for distributing radiological studies within hospital facilities.


2018 ◽  
pp. 254-260 ◽  
Author(s):  
Antonio J Salazar ◽  
Nicolás Useche ◽  
Manuel F Granja ◽  
Aníbal J Morillo ◽  
Sonia Bermúdez ◽  
...  

Aim: This study compares the reliability of brain CT interpretations performed using a diagnostic workstation and a mobile tablet computer in a telestroke context. Methods: A factorial design with 1,452 interpretations was used. Reliability was evaluated using the Fleiss’ kappa coefficient on the agreements of the interpretation results on the lesion classification, presence of imaging contraindications to the intravenous recombinant tissue-type plasminogen activator (t-PA) administration, and on the Alberta Stroke Program Early CT Score (ASPECTS). Results: The intra-observer agreements were as follows: good agreement on the overall lesion classification (κ= 0.63, p<0.001), very good agreement on hemorrhagic lesions (κ= 0.89, p<0.001), and moderate agreements on both without acute lesion classification and acute ischemic lesion classification (κ= 0.59 and κ= 0.58 respectively, p<0.001). There was good intra-observer agreement on the dichotomized-ASPECTS (κ= 0.65, p<0.001). Conclusions: The results of our study allow us to conclude that the reliability of the mobile solution for interpreting brain CT images of patients with acute stroke was assured, which would allow efficient and low-cost telestroke services.


2015 ◽  
Vol 66 (4) ◽  
pp. 363-367
Author(s):  
Umer Salati ◽  
Sum Leong ◽  
John Donnellan ◽  
Hong Kuan Kok ◽  
Orla Buckley ◽  
...  

Purpose The purpose was to compare performance of diagnostic workstation monitors and the Apple iPad 2 (Cupertino, CA) in interpretation of emergency computed tomography (CT) brain studies. Methods Two experienced radiologists interpreted 100 random emergency CT brain studies on both on-site diagnostic workstation monitors and the iPad 2 via remote access. The radiologists were blinded to patient clinical details and to each other's interpretation and the study list was randomized between interpretations on different modalities. Interobserver agreement between radiologists and intraobserver agreement between modalities was determined and Cohen kappa coefficients calculated for each. Performance with regards to urgent and nonurgent abnormalities was assessed separately. Results There was substantial intraobserver agreement of both radiologists between the modalities with overall calculated kappa values of 0.959 and 0.940 in detecting acute abnormalities and perfect agreement with regards to hemorrhage. Intraobserver agreement kappa values were 0.939 and 0.860 for nonurgent abnormalities. Interobserver agreement between the 2 radiologists for both diagnostic monitors and the iPad 2 was also substantial ranging from 0.821-0.860. Conclusions The iPad 2 is a reliable modality in the interpretation of CT brain studies in them emergency setting and for the detection of acute and chronic abnormalities, with comparable performance to standard diagnostic workstation monitors.


Sign in / Sign up

Export Citation Format

Share Document