scholarly journals Implementation of the structural SIMilarity (SSIM) index as a quantitative evaluation tool for dose distribution error detection

2020 ◽  
Vol 47 (4) ◽  
pp. 1907-1919 ◽  
Author(s):  
Jiayuan Peng ◽  
Chengyu Shi ◽  
Eric Laugeman ◽  
Weigang Hu ◽  
Zhen Zhang ◽  
...  
Computation ◽  
2020 ◽  
Vol 8 (2) ◽  
pp. 31
Author(s):  
Lenuta Pana ◽  
Simona Moldovanu ◽  
Nilanjan Dey ◽  
Amira S. Ashour ◽  
Luminita Moraru

Background: The purpose of this article is to provide a new evaluation tool based on skeleton maps to assess the tumoral and non-tumoral regions of the 2D MRI in PD-weighted (proton density) and T2w (T2-weighted type) brain images. Methods: The proposed method investigated inter-hemisphere brain tissue similarity using a mask in the right hemisphere and its mirror reflection in the left one. At the hemisphere level and for each ROI (region of interest), a morphological skeleton algorithm was used to efficiently investigate the similarity between hemispheres. Two datasets with 88 T2w and PD images belonging to healthy patients and patients diagnosed with glioma were investigated: D1 contains the original raw images affected by Rician noise and D2 consists of the same images pre-processed for noise removal. Results: The investigation was based on structural similarity assessment by using the Structural Similarity Index (SSIM) and a modified Jaccard metrics. A novel S-Jaccard (Skeleton Jaccard) metric was proposed. Cluster accuracy was estimated based on the Silhouette method (SV). The Silhouette coefficient (SC) indicates the quality of the clustering process for the SSIM and S-Jaccard. To assess the overall classification accuracy an ROC curve implementation was carried out. Conclusions: Consistent results were obtained for healthy patients and for PD images of glioma. We demonstrated that the S-Jaccard metric based on skeletal similarity is an efficient tool for an inter-hemisphere brain similarity evaluation. The accuracy of the proposed skeletonization method was smaller for the original images affected by Rician noise (AUC = 0.883 (T2w) and 0.904 (PD)) but increased for denoised images (AUC = 0.951 (T2w) and 0.969 (PD)).


2008 ◽  
Vol 5 (1) ◽  
pp. 1-17 ◽  
Author(s):  
Martha Todd ◽  
Julie A Manz ◽  
Kim S Hawkins ◽  
Mary E Parsons ◽  
Maribeth Hercinger

Author(s):  
Jorge Castellini ◽  
Natalia Rosli ◽  
María Gala Santini Araujo ◽  
Horacio Sixto Herrera ◽  
Mauro Vivas ◽  
...  

La presentación de trabajos científicos a través de un informe requiere de un armado que podemos llamar “anatomía de un informe” y una dinámica en su interior a la que consideramos “la fisiología del informe” para que tenga validez y credibilidad, pero ¿cómo se evalúan los trabajos científicos presentados en nuestros congresos? ¿Cuáles son los criterios para determinar que un trabajo sea aceptado o no? El Comité de Investigaciones de la AAOT elaboró las grillas de evaluación. El objetivo de publicar las grillas es poder contar con una herramienta objetiva de valoración cualitativa y cuantitativa de todos los trabajos presentados en sus diferentes géneros, que podrá ser utilizada por los evaluadores de trabajos como elemento de feedback y como elemento de evaluación objetiva, y por los escribientes como guía para el armado del informe de su trabajo de investigación. ABSTRACTFor validity and credibility purposes, the presentation of scientific publications in a report form requires a certain layout that we could call the "anatomy of the report", as well as the dynamics of it, considered the "physiology of the report". But how are conference scientific publications evaluated? What are the criteria used to determine whether a report is accepted or not? The Research Committee at the Argentine Association of Orthopedics and Traumatology (AAOT) developed evaluation grids with the aim of publishing them to have an objective tool for the qualitative and quantitative evaluation of all types of publications submitted. These grids could be used by evaluators to present their feedback, as well as an objective evaluation tool. Authors could use them as a guide to prepare the report of their research work.


Author(s):  
Anke Wind ◽  
René Limbeek ◽  
Henrike Bretveld ◽  
Robert van Schijndel ◽  
Daan Smits ◽  
...  

Background: Networks are promoted as an organizational form that enables integrated care as well as enhanced patient outcomes. However, implementing networks is complex. It is therefore important to evaluate the quality and effectiveness of networks to ensure it is worth developing and maintaining them. This article describes the development of an evaluation tool for cancer care networks and the results of a pilot study with a regional lung cancer care network. Methods: This study used a combination of qualitative and quantitative evaluation methods. The qualitative evaluation was based on a framework with 10 standards for the organization of an oncological (tumor-specific) care network. Data for the quantitative evaluation were obtained from the Dutch Cancer Registry. The evaluation was performed at a network of three hospitals collaborating in the field of lung oncology. Results: The qualitative evaluation framework consisted of 10 standards/questions which were divided into 38 sub-questions. The evaluation showed that in general patients are satisfied with the collaboration in the network. However, some improvement points were found such as the need for more attention for the implementation and periodic evaluation of a regional care pathway. The start of a regional multidisciplinary meeting has been a major step for improving the collaboration. Conclusion: An evaluation tool for (lung) cancer care networks was successfully developed and piloted within a cancer care network. The tool has proven to be a useful method for evaluating collaboration within an oncological network. It helped network partners to understand what they see as important and allowed them to learn about their program’s dynamics. Improvement opportunities were successfully identified. To keep the tool up to date continuous improvement is needed, following the Plan Do Check Act (PDCA) cycle.


Sign in / Sign up

Export Citation Format

Share Document