scholarly journals Deep learning and level set approach for liver and tumor segmentation from CT scans

2020 ◽  
Vol 21 (10) ◽  
pp. 200-209
Author(s):  
Omar Ibrahim Alirr
Author(s):  
Layth Kamil Adday Almajmaie ◽  
Ahmed Raad Raheem ◽  
Wisam Ali Mahmood ◽  
Saad Albawi

<span>The segmented brain tissues from magnetic resonance images (MRI) always pose substantive challenges to the clinical researcher community, especially while making precise estimation of such tissues. In the recent years, advancements in deep learning techniques, more specifically in fully convolution neural networks (FCN) have yielded path breaking results in segmenting brain tumour tissues with pin-point accuracy and precision, much to the relief of clinical physicians and researchers alike. A new hybrid deep learning architecture combining SegNet and U-Net techniques to segment brain tissue is proposed here. Here, a skip connection of the concerned U-Net network was suitably explored. The results indicated optimal multi-scale information generated from the SegNet, which was further exploited to obtain precise tissue boundaries from the brain images. Further, in order to ensure that the segmentation method performed better in conjunction with precisely delineated contours, the output is incorporated as the level set layer in the deep learning network. The proposed method primarily focused on analysing brain tumor segmentation (BraTS) 2017 and BraTS 2018, dedicated datasets dealing with MRI brain tumour. The results clearly indicate better performance in segmenting brain tumours than existing ones.</span>


Author(s):  
Mamta Raju Jotkar ◽  
Daniel Rodriguez ◽  
Bruno Marins Soares

2020 ◽  
Vol 152 ◽  
pp. S949
Author(s):  
L. Bokhorst ◽  
M.H.F. Savenije ◽  
M.P.W. Intven ◽  
C.A.T. Van den Berg

Author(s):  
Vlad Vasilescu ◽  
Ana Neacsu ◽  
Emilie Chouzenoux ◽  
Jean-Christophe Pesquet ◽  
Corneliu Burileanu

2020 ◽  
Vol 22 (Supplement_3) ◽  
pp. iii359-iii359
Author(s):  
Lydia Tam ◽  
Edward Lee ◽  
Michelle Han ◽  
Jason Wright ◽  
Leo Chen ◽  
...  

Abstract BACKGROUND Brain tumors are the most common solid malignancies in childhood, many of which develop in the posterior fossa (PF). Manual tumor measurements are frequently required to optimize registration into surgical navigation systems or for surveillance of nonresectable tumors after therapy. With recent advances in artificial intelligence (AI), automated MRI-based tumor segmentation is now feasible without requiring manual measurements. Our goal was to create a deep learning model for automated PF tumor segmentation that can register into navigation systems and provide volume output. METHODS 720 pre-surgical MRI scans from five pediatric centers were divided into training, validation, and testing datasets. The study cohort comprised of four PF tumor types: medulloblastoma, diffuse midline glioma, ependymoma, and brainstem or cerebellar pilocytic astrocytoma. Manual segmentation of the tumors by an attending neuroradiologist served as “ground truth” labels for model training and evaluation. We used 2D Unet, an encoder-decoder convolutional neural network architecture, with a pre-trained ResNet50 encoder. We assessed ventricle segmentation accuracy on a held-out test set using Dice similarity coefficient (0–1) and compared ventricular volume calculation between manual and model-derived segmentations using linear regression. RESULTS Compared to the ground truth expert human segmentation, overall Dice score for model performance accuracy was 0.83 for automatic delineation of the 4 tumor types. CONCLUSIONS In this multi-institutional study, we present a deep learning algorithm that automatically delineates PF tumors and outputs volumetric information. Our results demonstrate applied AI that is clinically applicable, potentially augmenting radiologists, neuro-oncologists, and neurosurgeons for tumor evaluation, surveillance, and surgical planning.


2021 ◽  
Vol 352 ◽  
pp. 109091
Author(s):  
Asieh Khosravanian ◽  
Mohammad Rahmanimanesh ◽  
Parviz Keshavarzi ◽  
Saeed Mozaffari

Sign in / Sign up

Export Citation Format

Share Document