• Türkçe
    • English
  • English 
    • Türkçe
    • English
  • Login
View Item 
  •   RTEÜ
  • Araştırma Çıktıları | TR-Dizin | WoS | Scopus | PubMed
  • Scopus İndeksli Yayınlar Koleksiyonu
  • View Item
  •   RTEÜ
  • Araştırma Çıktıları | TR-Dizin | WoS | Scopus | PubMed
  • Scopus İndeksli Yayınlar Koleksiyonu
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Deep network-based comprehensive parotid gland tumor detection

Access

info:eu-repo/semantics/closedAccess

Date

2023

Author

Sünnetçi, Kubilay Muhammed
Kaba, Esat
Çeliker, Fatma Beyazal
Alkan, Ahmet

Metadata

Show full item record

Citation

Sunnetci, K. M., Kaba, E., Celiker, F. B., & Alkan, A. (2023). Deep Network-Based Comprehensive Parotid Gland Tumor Detection. Academic radiology, S1076-6332(23)00226-X. Advance online publication. https://doi.org/10.1016/j.acra.2023.04.028

Abstract

Rationale and Objectives: Salivary gland tumors constitute 2%-6% of all head and neck tumors and are most common in the parotid gland. Magnetic resonance (MR) imaging is the most sensitive imaging modality for diagnosis. Tumor type, localization, and relationship with surrounding structures are important factors for treatment. Therefore, parotid gland tumor segmentation is important. Specialists widely use manual segmentation in diagnosis and treatment. However, considering the development of artificial intelligence-based models today, it is seen that artificial intelligence-based automatic segmentation models can be used instead of manual segmentation, which is a time-consuming technique. Therefore, we segmented parotid gland tumor (PGT) using deep learning-based architectures in the paper. Materials and Methods: The dataset used in the study includes 102 T1-w, 102 contrast-enhanced T1-w (T1C-w), and 102 T2-w MR images. After cropping the raw and manually segmented images by experts, we obtained the masks of these images. After standardizing the image sizes, we split these images into approximately 80% training set and 20% test set. Hereabouts, we trained six models for these images using ResNet18 and Xception-based DeepLab v3+. We prepared a user-friendly Graphical User Interface application that includes each of these models. Results: From the results, the accuracy and weighted Intersection over Union values of the ResNet18-based DeepLab v3+ architecture trained for T1C-w, which is the most successful model in the study, are equal to 0.96153 and 0.92601, respectively. Regarding the results and the literature, it can be seen that the proposed system is competitive in terms of both using MR images and training the models independently for T1-w, T1C-w, and T2-w. Expressing that PGT is usually segmented manually in the literature, we predict that our study can contribute significantly to the literature. Conclusion: In this study, we prepared and presented a software application that can be easily used by users for automatic PGT segmentation. In addition to predicting the reduction of costs and workload through the study, we developed models with meaningful performance metrics according to the literature.

Source

Academic Radiology

URI

https://doi.org/10.1016/j.acra.2023.04.028
https://hdl.handle.net/11436/8384

Collections

  • PubMed İndeksli Yayınlar Koleksiyonu [2443]
  • Scopus İndeksli Yayınlar Koleksiyonu [5931]
  • TF, Dahili Tıp Bilimleri Bölümü Koleksiyonu [1559]
  • WoS İndeksli Yayınlar Koleksiyonu [5260]



DSpace software copyright © 2002-2015  DuraSpace
Contact Us | Send Feedback
Theme by 
@mire NV
 

 




| Instruction | Guide | Contact |

DSpace@RTEÜ

by OpenAIRE
Advanced Search

sherpa/romeo

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsTypeLanguageDepartmentCategoryPublisherAccess TypeInstitution AuthorThis CollectionBy Issue DateAuthorsTitlesSubjectsTypeLanguageDepartmentCategoryPublisherAccess TypeInstitution Author

My Account

LoginRegister

Statistics

View Google Analytics Statistics

DSpace software copyright © 2002-2015  DuraSpace
Contact Us | Send Feedback
Theme by 
@mire NV
 

 


|| Guide|| Instruction || Library || Recep Tayyip Erdoğan University || OAI-PMH ||

Recep Tayyip Erdoğan University, Rize, Turkey
If you find any errors in content, please contact:

Creative Commons License
Recep Tayyip Erdoğan University Institutional Repository is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 4.0 Unported License..

DSpace@RTEÜ:


DSpace 6.2

tarafından İdeal DSpace hizmetleri çerçevesinde özelleştirilerek kurulmuştur.