• Türkçe
    • English
  • Türkçe 
    • Türkçe
    • English
  • Giriş
Öğe Göster 
  •   RTEÜ
  • Araştırma Çıktıları | TR-Dizin | WoS | Scopus | PubMed
  • Scopus İndeksli Yayınlar Koleksiyonu
  • Öğe Göster
  •   RTEÜ
  • Araştırma Çıktıları | TR-Dizin | WoS | Scopus | PubMed
  • Scopus İndeksli Yayınlar Koleksiyonu
  • Öğe Göster
JavaScript is disabled for your browser. Some features of this site may not work without it.

Deep network-based comprehensive parotid gland tumor detection

Erişim

info:eu-repo/semantics/closedAccess

Tarih

2023

Yazar

Sünnetçi, Kubilay Muhammed
Kaba, Esat
Çeliker, Fatma Beyazal
Alkan, Ahmet

Üst veri

Tüm öğe kaydını göster

Künye

Sunnetci, K. M., Kaba, E., Celiker, F. B., & Alkan, A. (2023). Deep Network-Based Comprehensive Parotid Gland Tumor Detection. Academic radiology, S1076-6332(23)00226-X. Advance online publication. https://doi.org/10.1016/j.acra.2023.04.028

Özet

Rationale and Objectives: Salivary gland tumors constitute 2%-6% of all head and neck tumors and are most common in the parotid gland. Magnetic resonance (MR) imaging is the most sensitive imaging modality for diagnosis. Tumor type, localization, and relationship with surrounding structures are important factors for treatment. Therefore, parotid gland tumor segmentation is important. Specialists widely use manual segmentation in diagnosis and treatment. However, considering the development of artificial intelligence-based models today, it is seen that artificial intelligence-based automatic segmentation models can be used instead of manual segmentation, which is a time-consuming technique. Therefore, we segmented parotid gland tumor (PGT) using deep learning-based architectures in the paper. Materials and Methods: The dataset used in the study includes 102 T1-w, 102 contrast-enhanced T1-w (T1C-w), and 102 T2-w MR images. After cropping the raw and manually segmented images by experts, we obtained the masks of these images. After standardizing the image sizes, we split these images into approximately 80% training set and 20% test set. Hereabouts, we trained six models for these images using ResNet18 and Xception-based DeepLab v3+. We prepared a user-friendly Graphical User Interface application that includes each of these models. Results: From the results, the accuracy and weighted Intersection over Union values of the ResNet18-based DeepLab v3+ architecture trained for T1C-w, which is the most successful model in the study, are equal to 0.96153 and 0.92601, respectively. Regarding the results and the literature, it can be seen that the proposed system is competitive in terms of both using MR images and training the models independently for T1-w, T1C-w, and T2-w. Expressing that PGT is usually segmented manually in the literature, we predict that our study can contribute significantly to the literature. Conclusion: In this study, we prepared and presented a software application that can be easily used by users for automatic PGT segmentation. In addition to predicting the reduction of costs and workload through the study, we developed models with meaningful performance metrics according to the literature.

Kaynak

Academic Radiology

Bağlantı

https://doi.org/10.1016/j.acra.2023.04.028
https://hdl.handle.net/11436/8384

Koleksiyonlar

  • PubMed İndeksli Yayınlar Koleksiyonu [2443]
  • Scopus İndeksli Yayınlar Koleksiyonu [5931]
  • TF, Dahili Tıp Bilimleri Bölümü Koleksiyonu [1559]
  • WoS İndeksli Yayınlar Koleksiyonu [5260]



DSpace software copyright © 2002-2015  DuraSpace
İletişim | Geri Bildirim
Theme by 
@mire NV
 

 




| Yönerge | Rehber | İletişim |

DSpace@RTEÜ

by OpenAIRE
Gelişmiş Arama

sherpa/romeo

Göz at

Tüm DSpaceBölümler & KoleksiyonlarTarihe GöreYazara GöreBaşlığa GöreKonuya GöreTüre GöreDile GöreBölüme GöreKategoriye GöreYayıncıya GöreErişim ŞekliKurum Yazarına GöreBu KoleksiyonTarihe GöreYazara GöreBaşlığa GöreKonuya GöreTüre GöreDile GöreBölüme GöreKategoriye GöreYayıncıya GöreErişim ŞekliKurum Yazarına Göre

Hesabım

GirişKayıt

İstatistikler

Google Analitik İstatistiklerini Görüntüle

DSpace software copyright © 2002-2015  DuraSpace
İletişim | Geri Bildirim
Theme by 
@mire NV
 

 


|| Rehber|| Yönerge || Kütüphane || Recep Tayyip Erdoğan Üniversitesi || OAI-PMH ||

Recep Tayyip Erdoğan Üniversitesi, Rize, Türkiye
İçerikte herhangi bir hata görürseniz, lütfen bildiriniz:

Creative Commons License
Recep Tayyip Erdoğan Üniversitesi Institutional Repository is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 4.0 Unported License..

DSpace@RTEÜ:


DSpace 6.2

tarafından İdeal DSpace hizmetleri çerçevesinde özelleştirilerek kurulmuştur.