CNN-Based Prostate Zonal Segmentation on T2-Weighted MR Images: A Cross-Dataset Study

Salvatore Vitabile, Hideki Nakayama, Maria Carla Gilardi, Marco S. Nobile, Carmelo Militello, Leonardo Rundo, Andrea Tangherloni, Claudio Ferretti, Changhee Han, Ryuichiro Hataya, Yudai Nagano, Jin Zhang, Hideki Nakayama, Hideki Nakayama, Giancarlo Mauri

Research output: Chapter in Book/Report/Conference proceedingChapter

4 Citations (Scopus)

Abstract

Prostate cancer is the most common cancer among US men. However, prostate imaging is still challenging despite the advances in multi-parametric magnetic resonance imaging (MRI), which provides both morphologic and functional information pertaining to the pathological regions. Along with whole prostate gland segmentation, distinguishing between the central gland (CG) and peripheral zone (PZ) can guide toward differential diagnosis, since the frequency and severity of tumors differ in these regions; however, their boundary is often weak and fuzzy. This work presents a preliminary study on deep learning to automatically delineate the CG and PZ, aiming at evaluating the generalization ability of convolutional neural networks (CNNs) on two multi-centric MRI prostate datasets. Especially, we compared three CNN-based architectures: SegNet, U-Net, and pix2pix. In such a context, the segmentation performances achieved with/without pre-training were compared in 4-fold cross-validation. In general, U-Net outperforms the other methods, especially when training and testing are performed on multiple datasets.
Original languageEnglish
Title of host publicationSmart Innovation, Systems and Technologies
Pages269-280
Number of pages12
Publication statusPublished - 2020

Publication series

NameSMART INNOVATION, SYSTEMS AND TECHNOLOGIES

All Science Journal Classification (ASJC) codes

  • Decision Sciences(all)
  • Computer Science(all)

Fingerprint Dive into the research topics of 'CNN-Based Prostate Zonal Segmentation on T2-Weighted MR Images: A Cross-Dataset Study'. Together they form a unique fingerprint.

Cite this