Deep learning for real-time, automatic, and scanner-adapted prostate (zone) segmentation of transrectal ultrasound, for example, magnetic resonance imaging–transrectal ultrasound fusion prostate biopsy

Ruud J.G. van Sloun (Corresponding author), Rogier R. Wildeboer, Christophe K. Mannaerts, Arnoud W. Postema, Maudy Gayet, Harrie P. Beerlage, Georg Salomon, Hessel Wijkstra, Massimo Mischi

Research output: Contribution to journalArticleAcademicpeer-review

4 Citations (Scopus)

Abstract

Background Although recent advances in multiparametric magnetic resonance imaging (MRI) led to an increase in MRI-transrectal ultrasound (TRUS) fusion prostate biopsies, these are time consuming, laborious, and costly. Introduction of deep-learning approach would improve prostate segmentation. Objective To exploit deep learning to perform automatic, real-time prostate (zone) segmentation on TRUS images from different scanners. Design, setting, and participants Three datasets with TRUS images were collected at different institutions, using an iU22 (Philips Healthcare, Bothell, WA, USA), a Pro Focus 2202a (BK Medical), and an Aixplorer (SuperSonic Imagine, Aix-en-Provence, France) ultrasound scanner. The datasets contained 436 images from 181 men. Outcome measurements and statistical analysis Manual delineations from an expert panel were used as ground truth. The (zonal) segmentation performance was evaluated in terms of the pixel-wise accuracy, Jaccard index, and Hausdorff distance. Results and limitations The developed deep-learning approach was demonstrated to significantly improve prostate segmentation compared with a conventional automated technique, reaching median accuracy of 98% (95% confidence interval 95–99%), a Jaccard index of 0.93 (0.80–0.96), and a Hausdorff distance of 3.0 (1.3–8.7) mm. Zonal segmentation yielded pixel-wise accuracy of 97% (95–99%) and 98% (96–99%) for the peripheral and transition zones, respectively. Supervised domain adaptation resulted in retainment of high performance when applied to images from different ultrasound scanners (p > 0.05). Moreover, the algorithm’s assessment of its own segmentation performance showed a strong correlation with the actual segmentation performance (Pearson’s correlation 0.72, p 
Original languageEnglish
JournalEuropean Urology Focus
Early online date23 Apr 2019
DOIs
Publication statusE-pub ahead of print - 23 Apr 2019

Keywords

  • Deep learning
  • Prostate cancer
  • Segmentation
  • Ultrasound
  • magnetic resonance imaging–transrectal ultrasound fusion biopsy

Fingerprint Dive into the research topics of 'Deep learning for real-time, automatic, and scanner-adapted prostate (zone) segmentation of transrectal ultrasound, for example, magnetic resonance imaging–transrectal ultrasound fusion prostate biopsy'. Together they form a unique fingerprint.

  • Cite this