Samenvatting
BACKGROUND: Although recent advances in multiparametric magnetic resonance imaging (MRI) led to an increase in MRI-transrectal ultrasound (TRUS) fusion prostate biopsies, these are time consuming, laborious, and costly. Introduction of deep-learning approach would improve prostate segmentation.
OBJECTIVE: To exploit deep learning to perform automatic, real-time prostate (zone) segmentation on TRUS images from different scanners.
DESIGN, SETTING, AND PARTICIPANTS: Three datasets with TRUS images were collected at different institutions, using an iU22 (Philips Healthcare, Bothell, WA, USA), a Pro Focus 2202a (BK Medical), and an Aixplorer (SuperSonic Imagine, Aix-en-Provence, France) ultrasound scanner. The datasets contained 436 images from 181 men.
OUTCOME MEASUREMENTS AND STATISTICAL ANALYSIS: Manual delineations from an expert panel were used as ground truth. The (zonal) segmentation performance was evaluated in terms of the pixel-wise accuracy, Jaccard index, and Hausdorff distance.
RESULTS AND LIMITATIONS: The developed deep-learning approach was demonstrated to significantly improve prostate segmentation compared with a conventional automated technique, reaching median accuracy of 98% (95% confidence interval 95-99%), a Jaccard index of 0.93 (0.80-0.96), and a Hausdorff distance of 3.0 (1.3-8.7) mm. Zonal segmentation yielded pixel-wise accuracy of 97% (95-99%) and 98% (96-99%) for the peripheral and transition zones, respectively. Supervised domain adaptation resulted in retainment of high performance when applied to images from different ultrasound scanners (p > 0.05). Moreover, the algorithm's assessment of its own segmentation performance showed a strong correlation with the actual segmentation performance (Pearson's correlation 0.72, p < 0.001), indicating that possible incorrect segmentations can be identified swiftly.
CONCLUSIONS: Fusion-guided prostate biopsies, targeting suspicious lesions on MRI using TRUS are increasingly performed. The requirement for (semi)manual prostate delineation places a substantial burden on clinicians. Deep learning provides a means for fast and accurate (zonal) prostate segmentation of TRUS images that translates to different scanners.
PATIENT SUMMARY: Artificial intelligence for automatic delineation of the prostate on ultrasound was shown to be reliable and applicable to different scanners. This method can, for example, be applied to speed up, and possibly improve, guided prostate biopsies using magnetic resonance imaging-transrectal ultrasound fusion.
Originele taal-2 | Engels |
---|---|
Pagina's (van-tot) | 78-85 |
Aantal pagina's | 8 |
Tijdschrift | European Urology Focus |
Volume | 7 |
Nummer van het tijdschrift | 1 |
Vroegere onlinedatum | 23 apr. 2019 |
DOI's | |
Status | Gepubliceerd - jan. 2021 |
Financiering
Prostate segmentation in TRUS B-mode images is challenging and has been studied extensively because of its clinical relevance, for example, for MRI-TRUS fusion biopsy. We presented a deep-learning–based approach for fully automatic (zonal) prostate segmentation and demonstrated its speed, reliability, and ability to adapt to different scanners on a large set of TRUS images from different institutions. Given the image variety in the current datasets, we expect that this adaptation ability can be extrapolated to other manufacturers. Author contributions: Ruud J.G. van Sloun had full access to all the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis. Study concept and design: van Sloun, Wildeboer. Acquisition of data : Mannaerts, Postema, Gayet. Analysis and interpretation of data: van Sloun, Wildeboer. Drafting of the manuscript: van Sloun, Wildeboer. Critical revision of the manuscript for important intellectual content: All authors. Statistical analysis : van Sloun, Wildeboer. Obtaining funding : Salomon, Wijkstra, Beerlage, Mischi. Administrative, technical, or material support: None. Supervision : Salomon, Wijkstra, Beerlage, Mischi. Other : None. Financial disclosures: Ruud J.G. van Sloun certifies that all conflicts of interest, including specific financial interests and relationships and affiliations relevant to the subject matter or materials discussed in the manuscript (eg, employment/affiliation, grants or funding, consultancies, honoraria, stock ownership or options, expert testimony, royalties, or patents filed, received, or pending), are the following: Hessel Wijkstra: unrestricted grant from Dutch Cancer Society (#UVA2013-5941). Massimo Mischi: European Research Council Starting Grant (#280209). Maudy Gayet: research grant from Astellas Pharma Netherlands B.V. Rogier R. Wildeboer: IMPULS2-pogram within the Eindhoven University of Technology in collaboration with Philips. Funding/Support and role of the sponsor: The authors would like to acknowledge NVIDIA Corporation for granting the Titan XP GPU processor.