Feasibility study of catheter segmentation in 3D Frustum Ultrasounds by DCNN

Lan Min, Hongxu Yang, Caifeng Shan, Alexander F. Kolen, Peter de With

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

2 Citations (Scopus)


3D ultrasound has been developed rapidly in medical intervention therapies, such as cardiac catheterization. Image-based catheter detection is studied to help sonographer to timely localize the instrument in the 3D US images. Conventionally, the 3D imaging methods are based on the Cartesian domain, which is limited by bandwidth and information lose when it is converted from the original acquisition space---Frustum domain. The catheter segmentation in the Frustum space helps to reduce the computational cost and improve efficiency. In this paper, we present a catheter segmentation method in 3D Frustum image via a deep convolutional network (DCNN). To accelerate the prediction efficiency on whole US Frustum volume, a filter-based pre-selection is applied to reduce the computational cost of the DCNN. Based on experiments on the ex-vivo dataset, our proposed method can segment the catheter in Frustum images with 0.67 Dice score within 3 seconds.
Original languageEnglish
Title of host publicationMedical Imaging 2020
Subtitle of host publicationImage-Guided Procedures, Robotic Interventions, and Modeling
EditorsBaowei Fei, Cristian A. Linte
Number of pages6
ISBN (Electronic)9781510633971
Publication statusPublished - 16 Mar 2020
EventSPIE Medical Imaging 2020 - Houston, United States
Duration: 15 Feb 202020 Feb 2020

Publication series

NameProceedings of SPIE - The International Society for Optical Engineering
ISSN (Print)0277-786X
ISSN (Electronic)1996-756X


ConferenceSPIE Medical Imaging 2020
Country/TerritoryUnited States


  • 3D Frustum ultrasound
  • Catheter segmentation
  • DCNN
  • Ex-vivo dataset


Dive into the research topics of 'Feasibility study of catheter segmentation in 3D Frustum Ultrasounds by DCNN'. Together they form a unique fingerprint.

Cite this