Robust and semantic needle detection in 3D ultrasound using orthogonal-plane convolutional neural networks

Research output: Contribution to journalArticleAcademicpeer-review

5 Citations (Scopus)

Abstract

Purpose: During needle interventions, successful automated detection of the needle immediately after insertion is necessary to allow the physician identify and correct any misalignment of the needle and the target at early stages, which reduces needle passes and improves health outcomes. Methods: We present a novel approach to localize partially inserted needles in 3D ultrasound volume with high precision using convolutional neural networks. We propose two methods based on patch classification and semantic segmentation of the needle from orthogonal 2D cross-sections extracted from the volume. For patch classification, each voxel is classified from locally extracted raw data of three orthogonal planes centered on it. We propose a bootstrap resampling approach to enhance the training in our highly imbalanced data. For semantic segmentation, parts of a needle are detected in cross-sections perpendicular to the lateral and elevational axes. We propose to exploit the structural information in the data with a novel thick-slice processing approach for efficient modeling of the context. Results: Our introduced methods successfully detect 17 and 22 G needles with a single trained network, showing a robust generalized approach. Extensive ex-vivo evaluations on datasets of chicken breast and porcine leg show 80 and 84% F1-scores, respectively. Furthermore, very short needles are detected with tip localization errors of less than 0.7 mm for lengths of only 5 and 10 mm at 0.2 and 0.36 mm voxel sizes, respectively. Conclusion: Our method is able to accurately detect even very short needles, ensuring that the needle and its tip are maximally visible in the visualized plane during the entire intervention, thereby eliminating the need for advanced bi-manual coordination of the needle and transducer.

LanguageEnglish
Pages1321-1333
Number of pages13
JournalInternational Journal of Computer Assisted Radiology and Surgery
Volume13
Issue number9
DOIs
StatePublished - 1 Sep 2018

Fingerprint

Semantics
Needles
Ultrasonics
Neural networks
Transducers
Chickens
Leg
Breast
Swine
Health
Physicians

Keywords

  • 3D ultrasound
  • Convolutional neural networks
  • Needle detection

Cite this

@article{f46d8de09c7d4ec6bcee7849d75d8dbd,
title = "Robust and semantic needle detection in 3D ultrasound using orthogonal-plane convolutional neural networks",
abstract = "Purpose: During needle interventions, successful automated detection of the needle immediately after insertion is necessary to allow the physician identify and correct any misalignment of the needle and the target at early stages, which reduces needle passes and improves health outcomes. Methods: We present a novel approach to localize partially inserted needles in 3D ultrasound volume with high precision using convolutional neural networks. We propose two methods based on patch classification and semantic segmentation of the needle from orthogonal 2D cross-sections extracted from the volume. For patch classification, each voxel is classified from locally extracted raw data of three orthogonal planes centered on it. We propose a bootstrap resampling approach to enhance the training in our highly imbalanced data. For semantic segmentation, parts of a needle are detected in cross-sections perpendicular to the lateral and elevational axes. We propose to exploit the structural information in the data with a novel thick-slice processing approach for efficient modeling of the context. Results: Our introduced methods successfully detect 17 and 22 G needles with a single trained network, showing a robust generalized approach. Extensive ex-vivo evaluations on datasets of chicken breast and porcine leg show 80 and 84{\%} F1-scores, respectively. Furthermore, very short needles are detected with tip localization errors of less than 0.7 mm for lengths of only 5 and 10 mm at 0.2 and 0.36 mm voxel sizes, respectively. Conclusion: Our method is able to accurately detect even very short needles, ensuring that the needle and its tip are maximally visible in the visualized plane during the entire intervention, thereby eliminating the need for advanced bi-manual coordination of the needle and transducer.",
keywords = "3D ultrasound, Convolutional neural networks, Needle detection",
author = "Arash Pourtaherian and {Ghazvinian Zanjani}, Farhad and Svitlana Zinger and Nenad Mihajlovic and Ng, {Gary C.} and Korsten, {Hendrikus H.M.} and {de With}, {Peter H.N.}",
year = "2018",
month = "9",
day = "1",
doi = "10.1007/s11548-018-1798-3",
language = "English",
volume = "13",
pages = "1321--1333",
journal = "International Journal of Computer Assisted Radiology and Surgery",
issn = "1861-6410",
publisher = "Springer",
number = "9",

}

TY - JOUR

T1 - Robust and semantic needle detection in 3D ultrasound using orthogonal-plane convolutional neural networks

AU - Pourtaherian,Arash

AU - Ghazvinian Zanjani,Farhad

AU - Zinger,Svitlana

AU - Mihajlovic,Nenad

AU - Ng,Gary C.

AU - Korsten,Hendrikus H.M.

AU - de With,Peter H.N.

PY - 2018/9/1

Y1 - 2018/9/1

N2 - Purpose: During needle interventions, successful automated detection of the needle immediately after insertion is necessary to allow the physician identify and correct any misalignment of the needle and the target at early stages, which reduces needle passes and improves health outcomes. Methods: We present a novel approach to localize partially inserted needles in 3D ultrasound volume with high precision using convolutional neural networks. We propose two methods based on patch classification and semantic segmentation of the needle from orthogonal 2D cross-sections extracted from the volume. For patch classification, each voxel is classified from locally extracted raw data of three orthogonal planes centered on it. We propose a bootstrap resampling approach to enhance the training in our highly imbalanced data. For semantic segmentation, parts of a needle are detected in cross-sections perpendicular to the lateral and elevational axes. We propose to exploit the structural information in the data with a novel thick-slice processing approach for efficient modeling of the context. Results: Our introduced methods successfully detect 17 and 22 G needles with a single trained network, showing a robust generalized approach. Extensive ex-vivo evaluations on datasets of chicken breast and porcine leg show 80 and 84% F1-scores, respectively. Furthermore, very short needles are detected with tip localization errors of less than 0.7 mm for lengths of only 5 and 10 mm at 0.2 and 0.36 mm voxel sizes, respectively. Conclusion: Our method is able to accurately detect even very short needles, ensuring that the needle and its tip are maximally visible in the visualized plane during the entire intervention, thereby eliminating the need for advanced bi-manual coordination of the needle and transducer.

AB - Purpose: During needle interventions, successful automated detection of the needle immediately after insertion is necessary to allow the physician identify and correct any misalignment of the needle and the target at early stages, which reduces needle passes and improves health outcomes. Methods: We present a novel approach to localize partially inserted needles in 3D ultrasound volume with high precision using convolutional neural networks. We propose two methods based on patch classification and semantic segmentation of the needle from orthogonal 2D cross-sections extracted from the volume. For patch classification, each voxel is classified from locally extracted raw data of three orthogonal planes centered on it. We propose a bootstrap resampling approach to enhance the training in our highly imbalanced data. For semantic segmentation, parts of a needle are detected in cross-sections perpendicular to the lateral and elevational axes. We propose to exploit the structural information in the data with a novel thick-slice processing approach for efficient modeling of the context. Results: Our introduced methods successfully detect 17 and 22 G needles with a single trained network, showing a robust generalized approach. Extensive ex-vivo evaluations on datasets of chicken breast and porcine leg show 80 and 84% F1-scores, respectively. Furthermore, very short needles are detected with tip localization errors of less than 0.7 mm for lengths of only 5 and 10 mm at 0.2 and 0.36 mm voxel sizes, respectively. Conclusion: Our method is able to accurately detect even very short needles, ensuring that the needle and its tip are maximally visible in the visualized plane during the entire intervention, thereby eliminating the need for advanced bi-manual coordination of the needle and transducer.

KW - 3D ultrasound

KW - Convolutional neural networks

KW - Needle detection

UR - http://www.scopus.com/inward/record.url?scp=85047919138&partnerID=8YFLogxK

U2 - 10.1007/s11548-018-1798-3

DO - 10.1007/s11548-018-1798-3

M3 - Article

VL - 13

SP - 1321

EP - 1333

JO - International Journal of Computer Assisted Radiology and Surgery

T2 - International Journal of Computer Assisted Radiology and Surgery

JF - International Journal of Computer Assisted Radiology and Surgery

SN - 1861-6410

IS - 9

ER -