The automatic classification of breast tumor in ultrasound images is of great significance to improve doctors' efficiency and reduce the rate of misdiagnosis. The novel 3D breast ultrasound data contains more information for diagnosis, but images from different directions have their distinct performance as a result of this ultrasound imaging mechanism. For this breast ultrasound data, this paper designed three kinds of convolutional neural network model using its flexibility and characteristic of learning automatically, and the three models were able to accept transverse plane images, transverse plane and coronal plane images, images and annotations information. The effects of different information fusion on the accuracy of breast tumor classification were investigated. A dataset contains 880 images (i. e., 401 benign images, 479 malign images) and their annotations were employed, and we performed 5-fold cross validation to calculate the accuracy and AUC of each model. The experimental results indicated that the models designed in this paper can deal with the images and annotations simultaneously. Compared with the single-input model, the multi-information fusion model improved the accuracy of classification by 2.91%, and achieved the accuracy of 75.11% and AUC of 0.8294. The proposed models provided a reference for the classification application of convolutional neural networks with multi-information fusion.
|Number of pages||9|
|Journal||Chinese Journal of Biomedical Engineering|
|Publication status||Published - 20 Aug 2018|
- 3D breast ultrasound
- Convolutional neural networks
- Medical image classification
- Multi-information fusion