U-Net transfer learning backbones for lesions segmentation in breast ultrasound images

Mohamed Bal-Ghaoui, My Hachem El Yousfi Alaoui, Abdelilah Jilbab, Abdennacer Bourouhou

Abstract


Breast ultrasound images are highly valuable for the early detection of breast cancer. However, the drawback of these images is low-quality resolution and the presence of speckle noise, which affects their interpretability and makes them radiologists’ expertise-dependent. As medical images, breast ultrasound datasets are scarce and imbalanced, and annotating them is tedious and time-consuming. Transfer learning, as a deep learning technique, can be used to overcome the dataset deficiency in available images. This paper presents the implementation of transfer learning U-Net backbones for the automatic segmentation of breast ultrasound lesions and implements a threshold selection mechanism to deliver optimal generalized segmentation results of breast tumors. The work uses the public breast ultrasound images (BUSI) dataset and implements ten state-of-theart candidate models as U-Net backbones. We have trained these models with a five-fold cross-validation technique on 630 images with benign and malignant cases. Five out of ten models showed good results, and the best U-Net backbone was found to be DenseNet121. It achieved an average Dice coefficient of 0.7370 and a sensitivity of 0.7255. The model’s robustness was also evaluated against normal cases, and the model accurately detected 72 out of 113 images, which is higher than the four best models.


Keywords


breast ultrasound images; deep learning; lesions segmentation; transfer learning; U-Net backbones;

Full Text:

PDF


DOI: http://doi.org/10.11591/ijece.v13i5.pp5747-5754

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

International Journal of Electrical and Computer Engineering (IJECE)
p-ISSN 2088-8708, e-ISSN 2722-2578

This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).