Neural networks model based on an automated multi-scale method for mammogram classification

作者:

Highlights:

摘要

Breast cancer is the most commonly diagnosed cancer among women. Convolutional neural networks (CNN)-based mammogram classification plays a vital role in early breast cancer detection. However, it pays too much attention to the lesions of mammograms and ignores the global characteristics of the breast. In the process of diagnosis, doctors not only pay attention to the features of local lesions but also combine with the comparison to the global characteristics of breasts. Mammogram images have a visible characteristic, which is that the original image is large, while the lesions are relatively small. It means that the lesions are easy to overlook. This paper proposes an automated multi-scale end-to-end deep neural networks model for mammogram classification, that only requires mammogram images and class labels (without ROI annotations). The proposed model generated three scales of feature maps that make the classifier combine global information with the local lesions for classification. Moreover, the images processed by our method contain fewer non-breast pixels and retain the small lesions information as much as possible, which is helpful for the model to focus on the small lesions. The performance of our method is verified on the INbreast dataset. Compared to other state-of-the-art mammogram classification algorithms, our model performs the best. Moreover, the multi-scale method is applied to the networks with fewer parameters that can achieve comparable performance, while saving 60% of the computing resources. It shows that the multi-scale method can work for both performance and computational efficiency.

论文关键词:Breast cancer,Mammogram classification,Small lesions,Convolutional neural networks,Multi-scale feature

论文评审过程:Received 15 October 2019, Revised 10 July 2020, Accepted 2 September 2020, Available online 19 September 2020, Version of Record 19 September 2020.

论文官网地址:https://doi.org/10.1016/j.knosys.2020.106465