TY - JOUR
T1 - Automatic detection of breast lesions in automated 3D breast ultrasound with cross-organ transfer learning
AU - Lingyun, B. A.O.
AU - HUANG, Zhengrui
AU - LIN, Zehui
AU - SUN, Yue
AU - CHEN, Hui
AU - LI, You
AU - LI, Zhang
AU - YUAN, Xiaochen
AU - XU, Lin
AU - TAN, Tao
N1 - Publisher Copyright:
© 2024 Beijing Zhongke Journal Publishing Co. Ltd
PY - 2024/6
Y1 - 2024/6
N2 - Background: Deep convolutional neural networks have garnered considerable attention in numerous machine learning applications, particularly in visual recognition tasks such as image and video analyses. There is a growing interest in applying this technology to diverse applications in medical image analysis. Automated three-dimensional Breast Ultrasound is a vital tool for detecting breast cancer, and computer-assisted diagnosis software, developed based on deep learning, can effectively assist radiologists in diagnosis. However, the network model is prone to overfitting during training, owing to challenges such as insufficient training data. This study attempts to solve the problem caused by small datasets and improve model detection performance. Methods: We propose a breast cancer detection framework based on deep learning (a transfer learning method based on cross-organ cancer detection) and a contrastive learning method based on breast imaging reporting and data systems (BI-RADS). Results: When using cross organ transfer learning and BIRADS based contrastive learning, the average sensitivity of the model increased by a maximum of 16.05%. Conclusion: Our experiments have demonstrated that the parameters and experiences of cross-organ cancer detection can be mutually referenced, and contrastive learning method based on BI-RADS can improve the detection performance of the model.
AB - Background: Deep convolutional neural networks have garnered considerable attention in numerous machine learning applications, particularly in visual recognition tasks such as image and video analyses. There is a growing interest in applying this technology to diverse applications in medical image analysis. Automated three-dimensional Breast Ultrasound is a vital tool for detecting breast cancer, and computer-assisted diagnosis software, developed based on deep learning, can effectively assist radiologists in diagnosis. However, the network model is prone to overfitting during training, owing to challenges such as insufficient training data. This study attempts to solve the problem caused by small datasets and improve model detection performance. Methods: We propose a breast cancer detection framework based on deep learning (a transfer learning method based on cross-organ cancer detection) and a contrastive learning method based on breast imaging reporting and data systems (BI-RADS). Results: When using cross organ transfer learning and BIRADS based contrastive learning, the average sensitivity of the model increased by a maximum of 16.05%. Conclusion: Our experiments have demonstrated that the parameters and experiences of cross-organ cancer detection can be mutually referenced, and contrastive learning method based on BI-RADS can improve the detection performance of the model.
KW - Automated 3D breast ultrasound
KW - Breast cancers
KW - Breast ultrasound
KW - Computer-aided diagnosis
KW - Convolutional neural networks
KW - Cross organ learning
KW - Deep learning
KW - Transfer learning
UR - http://www.scopus.com/inward/record.url?scp=85196934782&partnerID=8YFLogxK
U2 - 10.1016/j.vrih.2024.02.001
DO - 10.1016/j.vrih.2024.02.001
M3 - Article
AN - SCOPUS:85196934782
SN - 2096-5796
VL - 6
SP - 239
EP - 251
JO - Virtual Reality and Intelligent Hardware
JF - Virtual Reality and Intelligent Hardware
IS - 3
ER -