A Practical Implementation of Deep Learning Method for Supporting the Classification of Breast Lesions in Ultrasound Images

  • Han, Seokmin (Department of Computer Science and Information Engineering, Korea National University of Transportation) ;
  • Lee, Suchul (Department of Computer Science and Information Engineering, Korea National University of Transportation) ;
  • Lee, Jun-Rak (Division of Liberal Studies, Kangwon National University)
  • Received : 2019.01.07
  • Accepted : 2019.01.20
  • Published : 2019.03.31


In this research, a practical deep learning framework to differentiate the lesions and nodules in breast acquired with ultrasound imaging has been proposed. 7408 ultrasound breast images of 5151 patient cases were collected. All cases were biopsy proven and lesions were semi-automatically segmented. To compensate for the shift caused in the segmentation, the boundaries of each lesion were drawn using Fully Convolutional Networks(FCN) segmentation method based on the radiologist's specified point. The data set consists of 4254 benign and 3154 malignant lesions. In 7408 ultrasound breast images, the number of training images is 6579, and the number of test images is 829. The margin between the boundary of each lesion and the boundary of the image itself varied for training image augmentation. The training images were augmented by varying the margin between the boundary of each lesion and the boundary of the image itself. The images were processed through histogram equalization, image cropping, and margin augmentation. The networks trained on the data with augmentation and the data without augmentation all had AUC over 0.95. The network exhibited about 90% accuracy, 0.86 sensitivity and 0.95 specificity. Although the proposed framework still requires to point to the location of the target ROI with the help of radiologists, the result of the suggested framework showed promising results. It supports human radiologist to give successful performance and helps to create a fluent diagnostic workflow that meets the fundamental purpose of CADx.


Breast Cancer;Deep Learning;FCN;segmentation

OTNBCL_2019_v8n1_24_f0001.png 이미지

Figure 1. The conceptual architecture of the proposed deep learning CAD framework

OTNBCL_2019_v8n1_24_f0002.png 이미지

Figure 2. The conceptual architecture of the proposed deep learning CAD framework

OTNBCL_2019_v8n1_24_f0003.png 이미지

Figure 4. ROC curves of GoogLeNets trained and tested on the images without a margin(black) and with a margin(red).

OTNBCL_2019_v8n1_24_f0004.png 이미지

Figure 5. Saliency map examples that shows where the important information exists in the image.

OTNBCL_2019_v8n1_24_f0005.png 이미지

Figure 6. ROC curves of GoogLeNet without the data augmentation and with the dataaugmentation.

OTNBCL_2019_v8n1_24_f0006.png 이미지

Figure 7. The perturbation in the center location by radiologist does not affect the performance much

OTNBCL_2019_v8n1_24_f0007.png 이미지

Figure 8. An implementation examples of (a)a possibly benign lesion, (b)a possibly malignant lesion.

OTNBCL_2019_v8n1_24_f0008.png 이미지

Figure 3. (a) and (b) are examples of segmented boundaries of breast lesion.

Table 1. Overview of the lesion size attributes of training data and test data.

OTNBCL_2019_v8n1_24_t0001.png 이미지

Table 2. Comparison of the required time for learning and the number of training images.

OTNBCL_2019_v8n1_24_t0002.png 이미지

Table 3. Performance comparison of a CNN trained on the images with a margin to a network trained on images without a margin. GLN refers to GoogLeNet.

OTNBCL_2019_v8n1_24_t0003.png 이미지

Table 4. Diagnostic performances of CNN networks.

OTNBCL_2019_v8n1_24_t0004.png 이미지

Table 5. Diagnostic performances on centered images, and perturbed images.

OTNBCL_2019_v8n1_24_t0005.png 이미지


Supported by : National Research Foundation of Korea(NRF), Korea National University of Transportation, Kangwon National University


  1. Jonathan Long, Evan Shelhamer, and Trevor Darrell, "Fully Convolutional Networks for Semantic Segmentation", IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.39, April 2017,pp.640-651.
  3. Kornecki, "A 2011 Current Status of Breast Ultrasound," Can. Assoc. Radiol. J., vol.62, 2011,pp.31-40.
  4. L. TabA¡r, B. Vitak, T.H. Chen, A.M. Yen, A. Cohen, T. Tot, S.Y. Chiu, S.L. Chen, J.C. Fann, J. Rosell, H. Fohlin,R.A. Smith, and S.W. Duffy, "Swedish two-county trial: impact of mammographic screening on breast cancer mortality during3 decades," Radiology, vol.260 (3), 2011, pp.658-663.
  5. K. Doi, "Computer-aided diagnosis in medical imaging: historical review, current status and future potential," Comput Med Imaging Graph, vol.31, 2007pp.198-211.
  6. B. van Ginneken, C.M. Schaefer-Prokop, and M. Prokop, "Computer-aided diagnosis: how to move from the laboratory to the clinic," Radiology, vol.261, 2011, pp.719-732.
  7. M. L. Giger, H. Chan, and J. Boone, "Anniversary paper: history and status of CAD and quantitative image analysis: the role of medical physics and AAPM," Med Phys, vol.35, 2008, pp.5799-5820.
  8. J. Cheng et al., "Computer-aided US diagnosis of breast lesions by using cell-based contour grouping1." Radiology, vol.255, 2010, pp.746-754.
  9. M. L. Giger, N. Karssemeijer, and J.A. Schnabel, "Breast image analysis for risk assessment, detection, diagnosis, and treatment of cancer," Annu Rev Biomed Eng, vol.15, 2013, pp.327-357
  10. S. Joo, Y.S Yang, W.K. Moon, and H.C. Kim, "Computer-aided diagnosis of solid breast nodules: use of an artificial neural network based on multiple sonographic features," IEEE Trans Med Imag, vol.23, 1292-1300 (2004)
  11. C.M. Chen et al, "Breast Lesions on Sonograms: Computer-aided Diagnosis with Nearly Setting-Independent Features and Artificial Neural Networks," Radiology, vol.226, 2003, pp.504-514.
  12. K. Drukker, C. Sennett and M.L. Giger, "Automated method for improving system performance of computer-aided diagnosis in breast ultrasound," IEEE Trans Med Imag,vol.28, 122-128 (2009).
  13. K. Awai et al, "Pulmonary Nodules: Estimation of Malignancy at Thin-Section Helical CTâ€"Effect of Computer-aided Diagnosis on Performance of Radiologists," Radiology, vol.239, 2006,pp.276-284.
  14. M.B. McCarvilleet et al, "Distinguishing Benign from Malignant Pulmonary Nodules with Helical Chest CT in Children with Malignant Solid Tumors," Radiology, vol.239, 2006, pp.514-520.
  15. I.C. Sluimer, P.F. van Waes, M.A. Viergever, and B. van Ginneken, "Computer-aided diagnosis in high resolution CT of the lungs," Med Phys, vol.30, 2003, pp.3081-3090.
  16. T. Sun, R. Zhang, J. Wang, X. Li, and X. Guo, "Computer-aided diagnosis for early-stage lung cancer based on longitudinal and balanced data," Plos ONE, vol.8, 2013, pp.e63559.
  17. T.W. Way et al, "Computer-aided diagnosis of pulmonary nodules on CT scans: improvement of classification performance with nodule surface features," Med Phys, vol.36, 2009, pp.3086-3098.
  18. S.G. Armato III, and W.F. Sensakovic, "Automated lung segmentation for thoracic CT: Impact on computer-aided diagnosis," Acad Radiol, vol.11, 2004, pp.1011-1021.
  19. T.W. Way et al, "Computer-aided diagnosis of pulmonary nodules on CT scans: segmentation and classification using 3D active contours," Med Phys, vol.33, 2006, pp.2323-2337.
  20. J. Wang et al, "Discrimination of Breast Cancer with Microcalcifications on Mammography by Deep Learning," Sci Rep, vol.6, 2016.
  21. T. Ayer et al, "Computer-aided diagnostic models in breast cancer screening,", Imaging Med., vol.2(3), 2010, pp.313-323.
  22. J.A. Cruz and D.S. Wishartl, "Applications of Machine Learning in Cancer Prediction and Prognosis," Cancer Inform., vol.2, 2006, pp.59-77.
  23. V. Vishrutha and M. Ravishankar, "Early Detection and Classification of Breast Cancer," Proceedings of the 3rd International Conference on Frontiers of Intelligent Computing: Theory and Applications (FICTA), 2014, pp.413-419.
  24. M. Krishnan et al, "Statistical analysis of mammographic features and its classification using support vector machine," Expert Systems with Applications, vol.37(1), 2010,pp.470-478.
  25. Y. Bengio, A. Courville, and P. Vincent, "Representation learning: are view and new perspectives," IEEE Trans Pattern Anal Mach Intell. vol.35(8), 2013, pp.1798-1828.
  26. W. Zhang et al, "Deep convolutional neural networks for multi-modality isointense infant brain image segmentation," NeuroImage, vol.108, 2015, pp.214-224.
  27. H. -C. Shin, M. R. Orton, D. J. Collins, S. J. Doran, and M. O. Leach, "Stacked autoencoders for unsupervised feature learning and multiple organ detection in a pilot study using 4D patient data," IEEE Trans Pattern Anal Mach Intell, vol.35, 2013, pp. 1930-1943
  28. H. Roth et al, "Improving Computer-aided Detection using Convolutional Neural Networks and Random View Aggregation. IEEE Trans Med Imag, vol.35(5), 2016, pp.1170-1181.
  29. A. Seff et al. "Leveraging Mid-Level Semantic Boundary Cues for Automated Lymph Node Detection, " Med Image Comput Comput Assist Interv(MICCAI), vol.9350, 2015, pp.53-61.
  30. N. Tajbakhsh, M. B. Gotway, and J. Liang, "Computer-aided pulmonary embolism detection using a novel vessel-aligned multi-planar image
  31. J. Arevalo, A. Cruz-Roa, and F.A. Gonzalez, "Hybrid image representation learning model with invariant features for basal cell carcinoma detection," Proc SPIE 8922, 2013, pp. 89220M-89220M-6.
  32. A. A. Cruz-Roa, J. E. A. Ovalle, A. Madabhushi, and F. A. G. Osorio, "A deep learning architecture for image representation, visual interpretability and automated basal-cell carcinoma cancer detection," Medical Image Computing and Computer-Assisted Intervention(MICCAI) 2013, 2013, pp.403-410,
  33. H. Chen et al, "Automatic Fetal Ultrasound Standard Plane Detection Using Knowledge Transferred Recurrent Neural Networks," Med Image Comput Comput Assist Interv(MICCAI), vol.9349, 2015, pp.507-514.
  34. A. Prasoon, K. Petersen, C. Igel, F. Lauze, E. Dam, and M. Nielsen, "Deep feature learning for knee cartilage segmentation using a triplanar convolutional neural network," Medical Image Computing and Computer-Assisted Intervention(MICCAI 2013), Vol. 8150 of Lecture Notes in Computer Science, 2013, pp. 246-253.
  35. H.I. Suk, and D. Shen, "Deep learning-based feature representation for AD/MCI classification," Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 8150, 2013, pp.583-590.
  36. H.-I. Suk, S.-W. Lee, and D. Shen, "Latent feature representation with stacked auto-encoder for AD/MCI diagnosis," Brain Struct Funct, 2013, pp.1-19.
  37. H.-I. Suk, S.-W. Lee, and D. Shen, "Hierarchical feature representation and multimodal fusion with deep learning for AD/MCI diagnosis," Neuroimage, vol.101, 2014, pp.569-582.
  38. F. Li, L. Tran, K.-H. Thung, S. Ji, D. Shen, and J. Li, "A Robust deep learning for improved classification of AD/MCI patients," IEEE J. Biomed. Health Inform., 2015, pp.1610-1616
  39. A. Jalalian, S.B. Mashohor, H.R. Mahmud, M.I.B. Saripan,A.R.B. Ramli, and B. Karasfi, "Computer-aided detection/diagnosisof breast cancer in mammography and ultrasound: a review," Clin Imaging, vol.37(3), 2013, pp.420-426.
  40. J. Z. Cheng et al, "Computer-Aided Diagnosis with Deep Learning Architecture: Applications to Breast Lesions in US Images and Pulmonary Nodules in CT Scans," Sci Rep, 2016.
  41. K Simonyan, A Vedaldi, A Zisserman, "Deep Inside Convolutional Networks: Visualising Image Classification Models and Saliency Maps," 2013,
  42. S.Han, H.K. Kang, J.Y. Jeong, M.H. Park, W. Kim, W.C. Bang, and Y.K. Seong, "A Deep Learning Framework for Supporting the Classification of Breast Lesions in Ultrasound images," Phys. Med. Biol., vol.62, 2017
  43. S.Han, J.Jeong, H.Kim, "An Implementation of Deep Learning Method of Breast Lesion Classification," Proceedings of Joint Conference on Communications and Information 2019, submitted.
  44. Y. Jia, E. Shelhamer, J. Donahue, S. Karayev, J. Long, R. B. Girshick, S. Guadarrama, and T. Darrel, "Caffe: Convolutional Architecture for Fast Feature Embedding," in ACM Multimedia, vol.2, 2014.