Background

Breast cancer is one of the leading causes of cancer death among women, and one in eight women in the United States will develop breast cancer during their lifetime. In clinical routine, the tumor segmentation is a critical but quite challenging step for further cancer diagnosis and treatment planning.

Many BUS segmentation approaches have been proposed in the last two decades, but the performances of most approaches have been assessed using relatively small private datasets with different quantitative metrics, which result in the discrepancy in performance comparison. Therefore, there is a pressing need for building a benchmark to compare existing methods using a public dataset objectively, and to determine the performance of the best breast tumor segmentation algorithm available today and to investigate what segmentation strategies are valuable in clinical practice and theoretical study. In this work, we will publish a B-mode BUS image segmentation benchmark (BUSIS) with 562 images and compare the performance of fourteen state-of-the-art BUS segmentation methods quantitatively. 

Dataset, Ground Truth and Metrics

Basic dataset  information:
The 562 images are collected and prepared by the joint efforts of scientists and researcheres from the following institutions:
Utah State University
University of Idaho
Harbin Institute of Technology
The Second Affiliated Hospital of Harbin Medical University
The Affiliated Hospital of Qingdao University
The Second Hospital of Hebei Medical University

The images were from different sources and collected using multiple ultrasound devices:

GE VIVID 7
LOGIQ E9
Hitachi EUB-6500
Philips iU22
Siemens ACUSON S2000
Informed consents to the protocol from all patients were acquired. The privacy of the patients is well protected.

Ground truth generation:
Four experienced radiologists are involved in the ground truth generation; three radiologists read and delineated each tumor boundary individually, and the fourth one (senior expert) will judge if the majority voting results need adjustment. The complete procedures of the ground truth generation are as follows.
Step 1: every of three experienced radiologists delineates each tumor boundary manually, and three delineation results will be produced for each BUS image.
Step 2: view all pixels inside/on the boundary as tumor region, and outside pixels as background; conduct majority voting to generate the preliminary result for each BUS image.
Step 3: a senior expert will read each BUS image and refer its corresponding preliminary result to decide if it needs any adjustment.
Step 4: label tumor pixel as 1 and background pixel as 0; and generate a binary and uncompressed image to save the ground truth for each BUS image.

Release Agreement Download:

If you would like to access this dataset, please sign the dataset release agreement and email us. We will send you the link of dataset after receiving the signed AGREEMENT. Please allow up to 10 business days for processing.

References

If you use the dataset and results of  this work, please cite the following papers: 

  1. M. Xian, Y. Zhang, and H. D. Cheng, “Fully automatic segmentation of breast ultrasound images based on breast characteristics in space and frequency domains,” Pattern Recognit., vol. 48, no. 2, pp. 485-497, 2015.
  2. H. D. Cheng, J. Shan, W. Ju, Y. Guo, and L. Zhang, “Automated breast cancer detection and classification using ultrasound images: A survey,” Pattern Recognit., vol. 43, no. 1, pp. 299-317, Jan, 2010.
  3. Y. Zhang, M. Xian, H. D. Cheng, B. Shareef, J. Ding, F. Xu, K. Huang, B. Zhang, C. Ning, Y. Wang, "BUSIS: A Benchmark for Breast Ultrasound Image Segmentation," Healthcare., vol. 10, no. 4, pp. 729, Apr, 2022.
  4. M. Xian, Y. Zhang, H. D. Cheng, F. Xu, B. Zhang, and J. Ding, "Automatic Breast Ultrasound Image Segmentation: A Survey," Pattern Recognit., vol. 79, pp. 340-355, 2018.

Main Contributors:

H. D. Cheng, Fei Xu, and Kuan Huang, Utah State University
Min Xian, Boyu Zhang, University of Idaho
Yingtao Zhang, Jianrui Ding, Harbin Institute of Technolgy
Chunping Ning, The Affiliated Hospital of Qingdao University
Ying Wang, The Second Hospital of Hebei Medical University