WO2012109643A2 - Systems, methods and computer readable storage mediums storing instructions for classifying breast ct images - Google Patents

Systems, methods and computer readable storage mediums storing instructions for classifying breast ct images Download PDF

Info

Publication number
WO2012109643A2
WO2012109643A2 PCT/US2012/024824 US2012024824W WO2012109643A2 WO 2012109643 A2 WO2012109643 A2 WO 2012109643A2 US 2012024824 W US2012024824 W US 2012024824W WO 2012109643 A2 WO2012109643 A2 WO 2012109643A2
Authority
WO
WIPO (PCT)
Prior art keywords
image
tissue
classification
breast
constituent
Prior art date
Application number
PCT/US2012/024824
Other languages
French (fr)
Other versions
WO2012109643A3 (en
Inventor
Baowei Fei
Xiaofeng Yang
Ioannis SECHOPOULOS
Original Assignee
Emory University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Emory University filed Critical Emory University
Publication of WO2012109643A2 publication Critical patent/WO2012109643A2/en
Publication of WO2012109643A3 publication Critical patent/WO2012109643A3/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/02Devices for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computerised tomographs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/50Clinical applications
    • A61B6/502Clinical applications involving diagnosis of breast, i.e. mammography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5205Devices using data or image processing specially adapted for radiation diagnosis involving processing of raw data to produce diagnostic data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20028Bilateral filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30068Mammography; Breast

Definitions

  • mammography when the size of the lesion approaches 1 cm in size. See, e.g., Tabar et al., J Natl. Cancer Inst. Monogr., 1997, (22):43-47. Early identification of breast cancer is the most important consideration determining prognosis. Currently, the most common test for early detection of breast cancer is X-ray mammography. However, one of its most significant limitations with mammography is the superposition of overlaying glandular structures that can obscure visualization of a breast tumor and reduced sensitivity in women with dense breasts. Mammography also does not readily permit tissue classification due to the significant superposition problem inherent in a projection imaging technique.
  • BCT Dedicated X-ray breast computed tomography
  • breast CT provides three-dimensional images of the breast compared with just two-dimensional images for mammography, eliminates compression of the breast between two plates, and eliminates image artifacts (suspicious areas that result from normal breast structures overlaying each other when the breast compresses).
  • It provides high-quality volume data that enhances visualization of breast glandular tissue and architecture compared to other breast imaging methods providing imaging technologists with a clear view of breast architecture and glandular tissues. See, e.g., Boone et al., Med Phys., 2005, 32(12):3767-3776, and Boone et al., Radiology, 2001,
  • Breast tissue classification can be used for breast cancer detection and diagnosis and identification of women at high-risk.
  • various approaches have been tried to develop computer algorithms to classify breast tissues, accurate information regarding breast tissue composition has been difficult to obtain. See, e.g., Anderson et al., J Pathol., 1997, 181(4):374-380; Chang et al., Acad. Radiol., 2002, 9(8):899-905; Samani et al., IEEE Trans Med Imaging., 2001, 20(9):877-885; and Stines, J. and Tristant, H., Eur. J Radiol., 2004, 54(l):26-36.
  • the disclosure relates to systems, methods, and computer-readable mediums
  • a method for processing at least one image of a breast to determine a plurality of tissue constituents of the breast may include: processing the image to determine at least a first classification of the tissue constituents represented by the image and a second classification of the tissue constituents represented by the image, the second tissue classification being based on, in at least a part of, the first classification; and generating at least one classified image of the breast based on the processing.
  • the first classification may be based on intensity information and the second classification is based on position information.
  • the method may further include correcting bias of the image; and filtering the corrected image with a multi-scale filter.
  • the processing may process a corrected and filtered image.
  • the tissue constituents may include at least a first constituent, a second constituent and a third constituent. In some embodiments, the tissue constituents may include a fourth constituents. In further embodiments, the tissue constituents may include a fifth class.
  • the processing may include first classifying the image into the first classification, the first classification including at least two classes, a first class including the first constituent and the second constituent, and the second class including the third constituent.
  • the first classification may include additional classes.
  • the first classification may include a fourth class.
  • the fourth class may include the fourth constituent.
  • the tissue constituents may include but is not limited to glandular tissue, skin tissue, fat tissue, calcifications, and masses.
  • the first constituent may correspond to glandular tissue
  • the second constituent may correspond to skin tissue
  • the third constituent may correspond to fat tissue.
  • the fourth constituent and/or fifth constituent may correspond to calcification and/or mass.
  • the first classifying may include applying a fuzzy C-means classifier to the image.
  • the first classifying may include classifying the image into classes based on voxel intensity values.
  • the first classifying may include classifying the skin and glandular tissue into one class, and the fat tissue in another class.
  • the processing may include second classifying the image into the second classification.
  • the second classifying may include separating the first class into the first and second constituents.
  • the second classifying may include classifying the first class by identifying the skin and glandular tissue constituents.
  • the second classifying may include morphological processing of a portion of the first classification.
  • the second classifying may include applying erosion operations to at least a portion of the first classification
  • the second classifying may include obtaining a first mask, the first mask being of a whole breast.
  • the first mask may be generated during a preprocessing step.
  • the second classifying may include obtaining a second mask, the second mask being of skin.
  • the second classifying may include applying the second mask to the first classification to classify the skin and glandular tissue.
  • the method may include generating the classified image.
  • the method may further include quantifying the classified image.
  • The may include determining quantitative measurements for at least one of breast tissue composition, tissue density or distribution with respect to age.
  • the method may include outputting the classified image.
  • the outputting may include at least one of displaying the classified image, printing the classified image, or storing the classified image.
  • the classified image may include the quantitative measurements.
  • the method may further include detecting breast cancer based on the classified image.
  • the method may be an automatic method. In some embodiments, the method. In some embodiments, the method may be performed by a computer having a memory and a processor.
  • a computer-readable storage medium may store instructions for processing at least one image of a breast to determine a plurality of tissue constituents of the breast, the at least one image including image data.
  • the instructions may include: processing the image to determine at least a first classification of the tissue constituents represented by the image and a second classification of the tissue constituents represented by the image, the second tissue classification being based on, in at least a part of, the first classification; and generating at least one classified image of the breast based on the processing, wherein the first classification is based on intensity information and the second classification is based on position information.
  • the instructions may include: correcting bias of the image; and filtering the corrected image with a multi-scale filter.
  • the processing may process a corrected and filtered image
  • a system may be configured to process at least one image of a breast to determine a plurality of tissue constituents of the breast, the at least one image including image data.
  • the system may include an image processor.
  • the image processor may be configured to: process the image to determine at least a first classification of the tissue constituents represented by the image and a second classification of the tissue constituents represented by the image, the second tissue classification being based on, in at least a part of, the first classification; and generate at least one classified image of the breast based on the first and second classifications.
  • the first classification may be based on intensity information and the second classification may be based on position information.
  • the processor may include a first classifier.
  • the first classifier may be configured to determine a first classification.
  • the first classifier may be configured to apply a fuzzy C- means classifier to the image.
  • the first classifier may be configured to classify the image into classes based on voxel intensity values.
  • the first classifier may be configured to classify the skin and glandular tissue into one class, and the fat tissue in another class.
  • the processor may include a second classifier.
  • the second classifier may be configured to classify the image into the second classification based on a portion of the first classification.
  • the second classifier may be configured to separate a first class into the first and second constituents.
  • the second classifier may be configured to classify the first class by identifying the skin and glandular tissue constituents.
  • the second classifier may be configured to morphological process at least a portion of the first classification.
  • the second classifier may be configured to apply erosion operations to at least a portion of the first classification.
  • the image processor may include an image quantifier.
  • the image quantifier may be configured to determine quantitative measurements for at least one of breast tissue composition, tissue density or distribution with respect to age. BRIEF DESCRIPTION OF THE DRAWINGS
  • Figure 1 shows a method to generate a classified image of a breast according to embodiments
  • Figure 2 illustrates an example of an original image
  • Figure 3 shows a method of preprocessing an image according to embodiments
  • Figure 4 illustrates an example of a quantified comparison of original
  • Figure 5 shows an example of a comparison of a corrected and original BCT images
  • Figure 6 illustrates an example of a comparison of original and corrected BCT images
  • Figure 7 illustrates an example of quantified and qualitative comparison of two
  • Figure 8 shows a method of processing an image to generate a classified image according to embodiments
  • Figure 9 illustrates an example of a binary classified result of two classes of images
  • Figure 10 shows a method of processing an image to determine a second
  • Figure 11 illustrates an example of the processed images of a breast
  • Figure 12 illustrates an example of a quantified composition of a breast
  • Figure 13 shows an system configured to process an image of a breast to classify the tissue constituents.
  • Breast tissue classification may be used for breast cancer detection and diagnosis and identification of women at high-risk. Tissue classification makes possible a range of quantitative measurements regarding breast composition, density and tissue distribution with age. In addition, quantitative tissue classification can be valuable as input to finite-element analysis algorithms simulating breast compression for comparison to mammography. See, e.g., Kellner et al., Biomedical Engineering, IEEE Transactions, 2007, 54(10): 1885-1891. Classified breast data also may be of use in dose estimation and computer-aided diagnosis. See, e.g., Hoeschen et al., Radiat. Prot. Dosimetry, 2005, 114(l-3):406- 409; and Wu et al., Med Phys., 1992, 19(3):555-560.
  • One method of classification is intensity or CT number thresholding.
  • classification of BCT images may be challenged because BCT images are affected by multiple factors such as noise, and intensity inhomogeneity.
  • skin and glandular tissue have similar CT values in BCT images. See, e.g., Nelson et al., Med. Phys., 2008, 35(3): 1078-1086.
  • the methods, systems, and computer-readable mediums according to embodiments overcome these deficiencies.
  • the methods, systems, and computer-readable mediums according to embodiments are directed to automatically classifying high-resolution BCT images into a plurality of tissue constituents based on intensity and position information.
  • the methods of the disclosure are described (in the figures) with respect to three tissue constituents, fat, glandular, and skin tissue. It should be understood, however, that there is no intent to limit the disclosure to the particular constituents disclosed and shown, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the disclosure.
  • the methods may include classifying the tissue into additional constituents and may include additional classifying steps.
  • the methods of the disclosure are not limited to the steps described herein.
  • the steps may be individually modified or omitted, as well as additional steps may be added.
  • all of the steps of the method may be performed automatically.
  • some steps of the method may be performed manually.
  • Embodiments of the methods described herein may be implemented using computer software. If written in a programming language conforming to a recognized standard, sequences of instructions designed to implement the methods may be compiled for execution on a variety of hardware platforms and for interface to a variety of operating systems. In addition, embodiments are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement embodiments of the disclosure.
  • Figure 1 illustrates a method according to embodiments to process CT image(s) to classify the breast into different tissue constituents.
  • a processing method 100 may include a receiving step 110 of receiving an image of a breast of a subject.
  • One image or more than one image of the breast may be received.
  • the image may include image data for one image or more than one image of the breast.
  • the image may be "raw" or pre-processed image data directly or may be processed image data that has been readied for display or printing.
  • the image may be CT or breast CT image(s).
  • the image may be a processed or preprocessed image of the breast.
  • the method may further include a step of separating or slicing the image into a plurality of (2D) image slices, each image being in a different direction or plane by slicing or cutting the image along that plan. For example, if a 3D image is received, the method may include slicing the image into an image of the breast in the coronal direction or plane, an image of the breast in the transverse direction or plane, and an image of the breast in sagittal direction or plane.
  • the image may include a plurality of image slices of the breast in different directions.
  • the image may include three images of the breast in three different directions.
  • An example of a sliced (original) CT image 200 in three directions of a breast of a patient is shown in Figure 2.
  • the image(s) 200 may include a CT image 210 of the breast in the coronal direction or plane, a CT image 220 of the breast in the transverse direction or plane, and a CT image 230 of the breast in the sagittal direction or plane.
  • the image may be obtained by a CT system.
  • the CT system may be a dedicated breast CT system (also referred to a "BCT system").
  • Examples of a BCT system may include but are not limited to systems being developed a team of researches lead by John M. Boone, PhD, at the University of California, Davis (as described in Boone et al., Med Phys., 2005, (12):3767-3776; and Boone et al., Radiology, 2001, 221(3):657-667; and Boone et al., Med Phys., 2004, 31(2):226-235); systems being developed by Koning Corporation; and systems being developed by Zumatek Corporation.
  • the image may be received from the medical image device, such as a CT or BCT system.
  • the image may be received by a medical image database that stores image data of the medical images generated by the medical image diagnostic device.
  • the method 100 may further include a preprocessing step 120 for processing the image(s).
  • the preprocessing step 300 may include a filtering step 310 of filtering the image.
  • the filtering step 310 may filter the image to remove the background noise and smooth the image while preserving the edge information.
  • the filtering step 310 may be performed by a multiscale bilateral filter.
  • the multiscale bilateral filter may be any known filter.
  • the preprocessing step 300 may include a step 320 of generating a (first) mask of the breast.
  • the mask may be of the whole breast.
  • the mask may be generated by using a threshold.
  • the threshold may depend on parameters of the image and/or patient, for example, the Hounsfield scale.
  • the step 320 may be performed before or after the filtering step 310.
  • the method 100 may further include a correcting step 130 for correcting the bias of the image, e.g., the cupping artifacts.
  • the correcting step 130 may be based on a nonparametric course to fine approach that allows the bias fields to be modeled with different frequency ranges.
  • the correcting step 130 may be any known entropy-based method for bias field correction, wherein the bias field is assumed to be a multiplied field (not an additive field).
  • the entropy-based method may be an entropy-related cost function based on the combination of intensity and gradient image features. An entropy-related cost function is further described in for example, Manjon et al., Med Image Anal., 2007, 11(4):336-345, which is incorporated in its entirety.
  • the value of the voxel in the bias field may depend on the position and tissue distribution and the value of voxel in center of a real breast CT image may not be the smallest because gland tissue and fat have different attenuation coefficients.
  • the correcting step 130 may reduce or prevent misclassifications in the classifying step.
  • the bias (cupping artifacts) can cause serious misclassifications when intensity -based segmentation algorithms are use. Li et al., Med Image Comput. Assist. Interv., 2008, 11(2): 1083-1091.
  • the misclassification may be due to an overlap of the intensity range of different tissues introduced by the bias field, so that the voxels in different tissues are not separable based on their intensities.
  • Figure 4 illustrates an example of a quantified comparison 400 of original and corrected BCT images.
  • Rows 410, 420, and 430 correspond to BCT images for three patients.
  • Images 414, 424, and 434 show original BCT mages for the three patients.
  • Images 416, 426, and 436 show the corresponding corrected BCT images.
  • Profiles 418, 428, and 438 show the corresponding profiles between the original and corrected BCT images through the dotted line 412, 422, and 432, respectively.
  • Figure 5 illustrates example of a visual comparison 500 on the bias correction corrects cupping artifacts in the original BCT images shown in Figure 1.
  • Rows 510, 520, and 530 show the BCT images in the coronal direction or plane, transverse direction or plane, and the sagittal direction or plane, respectively.
  • Images 512, 522, 532 show the original images in the respective planes.
  • Images 514, 524, and 534 show the corresponding corrected images.
  • Images 516, 526, and 536 show fusion images between the corresponding original and corrected images.
  • Images 518, 528, and 538 show the corresponding multiplied bias field in the respective three directions.
  • Figure 6 illustrates an improvement in signal uniformity for corrected BCT images.
  • Figure 6 illustrates a comparison 600 of four histograms 610, 620, 630, and 640, for original and corrected BCT images for four patients with different glandular tissue ratio, respectively.
  • histograms 610, 620, 630, and 640 there is an improvement in signal uniformity.
  • the corrected images (shown in solid lines) have narrow peaks, better separation of the fat, gland, and skin tissue signals as compared to the original images (shown in dotted lines).
  • the method may further include a filtering step 140 for generating a series of images at different scales.
  • the filtering may be a multi-scale bi-lateral filtering.
  • the filtering step 140 may be repeated until the number of images at the desired or predetermined scale is generated.
  • the filtering step 140 may filter each image from a course to a fine scale.
  • the number of scales may be any number. In some embodiments, the number of scales may be three. In other embodiments, the number of scales may be less than three, for example, two, and more than three, for example, four, five, six, etc.
  • Bilateral filtering is a non-linear filtering technique. Such technique is further detailed in Tomasi and Manduchi, 1998, Computer Vision, 1998, Sixth International Conference, 839-846, which is incorporated by reference in its entirety.
  • the filter may be a weighted average of the local neighborhood samples, where the weights are computed based on temporal (or spatial in case on images) and radiometric distance between the center sample and the neighboring samples.
  • the bilateral filtering may smooth the images while preserving edges, by means of a nonlinear combination of nearby image values. Bilateral filtering may be described as follows: ⁇ ( ⁇ ) ⁇ ( ⁇ - ⁇ ) ⁇ ( ⁇ ( ⁇ ) - ⁇ ( ⁇ )) ⁇ (1)
  • I ⁇ x and h(x) denote input images and output images, respectively.
  • W a measures the geometric closeness between the neighborhood center x and a nearby point ⁇ and W a measures the photometric similarity between the pixel at the neighborhood center X and that of a nearby point ⁇ .
  • the similarity function W a may operate in the range of the image function /
  • the closeness function W a operates in the domain of / .
  • the bilateral filtering may be performed by many different kernels.
  • the bilateral filtering may be performed by a Gaussian filtering.
  • the Gaussian filtering may be shift- invariant Gaussian filtering, in which both the spatial function W a and the range function W a are
  • W a Gaussian functions of the Euclidean distance between their arguments. More specifically, W a may be described as:
  • this may be simply the absolute difference of the pixel difference.
  • the Gaussian range filter may also be insensitive to overall additive changes of image intensity.
  • the range filter may also be shift-invariant.
  • the filtering step 140 may smooth the images (suppress noise) while preserving the edges because the filtering generates a nonlinear combination of nearby image values.
  • the range Gaussian may be an edge-stopping function. At each scale, the width of the range Gaussian may be reduced and the width of the spatial Gaussian may be increased to filter the breast images.
  • Figure 7 shows an example 700 of quantified and qualitative comparison of two BCT images (in one direction) for two patients 710 and 720 after the filtering step.
  • Images 712 and 722 show the original BCT images;
  • Images 714 and 724 show filtered images after a first scale;
  • Images 716 and 726 show filtered images after a third scale.
  • Profiles 718 and 728 show a quantified profile through dotted lines 711 and 721, respectively, through the original and different scale filtered images.
  • the method 100 may further include a processing step 150.
  • the processing step 150 may process the image(s) to classify the image(s) into different tissue constituents.
  • the processing step 150 may include analyzing different parameter(s) of the image data to classify or identify the tissue constituents.
  • tissue constituents There may be any number of tissue constituents. In some embodiments, the number of different tissue constituents may be three. In other embodiments, the number of different tissue constituents may be more than four. The number may include four, five, six, and etc.
  • the tissue constituents may include but is not limited to skin tissue, glandular tissue, fat tissue, calcifications, or masses (e.g., tumors).
  • the processing step 150 may include a plurality of steps.
  • the processing step 800 may include a step 810 of processing the data to first classify the image(s) into a first classification (e.g., first classified results).
  • the step 810 of first classifying may be based on intensity information.
  • the first classification may include a plurality of classes. In some embodiments, the first classifying step may classify the breast into at least two classes. In some embodiments, the first classifying step may classify the breast tissue into three classes or more than three classes.
  • the classes may correspond to different tissue constituents.
  • one of the classes may include a combination of two different tissue constituents, for example, skin and glandular tissue.
  • the other classes may each correspond to a specific tissue constituents or a combination of tissue constituents.
  • the first classifying step may classify the tissue into a first class and a second class.
  • One of the classes may include fat tissue, which may correspond to a first tissue constituent.
  • the other one of the classes may include skin and glandular tissue, which may correspond to second and third tissue constituents, respectively.
  • the first classifying step may classify the tissue into additional classes.
  • the first classifying step may classify the tissue into a third class, e.g., calcifications, which may correspond to a fourth constituent.
  • the first classifying step may employ a Fuzzy C-mean Classification technique.
  • the first classifying step may employ any known fuzzy c-mean classification algorithm.
  • a Fuzzy C-mean classification technique is described in Wang, H. and Fei, B., Med. Image Anal., 2009, 13(2): 193-202, which is hereby incorporated in it is entirety.
  • Fuzzy C-means (FCM) classification techniques may employ fuzzy partitioning to allow one pixel to belong to tissue types with different memberships graded between 0 and 1.
  • FCM is an unsupervised algorithm and allows soft classification of each pixel which possibly consists of several different tissue types. See, e.g., Pham, D.L. and Prince, J.L., IEEE Trans Med Imaging, 1999, 18(9):737- 52.
  • standard FCM methods may be sensitive to noise and cannot effectively compensate for intensity inhomogeneities.
  • the correction and filter steps to generate corrected and filtered original CT images before the classifying step addresses these deficiencies.
  • the FCM may be described as:
  • ⁇ v k ⁇ k ° 1 is the characterized intensity center of the class k
  • c is the number of underlying tissue types in the images which is given before classification.
  • the u ik represents the
  • the parameter p is a weighting exponent on each fuzzy membership and is set as 2.
  • each voxel may be assigned a high membership to a class whose center is close to the intensity of the voxel, and a low membership is given when the voxel intensity is far from the class centroid.
  • a hard or crisp classification is reached by assigning each voxel solely to the class that has the highest membership value for the voxel. See, e.g.,
  • Figure 9 illustrates an example 900 of a binary classified result with two classes of images in the three directions.
  • Rows 910, 920, and 930 show the BCT images in the coronal direction or plane, transverse direction or plane, and the sagittal direction or plane, respectively.
  • Images 912, 922, 932 (in the first column) show the original images in the respective planes.
  • Images 916, 926, and 936 (in the third column) show the corresponding classified images.
  • Images 914, 924, and 934 show fusion images between the corresponding original and classified images with only one class.
  • the fusion images demonstrate how the first classifying step may separate the skin and glandular tissue from the fat tissue.
  • the classified results also maintain many small details after the previous steps of bias correction and multi-scale filtering.
  • the processing step 800 may include a step 820 of further or second classifying the first classification.
  • the step 820 of second classifying may include processing a portion of the first classification.
  • the second classifying step may further classify at least one of the classes.
  • the second classifying step may be based on position information.
  • the second classifying step may classify one of the classes that includes a combination of tissue constituents into its respective tissue constituent.
  • the class that includes the skin and glandular tissues may be classified into respective tissue constituents.
  • the second classifying step may also classify other combination classes.
  • the second classifying step may employ morphologic processing techniques.
  • the morphologic processing techniques may include erosion operations.
  • FIG 10 illustrates an example of a step of second classifying according to some embodiments.
  • second classifying step 1000 may include a step 1010 of obtaining a first breast mask.
  • the first breast mask may be the whole breast mask.
  • the first breast mask may be generated by applying a threshold to the image.
  • the first breast mask may be generated during the preprocessing step. In other embodiments, the first breast mask may be generated during other steps.
  • the classifying step 1000 may further include a step 1020 of obtaining a second breast mask.
  • the second breast mask may be a mask of the skin.
  • the mask of the skin may be generated by performing erosion operations.
  • the skin thickness is 1.45 ⁇ 0.30 mm. See, e.g., Huang et al., Med Phys., 2008, 35(4): 1199-1206; and Willson et al., Clinical Radiology, 2001, 33(6):691 -693.
  • the voxel for the skin may be determined based on the resolution of the breast CT images. In some embodiments, for example, the breast CT voxel size may be 0.273x0.273x0.273mm 3 .
  • the skin thickness may be restrained within a predetermined number of voxels. The predetermined number of voxels may be any number, for example, between and including three voxels and thirteen voxels.
  • the predetermined number may be seven voxels.
  • a 9> ⁇ 9> ⁇ 9 box may be used to perform erosion operations to generate the mask for the skin.
  • another size box may be used.
  • a 7x7x7 box may be used to generate the mask for the skin.
  • the step 1000 of second classifying may include a step 1030 of applying the skin mask to the first classification (the FCM classification). This step may classify the skin and glandular tissue constituents by separating skin and glandular tissue.
  • the method may further include a step 830 of generating a classified breast image(s).
  • the classified image may vary the appearance of the different tissue constituents. This may include but is not limited to different contrast, brightness, and color.
  • the classified image may be generated after any of the processing steps, e.g., after the first classification and/or the second classification.
  • Figure 1 1 shows an example of the processing the images of a breast in the three directions.
  • Rows 1110, 1120, and 1130 show the BCT images in the coronal direction or plane, transverse direction or plane, and the sagittal direction or plane, respectively.
  • Images 1112, 1122, and 1132 (in the first column) show the original images in the respective planes.
  • Images 1114, 1124, and 1134 (in the second column) show the generated, corresponding skin mask after the morphologic operations, for example, after step 1020.
  • Images 1116, 1126, and 1136 (in the third column) show the generated classified image illustrating three tissue constituents, skin, glandular, and fat tissue.
  • the image 1116 includes glandular tissue 1146, fat tissue 1144, and skin tissue 1142.
  • the tissue constituents (fat, gland, and fat) have different values (1, 2, and 3, respectively) as compared to the background, which has a value of 0.
  • the method 100 may further include a step 160 of quantifying the classified breast tissue image.
  • the quantifying step may include but is not limited to determining quantitative measurements for breast tissue composition, tissue density and distribution with respect to ages. For example, glandular tissue fraction may be determined.
  • the method may further including analyzing the quantitative measurements to detect, as well as diagnosis, breast cancer.
  • the analyzing may include determining the type of breast cancer.
  • the analyzing may include analyzing the density and/or Hounsfield number. There have been reports of differences in the density and Hounsfield number associated with different types of cancers compared to normal breast tissues. See, e.g., Johns et al., Phys. Med. Biol., 1987, 32(6):675.
  • the method may further include processing the quantitative measurements to generate a simulated breast compression for comparison to mammography.
  • the processing may be performed using finite-element analysis algorithms simulating classified breast data. While the present work focuses on classification of breast tissue, with improvements in image signal-to- noise performance, it may be possible to differentiate between breast cancers and normal glandular tissue using the same strategies.
  • Figure 12 shows an example of a fractional composition of a fatty replacement breast.
  • the fractional composition plotted is determined by slice, whereas the percentage of each tissue number represents the total tissue composition.
  • the fractional composition plot 1200 includes one midbreast slice of the breast CT 1210.
  • the breast CT 1210 includes the original image 1212, the corresponding segment image 1216, and the fusion image 1214.
  • the composition analysis 1220 illustrated on the bottom is through all slices (from nipple to base). For this example, the total breast volume was 886cc with a volume factional composition of 6.8% skin, 84.7% fat, and 6.8% gland.
  • the method 100 may further include a step 170 of outputting the classified images.
  • the image may be outputted after any step of processing step 160 and quantifying step 170.
  • the outputting may include but is not limited to displaying the classified image(s), printing the classified image(s), and storing the classified image(s) remotely or locally.
  • the classified image with may be forwarded for further processing.
  • the classified image(s) may include a plurality of (sliced) images in different directions.
  • the classified image may include a 3D image of the breast.
  • the classified image may include quantitative measurements.
  • the measurements may be presented in a graph or plot.
  • Figure 13 shows an example of a system 1300 configured to process an image of a breast to classify the tissue constituents.
  • the system for carrying out the embodiments of the methods disclosed herein is not limited to the system shown in Figure 13. Other systems may be used.
  • the system 1300 may include an image acquisition device 1310 configured to acquire the image data of a patient.
  • the image acquisition system or device 1310 may be any device configured to acquire images from a CT scan.
  • the image acquisition system device 1310 may be a dedicated breast-computed tomography (BCT) system or device. Examples of a BCT system may include but are not limited to systems being developed a team of researches lead by John M.
  • Boone PhD, at the University of California, Davis (as described in, for example, Boone et al., Med Phys., 2005, 32(12):3767-3776; Boone et al., Radiology, 2001, 221(3):657-667; and Boone et al., Med Phys., 2004, 31(2):226-235); systems being developed by Koning Corporation; and systems being developed by Zumatek Corporation.
  • the system 1300 may further include a computer system 1320 to carry out the classifying of the tissue and generating a classified image.
  • the computer system 1320 may further be used to control the operation of the system or a computer separate system may be included.
  • the computer system 1320 may also be communicably connected to another computer system as well as a wired or wireless network.
  • the computer system 1320 may receive or obtain the image data from the image acquisition device 1310 or from another module provided on the network, for example, a medical image storage device 1312, such as a picture archiving and communication system (PACS) image storage.
  • a medical image storage device 1312 such as a picture archiving and communication system (PACS) image storage.
  • PACS picture archiving and communication system
  • the computer system 1320 may include a number of modules that communicate with each other through electrical and/or data connections (not shown). Data connections may be direct wired links or may be fiber optic connections or wireless communications links or the like. The computer system 1320 may also be connected to permanent or back-up memory storage, a network, or may communicate with a separate system control through a link (not shown).
  • the modules may include a CPU 1322, a memory 1324, an image processor 1330, an input device 1326, a display 1328, and a printer interface 1329.
  • the CPU 1322 may any known central processing unit, a processor, or a microprocessor.
  • the CPU 1322 may be coupled directly or indirectly to memory elements.
  • the memory 1324 may include random access memory (RAM), read only memory (ROM), disk drive, tape drive, etc., or a combinations thereof.
  • the memory may also include a frame buffer for storing image data arrays.
  • the present disclosure may be implemented as a routine that is stored in memory 1324 and executed by the CPU 1322.
  • the computer system 1320 may be a general purpose computer system that becomes a specific purpose computer system when executing the routine of the disclosure.
  • the computer system 1320 may also include an operating system and micro instruction code.
  • the various processes and functions described herein may either be part of the micro instruction code or part of the application program or routine (or combination thereof) that is executed via the operating system.
  • various other peripheral devices may be connected to the computer platform such as an additional data storage device, a printing device, and I/O devices.
  • the input device 1326 may include a mouse, joystick, keyboard, track ball, touch activated screen, light wand, voice control, or any similar or equivalent input device, and may be used for interactive geometry prescription.
  • the input device 1326 may control the production, display of images on the display 1328, and printing of the images by the printer interface 1329.
  • the display 1328 and the printer interface 1329 may be any known display screen and the printer interface 1329 may any known printer, either locally or network connected.
  • the image processor 1330 may be any known central processing unit, a processor, or a microprocessor. In some embodiments, the image processor 1330 may process and classify the image to generate classified images. The image processor 1330 may also quantify the classified images. In other embodiments, the image processor 1330 may be replaced by image processing functionality on the CPU 1322.
  • the image processor 1330 may be configured to process and classify the images (data) from the image acquisition device 1310 and/or the medical image storage device 1312. In some embodiments, the image processor 1330 may be configured to process and classify the tissue of the image (data) to generate a classified image. In some embodiments, the image processor 1330 may be configured to quantify the classified tissue of the image, for example, determine the breast tissue composition, tissue density and distribution with respect to ages. In some embodiments, the image processor 1330 may also carry out finite-element analysis algorithms to simulate breast compression for comparing the classified BCT image to mammography.
  • the image processor 1330 may include a first classifier 1332.
  • the first classifier 1332 may be configured to carry out the FCM classification to classify a BCT image(s) to determine a (first) classification.
  • the (first) classification may include a plurality of tissue classes, one of which includes skin and glandular tissue.
  • the image processor 1330 may include a second classifier 1334.
  • the second classifier 1332 may be configured to carry out morphological processing, including erosions operations, to classify the (first) classification (results) from the first classifier 1332 to determine a (second) classification.
  • the second classifier 1334 may be configured to further classify the one of the classes into skin and glandular tissue.
  • the image processor 1330 may include a tissue quantifier 1336.
  • the tissue quantifier 1336 may be configured to carry out quantification of the classified tissue.
  • the tissue quantifier 1336 may be configured to determine parameters of the classified tissue. The parameters may include but are not limited to the breast tissue composition, tissue density and distribution with respect to ages.
  • the classified and/or quantified image may be stored in the memory 1324.
  • another computer system may assume the image reconstruction or other functions of the image processor 1330.
  • the image data stored in the memory 1324 may be archived in a long term storage or may be further processed by the image processor 1330 and presented on the display 1329.
  • the disclosure may be implemented in software as an application program tangible embodied on a computer readable program storage device.
  • the application program may be uploaded to, and executed by, a machine comprising any suitable architecture.
  • the system and method of the present disclosure may be implemented in the form of a software application running on a computer system, for example, a mainframe, personal computer (PC), handheld computer, server, etc.
  • the software application may be stored on a recording media locally accessible by the computer system and accessible via a hard wired or wireless connection to a network, for example, a local area network, or the Internet.

Abstract

Systems, methods and computer-readable storage mediums relate to processing BCT images to classify tissue constituents. The processing may be based on position and intensity information.

Description

SYSTEMS, METHODS AND COMPUTER READABLE STORAGE MEDIUMS STORING INSTRUCTIONS FOR CLASSIFYING BREAST CT IMAGES
ACKNOWLEDGEMENTS
[0001 ] This invention was made with government support under Grants RO 1 CA156775, NIH P50CA12830, and ULI RR0258008 awarded by the National Institutes of Health. The government has certain rights in the invention.
CROSS REFERENCE TO RELATED APPLICATION
[0002] This application claims priority to United States Provisional Application Serial
Number 61/441,826 filed February 11, 2011, which is hereby incorporated by this reference in its entirety.
BACKGROUND
[0003] In the United States, the lifetime risk for women of developing breast cancer is one in eight. Breast cancer arises in the glandular and ductal tissues and generally becomes detectable by
mammography when the size of the lesion approaches 1 cm in size. See, e.g., Tabar et al., J Natl. Cancer Inst. Monogr., 1997, (22):43-47. Early identification of breast cancer is the most important consideration determining prognosis. Currently, the most common test for early detection of breast cancer is X-ray mammography. However, one of its most significant limitations with mammography is the superposition of overlaying glandular structures that can obscure visualization of a breast tumor and reduced sensitivity in women with dense breasts. Mammography also does not readily permit tissue classification due to the significant superposition problem inherent in a projection imaging technique.
[0004] Dedicated X-ray breast computed tomography (BCT) can eliminate the influence of overlapping tissues. It offers women pain-free screening as well as more precise diagnosis and treatment options. Using X-rays taken from many different angles, breast CT provides three-dimensional images of the breast compared with just two-dimensional images for mammography, eliminates compression of the breast between two plates, and eliminates image artifacts (suspicious areas that result from normal breast structures overlaying each other when the breast compresses). It provides high-quality volume data that enhances visualization of breast glandular tissue and architecture compared to other breast imaging methods providing imaging technologists with a clear view of breast architecture and glandular tissues. See, e.g., Boone et al., Med Phys., 2005, 32(12):3767-3776, and Boone et al., Radiology, 2001,
221(3):657-667.
[0005] Breast tissue classification can be used for breast cancer detection and diagnosis and identification of women at high-risk. However, although various approaches have been tried to develop computer algorithms to classify breast tissues, accurate information regarding breast tissue composition has been difficult to obtain. See, e.g., Anderson et al., J Pathol., 1997, 181(4):374-380; Chang et al., Acad. Radiol., 2002, 9(8):899-905; Samani et al., IEEE Trans Med Imaging., 2001, 20(9):877-885; and Stines, J. and Tristant, H., Eur. J Radiol., 2004, 54(l):26-36.
[0006] Thus, there is a need for an imaging processing technique that accurately classifies breast tissue composition.
SUMMARY
[0007] The disclosure relates to systems, methods, and computer-readable mediums
storing instructions for classifying breast CT images based on intensity and position information, automatic classification method to classify high-resolution breast CT images into skin, fat and glandular tissue based on intensity and position information.
[0008] In some embodiments, a method for processing at least one image of a breast to determine a plurality of tissue constituents of the breast, the at least one image including image data, may include: processing the image to determine at least a first classification of the tissue constituents represented by the image and a second classification of the tissue constituents represented by the image, the second tissue classification being based on, in at least a part of, the first classification; and generating at least one classified image of the breast based on the processing. The first classification may be based on intensity information and the second classification is based on position information.
[0009] In some embodiments, the method may further include correcting bias of the image; and filtering the corrected image with a multi-scale filter. The processing may process a corrected and filtered image.
[0010] In some embodiments, the tissue constituents may include at least a first constituent, a second constituent and a third constituent. In some embodiments, the tissue constituents may include a fourth constituents. In further embodiments, the tissue constituents may include a fifth class.
[0011] In some embodiments, the processing may include first classifying the image into the first classification, the first classification including at least two classes, a first class including the first constituent and the second constituent, and the second class including the third constituent. In some embodiments, the first classification may include additional classes. The first classification may include a fourth class. The fourth class may include the fourth constituent.
[0012] In some embodiments, the tissue constituents may include but is not limited to glandular tissue, skin tissue, fat tissue, calcifications, and masses. In some embodiments, the first constituent may correspond to glandular tissue, the second constituent may correspond to skin tissue, and the third constituent may correspond to fat tissue. In some embodiments, the fourth constituent and/or fifth constituent may correspond to calcification and/or mass.
[0013] In some embodiments, the first classifying may include applying a fuzzy C-means classifier to the image. The first classifying may include classifying the image into classes based on voxel intensity values. The first classifying may include classifying the skin and glandular tissue into one class, and the fat tissue in another class.
[0014] In some embodiments, the processing may include second classifying the image into the second classification. The second classifying may include separating the first class into the first and second constituents. The second classifying may include classifying the first class by identifying the skin and glandular tissue constituents.
[0015] In some embodiments, the second classifying may include morphological processing of a portion of the first classification. The second classifying may include applying erosion operations to at least a portion of the first classification
[0016] In some embodiments, the second classifying may include obtaining a first mask, the first mask being of a whole breast. The first mask may be generated during a preprocessing step.
[0017] In some embodiments, the second classifying may include obtaining a second mask, the second mask being of skin. The second classifying may include applying the second mask to the first classification to classify the skin and glandular tissue.
[0018] In some embodiments, the method may include generating the classified image.
[0019] In some embodiments, the method may further include quantifying the classified image. The may include determining quantitative measurements for at least one of breast tissue composition, tissue density or distribution with respect to age.
[0020] In some embodiments, the method may include outputting the classified image. The outputting may include at least one of displaying the classified image, printing the classified image, or storing the classified image. In some embodiments, the classified image may include the quantitative measurements.
[0021] In some embodiments, the method may further include detecting breast cancer based on the classified image.
[0022] In some embodiments, the method may be an automatic method. In some embodiments, the method. In some embodiments, the method may be performed by a computer having a memory and a processor.
[0023] In some embodiments, a computer-readable storage medium may store instructions for processing at least one image of a breast to determine a plurality of tissue constituents of the breast, the at least one image including image data. The instructions may include: processing the image to determine at least a first classification of the tissue constituents represented by the image and a second classification of the tissue constituents represented by the image, the second tissue classification being based on, in at least a part of, the first classification; and generating at least one classified image of the breast based on the processing, wherein the first classification is based on intensity information and the second classification is based on position information.
[0024] In some embodiments, the instructions may include: correcting bias of the image; and filtering the corrected image with a multi-scale filter. The processing may process a corrected and filtered image
[0025] In some embodiments, a system may be configured to process at least one image of a breast to determine a plurality of tissue constituents of the breast, the at least one image including image data. The system may include an image processor. The image processor may be configured to: process the image to determine at least a first classification of the tissue constituents represented by the image and a second classification of the tissue constituents represented by the image, the second tissue classification being based on, in at least a part of, the first classification; and generate at least one classified image of the breast based on the first and second classifications. The first classification may be based on intensity information and the second classification may be based on position information.
[0026] In some embodiments, the processor may include a first classifier. The first classifier may be configured to determine a first classification. The first classifier may be configured to apply a fuzzy C- means classifier to the image. The first classifier may be configured to classify the image into classes based on voxel intensity values. The first classifier may be configured to classify the skin and glandular tissue into one class, and the fat tissue in another class.
[0027] In some embodiments, the processor may include a second classifier. The second classifier may be configured to classify the image into the second classification based on a portion of the first classification. The second classifier may be configured to separate a first class into the first and second constituents. The second classifier may be configured to classify the first class by identifying the skin and glandular tissue constituents.
[0028] In some embodiments, the second classifier may be configured to morphological process at least a portion of the first classification. The second classifier may be configured to apply erosion operations to at least a portion of the first classification.
[0029] In some embodiments, the image processor may include an image quantifier. The image quantifier may be configured to determine quantitative measurements for at least one of breast tissue composition, tissue density or distribution with respect to age. BRIEF DESCRIPTION OF THE DRAWINGS
[0030] The disclosure can be better understood with the reference to the following drawings and description. The components in the figures are not necessarily to scale, emphasis being placed upon illustrating the principles of the disclosure.
[0031] Figure 1 shows a method to generate a classified image of a breast according to embodiments;
[0032] Figure 2 illustrates an example of an original image;
[0033] Figure 3 shows a method of preprocessing an image according to embodiments;
[0034] Figure 4 illustrates an example of a quantified comparison of original and
corrected BCT images;
[0035] Figure 5 shows an example of a comparison of a corrected and original BCT images;
[0036] Figure 6 illustrates an example of a comparison of original and corrected BCT images;
[0037] Figure 7 illustrates an example of quantified and qualitative comparison of two
BCT images for two patients;
[0038] Figure 8 shows a method of processing an image to generate a classified image according to embodiments;
[0039] Figure 9 illustrates an example of a binary classified result of two classes of images;
[0040] Figure 10 shows a method of processing an image to determine a second
classification according to embodiments;
[0041] Figure 11 illustrates an example of the processed images of a breast;
[0042] Figure 12 illustrates an example of a quantified composition of a breast; and
[0043] Figure 13 shows an system configured to process an image of a breast to classify the tissue constituents.
DETAILED DESCRIPTION OF THE EMBODIMENTS
[0044] The following description, numerous specific details are set forth such as examples of specific components, devices, methods, etc., in order to provide a thorough understanding of embodiments of the disclosure. It will be apparent, however, to one skilled in the art that these specific details need not be employed to practice embodiments of the disclosure. In other instances, well-known materials or methods have not been described in detail in order to avoid unnecessarily obscuring embodiments of the disclosure. While the disclosure is susceptible to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings and will herein be described in detail. It should be understood, however, that there is no intent to limit the disclosure to the particular forms disclosed, but on the contrary, the disclosure is to cover all
modifications, equivalents, and alternatives falling within the spirit and scope of the disclosure.
[0045] Breast tissue classification may be used for breast cancer detection and diagnosis and identification of women at high-risk. Tissue classification makes possible a range of quantitative measurements regarding breast composition, density and tissue distribution with age. In addition, quantitative tissue classification can be valuable as input to finite-element analysis algorithms simulating breast compression for comparison to mammography. See, e.g., Kellner et al., Biomedical Engineering, IEEE Transactions, 2007, 54(10): 1885-1891. Classified breast data also may be of use in dose estimation and computer-aided diagnosis. See, e.g., Hoeschen et al., Radiat. Prot. Dosimetry, 2005, 114(l-3):406- 409; and Wu et al., Med Phys., 1992, 19(3):555-560.
[0046] One method of classification is intensity or CT number thresholding. However, classification of BCT images may be challenged because BCT images are affected by multiple factors such as noise, and intensity inhomogeneity. At the same time, skin and glandular tissue have similar CT values in BCT images. See, e.g., Nelson et al., Med. Phys., 2008, 35(3): 1078-1086.
[0047] The methods, systems, and computer-readable mediums according to embodiments overcome these deficiencies. The methods, systems, and computer-readable mediums according to embodiments are directed to automatically classifying high-resolution BCT images into a plurality of tissue constituents based on intensity and position information.
[0048] The examples of the methods of the disclosure are described (in the figures) with respect to three tissue constituents, fat, glandular, and skin tissue. It should be understood, however, that there is no intent to limit the disclosure to the particular constituents disclosed and shown, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the disclosure. For example, the methods may include classifying the tissue into additional constituents and may include additional classifying steps.
CLASSIFICATION METHOD
[0049] The methods of the disclosure are not limited to the steps described herein. The steps may be individually modified or omitted, as well as additional steps may be added. In some embodiments, all of the steps of the method may be performed automatically. In other embodiments, some steps of the method may be performed manually. [0050] Unless stated otherwise as apparent from the following discussion, it will be appreciated that terms such as "analyzing," decomposing," "receiving," "classifying," "preprocessing," "correcting, "slicing," "separating," "displaying," "storing," "printing," "quantifying," "filtering," "combining," "reconstructing," "segmenting," "generating," "registering," "determining," "obtaining," "processing," "computing," "selecting," "estimating," "detecting," "tracking," "applying," "outputting" or the like may refer to the actions and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (e.g., electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices. Embodiments of the methods described herein may be implemented using computer software. If written in a programming language conforming to a recognized standard, sequences of instructions designed to implement the methods may be compiled for execution on a variety of hardware platforms and for interface to a variety of operating systems. In addition, embodiments are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement embodiments of the disclosure.
[0051] Figure 1 illustrates a method according to embodiments to process CT image(s) to classify the breast into different tissue constituents.
[0052] In some embodiments, a processing method 100 may include a receiving step 110 of receiving an image of a breast of a subject. One image or more than one image of the breast may be received. The image may include image data for one image or more than one image of the breast. The image may be "raw" or pre-processed image data directly or may be processed image data that has been readied for display or printing. The image may be CT or breast CT image(s).
[0053] In some embodiments, the image may be a processed or preprocessed image of the breast. In some embodiments, the method may further include a step of separating or slicing the image into a plurality of (2D) image slices, each image being in a different direction or plane by slicing or cutting the image along that plan. For example, if a 3D image is received, the method may include slicing the image into an image of the breast in the coronal direction or plane, an image of the breast in the transverse direction or plane, and an image of the breast in sagittal direction or plane.
[0054] In other embodiments, the image may include a plurality of image slices of the breast in different directions. In some embodiments, the image may include three images of the breast in three different directions. An example of a sliced (original) CT image 200 in three directions of a breast of a patient is shown in Figure 2. The image(s) 200 may include a CT image 210 of the breast in the coronal direction or plane, a CT image 220 of the breast in the transverse direction or plane, and a CT image 230 of the breast in the sagittal direction or plane. [0055] The image may be obtained by a CT system. The CT system may be a dedicated breast CT system (also referred to a "BCT system"). Examples of a BCT system may include but are not limited to systems being developed a team of researches lead by John M. Boone, PhD, at the University of California, Davis (as described in Boone et al., Med Phys., 2005, (12):3767-3776; and Boone et al., Radiology, 2001, 221(3):657-667; and Boone et al., Med Phys., 2004, 31(2):226-235); systems being developed by Koning Corporation; and systems being developed by Zumatek Corporation.
[0056] The image may be received from the medical image device, such as a CT or BCT system. In some embodiments, the image may be received by a medical image database that stores image data of the medical images generated by the medical image diagnostic device.
Preprocessing Step
[0057] In some embodiments, the method 100 may further include a preprocessing step 120 for processing the image(s). As shown in Figure 3, the preprocessing step 300 may include a filtering step 310 of filtering the image. The filtering step 310 may filter the image to remove the background noise and smooth the image while preserving the edge information. The filtering step 310 may be performed by a multiscale bilateral filter. The multiscale bilateral filter may be any known filter.
[0058] In some embodiments, the preprocessing step 300 may include a step 320 of generating a (first) mask of the breast. The mask may be of the whole breast. The mask may be generated by using a threshold. The threshold may depend on parameters of the image and/or patient, for example, the Hounsfield scale. The step 320 may be performed before or after the filtering step 310.
Correcting Step
[0059] In some embodiments, the method 100 may further include a correcting step 130 for correcting the bias of the image, e.g., the cupping artifacts. The correcting step 130 may be based on a nonparametric course to fine approach that allows the bias fields to be modeled with different frequency ranges.
[0060] The correcting step 130 may be any known entropy-based method for bias field correction, wherein the bias field is assumed to be a multiplied field (not an additive field). The entropy-based method may be an entropy-related cost function based on the combination of intensity and gradient image features. An entropy-related cost function is further described in for example, Manjon et al., Med Image Anal., 2007, 11(4):336-345, which is incorporated in its entirety. The value of the voxel in the bias field may depend on the position and tissue distribution and the value of voxel in center of a real breast CT image may not be the smallest because gland tissue and fat have different attenuation coefficients.
[0061] The correcting step 130 may reduce or prevent misclassifications in the classifying step. The bias (cupping artifacts) can cause serious misclassifications when intensity -based segmentation algorithms are use. Li et al., Med Image Comput. Assist. Interv., 2008, 11(2): 1083-1091. Essentially, the misclassification may be due to an overlap of the intensity range of different tissues introduced by the bias field, so that the voxels in different tissues are not separable based on their intensities.
[0062] Figure 4 illustrates an example of a quantified comparison 400 of original and corrected BCT images. Rows 410, 420, and 430 correspond to BCT images for three patients. Images 414, 424, and 434 show original BCT mages for the three patients. Images 416, 426, and 436 show the corresponding corrected BCT images. Profiles 418, 428, and 438 show the corresponding profiles between the original and corrected BCT images through the dotted line 412, 422, and 432, respectively.
[0063] Figure 5 illustrates example of a visual comparison 500 on the bias correction corrects cupping artifacts in the original BCT images shown in Figure 1. Rows 510, 520, and 530 show the BCT images in the coronal direction or plane, transverse direction or plane, and the sagittal direction or plane, respectively. Images 512, 522, 532 show the original images in the respective planes. Images 514, 524, and 534 show the corresponding corrected images. Images 516, 526, and 536 show fusion images between the corresponding original and corrected images. Images 518, 528, and 538 show the corresponding multiplied bias field in the respective three directions.
[0064] Figure 6 illustrates an improvement in signal uniformity for corrected BCT images. Figure 6 illustrates a comparison 600 of four histograms 610, 620, 630, and 640, for original and corrected BCT images for four patients with different glandular tissue ratio, respectively. As shown in histograms 610, 620, 630, and 640, there is an improvement in signal uniformity. The corrected images (shown in solid lines) have narrow peaks, better separation of the fat, gland, and skin tissue signals as compared to the original images (shown in dotted lines).
Filtering Step
[0065] In some embodiments, the method may further include a filtering step 140 for generating a series of images at different scales. The filtering may be a multi-scale bi-lateral filtering. The filtering step 140may be repeated until the number of images at the desired or predetermined scale is generated.
[0066] The filtering step 140 may filter each image from a course to a fine scale. The number of scales may be any number. In some embodiments, the number of scales may be three. In other embodiments, the number of scales may be less than three, for example, two, and more than three, for example, four, five, six, etc.
[0067] Bilateral filtering is a non-linear filtering technique. Such technique is further detailed in Tomasi and Manduchi, 1998, Computer Vision, 1998, Sixth International Conference, 839-846, which is incorporated by reference in its entirety. In some embodiments, the filter may be a weighted average of the local neighborhood samples, where the weights are computed based on temporal (or spatial in case on images) and radiometric distance between the center sample and the neighboring samples. The bilateral filtering may smooth the images while preserving edges, by means of a nonlinear combination of nearby image values. Bilateral filtering may be described as follows:
Figure imgf000012_0001
ΐ(ξ) σ (ξ - χ) σ (Ι(ξ) - Ι(χ))άξ (1)
[0068] with the normalization that ensures that the weights for all the pixels add up to one. x) = \ \ (ξ~ x)w„r (/(£) - ΐ{χ))άξ (2)
[0069] Where I{x) and h(x) denote input images and output images, respectively. And Wa measures the geometric closeness between the neighborhood center x and a nearby point ζ and Wa measures the photometric similarity between the pixel at the neighborhood center X and that of a nearby point ζ . Thus, the similarity function Wa may operate in the range of the image function / , while the closeness function Wa operates in the domain of / .
[0070] The bilateral filtering may be performed by many different kernels. In some embodiments the bilateral filtering may be performed by a Gaussian filtering. The Gaussian filtering may be shift- invariant Gaussian filtering, in which both the spatial function Wa and the range function Wa are
Gaussian functions of the Euclidean distance between their arguments. More specifically, Wa may be described as:
Ψσ {ξ - χ) = β ^ J (3) [0071] where fl^ = ||<^ - χ|| is the Euclidean distance. The range function Wa may be perfectly analogous to Wa : στ {ξ - χ) = β ^ ' r J (4)
[0072] where dr =
Figure imgf000012_0002
- I(x)\ is a suitable measure of distance in intensity space.
[0073] In the scalar case, this may be simply the absolute difference of the pixel difference. The Gaussian range filter may also be insensitive to overall additive changes of image intensity. Thus, the range filter may also be shift-invariant.
[0074] The filtering step 140 may smooth the images (suppress noise) while preserving the edges because the filtering generates a nonlinear combination of nearby image values. The range Gaussian may be an edge-stopping function. At each scale, the width of the range Gaussian may be reduced and the width of the spatial Gaussian may be increased to filter the breast images.
[0075] Figure 7 shows an example 700 of quantified and qualitative comparison of two BCT images (in one direction) for two patients 710 and 720 after the filtering step. Images 712 and 722 show the original BCT images; Images 714 and 724 show filtered images after a first scale; and Images 716 and 726 show filtered images after a third scale. Profiles 718 and 728 show a quantified profile through dotted lines 711 and 721, respectively, through the original and different scale filtered images. These examples demonstrate how the filtering step may suppress noise while preserving the edges.
Processing Step
[0076] In some embodiments, the method 100 may further include a processing step 150. The processing step 150 may process the image(s) to classify the image(s) into different tissue constituents. The processing step 150 may include analyzing different parameter(s) of the image data to classify or identify the tissue constituents.
[0077] There may be any number of tissue constituents. In some embodiments, the number of different tissue constituents may be three. In other embodiments, the number of different tissue constituents may be more than four. The number may include four, five, six, and etc. The tissue constituents may include but is not limited to skin tissue, glandular tissue, fat tissue, calcifications, or masses (e.g., tumors).
[0078] In some embodiment, the processing step 150 may include a plurality of steps. As shown in Figure 8, the processing step 800 may include a step 810 of processing the data to first classify the image(s) into a first classification (e.g., first classified results). The step 810 of first classifying may be based on intensity information.
[0079] In some embodiments, the first classification may include a plurality of classes. In some embodiments, the first classifying step may classify the breast into at least two classes. In some embodiments, the first classifying step may classify the breast tissue into three classes or more than three classes.
[0080] The classes may correspond to different tissue constituents. In some embodiments, one of the classes may include a combination of two different tissue constituents, for example, skin and glandular tissue. The other classes may each correspond to a specific tissue constituents or a combination of tissue constituents.
[0081] For example, in some embodiments, the first classifying step may classify the tissue into a first class and a second class. One of the classes may include fat tissue, which may correspond to a first tissue constituent. The other one of the classes may include skin and glandular tissue, which may correspond to second and third tissue constituents, respectively. In some embodiments, the first classifying step may classify the tissue into additional classes. For example, the first classifying step may classify the tissue into a third class, e.g., calcifications, which may correspond to a fourth constituent.
[0082] The first classifying step may employ a Fuzzy C-mean Classification technique. The first classifying step may employ any known fuzzy c-mean classification algorithm. A Fuzzy C-mean classification technique is described in Wang, H. and Fei, B., Med. Image Anal., 2009, 13(2): 193-202, which is hereby incorporated in it is entirety.
[0083] Fuzzy C-means (FCM) classification techniques may employ fuzzy partitioning to allow one pixel to belong to tissue types with different memberships graded between 0 and 1. FCM is an unsupervised algorithm and allows soft classification of each pixel which possibly consists of several different tissue types. See, e.g., Pham, D.L. and Prince, J.L., IEEE Trans Med Imaging, 1999, 18(9):737- 52. Generally, standard FCM methods may be sensitive to noise and cannot effectively compensate for intensity inhomogeneities. However, the correction and filter steps to generate corrected and filtered original CT images before the classifying step addresses these deficiencies.
[0084] The fuzzy C-means (FCM) algorithm may be an iterative method that produces an optimal c partition for the image {Xj}?=l by minimizing the weighted inter-group sum of squared error objective function JFCM- The FCM may be described as:
Figure imgf000014_0001
[0086] Where {vk }k°=1 is the characterized intensity center of the class k, and c is the number of underlying tissue types in the images which is given before classification. The uik represents the
c
possibility of the voxel i belonging to the class k and requires uik e [0, 1] and ^ uik = 1 for any voxel i.
k=l
The parameter p is a weighting exponent on each fuzzy membership and is set as 2.
[0087] As the FCM objective function is minimized, each voxel may be assigned a high membership to a class whose center is close to the intensity of the voxel, and a low membership is given when the voxel intensity is far from the class centroid. Generally, a hard or crisp classification is reached by assigning each voxel solely to the class that has the highest membership value for the voxel. See, e.g.,
Wang et al., Medical Imaging, Proc. SPIE, 2007, 6512; and Wang et al., Med. Image Anal., 2009,
13(2):193-202.
[0088] Figure 9 illustrates an example 900 of a binary classified result with two classes of images in the three directions. Rows 910, 920, and 930 show the BCT images in the coronal direction or plane, transverse direction or plane, and the sagittal direction or plane, respectively. Images 912, 922, 932 (in the first column) show the original images in the respective planes. Images 916, 926, and 936 (in the third column) show the corresponding classified images. Images 914, 924, and 934 (in the second column) show fusion images between the corresponding original and classified images with only one class. The fusion images demonstrate how the first classifying step may separate the skin and glandular tissue from the fat tissue. The classified results also maintain many small details after the previous steps of bias correction and multi-scale filtering.
[0089] In some embodiments, the processing step 800 may include a step 820 of further or second classifying the first classification. The step 820 of second classifying may include processing a portion of the first classification. In some embodiments, the second classifying step may further classify at least one of the classes. The second classifying step may be based on position information.
[0090] In some embodiments, the second classifying step may classify one of the classes that includes a combination of tissue constituents into its respective tissue constituent. In some embodiments, the class that includes the skin and glandular tissues may be classified into respective tissue constituents. The second classifying step may also classify other combination classes.
[0091] In some embodiments, the second classifying step may employ morphologic processing techniques. In some embodiments, the morphologic processing techniques may include erosion operations.
[0092] Figure 10 illustrates an example of a step of second classifying according to some embodiments. As shown in Figure 1000, second classifying step 1000 may include a step 1010 of obtaining a first breast mask. The first breast mask may be the whole breast mask. The first breast mask may be generated by applying a threshold to the image. In some embodiments, the first breast mask may be generated during the preprocessing step. In other embodiments, the first breast mask may be generated during other steps.
[0093] The classifying step 1000 may further include a step 1020 of obtaining a second breast mask. The second breast mask may be a mask of the skin. The mask of the skin may be generated by performing erosion operations.
[0094] It has been reported that the skin thickness is 1.45±0.30 mm. See, e.g., Huang et al., Med Phys., 2008, 35(4): 1199-1206; and Willson et al., Clinical Radiology, 2001, 33(6):691 -693. The voxel for the skin may be determined based on the resolution of the breast CT images. In some embodiments, for example, the breast CT voxel size may be 0.273x0.273x0.273mm3. The skin thickness may be restrained within a predetermined number of voxels. The predetermined number of voxels may be any number, for example, between and including three voxels and thirteen voxels. In some embodiments, the predetermined number may be seven voxels. [0095] In some embodiments, a 9><9><9 box may be used to perform erosion operations to generate the mask for the skin. In other embodiments, another size box may be used. For example, a 7x7x7 box may be used to generate the mask for the skin.
[0096] After the skin mask is generated, the step 1000 of second classifying may include a step 1030 of applying the skin mask to the first classification (the FCM classification). This step may classify the skin and glandular tissue constituents by separating skin and glandular tissue.
[0097] In breast CT images, skin and glandular tissue have similar CT values. See, e.g., Nelson et al., Med. Phys., 2008, 35(3): 1078-1086. It thus can be difficult to differentiate them just based on the intensity information. The method overcomes this difficulty by classifying the skin and glandular tissue based on position information.
[0098] The method may further include a step 830 of generating a classified breast image(s). In some embodiments, the classified image may vary the appearance of the different tissue constituents. This may include but is not limited to different contrast, brightness, and color. The classified image may be generated after any of the processing steps, e.g., after the first classification and/or the second classification.
[0099] Figure 1 1 shows an example of the processing the images of a breast in the three directions. Rows 1110, 1120, and 1130 show the BCT images in the coronal direction or plane, transverse direction or plane, and the sagittal direction or plane, respectively. Images 1112, 1122, and 1132 (in the first column) show the original images in the respective planes. Images 1114, 1124, and 1134 (in the second column) show the generated, corresponding skin mask after the morphologic operations, for example, after step 1020. Images 1116, 1126, and 1136 (in the third column) show the generated classified image illustrating three tissue constituents, skin, glandular, and fat tissue. For example, the image 1116 includes glandular tissue 1146, fat tissue 1144, and skin tissue 1142. The tissue constituents (fat, gland, and fat) have different values (1, 2, and 3, respectively) as compared to the background, which has a value of 0.
Quantifying Step
[00100] In some embodiments, the method 100 may further include a step 160 of quantifying the classified breast tissue image. The quantifying step may include but is not limited to determining quantitative measurements for breast tissue composition, tissue density and distribution with respect to ages. For example, glandular tissue fraction may be determined.
[00101] In some embodiments, the method may further including analyzing the quantitative measurements to detect, as well as diagnosis, breast cancer. In some embodiments, the analyzing may include determining the type of breast cancer. In some embodiments, the analyzing may include analyzing the density and/or Hounsfield number. There have been reports of differences in the density and Hounsfield number associated with different types of cancers compared to normal breast tissues. See, e.g., Johns et al., Phys. Med. Biol., 1987, 32(6):675.
[00102] In some embodiments, the method may further include processing the quantitative measurements to generate a simulated breast compression for comparison to mammography. The processing may be performed using finite-element analysis algorithms simulating classified breast data. While the present work focuses on classification of breast tissue, with improvements in image signal-to- noise performance, it may be possible to differentiate between breast cancers and normal glandular tissue using the same strategies.
[00103] Figure 12 shows an example of a fractional composition of a fatty replacement breast. The fractional composition plotted is determined by slice, whereas the percentage of each tissue number represents the total tissue composition. The fractional composition plot 1200 includes one midbreast slice of the breast CT 1210. The breast CT 1210 includes the original image 1212, the corresponding segment image 1216, and the fusion image 1214. The composition analysis 1220 illustrated on the bottom is through all slices (from nipple to base). For this example, the total breast volume was 886cc with a volume factional composition of 6.8% skin, 84.7% fat, and 6.8% gland.
Outputting Step
[00104] The method 100 may further include a step 170 of outputting the classified images. The image may be outputted after any step of processing step 160 and quantifying step 170. In some embodiments, the outputting may include but is not limited to displaying the classified image(s), printing the classified image(s), and storing the classified image(s) remotely or locally. In other embodiments, the classified image with may be forwarded for further processing.
[00105] The classified image(s) may include a plurality of (sliced) images in different directions. The classified image may include a 3D image of the breast. In some embodiments, the classified image may include quantitative measurements. In some embodiments, the measurements may be presented in a graph or plot.
SYSTEM IMPLEMENTATION
[00106] Figure 13 shows an example of a system 1300 configured to process an image of a breast to classify the tissue constituents. The system for carrying out the embodiments of the methods disclosed herein is not limited to the system shown in Figure 13. Other systems may be used.
[00107] In some embodiments, the system 1300 may include an image acquisition device 1310 configured to acquire the image data of a patient. The image acquisition system or device 1310 may be any device configured to acquire images from a CT scan. The image acquisition system device 1310 may be a dedicated breast-computed tomography (BCT) system or device. Examples of a BCT system may include but are not limited to systems being developed a team of researches lead by John M. Boone, PhD, at the University of California, Davis (as described in, for example, Boone et al., Med Phys., 2005, 32(12):3767-3776; Boone et al., Radiology, 2001, 221(3):657-667; and Boone et al., Med Phys., 2004, 31(2):226-235); systems being developed by Koning Corporation; and systems being developed by Zumatek Corporation.
[00108] The system 1300 may further include a computer system 1320 to carry out the classifying of the tissue and generating a classified image. The computer system 1320 may further be used to control the operation of the system or a computer separate system may be included.
[00109] The computer system 1320 may also be communicably connected to another computer system as well as a wired or wireless network. The computer system 1320 may receive or obtain the image data from the image acquisition device 1310 or from another module provided on the network, for example, a medical image storage device 1312, such as a picture archiving and communication system (PACS) image storage.
[00110] The computer system 1320 may include a number of modules that communicate with each other through electrical and/or data connections (not shown). Data connections may be direct wired links or may be fiber optic connections or wireless communications links or the like. The computer system 1320 may also be connected to permanent or back-up memory storage, a network, or may communicate with a separate system control through a link (not shown). The modules may include a CPU 1322, a memory 1324, an image processor 1330, an input device 1326, a display 1328, and a printer interface 1329.
[00111] The CPU 1322 may any known central processing unit, a processor, or a microprocessor. The CPU 1322 may be coupled directly or indirectly to memory elements. The memory 1324 may include random access memory (RAM), read only memory (ROM), disk drive, tape drive, etc., or a combinations thereof. The memory may also include a frame buffer for storing image data arrays.
[00112] The present disclosure may be implemented as a routine that is stored in memory 1324 and executed by the CPU 1322. As such, the computer system 1320 may be a general purpose computer system that becomes a specific purpose computer system when executing the routine of the disclosure.
[00113] The computer system 1320 may also include an operating system and micro instruction code. The various processes and functions described herein may either be part of the micro instruction code or part of the application program or routine (or combination thereof) that is executed via the operating system. In addition, various other peripheral devices may be connected to the computer platform such as an additional data storage device, a printing device, and I/O devices.
[00114] The input device 1326 may include a mouse, joystick, keyboard, track ball, touch activated screen, light wand, voice control, or any similar or equivalent input device, and may be used for interactive geometry prescription. The input device 1326 may control the production, display of images on the display 1328, and printing of the images by the printer interface 1329. The display 1328 and the printer interface 1329 may be any known display screen and the printer interface 1329 may any known printer, either locally or network connected.
[00115] The image processor 1330 may be any known central processing unit, a processor, or a microprocessor. In some embodiments, the image processor 1330 may process and classify the image to generate classified images. The image processor 1330 may also quantify the classified images. In other embodiments, the image processor 1330 may be replaced by image processing functionality on the CPU 1322.
[00116] In some embodiments, the image processor 1330 may be configured to process and classify the images (data) from the image acquisition device 1310 and/or the medical image storage device 1312. In some embodiments, the image processor 1330 may be configured to process and classify the tissue of the image (data) to generate a classified image. In some embodiments, the image processor 1330 may be configured to quantify the classified tissue of the image, for example, determine the breast tissue composition, tissue density and distribution with respect to ages. In some embodiments, the image processor 1330 may also carry out finite-element analysis algorithms to simulate breast compression for comparing the classified BCT image to mammography.
[00117] In some embodiments, the image processor 1330 may include a first classifier 1332. The first classifier 1332 may be configured to carry out the FCM classification to classify a BCT image(s) to determine a (first) classification. The (first) classification may include a plurality of tissue classes, one of which includes skin and glandular tissue.
[00118] In some embodiments, the image processor 1330 may include a second classifier 1334. The second classifier 1332 may be configured to carry out morphological processing, including erosions operations, to classify the (first) classification (results) from the first classifier 1332 to determine a (second) classification. The second classifier 1334 may be configured to further classify the one of the classes into skin and glandular tissue.
[00119] In some embodiments, the image processor 1330 may include a tissue quantifier 1336. The tissue quantifier 1336 may be configured to carry out quantification of the classified tissue. The tissue quantifier 1336 may be configured to determine parameters of the classified tissue. The parameters may include but are not limited to the breast tissue composition, tissue density and distribution with respect to ages.
[00120] In some embodiments, the classified and/or quantified image (data) may be stored in the memory 1324. In other embodiments, another computer system may assume the image reconstruction or other functions of the image processor 1330. In response to commands received from the input device 1326, the image data stored in the memory 1324 may be archived in a long term storage or may be further processed by the image processor 1330 and presented on the display 1329.
[00121] It is to be understood that the embodiments of the disclosure be implemented in various forms of hardware, software, firmware, special purpose processes, or a combination thereof. In one
embodiment, the disclosure may be implemented in software as an application program tangible embodied on a computer readable program storage device. The application program may be uploaded to, and executed by, a machine comprising any suitable architecture. The system and method of the present disclosure may be implemented in the form of a software application running on a computer system, for example, a mainframe, personal computer (PC), handheld computer, server, etc. The software application may be stored on a recording media locally accessible by the computer system and accessible via a hard wired or wireless connection to a network, for example, a local area network, or the Internet.
[00122] It is to be further understood that, because some of the constituent system components and method steps depicted in the accompanying figures can be implemented in software, the actual connections between the systems components (or the process steps) may differ depending upon the manner in which the disclosure is programmed. Given the teachings of the disclosure provided herein, one of ordinary skill in the related art will be able to contemplate these and similar implementations or configurations of the disclosure.
[00123] All references cited herein are hereby incorporated by reference in their entirety.
[00124] While various embodiments of the disclosure have been described, the description is intended to be exemplary rather than limiting and it will be appeared to those of ordinary skill in the art that may more embodiments and implementations are possible that are within the scope of the disclosure.

Claims

CLAIMS What is claimed:
1. A method for processing at least one image of a breast to determine a plurality of tissue constituents of the breast, the at least one image including image data, comprising:
processing the image to determine at least a first classification of the tissue constituents represented by the image and a second classification of the tissue constituents represented by the image, the second tissue classification being based on, in at least a part of, the first classification; and generating at least one classified image of the breast based on the processing,
wherein the first classification is based on intensity information and the second classification is based on position information.
2. The method according to claim 1, further comprising:
correcting bias of the image; and
filtering the corrected image with a multi-scale filter,
wherein the processing includes processing a corrected and filtered image.
3. The method of claim 1 , wherein
the tissue constituents include at least a first constituent, a second constituent and a third constituent;
the processing includes first classifying the image into the first classification, the first classification including at least two classes, a first class including the first constituent and the second constituent, and the second class including the third constituent.
4. The method of claim 3, wherein the first constituent corresponds to glandular tissue, the second constituent corresponds to skin tissue, and the third constituent corresponds to fat tissue.
5. The method of claim 3, wherein the first classifying includes applying a fuzzy C-means classifier to the image.
6. The method of claim 3, wherein the processing includes second classifying the image into the second classification, wherein the second classifying includes separating the first class into the first and second constituents.
7. The method of claim 6, wherein the second classifying includes applying erosion operations to at least a portion of the first classification.
8. The method of claim 6, the method further comprising:
obtaining a first mask, the first mask being of a whole breast;
obtaining a second mask, the second mask being of skin; and
applying the second mask to the first classification to classify the skin and glandular tissue.
9. The method of claim 1, further comprising:
quantifying the classified image, wherein the quantifying includes determining quantitative measurements for at least one of breast tissue composition, tissue density or distribution with respect to age.
10. The method of claim 1, further comprising:
outputting the classified image.
11. A computer-readable storage medium storing instructions for processing at least one image of a breast to determine a plurality of tissue constituents of the breast, the at least one image including image data, the instructions comprising:
processing the image to determine at least a first classification of the tissue constituents represented by the image and a second classification of the tissue constituents represented by the image, the second tissue classification being based on, in at least a part of, the first classification; and generating at least one classified image of the breast based on the first and second classifications,
wherein the first classification is based on intensity information and the second classification is based on position information.
12. The medium according to claim 11, further comprising instructions for:
correcting bias of the image; and
filtering the corrected image with a multi-scale filter,
wherein the processing processes a corrected and filtered image.
13. The medium according to claim 11 , wherein:
the tissue constituents include at least a first constituent, a second constituent and a third constituent;
the processing includes first classifying the image into the first classification, the first classification including at least two classes, a first class including a first constituent and a second constituent, and the second class including a third constituent.
14. The medium according to claim 13, wherein the first constituent corresponds to glandular tissue, the second constituent corresponds to skin tissue, and the third constituent corresponds to fat tissue.
15. The medium according to claim 11 , wherein the processing includes second classifying the image into the second classification, wherein the second classification includes separating the first class into the first and second constituents.
16. The method of claim 15, wherein:
the first classifying includes applying a fuzzy C-means classifier to the image; and the second classifying includes applying erosion operations to at least a portion of the first classification.
17. The medium according to claim 11, further comprising instructions for:
quantifying the classified image, wherein the quantifying includes determining quantitative measurements for at least one of breast tissue composition, tissue density or distribution with respect to age.
18. The medium according to claim 11 , further comprising instructions for:
outputting the classified image.
19. A system for processing at least one image of a breast to determine a plurality of tissue constituents of the breast, the at least one image including image data, comprising:
an image processor, the image processor being configured to:
process the image to determine at least a first classification of the tissue constituents represented by the image and a second classification of the tissue constituents represented by the image, the second tissue classification being based on, in at least a part of, the first classification; and generate at least one classified image of the breast based on the processing, wherein the first classification is based on intensity information and the second classification is based on position information.
20. The system according to claim 19, wherein the tissue constituents include glandular tissue, skin tissue, and fat tissue.
PCT/US2012/024824 2011-02-11 2012-02-13 Systems, methods and computer readable storage mediums storing instructions for classifying breast ct images WO2012109643A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201161441826P 2011-02-11 2011-02-11
US61/441,826 2011-02-11

Publications (2)

Publication Number Publication Date
WO2012109643A2 true WO2012109643A2 (en) 2012-08-16
WO2012109643A3 WO2012109643A3 (en) 2012-10-18

Family

ID=46639240

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2012/024824 WO2012109643A2 (en) 2011-02-11 2012-02-13 Systems, methods and computer readable storage mediums storing instructions for classifying breast ct images

Country Status (1)

Country Link
WO (1) WO2012109643A2 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5452367A (en) * 1993-11-29 1995-09-19 Arch Development Corporation Automated method and system for the segmentation of medical images
JPH1011604A (en) * 1996-06-19 1998-01-16 Hitachi Medical Corp Shading method by volume rendering method and device therefor
US20080187095A1 (en) * 2005-05-03 2008-08-07 The Regents Of The University Of California Biopsy Systems For Breast Computed Tomography

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5452367A (en) * 1993-11-29 1995-09-19 Arch Development Corporation Automated method and system for the segmentation of medical images
JPH1011604A (en) * 1996-06-19 1998-01-16 Hitachi Medical Corp Shading method by volume rendering method and device therefor
US20080187095A1 (en) * 2005-05-03 2008-08-07 The Regents Of The University Of California Biopsy Systems For Breast Computed Tomography

Also Published As

Publication number Publication date
WO2012109643A3 (en) 2012-10-18

Similar Documents

Publication Publication Date Title
EP3855391A1 (en) Methods and systems for characterizing anatomical features in medical images
US11284849B2 (en) Dual energy x-ray coronary calcium grading
JP6758817B2 (en) Medical image diagnostic equipment and medical image processing equipment
US8229200B2 (en) Methods and systems for monitoring tumor burden
Goo A computer-aided diagnosis for evaluating lung nodules on chest CT: the current status and perspective
US8340388B2 (en) Systems, computer-readable media, methods, and medical imaging apparatus for the automated detection of suspicious regions of interest in noise normalized X-ray medical imagery
Brown et al. Toward clinically usable CAD for lung cancer screening with computed tomography
US9449403B2 (en) Out of plane artifact reduction in digital breast tomosynthesis and CT
US20150356730A1 (en) Quantitative predictors of tumor severity
MacMahon et al. Computer-aided diagnosis in chest radiology
WO2009105530A2 (en) System and method for automated segmentation, characterization, and classification of possibly malignant lesions and stratification of malignant tumors
WO2011100511A2 (en) Methods for microcalification detection of breast cancer on digital tomosynthesis mammograms
Maitra et al. Automated digital mammogram segmentation for detection of abnormal masses using binary homogeneity enhancement algorithm
Hogeweg et al. Suppression of translucent elongated structures: applications in chest radiography
Lau et al. Towards visual-search model observers for mass detection in breast tomosynthesis
Yang et al. An automated method for accurate vessel segmentation
Lee et al. Automatic left and right lung separation using free-formed surface fitting on volumetric CT
Ha et al. Radiation dose reduction in digital mammography by deep-learning algorithm image reconstruction: a preliminary study
Salman et al. Breast Cancer Classification as Malignant or Benign based on Texture Features using Multilayer Perceptron
WO2012109643A2 (en) Systems, methods and computer readable storage mediums storing instructions for classifying breast ct images
Rodriguez Ruiz Artificial intelligence and tomosynthesis for breast cancer detection
Gomathi et al. Computer aided medical diagnosis system for detection of lung cancer nodules: a survey
Chan et al. Computer-aided diagnosis of breast cancer with tomosynthesis imaging
Park et al. Separation of left and right lungs using 3D information of sequential CT images and a guided dynamic programming algorithm
Pinto et al. Breast shape-specific subtraction for improved contrast enhanced mammography imaging

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12744308

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12744308

Country of ref document: EP

Kind code of ref document: A2