System and method for targeted object enhancement to generate synthetic breast tissue images

Information

  • Patent Grant
  • 11445993
  • Patent Number
    11,445,993
  • Date Filed
    Wednesday, March 28, 2018
    6 years ago
  • Date Issued
    Tuesday, September 20, 2022
    a year ago
Abstract
A method for processing breast tissue image data includes obtaining image data of a patient's breast tissue, processing the image data to generate a set of image slices, the image slices collectively depicting the patient's breast tissue; feeding image slices of the set through each of a plurality of object-recognizing modules, each of the object-recognizing modules being configured to recognize a respective type of object that may be present in the image slices; combining objects recognized by the respective object-recognizing modules to generate a synthesized image of the patient's breast tissue; and displaying the synthesized image.
Description
FIELD

The presently disclosed inventions relate generally to breast imaging techniques such as tomosynthesis, and more specifically to systems and methods for obtaining, processing, synthesizing, storing and displaying a breast imaging data set or a subset thereof. In particular, the present disclosure relates to implementing one or more target object recognition/synthesis modules to identify respective objects in a tomosynthesis stack, and to combine results from the one or more target object recognition/synthesis modules to generate objects to display in one or more synthesized images.


BACKGROUND

Mammography has long been used to screen for breast cancer and other abnormalities. Traditionally, mammograms have been formed on x-ray film. More recently, flat panel digital imagers have been introduced that acquire a mammogram in digital form, and thereby facilitate analysis and storage of the acquired image data, and to also provide other benefits. Further, substantial attention and technological development have been dedicated to obtaining three-dimensional images of the breast using methods such as breast tomosynthesis. In contrast to the 2D images generated by legacy mammography systems, breast tomosynthesis systems construct a 3D image volume from a series of 2D projection images, each projection image obtained at a different angular displacement of an x-ray source relative to the image detector as the x-ray source is scanned over the detector. The constructed 3D image volume is typically presented as a plurality of slices of image data, the slices being mathematically reconstructed on planes typically parallel to the imaging detector. The reconstructed tomosynthesis slices reduce or eliminate the problems caused by tissue overlap and structure noise present in single slice, two-dimensional mammography imaging, by permitting a user (e.g., a radiologist or other medical professional) to scroll through the image slices to view only the structures in that slice.


Imaging systems such as tomosynthesis systems have recently been developed for breast cancer screening and diagnosis. In particular, Hologic, Inc. (www.hologic.com) has developed a fused, multimode mammography/tomosynthesis system that acquires one or both types of mammogram and tomosynthesis images, either while the breast remains immobilized or in different compressions of the breast. Other companies have introduced systems that include tomosynthesis imaging; e.g., which do not include the ability to also acquire a mammogram in the same compression.


Examples of systems and methods that leverage existing medical expertise in order to facilitate, optionally, the transition to tomosynthesis technology are described in U.S. Pat. No. 7,760,924, which is hereby incorporated by reference in its entirety. In particular, U.S. Pat. No. 7,760,924 describes a method of generating a synthesized 2D image, which may optionally be displayed along with tomosynthesis projection or reconstructed images, in order to assist in screening and diagnosis.


The 2D synthesized image is designed to provide a concise representation of the 3D reconstruction slices, including any clinically important and meaningful information, such as abnormal lesions and normal breast structures, while representing in relevant part a traditional 2D image. There are many different types of lesions and breast structures, which may be defined as different types of image objects having different characteristics. For any given image object visible in the 3D volume data, it is important to maintain and enhance the image characteristics (e.g., micro-calcifications, architectural distortions, etc.) as much as possible onto the 2D synthesized image. To achieve the enhancement of the targeted image object, it is critical to accurately identify and represent the image object present in the 3D tomosynthesis data.


SUMMARY

In one embodiment of the disclosed inventions, a method for processing breast tissue image data includes obtaining image data of a patient's breast tissue, processing the image data to generate a set of image slices, the image slices collectively depicting the patient's breast tissue; feeding image slices of the set through each of a plurality of object-recognizing modules, each of the object-recognizing modules being configured to recognize a respective type of object that may be present in the image slices; combining objects recognized by the respective object-recognizing modules to generate a synthesized image of the patient's breast tissue; and displaying the synthesized image.


These and other aspects and embodiments of the disclosed inventions are described in more detail below, in conjunction with the accompanying figures.





BRIEF DESCRIPTION OF THE FIGURES

The drawings illustrate the design and utility of embodiments of the disclosed inventions, in which similar elements are referred to by common reference numerals. These drawings are not necessarily drawn to scale. In order to better appreciate how the above-recited and other advantages and objects are obtained, a more particular description of the embodiments will be rendered, which are illustrated in the accompanying drawings. These drawings depict only typical embodiments of the disclosed inventions and are not therefore to be considered limiting of its scope.



FIG. 1 is a block diagram illustrating the flow of data through an exemplary breast image acquisition and processing system in accordance with embodiments of the disclosed inventions;



FIG. 2 is a block diagram illustrating the flow of data through a 2D synthesizer that utilizes multiple target object recognition/enhancement modules to identify respective objects in an image stack in accordance with embodiments of the disclosed inventions;



FIG. 3 illustrates one embodiment of applying target object recognition/enhancement modules on an image stack to recognize respective objects and reduce the objects onto the 2D synthesized image;



FIG. 4 illustrates a flow of data when applying a single target object recognition/enhancement module on an image stack;



FIG. 5 illustrates a flow of data when applying multiple target object recognition/enhancement modules on an image stack;



FIGS. 6A and 6B illustrate a sequential combination technique of combining data from multiple target object synthesis modules;



FIGS. 7A and 7B illustrate a parallel combination technique of combining data from multiple target object synthesis modules; and



FIGS. 8A and 8B illustrate two example flow diagrams of generating 2D synthesized images using the sequential combination and parallel combination techniques respectively.





DETAILED DESCRIPTION OF THE ILLUSTRATED EMBODIMENTS

All numeric values are herein assumed to be modified by the terms “about” or “approximately,” whether or not explicitly indicated, wherein the terms “about” and “approximately” generally refer to a range of numbers that one of skill in the art would consider equivalent to the recited value (i.e., having the same function or result). In some instances, the terms “about” and “approximately” may include numbers that are rounded to the nearest significant figure. The recitation of numerical ranges by endpoints includes all numbers within that range (e.g., 1 to 5 includes 1, 1.5, 2, 2.75, 3, 3.80, 4, and 5).


As used in this specification and the appended claims, the singular forms “a”, “an”, and “the” include plural referents unless the content clearly dictates otherwise. As used in this specification and the appended claims, the term “or” is generally employed in its sense including “and/or” unless the content clearly dictates otherwise. In describing the depicted embodiments of the disclosed inventions illustrated in the accompanying figures, specific terminology is employed for the sake of clarity and ease of description. However, the disclosure of this patent specification is not intended to be limited to the specific terminology so selected, and it is to be understood that each specific element includes all technical equivalents that operate in a similar manner. It is to be further understood that the various elements and/or features of different illustrative embodiments may be combined with each other and/or substituted for each other wherever possible within the scope of this disclosure and the appended claims.


Various embodiments of the disclosed inventions are described hereinafter with reference to the figures. It should be noted that the figures are not drawn to scale and that elements of similar structures or functions are represented by like reference numerals throughout the figures. It should also be noted that the figures are only intended to facilitate the description of the embodiments. They are not intended as an exhaustive description of the invention or as a limitation on the scope of the disclosed inventions, which is defined only by the appended claims and their equivalents. In addition, an illustrated embodiment of the disclosed inventions needs not have all the aspects or advantages shown. For example, an aspect or an advantage described in conjunction with a particular embodiment of the disclosed inventions is not necessarily limited to that embodiment and can be practiced in any other embodiments even if not so illustrated.


For the following defined terms and abbreviations, these definitions shall be applied throughout this patent specification and the accompanying claims, unless a different definition is given in the claims or elsewhere in this specification:


An “acquired image” refers to an image generated while visualizing a patient's tissue. Acquired images can be generated by radiation from a radiation source impacting on a radiation detector disposed on opposite sides of a patient's tissue, as in a conventional mammogram.


A “reconstructed image” refers to an image generated from data derived from a plurality of acquired images. A reconstructed image simulates an acquired image not included in the plurality of acquired images.


A “synthesized image” refers to an artificial image generated from data derived from a plurality of acquired and/or reconstructed images. A synthesized image includes elements (e.g., objects and regions) from the acquired and/or reconstructed images, but does not necessarily correspond to an image that can be acquired during visualization. Synthesized images are constructed analysis tools.


An “Mp” image is a conventional mammogram or contrast enhanced mammogram, which are two-dimensional (2D) projection images of a breast, and encompasses both a digital image as acquired by a flat panel detector or another imaging device, and the image after conventional processing to prepare it for display (e.g., to a health professional), storage (e.g., in the PACS system of a hospital), and/or other use.


A “Tp” image is an image that is similarly two-dimensional (2D), but is acquired at a respective tomosynthesis angle between the breast and the origin of the imaging x rays (typically the focal spot of an x-ray tube), and encompasses the image as acquired, as well as the image data after being processed for display, storage, and/or other use.


A “Tr” image is a type (or subset) of a reconstructed image that is reconstructed from tomosynthesis projection images Tp, for example, in the manner described in one or more of U.S. Pat. Nos. 7,577,282, 7,606,801, 7,760,924, and 8,571,289, the disclosures of which are fully incorporated by reference herein in their entirety, wherein a Tr image represents a slice of the breast as it would appear in a projection x ray image of that slice at any desired angle, not only at an angle used for acquiring Tp or Mp images.


An “Ms” image is a type (or subset) of a synthesized image, in particular, a synthesized 2D projection image that simulates mammography images, such as a craniocaudal (CC) or mediolateral oblique (MLO) images, and is constructed using tomosynthesis projection images Tp, tomosynthesis reconstructed images Tr, or a combination thereof. Ms images may be provided for display to a health professional or for storage in the PACS system of a hospital or another institution. Examples of methods that may be used to generate Ms images are described in the above-incorporated U.S. Pat. Nos. 7,760,924 and 8,571,289.


It should be appreciated that Tp, Tr, Ms and Mp image data encompasses information, in whatever form, that is sufficient to describe the respective image for display, further processing, or storage. The respective Mp, Ms. Tp and Tr images are typically provided in digital form prior to being displayed, with each image being defined by information that identifies the properties of each pixel in a two-dimensional array of pixels. The pixel values typically relate to respective measured, estimated, or computed responses to X-rays of corresponding volumes in the breast, i.e., voxels or columns of tissue. In a preferred embodiment, the geometry of the tomosynthesis images (Tr and Tp) and mammography images (Ms and Mp) are matched to a common coordinate system, as described in U.S. Pat. No. 7,702,142. Unless otherwise specified, such coordinate system matching is assumed to be implemented with respect to the embodiments described in the ensuing detailed description of this patent specification.


The terms “generating an image” and “transmitting an image” respectively refer to generating and transmitting information that is sufficient to describe the image for display. The generated and transmitted information is typically digital information.


In order to ensure that a synthesized 2D image displayed to an end-user (e.g., an Ms image) includes the most clinically relevant information, it is necessary to detect and identify three-dimensional (3D) objects, such as malignant breast mass, tumors, etc., within the breast tissue. Towards this end, in accordance with embodiments of the presently disclosed inventions, 3D objects may be identified using multiple target object recognition/synthesis modules, wherein each target recognition/synthesis module may be configured to identify and reconstruct a particular type of object. These multiple target synthesis modules may work together in combining information pertaining to respective objects during the reconstruction process of generating one or more synthesized 2D images, ensuring that each object is represented accurately, and preserving clinically significant information on the 2D synthesized images that are the displayed to the end-user.



FIG. 1 illustrates the flow of data in an exemplary image generation and display system 100, which incorporates each of synthesized image generation, object identification, and display technology. It should be understood that, while FIG. 1 illustrates a particular embodiment of a flow diagram with certain processes taking place in a particular serial order or in parallel, the claims and various other embodiments described herein are not limited to the performance of the image processing steps in any particular order, unless so specified.


More particularly, the image generation and display system 100 includes an image acquisition system 101 that acquires tomosynthesis image data for generating Tp images of a patient's breasts, using the respective three-dimensional and/or tomosynthesis acquisition methods of any of the currently available systems. If the acquisition system is a combined tomosynthesis/mammography system, Mp images may also be generated. Some dedicated tomosynthesis systems or combined tomosynthesis/mammography systems may be adapted to accept and store legacy mammogram images, (indicated by a dashed line and legend “Mplegacy” in FIG. 1) in a storage device 102, which is preferably a DICOM-compliant Picture Archiving and Communication System (PACS) storage device. Following acquisition, the tomosynthesis projection images Tp may also be transmitted to the storage device 102 (as shown in FIG. 1). The storage device 102 may further store a library of known 3D objects that may be used to identify significant 3D image patterns to the end-user. In other embodiments, a separate dedicated storage device (not shown) may be used to store the library of known 3D objects with which to identify 3D image patterns or objects.


The Tp images are transmitted from either the acquisition system 101, or from the storage device 102, or both, to a computer system configured as a reconstruction engine 103 that reconstructs the Tp images into reconstructed image “slices” Tr, representing breast slices of selected thickness and at selected orientations, as disclosed in the above-incorporated patents and applications.


Mode filters 107 are disposed between image acquisition and image display. The filters 107 may additionally include customized filters for each type of image (i.e., Tp, Mp, and Tr images) arranged to identify and highlight certain aspects of the respective image types. In this manner, each imaging mode can be tuned or configured in an optimal way for a specific purpose. For example, filters programmed for recognizing objects across various 2D image slices may be applied in order to detect image patterns that may belong to a particular high-dimensional objects. The tuning or configuration may be automatic, based on the type of the image, or may be defined by manual input, for example through a user interface coupled to a display. In the illustrated embodiment of FIG. 1, the mode filters 107 are selected to highlight particular characteristics of the images that are best displayed in respective imaging modes, for example, geared towards identifying objects, highlighting masses or calcifications, identifying certain image patterns that may be constructed into a 3D object, or for creating 2D synthesized images (described below). Although FIG. 1 illustrates only one mode filter 107, it should be appreciated that any number of mode filters may be utilized in order to identify structures of interest in the breast tissue.


The imaging and display system 100 further includes a 2D image synthesizer 104 that operates substantially in parallel with the reconstruction engine 103 for generating 2D synthesized images using a combination of one or more Tp, Mp, and/or Tr images. The 2D image synthesizer 104 consumes a set of input images (e.g., Mp, Tr and/or Tp images), determines a set of most relevant features from each of the input images, and outputs one or more synthesized 2D images. The synthesized 2D image represents a consolidated synthesized image that condenses significant portions of various slices onto one image. This provides an end-user (e.g., medical personnel, radiologist, etc.) with the most clinically-relevant image data in an efficient manner, and reduces time spent on other images that may not have significant data.


One type of relevant image data to highlight in the synthesized 2D images would be relevant objects found across one or more Mp, Tr and/or Tp images. Rather than simply assessing image patterns of interest in each of the 2D image slices, it may be helpful to determine whether any of the 2D image patterns of interest belong to a larger high-dimensional structure, and if so, to combine the identified 2D image patterns into a higher-dimensional structure. This approach has several advantages, but in particular, by identifying high-dimensional structures across various slices/depths of the breast tissue, the end-user may be better informed as to the presence of a potentially significant structure that may not be easily visible in various 2D slices of the breast.


Further, instead of identifying similar image patterns in two 2D slices (that are perhaps adjacent to each other), and determining whether or not to highlight image data from one or both of the 2D slices, identifying both image patterns as belonging to the same high-dimensional structure may allow the system to make a more accurate assessment pertaining to the nature of the structure, and consequently provide significantly more valuable information to the end-user. Also, by identifying the high-dimensional structure, the structure can be more accurately depicted on the synthesized 2D image. Yet another advantage of identifying high-dimensional structures within the various captured 2D slices of the breast tissue relates to identifying a possible size/scope of the identified higher-dimensional structure. For example, once a structure has been identified, previously unremarkable image patterns that are somewhat proximate to the high-dimensional structure may now be identified as belonging to the same structure. This may provide the end-user with an indication that the high-dimensional structure is increasing in size/scope.


To this end, the 2D image synthesizer 104 employs a plurality of target object recognition/enhancement modules (also referred to as target object synthesis modules) that are configured to identify and reconstruct different types of objects. Each target image recognition/synthesis module may be applied (or “run”) on a stack (e.g., a tomosynthesis image stack) of 2D image slices of a patient's breast tissue, and work to identify particular types of objects that may be in the breast tissue, and ensure that such object(s) are represented in a clinically-significant manner in the resulting 2D synthesized image presented to the end-user. For example, a first target image synthesis module may be configured to identify calcifications in the breast tissue. Another target image synthesis module may be configured to identify and reconstruct spiculated lesions in the breast tissue. Yet another target image synthesis module may be configured to identify and reconstruct spherical masses in the breast tissue. In one or more embodiments, the multiple target image synthesis modules process the image slice data and populate respective objects in a high-dimensional grid (e.g., 3D grid) comprising respective high-dimensional structures (e.g., 3D objects) present in the breast tissue. This high-dimensional grid may then be utilized to accurately depict the various structures in the 2D synthesized image.


A high-dimensional object may refer to any object that comprises at least three or more dimensions, e.g., 3D or higher object, or a 3D or higher object and time dimension, etc. Examples of such objects or structures include, without limitation, calcifications, spiculated lesions, benign tumors, irregular masses, dense objects, etc. An image object may be defined as a certain type of image pattern that exists in the image data. The object may be a simple round object in a 3D space, and a corresponding flat round object in a 2D space. It can be an object with complex patterns and complex shapes, and it can be of any size or dimension. The concept of an object may extend past a locally bound geometrical object. Rather, the image object may refer to an abstract pattern or structure that can exist in any dimensional shape. It should be appreciated that the inventions disclosed herein are not limited to 3D objects and/or structures, and may include higher-dimensional structures. It should be appreciated that each of the target image synthesis modules is configured for identifying and reconstructing respective types of objects. These “objects” may refer to 2D shapes, 2D image patterns, 3D objects, or any other high-dimensional object, but in any event will all be referred to as “objects” or “3D objects” herein for simplicity, but this illustrative use should not be otherwise read as limiting the scope of the claims.


In the illustrated embodiment, the 2D synthesizer 104 comprises a plurality of target object recognition/enhancement modules (e.g., 110a, 110b . . . 110n), each configured for recognizing and enhancing a particular type of object. Each of the target object recognition/enhancement modules 110 may be run on a 2D image stack (e.g., Tr image stack), and is configured to identify the respective object (if any is/are present) therein. By identifying the assigned object in the 2D image stack, each target object recognition/enhancement module 110 works to ensure that the respective object is preserved and depicted accurately in the resulting 2D synthesized image presented to the end-user.


In some embodiments, a hierarchical model may be utilized in determining which objects to emphasize or de-emphasize in the 2D synthesized image based on a weight or priority assigned to the target object recognition/enhancement module. In other embodiments, all objects may be treated equally, and different objects may be fused together if there is an overlap in the z direction, as will be discussed in further detail below. These reconstruction techniques allow for creation of 2D synthesized images that comprise clinically-significant information, while eliminating or reducing unnecessary or visually confusing information.


The synthesized 2D images may be viewed at a display system 105. The reconstruction engine 103 and 2D image synthesizer 104 are preferably connected to a display system 105 via a fast transmission link. The display system 105 may be part of a standard acquisition workstation (e.g., of acquisition system 101), or of a standard (multi-display) review station (not shown) that is physically remote from the acquisition system 101. In some embodiments, a display connected via a communication network may be used, for example, a display of a personal computer or of a so-called tablet, smart phone or other hand-held device. In any event, the display 105 of the system is preferably able to display respective Ms, Mp, Tr, and/or Tp images concurrently, e.g., in separate side-by-side monitors of a review workstation, although the invention may still be implemented with a single display monitor, by toggling between images.


Thus, the imaging and display system 100, which is described as for purposes of illustration and not limitation, is capable of receiving and selectively displaying tomosynthesis projection images Tp, tomosynthesis reconstruction images Tr, synthesized mammogram images Ms, and/or mammogram (including contrast mammogram) images Mp, or any one or sub combination of these image types. The system 100 employs software to convert (i.e., reconstruct) tomosynthesis images Tp into images Tr, software for synthesizing mammogram images Ms, software for decomposing 3D objects, software for creating feature maps and object maps. An object of interest or feature in a source image may be considered a ‘most relevant’ feature for inclusion in a 2D synthesized image based upon the application of the object maps along with one or more algorithms and/or heuristics, wherein the algorithms assign numerical values, weights or thresholds, to pixels or regions of the respective source images based upon identified/detected objects and features of interest within the respective region or between features. The objects and features of interest may include, for example, spiculated lesions, calcifications, and the like.



FIG. 2 illustrates the 2D image synthesizer 104 in further detail. As discussed above, various image slices 218 of a tomosynthesis data set (or “stack”) 202 (e.g., filtered and/or unfiltered Mp, Tr and/or Tp images of a patient's breast tissue) are input into the 2D image synthesizer 104, and then processed to determine portions of the images to highlight in a synthesized 2D image that will be displayed on the display 105. The image slices 218 may be consecutively-captured cross-sections of a patient's breast tissue. Or, the image slices 218 may be cross-sectional images of the patient's breast tissue captured at known intervals. The tomosynthesis image stack 202 comprising the image slices 218 may be forwarded to the 2D image synthesizer 104, which evaluates each of the source images in order to (1) identify various types of objects (Tr) for possible inclusion in one or more 2D synthesized images, and/or (2) identify respective pixel regions in the images that contain the identified objects.


As shown in the illustrated embodiment, the tomosynthesis stack 202 comprises a plurality of images 218 taken at various depths/cross-sections of the patient's breast tissue. Some of the images 218 in the tomosynthesis stack 202 comprise 2D image patterns. Thus, the tomosynthesis stack 202 comprises a large number of input images containing various image patterns within the images of the stack.


More particularly, as shown in FIG. 2, three target object recognition/enhancement modules 210a, 210b and 210c are configured to run on the tomosynthesis stack 202, wherein each of the target object recognition and enhancement modules 210 corresponds to a respective set of programs/rules and parameters that define a particular object, and how to identify that particular object amongst other objects that may exist in the breast tissue depicted by the tomosynthesis stack 202. For example, filtering/image recognition techniques and various algorithms/heuristics may be run on the tomosynthesis stack 202 in order to identify the object assigned to the particular target object recognition/enhancement module 210. It will be appreciated that there are many ways to recognize objects using a combination of image manipulation/filtration techniques.


For the purposes of illustration, it will be assumed that the each of the target object recognition/enhancement modules 210 identifies at least one respective object, but it should be appreciated that in many cases no objects will be identified. However, even healthy breast tissue may have one or more suspicious objects or structures, and the target object recognition/enhancement modules may inadvertently identify a breast background object. For example, all breast linear tissue and density tissue structures can be displayed as the breast background object. In other embodiments, “healthy” objects such as spherical shapes, oval shapes, etc., may simply be identified by one or more of the target object recognition/enhancement modules 210. The identified 3D objects may then be displayed on the 2D synthesized image 206; of course, out of all identified 2D objects, more clinically-significant objects may be prioritized/enhanced when displaying the respective objects on the 2D synthesized image, as will be discussed in further detail below.


In the illustrated embodiment, a first target object recognition/enhancement module 210a is configured to recognize circular and/or spherical shapes in the images 218 of the tomosynthesis stack 202 (e.g., Tr, Tp, Mp, etc.). A second target object synthesis module 210b is configured to recognize lobulated shapes. A third target object synthesis module 210c is configured to recognize calcification patterns. In particular, each of the target object synthesis modules 210a, 210b and 210c is run on the Tr image stack 202, wherein a set of features/objects are recognized by the respective target object synthesis modules.


For example, target object recognition/enhancement module 210a may recognize one or more circular shapes and store these as “recognized objects” 220a. It will be appreciated that multiple image slices 218 of the stack 202 may contain circular shapes, and that these shapes may be associated with the same spherical object, or may belong to different spherical objects. In the illustrated embodiment, at least two distinct circular objects are recognized by the target object recognition/enhancement module 210a.


Similarly, target object recognition/enhancement module 210b may recognize one or more lobulated shapes and store these as recognized objects 220b. In the illustrated embodiment, one lobulated object has been recognized in the tomosynthesis stack 202 by the target object recognition/enhancement module 210b. As can be seen, two different image slices 218 in the tomosynthesis stack 202 depict portions of the lobulated object, but the respective portions are recognized as belonging to a single lobulated object by the recognition/enhancement module 210b, and stored as a single recognized object 220b.


Finally, target object recognition/enhancement module 210c may recognize one or more calcification shapes and store these as recognized objects 220c. In the illustrated embodiment, a (single) calcification cluster has been recognized by the target object recognition/enhancement module 210c and stored as a recognized object 220c. The recognized objects 220a, 220b and 220c may be stored at storage facilities corresponding to the respective target object recognition/enhancement modules 210a, 210b and 210c, or alternatively at a separate (i.e., single) storage facility that may be accessed by each of the target object recognition/enhancement modules.


Referring now to FIG. 3, each of the target object recognition/enhancement modules 210 may be configured to identify and synthesize (e.g., to reduce to 2D) a respective 3D object to be displayed on the one or more 2D synthesized images. In other words, once the 3D objects are recognized by the respective target object recognition/enhancement module 210a, 210b or 210c, the target object recognition/enhancement module thereafter converts the recognized 3D object into a 2D format so that the recognized object may be displayed on the 2D synthesized image. In the illustrated embodiment, the target object recognition/enhancement modules 210a, 21b and 210c recognize respective objects, and convert the recognized objects into respective 2D formats. As part of the conversion process, certain of the recognized objects may be enhanced to a greater or lesser degree for the displayed image, as will be discussed in further detail below. Assuming all three target object recognition/enhancement modules 210a, 210b and 210c are considered equally important to the 2D image synthesizer 104, the respective 2D formats of all recognized objects (e.g., two spherical objects, one lobular object, and one calcification mass) depicted on the 2D synthesized image 302.



FIG. 4 illustrates how a single target object recognition/enhancement module 210 may be run on a tomosynthesis stack to generate a portion of the 2D synthesized image. In the illustrated embodiment, image slices 402 are fed through a single target object recognition/enhancement module 404, which is configured to recognize star shaped objects in the stack of images 402. As a result, the single target object synthesis module reduces information pertaining to the recognized star shape gained from various depths of the image slices onto a single 2D synthesized image 406.



FIG. 5 illustrates an exemplary embodiment for having multiple target object recognition/enhancement modules work together to produce the 2D synthesized image. In the illustrated embodiment, image slices 502 (of a respective stack) are fed through a first target object recognition/enhancement module 504a configured to recognize and reconstruct circular and/or spherical shapes, a second target object recognition/enhancement module 504b configured to recognize and reconstruct star-like shapes, and a third target object recognition/enhancement module 504c configured to recognize and reconstruct calcification structures. It should be appreciated that any number of target object recognition/enhancement modules may be programmed for any number of object types.


Each of the target object recognition/enhancement modules 504a, 504b and 504c corresponds to respective algorithms that are configured with various predetermined rules and attributes that enable these programs to successfully recognize respective objects, and reduce the recognized objects to a 2D format. By applying all three target object recognition/synthesis modules 504a, 504b and 504c to the image slices 502, a 2D synthesized image 506 is generated. In particular, rather than simply displaying a single type of object, the 2D synthesized image 506 comprises all three object types that are recognized and synthesized by the three target object recognition/enhancement modules 504a, 504b and 504c, with each of the recognized objects being equally emphasized. While this may be desirable if all the object types are of equal significance, it may be helpful to enhance/emphasize different object types to varying degrees based on their weight/priority. This technique may be more effective in alerting the end-user to a potentially important object, while de-emphasizing objects of lesser importance.


Referring now to FIG. 6A, a hierarchical sequential approach to combine data from the multiple target object recognition/enhancement modules is illustrated. In particular, a sequential combination technique may be applied if the various object types have a clearly defined hierarchy associated with them. For example, one type of object (e.g., spiculated lesions) may be deemed to be more clinically significant than another type of object (e.g., a spherical mass in breast tissue). This type of object (and the corresponding target object module) may be assigned a particular high weight/priority. In such a case, if two objects are competing for space on the 2D synthesized image, the object type associated with the higher priority may be emphasized/displayed on the 2D synthesized image, and the other object type may be de-emphasized, or not displayed at all. Similarly, in such an approach, each of the target object recognition/enhancement modules may be assigned respective weights based on respective significance.


In the illustrated embodiment, the image slices 602 are sequentially fed through three different target object recognition/enhancement modules (604, 606 and 608) to generate the 2D synthesized image 610, wherein each of the target object synthesis modules is configured to recognize and reconstruct a particular type of object. The first target object recognition/enhancement module 604 (associated with a square-shaped object) is run first on the reconstruction image slices 602, followed by the second target object recognition/enhancement module 606 (associated with a diamond-shaped object), and then followed by the third target object recognition/enhancement module 608 (associated with a circular-shaped object). It should be appreciated that since the target object recognition/enhancement modules are applied (or “run”) sequentially, the second target object recognition/enhancement module 606 may be considered a higher priority object as compared with the first target object recognition/enhancement module 604, and the third target object recognition/enhancement module 608 may be considered as having a higher priority as compared to the second target object recognition/enhancement module 606. Thus, the third object type may override (or be emphasized over) the second object type, and the second object type may override (or be emphasized over) the first object type.



FIG. 6B illustrates this hierarchical approach to combining various object types sequentially. In particular, the tomosynthesis image stack 652 includes objects 656, 658 and 660 that can be recognized in various image slices. As illustrated, objects 658 and 660 somewhat overlap in the z direction, which means that they are likely to compete for representation in the 2D synthesized image 654. When using the sequential approach of FIG. 6A to combine data from the multiple target object recognition/enhancement modules 604, 606 and 608, the programmed hierarchy is preserved. Thus, since target object recognition/enhancement module 608 configured to recognize and reconstruct circular-shaped objects has higher priority as compared to target object recognition/enhancement module 604 configured to recognize and reconstruct square-shaped objects, in a case of overlap between the two objects (as is the case in FIG. 6B), circular-shaped object 658 overrides square-shaped object 660 in the 2D synthesized image 654. Of course, it should be appreciated that since diamond-shaped object 656 does not overlap in the z direction with the other two objects, diamond shaped object 656 is also displayed in the 2D synthesized image 654. In other embodiments, instead of completing overriding the lower-priority object, the object with high-priority may be emphasized relative to the lower-priority object (rather than be omitted from display).


Another approach to running multiple target object synthesis modules on a set of image slices is illustrated in FIG. 7A. As can be seen, rather than running the multiple target object recognition/enhancement modules sequentially with the last-run target object synthesis module having the highest priority, all the target object recognition/enhancement modules may be applied in parallel. In particular, one or more enhancement or fusion modules 712 may be utilized to ensure that the various objects are combined appropriately on the 2D synthesized image. This approach may not follow a hierarchical approach, and all of the objects may be given equal weight.


The image slices 702 are fed through three different target object recognition/enhancement modules, 704, 706 and 708, in parallel. The first target object recognition/enhancement module 604 (associated with square-shaped object), the second target object recognition/enhancement module 606 (associated with diamond-shaped object), and the third target object recognition/enhancement module 608 (associated with circular-shaped object) are all run in parallel on the image slices 702. In some embodiments, an enhancement and fusion module 712 may be utilized to ensure that the different objects are fused together appropriately in case of overlap between multiple objects. The target object recognition/enhancement modules 704, 706 and 708, run in parallel may generate the 2D synthesized image 710.


This approach to combining various object types in parallel is illustrated in FIG. 7B. In particular, the tomosynthesis stack 752 depict the same objects as FIG. 6B (e.g., objects 756, 758 and 760) at various image slices. As illustrated, objects 758 and 760 somewhat overlap in the z direction, which means that they are likely to compete for representation and/or overlap in the 2D synthesized image 754. Here, because the multiple target object recognition/enhancement modules are run in parallel, rather than one object type overriding another object type, as was the case in FIG. 6B, both the square-object 760 and the circular object 758 are fused together in the 2D synthesized image 754. Thus, this approach does not assume an innate priority/hierarchy between objects and all objects may be fused together appropriately in the 2D synthesized image 754.



FIG. 8A depicts a flow diagram 800 that illustrates exemplary steps that may be performed in an image merge process carried out in accordance with the sequential combination approach outlined above in conjunction with FIGS. 6A and 6B. At step 802, an image data set is acquired. The image data set may be acquired by a tomosynthesis acquisition system, a combination tomosynthesis/mammography system, or by retrieving pre-existing image data from a storage device, whether locally or remotely located relative to an image display device. At steps 804 and 806, for a range of 2D images (e.g., Tr stack), a first target object recognition/enhancement module is run in order to recognize a first object associated with the first target object recognition/enhancement module. Any recognized objects may be stored in a storage module associated with the first target object recognition/enhancement module. At step 808, a second target object recognition/enhancement module is run in order to recognize a second object associated with the second target object recognition/enhancement module. At step 810, it may be determined whether the first recognize object and the second recognized object overlap each other in the z direction. If it is determined that the two objects overlap, only the second object may be displayed (or otherwise emphasized over the first object) on the 2D synthesized image at step 812. If, on the other hand, it is determined that the two objects do not overlap, both objects are displayed on the 2D synthesized image at step 814.



FIG. 8B depicts a flow diagram 850 that illustrates exemplary steps that may be performed in an image synthesis process carried out in accordance with the parallel combination approach outlined above in conjunction with FIGS. 7A and 7B. At step 852, an image data set is acquired. The image data set may be acquired by a tomosynthesis acquisition system, a combination tomosynthesis/mammography system, or by retrieving pre-existing image data from a storage device, whether locally or remotely located relative to an image display device. At steps 854 and 856, for a range of 2D images (e.g., Tr stack), all the programmed target object recognition/enhancement modules are run to recognize respective objects in the Tr image stack. At step 858, one or more enhancement modules may also be run to determine whether a fusion process needs to occur. At step 860, it may be determined whether any recognized objects overlap in the z direction. If it is determined that any two (or more) objects overlap, the overlapping objects may be fused together, at step 862. If, on the other hand, it is determined that no objects overlap, all the objects are displayed as is on the 2D synthesized image at step 814.


Having described exemplary embodiments, it can be appreciated that the examples described above and depicted in the accompanying figures are only illustrative, and that other embodiments and examples also are encompassed within the scope of the appended claims. For example, while the flow diagrams provided in the accompanying figures are illustrative of exemplary steps; the overall image merge process may be achieved in a variety of manners using other data merge methods known in the art. The system block diagrams are similarly representative only, illustrating functional delineations that are not to be viewed as limiting requirements of the disclosed inventions. It will also be apparent to those skilled in the art that various changes and modifications may be made to the depicted and/or described embodiments (e.g., the dimensions of various parts), without departing from the scope of the disclosed inventions, which is to be defined only by the following claims and their equivalents. The specification and drawings are, accordingly, to be regarded in an illustrative rather than restrictive sense.

Claims
  • 1. A method for processing breast tissue image data, comprising: processing image data of a patient's breast tissue to generate a set of image slices that collectively depict the patient's breast tissue, wherein at least two image slices in the set of image slices comprise at least a first object of a first object type and a second object of a second object type;performing a plurality of object-recognition processes on the set of image slices, wherein each of the plurality of object-recognition processes is configured to recognize a respective type of object that may be present in the set of image slices;based at least in part on the performance of the plurality of object-recognition processes, recognizing the first object of the first object type and the second object of the second object type in the at least two image slices in the set of image slices;based on a determination that the first object of the first object type and the second object of the second object type are likely to overlap in a synthesized image, generating the synthesized image based at least on the at least two image slices in the set of image slices, wherein generating the synthesized image includes fusing the first object and the second object such that at least a portion of each of the first and the second objects are included in the synthesized image; andcausing the synthesized image to be displayed.
  • 2. The method of claim 1, wherein the plurality of object-recognition processes are performed on the image slices in a sequence.
  • 3. The method of claim 2, further comprising assigning a respective weight to each of the plurality of object-recognition processes, wherein: the assigned weight corresponds to a significance of the type of object recognized by a particular one of the plurality of object-recognition processes; andthe respective weights assigned to the object-recognition processes determine an order of image slices in the set of image slices processed by the plurality of object-recognition processes.
  • 4. The method of claim 3, wherein: the plurality of object-recognition processes comprise a first object-recognition process having a first weight and a second object-recognition process having a second weight that is higher than the first weight;the second object of the second object type is recognized by the second object-recognition process; andthe method further comprises: performing the first object-recognition process on the set of image slices prior to generating the synthesized image; andrecognizing a third object of a third object type based on performing the first object-recognition process on the set of image slices.
  • 5. The method of claim 4, further comprising determining whether the third object of the third object type is likely to overlap the second object of the second object type in the synthesized image.
  • 6. The method of claim 5, further comprising based on a determination that the third object of the third object type and the second object of the second object type are likely to overlap, including only the at least the portion of the second object of the second object type fused with the at least the portion of the first object of the first object type in the synthesized image.
  • 7. The method of claim 5, further comprising based on a determination that the third object of the third object type and the second object of the second object type are likely to overlap, emphasizing the at least the portion of the second object of the second object type relative to the third object of the third object type in the synthesized image.
  • 8. The method of claim 1, wherein the plurality of object-recognition processes are performed in parallel on the image slices.
  • 9. The method of claim 8, wherein fusing the first object with the second object such that at least the portion of each of the first and the second objects are included in the synthesized image comprises fusing the first object with the second object such that at least the portion of each of the first and the second objects are enhanced and included in the synthesized image.
  • 10. The method of claim 9, wherein the first object of the first object type is fused with the second object of the second object type using a linear combination technique.
  • 11. The method of claim 9, wherein the first object of the first object type is fused with the second object of the second object type using a non-linear combination technique.
  • 12. The method of claim 1, wherein a first subset of object-recognition processes in the plurality of object-recognition processes are performed sequentially on the set of image slices to recognize a first subset of object types, and a second subset of object-recognition processes in the plurality of object recognition processes are performed in parallel on the set of image slices to recognize a second subset of object types.
  • 13. The method of claim 12, wherein the first subset of object types includes abnormal breast tissue malignancies, and the second subset of object types include normal breast tissue structures or predetermined image patterns.
  • 14. The method of claim 1, further comprising displaying target object types associated with the plurality of object-recognition processes in a graphical user interface.
  • 15. The method of claim 14, wherein the graphical user interface provides options for an end user to select one or more target object types to be recognized and included in the synthesized image.
  • 16. The method of claim 15, wherein the graphical user interface provides options for allowing an end user to input an order of importance for displaying selected target object types in the synthesized image.
  • 17. The method of claim 15, wherein: the graphical user interface provides options for allowing an end user to input a weight factor for each of one or more target object types; anduser input weight factors are considered when generating and displaying user-selected target object types in the synthesized image.
  • 18. The method of claim 17, wherein the options for allowing an end user to input a weight factor for each of one or more target object types are based at least in part on one or more of age, gender, ethnicity, race or genetic characteristics of the patient.
  • 19. An image processing system configured to perform the method of claim 1, wherein the system is configured to allow for adding further object-recognition processes to the plurality of the object-recognition processes in order to recognize and display further types of objects.
  • 20. The method of claim 1, further comprising recognizing, based at least in part on the performance of the plurality of object-recognition processes, a third object of a third object type in at least a third image slice in the set of image slices, wherein generating the synthesized image based at least on the at least two image slices in the set of image slices comprises generating the synthesized image based at least on the at least two image slices and the at least third image slice in the set of image slices.
RELATED APPLICATIONS DATA

The present application is a National Phase entry under 35 U.S.C § 371 of International Patent Application No. PCT/US2018/024913, having an international filing date of Mar. 28, 2018, which claims the benefit under 35 U.S.C. § 119 to U.S. Provisional Patent Application Ser. No. 62/479,036, filed Mar. 30, 2017, which is incorporated by reference in its entirety into the present application.

PCT Information
Filing Document Filing Date Country Kind
PCT/US2018/024913 3/28/2018 WO
Publishing Document Publishing Date Country Kind
WO2018/183550 10/4/2018 WO A
US Referenced Citations (460)
Number Name Date Kind
3502878 Stewart Mar 1970 A
3863073 Wagner Jan 1975 A
3971950 Evans et al. Jul 1976 A
4160906 Daniels Jul 1979 A
4310766 Finkenzeller et al. Jan 1982 A
4496557 Malen et al. Jan 1985 A
4559641 Caugant et al. Dec 1985 A
4706269 Reina et al. Nov 1987 A
4744099 Huettenrauch May 1988 A
4773086 Fujita Sep 1988 A
4773087 Plewes Sep 1988 A
4819258 Kleinman et al. Apr 1989 A
4821727 Levene et al. Apr 1989 A
4907156 Doi et al. Jun 1990 A
4969174 Schied Nov 1990 A
4989227 Tirelli et al. Jan 1991 A
5018176 Romeas et al. May 1991 A
RE33634 Yanaki Jul 1991 E
5029193 Saffer Jul 1991 A
5051904 Griffith Sep 1991 A
5078142 Siczek et al. Jan 1992 A
5099846 Hardy Mar 1992 A
5129911 Siczek et al. Jul 1992 A
5133020 Giger et al. Jul 1992 A
5163075 Lubinsky Nov 1992 A
5164976 Scheid et al. Nov 1992 A
5199056 Darrah Mar 1993 A
5219351 Teubner Jun 1993 A
5240011 Assa Aug 1993 A
5279309 Taylor et al. Jan 1994 A
5280427 Magnusson Jan 1994 A
5289520 Pellegrino et al. Feb 1994 A
5343390 Doi et al. Aug 1994 A
5359637 Webbe Oct 1994 A
5365562 Toker Nov 1994 A
5386447 Siczek Jan 1995 A
5415169 Siczek et al. May 1995 A
5426685 Pellegrino et al. Jun 1995 A
5452367 Bick Sep 1995 A
5491627 Zhang et al. Feb 1996 A
5499097 Ortyn et al. Mar 1996 A
5506877 Niklason et al. Apr 1996 A
5526394 Siczek Jun 1996 A
5539797 Heidsieck et al. Jul 1996 A
5553111 Moore Sep 1996 A
5592562 Rooks Jan 1997 A
5594769 Pellegrino et al. Jan 1997 A
5596200 Sharma Jan 1997 A
5598454 Franetzki Jan 1997 A
5609152 Pellegrino et al. Mar 1997 A
5627869 Andrew et al. May 1997 A
5642433 Lee et al. Jun 1997 A
5642441 Riley et al. Jun 1997 A
5647025 Frost et al. Jul 1997 A
5657362 Giger et al. Aug 1997 A
5668889 Hara Sep 1997 A
5671288 Wilhelm et al. Sep 1997 A
5712890 Spivey Jan 1998 A
5719952 Rooks Feb 1998 A
5735264 Siczek et al. Apr 1998 A
5763871 Ortyn et al. Jun 1998 A
5769086 Ritchart et al. Jun 1998 A
5773832 Sayed et al. Jun 1998 A
5803912 Siczek et al. Sep 1998 A
5818898 Tsukamoto et al. Oct 1998 A
5828722 Ploetz Oct 1998 A
5835079 Shieh Nov 1998 A
5841124 Ortyn et al. Nov 1998 A
5872828 Niklason et al. Feb 1999 A
5875258 Ortyn et al. Feb 1999 A
5878104 Ploetz Mar 1999 A
5878746 Lemelson et al. Mar 1999 A
5896437 Ploetz Apr 1999 A
5941832 Tumey Aug 1999 A
5954650 Saito Sep 1999 A
5986662 Argiro Nov 1999 A
6005907 Ploetz Dec 1999 A
6022325 Siczek et al. Feb 2000 A
6067079 Shieh May 2000 A
6075879 Roehrig et al. Jun 2000 A
6091841 Rogers Jul 2000 A
6101236 Wang et al. Aug 2000 A
6102866 Nields et al. Aug 2000 A
6137527 Abdel-Malek Oct 2000 A
6141398 He Oct 2000 A
6149301 Kautzer et al. Nov 2000 A
6175117 Komardin Jan 2001 B1
6196715 Nambu Mar 2001 B1
6215892 Douglass et al. Apr 2001 B1
6216540 Nelson Apr 2001 B1
6219059 Argiro Apr 2001 B1
6256370 Yavus Apr 2001 B1
6233473 Sheperd May 2001 B1
6243441 Zur Jun 2001 B1
6245028 Furst et al. Jun 2001 B1
6272207 Tang Aug 2001 B1
6289235 Webber et al. Sep 2001 B1
6292530 Yavus Sep 2001 B1
6293282 Lemelson Sep 2001 B1
6327336 Gingold et al. Dec 2001 B1
6327377 Rutenberg et al. Dec 2001 B1
6341156 Baetz Jan 2002 B1
6375352 Hewes Apr 2002 B1
6389104 Bani-Hashemi et al. May 2002 B1
6411836 Patel Jun 2002 B1
6415015 Nicolas Jul 2002 B2
6424332 Powell Jul 2002 B1
6442288 Haerer Aug 2002 B1
6459925 Nields et al. Oct 2002 B1
6463181 Duarte Oct 2002 B2
6468226 McIntyre, IV Oct 2002 B1
6480565 Ning Nov 2002 B1
6501819 Unger et al. Dec 2002 B2
6556655 Chichereau Apr 2003 B1
6574304 Hsieh Jun 2003 B1
6597762 Ferrant Jul 2003 B1
6611575 Alyassin et al. Aug 2003 B1
6620111 Stephens et al. Sep 2003 B2
6626849 Huitema et al. Sep 2003 B2
6633674 Barnes Oct 2003 B1
6638235 Miller et al. Oct 2003 B2
6647092 Eberhard Nov 2003 B2
6650928 Gailly Nov 2003 B1
6683934 Zhao Jan 2004 B1
6744848 Stanton Jun 2004 B2
6748044 Sabol et al. Jun 2004 B2
6751285 Eberhard Jun 2004 B2
6758824 Miller et al. Jul 2004 B1
6813334 Koppe Nov 2004 B2
6882700 Wang Apr 2005 B2
6885724 Li Apr 2005 B2
6901156 Giger et al. May 2005 B2
6912319 Barnes May 2005 B1
6940943 Claus Sep 2005 B2
6978040 Berestov Dec 2005 B2
6987331 Koeppe Jan 2006 B2
6999554 Mertelmeier Feb 2006 B2
7022075 Grunwald et al. Apr 2006 B2
7025725 Dione et al. Apr 2006 B2
7030861 Westerman Apr 2006 B1
7110490 Eberhard Sep 2006 B2
7110502 Tsuji Sep 2006 B2
7117098 Dunlay et al. Oct 2006 B1
7123684 Jing et al. Oct 2006 B2
7127091 OpDeBeek Oct 2006 B2
7142633 Eberhard Nov 2006 B2
7218766 Eberhard May 2007 B2
7245694 Jing et al. Jul 2007 B2
7289825 Fors et al. Oct 2007 B2
7298881 Giger et al. Nov 2007 B2
7315607 Ramsauer Jan 2008 B2
7319735 Defreitas et al. Jan 2008 B2
7323692 Rowlands Jan 2008 B2
7346381 Okerlund et al. Mar 2008 B2
7406150 Minyard et al. Jul 2008 B2
7430272 Jing et al. Sep 2008 B2
7443949 Defreitas et al. Oct 2008 B2
7466795 Eberhard et al. Dec 2008 B2
7577282 Gkanatsios et al. Aug 2009 B2
7606801 Faitelson Oct 2009 B2
7616801 Gkanatsios et al. Nov 2009 B2
7630533 Ruth et al. Dec 2009 B2
7634050 Muller et al. Dec 2009 B2
7640051 Krishnan Dec 2009 B2
7697660 Ning Apr 2010 B2
7702142 Ren Apr 2010 B2
7705830 Westerman et al. Apr 2010 B2
7760924 Ruth Jul 2010 B2
7769219 Zahniser Aug 2010 B2
7787936 Kressy Aug 2010 B2
7809175 Roehrig et al. Oct 2010 B2
7828733 Zhang et al. Nov 2010 B2
7831296 DeFreitas et al. Nov 2010 B2
7869563 DeFreitas Jan 2011 B2
7974924 Holla et al. Jul 2011 B2
7991106 Ren et al. Aug 2011 B2
8044972 Hall et al. Oct 2011 B2
8051386 Rosander et al. Nov 2011 B2
8126226 Bernard et al. Feb 2012 B2
8155421 Ren et al. Apr 2012 B2
8165365 Bernard et al. Apr 2012 B2
8532745 DeFreitas et al. Sep 2013 B2
8571289 Ruth et al. Oct 2013 B2
8594274 Hoernig et al. Nov 2013 B2
8677282 Cragun et al. Mar 2014 B2
8712127 Ren et al. Apr 2014 B2
8897535 Ruth et al. Nov 2014 B2
8983156 Periaswamy et al. Mar 2015 B2
9020579 Smith Apr 2015 B2
9075903 Marshall Jul 2015 B2
9084579 Ren et al. Jul 2015 B2
9119599 Itai Sep 2015 B2
9129362 Jerebko Sep 2015 B2
9289183 Karssemeijer Mar 2016 B2
9451924 Bernard Sep 2016 B2
9456797 Ruth et al. Oct 2016 B2
9478028 Parthasarathy Oct 2016 B2
9589374 Gao Mar 2017 B1
9592019 Sugiyama Mar 2017 B2
9805507 Chen Oct 2017 B2
9808215 Ruth et al. Nov 2017 B2
9811758 Ren et al. Nov 2017 B2
9901309 DeFreitas et al. Feb 2018 B2
10008184 Kreeger et al. Jun 2018 B2
10010302 Ruth et al. Jul 2018 B2
10092358 DeFreitas Oct 2018 B2
10111631 Gkanatsios Oct 2018 B2
10242490 Karssemeijer Mar 2019 B2
10335094 DeFreitas Jul 2019 B2
10357211 Smith Jul 2019 B2
10410417 Chen et al. Sep 2019 B2
10413263 Ruth et al. Sep 2019 B2
10444960 Marshall Oct 2019 B2
10456213 DeFreitas Oct 2019 B2
10573276 Kreeger et al. Feb 2020 B2
10575807 Gkanatsios Mar 2020 B2
10595954 DeFreitas Mar 2020 B2
10624598 Chen Apr 2020 B2
10977863 Chen Apr 2021 B2
10978026 Kreeger Apr 2021 B2
20010038681 Stanton et al. Nov 2001 A1
20010038861 Hsu et al. Nov 2001 A1
20020012450 Tsuji Jan 2002 A1
20020050986 Inoue May 2002 A1
20020075997 Unger et al. Jun 2002 A1
20020113681 Byram Aug 2002 A1
20020122533 Marie et al. Sep 2002 A1
20020188466 Barrette et al. Dec 2002 A1
20020193676 Bodicker Dec 2002 A1
20030007598 Wang Jan 2003 A1
20030018272 Treado et al. Jan 2003 A1
20030026386 Tang Feb 2003 A1
20030048260 Matusis Mar 2003 A1
20030073895 Nields et al. Apr 2003 A1
20030095624 Eberhard et al. May 2003 A1
20030097055 Yanof May 2003 A1
20030128893 Castorina Jul 2003 A1
20030135115 Burdette et al. Jul 2003 A1
20030169847 Karellas Sep 2003 A1
20030194050 Eberhard Oct 2003 A1
20030194121 Eberhard et al. Oct 2003 A1
20030195433 Turovskiy Oct 2003 A1
20030210254 Doan Nov 2003 A1
20030212327 Wang Nov 2003 A1
20030215120 Uppaluri Nov 2003 A1
20040008809 Webber Jan 2004 A1
20040008900 Jabri et al. Jan 2004 A1
20040008901 Avinash Jan 2004 A1
20040036680 Davis Feb 2004 A1
20040047518 Tiana Mar 2004 A1
20040052328 Saboi Mar 2004 A1
20040064037 Smith Apr 2004 A1
20040066884 Claus Apr 2004 A1
20040066904 Eberhard et al. Apr 2004 A1
20040070582 Smith et al. Apr 2004 A1
20040077938 Mark et al. Apr 2004 A1
20040081273 Ning Apr 2004 A1
20040094167 Brady May 2004 A1
20040101095 Jing et al. May 2004 A1
20040109028 Stern et al. Jun 2004 A1
20040109529 Eberhard et al. Jun 2004 A1
20040127789 Ogawa Jul 2004 A1
20040138569 Grunwald Jul 2004 A1
20040171933 Stoller et al. Sep 2004 A1
20040171986 Tremaglio, Jr. et al. Sep 2004 A1
20040267157 Miller et al. Dec 2004 A1
20050047636 Gines et al. Mar 2005 A1
20050049521 Miller et al. Mar 2005 A1
20050063509 Defreitas et al. Mar 2005 A1
20050078797 Danielsson et al. Apr 2005 A1
20050084060 Seppi et al. Apr 2005 A1
20050089205 Kapur Apr 2005 A1
20050105679 Wu et al. May 2005 A1
20050107689 Sasano May 2005 A1
20050111718 MacMahon May 2005 A1
20050113681 DeFreitas et al. May 2005 A1
20050113715 Schwindt et al. May 2005 A1
20050124845 Thomadsen et al. Jun 2005 A1
20050135555 Claus Jun 2005 A1
20050135664 Kaufhold Jun 2005 A1
20050226375 Eberhard Oct 2005 A1
20060009693 Hanover et al. Jan 2006 A1
20060018526 Avinash Jan 2006 A1
20060025680 Jeune-Lomme Feb 2006 A1
20060030784 Miller et al. Feb 2006 A1
20060074288 Kelly et al. Apr 2006 A1
20060098855 Gkanatsios et al. May 2006 A1
20060129062 Nicoson et al. Jun 2006 A1
20060132508 Sadikali Jun 2006 A1
20060147099 Marshall et al. Jul 2006 A1
20060155209 Miller et al. Jul 2006 A1
20060197753 Hotelling Sep 2006 A1
20060210131 Wheeler Sep 2006 A1
20060228012 Masuzawa Oct 2006 A1
20060238546 Handley Oct 2006 A1
20060257009 Wang Nov 2006 A1
20060269040 Mertelmeier Nov 2006 A1
20060291618 Eberhard et al. Dec 2006 A1
20070019846 Bullitt et al. Jan 2007 A1
20070030949 Jing et al. Feb 2007 A1
20070036265 Jing et al. Feb 2007 A1
20070046649 Reiner Mar 2007 A1
20070052700 Wheeler et al. Mar 2007 A1
20070076844 Defreitas et al. Apr 2007 A1
20070114424 Danielsson et al. May 2007 A1
20070118400 Morita et al. May 2007 A1
20070156451 Gering Jul 2007 A1
20070223651 Wagenaar et al. Sep 2007 A1
20070225600 Weibrecht et al. Sep 2007 A1
20070236490 Casteele Oct 2007 A1
20070242800 Jing et al. Oct 2007 A1
20070263765 Wu Nov 2007 A1
20070274585 Zhang et al. Nov 2007 A1
20080019581 Gkanatsios et al. Jan 2008 A1
20080043905 Hassanpourgol Feb 2008 A1
20080045833 DeFreitas et al. Feb 2008 A1
20080101537 Sendai May 2008 A1
20080114614 Mahesh et al. May 2008 A1
20080125643 Huisman May 2008 A1
20080130979 Ren Jun 2008 A1
20080139896 Baumgart Jun 2008 A1
20080152086 Hall Jun 2008 A1
20080165136 Christie et al. Jul 2008 A1
20080187095 Boone et al. Aug 2008 A1
20080198966 Hjarn Aug 2008 A1
20080221479 Ritchie Sep 2008 A1
20080229256 Shibaike Sep 2008 A1
20080240533 Piron et al. Oct 2008 A1
20080297482 Weiss Dec 2008 A1
20090003519 DeFreitas Jan 2009 A1
20090005668 West et al. Jan 2009 A1
20090010384 Jing et al. Jan 2009 A1
20090034684 Bernard Feb 2009 A1
20090037821 O'Neal et al. Feb 2009 A1
20090079705 Sizelove et al. Mar 2009 A1
20090080594 Brooks et al. Mar 2009 A1
20090080602 Brooks et al. Mar 2009 A1
20090080604 Shores et al. Mar 2009 A1
20090080752 Ruth Mar 2009 A1
20090080765 Bernard et al. Mar 2009 A1
20090087067 Khorasani Apr 2009 A1
20090123052 Ruth May 2009 A1
20090129644 Daw et al. May 2009 A1
20090135997 Defreitas et al. May 2009 A1
20090138280 Morita et al. May 2009 A1
20090143674 Nields Jun 2009 A1
20090167702 Nurmi Jul 2009 A1
20090171244 Ning Jul 2009 A1
20090238424 Arakita Sep 2009 A1
20090259958 Ban Oct 2009 A1
20090268865 Ren et al. Oct 2009 A1
20090278812 Yasutake Nov 2009 A1
20090296882 Gkanatsios et al. Dec 2009 A1
20090304147 Jing et al. Dec 2009 A1
20100034348 Yu Feb 2010 A1
20100049046 Peiffer Feb 2010 A1
20100054400 Ren et al. Mar 2010 A1
20100079405 Bernstein Apr 2010 A1
20100086188 Ruth et al. Apr 2010 A1
20100088346 Urness et al. Apr 2010 A1
20100098214 Star-Lack et al. Apr 2010 A1
20100105879 Katayose et al. Apr 2010 A1
20100121178 Krishnan May 2010 A1
20100131294 Venon May 2010 A1
20100131482 Linthicum et al. May 2010 A1
20100135558 Ruth et al. Jun 2010 A1
20100152570 Navab Jun 2010 A1
20100166267 Zhang Jul 2010 A1
20100195882 Ren et al. Aug 2010 A1
20100208037 Sendai Aug 2010 A1
20100231522 Li Sep 2010 A1
20100246909 Blum Sep 2010 A1
20100259561 Forutanpour et al. Oct 2010 A1
20100259645 Kaplan Oct 2010 A1
20100260316 Stein et al. Oct 2010 A1
20100280375 Zhang Nov 2010 A1
20100293500 Cragun Nov 2010 A1
20110018817 Kryze Jan 2011 A1
20110019891 Puong Jan 2011 A1
20110054944 Sandberg et al. Mar 2011 A1
20110069808 Defreitas et al. Mar 2011 A1
20110069906 Park Mar 2011 A1
20110087132 DeFreitas et al. Apr 2011 A1
20110105879 Masumoto May 2011 A1
20110109650 Kreeger May 2011 A1
20110110576 Kreeger et al. May 2011 A1
20110150447 Li Jun 2011 A1
20110163939 Tam et al. Jul 2011 A1
20110178389 Kumar et al. Jul 2011 A1
20110182402 Partain Jul 2011 A1
20110234630 Batman et al. Sep 2011 A1
20110237927 Brooks et al. Sep 2011 A1
20110242092 Kashiwagi Oct 2011 A1
20110310126 Georgiev et al. Dec 2011 A1
20120014504 Jang Jan 2012 A1
20120014578 Karssemeijer Jan 2012 A1
20120069951 Toba Mar 2012 A1
20120131488 Karlsson et al. May 2012 A1
20120133600 Marshall May 2012 A1
20120133601 Marshall May 2012 A1
20120134464 Hoernig et al. May 2012 A1
20120148151 Hamada Jun 2012 A1
20120189092 Jerebko Jul 2012 A1
20120194425 Buelow Aug 2012 A1
20120238870 Smith et al. Sep 2012 A1
20120293511 Mertelmeier Nov 2012 A1
20130022165 Jang Jan 2013 A1
20130044861 Muller Feb 2013 A1
20130059758 Haick et al. Mar 2013 A1
20130108138 Nakayama May 2013 A1
20130121569 Yadav May 2013 A1
20130121618 Yadav May 2013 A1
20130202168 Jerebko Aug 2013 A1
20130259193 Packard Oct 2013 A1
20140033126 Kreeger Jan 2014 A1
20140035811 Guehring Feb 2014 A1
20140064444 Oh Mar 2014 A1
20140073913 DeFreitas et al. Mar 2014 A1
20140219534 Wiemker et al. Aug 2014 A1
20140219548 Wels Aug 2014 A1
20140327702 Kreeger et al. Nov 2014 A1
20140328517 Gluncic Nov 2014 A1
20150052471 Chen Feb 2015 A1
20150061582 Smith Apr 2015 A1
20150238148 Georgescu Aug 2015 A1
20150302146 Marshall Oct 2015 A1
20150309712 Marshall Oct 2015 A1
20150317538 Ren et al. Nov 2015 A1
20150331995 Zhao Nov 2015 A1
20160000399 Halmann et al. Jan 2016 A1
20160022364 DeFreitas et al. Jan 2016 A1
20160051215 Chen Feb 2016 A1
20160078645 Abdurahman Mar 2016 A1
20160140749 Erhard May 2016 A1
20160228034 Gluncic Aug 2016 A1
20160235380 Smith Aug 2016 A1
20160367210 Gkanatsios et al. Dec 2016 A1
20170071562 Suzuki Mar 2017 A1
20170262737 Rabinovich Sep 2017 A1
20180047211 Chen et al. Feb 2018 A1
20180137385 Ren May 2018 A1
20180144244 Masoud May 2018 A1
20180256118 DeFreitas Sep 2018 A1
20190015173 DeFreitas Jan 2019 A1
20190043456 Kreeger Feb 2019 A1
20190290221 Smith Sep 2019 A1
20200046303 DeFreitas Feb 2020 A1
20200093562 DeFreitas Mar 2020 A1
20200184262 Chui Jun 2020 A1
20200205928 DeFreitas Jul 2020 A1
20200253573 Gkanatsios Aug 2020 A1
20200345320 Chen Nov 2020 A1
20200390404 DeFreitas Dec 2020 A1
20210000553 St. Pierre Jan 2021 A1
20210100626 St. Pierre Apr 2021 A1
20210113167 Chui Apr 2021 A1
20210118199 Chui Apr 2021 A1
20220005277 Chen Jan 2022 A1
20220013089 Kreeger Jan 2022 A1
20220192615 Chui Jun 2022 A1
Foreign Referenced Citations (97)
Number Date Country
2014339982 May 2016 AU
1846622 Oct 2006 CN
202161328 Mar 2012 CN
102429678 May 2012 CN
107440730 Dec 2017 CN
102010009295 Aug 2011 DE
102011087127 May 2013 DE
775467 May 1997 EP
982001 Mar 2000 EP
1428473 Jun 2004 EP
2236085 Jun 2010 EP
2215600 Aug 2010 EP
2301432 Mar 2011 EP
2491863 Aug 2012 EP
1986548 Jan 2013 EP
2656789 Oct 2013 EP
2823464 Jan 2015 EP
2823765 Jan 2015 EP
3060132 Apr 2019 EP
H09-198490 Jul 1997 JP
H09-238934 Sep 1997 JP
10-33523 Feb 1998 JP
H10-33523 Feb 1998 JP
2000-200340 Jul 2000 JP
2002-282248 Oct 2002 JP
2003-189179 Jul 2003 JP
2003-199737 Jul 2003 JP
2003-531516 Oct 2003 JP
2006-519634 Aug 2006 JP
2006-312026 Nov 2006 JP
2007-130487 May 2007 JP
2007-330334 Dec 2007 JP
2007-536968 Dec 2007 JP
2008-068032 Mar 2008 JP
2009-034503 Feb 2009 JP
2009-522005 Jun 2009 JP
2009-526618 Jul 2009 JP
2009-207545 Sep 2009 JP
2010-137004 Jun 2010 JP
2011-110175 Jun 2011 JP
2012-501750 Jan 2012 JP
2012011255 Jan 2012 JP
2012-061196 Mar 2012 JP
2013-244211 Dec 2013 JP
2014-507250 Mar 2014 JP
2014-534042 Dec 2014 JP
2015-506794 Mar 2015 JP
2015-144632 Aug 2015 JP
2016-198197 Dec 2015 JP
10-2015-0010515 Jan 2015 KR
10-2017-0062839 Jun 2017 KR
9005485 May 1990 WO
9317620 Sep 1993 WO
9406352 Mar 1994 WO
199700649 Jan 1997 WO
199816903 Apr 1998 WO
0051484 Sep 2000 WO
2003020114 Mar 2003 WO
2005051197 Jun 2005 WO
2005110230 Nov 2005 WO
2005110230 Nov 2005 WO
2005112767 Dec 2005 WO
2005112767 Dec 2005 WO
2006055830 May 2006 WO
2006058160 Jun 2006 WO
2007095330 Aug 2007 WO
08014670 Feb 2008 WO
2008047270 Apr 2008 WO
2008054436 May 2008 WO
2009026587 Feb 2009 WO
2010028208 Mar 2010 WO
2010059920 May 2010 WO
2011008239 Jan 2011 WO
2011043838 Apr 2011 WO
2011065950 Jun 2011 WO
2011073864 Jun 2011 WO
2011091300 Jul 2011 WO
2012001572 Jan 2012 WO
2012068373 May 2012 WO
2012063653 May 2012 WO
2012112627 Aug 2012 WO
2012122399 Sep 2012 WO
2013001439 Jan 2013 WO
2013035026 Mar 2013 WO
2013078476 May 2013 WO
WO 2013123091 Aug 2013 WO
2014149554 Sep 2014 WO
WO 2014207080 Dec 2014 WO
2015061582 Apr 2015 WO
2015066650 May 2015 WO
2015130916 Sep 2015 WO
2016103094 Jun 2016 WO
2016184746 Nov 2016 WO
2018183548 Oct 2018 WO
2018183549 Oct 2018 WO
WO2018183550 Oct 2018 WO
2018236565 Dec 2018 WO
Non-Patent Literature Citations (74)
Entry
International Search Report and Written Opinion dated Jun. 26, 2018 for PCT application No. PCT/US2018/024913, applicant Hologic, Inc., 10 pages.
Non-Final Office Action for U.S. Appl. No. 16/497,766 dated Feb. 3, 2021.
M. Ertas, A. Akan, I. Yildirim, A. Dinler and M. Kamasak, “2D versus 3D total variation minimization in digital breast tomosynthesis,” 2015 IEEE International Conference on Imaging Systems and Techniques (1st), Macau, 2015, pp. 1-4, doi: 10.11 09/IST.2015.7294553. (Year: 2015).
B. E. Caroline and N. Vaijayanthi, “Computer aided detection of masses in digital breast tomosynthesis: A review,” 2012 International Conference on Emerging Trends in Science, Engineering and Technology (INCOSET), Tiruchirappalli, 2012, pp. 186-191, doi: 10.1109/1 NCOSET.2012.6513903 (Year: 2012).
Giger et al. “Development of a smart workstation for use in mammography”, in Proceedings of SPIE, vol. 1445 (1991), pp. 101103; 4 pages.
Giger et al., “An Intelligent Workstation for Computer-aided Diagnosis”, in RadioGraphics, May 1993, 13:3 pp. 647-656; 10 pages.
eFilm Mobile HD by Merge Healthcare, web site: http://itunes.apple.com/bw/app/efilm-mobile-hd/id405261243?mt=8, accessed on Nov. 3, 2011 (2 pages).
eFilm Solutions, eFilm Workstation (tm) 3.4, website: http://estore.merge.com/na/estore/content.aspx?productID=405, accessed on Nov. 3, 2011 (2 pages).
Wodajo, Felasfa, MD, “Now Playing: Radiology Images from Your Hospital PACS on your iPad,” Mar. 17, 2010; web site: http://www.imedicalapps.com/2010/03/now-playing-radiology-images-from-your-hospital-pacs-on-your-ipad/, accessed on Nov. 3, 2011 (3 pages).
Lewin,JM, et al., Dual-energy contrast-enhanced digital subtraction mammography: feasibility. Radiology 2003; 229:261-268.
Berg, WA et al., “Combined screening with ultrasound and mammography vs mammography alone in women at elevated risk of breast cancer”, JAMA 299:2151-2163, 2008.
Carton, AK, et al., “Dual-energy contrast-enhanced digital breast tomosynthesis—a feasibility study”, Br J Radiol. Apr. 2010;83 (988):344-50.
Chen, SC, et al., “Initial clinical experience with contrast-enhanced digital breast tomosynthesis”, Acad Radio. Feb. 2007 14(2):229-38.
Diekmann, F., et al., “Digital mammography using iodine-based contrast media: initial clinical experience with dynamic contrast medium enhancement”, Invest Radiol 2005; 40:397-404.
Dromain C., et al., “Contrast enhanced spectral mammography: a multi-reader study”, RSNA 2010, 96th Scientific Assembly and Scientific Meeting.
Dromain, C., et al., “Contrast-enhanced digital mammography”, Eur J Radiol. 2009; 69:34-42.
Freiherr, G., “Breast tomosynthesis trials show promise”, Diagnostic Imaging—San Francisco 2005, V27; N4:42-48.
ICRP Publication 60: 1990 Recommendations of the International Commission on Radiological Protection, 12 pages.
Jochelson, M., et al., “Bilateral Dual Energy contrast-enhanced digital mammography: Initial Experience”, RSNA 2010, 96th Scientific Assembly and Scientific Meeting, 1 page.
Jong, RA, et al., Contrast-enhanced digital mammography: initial clinical experience. Radiology 2003; 228:842-850.
Kopans, et al. Will tomosynthesis replace conventional mammography? Plenary Session SFN08: Rsna 2005.
Lehman, CD, et al. MRI evaluation of the contralateral breast in women with recently diagnosed breast cancer. N Engl J Med 2007; 356:1295-1303.
Lindfors, KK, et al., Dedicated breast CT: initial clinical experience. Radiology 2008; 246(3): 725-733.
Niklason, L., et al., Digital tomosynthesis in breast imaging. Radiology. Nov. 1997; 205(2):399-406.
Poplack, SP, et al., Digital breast tomosynthesis: initial experience in 98 women with abnormal digital screening mammography. AJR Am J Roentgenology Sep. 2007 189(3):616-23.
Prionas, ND, et al., Contrast-enhanced dedicated breast CT: initial clinical experience. Radiology. Sep. 2010 256(3):714-723.
Rafferty, E. et al., “Assessing Radiologist Performance Using Combined Full-Field Digital Mammography and Breast Tomosynthesis Versus Full-Field Digital Mammography Alone: Results”. . . presented at 2007 Radiological Society of North America meeting, Chicago IL.
Smith, A., “Full field breast tomosynthesis”, Radiol Manage. Sep.-Oct. 2005; 27(5):25-31.
Weidner N, et al., “Tumor angiogenesis and metastasis: correlation in invasive breast carcinoma”, New England Journal of Medicine 1991; 324:1-8.
Weidner, N, “The importance of tumor angiogenesis: the evidence continues to grow”, Am J Clin Pathol. Nov. 2004 122(5):696-703.
Hologic, Inc., 510(k) Summary, prepared Nov. 28, 2010, for Affirm Breast Biopsy Guidance System Special 510(k) Premarket Notification, 5 pages.
Hologic, Inc., 510(k) Summary, prepared Aug. 14, 2012, for Affirm Breast Biopsy Guidance System Special 510(k) Premarket Notification, 5 pages.
“Filtered Back Projection”, (NYGREN), published May 8, 2007, URL: http://web.archive.org/web/19991010131715/http://www.owlnet.rice.edu/˜elec539/Projects97/cult/node 2.html, 2 pgs.
Hologic, “Lorad StereoLoc II” Operator's Manual 9-500-0261, Rev. 005, 2004, 78 pgs.
Shrading, Simone et al., “Digital Breast Tomosynthesis-guided Vacuum-assisted Breast Biopsy: Initial Experiences and Comparison with Prone Stereotactic Vacuum-assisted Biopsy”, the Department of Diagnostic and Interventional Radiology, Univ. of Aachen, Germany, published Nov. 12, 2014, 10 pgs.
“Supersonic to feature Aixplorer Ultimate at ECR”, AuntiMinnie.com, 3 pages (Feb. 2018).
Bushberg, Jerrold et al., “The Essential Physics of Medical Imaging”, 3rd ed., In: “The Essential Physics of Medical Imaging, Third Edition”, Dec. 28, 2011, Lippincott & Wilkins, Philadelphia, PA, USA, XP05579051, pp. 270-272.
Dromain, Clarisse et al., “Dual-energy contrast-enhanced digital mammography: initial clinical results”, European Radiology, Sep. 14, 2010, vol. 21, pp. 565-574.
Reynolds, April, “Stereotactic Breast Biopsy: A Review”, Radiologic Technology, vol. 80, No. 5, Jun. 1, 2009, pp. 447M-464M, XP055790574.
E. Shaw de Paredes et al., “Interventional Breast Procedure”, published Sep./Oct. 1998 in Curr Probl Diagn Radiol, pp. 138-184.
Burbank, Fred, “Stereotactic Breast Biopsy: Its History, Its Present, and Its Future”, published in 1996 at the Southeastern Surgical Congress, 24 pages.
Georgian-Smith, Dianne, et al., “Stereotactic Biopsy of the Breast Using an Upright Unit, a Vacuum-Suction Needle, and a Lateral Arm-Support System”, 2001, at the American Roentgen Ray Society meeting, 8 pages.
Fischer Imaging Corp, Mammotest Plus manual on minimally invasive breast biopsy system, 2002, 8 pages.
Fischer Imaging Corporation, Installation Manual, MammoTest Family of Breast Biopsy Systems, 86683G, 86684G, P-55957-IM, Issue 1, Revision 3, Jul. 2005, 98 pages.
Fischer Imaging Corporation, Operator Manual, MammoTest Family of Breast Biopsy Systems, 86683G, 86684G, P-55956-OM, Issue 1, Revision 6, Sep. 2005, 258 pages.
Koechli, Ossi R., “Available Sterotactic Systems for Breast Biopsy”, Renzo Brun del Re (Ed.), Minimally Invasive Breast Biopsies, Recent Results in Cancer Research 173:105-113; Springer-Verlag, 2009.
Al Sallab et al., “Self Learning Machines Using Deep Networks”, Soft Computing and Pattern Recognition (SoCPaR), 2011 Int'l. Conference of IEEE, Oct. 14, 2011, pp. 21-26.
Chan, Heang-Ping et al., “ROC Study of the effect of stereoscopic imaging on assessment of breast lesions,” Medical Physics, vol. 32, No. 4, Apr. 2005, 1001-1009.
Ghiassi, M. et al., “A Dynamic Architecture for Artificial Networks”, Neurocomputing, vol. 63, Aug. 20, 2004, pp. 397-413.
Lilja, Mikko, “Fast and accurate voxel projection technique in free-form cone-beam geometry with application to algebraic reconstruction,” Applies Sciences on Biomedical and Communication Technologies, 2008, Isabel '08, first international symposium on, IEEE, Piscataway, NJ, Oct. 25, 2008.
Pathmanathan et al., “Predicting tumour location by simulating large deformations of the breast using a 3D finite element model and nonlinear elasticity”, Medical Image Computing and Computer-Assisted Intervention, pp. 217-224, vol. 3217 (2004).
Pediconi, “Color-coded automated signal intensity-curve for detection and characterization of breast lesions: Preliminary evaluation of new software for MR-based breast imaging,” International Congress Series 1281 (2005) 1081-1086.
Sakic et al., “Mammogram synthesis using a 3D simulation. I. breast tissue model and image acquisition simulation” Medical Physics. 29, pp. 2131-2139 (2002).
Samani, A. et al., “Biomechanical 3-D Finite Element Modeling of the Human Breast Using MRI Data”, 2001, IEEE Transactions on Medical Imaging, vol. 20, No. 4, pp. 271-279.
Yin, H.M., et al., “Image Parser: a tool for finite element generation from three-dimensional medical images”, BioMedical Engineering Online. 3:31, pp. 1-9, Oct. 1, 2004.
Van Schie, Guido, et al., “Mass detection in reconstructed digital breast tomosynthesis volumes with a computer-aided detection system trained on 2D mammograms”, Med. Phys. 40(4), Apr. 2013, 41902-1-41902-11.
Van Schie, Guido, et al., “Generating Synthetic Mammograms from Reconstructed Tomosynthesis Volumes”, IEEE Transactions on Medical Imaging, vol. 32, No. 12, Dec. 2013, 2322-2331.
PCT International Preliminary Reporton Patentability in Application PCT/US2018/024913, dated Oct. 10, 2019, 8 pages.
U.S. Appl. No. 16/497,766, Final Office Action dated Jul. 27, 2021, 21 pages.
Diekmann, Felix et al., “Thick Slices from Tomosynthesis Data Sets: Phantom Study for the Evaluation of Different Algorithms”, Journal of Digital Imaging, Springer, vol. 22, No. 5, Oct. 23, 2007, pp. 519-526.
Donner, Peter, “Breast Response to Menopausal Hormone Therapy—Aspects on Proliferation, apoptosis and Mammographic Density”, 2007 Annals of Medicine, 39;1, 28-41.
Glick, Stephen J., “Breast CT”, Annual Rev. Biomed. Eng., 2007, 9;501-26.
Metheany, Kathrine G et al., “Characterizing anatomical variability in breast CT images”, Oct. 2008, Med. Phys. 35 (10); 4685-4694.
Dromain, Clarisse, et al., “Evaluation of tumor angiogenesis of breast carcinoma using contrast-enhanced digital mammography”, AJR: 187, Nov. 2006, 16 pages.
Zhao, Bo, et al., “Imaging performance of an amorphous selenium digital mammography detector in a breast tomosynthesis system”, May 2008, Med. Phys 35(5); 1978-1987.
Mahesh, Mahadevappa, “AAPM/RSNA Physics Tutorial for Residents—Digital Mammography: An Overview”, Nov.-Dec. 2004, vol. 24, No. 6, 1747-1760.
Zhang, Yiheng et al., “A comparative study of limited-angle cone-beam reconstruction methods for breast omosythesis”, Med Phys., Oct. 2006, 33(10): 3781-3795.
Sechopoulos, et al., “Glandular radiation dose in tomosynthesis of the breast using tungsten targets”, Journal of Applied Clinical Medical Physics, vol. 8, No. 4, Fall 2008, 161-171.
Wen, Junhai et al., “A study on truncated cone-beam sampling strategies for 3D mammography”, 2004, IEEE, 3200-3204.
Jaz, Umer Zeeshan, et al., “Mammography phantom studies using 3D electrical impedance tomography with numerical forward solver”, Frontiers in the Convergence of Bioscience and Information Technologies 2007, 379-383.
Kao, Tzu-Jen et al., “Regional admittivity spectra with tomosynthesis images for breast cancer detection”, Proc. of the 29th Annual Int'l. Conf. of the IEEE EMBS, Aug. 23-26, 2007, 4142-4145.
Varjonen, Mari, “Three-Dimensional Digital Breast Tomosynthesis in the Early Diagnosis and Detection of Breast Dancer”, IWDM 2006, LNCS 4046, 152-159.
Taghibakhsh, f. et al., “High dynamic range 2-TFT amplified pixel sensor architecture for digital mammography tomosynthesis”, IET Circuits Devices Syst., 2007, 1(10, pp. 87-92.
Chan, Heang-Ping et al., “Computer-aided detection system for breast masses on digital tomosynthesis mammograms: Preliminary Experience”, Radiology, Dec. 2005, 1075-1080.
Related Publications (1)
Number Date Country
20210100518 A1 Apr 2021 US
Provisional Applications (1)
Number Date Country
62479036 Mar 2017 US