Medical imaging has become a widely used tool for identifying and diagnosing abnormalities, such as cancers or other conditions, within the human body. Medical imaging processes such as mammography and tomography are particularly useful tools for imaging breasts to screen for, or diagnose, cancer or other lesions with the breasts. Tomosynthesis systems are mammography systems that allow high resolution breast imaging based on limited angle tomosynthesis. Tomosynthesis, generally, produces a plurality of x-ray images, each of discrete layers or slices of the breast, through the entire thickness thereof. In contrast to typical two-dimensional (2D) mammography systems, a tomosynthesis system acquires a series of x-ray projection images, each projection image obtained at a different angular displacement as the x-ray source moves along a path, such as a circular arc, over the breast. In contrast to conventional computed tomography (CT), tomosynthesis is typically based on projection images obtained at limited angular displacements of the x-ray source around the breast. Tomosynthesis reduces or eliminates the problems caused by tissue overlap and structure noise present in 2D mammography imaging. Acquiring each projection image, however, increases the total amount of time required to complete the imaging process.
It is with respect to these and other general considerations that the aspects disclosed herein have been made. Also, although relatively specific problems may be discussed, it should be understood that the examples should not be limited to solving the specific problems identified in the background or elsewhere in this disclosure.
Examples of the present disclosure describe systems and methods for medical imaging through the use of a synthesized virtual projections generated from real projections. In an aspect, the technology relates to a system for generating images of a breast. The system includes an x-ray source, an x-ray detector, at least one processor operatively connected to the x-ray detector, and memory operatively connected to the at least one processor, the memory storing instructions that, when executed by the at least one processor, cause the system to perform a set of operations. The operations include, emitting, from the x-ray source, a first x-ray emission at a first angular location relative to the x-ray detector; detecting, by the x-ray detector, the first x-ray emission after passing through the breast; generating first x-ray imaging data from the detected first x-ray emission; emitting, from the x-ray source, a second x-ray emission at a second angular location relative to the breast; detecting, by the x-ray detector, the second x-ray emission after passing through the breast; generating second x-ray imaging data from the detected second x-ray emission; synthesizing, based on at least the first x-ray imaging data and the second x-ray imaging data, third x-ray imaging data for a third angular location relative to the breast, wherein the third angular location is different from the first angular location and the second angular location, thereby eliminating the need for an x-ray emission at the third angular location; and generating and displaying an image of the breast from the third x-ray imaging data.
In an example, the first x-ray imaging data is a first real projection for the first angular location, the second x-ray imaging data is a second real projection for the second angular location, and the third x-ray imaging data is a virtual projection for the third angular location. In another example, synthesizing the third x-ray imaging data includes fusing the first x-ray imaging data and the second x-ray imaging data in at least one of a spatial domain or a frequency domain. In yet another example, synthesizing the third x-ray imaging data further includes generating reconstruction data from the first x-ray imaging data and the second x-ray imaging data, and synthesizing the third x-ray imaging data is further based on the generated reconstruction data. In still another example, synthesizing the third x-ray imaging data further includes providing the first x-ray imaging data and the second x-ray imaging data into a trained deep-learning neural network and executing the trained deep-learning neural network based on the first x-ray imaging data and the second x-ray imaging data to generate the third x-ray imaging data. In still yet another example, the operations further comprise training a deep-learning neural network to generate the trained deep-learning neural network. Training the deep-learning neural network includes obtaining a set of real prior x-ray imaging data used for imaging a breast at multiple angular locations; dividing the set of real prior x-ray imaging data into a plurality of datasets comprising a training real data set for a first plurality of the angular locations and a training virtual data set for a second plurality of the angular locations, the second plurality of angular locations being different from the first plurality of angular locations; providing the training real data set as inputs into the deep-learning neural network; and providing the training virtual data set as a ground truth for the deep-learning neural network. In another example, the operations are performed as part of digital breast tomosynthesis or multi-modality imaging.
In another aspect, the technology relates to a computer-implemented method, executed by at least one processor, for generating images of a breast. The method includes receiving first real projection data for an x-ray emission from a first angular location relative to the breast; receiving second real projection data for an x-ray emission emitted from a second angular location relative to the breast; receiving third real projection data for an x-ray emission from a third angular location relative to the breast; and executing a synthesization process. The synthesization process is executed to generate, based on the first real projection data and the second real projection data, first virtual projection data for an x-ray emission from a fourth angular location relative to the breast, wherein the fourth angular location is different from the first angular location and the third angular location; and generate, based on the second real projection data and the third real projection data, second virtual projection data for an x-ray emission from a fifth angular location relative to the breast, wherein the fifth angular location different from the second angular location and the fourth angular location. The method further includes determining that at least one of the first virtual projection data or the second virtual projection data has a quality outside of a predetermined tolerance; based on the determination that the at least one of the first virtual projection or the second virtual projection has a quality outside of a predetermined tolerance, modifying the synthesization process to create a modified synthesization process; executing the modified synthesization process to generate a modified first virtual projection and a modified second virtual projection; generating a reconstruction model from the first real projection data, the second real projection data, the third real projection data, the modified first virtual projection data, and the modified second virtual projection data; and displaying at least one of a slice of the breast from the generated reconstruction model, the first real projection data, the second real projection data, the third real projection data, the first virtual projection data, or the second virtual projection data.
In an example, determining that at least one of the first virtual projection data or the second virtual projection data has a quality outside of a predetermined tolerance further includes: identifying a landmark in one of the first real projection data or the second real projection data; identifying the landmark in the first virtual projection data; comparing the location of the landmark in the first virtual projection data to the location of the landmark in at least one of the first real projection data or the second real projection data; and based on the comparison, determining whether the location of the landmark in the first virtual projection data is within the predetermined tolerance. In another example, the synthesization process includes: providing the first real projection data, the second real projection data, the third real projection data into a trained deep-learning neural network; and executing the trained deep-learning neural network based on the first real projection data, the second real projection data, the third real projection data to generate the first virtual projection data and the second virtual projection data. In yet another example, modifying the synthesization process includes modifying coefficients of the trained deep-learning neural network. In still another example, the method further includes determining that the slice has a quality outside a reconstruction quality tolerance; and based on the determination that the slice has a quality outside a reconstruction quality tolerance, further modifying the modified synthesization process to create a further modified synthesization process. In still yet another example, the method further includes: executing the further modified synthesization process to generate a further modified first virtual projection data and a further modified second virtual projection data; generating a modified reconstruction model from the first real projection data, the second real projection data, the third real projection data, the further modified first virtual projection data, and the further modified second virtual projection data; and displaying at least one of a slice of the breast from the modified reconstruction model, the further modified first virtual projection, or the further modified second virtual projection. In another example, the method is performed as part of digital breast tomosynthesis or multi-modality imaging.
In another aspect, the technology relates to another computer-implemented method, executed by at least one processor, for generating images of a breast. The method includes receiving first real projection data for an x-ray emission from a first angular location relative to the breast; receiving second real projection data for an x-ray emission emitted from a second angular location relative to the breast; providing the first real projection data and the second real projection data into a trained deep-learning neural network; executing the trained deep-learning neural network based on the first real projection data, and the second real projection data to generate first virtual projection data for a third angular location relative to the breast; generating a reconstruction model from the first real projection data, the second real projection data, and the first virtual projection data; and displaying at least one of a slice of the breast from the generated reconstruction model, the first real projection data, the second real projection data, or the first virtual projection data.
In an example, the method further includes determining that the slice has a quality outside a reconstruction quality tolerance; and based on the determination that the slice has a quality outside a reconstruction quality tolerance, modifying the trained deep-learning neural network to create a modified deep-learning neural network. In another example, the method further includes: executing the modified deep-learning neural network based on the first real projection data and the second real projection data to generate a modified first virtual projection; generating a modified reconstruction model from the first real projection data, the second real projection data, and the modified first virtual projection data; and displaying at least one of a slice of the breast from the modified reconstruction model or the modified first virtual projection. In yet another example, the determination that the slice has a quality outside a reconstruction quality tolerance is based on at least one of image artifacts or image quality measurements. In still another example, the difference between the first angular location and the second angular location is less than or equal to three degrees. In still yet another example, the method is performed as part of digital breast tomosynthesis or multi-modality imaging.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Additional aspects, features, and/or advantages of examples will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the disclosure.
Non-limiting and non-exhaustive examples are described with reference to the following figures.
As discussed above, a tomosynthesis system acquires a series of x-ray projection images, each projection image obtained at a different angular displacement as the x-ray source moves along a path, such as a circular arc, over the breast. More specifically, the technology typically involves taking two-dimensional (2D) real projection images of the immobilized breast at each of a number of angles of the x-ray beam relative to the breast. The resulting x-ray measurements are computer-processed to reconstruct images of breast slices that typically are in planes transverse to the x-ray beam axis, such as parallel to the image plane of a mammogram of the same breast, but can be at any other orientation and can represent breast slices of selected thicknesses. Acquiring each real projection image introduces additional radiation to the patient and increases the total amount of time required to complete the imaging process. The use of fewer real projection images, however, leads to worse image quality for the reconstructed images.
The present technology contemplates systems and methods that allow fewer real projection images to be acquired, while still preserving suitable image quality of reconstructed images of breast slices. The present technology allows for virtual projection images to be generated from real projection images. The virtual projection images may then be used, along with the real projection images, to generate the reconstruction model for the breast. Through the use of the virtual projection images, radiation exposure at some of the angular locations where radiation exposure was traditionally necessary can be eliminated—thus reducing the total radiation dose received by the patient and reducing the time required to complete the tomosynthesis procedure. Reducing the amount of time the patient is imaged also improves image quality by reducing the amount of movement or motion of the patient during the imaging procedure.
In some examples the total imaging time and dosage may remain the same as prior imaging procedures, such as tomosynthesis imaging procedures. In such examples, the virtual projection images may be generated to expand the angular range that is images or provide additional information for the real projection images. Thus, imaging artifacts in the reconstruction images, such as overlay structures or other imaging artifacts that are inherent in limited-angle imaging modalities, may be reduced or removed due to the additional virtual projections.
The virtual projection images may be generated from machine-learning techniques, such a deep-learning neural networks. The virtual projection images may also be generated by fusing multiple real projection images. In addition, generating the virtual projection images may be based on reconstruction data generated from the real projection images. The generation of the virtual projections and reconstruction data may also be an iterative process. For example, image quality of a reconstructed breast slice may be assessed, and if the quality is poor, modified virtual projections can be generated to improve the image quality of the image of the breast slice until a desired performance criteria is achieved. As used herein, a real projection refers to a projection obtained by emitting radiation through the breast for a respective angular location. In contrast, a virtual projection refers to a projection obtained without emitting radiation through the breast.
In describing examples and embodiments illustrated in the drawings, specific terminology is employed for the sake of clarity. However, the disclosure of this patent specification is not intended to be limited to the specific terminology so selected and it is to be understood that each specific element includes all technical equivalents that operate in a similar manner.
When operating in a CT mode, the system of
A unique challenge arises because of the upright position of the patient and the rotation of x-ray tube 108 and receptor housing 110 through a large angle in the CT mode of operation. As known, CT scanning typically involves a rotation of the source and receptor through an angle of 180° plus the angle subtended by the imaging x-ray beam, and preferably a rotation through a greater angle, e.g., 360°. However, if the rotation includes the 0° position of x-ray source 108 as seen in
An example of such a shield is illustrated in
Use of the system in a tomosynthesis mode is illustrated in
In one example of tomosynthesis mode operation, x-ray tube 108 rotates through an arc of about ±15° while imaging receptor rotates or pivots through about ±5° about the horizontal axis that bisects its imaging surface. During this motion, plural projection images RP are taken, such as 20 or 21 images, at regular increments of rotation angle. The central angle of the ±15° arc of x-ray source 108 rotation can be the 0° angle, i.e., the position of the x-ray source 108 seen in
When operating in a tomosynthesis mode, the system of
In other examples, angular locations for real projections and virtual projections need not alternate as shown in
Fusing the real projections (RP) to synthesize or generate virtual projections (VP) may be performed by a variety of image analysis and combination techniques, including superposition, interpolation, and extrapolation techniques, among other potential techniques. Interpolation or extrapolation techniques may be performed based on the angular locations of the real projections (RP) as compared to the corresponding angular location of the virtual projection (VP). For instance, where the angular location of the virtual projection (VP) is between the angular locations of the real projections (RP) used to generate the virtual projection (VP), interpolation techniques may be used. Where the angular location of the virtual projection (VP) is outside the angular locations of the real projections (RP) used to generate the virtual projection (VP), extrapolation techniques may be used. The techniques for fusing the real projections (RP) to generate virtual projections (VP) may also be performed in the spatial, transform, or frequency domains. For example, image fusion techniques in the spatial domain generally operate based on the pixel values in the real projections (RP). Image fusion techniques within the transform or frequency domains generally operate based on mathematical transforms, such as a Fourier or Laplace transform, of the pixel data from the real projections (RP). For instance, in the frequency domain, the image fusion techniques may be based on a rate of change of pixel values within the spatial domain.
In the system 810, the virtual projection synthesizer 804 may also use reconstruction images (TR) to generate the virtual projections (VP). In one example, the virtual projection synthesizer 804 receives reconstruction images (TR) from the reconstruction engine 806. In such an example, the reconstruction images (TR) may be based on the real projections (RP) received by the reconstruction engine 806 from the acquisition system 802. In other examples, the process of generating the reconstruction images (TR) and/or the virtual projections (VP) may be an iterative process. For instance, the reconstruction engine 806 may receive the real projections (RP) from the acquisition system 802 and the virtual projections (VP) from the virtual projection synthesizer 804 generated from the real projections (RP). The reconstruction engine 806 then generates the reconstruction images (TR) from the real projections (RP) and the virtual projections (VP). Those reconstruction images (TR) may be provided back to the virtual projection synthesizer 804 to update the virtual projections (VP) based on the reconstruction images (TR). The updated or modified virtual projections (VP) may then be provided back to the reconstruction engine 806 to generate an updated or modified reconstruction model and updated reconstruction images (TR). This iterative updating or modification process may continue until performance criteria for the virtual projection or a performance criteria for the reconstruction images, or both, is achieved.
Prior to receiving the real projections (RP), the deep-learning neural network 808 has been trained to generate the virtual projections (VP). For instance, the deep-learning neural network 808 may be trained with a known set of real projection data. The real projection data may be separated into a set of training real projection data and training virtual projection data. As an example, real projection data may be received for angular locations L1, L2, and L3. The real projection data for angular location L2 may be segregated into a data set of training virtual projection data. The real projection data for angular location L2 is effectively the desired, or ideal, virtual projection data for angular location L2. As such, the deep-learning neural network 808 can be trained to produce virtual projection data based on former real projection data. The real projection data for angular locations L1 and L3 is used as input during training, and the known virtual projection data for the angular location L2 is used as a ground truth during training. Training the deep-learning neural network 808 may be performed using multiple different techniques. As one example, the coefficients of the deep-learning neural network 808 may be adjusted to minimize a pre-defined cost function that evaluates the difference between the known virtual projection data and the output of the deep-learning neural network 808 during training. Multiple sets of real projection data may be used to train the deep-learning neural network 808 until a desired performance of the deep-learning neural network 808 is achieved.
In other examples, the deep-learning neural network 808 may be used to generate a reconstruction model or reconstruction images (TR) without an intermediate operation of generating virtual projections (VP). In such an example, the deep-learning neural network 808 may be trained with a set of real projection data and a corresponding reconstruction model or reconstruction images. The reconstruction images can be used as the ground truth during training and the real projection data may be used as the input to the deep-learning neural network 808 during training. Training of the deep-learning neural network 808 may then be similar to the training discussed above.
While a deep-learning neural network 808 has a been used in the example system 820 depicted in
At operation 914, third x-ray imaging data for a third angular location is synthesized based on at least the first x-ray imaging data and the second x-ray imaging data. The third angular location is different from the first and second angular locations. In the example where the first x-ray imaging data is a real projection for angular location L1 and the second x-ray imaging data is a real projection for angular location L3, the third x-ray imaging data may be a virtual projection for the angular location L2. As such, by generating the third x-ray imaging data for the third angular location without emitting x-ray radiation at the third angular location, the need for an x-ray emission at the third angular location is eliminated. By eliminating the need for the x-ray emission, the overall radiation dose delivered to the patient is reduced and the time required to complete the imaging procedure is reduced.
In another example of method 900, the first x-ray imaging data is a real projection for angular location L5 and the second x-ray imaging data is a real projection for angular location L6. In that example, the third x-ray imaging data generated at operation 914 may be a virtual projection for angular location wider than angular locations L5 and L6, such as angular location L7. As such, by generating the third x-ray imaging data for the third angular location to expand the overall angular range from the angular location of L6 to that of L7, the overall image quality may be improved due to the additional information from the wider angular location of angular location L7. For instance, image artifacts such as overlap structures may be reduced or removed.
At operation 928, the first x-ray imaging data and second x-ray imaging data are provided as inputs into a trained deep-learning neural network, and the trained deep-learning neural network is executed based on the first x-ray imaging data and second x-ray imaging data to generate the third x-ray imaging data. The trained deep-learning neural network may have been trained based on a set of real projection data, as discussed above and discussed below in further detail with respect to
At operation 930, a reconstruction model and/or reconstruction images are generated based on the first x-ray imaging data, the second x-ray imaging data, and the third x-ray imaging data. For example, the first x-ray imaging data and the second x-ray imaging data may be provided to a reconstruction engine. The third x-ray imaging data generating at operation 926 and/or operation 928 may also be provided to the reconstruction engine. The reconstruction engine then generates the reconstruction model and/or reconstruction images based on the first x-ray imaging data, the second x-ray imaging data, and the third x-ray imaging data. In some examples, reconstruction data from the reconstruction model can be provided to a virtual projection synthesizer to be used in operation 926 and/or operation 928 to generate the third x-ray imaging data. In such examples, the reconstruction data is generated at operation 930 based on the first x-ray imaging data and the second x-ray imaging data prior to the generation of the third x-ray imaging data at operation 926 and/or operation 928. In other examples, the process may be iterative, and operation 926 and/or operation 928 may repeat upon receiving the reconstruction data to generate modified third x-ray imaging data. The modified third x-ray imaging data may then be used to generate a modified or updated reconstruction model and/or reconstruction images. At operation 932, one or more reconstruction slices of the breast are displayed based on the reconstruction model and/or reconstruction images generated at operation 930. In other examples, the one or more reconstruction slices of the breast may be displayed concurrently or sequentially with one or more of the acquired real projection images and/or one or more of the generated virtual projection images. In some examples of operation 932, one or more reconstruction slices of the breast, one or more of the acquired real projection images, and/or one or more of the generated virtual projection images may be displayed, either concurrently or sequentially. While the x-ray imaging data in the methods discussed herein are discussed as being first, second, third, etc., such designations are merely for clarity and do necessarily denote any particular order or sequence. In addition, it should be appreciated that additional real x-ray imaging data may be used and more virtual imaging data may also be generated than what is discussed by example in the methods described herein.
At operation 962, the deep-learning neural network is tested. The deep-learning neural network may be tested with other sets of real prior x-ray imaging data to determine the performance and accuracy of the deep-learning neural network. At operation 964, based on the testing performed in operation 962, a determination is made as to whether the performance of the deep-learning neural network is acceptable. The determination may be made based on differences between the output of the deep-learning neural network and the known test data. If the performance is acceptable or within a predetermined tolerance, the trained deep-learning neural network is stored for later use with live real x-ray imaging data. If the performance is not acceptable or outside a predetermined tolerance, the method 950 flows back to operation 952 where the training of the deep-learning neural network continues with an additional set of real prior x-ray imaging data is obtained and used. The method 950 continues and repeats until the deep-learning neural network generates acceptable results that are within the predetermined tolerance.
At operation 1006, a determination is made as to whether the generated virtual projection data is acceptable. For instance, a determination may be made as to whether the image quality of the virtual projection data is within a predetermined tolerance. Continuing with the example above, a determination may be made as to whether at least one of the first virtual projection data or the second virtual projection data has a quality outside of a predetermined tolerance. In one example, determining whether the generated virtual projection data is acceptable is based on the identification of landmarks in the real projection data and the generated virtual projection data. In continuing with the example above, a landmark may be identified in the first real projection data and/or the second real projection data. The landmark may then be identified in the first virtual projection data. The location of the landmark in the first virtual projection data is then compared to the location of the landmark in the first real projection data and/or the second real projection data. Based on the comparison, a determination is made as to whether the location of the landmark in the first virtual projection data is within the predetermined tolerance.
If the virtual projections are determined to be acceptable or within the predetermined tolerance at operation 1006, the method 1000 flows to operation 1012 where a reconstruction model and/or reconstruction images are generated from the real projection data received at operation 1002 and the virtual projection data generated at operation 1004. If, however, the virtual projection data is determined to not be acceptable or outside the predetermined tolerance, the method 1000 flows to operation 1008 where the synthesization process is modified to create a modifying synthesization process. Modifying the synthesization process may include altering the image combination techniques, such as modifying weighting or other parameters, used to combine the real projection data. In examples where the synthesization process includes executing a deep-learning neural network, modifying the synthesization process may include modifying the deep-learning neural network to create a modified deep-learning neural network. Modifying the deep-learning neural network may include adjusting the coefficients of the deep-learning neural network such that the modified deep-learning neural network produces virtual projection data that will fall within the predetermined tolerance.
The modified synthesization process is then executed in operation 1010 to generate modified virtual projection data. In continuing with the example above, the modified synthesization process may be executed to generate a modified first virtual projection and a modified second virtual projection. At operation 1012, a modified reconstruction model and/or modified reconstruction images are generated from the real projection data received at operation 1002 and the modified virtual projection data generated at operation 1010. In continuing with the example above, generating the modified reconstruction model and/or modified reconstruction images may be based on the first real projection data, the second real projection data, the third real projection data, the modified first virtual projection data, and the modified second virtual projection data.
At operation 1014, a determination is made as to whether the reconstruction model and/or reconstruction images generated at operation 1012 are acceptable or within a reconstruction quality tolerance. For example, it may be determined that a particular slice has a quality outside a reconstruction quality tolerance. The determination that the slice has a quality outside a reconstruction quality tolerance may be based on image artifacts within the slice and/or other image quality measurements, such as the sharpness of objects in the slice, contrast-to-noise ratios, spatial resolutions, z-axis resolution or an artifact spread function (e.g., artifact spreading among the slices along the z-direction. If the reconstruction model and/or reconstruction images are determined to be acceptable at operation 1014, the method 1000 flows to operation 1016 where a slice from the reconstruction model and/or reconstruction images is displayed. In other examples, the one or more reconstruction slices of the breast may be displayed concurrently or sequentially with one or more of the acquired real projection images and/or one or more of the generated virtual projection images. In some examples of operation 1016, one or more reconstruction slices of the breast, one or more of the acquired real projection images, and/or one or more of the generated virtual projection images may be displayed, either concurrently or sequentially. If the reconstruction model and/or reconstruction images are determined to not be acceptable at operation 1014, the method 1000 flows back to operation 1008 where the synthesization process may be further modified to create a further modified synthesization process. The further modified synthesization process is then executed at operation 1010 to generate further modified virtual projection data. That further modified virtual projection data may then be used to create a further modified reconstruction model and/or reconstruction images.
In its most basic configuration, operating environment 1100 typically includes at least one processing unit 1102 and memory 1104. Depending on the exact configuration and type of computing device, memory 1104 (storing, among other things, instructions to perform the image acquisition and processing methods disclosed herein) can be volatile (such as RAM), non-volatile (such as ROM, flash memory, etc.), or some combination of the two. This most basic configuration is illustrated in
Operating environment 1100 typically includes at least some form of computer readable media. Computer readable media can be any available media that can be accessed by processing unit 1102 or other devices comprising the operating environment. By way of example, and not limitation, computer readable media can comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, solid state storage, or any other tangible medium which can be used to store the desired information. Communication media embodies computer readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of the any of the above should also be included within the scope of computer readable media. A computer-readable device is a hardware device incorporating computer storage media.
The operating environment 1100 can be a single computer operating in a networked environment using logical connections to one or more remote computers. The remote computer can be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above as well as others not so mentioned. The logical connections can include any method supported by available communications media. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet.
In some embodiments, the components described herein comprise such modules or instructions executable by computer system 1100 that can be stored on computer storage medium and other tangible mediums and transmitted in communication media. Computer storage media includes volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules, or other data. Combinations of any of the above should also be included within the scope of readable media. In some embodiments, computer system 1100 is part of a network that stores data in remote storage media for use by the computer system 1100.
In embodiments, the various systems and methods disclosed herein may be performed by one or more server devices. For example, in one embodiment, a single server, such as server 1204 may be employed to perform the systems and methods disclosed herein, such as the methods for imaging discussed herein. Client device 1202 may interact with server 1204 via network 1208. In further embodiments, the client device 1202 may also perform functionality disclosed herein, such as scanning and image processing, which can then be provided to servers 1204 and/or 1206.
In alternate embodiments, the methods and systems disclosed herein may be performed using a distributed computing network, or a cloud network. In such embodiments, the methods and systems disclosed herein may be performed by two or more servers, such as servers 1204 and 1206. Although a particular network embodiment is disclosed herein, one of skill in the art will appreciate that the systems and methods disclosed herein may be performed using other types of networks and/or network configurations.
In light of the foregoing, it should be appreciated that the present technology is able to reduce the overall radiation does to the patient during an imaging process by acquiring real projection data at fewer angular locations than what was previously used in tomosynthesis procedures. Virtual projections may then be used in place of additional real projections. The combination of the real and virtual projections thus can provide a substantially equivalent reconstruction as that of a former full-dose projection acquisition process. In addition, the present technology can improve the image quality of the reconstructed images, without increasing radiation dosage, by generating virtual projections at angular locations that provide additional information to reduce image artifacts that would otherwise appear in tomosynthesis. Further, the total time required to complete the imaging process is reduced by the present technology. Reducing the time the patient is imaged also reduces the impact of patient movement on the image quality of the reconstructed data.
The embodiments described herein may be employed using software, hardware, or a combination of software and hardware to implement and perform the systems and methods disclosed herein. Although specific devices have been recited throughout the disclosure as performing specific functions, one of skill in the art will appreciate that these devices are provided for illustrative purposes, and other devices may be employed to perform the functionality disclosed herein without departing from the scope of the disclosure.
This disclosure describes some embodiments of the present technology with reference to the accompanying drawings, in which only some of the possible embodiments were shown. Other aspects may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments were provided so that this disclosure was thorough and complete and fully conveyed the scope of the possible embodiments to those skilled in the art. Further, as used herein and in the claims, the phrase “at least one of element A, element B, or element C” is intended to convey any of: element A, element B, element C, elements A and B, elements A and C, elements B and C, and elements A, B, and C.
Although specific embodiments are described herein, the scope of the technology is not limited to those specific embodiments. One skilled in the art will recognize other embodiments or improvements that are within the scope and spirit of the present technology. Therefore, the specific structure, acts, or media are disclosed only as illustrative embodiments. The scope of the technology is defined by the following claims and any equivalents therein.
This application claims the benefit of U.S. Provisional Application No. 62/730,818 filed on Sep. 13, 2018, which is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
3365575 | Strax | Jan 1968 | A |
3502878 | Stewart | Mar 1970 | A |
3863073 | Wagner | Jan 1975 | A |
3971950 | Evans et al. | Jul 1976 | A |
4160906 | Daniels et al. | Jul 1979 | A |
4310766 | Finkenzeller et al. | Jan 1982 | A |
4380086 | Vagi | Apr 1983 | A |
4496557 | Malen et al. | Jan 1985 | A |
4513433 | Weiss et al. | Apr 1985 | A |
4542521 | Hahn et al. | Sep 1985 | A |
4559641 | Caugant et al. | Dec 1985 | A |
4662379 | Macovski | May 1987 | A |
4706269 | Reina et al. | Nov 1987 | A |
4721856 | Saotome et al. | Jan 1988 | A |
4744099 | Huettenrauch et al. | May 1988 | A |
4752948 | MacMahon | Jun 1988 | A |
4760589 | Siczek | Jul 1988 | A |
4763343 | Yanaki | Aug 1988 | A |
4773086 | Fujita et al. | Sep 1988 | A |
4773087 | Plewes | Sep 1988 | A |
4819258 | Kleinman et al. | Apr 1989 | A |
4821727 | Levene et al. | Apr 1989 | A |
4969174 | Scheid et al. | Nov 1990 | A |
4989227 | Tirelli et al. | Jan 1991 | A |
5018176 | Romeas et al. | May 1991 | A |
RE33634 | Yanaki | Jul 1991 | E |
5029193 | Saffer | Jul 1991 | A |
5051904 | Griffith | Sep 1991 | A |
5078142 | Siczek et al. | Jan 1992 | A |
5163075 | Lubinsky et al. | Nov 1992 | A |
5164976 | Scheid et al. | Nov 1992 | A |
5199056 | Darrah | Mar 1993 | A |
5212637 | Saxena | May 1993 | A |
5240011 | Assa | Aug 1993 | A |
5256370 | Slattery et al. | Oct 1993 | A |
5274690 | Burke | Dec 1993 | A |
5289520 | Pellegrino et al. | Feb 1994 | A |
5291539 | Thumann et al. | Mar 1994 | A |
5313510 | Ebersberger | May 1994 | A |
5359637 | Webber | Oct 1994 | A |
5365562 | Toker | Nov 1994 | A |
5415169 | Siczek et al. | May 1995 | A |
5426685 | Pellegrino et al. | Jun 1995 | A |
5452367 | Bick et al. | Sep 1995 | A |
5506877 | Niklason et al. | Apr 1996 | A |
5526394 | Siczek et al. | Jun 1996 | A |
5528658 | Hell | Jun 1996 | A |
5539797 | Heidsieck et al. | Jul 1996 | A |
5553111 | Moore et al. | Sep 1996 | A |
5592562 | Rooks | Jan 1997 | A |
5594769 | Pellegrino et al. | Jan 1997 | A |
5596200 | Sharma et al. | Jan 1997 | A |
5598454 | Franetzki et al. | Jan 1997 | A |
5609152 | Pellegrino et al. | Mar 1997 | A |
5627869 | Andrew et al. | May 1997 | A |
5657362 | Giger et al. | Aug 1997 | A |
5668844 | Webber | Sep 1997 | A |
5668889 | Hara | Sep 1997 | A |
5706327 | Adamkowski et al. | Jan 1998 | A |
5719952 | Rooks | Feb 1998 | A |
5735264 | Siczek et al. | Apr 1998 | A |
5769086 | Ritchart et al. | Jun 1998 | A |
5803912 | Siczek et al. | Sep 1998 | A |
5818898 | Tsukamoto et al. | Oct 1998 | A |
5828722 | Ploetz et al. | Oct 1998 | A |
5841829 | Dolazza | Nov 1998 | A |
5844965 | Galkin | Dec 1998 | A |
5864146 | Karellas | Jan 1999 | A |
5872828 | Niklason et al. | Feb 1999 | A |
5878104 | Ploetz | Mar 1999 | A |
5896437 | Ploetz | Apr 1999 | A |
5941832 | Tumey et al. | Aug 1999 | A |
5970118 | Sokolov | Oct 1999 | A |
5986662 | Argiro et al. | Nov 1999 | A |
5999836 | Nelson et al. | Dec 1999 | A |
6005907 | Ploetz | Dec 1999 | A |
6022325 | Siczek et al. | Feb 2000 | A |
6075879 | Roehrig et al. | Jun 2000 | A |
6091841 | Rogers et al. | Jul 2000 | A |
6137527 | Abdel-Malek et al. | Oct 2000 | A |
6141398 | He et al. | Oct 2000 | A |
6149301 | Kautzer et al. | Nov 2000 | A |
6167115 | Inoue | Dec 2000 | A |
6175117 | Komardin et al. | Jan 2001 | B1 |
6196715 | Nambu et al. | Mar 2001 | B1 |
6207958 | Giakos | Mar 2001 | B1 |
6216540 | Nelson et al. | Apr 2001 | B1 |
6219059 | Argiro | Apr 2001 | B1 |
6233473 | Shepherd et al. | May 2001 | B1 |
6243441 | Zur | Jun 2001 | B1 |
6244507 | Garland | Jun 2001 | B1 |
6256369 | Lai | Jul 2001 | B1 |
6256370 | Yavuz | Jul 2001 | B1 |
6272207 | Tang | Aug 2001 | B1 |
6289235 | Webber et al. | Sep 2001 | B1 |
6292530 | Yavus et al. | Sep 2001 | B1 |
6327336 | Gingold et al. | Dec 2001 | B1 |
6341156 | Baetz et al. | Jan 2002 | B1 |
6345194 | Nelson et al. | Feb 2002 | B1 |
6375352 | Hewes et al. | Apr 2002 | B1 |
6411836 | Patel et al. | Jun 2002 | B1 |
6415015 | Nicolas et al. | Jul 2002 | B2 |
6418189 | Schafer | Jul 2002 | B1 |
6442288 | Haerer et al. | Aug 2002 | B1 |
6459925 | Nields et al. | Oct 2002 | B1 |
6480565 | Ning | Nov 2002 | B1 |
6490476 | Townsend et al. | Dec 2002 | B1 |
6501819 | Unger et al. | Dec 2002 | B2 |
6542575 | Schubert | Apr 2003 | B1 |
6553096 | Zhou et al. | Apr 2003 | B1 |
6556655 | Chichereau et al. | Apr 2003 | B1 |
6574304 | Hsieh et al. | Jun 2003 | B1 |
6574629 | Cooke, Jr. et al. | Jun 2003 | B1 |
6597762 | Ferrant et al. | Jul 2003 | B1 |
6611575 | Alyassin et al. | Aug 2003 | B1 |
6620111 | Stephens et al. | Sep 2003 | B2 |
6626849 | Huitema et al. | Sep 2003 | B2 |
6633674 | Gemperline et al. | Oct 2003 | B1 |
6638235 | Miller et al. | Oct 2003 | B2 |
6647092 | Eberhard et al. | Nov 2003 | B2 |
6744848 | Stanton et al. | Jun 2004 | B2 |
6748044 | Sabol et al. | Jun 2004 | B2 |
6751285 | Eberhard et al. | Jun 2004 | B2 |
6758824 | Miller et al. | Jul 2004 | B1 |
6813334 | Koppe et al. | Nov 2004 | B2 |
6882700 | Wang et al. | Apr 2005 | B2 |
6885724 | Li et al. | Apr 2005 | B2 |
6895076 | Halsmer | May 2005 | B2 |
6909790 | Tumey et al. | Jun 2005 | B2 |
6909792 | Carrott et al. | Jun 2005 | B1 |
6912319 | Barnes et al. | Jun 2005 | B1 |
6940943 | Claus et al. | Sep 2005 | B2 |
6950493 | Besson | Sep 2005 | B2 |
6957099 | Arnone et al. | Oct 2005 | B1 |
6970531 | Eberhard et al. | Nov 2005 | B2 |
6978040 | Berestov | Dec 2005 | B2 |
6987831 | Ning | Jan 2006 | B2 |
6999554 | Mertelmeier | Feb 2006 | B2 |
7001071 | Deuringer | Feb 2006 | B2 |
7016461 | Rotondo | Mar 2006 | B2 |
7110490 | Eberhard et al. | Sep 2006 | B2 |
7110502 | Tsuji | Sep 2006 | B2 |
7116749 | Besson | Oct 2006 | B2 |
7123684 | Jing et al. | Oct 2006 | B2 |
7127091 | Op De Beek et al. | Oct 2006 | B2 |
7142633 | Eberhard et al. | Nov 2006 | B2 |
7190758 | Hagiwara | Mar 2007 | B2 |
7206462 | Betke | Apr 2007 | B1 |
7244063 | Eberhard | Jul 2007 | B2 |
7245694 | Jing et al. | Jul 2007 | B2 |
7286645 | Freudenberger | Oct 2007 | B2 |
7302031 | Hjarn et al. | Nov 2007 | B2 |
7315607 | Ramsauer | Jan 2008 | B2 |
7319735 | Defreitas et al. | Jan 2008 | B2 |
7319736 | Rotondo | Jan 2008 | B2 |
7323692 | Rowlands et al. | Jan 2008 | B2 |
7331264 | Ozawa | Feb 2008 | B2 |
7430272 | Jing et al. | Sep 2008 | B2 |
7443949 | Defreitas et al. | Oct 2008 | B2 |
7577282 | Gkanatsios et al. | Aug 2009 | B2 |
7583786 | Jing et al. | Sep 2009 | B2 |
7609806 | Defreitas et al. | Oct 2009 | B2 |
7616731 | Pack | Nov 2009 | B2 |
7616801 | Gkanatsios et al. | Nov 2009 | B2 |
7630531 | Chui | Dec 2009 | B2 |
7630533 | Ruth et al. | Dec 2009 | B2 |
7688940 | Defreitas et al. | Mar 2010 | B2 |
7697660 | Ning | Apr 2010 | B2 |
7702142 | Ren et al. | Apr 2010 | B2 |
7760853 | Jing et al. | Jul 2010 | B2 |
7760924 | Ruth et al. | Jul 2010 | B2 |
7792245 | Hitzke et al. | Sep 2010 | B2 |
7831296 | Defreitas et al. | Nov 2010 | B2 |
7839979 | Hauttmann | Nov 2010 | B2 |
7869563 | Defreitas et al. | Jan 2011 | B2 |
7881428 | Jing et al. | Feb 2011 | B2 |
7885384 | Mannar | Feb 2011 | B2 |
7894646 | Shirahata et al. | Feb 2011 | B2 |
7916915 | Gkanatsios et al. | Mar 2011 | B2 |
7949091 | Jing et al. | May 2011 | B2 |
7986765 | Defreitas et al. | Jul 2011 | B2 |
7991106 | Ren et al. | Aug 2011 | B2 |
8031834 | Ludwig | Oct 2011 | B2 |
8131049 | Ruth et al. | Mar 2012 | B2 |
8155421 | Ren et al. | Apr 2012 | B2 |
8170320 | Smith et al. | May 2012 | B2 |
8175219 | Defreitas et al. | May 2012 | B2 |
8285020 | Gkanatsios et al. | Oct 2012 | B2 |
8416915 | Jing et al. | Apr 2013 | B2 |
8452379 | DeFreitas et al. | May 2013 | B2 |
8457282 | Baorui et al. | Jun 2013 | B2 |
8515005 | Ren et al. | Aug 2013 | B2 |
8559595 | Defreitas et al. | Oct 2013 | B2 |
8565372 | Stein et al. | Oct 2013 | B2 |
8565374 | DeFreitas et al. | Oct 2013 | B2 |
8565860 | Kimchy | Oct 2013 | B2 |
8571289 | Ruth et al. | Oct 2013 | B2 |
8712127 | Ren et al. | Apr 2014 | B2 |
8767911 | Ren et al. | Jul 2014 | B2 |
8787522 | Smith et al. | Jul 2014 | B2 |
8831171 | Jing et al. | Sep 2014 | B2 |
8853635 | O'Connor | Oct 2014 | B2 |
8873716 | Ren et al. | Oct 2014 | B2 |
9042612 | Gkanatsios et al. | May 2015 | B2 |
9066706 | Defreitas et al. | Jun 2015 | B2 |
9226721 | Ren et al. | Jan 2016 | B2 |
9460508 | Gkanatsios et al. | Oct 2016 | B2 |
9498175 | Stein et al. | Nov 2016 | B2 |
9502148 | Ren | Nov 2016 | B2 |
9549709 | DeFreitas et al. | Jan 2017 | B2 |
9851888 | Gkanatsios et al. | Dec 2017 | B2 |
9895115 | Ren | Feb 2018 | B2 |
10108329 | Gkanatsios et al. | Oct 2018 | B2 |
10194875 | DeFreitas et al. | Feb 2019 | B2 |
10296199 | Gkanatsios | May 2019 | B2 |
10413255 | Stein | Sep 2019 | B2 |
10719223 | Gkanatsios | Jul 2020 | B2 |
20010038681 | Stanton et al. | Nov 2001 | A1 |
20020012450 | Tsujii | Jan 2002 | A1 |
20020048343 | Launay et al. | Apr 2002 | A1 |
20020050986 | Inoue et al. | May 2002 | A1 |
20020070970 | Wood et al. | Jun 2002 | A1 |
20020075997 | Unger et al. | Jun 2002 | A1 |
20020090055 | Zur et al. | Jul 2002 | A1 |
20020094062 | Dolazza | Jul 2002 | A1 |
20020122533 | Marie et al. | Sep 2002 | A1 |
20020126798 | Harris | Sep 2002 | A1 |
20030007598 | Wang et al. | Jan 2003 | A1 |
20030010923 | Zur | Jan 2003 | A1 |
20030018272 | Treado et al. | Jan 2003 | A1 |
20030026386 | Tang et al. | Feb 2003 | A1 |
20030058989 | Rotondo | Mar 2003 | A1 |
20030072409 | Kaufhold et al. | Apr 2003 | A1 |
20030072417 | Kaufhold et al. | Apr 2003 | A1 |
20030073895 | Nields et al. | Apr 2003 | A1 |
20030095624 | Eberhard et al. | May 2003 | A1 |
20030097055 | Yanof et al. | May 2003 | A1 |
20030149364 | Kapur | Aug 2003 | A1 |
20030169847 | Karellas et al. | Sep 2003 | A1 |
20030194050 | Eberhard | Oct 2003 | A1 |
20030194051 | Wang et al. | Oct 2003 | A1 |
20030194121 | Eberhard et al. | Oct 2003 | A1 |
20030210254 | Doan et al. | Nov 2003 | A1 |
20030212327 | Wang et al. | Nov 2003 | A1 |
20030215120 | Uppaluri et al. | Nov 2003 | A1 |
20040008809 | Webber | Jan 2004 | A1 |
20040066882 | Eberhard et al. | Apr 2004 | A1 |
20040066884 | Hermann Claus et al. | Apr 2004 | A1 |
20040066904 | Eberhard et al. | Apr 2004 | A1 |
20040070582 | Smith et al. | Apr 2004 | A1 |
20040094167 | Brady et al. | May 2004 | A1 |
20040101095 | Jing et al. | May 2004 | A1 |
20040109529 | Eberhard et al. | Jun 2004 | A1 |
20040146221 | Siegel et al. | Jul 2004 | A1 |
20040171986 | Tremaglio, Jr. et al. | Sep 2004 | A1 |
20040190682 | Deuringer | Sep 2004 | A1 |
20040213378 | Zhou et al. | Oct 2004 | A1 |
20040247081 | Halsmer | Dec 2004 | A1 |
20040264627 | Besson | Dec 2004 | A1 |
20040267157 | Miller et al. | Dec 2004 | A1 |
20050025278 | Hagiwara | Feb 2005 | A1 |
20050049521 | Miller et al. | Mar 2005 | A1 |
20050063509 | DeFreitas et al. | Mar 2005 | A1 |
20050078797 | Danielsson et al. | Apr 2005 | A1 |
20050089205 | Kapur | Apr 2005 | A1 |
20050105679 | Wu et al. | May 2005 | A1 |
20050113681 | DeFreitas et al. | May 2005 | A1 |
20050113715 | Schwindt et al. | May 2005 | A1 |
20050117694 | Francke | Jun 2005 | A1 |
20050129172 | Mertelmeier | Jun 2005 | A1 |
20050133706 | Eberhard | Jun 2005 | A1 |
20050135555 | Claus et al. | Jun 2005 | A1 |
20050135664 | Kaufhold et al. | Jun 2005 | A1 |
20050226375 | Eberhard et al. | Oct 2005 | A1 |
20050248347 | Damadian | Nov 2005 | A1 |
20060030784 | Miller et al. | Feb 2006 | A1 |
20060034426 | Freudenberger | Feb 2006 | A1 |
20060074288 | Kelly et al. | Apr 2006 | A1 |
20060098855 | Gkanatsios et al. | May 2006 | A1 |
20060109951 | Popescu | May 2006 | A1 |
20060126780 | Rotondo | Jun 2006 | A1 |
20060129062 | Nicoson et al. | Jun 2006 | A1 |
20060155209 | Miller et al. | Jul 2006 | A1 |
20060210016 | Francke | Sep 2006 | A1 |
20060262898 | Partain | Nov 2006 | A1 |
20060269041 | Mertelmeier | Nov 2006 | A1 |
20060291618 | Eberhard et al. | Dec 2006 | A1 |
20070030949 | Jing et al. | Feb 2007 | A1 |
20070036265 | Jing et al. | Feb 2007 | A1 |
20070076844 | Defreitas et al. | Apr 2007 | A1 |
20070078335 | Horn | Apr 2007 | A1 |
20070140419 | Souchay | Jun 2007 | A1 |
20070223651 | Wagenaar et al. | Sep 2007 | A1 |
20070225600 | Weibrecht et al. | Sep 2007 | A1 |
20070242800 | Jing et al. | Oct 2007 | A1 |
20080019581 | Gkanatsios et al. | Jan 2008 | A1 |
20080045833 | Defreitas et al. | Feb 2008 | A1 |
20080056436 | Pack | Mar 2008 | A1 |
20080101537 | Sendai | May 2008 | A1 |
20080112534 | Defreitas et al. | May 2008 | A1 |
20080118023 | Besson | May 2008 | A1 |
20080130979 | Ren et al. | Jun 2008 | A1 |
20080212861 | Durgan et al. | Sep 2008 | A1 |
20080285712 | Kopans | Nov 2008 | A1 |
20080317196 | Imai | Dec 2008 | A1 |
20090003519 | Defreitas et al. | Jan 2009 | A1 |
20090010384 | Jing et al. | Jan 2009 | A1 |
20090080594 | Brooks et al. | Mar 2009 | A1 |
20090080602 | Brooks et al. | Mar 2009 | A1 |
20090135997 | Defreitas et al. | May 2009 | A1 |
20090141859 | Gkanatsios et al. | Jun 2009 | A1 |
20090213987 | Stein et al. | Aug 2009 | A1 |
20090237924 | Ladewig | Sep 2009 | A1 |
20090238424 | Arakita et al. | Sep 2009 | A1 |
20090268865 | Ren et al. | Oct 2009 | A1 |
20090296882 | Gkanatsios et al. | Dec 2009 | A1 |
20090304147 | Jing et al. | Dec 2009 | A1 |
20100020937 | Hautmann | Jan 2010 | A1 |
20100020938 | Koch | Jan 2010 | A1 |
20100034450 | Mertelmeier | Feb 2010 | A1 |
20100054400 | Ren | Mar 2010 | A1 |
20100086188 | Ruth et al. | Apr 2010 | A1 |
20100091940 | Ludwig et al. | Apr 2010 | A1 |
20100150306 | Defreitas et al. | Jun 2010 | A1 |
20100189227 | Mannar | Jul 2010 | A1 |
20100195882 | Ren | Aug 2010 | A1 |
20100226475 | Smith | Sep 2010 | A1 |
20100290585 | Eliasson | Nov 2010 | A1 |
20100303202 | Ren | Dec 2010 | A1 |
20100313196 | De Atley et al. | Dec 2010 | A1 |
20110026667 | Poorter | Feb 2011 | A1 |
20110069809 | Defreitas et al. | Mar 2011 | A1 |
20110178389 | Kumar et al. | Jul 2011 | A1 |
20110188624 | Ren | Aug 2011 | A1 |
20110234630 | Batman et al. | Sep 2011 | A1 |
20110268246 | Dafni | Nov 2011 | A1 |
20120033868 | Ren | Feb 2012 | A1 |
20120051502 | Ohta et al. | Mar 2012 | A1 |
20120236987 | Ruimi | Sep 2012 | A1 |
20120238870 | Smith et al. | Sep 2012 | A1 |
20130028374 | Gkanatsios et al. | Jan 2013 | A1 |
20130211261 | Wang | Aug 2013 | A1 |
20130272494 | DeFreitas et al. | Oct 2013 | A1 |
20140044230 | Stein et al. | Feb 2014 | A1 |
20140044231 | Defreitas et al. | Feb 2014 | A1 |
20140086471 | Ruth et al. | Mar 2014 | A1 |
20140098935 | Defreitas et al. | Apr 2014 | A1 |
20140232752 | Ren et al. | Aug 2014 | A1 |
20140314198 | Ren et al. | Oct 2014 | A1 |
20140321607 | Smith | Oct 2014 | A1 |
20140376690 | Jing et al. | Dec 2014 | A1 |
20150049859 | DeFreitas et al. | Feb 2015 | A1 |
20150160848 | Gkanatsios et al. | Jun 2015 | A1 |
20150310611 | Gkanatsios et al. | Oct 2015 | A1 |
20160106383 | Ren et al. | Apr 2016 | A1 |
20160189376 | Bernard | Jun 2016 | A1 |
20160209995 | Jeon | Jul 2016 | A1 |
20160220207 | Jouhikainen | Aug 2016 | A1 |
20160256125 | Smith | Sep 2016 | A1 |
20160270742 | Stein et al. | Sep 2016 | A9 |
20160302746 | Erhard | Oct 2016 | A1 |
20160331339 | Guo | Nov 2016 | A1 |
20170024113 | Gkanatsios et al. | Jan 2017 | A1 |
20170032546 | Westerhoff | Feb 2017 | A1 |
20170071562 | Suzuki | Mar 2017 | A1 |
20170128028 | DeFreitas et al. | May 2017 | A1 |
20170135650 | Stein et al. | May 2017 | A1 |
20170316588 | Homann | Nov 2017 | A1 |
20170319167 | Goto | Nov 2017 | A1 |
20180130201 | Bernard | May 2018 | A1 |
20180177476 | Jing et al. | Jun 2018 | A1 |
20180188937 | Gkanatsios et al. | Jul 2018 | A1 |
20180289347 | DeFreitas et al. | Oct 2018 | A1 |
20180344276 | DeFreitas et al. | Dec 2018 | A1 |
20190059830 | Williams | Feb 2019 | A1 |
20190095087 | Gkanatsios et al. | Mar 2019 | A1 |
20190200942 | DeFreitas | Jul 2019 | A1 |
20190336794 | Li | Nov 2019 | A1 |
20190388051 | Morita | Dec 2019 | A1 |
20200029927 | Wilson | Jan 2020 | A1 |
Number | Date | Country |
---|---|---|
102222594 | Oct 2011 | CN |
102004051401 | May 2006 | DE |
102004051820 | May 2006 | DE |
102010027871 | Oct 2011 | DE |
0775467 | May 1997 | EP |
0982001 | Mar 2000 | EP |
1028451 | Aug 2000 | EP |
1428473 | Jun 2004 | EP |
1759637 | Mar 2007 | EP |
1569556 | Apr 2012 | EP |
2732764 | May 2014 | EP |
2602743 | Nov 2014 | EP |
2819145 | Dec 2014 | EP |
3143935 | Mar 2017 | EP |
53151381 | Nov 1978 | JP |
2001-346786 | Dec 2001 | JP |
2002219124 | Aug 2002 | JP |
2006-231054 | Sep 2006 | JP |
2007-50264 | Mar 2007 | JP |
2007-521911 | Aug 2007 | JP |
2007229269 | Sep 2007 | JP |
2008-67933 | Mar 2008 | JP |
2008086471 | Apr 2008 | JP |
2009500048 | Jan 2009 | JP |
2012-509714 | Apr 2012 | JP |
2012-511988 | May 2012 | JP |
2015-530706 | Oct 2015 | JP |
WO 9005485 | May 1990 | WO |
WO 9803115 | Jan 1998 | WO |
WO 9816903 | Apr 1998 | WO |
WO 0051484 | Sep 2000 | WO |
WO 03020114 | Mar 2003 | WO |
WO 03037046 | May 2003 | WO |
WO 2003057564 | Jul 2003 | WO |
WO 2004043535 | May 2004 | WO |
WO 2005051197 | Jun 2005 | WO |
WO 2005110230 | Nov 2005 | WO |
WO 2005112767 | Dec 2005 | WO |
WO 2006055830 | May 2006 | WO |
WO 2006058160 | Jun 2006 | WO |
WO 2007129244 | Nov 2007 | WO |
WO 2008072144 | Jun 2008 | WO |
WO 2009122328 | Oct 2009 | WO |
WO 2009136349 | Nov 2009 | WO |
WO 2010070554 | Jun 2010 | WO |
WO 2013184213 | Dec 2013 | WO |
Entry |
---|
Kachelriess, Marc et al., “Flying Focal Spot (FFS) in Cone-Beam CT”, 2004 IEEE Nuclear Science Symposium Conference Record, Oct. 16-22, 2004, Rome Italy, vol. 6, pp. 3759-3763. |
Niklason et al., “Digital breast tomosynthesis: potentially a new method for breast cancer screening”, In Digital Mammography, 1998, 6 pages. |
Thurfjell, “Mammography screening: one versus two views and independent double reading”, Acta Radiologica 35, No. 4, 1994, pp. 345-350. |
“Essentials for life: Senographe Essential Full-Field Digital Mammography system”, GE Health-care Brochure, MM-0132-05.06-EN-US, 2006, 12 pgs. |
“Filtered Back Projection,” (NYGREN) published May 8, 2007; URL:http://web.archive.org/web/19991010131715/http://www.owlnet.rice.edu/-.about.e1ec539/Projects97/cult/node2.html., 2 pgs. |
“Lorad Selenia” Document B-BI-SEO US/Intl (May 2006) copyright Hologic 2006, 12 pgs. |
ACRIN website, located at https://www.acrin.org/PATIENTS/ABOUTIMAGINGEXAMSANDAGENTS/ABOUTMAMMOGRAPHYANDTOMOSYNTHESIS.aspx, “About Mammography and Tomosynthesis”, obtained online on Dec. 8, 2015, 5 pgs. |
American College of Radiology website, located at http://www.acr.org/FAQs/DBT-FAQ, “Digital Breast Tomosynthesis FAQ for Insurers”, obtained online on Dec. 8, 2015, 2 pages. |
Aslund, Magnus, “Digital Mammography with a Photon Counting Detector in a Scanned Multislit Geometry”, Doctoral Thesis, Dept of Physics, Royal Institute of Technology, Stockholm, Sweden, Apr. 2007, 51 pages. |
Chan, Heang-Ping et al., “ROC study of the effect of stereoscopic imaging on assessment of breast lesions”, Medical Physics, vol. 32, No. 4, Apr. 2005, 7 pgs. |
Cole, Elodia, et al., “The Effects of Gray Scale Image Processing on Digital Mammography Interpretation Performance”, Academic Radiology, vol. 12, No. 5, pp. 585-595, May 2005. |
Digital Clinical Reports, Tomosynthesis, GE Brochure 98-5493, Nov. 1998, 8 pgs. |
Dobbins, James T., “Digital x-ray tomosynthesis: current state of the art and clinical potential,” Physics in Medicine and Biology, Taylor and Francis Ltd, London GB, vol. 48, No. 19, Oct. 7, 2003, 42 pages. |
Federica Pediconi et al., “Color-coded automated signal intensity-curve for detection and characterization of breast lesions: Preliminary evaluation of a new software for Mr-based breast imaging”, International Congress Series 1281 (2005) 1081-1086. |
Grant, David G., “Tomosynthesis: a three-dimensional imaging technique”, IEEE Trans. Biomed. Engineering, vol. BME-19, #1, Jan. 1972, pp. 20-28. |
Japanese Office Action mailed in Application 2016-087710, dated Mar. 1, 2017, 5 pages. |
Japanese Office Action mailed in Application 2017-001579, mailed Mar. 29, 2017, 1 page. (No English Translation.). |
Kita et al., “Correspondence between different view breast X-rays using simulation of breast deformation”, Proceedings 1998 IEE Computer Society Conference on Computer Vision and Pattern Recognition, Santa Barbara, CA, Jun. 23-25, 1998, pp. 700-707. |
Mammographic Accreditation Phantom, http://www.cirsinc.com/pdfs/015cp.pdf. (2006), 2 pgs. |
Niklason, Loren T. et al., “Digital Tomosynthesis in Breast Imaging”, Radiology, Nov. 1997, vol. 205, No. 2, pp. 399-406. |
Pisano, Etta D., “Digital Mammography”, Radiology, vol. 234, No. 2, Feb. 2005, pp. 353-362. |
Senographe 700 & 800T (GE); 2-page download on Jun. 22, 2006 from www.gehealthcare.com/inen/rad/whe/products/mswh800t.html.; Figures 1-7 on 4 sheets re lateral shift compression paddle, 2 pgs. |
Smith, A., “Fundamentals of Breast Tomosynthesis”, White Paper, Hologic Inc., WP-00007, Jun. 2008, 8 pgs. |
Smith, Andrew, PhD, “Full Field Breast Tomosynthesis”, Hologic White Paper, Oct. 2004, 6 pgs. |
Wheeler F. W., et al. “Micro-Calcification Detection in Digital Tomosynthesis Mammography”, Proceedings of SPIE, Conf-Physics of Semiconductor Devices, Dec. 11, 2001 to Dec. 15, 2001, Delhi, SPIE, US, vol. 6144, Feb. 13, 2006, 12 pgs. |
Wu, Tao, et al. “Tomographic Mammography Using a Limited No. Of Low-Dose Cone-Beam Projection Images” Medical Physics, AIP, Melville, NY, vol. 30, No. 3, Mar. 1, 2003, p. 365-380. |
Japanese Notice of Rejection in Application 2018-554775, dated Feb. 22, 2021, 10 pages. |
Number | Date | Country | |
---|---|---|---|
20200085393 A1 | Mar 2020 | US |
Number | Date | Country | |
---|---|---|---|
62730818 | Sep 2018 | US |