The disclosure generally relates to quality assurance of patient imaging, and more particularly to improving detection of movement and correction of motion artifacts, such as it relates to mammography or tomosynthesis image acquisition.
Preventing movement of subject tissue, and in particular breast tissue, is important when performing radiation-based imaging of a patient for a variety of reasons. First, some imaging procedures last for a non-trivial period of time, and movement during a portion of the procedure may negatively impact image quality. Specifically, patient motion may cause anatomical distortions or artifacts, which can be exaggerated during longer exposure times. Second, it is desirable to minimize a patient's total exposure to radiation during a procedure and, thus, subsequent imaging to obtain proper image quality is not ideal. Third, due to regulations in many jurisdictions, subsequent imaging used solely to correct image quality may be counted against a practitioner or organization, and frequent re-imaging may result in revocation of a license and/or accreditation. Fourth, poor quality images due to excess movement may require a patient to make subsequent visits to an imaging center, placing additional burden on the patient and the healthcare system itself, including the imaging center and payer.
The following presents a simplified summary in order to provide a basic understanding of some novel embodiments described herein. This summary is not an extensive overview, and it is not intended to identify key/critical elements or to delineate the scope thereof. Its sole purpose is to present some concepts in a simplified form as a prelude to the more detailed description that is presented later.
Techniques for detecting and/or otherwise notifying a patient of detected motion and modifying the imaging protocol during breast imaging are described. As described above, preventing movement breast tissue, is important when performing radiation-based imaging of a patient for a variety of reasons including improving image quality, improving patient experience, reducing exposure and avoiding repeat imaging. For at least these reasons, there is a need for improved techniques, which may be automated or semi-automated, for detection of movement during an imaging procedure, for corrective actions during and after the procedure when movement has been detected, and for minimizing the amount of radiation exposure to patients in a workflow efficient manner.
An imaging system as described herein may include an imaging detector to capture an image of human tissue, such as breast tissue, and a compression paddle situated apart from the imaging detector to compress the human tissue between the compression paddle and the imaging detector. One or more sensors may be included, in one embodiment a force sensor may generate a force signal indicating a measure of force applied to the human tissue. A movement detection circuit may filter a movement signal from the force signal indicating a measure of movement of the compressed human tissue. A movement analysis module may determine that the movement signal is beyond a movement threshold. An image correction module may perform a corrective action based upon the determination that the movement signal is beyond a movement threshold. Other embodiments are described and claimed.
The force sensor described herein is typical to most modern mammography systems where breast compression force is incorporated. The force sensor helps to prevent excessive compression of the patient's breast which can cause pain and other undesirable effects. The embodiments as described and claimed relate to the output of the force sensor, representative of a force level, which may be filtered or converted by one or more circuits or modules described herein into a value that indicates movement. This movement signal, when compared to other measurements over time, may indicate movement of the patient undergoing an imaging procedure.
In addition or in the alternative, other sensors may be used. For example, one or more ultrasound sensors, optical and/or infrared sensors may be used. In some examples, the sensors may be located either in a grid on the compression paddle. In other examples, the sensors may be located on the periphery of the paddle. The sensors may capture spatial data information from the compression of the breast. The special information may be used to create motion maps and/or contact maps. The motion map information can be used to create a correction map. The correction map information may be used as input to the image correction algorithm which corrects the tomosynthesis images. In the examples where a contact map is created based on the spatial information, the contact map can be used to create compression contours, which can be used as an input to the compression adequacy analysis and recommend a corrective action.
Some software based techniques for detecting motion during an imaging procedure have been previously described. For example, one method of detecting patient motion includes detecting from a series of images displacement of an edge line such as the skin line of the breast, an implant edge, or some other internal edge. This skin line detection process is disclosed in U.S. Pat. No. 9,498,180, titled System and Method For Detecting Patient Motion During Tomosynthesis Scans, which is incorporated by reference herein (hereafter the '180 patent).
However, unlike software based and image artifact based motion detection, detection of motion based on hardware sensors gives an objective measure of patient motion to add to the assessment of motion. The independent, hardware based, detection using the information from one or more sensors allows for greater accuracy. In addition, because the mammography system already includes the force sensor, this method of patient motion is more cost effective than the alternative image based detection when force sensor detection is used. In addition, different types of motion may be detected and different compensation actions may be taken. For example, if motion with regular movement interval is detected, such as breathing or heartbeat, image capture may be synchronized with the motion. In a different example, if irregular movement is detected, such as patient adjusting position, the image capture may be delayed. Such nuanced and continued detection may not be possible if the detection is based on image processing alone.
In an aspect, the technology relates to a method for improving medical imaging procedures. The method includes receiving, by a central computer system from a first imaging facility, a first set of quality metrics for a plurality of medical images acquired at the first imaging facility; receiving, by the central computer system from a second imaging facility, a second set of quality metrics for a second plurality of medical images acquired at the second imaging facility; comparing, by the central computer system, the first set of quality metrics to the second set of quality metrics; based on the comparison of the first set of quality metrics to the second set of quality metrics, generating, by the central computer system, a benchmark for at least one metric in the first set of quality metrics and the second set of quality metrics; generating, by the central computer system, facility data based on the generated benchmark and the first set of quality metrics; and sending, by the central computer system, the facility data to the first imaging facility.
In an example, the method further includes generating a training recommendation based on the generated benchmark and the first set of quality metrics; receiving, from the first imaging facility, a subsequent set of quality metrics for a plurality of medical images acquired at the first facility after the sending of the generated training recommendation; comparing the subsequent set of quality metrics to the first set of quality metrics; and based on the comparison of the subsequent set of quality metrics to the first set of quality metrics, generating an effectiveness rating for the generated training. In another example, the method further includes receiving, from the first imaging facility, a subsequent set of quality metrics for a plurality of medical images acquired at the first facility after the sending of the generated training recommendation; comparing the subsequent set of quality metrics to the first set of quality metrics to determine a trend for at least one quality metric; and based on determined trend for the at least one quality metric, generating a trend warning. In yet another example, the quality metrics are based on positioning metrics generated from the plurality of medical images. In still another example, the method further includes providing the first set of quality metrics and the second set of quality metrics as inputs to an unsupervised machine learning algorithm to identify additional patterns within the sets of quality metrics. In a further example, the first set of quality metrics are received via a web application managed by the central computer system and the training is sent via the web application. In still yet another example, the quality metrics are based on patient movement. In another example, at least one quality metric is based on a movement signal that is generated by the following operations: generating, by a force sensor, a force signal indicating a measure of force applied superior to human tissue being compressed between a compression paddle and an imaging detector to capture an image of the human tissue; and filtering, by a movement detection circuit, a movement signal from the force signal indicating a measure of movement of the compressed human tissue.
In another aspect, the technology relates to a central computer system that includes at least one processing unit; and memory operatively in communication with the at least processing unit, the memory storing instructions that, when executed by the at least one processing unit, are configured to cause the system to perform a set of operations. The operations include receiving, from a first imaging facility, a first set of quality metrics for a plurality of medical images acquired at the first imaging facility; receiving, from a second imaging facility, a second set of quality metrics for a second plurality of medical images acquired at the second imaging facility; comparing, by the central computer system, the first set of quality metrics to the second set of quality metrics; based on the comparison of the first set of quality metrics to the second set of quality metrics, generating a benchmark for at least one metric in the first set of quality metrics and the second set of quality metrics; generating a training recommendation based on the generated benchmark and the first set of quality metrics; and sending the generated training recommendation to the first facility.
In an example, the operations further comprise receiving, from the first imaging facility, a subsequent set of quality metrics for a plurality of medical images acquired at the first facility after the sending of the generated training recommendation; comparing the subsequent set of quality metrics to the first set of quality metrics; and based on the comparison of the subsequent set of quality metrics to the first set of quality metrics, generating an effectiveness rating for the generated training. In another example, the operations further comprise receiving, from the first imaging facility, a subsequent set of quality metrics for a plurality of medical images acquired at the first facility after the sending of the generated training recommendation; comparing the subsequent set of quality metrics to the first set of quality metrics to determine a trend for at least one quality metric; and based on determined trend for the at least one quality metric, generating a trend warning. In yet another example, the trend warning is based on a rate of the determined trend. In still yet another example, the operations further comprise providing the first set of quality metrics and the second set of quality metrics as inputs to an unsupervised machine learning algorithm to identify additional patterns within the sets of quality metrics. In another example, the first set of quality metrics are received via a web application managed by the central computer system and the training is sent via the web application.
In a further example, the set of operations further comprise providing a dashboard via a web application to the first facility and the second facility. In still another example, the dashboard displays quality metrics received from the first facility compared to the benchmark. In still yet another example, receiving the first set of quality metrics for a plurality of medical images includes receiving identification information for the plurality of medical images.
In another aspect, the technology relates to a computer-implemented method comprising for improving medical imaging procedures. The method includes receiving patient positioning scores from a technician at a first facility; receiving patient positioning scores from a technician at a second facility; comparing a threshold to the received patient positioning scores for the technician at the first facility and the technician at the second facility; based on the comparison, determining that the patient positioning scores for the technician at the first facility are below the threshold; and based on the patient positioning scores for the technician at the first facility being below the threshold, generating a recommendation for a corrective action for the technician.
In an example, the method further includes based on the comparison, determining that the patient positioning scores for the technician at the second facility are above the threshold; and based on the patient positioning scores for the technician at the second facility being above the threshold, generating a report of compliance with federal regulations. In another example, the method includes comparing the patient positioning scores from the technician at the first facility to the patient positioning scores from the technician at the second facility to score the technologists relative to each other.
To the accomplishment of the foregoing and related ends, certain illustrative aspects are described herein in connection with the following description and the annexed drawings. These aspects are indicative of the various ways in which the principles disclosed herein can be practiced and all aspects and equivalents thereof are intended to be within the scope of the claimed subject matter. Other advantages and novel features will become apparent from the following detailed description when considered in conjunction with the drawings.
Techniques for breast imaging patient motion compensation, compression evaluation, and positioning evaluation are described. An imaging system may include an imaging detector to capture an image of human tissue, such as breast tissue or other soft tissue, and a compression paddle situated apart from the imaging detector to compress the human tissue between the compression paddle and the imaging detector. In one embodiment, a force sensor may generate a force signal indicating a measure of force applied to the human tissue. A movement detection circuit may filter a movement signal from the force signal indicating a measure of movement of the compressed human tissue. A movement analysis module may determine that the movement signal is beyond a movement threshold. An image correction module to perform a corrective action based upon the determination that the movement signal is beyond a movement threshold. In another embodiment, other types of sensors may be used which may be disposed in a grid or around the periphery of the compression paddle.
As used herein, corrective actions may include actions to correct an image, generate an image while minimizing motion artifacts, generate an audio or visual indication that motion has been detected, and/or other actions described below in response to detection of motion during a procedure. By way of example and not limitation, corrective actions may include the determination and display of a movement score on a display device, display of an alert on a display device indicating that a movement threshold has been exceeded, triggering a visual indicator of the imaging system, terminating or modifying an imaging sequence or imaging protocol or image acquisition, delaying capture of the image of human tissue until the movement threshold is no longer exceeded, and/or synchronizing an image capture with repetitive movement. A movement score for all images taken by a particular technologist may be combined to create a positioning score for the technologist. The movement scores may be compared to other technologists in a facility or in other facilities. The technologist score may be compared to a threshold to determine compliance. A facility score may be compared to other facilities and compared to a threshold score to determine compliance. A report may be generated showing positioning scores for the technologist, the facility and compliance over time. A retrospective and prospective approach will allow the facility to identify the root-cause for why the positioning, noise, artifacts, compression etc. at the physician level could occur. A particular technician can be identified with this approach to understand his/her behavior to improve their ability to take their image. Other embodiments are described and claimed.
With general reference to notations and nomenclature used herein, the detailed descriptions which follow may be presented in terms of program procedures executed on a computer or network of computers. These procedural descriptions and representations are used by those skilled in the art to most effectively convey the substance of their work to others skilled in the art.
A procedure is here, and generally, conceived to be a self-consistent sequence of operations leading to a desired result. These operations are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical, magnetic or optical signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It proves convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like. It should be noted, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to those quantities.
Further, the manipulations performed are often referred to in terms, such as adding or comparing, which are commonly associated with mental operations performed by a human operator. No such capability of a human operator is necessary, or desirable in most cases, in any of the operations described herein which form part of one or more embodiments. Rather, the operations are machine operations. Useful machines for performing operations of various embodiments include general purpose digital computers or similar devices.
Various embodiments also relate to apparatus or systems for performing these operations. This apparatus may be specially constructed for the required purpose or it may comprise a general purpose computer as selectively activated or reconfigured by a computer program stored in the computer. The procedures presented herein are not inherently related to a particular computer or other apparatus. Various general purpose machines may be used with programs written in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method steps. The required structure for a variety of these machines will appear from the description given.
In an embodiment, each module of imaging system 100 may comprise without limitation an imaging system, mobile computing device, a smart phone, or a desktop computer, or other devices described herein. In various embodiments, imaging system 100 may comprise or implement multiple components or modules. As used herein the terms “component” and “module” are intended to refer to computer-related entities, comprising either hardware, a combination of hardware and software, software, or software in execution. For example, a component and/or module can be implemented as a process running on a processor, such as CPU 120, a hard disk drive, multiple storage drives (of optical and/or magnetic storage medium), an object, an executable, a thread of execution, a program, and/or a computer. By way of illustration, both an application running on a server and the server can be a component and/or module. One or more components and/or modules can reside within a process and/or thread of execution, and a component and/or module can be localized on one computer and/or distributed between two or more computers as desired for a given implementation. The embodiments are not limited in this context.
The various devices within system 100, and components and/or modules within a device of system 100, may be communicatively coupled via various types of communications media as indicated by various lines or arrows. In various embodiments, the various modules and storages of system 100 may be organized as a distributed system. A distributed system typically comprises multiple autonomous computers that communicate through a computer network. It is worthy to note that although some embodiments may utilize a distributed system when describing various enhanced techniques for data retrieval, it may be appreciated that the enhanced techniques for data retrieval may be implemented by a single computing device as well. The embodiments are not limited in this context.
In an embodiment, imaging module 102 may include an imaging source 106 and a detector 108, which may be used to perform breast imaging (2D, tomosynthesis, computed tomography, ultrasound or any combination thereof), and may be an x-ray source and detector in some examples. In other examples, imaging source 106 and detector 108 may be other types of imaging sources and sensors, respectively. For example, in some embodiments imaging module 102 may be configured to perform breast imaging, such as x-ray mammography, tomosynthesis, computed tomography, and/or ultrasound. Tomosynthesis is a method for performing high-resolution limited-angle tomography at radiographic dose levels. While mammography is used as an exemplary embodiment through the description, it can be appreciated that the techniques described herein may be applicable to other procedures in which imaging of human tissue susceptible to movement may occur.
Imaging source 106 may be configured to expose human tissue, such as breast tissue, to x-rays, which may be detected by detector 108. Detector 108 may be configured to respond to the influence of incident x-rays over a wide range. Detector 108 may be configured to absorb x-rays, produce an electronic signal, digitize the signal, and store the results in one of storage 104 and/or database 122. The output image may be saved as a two-dimensional matrix, where each element represents the x-ray transmission corresponding to a path through the breast tissue. Three-dimensional images and matrices may be generated in some embodiments, depending on the imaging modality, such as tomosynthesis, computed tomography, and the like. The image may be digitally processed such that when it is displayed on a display device or printed on laser film, it will illustrate the key features required for diagnosis. Such diagnostic images may be stored in storage 104 so that they may be viewed on a user interface of display 124.
In an embodiment, images may also be archived in image database 122. In this manner, patient records may be maintained and past images may be used to evaluate detected movement when compared to new images. In an exemplary embodiment, an image correction module, described herein, may refer to archived images containing common elements (e.g., still calcification for the same tissue of the same patient) and compare to a current image (which may include blurry calcifications for the same tissue of the same patient). Such as analysis, combined with the techniques described herein, may be used to detect and/or correct motion artifacts within an image.
Imaging system 100 may include a force sensor 110, which may be contained within a compression paddle of imaging system 100 (not shown in
Imaging system 100 may include a movement detection circuit 112, configured to receive an electronic force signal from force sensor 110 and filter a movement signal from the received force signal In some embodiments, the received force signal may include a low frequency compression force signal (e.g., 0 (DC) to <5 Hz), which may be tapped and processed in parallel using movement detection circuit 112. Movement detection circuit 112 may include one or more components to process and filter the force signal, including a DC signal block, such as a blocking capacitor to remove the DC and low frequency components of the force signal, leaving a higher frequency (AC) component, referred to herein as a movement signal One or more analog circuits may filter and apply gain to the higher frequency (AC) signal components to improve signal-to-noise ratio, if needed. The resulting movement signal may include motion artifacts from the original force signal. As described later, one or more modules, such as movement analysis module 114 may include a digital processing unit and corresponding software to analyze the output from movement detection circuit 112.
In an embodiment, a movement analysis module 114 may include one or more analog circuits, such as a tuned differentiator, to detect movement of human tissue compressed within imaging system 100 using a received movement signal from movement detection circuit 112. In some embodiments, movement analysis module 114 may include hardware and/or software modules configured to accept the movement signal from movement detection circuit 112, and detect tissue movement caused by the patient. An exemplary logic flow illustrating movement detection by movement analysis module 114 is set forth within
Image correction module 116 may be configured to receive a determination from movement analysis module 114 that movement has been detected. The determination may include data indicating a movement time and movement level in some embodiments, and the determination may be used to determine a corrective action to be taken. Techniques described herein strive to improve image quality, even in situations where movement is detected, reduce patient radiation exposure when possible, and reduce the time required for patients to undergo imaging procedures. Exemplary corrective actions are described herein with respect to
A database of movement criteria 118 may be used by image correction module 116 to determine the proper corrective action based upon various determinations by movement analysis module 114. For example, criteria within movement criteria database 8 may include movement thresholds, time thresholds for delay, image quality criteria, thresholds indicating the maximum number of images that can be deleted from an image sequence due to detected movement, and other criteria necessary to determine and take corrective actions. In an example, image correction module 116 may include hardware and/or software configured consistent with the techniques described herein to take one or more corrective actions when movement exceeding a threshold has been detected. As described further with respect to
In some embodiments, artifact-based image detection of patient motion as described in the '180 patent, may be combined with the information from the force sensor 110 and the movement detection circuit 112 in the movement analysis module 114. In one example, the movement analysis module 114 may correlate the information received from the motion detection circuit with the artifact based image detection.
In an embodiment, display device 121 may include a user interface configured to receive and display an image along with information with respect to detected movement and any corrective actions taken in response. In an embodiment, display 124 may be configured to display an alert or movement score (
Compression paddle 206 may be mounted to an arm, itself connected to a frame connected to a body of the imaging system 200. Compression paddle 206 may be lowered onto human tissue during an imaging procedure. Certain imaging procedures, such as mammography, may require compression of human tissue between compression paddle 206 and another surface, such as the surface of detector 214, which may be consistent with detector 108, described above with respect to
Force sensor module 208 may be contained within compression paddle 206, and may detect force 212 imparted on breast 210, which is placed between compression paddle 206 and imaging detector 214. The detected force may represent a measurement of force applied superior to the breast via the compression paddle 206 and/or via the imaging detector 214 “top” surface. Additionally or separately, a force sensor module may be incorporated into the imaging detector 214 component. In this configuration, the force sensor module incorporated into the imaging detector 214 may operate in the same manner as the force sensor module 208 and may measure the DC and AC compression signals applied by the compression paddle 206 upon the human tissue (breast 210) that is placed between the compression paddle 206 and upon the surface of the imaging detector 214. As set forth above, force sensor 208, or the optional force sensor incorporated into the imaging detector 214, may include a strain gauge, piezoelectric sensor, load cell, or other sensor capable of measuring the force applied to human tissue compressed between a compression paddle and an opposite detector plane, in some embodiments, force sensor 208, or the optional force sensor incorporated into the imaging detector 214, may include an analog filter, gain circuits for signal conditioning, and/or an analog-to-digital converter for signal capture. The output of force sensor 208, or the optional force sensor incorporated into the imaging detector 214, may be an electrical signal representative of a force level, which may be filtered or converted by one or more circuits or modules described herein into a value that indicates movement. This movement signal, when compared to other measurements over time, may indicate movement of the patient undergoing an imaging procedure.
In an embodiment, the described force sensor modules may include one or more circuitry components comprising a movement detection circuit, such as movement detection circuit 112. In an embodiment, movement detection circuit 216 may be implemented separate from force sensor 208, and may receive a signal therefrom. As described with respect to
Movement analysis module 218, which may be implemented in hardware and/or software, may be configured to determine whether a received movement signal has exceeded a movement threshold. In some embodiments, the movement analysis module 28 may be present separate from force sensor 208, and may be within, the optional force sensor incorporated into the imaging detector 214, compression paddle 206 or within another portion of imaging system 200, as illustrated. If a movement threshold has been exceeded, movement analysis module may communicate that determination to image correction module 220, which may be configured to take corrective action, as described herein with respect to
As discussed above, patient motion during a breast imaging procedure can adversely affect imaging quality and therefore the diagnostic value of the resultant images. Detecting and/or measuring motion and correction, however, is difficult due at least in part to the fact that the breast is a non-rigid object. Accordingly, motion patterns of the breast during the imaging procedure may be complex in both time and space. For instance, some portions of the breast may move differently from other portions. As a result, image quality may change for different regions of a breast image. For a modality such as tomosynthesis, the motion or movement may occur between acquiring projections and/or during exposure of one or more of the projections.
Proper compression and positioning of the breast during the imaging procedure also affects image quality. Inadequate compression of the breast may increase the likelihood of unwanted results. For example, inadequate compression may increase the likelihood of motion, which reduces image quality. As another example, inadequate compression may increase the likelihood of overlapping tissue which may make it more difficult to detect cancerous lesions in a resultant image. Thus, there is a need to more accurately detect motion and compression in space and time, which can prove to be useful input data to help correct and enhance image quality during breast imaging procedures.
One or more sensors 1408 are disposed on or within the compression paddle 1406 and the detector 1414. The one or more sensors 1408 may comprise or communicate with a sensor module which may detect motion of the breast and may also be used to detect or analyze compression and positioning of the breast. In one example, the sensors 1408 may include one or more photo sensors, infrared sensors and/or ultrasound or ultrasonic sensors. The motion detected by the sensors 1408 may be based on reflected sonic signals and/or reflected light signals depending on the types of sensors 1408 implemented. For example, the photo sensors may include cameras to capture optical images of the breast when it is in a compressed and/or uncompressed state. Similarly, the infrared sensors may be utilized to produce a three-dimensional image or depth map of the breast that may be used to determine the three-dimensional location of exterior of the breast at different points in time. The ultrasound or ultrasonic sensors may also be used to detect the three-dimensional location of the exterior of the breast. In some examples, the ultrasound or ultrasonic sensors may also be utilized to image the interior of the breast. With the interior of the breast imaged, landmarks within the breast may be identified and the locations of those landmarks may be tracked in three-dimensional space at different points in time.
In some embodiments, the sensors 1408 may be placed in a grid pattern on or within the compression paddle 1406 and the detector 1414. In other examples, the sensors 1408 may be disposed around the periphery of the compression paddle 1406 and the detector 1414. The location and pattern of the sensors may be based on the types of sensor and the physical properties of the compression paddle 1406 and/or the detector 1414. For example, if the compression paddle 1406 is optically opaque, the photo sensors may be placed in a position where they have a line of sight to the exterior of the breast that is not blocked by the compression paddle 1406. Similarly, for some ultrasound or ultrasonic sensors, an air gap between the sensor and the breast may be undesirable. As such, the ultrasonic sensors may be placed in location where there is no air gap between the ultrasonic sensor and the breast. Other solid surfaces, such as a portion of the compression paddle 1406 and/or detector 1414 may still be located between the ultrasonic sensor and the compressed breast.
By disposing multiple sensors in a pattern, a more detailed understanding of motion of the breast may be obtained. It is appreciated that movement of the breast may not be uniform. For example, some areas of the breast may move more than others. Use of multiple sensors allows the imaging system 1400 to create a motion map that may be capable of visually showing the location of movement throughout the surface of the breast. In other examples, the motion map may not be a visual representation but rather a set of data indicating the locations of the breast that moved as well as the magnitude and direction of the breast movement at each location. For instance, the motion map may be a set of motion vectors for different positions in three-dimensional space. By having a more complete understanding of the location of motion of the breast, the imaging system can determine whether the motion may have had a negative effect on the image obtained. In addition, having additional sensors allows the imaging system to obtain other information such as the amount of contact with the breast, as further discussed below, to determine breast positioning and compression information.
The sensors 1408 that may be incorporated into the imaging detector 1414 and/or the compression paddle 1406 may include an analog filter, gain circuits for signal conditioning, and/or an analog-to-digital converter for signal capture. The output of sensors 1408 may be electrical signals representative of motion and/or spatial data representative of location of the breast, which may be filtered or converted by one or more circuits or modules described herein into a plurality of spatial information or data 1416. The spatial information may be combined to create a motion map 1418a. The motion map 1418a takes spatial information from each of the sensors 1408 to create a relative representation of motion. The motion map 1408a may describe some areas of the breast that include more motion than others. The motion map 1408a may be a visual representation of the spatial information having some colors (e.g. red) represent higher amount of motion and other colors represent moderate (e.g. yellow) or low (e.g. green) amount of motion. The relative representation of motion may be determined based on spatial information comparison to a threshold or a look up table representing various levels of motion. In other examples, the motion map 1418a may not include a visual representation but rather a set of data indicating the locations of the breast that moved as well as the magnitude and direction of the breast movement at each location. For instance, the motion map 1418a may be a set of motion vectors for different positions in three-dimensional space.
In addition, the motion map 1408a may be created for each of the tomosynthesis projections or slices created. For example,
The information or data from the motion map 1418a may be provided as input into an image correction module or algorithm 1420a. In some examples, the motion map 1418a may be utilized to generate a correction map. That correction map may effectively be an inverse of the motion map 1418a. For instance, the correction map may indicate how pixels in the image should be adjusted based on the detected motion. The image correction module or algorithm 1420a may be similar to the image correction modules 116 and 220 described above with respect to
The spatial information 1416 may also be used to create a contact map 1418b. An example of a contact map 1418b is depicted in
The contact map 1418b shows or indicates the level of contact with the breast. The contact map 1418b can be used to determine or define a roll-off region 1434, which is the region where the breast is uncompressed. The roll-off region 1434 may be the area between the uncompressed tissue line 1432 and the skin line 1430 of the breast. The size of the roll-off region 1434 may be represented by the area between the uncompressed tissue line 1432 and the skin line 1430. The size of the roll-off region 1434 may also be represented by a distance between the uncompressed tissue line 1432 and the skin line 1430. The distance may be the maximum, minimum, and/or average distance between the uncompressed tissue line 1432 and the skin line 1430. The location of the uncompressed tissue line 1342 and the size of the roll off region 1434 may also be useful in special image processing techniques in uncompressed versus compressed breast areas. The location of uncompressed tissue line 1432 with respect to breast profile 1430 may also be used to give an idea of how adequate the compression is, which may also be used in determining the adequacy of positioning of the breast.
Additional positioning information of the breast may also be determined from the data produced by the sensors 1408. For example, in a mediolateral oblique (MLO) compression, the sensors may be used to determine whether the pectoral muscle is properly positioned such that it will be imaged during the imaging procedure. The spatial data produced by the sensors 1408 may also be used to assess the alignment of the nipple, such as by determining the posterior nipple line (PNL). Other positioning criteria may also be determined from the spatial data produced by the sensors 1408.
Returning to
Included herein is a set of flow charts representative of exemplary methodologies for performing novel aspects of the disclosed architecture. While, for purposes of simplicity of explanation, the one or more methodologies shown herein, for example, in the form of a flow chart or flow diagram, are shown and described as a series of acts, it is to be understood and appreciated that the methodologies are not limited by the order of acts, as some acts may, in accordance therewith, occur in a different order and/or concurrently with other acts from that shown and described herein. For example, those skilled in the art will understand and appreciate that a methodology could alternatively be represented as a series of interrelated states or events, such as in a state diagram. Moreover, not all acts illustrated in a methodology may be required for a novel implementation.
At operation 1506, motion data may be generated based on the first set of spatial data and the second set of spatial data. For example, by comparing the first spatial data to the second spatial data, a difference in location of the breast or portions of the breast may be determined. That change in location corresponds to motion. The generated motion data may indicate an amount of motion that occurred for the breast or a portion of the breast. Generating motion data may also include generating a motion map at operation 1508. The motion map, for example, may be a visual representation of the spatial information having some colors represent higher amount of motion and other colors represent moderate or low amount of motion. At operation 1510, a correction map may be generated from or based on the motion map. That correction map may effectively be an inverse of the motion map. For instance, the correction map may indicate how pixels in the image should be adjusted based on the detected motion.
The motion data generated in operation 1506 may also be utilized for motion correction or tomosynthesis reconstruction, as shown in
The motion data for each of the projections or medical images may also be compared to a predetermined motion threshold at operation 1522. For example, an amount of motion that occurred between a first projection and a second projection may be compared to a motion threshold. If the amount of motion that occurred between the projections is greater than a threshold, the projection or medical image may be discarded at operation 1526. For example, the second projection may be discarded if the amount of motion that occurred between the first and second projection is greater than the predetermined motion threshold. The thresholds described herein may be dynamic or predetermined. For example, the thresholds may be dynamically determined by an imaging system during the image capture process based, at least in part, on a detected image quality assessment taken in near real-time. In other embodiments, a movement threshold may be predetermined and stored within an imaging system. The predetermined thresholds discussed herein may be a setting or a value that is stored or accessed by the medical imaging system or a portion thereof. For instance, the predetermined thresholds may be set by a medical professional, be provided with the imaging system, or accessed from a remote source. The thresholds may be based on values, percentages, ratios, or other types of thresholds.
If, however, the motion amount is not greater than the predetermined threshold, the projection or image is retained at operation 1524. The images or projections that are retained may then be used to generate a tomosynthesis reconstruction in operation 1528. In some examples, images that are discarded in operation 1526 may be regenerated through a synthesis or interpolation of other acquired projections. Those synthesized projections may then be used in generating the tomosynthesis reconstruction in operation 1528.
Returning to
As shown in
If the compression values and/or the positioning values do exceed the thresholds (or are outside of the tolerance thresholds), then a notification or alert may be generated in operation 1534. The notification may be a visual notification, such as a notification displayed on a screen or indicated by illumination of a light. The notification may also be an audible notification played through a speaker or other sound-making device. The notification may indicate that the compression is inadequate or improper and/or that the breast in improperly positioned. The notification may further indicate a reason as to why the compression was inadequate, such as too large of a roll-off area, and/or why the breast was improperly positioned, such as an improper PNL value. In addition, the notification may provide guidance to the medical professional or technician as to how the breast should be repositioned. At operation 1536, the breast may be repositioned or recompressed. Once the breast is repositioned or recompressed, the imaging procedure continues and medical images are acquired at operation 1532. In some examples, upon repositioning and/or recompressing the breast at operation 1536, method 1500 flows back to the start where spatial data is recaptured at operation 1502. The operations determining whether the positioning and/or compression is proper may then be repeated for the repositioned and/or recompressed breast until the breast is determined to be in a proper position and properly compressed.
At 402, a movement analysis module may receive a movement signal from a force sensor and/or movement detection circuit. The movement signal may include motion artifacts indicating that human tissue, currently under compression during an imaging procedure, has moved. Using hardware and/or software components, the received movement signal may be evaluated to isolate data indicating movement and a value may be assigned indicating a movement level. In an embodiment, a baseline movement signal may be first evaluated, indicating a baseline movement value, or a baseline movement value may be stored within an imaging system. Subsequent movement signals may be received and compared to the baseline movement value to identify motion artifacts within the subsequent movement signals.
At 404, the movement analysis module may compare subsequently received movement signals, and any motion artifacts identified therein, to a movement threshold, which may be predetermined and stored within a non-transitory computer-readable storage medium. In some embodiments, thresholds may be dynamically determined by an imaging system during the image capture process based, at least in part, on a detected image quality assessment taken in near real-time. In other embodiments, a movement threshold may be predetermined and stored within an imaging system.
At 406, the movement analysis module may determine whether the received movement signal has exceeded the movement threshold and, at 408, the movement analysis module may communicate the determination to an image correction module, which is discussed in more detail below. The determination, in some embodiments, may include an indication that movement has been detected, a movement value, a timestamp, a frame identifier, or other information that may be necessary for an image correction module to take appropriate corrective measures based upon the detected movement.
At 502, an image correction module may receive a determination that movement has been detected. In some embodiments, any movement may be communicated to the image correction module. In other embodiments, only movement that exceeds a threshold, as described herein, may be communicated to the image correction module. The determination, in some embodiments, may include an indication that movement has been detected, a movement value, a timestamp, a frame identifier, or other information that may be necessary for an image correction module to take appropriate corrective measures based upon the detected movement.
At 504, the image correction module may determine a type of movement based upon one or more received movement determinations. For example, a movement may be categorized as a regular movement when it is repetitive and generally within a regular time interval. This type of movement may indicate a patient is breathing, or moving in a regular fashion. In another example, movement may be categorized as irregular. A single irregular movement may indicate a patient has shifted positions, or sneezed, for example. In yet another example, movement may be categorized as continuously irregular. A determination of movement type may be based, in part, on a movement value and/or timestamp, for example. In at least one example, a determination of the movement may be that the movement is localized to one or more tomosynthesis slices.
At 506, when a regular movement that is repetitive and generally within a regular time interval is detected, the image correction module may configure the image capture to the synchronized with the regular movement. In this manner, image capture may be performed during a time period in which movement is not detected, and skipped during a time period in which movement is detected. The synchronized image sequence may be generated at 512, and may include only images in which movement has not been detected, or detected movement is below a threshold amount.
At 508, when irregular movement is detected, the image correction module may delay image capture for a period of time, allowing the movement to stop so an image is not negatively impacted. As described herein, some embodiments may flag image captured images taken during a movement, and those images may be removed from an imaging sequence used to generate an image.
At 510, if a termination of movement is localized to one or more tomosynthesis slices. The slices may be removed or cancelled from the tomosynthesis stack that are associated with movement above a threshold.
At 514, if irregular movements continue during the delay period, the delay may be extended until movement stops. However, since in some cases the patient may be exposed to x-ray radiation during the delay, a time period threshold may be set for which the image capture may be canceled if the delay lasts beyond the threshold. Thus, an image may be generated at 512 if the delay is within the time threshold, and the image capture may be canceled at 516 if the delay period extends beyond the time threshold. In this manner, an imaging system may be able to compensate for some movement, and generate higher quality images by delaying capture until movement is no longer detected, while at the same time canceling an image and limiting patient radiation exposure when a satisfactory image cannot be obtained due to excessive irregular movement.
During image generation at 512, certain embodiments may correlate images with detected movement and flag images in which movement was detected. In this manner, images flagged with movement may be removed from a resulting imaging sequence, thus, improving overall image quality despite detecting motion within the imaging procedure. In some cases, many images may be flagged as occurring during movement and the entire imaging sequence may need to be canceled. Based upon a particular procedure, for example, a threshold may be set such that an image correction module may determine whether the process of deleting images may result in a usable image sequence, or if the imaging sequence needs to be canceled due to excessive movement during the imaging procedure.
At 604, a movement detection circuit may filter the received force signal and isolate a movement signal from therein. The movement signal may indicate a level of force, and in some cases may indicate that a patient has moved during image capture in a manner that is detrimental to the quality of a resulting image. As set forth above, the movement detection circuit may be configured to receive an electronic force signal from a force sensor and filter a movement signal from the received force signal. In some embodiments, the received force signal may include a low frequency compression force signal (e.g., 0 (DC) to <5 Hz), winch may be tapped and processed in parallel using the movement detection circuit. Further, the movement detection circuit may include one or more components to process the force signal, including a DC signal block, such as a blocking capacitor to remove the DC and low frequency components of the force signal, leaving a higher frequency (AC) component, referred to herein as a movement signal. One or more analog circuits may filter and apply gain to the higher frequency (AC) signal components to improve signal-to-noise ratio, if needed. The resulting movement signal may include motion artifacts from the original force signal.
At 606, a movement analysis module may determine whether a detected movement is beyond a movement threshold. The movement analysis module may include one or more analog circuits, such as a tuned differentiator, to detect movement of human tissue compressed within an imaging system using a received movement signal from the movement detection circuit. In some embodiments, the movement analysis module may include hardware and/or software modules configured to accept the movement signal from the movement detection circuit, and detect tissue movement caused by the patient. An exemplary logic flow illustrating movement detection by a movement analysis module is set forth within
At 608, when movement is beyond a threshold, an image correction module may perform a corrective action, which may include one or more of a variety of actions that improve image quality and reduce patient exposure to radiation. An image correction module may be configured to receive a determination from movement analysis module that movement has been detected. The determination may include data indicating a movement time and movement level in some embodiments, and the determination may be used to determine a corrective action to be taken, some of which are described with respect to
Additional examples of use of the positioning information may include compliance with Federal Regulations, such as the Mammography Quality Standards Act (MQSA) and the Enhancing Quality Using the Inspection Program or EQUIP initiative. The MQSA requires that the images taken at a facility must comply with certain quality standards. Poor positioning is a factor in most deficiencies and failures of clinical images to meet quality standards. EQUIP requires regular review of images, corrective procedures when clinical images are of poor quality, including a mechanism for providing ongoing feedback to technologists or other designated personnel, and oversight of quality control records. The analytics described above can be used to generate reports of compliance with federal regulations. For example, the report may be automatically generated on a periodic basis that includes information such as the score information for that facility, the number of times corrective procedures were taken, the number of times that corrective measures such as education and quality control measures were recommended and were taken. Such reports can be stored and provided if needed to federal regulators to ensure compliance. Complying with EQUIP, however, requires only annual self-reporting from facilities, which has generally resulted in only a yearly review of imaging quality by facilities. Such a lag in reporting may cause downward trends to go unnoticed and potentially poor imaging procedures to occur. In addition, the metrics may be stored in various formats across a plurality of devices and imaging systems, making it even more difficult for the metrics to be monitored. The present technology resolves these problems among others by being able to continuously aggregate quality metrics across a plurality of facilities and provide access and additional insights to those quality metrics in substantially real time as the quality metrics are aggregated. The quality metrics may also be provided to the central computing system in a standardized format via the web application to help ensure that the same types of metrics are properly aggregated and correlated together. Further, due in part to the real time tracking of metrics, warnings and trainings may be provided based on downward trends in image quality before poor imaging procedures may be implemented by technicians of a facility. Such warnings and trainings may provide for an overall improved imaging process for facilities and lead to better detection of abnormalities such as cancers.
At operation 820, a random sampling of the acquired medical images is taken. The random sampling of images helps reduce potential bias in facilities or technicians selecting images that they believe to be of better quality to artificially inflate their scores or metrics. The random sampling of images may be of images may be for all medical images acquired or of medical images that were actually reviewed by radiologists or other medical professionals. The random sampling may be performed automatically through an interface or software provided by the central computing system to the facility, such as through a web application provided by the central computing system. Random sampling can also be done in real-time (through an algorithm) picking every 3 or 4 or 5 patients, based on the prior volumes/seasonality calculations.
At operation 822, quality metrics are then generated from and/or for the randomly sampled images. The metrics may be automatically generated through patient positioning and/or motion detection algorithms, such as the ones discussed herein and in International Publication No. WO2018/170265, titled “Techniques For Patient Positioning Quality Assurance Prior to Mammographic Image Acquisition,” which is incorporated by reference herein in its entirety. The metrics may also be generated by an interpreting physician (IP). For instance, the interpreting physician may review the medical images that have been randomly selected and provide metrics for those medical images. The metrics from the IP may be input into a web application, interface, or other software provided by the central computing system. The metrics that are provided may include positioning metrics, compression metrics, exposure level metrics, contrast metrics, sharpness metrics, noise metrics, and/or artifact metrics, among other metrics. Each metric may also include additional sub-metrics or scores. For example, the positioning metrics may also include sub-metrics such as nipple location, nipple angle, pectoral muscle coverage, inframammary fold visibility, pectoral-nipple line distance, and symmetry between image views. The positioning metrics generally relate to whether sufficient breast tissue is imaged to ensure that cancers or anomalies are not likely to be missed because of inadequate positioning. The compression metrics generally relate to whether compression has been applied in a manner that minimizes the potential obscuring effect of overlying breast tissue and motion artifacts. The exposure level metrics generally relate to whether the exposure level was adequate to visualize breast structures and whether the images were underexposed or overexposed. The contrast metrics generally relate to where the image contrast permitted differentiation of subtle tissue density differences. The sharpness metrics generally relate to whether the margins of normal breast structures were distinct and not blurred. The noise metrics generally relate to whether noise in the image obscured breast structures or suggested the appearance of structures not actually present. The artifacts metrics generally relate to whether artifacts due to lint, processing, scratches, and other factors external to the breast obscured breast structures or suggest the appearance of structures not actually present. The quality metrics may further include metrics or scores based on motion or movement that occurred during imaging the patient. Such motion or movement metrics may be generated using the sensors and techniques discussed herein. The metrics generated at operation 822 may be stored with the medical image(s) for which the metrics were generated.
At operation 824 a lead interpreting physician (LIP) may generate additional metrics for the medical images for which the metrics were generated in operation 822. The LIP may generate metrics for all of those images or a subset of those images. The subset of the images may be a randomized subset of images. The LIP may in some cases modify or confirm the metrics generated in operation 822. In other examples, the LIP may generate additional metrics for the medical images. Any metrics generated by the LIP may also be stored with the medical images. At operation 826, the metrics for medical images are sent from the facility to the central computing system. The medical images may also be sent with the metrics in some examples. In other examples the metrics and at least a portion of the identification information may be sent as correlated to one another, such as in the same report or otherwise linked in an exported database.
At operation 834, the first set of quality metrics are compared to the second set of quality metrics. For example, once the first and second sets of quality metrics are received, the central computing system may cause the two sets of metrics to be compared to one another. In other examples where additional metrics are received from additional facilities, those metrics may also be compared to one another. In some examples, individual metrics from the first set may be compared to the corresponding individual metric of the other set. For instance, positioning metrics for the first facility may be compared to positioning metrics of the second facility. At operation 836, based on the comparison of the first set of quality metrics to the second set of quality metrics, a benchmark is set for at least one metric in the first set of quality metrics and the second set of quality metrics. The benchmark may be standard or point of reference against which the quality metrics may be assessed. For example, based on the comparison of the aggregated quality metrics from the plurality of facilities, the average patient positioning score may be determined. That average patient positioning score may then be used as a benchmark for an individual facility to determine how it is performing as compared to that benchmark. Benchmarks other than average scores may also be determined including benchmarks based on different statistical analyses such as percentiles. Benchmarks may also be set by government entities, such as the FDA. The benchmarks may also be based on the type of facility. For example, benchmarks may be created for facilities that are similarly situated based on factors such as location, number of technicians, number of medical images acquired, number of imaging systems on site, number of interpreting physicians, or other characteristics of the facilities. Accordingly, facilities may be able to compare their own quality metrics against benchmarks that are derived from the like facilities.
At operation 838, a dashboard may be provided by the central computing system one or more of the facilities from which quality metrics are received. The dashboard may be provided to the facility through the web application. Facility data for a facility or a particular technologist may be generated based on the generated benchmark in operation 836 and the first set of quality metrics and/or the second set of quality metrics. The facility data may include the quality metrics from one or more facilities compared to the benchmark(s) generated in operation 836. The facility data may be presented in the dashboard. For instance, the facility data and/or quality metrics for the facility accessing the dashboard may be viewed through the dashboard. For example, an LIP or other member of the facility may access the dashboard to see how the facility's quality metrics compare to the benchmark quality metrics and/or the quality metrics of other facilities. The quality metrics may be searched or refined as well through the dashboard. The LIP may refine the quality metrics based on a period of time or for a certain technician or technicians. For instance, if the LIP wanted to see how a particular technician performed over a certain month, the LIP could refine the results in the dashboard to see such information. Reports regarding the quality metrics of the facility may also be generated through the dashboard. The dashboard may also be used to track how the facility's quality metrics compare to federal regulations and provide warnings if the metrics are below federal regulations for any time period. Reports may also be generated that indicate how the quality metrics of the facility compare to federal regulations or guidelines. In addition, a dashboard may also be provided to a government agency or review board to show how an individual facility is performing or to show how a group of facilities is performing. The dashboard also provides useful insights into the quality metrics that were previously unavailable. As the metrics are received by the central computing system, the dashboard representation for the facility may be updated almost immediately. Accordingly, the imaging quality of a facility is able to be tracked over time and in a real time or live manner that has never been available before.
The dashboard or the reports may also provide additional insights beyond the metrics that are reported. For instance, based on the aggregation of the quality metrics, a large enough sample of metrics across different facilities may allow for correlations between different types of quality metrics and the identifying data. Such correlations and insights may be generated through machine learning techniques. Unsupervised machine learning techniques may be particularly useful in identifying correlations and insights that may have been previously unknown. Clustering-based, association-based, and anomaly-based unsupervised learning algorithms, among others, may all be used for the data. Clustering algorithms are generally directed to problems where the goal is to discover inherent clusters or grouping of data, and association algorithms are generally directed to problems where the goal is to discover rules that describe large portions of data. Anomaly detection algorithms generally are directed to discovering unusual or outlier metrics within the set of quality metrics. As an example, the aggregated metrics and identification data may be provided as an input to the unsupervised machine learning algorithms to output previously unknown structures, patterns, and associations within the aggregated metrics and identification data.
At operation 840, a training recommendation for a facility or a particular technologist may be generated based on the generated benchmark in operation 836. The training recommendation may be generated by the central computing system. For example, based on a comparison of the first set of quality metrics to the generated benchmark, it may be determined that the first set of quality metrics (or a subset thereof) are falling short of the benchmark. Based on that determination, a training recommendation may be provided. As an example, the positioning metrics for a particular technician may be below a benchmark or a federal regulation or guideline. A training for that technician regarding positioning may then be generated by operation 840. The training may also be tailored to the specific positioning metrics that are problematic for the specific technician. The trainings may be generated from training sets available from different training organizations or custom trainings provided by the company hosting or operating the central computing system. For instance, the trainings may be a set of videos for the technician to watch to better understand how to properly position the patient. The video trainings may also include interactive elements that further improve the interaction of the technician during the training. Such interactive trainings may also include an assessment following or during the training that assesses how well the technician is understanding the information provided by the training. The generated training may also include recommendations for programs that are available based on the particular problematic metrics.
At operation 842, the training generated in operation 840 is sent to the facility. For example, when the training has been generated for a technician at the first facility, the generated training is sent to the first facility. The training may be sent to the first facility by the central computing system via the web application or other form of communication. The training may be in the form as a series of videos that are accessible via the web application by the technician. Similarly, interactive trainings may also be provided via the web application. During the interactive trainings, an assessment score may be stored that indicates how well the technician performed during the training. That assessment score may be stored and associated with that technician for the particular training that was generated at operation 840.
At operation 844 (shown in
At operation 846, the subsequent set of quality metrics are compared to quality metrics previously received by one or more of the imaging facilities. For example, when the subsequent quality metrics are received from the first imaging facility, those subsequent quality metrics are compared to the first set of quality metrics received in operation 830. At operation 848, based on the comparison of the subsequent quality metrics to the prior quality metrics, such as the first set of quality metrics, an effectiveness rating for a training is generated. The effectiveness rating may be generated for the training that was generated in operation 840 and sent to the facility in operation 842. The effectiveness rating indicates how effective the training was that was provided to the facility and/or the technician.
As an example, the training is provided to the first facility after the first set of quality metrics are received and the subsequent set of quality metrics are received after the training has been completed. Due to the training being completed, the expectation is that the quality metrics for the facility will have improved. A comparison between the subsequent set of quality metrics and the first set of quality metrics can either confirm or refute that expectation. For instance, if the training was for patient positioning, the patient positioning metrics from the first set of quality metrics are compared to the patient positioning metrics from the subsequent set of quality metrics. If the subsequent positioning metric improved, a positive effectiveness rating is generated for the training to indicate that the training is effective. If the subsequent positioning metric remained the same, a neutral effectiveness rating is generated for the training to indicate that the training is ineffective. If the subsequent positioning metric worsened, a negative effectiveness rating is generated for the training to indicate that the training is counterproductive. The effectiveness rating may then be used as feedback to the central computing system in generating future trainings. For example, trainings that have received a negative effectiveness rating may no longer be generated and sent to facilities, whereas trainings that have received a positive effectiveness rating may be more heavily weighted in generating future trainings.
In some examples, generating the effectiveness rating of the training may account for the assessment scores of the technicians that received the training. As discussed above, during the interactive trainings, an assessment score may be stored that indicates how well the technician performed during the training. If the assessment score is low for the training and the subsequent quality metric did not improve, the training itself may not be ineffective. Rather, the low-scoring technician may need additional more in-depth training and assistance. In such examples where the assessment score is low, a lower weight may be assigned to the effectiveness rating of the training.
At operation 850, a trend is determined for at least one quality metric based on the comparison of the subsequent set of metrics and the prior set of quality metrics performed in operation 846. In some examples, the determination of the trend may be performed as part of operation 848. For instance, as discussed above, particular metrics may be compared to one another to determine whether the metric has improved or worsened over time. That trend and the rate of the trend is determined in operation 850. The rate of the trend may be based on the total change in the particular metric over a designated period of time, such as days, weeks, months, quarters, or years.
At operation 852, a warning or notification may be generated based on the trend determined in operation 850. The generation of the warning may be based on the rate of the trend as well. For instance, if the trend is negative and the rate of the trend is above a predetermined threshold, a warning may be generated and send to the facility that has the rapidly worsening metric. Such a warning may prevent poor imaging procedures before they occur because the facility can implement corrections upon receiving the warning. The warning may also be based on a negative trend and a metric that is approaching a benchmark and/or federal guideline or regulation. For example, where a quality metric is within a predetermined threshold of a benchmark and/or federal guideline or regulation and the trend is negative, a warning may be generated and provided to the facility. By providing such a warning, the facility is able to implement corrections before falling below the benchmark or being out of compliance with the issued guideline or regulation.
The device 1010 may comprise any electronic device capable of receiving, processing, and sending information for the web services system 1020. Examples of an electronic device may include without limitation an imaging system, client device, a mobile computing device, a computer, a server, a distributed computing system, multiprocessor systems, or combination thereof. The embodiments are not limited in this context.
The device 1010 may execute processing operations or logic for the web services system 1020 using a processing component 1030. The processing component 1030 may comprise various hardware elements, software elements, or a combination of both. Examples of hardware elements may include devices, logic devices, microprocessors, circuits, circuit elements (e.g., transistors, resistors, capacitors, inductors, and so forth), integrated circuits, and so forth. Examples of software elements may include software programs, machine programs, operating system software, middleware, firmware, functions, methods, procedures, software interfaces, application program interfaces (API), words, values, symbols, or any combination thereof. Determining whether an embodiment is implemented using hardware elements and/or software elements may vary in accordance with any number of factors, such as desired computational rate, power levels, heat tolerances, processing cycle budget, input data rates, output data rates, memory resources, data bus speeds and other design or performance constraints, as desired for a given implementation.
The device 1010 may execute communications operations or logic for the web services system 1020 using communications component 1040. The communications component 1040 may implement any well-known communications techniques and protocols, such as techniques suitable for use with packet-switched networks (e.g., public networks such as the Internet, private networks such as an enterprise intranet, and so forth), circuit-switched networks (e.g., the public switched telephone network), or a combination of packet-switched networks and circuit-switched networks (with suitable gateways and translators). The communications component 1040 may include various types of standard communication elements, such as one or more communications interfaces, network interfaces, wireless transmitters/receivers (transceivers), wired and/or wireless communication media, physical connectors, and so forth. By way of example, and not limitation, communication media 1009, 1049 include wired communications media and wireless communications media,
The device 1010 may communicate with other devices 1005, 1045 over a communications media 1009, 1049, respectively, using communications signals 1007, 1047, respectively, via the communications component 1040. The devices 1005, 1045, may be internal or external to the device 1010 as desired for a given implementation.
For example, device 1005 may correspond to a client device such as a phone used by a user. Signals 1007 sent over media 1009 may therefore comprise communication between the phone and the web services system 1020 in winch the phone transmits a request and receives a web page or other data, in response.
The distributed system 1100 may comprise a client device 1110 and a server device 1140. In general, the client device 1110 and the server device 1140 may be the same or similar to the client device 1010 as described with reference to
The client device 1110 may comprise or employ one or more client programs that operate to perform various methodologies in accordance with the described embodiments. In one embodiment, for example, the client device 1110 may implement some steps described with respect to
The server device 1140 may comprise or employ one or more server programs that operate to perform various methodologies in accordance with the described embodiments. In one embodiment, for example, the server device 40 may implement some steps described with respect to
As used in this application, the terms “system” and “component” are intended to refer to a computer-related entity, either hardware, a combination of hardware and software, software, or software in execution, examples of which are provided by the exemplary computing architecture 1200. For example, a component can be, but is not limited to being, a process running on a processor, a processor, a hard disk drive, multiple storage drives (of optical and/or magnetic storage medium), an object, an executable, a thread of execution, a program, and/or a computer. By way of illustration, both an application running on a server and the server can be a component. One or more components can reside within a process and/or thread of execution, and a component can be localized on one computer and/or distributed between two or more computers. Further, components may be communicatively coupled to each other by various types of communications media to coordinate operations. The coordination may involve the unidirectional or bi-directional exchange of information. For instance, the components may communicate information in the form of signals communicated over the communications media. The information can be implemented as signals allocated to various signal lines. In such allocations, each message is a signal. Further embodiments, however, may alternatively employ data messages. Such data messages may be sent across various connections. Exemplary connections include parallel interfaces, serial interfaces, and bus interfaces.
The computing architecture 1200 includes various common computing elements, such as one or more processors, multi-core processors, co-processors, memory units, chipsets, controllers, peripherals, interfaces, oscillators, timing devices, video cards, audio cards, multimedia input/output (I/O) components, power supplies, and so forth. The embodiments, however, are not limited to implementation by the computing architecture 1500.
As shown in
The system bus 1208 provides an interface for system components including, but not limited to, the system memory 1206 to the processing unit 1204. The system bus 1208 can be any of several types of bus structure that may further interconnect to a memory bus (with or without a memory controller), a peripheral bus, and a local bus using any of a variety of commercially available bus architectures. Interface adapters may connect to the system bus 1208 via a slot architecture, for example.
The computing architecture 1200 may comprise or implement various articles of manufacture. An article of manufacture may comprise a computer-readable storage medium to store logic, as described above with respect to
The system memory 1206 may include various types of computer-readable storage media in the form of one or more higher speed memory units, such as read-only memory (ROM), random-access memory (RAM), dynamic RAM (DRAM), solid state memory devices (e.g., USB memory, solid state drives (SSD) and any other type of storage media suitable for storing information.). In the illustrated embodiment shown in
The computer 1202 may include various types of computer-readable storage media in the form of one or more lower speed memory units, including an internal (or external) hard disk drive (HDD) 1214, a magnetic floppy disk drive (FDD) 1216 to read from or write to a removable magnetic disk 1218, and an optical disk drive 1220 to read from or write to a removable optical disk 1222 (e.g., a CD-ROM, DVD, or Blu-ray). The HDD 1214, FDD 1216 and optical disk drive 1220 can be connected to the system bus 1208 by a HDD interface 1224, an FDD interface 1226 and an optical drive interface 1228, respectively. The HDD interface 1224 for external drive implementations can include at least one or both of Universal Serial Bus (USB) and IEEE 1394 interface technologies.
The drives and associated computer-readable media provide volatile and/or nonvolatile storage of data, data, structures, computer-executable instructions, and so forth. For example, a number of program modules can be stored in the drives and memory units 1210, 1213, including an operating system 1230, one or more application programs 1232, other program modules 1234, and program data 1236. In one embodiment, the one or more application programs 1232, other program modules 1234, and program data 1236 can include, for example, the various applications and/or components to implement the disclosed embodiments.
A user can enter commands and information into the computer 1202 through one or more wire/wireless input devices, for example, a keyboard 1238 and a pointing device, such as a mouse 1240. Other input devices may include microphones, infra-red (IR) remote controls, radio-frequency (RF) remote controls, game pads, stylus pens, card readers, dongles, finger print readers, gloves, graphics tablets, joysticks, keyboards, retina readers, touch screens (e.g., capacitive, resistive, etc.), trackballs, trackpads, sensors, styluses, and the like. These and other input devices are often connected to the processing unit 1204 through an input device interface 1242 that is coupled to the system bus 1208, but can be connected by other interfaces such as a parallel port, IEEE 1394 serial port, a game port, a USB port, an IR interface, and so forth.
A display 1244 is also connected to the system bus 1208 via an interface, such as a video adaptor 1246. The display 1244 may be internal or external to the computer 1202. In addition to the display 1244, a computer typically includes other peripheral output devices, such as speakers, printers, and so forth.
The computer 1202 may operate in a networked environment using logical connections via wire and/or wireless communications to one or more remote computers, such as a remote computer 248. The remote computer 1248 can be a workstation, a server computer, a router, a personal computer, portable computer, microprocessor-based entertainment appliance, a peer device or other common network node, and typically includes many or all of the elements described relative to the computer 1202, although, for purposes of brevity, only a memory/storage device 1250 is illustrated. The logical connections depicted include wire/wireless connectivity to a local area network (LAN) 1252 and/or larger networks, for example, a wide area network (WAN) 1254. Such LAN and WAN networking environments are commonplace in offices and companies, and facilitate enterprise-wide computer networks, such as intranets, all of which may connect to a global communications network, for example, the Internet.
When used in a LAN networking environment, the computer 1202 is connected to the LAN 1252 through a wire and/or wireless communication network interface or adaptor 1256. The adaptor 1256 can facilitate wire and/or wireless communications to the LAN 1252, which may also include a wireless access point disposed thereon for communicating with the wireless functionality of the adaptor 1256.
When used in a WAN networking environment, the computer 1202 can include a modem 1258, or is connected to a communications server on the WAN 1254, or has other means for establishing communications over the WAN 1254, such as by way of the Internet. The modem 1258, which can be internal or external and a wire and/or wireless device, connects to the system bus 1208 via the input device interface 1242. In a networked environment, program modules depicted relative to the computer 1202, or portions thereof, can be stored in the remote memory/storage device 1250. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers can be used.
The computer 1202 is operable to communicate with wire and wireless devices or entities using the IEEE 802 family of standards, such as wireless devices operatively disposed in wireless communication (e.g., IEEE 802.11 over-the-air modulation techniques). This includes at least Wi-Fi (or Wireless Fidelity), WiMax, and Bluetooth™ wireless technologies, among others.
As shown in
The clients 1310 and the servers 1340 may communicate information between each other using a communication framework 1330. The communications framework 1330 may implement any well-known communications techniques and protocols. The communications framework 1330 may be implemented as a packet-switched network (e.g., public networks such as the Internet, private networks such as an enterprise intranet, and so forth), a circuit-switched network (e.g., the public switched telephone network), or a combination of a packet-switched network and a circuit-switched network (with suitable gateways and translators).
The communications framework 1330 may implement various network interfaces arranged to accept communicate, and connect to a communications network. A network interface may be regarded as a specialized form of an input output interface. Network interfaces may employ connection protocols including without limitation direct connect, Ethernet, wireless network interfaces, cellular network interfaces, and the like.
Some embodiments may be described using the expression “one embodiment” or “an embodiment” along with their derivatives. These terms mean that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment. Further, some embodiments may be described using the expression “coupled” and “connected” along with their derivatives. These terms are not necessarily intended as synonyms for each other. For example, some embodiments may be described using the terms “connected” and/or “coupled” to indicate that two or more elements are in direct physical or electrical contact with each other. The term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other.
A procedure is here, and generally, conceived to be a self-consistent sequence of operations leading to a desired result. These operations are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical, magnetic or optical signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It proves convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like. It should be noted, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to those quantities.
Further, the manipulations performed are often referred to in terms, such as adding or comparing, which are commonly associated with mental operations performed by a human operator. No such capability of a human operator is necessary, or desirable in most cases, in any of the operations described herein which form part of one or more embodiments. Rather, the operations are machine operations. Useful machines for performing operations of various embodiments include general purpose digital computers or similar devices.
Various embodiments also relate to apparatus or systems for performing these operations. This apparatus may be specially constructed for the required purpose or it may comprise a general purpose computer as selectively activated or reconfigured by a computer program stored in the computer. The procedures presented herein are not inherently related to a particular computer or other apparatus. Various general purpose machines may be used with programs written in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method steps. The required structure for a variety of these machines will appear from the description given.
In the foregoing Detailed Description, it can be seen that various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate embodiment. In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein,” respectively. Moreover, the terms “first,” “second,” “third,” and so forth, are used merely as labels, and are not intended to impose numerical requirements on their objects.
What has been described above includes examples of the disclosed architecture. It is, of course, not possible to describe every conceivable combination of components and/or methodologies, but one of ordinary skill in the art may recognize that many further combinations and permutations are possible.
This application is a divisional application of U.S. patent application Ser. No. 16/779,153, now U.S. Pat. No. 11,419,569, filed on Jan. 31, 2020, which is a continuation-in-part of International Application PCT/IB2018/056208, with an international filing date of Aug. 16, 2018, which claims priority to U.S. Provisional Patent Application Ser. No. 62/546,167, titled “Techniques for Breast Imaging Patient Motion Artifact Compensation” and filed on Aug. 16, 2017. The contents of the aforementioned applications are incorporated herein by reference in their entireties and, to the extent appropriate, priority is claimed to the aforementioned applications.
Number | Name | Date | Kind |
---|---|---|---|
3365575 | Strax | Jan 1968 | A |
3502878 | Stewart | Mar 1970 | A |
3863073 | Wagner | Jan 1975 | A |
3971950 | Evans et al. | Jul 1976 | A |
4160906 | Daniels et al. | Jul 1979 | A |
4212306 | Mahmud | Jul 1980 | A |
4310766 | Finkenzeller et al. | Jan 1982 | A |
4380086 | Vagi | Apr 1983 | A |
4496557 | Malen et al. | Jan 1985 | A |
4513433 | Weiss et al. | Apr 1985 | A |
4542521 | Hahn et al. | Sep 1985 | A |
4559641 | Caugant et al. | Dec 1985 | A |
4658409 | Summ | Apr 1987 | A |
4662379 | Macovski | May 1987 | A |
4706269 | Reina et al. | Nov 1987 | A |
4721856 | Saotome et al. | Jan 1988 | A |
4744099 | Huettenrauch et al. | May 1988 | A |
4752948 | MacMahon | Jun 1988 | A |
4760589 | Siczek | Jul 1988 | A |
4763343 | Yanaki | Aug 1988 | A |
4773086 | Fujita et al. | Sep 1988 | A |
4773087 | Plewes | Sep 1988 | A |
4819258 | Kleinman et al. | Apr 1989 | A |
4821727 | Levene et al. | Apr 1989 | A |
4969174 | Scheid et al. | Nov 1990 | A |
4989227 | Tirelli et al. | Jan 1991 | A |
5018176 | Romeas et al. | May 1991 | A |
RE33634 | Yanaki | Jul 1991 | E |
5029193 | Saffer | Jul 1991 | A |
5051904 | Griffith | Sep 1991 | A |
5078142 | Siczek et al. | Jan 1992 | A |
5163075 | Lubinsky et al. | Nov 1992 | A |
5164976 | Scheid et al. | Nov 1992 | A |
5199056 | Darrah | Mar 1993 | A |
5212637 | Saxena | May 1993 | A |
5240011 | Assa | Aug 1993 | A |
5256370 | Slattery | Oct 1993 | A |
5274690 | Burke | Dec 1993 | A |
5289520 | Pellegrino et al. | Feb 1994 | A |
5291539 | Thumann et al. | Mar 1994 | A |
5313510 | Ebersberger | May 1994 | A |
5359637 | Webber | Oct 1994 | A |
5365562 | Toker | Nov 1994 | A |
5415169 | Siczek et al. | May 1995 | A |
5426685 | Pellegrino et al. | Jun 1995 | A |
5452367 | Bick et al. | Sep 1995 | A |
5469429 | Yamazaki et al. | Nov 1995 | A |
5506877 | Niklason et al. | Apr 1996 | A |
5526394 | Siczek et al. | Jun 1996 | A |
5528658 | Hell | Jun 1996 | A |
5539797 | Heidsieck et al. | Jul 1996 | A |
5553111 | Moore et al. | Sep 1996 | A |
5592562 | Rooks | Jan 1997 | A |
5594769 | Pellegrino et al. | Jan 1997 | A |
5596200 | Sharma et al. | Jan 1997 | A |
5598454 | Franetzke et al. | Jan 1997 | A |
5609152 | Pellegrino et al. | Mar 1997 | A |
5627869 | Andrew et al. | May 1997 | A |
5657362 | Giger et al. | Aug 1997 | A |
5668844 | Webber | Sep 1997 | A |
5668889 | Hara | Sep 1997 | A |
5706327 | Adamkowski et al. | Jan 1998 | A |
5719952 | Rooks | Feb 1998 | A |
5735264 | Siczek et al. | Apr 1998 | A |
5769086 | Ritchart et al. | Jun 1998 | A |
5803912 | Siczek et al. | Sep 1998 | A |
5818898 | Tsukamoto et al. | Oct 1998 | A |
5828722 | Ploetz et al. | Oct 1998 | A |
5841829 | Dolazza | Nov 1998 | A |
5844965 | Galkin | Dec 1998 | A |
5864146 | Karellas | Jan 1999 | A |
5872828 | Niklason et al. | Feb 1999 | A |
5878104 | Ploetz | Mar 1999 | A |
5896437 | Ploetz | Apr 1999 | A |
5941832 | Tumey et al. | Aug 1999 | A |
5970118 | Sokolov | Oct 1999 | A |
5983123 | Shmulewitz | Nov 1999 | A |
5986662 | Argiro et al. | Nov 1999 | A |
5999836 | Nelson et al. | Dec 1999 | A |
6005907 | Ploetz | Dec 1999 | A |
6022325 | Siczek et al. | Feb 2000 | A |
6075879 | Roehrig et al. | Jun 2000 | A |
6091841 | Rogers et al. | Jul 2000 | A |
6137527 | Abdel-Malek et al. | Oct 2000 | A |
6141398 | He et al. | Oct 2000 | A |
6149301 | Kautzer et al. | Nov 2000 | A |
6167115 | Inoue | Dec 2000 | A |
6175117 | Komardin et al. | Jan 2001 | B1 |
6196715 | Nambu et al. | Mar 2001 | B1 |
6207958 | Giakos | Mar 2001 | B1 |
6216540 | Nelson et al. | Apr 2001 | B1 |
6219059 | Argiro | Apr 2001 | B1 |
6233473 | Shepherd et al. | May 2001 | B1 |
6243441 | Zur | Jun 2001 | B1 |
6244507 | Garland | Jun 2001 | B1 |
6256369 | Lai | Jul 2001 | B1 |
6256370 | Yavuz | Jul 2001 | B1 |
6272207 | Tang | Aug 2001 | B1 |
6289235 | Webber et al. | Sep 2001 | B1 |
6292530 | Yavus et al. | Sep 2001 | B1 |
6293282 | Lemelson | Sep 2001 | B1 |
6327336 | Gingold et al. | Dec 2001 | B1 |
6341156 | Baetz et al. | Jan 2002 | B1 |
6345194 | Nelson et al. | Feb 2002 | B1 |
6375352 | Hewes et al. | Apr 2002 | B1 |
6411836 | Patel et al. | Jun 2002 | B1 |
6415015 | Nicolas et al. | Jul 2002 | B2 |
6418189 | Schafer | Jul 2002 | B1 |
6442288 | Haerer et al. | Aug 2002 | B1 |
6459925 | Nields et al. | Oct 2002 | B1 |
6480565 | Ning | Nov 2002 | B1 |
6490476 | Townsend et al. | Dec 2002 | B1 |
6501819 | Unger et al. | Dec 2002 | B2 |
6542575 | Schubert | Apr 2003 | B1 |
6553096 | Zhou et al. | Apr 2003 | B1 |
6556655 | Chichereau et al. | Apr 2003 | B1 |
6574304 | Hsieh et al. | Jun 2003 | B1 |
6574629 | Cooke, Jr. et al. | Jun 2003 | B1 |
6597762 | Ferrant et al. | Jul 2003 | B1 |
6620111 | Stephens et al. | Sep 2003 | B2 |
6626849 | Huitema et al. | Sep 2003 | B2 |
6633674 | Barnes et al. | Oct 2003 | B1 |
6638235 | Miller et al. | Oct 2003 | B2 |
6647092 | Eberhard et al. | Nov 2003 | B2 |
6744848 | Stanton et al. | Jun 2004 | B2 |
6748044 | Sabol et al. | Jun 2004 | B2 |
6751285 | Eberhard et al. | Jun 2004 | B2 |
6758824 | Miller et al. | Jul 2004 | B1 |
6813334 | Koppe et al. | Nov 2004 | B2 |
6882700 | Wang et al. | Apr 2005 | B2 |
6885724 | Li et al. | Apr 2005 | B2 |
6895076 | Halsmer | May 2005 | B2 |
6909790 | Tumey et al. | Jun 2005 | B2 |
6909792 | Carrott et al. | Jun 2005 | B1 |
6912319 | Barnes et al. | Jun 2005 | B1 |
6940943 | Claus et al. | Sep 2005 | B2 |
6950493 | Besson | Sep 2005 | B2 |
6957099 | Arnone et al. | Oct 2005 | B1 |
6970531 | Eberhard et al. | Nov 2005 | B2 |
6978040 | Berestov | Dec 2005 | B2 |
6987831 | Ning | Jan 2006 | B2 |
6999554 | Mertelmeier | Feb 2006 | B2 |
7001071 | Deuringer | Feb 2006 | B2 |
7016461 | Rotondo | Mar 2006 | B2 |
7110490 | Eberhard et al. | Sep 2006 | B2 |
7110502 | Tsuji | Sep 2006 | B2 |
7116749 | Besson | Oct 2006 | B2 |
7123684 | Jing et al. | Oct 2006 | B2 |
7127091 | Op De Beek et al. | Oct 2006 | B2 |
7142633 | Eberhard et al. | Nov 2006 | B2 |
7190758 | Hagiwara | Mar 2007 | B2 |
7206462 | Betke | Apr 2007 | B1 |
7244063 | Eberhard | Jul 2007 | B2 |
7245694 | Jing et al. | Jul 2007 | B2 |
7286645 | Freudenberger | Oct 2007 | B2 |
7302031 | Hjarn et al. | Nov 2007 | B2 |
7315607 | Ramsauer | Jan 2008 | B2 |
7319735 | Defreitas et al. | Jan 2008 | B2 |
7319736 | Rotondo | Jan 2008 | B2 |
7323692 | Rowlands et al. | Jan 2008 | B2 |
7331264 | Ozawa | Feb 2008 | B2 |
7430272 | Jing et al. | Sep 2008 | B2 |
7443949 | Defreitas et al. | Oct 2008 | B2 |
7577282 | Gkanatsios et al. | Aug 2009 | B2 |
7583786 | Jing et al. | Sep 2009 | B2 |
7609806 | Defreitas et al. | Oct 2009 | B2 |
7616731 | Pack | Nov 2009 | B2 |
7616801 | Gkanatsios et al. | Nov 2009 | B2 |
7630531 | Chui | Dec 2009 | B2 |
7630533 | Ruth et al. | Dec 2009 | B2 |
7688940 | Defreitas et al. | Mar 2010 | B2 |
7697660 | Ning | Apr 2010 | B2 |
7702142 | Ren et al. | Apr 2010 | B2 |
7760853 | Jing et al. | Jul 2010 | B2 |
7760924 | Ruth et al. | Jul 2010 | B2 |
7792245 | Hitzke et al. | Sep 2010 | B2 |
7831296 | Defreitas et al. | Nov 2010 | B2 |
7839979 | Hauttmann | Nov 2010 | B2 |
7869563 | Defreitas et al. | Jan 2011 | B2 |
7881428 | Jing et al. | Feb 2011 | B2 |
7885384 | Mannar | Feb 2011 | B2 |
7894646 | Shirahata et al. | Feb 2011 | B2 |
7916915 | Gkanatsios et al. | Mar 2011 | B2 |
7949091 | Jing et al. | May 2011 | B2 |
7986765 | Defreitas et al. | Jul 2011 | B2 |
7991106 | Ren et al. | Aug 2011 | B2 |
8031834 | Ludwig | Oct 2011 | B2 |
8131049 | Ruth et al. | Mar 2012 | B2 |
8155421 | Ren et al. | Apr 2012 | B2 |
8170320 | Smith et al. | May 2012 | B2 |
8175219 | Defreitas et al. | May 2012 | B2 |
8285020 | Gkanatsios et al. | Oct 2012 | B2 |
8416915 | Jing et al. | Apr 2013 | B2 |
8452379 | DeFreitas et al. | May 2013 | B2 |
8457282 | Baorui et al. | Jun 2013 | B2 |
8515005 | Ren et al. | Aug 2013 | B2 |
8559595 | Defreitas et al. | Oct 2013 | B2 |
8565372 | Stein et al. | Oct 2013 | B2 |
8565374 | DeFreitas et al. | Oct 2013 | B2 |
8565860 | Kimchy | Oct 2013 | B2 |
8571289 | Ruth et al. | Oct 2013 | B2 |
8712127 | Ren et al. | Apr 2014 | B2 |
8767911 | Ren et al. | Jul 2014 | B2 |
8787522 | Smith et al. | Jul 2014 | B2 |
8824752 | Fonte | Sep 2014 | B1 |
8831171 | Jing et al. | Sep 2014 | B2 |
8853635 | O'Connor | Oct 2014 | B2 |
8873716 | Ren et al. | Oct 2014 | B2 |
9020579 | Smith | Apr 2015 | B2 |
9042612 | Gkanatsios et al. | May 2015 | B2 |
9066706 | Defreitas et al. | Jun 2015 | B2 |
9226721 | Ren et al. | Jan 2016 | B2 |
9460508 | Gkanatsios et al. | Oct 2016 | B2 |
9498175 | Stein et al. | Nov 2016 | B2 |
9498180 | Ren | Nov 2016 | B2 |
9502148 | Ren | Nov 2016 | B2 |
9549709 | DeFreitas et al. | Jan 2017 | B2 |
9851888 | Gkanatsios et al. | Dec 2017 | B2 |
9895115 | Ren | Feb 2018 | B2 |
10108329 | Gkanatsios et al. | Oct 2018 | B2 |
10194875 | DeFreitas et al. | Feb 2019 | B2 |
10296199 | Gkanatsios | May 2019 | B2 |
10413255 | Stein | Sep 2019 | B2 |
10452252 | Gkanatsios et al. | Oct 2019 | B2 |
10638994 | DeFreitas | May 2020 | B2 |
10719223 | Gkanatsios | Jul 2020 | B2 |
10881359 | Williams | Jan 2021 | B2 |
10905385 | DeFreitas | Feb 2021 | B2 |
10959694 | Jing et al. | Mar 2021 | B2 |
11076820 | Smith | Aug 2021 | B2 |
11090017 | Zhang | Aug 2021 | B2 |
11096644 | DeFreitas | Aug 2021 | B2 |
11419569 | Wong | Aug 2022 | B2 |
20010038681 | Stanton et al. | Nov 2001 | A1 |
20020012450 | Tsujii | Jan 2002 | A1 |
20020048343 | Launay et al. | Apr 2002 | A1 |
20020050986 | Inouc et al. | May 2002 | A1 |
20020070970 | Wood et al. | Jun 2002 | A1 |
20020075997 | Unger et al. | Jun 2002 | A1 |
20020090055 | Zur et al. | Jul 2002 | A1 |
20020094062 | Dolazza | Jul 2002 | A1 |
20020122533 | Marie et al. | Sep 2002 | A1 |
20020126798 | Harris | Sep 2002 | A1 |
20030007598 | Wang et al. | Jan 2003 | A1 |
20030010923 | Zur | Jan 2003 | A1 |
20030018272 | Treado et al. | Jan 2003 | A1 |
20030026386 | Tang et al. | Feb 2003 | A1 |
20030058989 | Rotondo | Mar 2003 | A1 |
20030072409 | Kaufhold et al. | Apr 2003 | A1 |
20030072417 | Kaufhold et al. | Apr 2003 | A1 |
20030073895 | Nields et al. | Apr 2003 | A1 |
20030095624 | Eberhard et al. | May 2003 | A1 |
20030097055 | Yanof et al. | May 2003 | A1 |
20030149364 | Kapur | Aug 2003 | A1 |
20030169847 | Karellas et al. | Sep 2003 | A1 |
20030194050 | Eberhard | Oct 2003 | A1 |
20030194051 | Wang et al. | Oct 2003 | A1 |
20030194121 | Eberhard et al. | Oct 2003 | A1 |
20030210254 | Doan et al. | Nov 2003 | A1 |
20030212327 | Wang et al. | Nov 2003 | A1 |
20030215120 | Uppaluri et al. | Nov 2003 | A1 |
20040008809 | Webber | Jan 2004 | A1 |
20040066882 | Eberhard et al. | Apr 2004 | A1 |
20040066884 | Hermann Claus | Apr 2004 | A1 |
20040066904 | Eberhard et al. | Apr 2004 | A1 |
20040070582 | Smith et al. | Apr 2004 | A1 |
20040094167 | Brady et al. | May 2004 | A1 |
20040101095 | Jing et al. | May 2004 | A1 |
20040109529 | Eberhard et al. | Jun 2004 | A1 |
20040146221 | Siegel et al. | Jul 2004 | A1 |
20040170561 | Salb et al. | Sep 2004 | A1 |
20040171986 | Tremaglio, Jr. et al. | Sep 2004 | A1 |
20040190682 | Deuringer | Sep 2004 | A1 |
20040213378 | Zhou et al. | Oct 2004 | A1 |
20040247081 | Halsmer | Dec 2004 | A1 |
20040264627 | Besson | Dec 2004 | A1 |
20040267157 | Miller et al. | Dec 2004 | A1 |
20050025278 | Hagiwara | Feb 2005 | A1 |
20050049521 | Miller et al. | Mar 2005 | A1 |
20050063509 | DeFreitas et al. | Mar 2005 | A1 |
20050078797 | Danielsson et al. | Apr 2005 | A1 |
20050089205 | Kapur | Apr 2005 | A1 |
20050105679 | Wu et al. | May 2005 | A1 |
20050113681 | DeFreitas et al. | May 2005 | A1 |
20050113715 | Schwindt et al. | May 2005 | A1 |
20050117694 | Francke | Jun 2005 | A1 |
20050129172 | Mertelmeier | Jun 2005 | A1 |
20050133706 | Eberhard | Jun 2005 | A1 |
20050135555 | Claus et al. | Jun 2005 | A1 |
20050135664 | Kaufhold et al. | Jun 2005 | A1 |
20050226375 | Eberhard et al. | Oct 2005 | A1 |
20050248347 | Damadian | Nov 2005 | A1 |
20060030784 | Miller et al. | Feb 2006 | A1 |
20060034426 | Freudenberger | Feb 2006 | A1 |
20060074288 | Kelly | Apr 2006 | A1 |
20060098855 | Gkanatsios et al. | May 2006 | A1 |
20060109951 | Popescu | May 2006 | A1 |
20060126780 | Rotondo | Jun 2006 | A1 |
20060129062 | Nicoson et al. | Jun 2006 | A1 |
20060155209 | Miller et al. | Jul 2006 | A1 |
20060210016 | Francke | Sep 2006 | A1 |
20060262898 | Partain | Nov 2006 | A1 |
20060269041 | Mertelmeier | Nov 2006 | A1 |
20060291618 | Eberhard et al. | Dec 2006 | A1 |
20070030949 | Jing et al. | Feb 2007 | A1 |
20070036265 | Jing et al. | Feb 2007 | A1 |
20070076844 | Defreitas et al. | Apr 2007 | A1 |
20070078335 | Horn | Apr 2007 | A1 |
20070140419 | Souchay | Jun 2007 | A1 |
20070217573 | Berhardt | Sep 2007 | A1 |
20070223651 | Wagenaar et al. | Sep 2007 | A1 |
20070225600 | Weibrecht et al. | Sep 2007 | A1 |
20070242800 | Jing et al. | Oct 2007 | A1 |
20070280412 | DeFreitas | Dec 2007 | A1 |
20080019581 | Gkanatsios et al. | Jan 2008 | A1 |
20080045833 | Defreitas et al. | Feb 2008 | A1 |
20080056436 | Pack | Mar 2008 | A1 |
20080101537 | Sendai | May 2008 | A1 |
20080112534 | Defreitas | May 2008 | A1 |
20080118023 | Besson | May 2008 | A1 |
20080130979 | Ren | Jun 2008 | A1 |
20080212861 | Durgan et al. | Sep 2008 | A1 |
20080285712 | Kopans | Nov 2008 | A1 |
20080317196 | Imai | Dec 2008 | A1 |
20090003519 | Defreitas et al. | Jan 2009 | A1 |
20090010384 | Jing et al. | Jan 2009 | A1 |
20090080594 | Brooks et al. | Mar 2009 | A1 |
20090080602 | Brooks et al. | Mar 2009 | A1 |
20090135997 | Defreitas et al. | May 2009 | A1 |
20090141859 | Gkanatsios et al. | Jun 2009 | A1 |
20090175408 | Goodsitt et al. | Jul 2009 | A1 |
20090213987 | Stein et al. | Aug 2009 | A1 |
20090237924 | Ladewig | Sep 2009 | A1 |
20090238424 | Arakita et al. | Sep 2009 | A1 |
20090262887 | Iordache | Oct 2009 | A1 |
20090268865 | Ren et al. | Oct 2009 | A1 |
20090296882 | Gkanatsios | Dec 2009 | A1 |
20090304147 | Jing et al. | Dec 2009 | A1 |
20100020937 | Hautmann | Jan 2010 | A1 |
20100020938 | Koch | Jan 2010 | A1 |
20100034450 | Mertelmeier | Feb 2010 | A1 |
20100054400 | Ren | Mar 2010 | A1 |
20100086188 | Ruth | Apr 2010 | A1 |
20100091940 | Ludwig et al. | Apr 2010 | A1 |
20100150306 | Defreitas et al. | Jun 2010 | A1 |
20100189227 | Mannar | Jul 2010 | A1 |
20100195882 | Ren | Aug 2010 | A1 |
20100226475 | Smith | Sep 2010 | A1 |
20100290585 | Eliasson | Nov 2010 | A1 |
20100303202 | Ren | Dec 2010 | A1 |
20100313196 | De Atley | Dec 2010 | A1 |
20110026667 | Poorter | Feb 2011 | A1 |
20110069809 | Defreitas et al. | Mar 2011 | A1 |
20110178389 | Kumar et al. | Jul 2011 | A1 |
20110188624 | Ren | Aug 2011 | A1 |
20110234630 | Batman et al. | Sep 2011 | A1 |
20110257919 | Reiner | Oct 2011 | A1 |
20110268246 | Dafni | Nov 2011 | A1 |
20120033868 | Ren | Feb 2012 | A1 |
20120051502 | Ohta et al. | Mar 2012 | A1 |
20120236987 | Ruimi | Sep 2012 | A1 |
20120238870 | Smith et al. | Sep 2012 | A1 |
20130028374 | Gkanatsios et al. | Jan 2013 | A1 |
20130211261 | Wang | Aug 2013 | A1 |
20130272493 | Otokuni | Oct 2013 | A1 |
20130272494 | DeFreitas et al. | Oct 2013 | A1 |
20140044230 | Stein et al. | Feb 2014 | A1 |
20140044231 | Defreitas et al. | Feb 2014 | A1 |
20140086471 | Ruth et al. | Mar 2014 | A1 |
20140098935 | Defreitas et al. | Apr 2014 | A1 |
20140232752 | Ren et al. | Aug 2014 | A1 |
20140314198 | Ren et al. | Oct 2014 | A1 |
20140321607 | Smith | Oct 2014 | A1 |
20140328458 | Erhard | Nov 2014 | A1 |
20140376690 | Jing et al. | Dec 2014 | A1 |
20150049859 | DeFreitas et al. | Feb 2015 | A1 |
20150160848 | Gkanatsios et al. | Jun 2015 | A1 |
20150310611 | Gkanatsios et al. | Oct 2015 | A1 |
20160066875 | Jacob et al. | Mar 2016 | A1 |
20160106383 | Ren et al. | Apr 2016 | A1 |
20160189376 | Bernard | Jun 2016 | A1 |
20160209995 | Jeon | Jul 2016 | A1 |
20160220207 | Jouhikainen | Aug 2016 | A1 |
20160256125 | Smith | Sep 2016 | A1 |
20160270742 | Stein et al. | Sep 2016 | A9 |
20160302746 | Erhard | Oct 2016 | A1 |
20160331339 | Guo | Nov 2016 | A1 |
20170024113 | Gkanatsios et al. | Jan 2017 | A1 |
20170032546 | Westerhoff | Feb 2017 | A1 |
20170071562 | Suzuki | Mar 2017 | A1 |
20170128028 | DeFreitas et al. | May 2017 | A1 |
20170135650 | Stein et al. | May 2017 | A1 |
20170316588 | Homann | Nov 2017 | A1 |
20170319167 | Goto | Nov 2017 | A1 |
20170367674 | Arai et al. | Dec 2017 | A1 |
20180130201 | Bernard | May 2018 | A1 |
20180177476 | Jing et al. | Jun 2018 | A1 |
20180188937 | Gkanatsios et al. | Jul 2018 | A1 |
20180289347 | DeFreitas et al. | Oct 2018 | A1 |
20180344276 | DeFreitas et al. | Dec 2018 | A1 |
20190059830 | Williams | Feb 2019 | A1 |
20190095087 | Gkanatsios et al. | Mar 2019 | A1 |
20190200942 | DeFreitas | Jul 2019 | A1 |
20190336794 | Li | Nov 2019 | A1 |
20190388051 | Morita | Dec 2019 | A1 |
20200012417 | Gkanatsios | Jan 2020 | A1 |
20200029927 | Wilson | Jan 2020 | A1 |
20200085393 | Zhang | Mar 2020 | A1 |
20200222023 | Wong | Jul 2020 | A1 |
20200348835 | Gkanatsios | Nov 2020 | A1 |
20200352531 | Smith | Nov 2020 | A1 |
20210128087 | DeFreitas | May 2021 | A1 |
20220071582 | DeFreitas | Mar 2022 | A1 |
20220361831 | Hartig | Nov 2022 | A1 |
20220378389 | Wong | Dec 2022 | A1 |
Number | Date | Country |
---|---|---|
102222594 | Oct 2011 | CN |
105286904 | Feb 2016 | CN |
102004051401 | May 2006 | DE |
102004051820 | May 2006 | DE |
102010027871 | Oct 2011 | DE |
0775467 | May 1997 | EP |
0982001 | Mar 2000 | EP |
1028451 | Aug 2000 | EP |
1428473 | Jun 2004 | EP |
1759637 | Mar 2007 | EP |
1569556 | Apr 2012 | EP |
2732764 | May 2014 | EP |
2602743 | Nov 2014 | EP |
2819145 | Dec 2014 | EP |
3143935 | Mar 2017 | EP |
53151381 | Nov 1978 | JP |
2001-346786 | Dec 2001 | JP |
2002219124 | Aug 2002 | JP |
2006-231054 | Sep 2006 | JP |
2007-50264 | Mar 2007 | JP |
2007-521911 | Aug 2007 | JP |
2007229269 | Sep 2007 | JP |
2008-67933 | Mar 2008 | JP |
2008086471 | Apr 2008 | JP |
2009500048 | Jan 2009 | JP |
2011-072667 | Apr 2011 | JP |
2011-250842 | Dec 2011 | JP |
2012-509714 | Apr 2012 | JP |
2012-511988 | May 2012 | JP |
2015-530706 | Oct 2015 | JP |
WO 9005485 | May 1990 | WO |
WO 9803115 | Jan 1998 | WO |
WO 9816903 | Apr 1998 | WO |
WO 0051484 | Sep 2000 | WO |
WO 03020114 | Mar 2003 | WO |
WO 03037046 | May 2003 | WO |
WO 2003057564 | Jul 2003 | WO |
WO 2004043535 | May 2004 | WO |
WO 2005051197 | Jun 2005 | WO |
WO 2005110230 | Nov 2005 | WO |
WO 2005112767 | Dec 2005 | WO |
WO 2006055830 | May 2006 | WO |
WO 2006058160 | Jun 2006 | WO |
WO 2007129244 | Nov 2007 | WO |
WO 2008072144 | Jun 2008 | WO |
2009104156 | Aug 2009 | WO |
WO 2009122328 | Oct 2009 | WO |
WO 2009136349 | Nov 2009 | WO |
WO 2010070554 | Jun 2010 | WO |
2011058730 | May 2011 | WO |
WO 2013184213 | Dec 2013 | WO |
2014176445 | Oct 2014 | WO |
WO 2018170265 | Sep 2018 | WO |
Entry |
---|
PCT International Preliminary Report on Patentability in Application PCT/IB2018/056208, dated Feb. 18, 2020, 9 pages. |
PCT International Search Report and Written Opinion in Application PCT/IB2018/056208, dated Nov. 13, 2018, 12 pages. |
European Communication and Search Report in Application 18847121.3, dated Apr. 8, 2021, 5 pages. |
European Extended Search Report in Application 22183201.7, dated Dec. 5, 2022, 7 pages. |
“Essentials for life: Senographe Essential Full-Field Digital Mammography system”, GE Health-care Brochure, MM-0132-05.06-EN-US, 2006, 12 pgs. |
“Filtered Back Projection,” (NYGREN) published May 8, 2007; URL:http://web.archive.org/web/19991010131715/http://www.owlnet.rice.edu/-.about.e1ec539/Projects97/cult/node2.html., 2 pgs. |
“Lorad Selenia” Document B-BI-SEO US/Intl (May 2006) copyright Hologic 2006, 12 pgs. |
Acrin website, located at https://www.acrin.org/PATIENTS/ABOUTIMAGINGEXAMSANDAGENTS/ABOUTMAMMOGRAPHYANDTOMOSYNTHESIS.aspx, “About Mammography and Tomosynthesis”, obtained online on Dec. 8, 2015, 5 pgs. |
American College of Radiology website, located at http://www.acr.org/FAQs/DBT-FAQ, “Digital Breast Tomosynthesis FAQ for Insurers”, obtained online on Dec. 8, 2015, 2 pages. |
Aslund, Magnus, “Digital Mammography with a Photon Counting Detector in a Scanned Multislit Geometry”, Doctoral Thesis, Dept of Physics, Royal Institute of Technology, Stockholm, Sweden, Apr. 2007, 51 pages. |
Chan, Heang-Ping et al., “ROC study of the effect of stereoscopic imaging on assessment of breast lesions”, Medical Physics, vol. 32, No. 4, Apr. 2005, 7 pgs. |
Cole, Elodia, et al., “The Effects of Gray Scale Image Processing on Digital Mammography Interpretation Performance”, Academic Radiology, vol. 12, No. 5, pp. 585-595, May 2005. |
Digital Clinical Reports, Tomosynthesis, GE Brochure 98-5493, Nov. 1998, 8 pgs. |
Dobbins, James T., “Digital x-ray tomosynthesis: current state of the art and clinical potential,” Physics in Medicine and Biology, Taylor and Francis LTD, London GB, vol. 48, No. 19, Oct. 7, 2003, 42 pages. |
Grant, David G., “Tomosynthesis: a three-dimensional imaging technique”, IEEE Trans. Biomed. Engineering, vol. BME-19, #1, Jan. 1972, pp. 20-28. |
Japanese Notice of Rejection in Application 2018-554775, dated Feb. 22, 2021, 10 pages. |
Japanese Office Action mailed in Application 2016-087710, dated Mar. 1, 2017, 5 pages. |
Japanese Office Action mailed in Application 2017-001579, dated Mar. 29, 2017, 1 page. (No English Translation.). |
Kachelriess, Marc et al., “Flying Focal Spot (FFS) in Cone-Beam CT”, 2004 IEEE Nuclear Science Symposium Conference Record, Oct. 16-22, 2004, Rome Italy, vol. 6, pp. 3759-3763. |
Kita et al., “Correspondence between different view breast X-rays using simulation of breast deformation”, Proceedings 1998 IEE Computer Society Conference on Computer Vision and Pattern Recognition, Santa Barbara, CA, Jun. 23-25, 1998, pp. 700-707. |
Kopans, Daniel B., “Breast Imaging”, Chapter 26: Future Advances in Breast Imaging, 2nd Edition, Lippincott-Raven Publishers, Philadelphia, 1998, 37 pages. |
Mammographic Accreditation Phantom, http://www.cirsinc.com/pdfs/015cp.pdf. (2006), 2 pgs. |
Niklason et al., “Digital breast tomosynthesis: potentially a new method for breast cancer screening”, In Digital Mammography, 1998, 6 pages. |
Niklason, Loren T. et al., “Digital Tomosynthesis in Breast Imaging”, Radiology, Nov. 1997, vol. 205, No. 2, pp. 399-406. |
Pediconi, Federica et al., “Color-coded automated signal intensity-curve for detection and characterization of breast lesions: Preliminary evaluation of a new software for MR-based breast imaging”, International Congress Series 1281 (2005) 1081-1086. |
Pisano, Etta D., “Digital Mammography”, Radiology, vol. 234, No. 2, Feb. 2005, pp. 353-362. |
Senographe 700 & 800T (GE); 2-page download on Jun. 22, 2006, from www.gehealthcare.com/inen/rad/whe/products/mswh800t.html.; Figures 1-7 on 4 sheets re lateral shift compression paddle, 2 pgs. |
Smith, A., “Fundamentals of Breast Tomosynthesis”, White Paper, Hologic Inc., WP-00007, Jun. 2008, 8 pgs. |
Smith, Andrew, PhD, “Full Field Breast Tomosynthesis”, Hologic White Paper, Oct. 2004, 6 pgs. |
Thurfjell, “Mammography screening: one versus two views and independent double reading”, Acta Radiologica 35, No. 4, 1994, pp. 345-350. |
Wheeler F. W., et al. “Micro-Calcification Detection in Digital Tomosynthesis Mammography”, Proceedings of SPIE, Conf-Physics of Semiconductor Devices, Dec. 11, 2001 to Dec. 15, 2001, Delhi, SPIE, US, vol. 6144, Feb. 13, 2006, 12 pgs. |
Wu, Tao, et al. “Tomographic Mammography Using a Limited Number of Low-Dose Cone-Beam Projection Images” Medical Physics, AIP, Melville, NY, vol. 30, No. 3, Mar. 1, 2003, p. 365-380. |
Number | Date | Country | |
---|---|---|---|
20230000455 A1 | Jan 2023 | US |
Number | Date | Country | |
---|---|---|---|
62546167 | Aug 2017 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16779153 | Jan 2020 | US |
Child | 17850529 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/IB2018/056208 | Aug 2018 | WO |
Child | 16779153 | US |