Interactive model interface for image selection in medical imaging systems

Information

  • Patent Grant
  • 12186119
  • Patent Number
    12,186,119
  • Date Filed
    Tuesday, September 6, 2022
    2 years ago
  • Date Issued
    Tuesday, January 7, 2025
    17 days ago
  • Inventors
    • Saba; Nickolas (San Jose, CA, US)
  • Original Assignees
  • Examiners
    • Artman; Thomas R
    Agents
    • Merchant & Gould P.C.
Abstract
Systems and methods for determining an image type are disclosed. A user interface includes a visual representation of a breast and a visual representation of a medical imaging system. The visual representation of the medical imaging system includes a visual representation of a detector and either a source or a compression paddle. The visual representation of the medical imaging system may be rotatable and positionable relative to the visual representation of the breast. Based on the relative position and orientation of the visual representation of the medical imaging system relative to the visual representation of the breast, an image type is determined. The image type may be displayed at the user interface.
Description
INTRODUCTION

Medical imaging is used for visualizing the inner structures and conditions of the human body. In the context of breast imaging, medical imaging contemplates detection of cancerous cells in breast tissue. A plurality of different imaging processes, image acquisition parameters, and image processing techniques are used to enhance images for better detection of abnormal tissue. A significant number of different images may be taken of a single breast or of a single patient. A technologist taking these images must work efficiently and accurately to take the images. Imaging errors may necessitate follow-up imaging, thereby exposing the patient to excess radiation.


SUMMARY

Examples of the present disclosure describe systems and methods relating to a method for determining an imaging type, the method including: displaying a visual representation of a breast and a visual representation of a medical imaging system capable of imaging the breast, wherein the visual representation of the medical imaging system includes a visual representation of a compression mechanism and a visual representation of an x-ray detector; receiving an indication to move the visual representation of the medical imaging system to a position and an orientation relative to the visual representation of the breast; displaying the visual representation of the medical imaging system in the position and the orientation relative to the visual representation of the breast; based on the position and the orientation of the visual representation of the medical imaging system relative to the visual representation of the breast, determining an image type of the medical imaging system; displaying the image type; and automatically adjusting at least one component of the medical imaging system based at least in part on the determined image type. In an example, the method further includes acquiring, while the breast is compressed by the imaging system, an x-ray image of the breast according to the determined image type. In another example, the method further includes receiving an indication to rotate the visual representation of the breast. In yet another example, the indication to rotate the visual representation of the breast is associated with rotation relative to a vertical axis along a length of the visual representation of the breast. In still another example, the orientation of visual representation of the medical imaging system is tilted relative to the visual representation of the breast.


In another example of the above aspect, the visual representation of the medical imaging system includes two parallel lines. In an example, the indication to move the visual representation of the medical imaging system includes an indication to rotate the visual representation of the medical imaging system in one of a clockwise or counter-clockwise direction. In another example, the indication to move the visual representation of the medical imaging system is a click and drag. In yet another example, the position and the orientation of the visual representation of the medical imaging system aligns with a predetermined position and a predetermined orientation.


In another aspect, the technology relates to a user interface for breast imaging, the user interface including: a visual representation of a breast; a visual representation of a medical imaging system capable of imaging the breast, wherein the visual representation of the medical imaging system includes: a visual representation of a compression paddle of the imaging system; and a visual representation of an x-ray detector of the imaging system; wherein the visual representation of the compression paddle and the visual representation of the x-ray detector are rotatable and positionable relative to the visual representation of the breast; and wherein an image type is displayed, based on a position and an orientation of the visual representation of the medical imaging system relative to the visual representation of the breast. In an example, the visual representation of the breast is three-dimensional. In another example, the visual representation of the breast is rotatable. In yet another example, the visual representation of the breast deforms based on the position and the orientation of the visual representation of a compression paddle and the visual representation of the platform relative to the visual representation of the breast. In yet another example, the visual representation of the breast includes a visual representation of two breasts. In still another example, the visual representation of the medical imaging system includes two parallel lines.


In another example of the above aspect, the two parallel lines are fixed relative to each other. In an example, the visual representation of the compression paddle and the visual representation of the x-ray detector are rotatable and positionable relative to a predetermined position and a predetermined orientation. In another example, the predetermined position and the predetermined orientation are associated with a specific image type. In yet another example, the image type is selected from a group consisting of: craniocaudal (CC); mediolateral oblique (MLO); mediolateral (ML); exaggerated craniocaudal lateral (XCCL); exaggerated craniocaudal medial (XCCM); cleavage view (CV); lateromedial (LM); tangential (TAN); caudocranial from below (FB); axillary tail (AT); lateromedial oblique (LMO); superoinferior oblique (SIO); and inferomedial superolateral oblique (ISO).


In another aspect, the technology relates to an apparatus for breast imaging, the apparatus including: an x-ray source capable of selectively moving relative to a breast; an x-ray detector; a compression system for compressing the breast, the compression system disposed between the x-ray source and the x-ray detector; a display; a processor; and memory storing instructions that, when executed by the processor, cause the apparatus to perform a set of operations including: displaying, at the display, a visual representation of the breast, a visual representation of the compression system, and a visual representation of the x-ray detector; receiving an indication to move the visual representation of the compression system and the visual representation of the x-ray detector relative to the visual representation of the breast; displaying, at the display, the visual representation of the compression system and the visual representation of the x-ray detector in a position and an orientation relative to the visual representation of the breast, based on the indication; based on the position and the orientation of the visual compression system and the visual representation of the x-ray detector relative to the visual representation of the breast, determining an image type; and displaying, at the display, the image type.


This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Additional aspects, features, and/or advantages of examples will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the disclosure.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying figures illustrate one or more aspects of the disclosed methods and systems. In the appended figures, similar components and/or features may have the same reference label. Further, various components of the same type may be distinguished by following the reference label with a second label that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label. Non-limiting and non-exhaustive examples are described with reference to the following figures:



FIG. 1A depicts a schematic view of an example imaging system.



FIG. 1B depicts a perspective view of the example imaging system of FIG. 1A.



FIG. 2 depicts a prior art configuration of a user interface for selecting an image type for a medical imaging system.



FIG. 3 depicts an example user interface for a medical imaging system.



FIGS. 4A-4D depict different configurations of the example user interface of FIG. 3.



FIG. 5 depicts an example method for determining an image type for a medical imaging system.



FIG. 6 illustrates an example operating environment for a medical imaging system.





While examples of the disclosure are amenable to various modifications and alternate forms, specific examples have been shown by way of example in the drawings and are described in detail below. The intention is not to limit the scope of the disclosure to the particular examples described. On the contrary, the disclosure is intended to cover all modifications, equivalents, and alternatives falling within the scope of the disclosure and the appended claims.


DETAILED DESCRIPTION

Various aspects of the disclosure are described more fully below, with reference to the accompanying drawings, which show specific example aspects. However, different aspects of the disclosure may be implemented in many different forms and should not be construed as limited to the aspects described herein; rather, these aspects are provided so that this disclosure will be thorough and complete and will fully convey the scope of the aspects to those skilled in the art. Aspects may be practiced as methods, systems, or devices. The following detailed description is, therefore, not to be interpreted in a limiting sense.


Breast cancer is one of the leading causes of cancer-related mortality of women. Abnormalities may be identified in the breast tissue by implementing one or more imaging techniques such as breast CT, breast tomosynthesis, and digital mammography. The imaging techniques are employed by emitting x-ray radiation from a source and detecting the radiation at a detector. A breast that is being imaged is placed between the source and the detector, and is typically compressed or immobilized therebetween. Placement and orientation of the source and detector relative to the breast may be adjusted. For example, different relative angles of radiation emitted through the breast may provide different view angles to view and/or assess an abnormality in the breast. The image types (otherwise referred to as image views) are based on the relative position and orientation of the source and detectors relative to the breast. Examples of image types include craniocaudal (CC), mediolateral oblique (MLO), mediolateral (ML), exaggerated craniocaudal lateral (XCCL), exaggerated craniocaudal medial (XCCM), cleavage view or “valley view” (CV), lateromedial (LM), tangential (TAN), caudocranial from below (FB), axillary tail (AT), lateromedial oblique (LMO), superoinferior oblique (SIO), inferomedial superolateral oblique (ISO), specimen, among others. Image types may be associated with one or more imaging modalities, such as conventional mammography, tomosynthesis, HD tomosynthesis, mammography/synthesis combination, HD combination, 2D contrast-enhanced, 2D contrast-enhanced combination, biopsy, quality control, etc.


Desired imaging type(s) are selected by a medical professional such as a technologist prior to imaging a breast with a medical imaging system. Traditionally, imaging type(s) are selected from a list of all available imaging types for a specific imaging modality. The list of image types may be provided as a list of images showing a breast relative to components of an imaging system (e.g., source and/or detector). This approach, however, requires the medical professional to scan through a list of options and navigate through different tabs on an interface. Additionally, by looking through a list of options, the medical professional needs to compare each list option with the configuration of the medical imaging system. Thus, the list format burdens the medical professional.


Accordingly, the present disclosure provides systems and methods for displaying, identifying, and selecting an image type for a medical imaging system. In an example, the present technology provides a user interface. The user interface includes a visual representation of a breast and a visual representation of a medical imaging system. The visual representation of the medical imaging system includes a visual representation of one or more of a detector, a source, and a compression paddle. The visual representation of the medical imaging system may be rotatable and positionable relative to the visual representation of the breast. Based on the relative position and orientation of the visual representation of the medical imaging system relative to the visual representation of the breast, an image type is determined. The image type may be displayed at the user interface. While the present disclosure is directed to breast imaging, the concepts and functions described herein are also applicable to imaging of other body parts (e.g., the abdomen, appendages such as arms and feet, etc.). Certain body parts, such as limbs, are more amenable to rotation thereof relative to a static imaging system. The systems and methods described herein that depict rotation of an imaging system relative to a static breast are equally applicable to rotation of a body part relative to a static imaging system; necessary or desirable modifications thereof would be apparent to a person of skill in the art upon reading this disclosure.



FIGS. 1A-1B show different views of an example imaging system 100. FIG. 1A depicts a schematic view of the example imaging system 100 and FIG. 1B depicts a perspective view of the example imaging system 100. The descriptions provided herein may be applied to either an upright (shown) or prone (not shown) imaging system 100. For simplicity, the following discussion includes examples for use with an upright breast tomosynthesis imaging system (such as the Dimensions Breast Tomosynthesis imaging system provided by Hologic, Inc.).


The imaging system 100 immobilizes a patient's breast 102 for x-ray imaging (either or both of mammography and tomosynthesis) via a breast compression immobilizer unit 104 that includes a static breast support platform 106 and a foam compressive element 108. Different paddles, each having different purposes, are known in the art. Certain examples paddles are also described herein for context. The breast support platform 106 and the foam compressive element 108 each have a compression surface 110 and 112, respectively, that move towards each other to compress, immobilize, stabilize, or otherwise hold and secure the breast 102 during imaging procedures. In known systems, the compression surface 110, 112 is exposed so as to directly contact the breast 102. Compression surface 110 may be a rigid plastic, a flexible plastic, a resilient foam, a mesh or screen, and so on. Compression surface 112 is a lower surface of the foam compressive element 108. The platform 106 also houses an image receptor 116 and, optionally, a tilting mechanism 118, and optionally an anti-scatter grid (not depicted, but disposed above the image receptor 116). The immobilizer unit 104 (otherwise referred to herein as the compression system 104) is in a path of an imaging beam 120 emanating from x-ray source 122, such that the imaging beam 120 impinges on the image receptor 116.


The immobilizer unit 104 is supported on a first support arm 124 via a compression arm 134, which is configured to be raised and lowered along the support arm 124. The x-ray source 122 is supported on a second support arm, also referred to as a tube head 126. For mammography, support arms 124, 126 can rotate as a unit about an axis 128 between different imaging orientations such as craniocaudal (CC) and mediolateral oblique (MLO), so that the imaging system 100 can take a mammogram projection image at each orientation. (The terms front, lower, and upper pertain to using a CC imaging orientation, with the patient facing the front of the imaging system, although it should be understood that other imaging orientations, including MLO, are used with the same equipment.) In operation, the image receptor 116 remains in place relative to the platform 106 while an image is taken. The immobilizer unit 104 releases the breast 102 for movement of arms 124, 126 to a different imaging orientation. For tomosynthesis, the support arm 124 stays in place, with the breast 102 immobilized and remaining in place, while at least the second support arm 126 rotates the x-ray source 122 relative to the immobilizer unit 104 and the compressed breast 102 about the axis 128. The imaging system 100 takes plural tomosynthesis projection images of the breast 102 at respective angles of the imaging beam 120 relative to the breast 102.


Concurrently and optionally, the image receptor 116 may be tilted relative to the breast support platform 106 and in sync with the rotation of the second support arm 126. The tilting can be through the same angle as the rotation of the x-ray source 122 but may also be through a different angle selected such that the imaging beam 120 remains substantially in the same position on the image receptor 116 for each of the plural images. The tilting can be about an axis 130, which can but need not be in the image plane of the image receptor 116. The tilting mechanism 118 that is coupled to the image receptor 116 can drive the image receptor 116 in a tilting motion. For tomosynthesis imaging and/or CT imaging, the breast support platform 106 can be horizontal or can be at an angle to the horizontal, e.g., at an orientation similar to that for conventional MLO imaging in mammography. The imaging system 100 can be solely a mammography system, a CT system, or solely a tomosynthesis system, or a “combo” system that can perform multiple forms of imaging. An example of such a combo system has been offered by the assignee hereof under the trade name Selenia Dimensions.


When the system is operated, the image receptor 116 produces imaging information in response to illumination by the imaging beam 120 and supplies it to an image processor 132 for processing and generating breast x-ray images. A system control and workstation unit 138 including software controls the operation of the system and interacts with the operator to receive commands and deliver information including processed-ray images.


The imaging system 100 includes a floor mount or base 140 for supporting the imaging system 100 on a floor. A gantry 142 extends upwards from the base 140 and rotatably supports both the tube head 208 and a support arm 210. The tube head 126 and support arm 124 are configured to rotate discretely from each other and may also be raised and lowered along a face 144 of the gantry 142 so as to accommodate patients of different heights. The x-ray source 122 is disposed within the tube head 208. Together, the tube head 126 and support arm 124 may be referred to as a C-arm 124.


A number of interfaces and display screens are disposed on the imaging system 100. Additionally or alternatively, a number of interfaces and display screens are disposed at the workstation unit 138, which may be located outside of a room of other components of the imaging system 100. These include a foot display screen 146, a gantry interface 148, a support arm interface 150, and a compression arm interface 152. In general, the various interfaces 148, 150, and 152 may include one or more tactile buttons, knobs, switches, as well as one or more display screens, including capacitive touch screens with graphical user interfaces (GUIs) so as to enable user interaction with and control of the imaging system 100. In general, the foot display screen 146 is primarily a display screen, though a capacitive touch screen might be utilized if required or desired.



FIG. 2 depicts a configuration of a user interface 200 for selecting an image type for a medical imaging system. The user interface 200 includes a set of image types 202, a set of image modalities 204, and imaging plan elements 206. The set of image types 202 includes a list of selectable image types on each tab for the set of image modalities 204. As shown, the “Conventional” modality tab is selected and shows a set of image types 202 organized by left and right breast imaging. Each tab for the set of image modalities 204 may include a different set of image types 202 associated with the selected tab. The imaging plan elements 206 include an “Add” element and a “Clear” element to add or remove one or more image types to a list. In an example, a medical professional using the user interface 200 may add one or more image types into a plan prior to imaging a breast. The image types may all be added to the plan before position the breast or after positioning the breast, or one or more image types may be added after positioning the breast for each image type.



FIG. 3 depicts an example user interface 300 for a medical imaging system. The user interface 300 includes a visual representation of a breast 302 and a visual representation of a medical imaging system 304. The visual representation of the breast 302 may be two-dimensional or three-dimensional. Additionally, the visual representation of the breast 302 may rotate along an axis disposed substantially parallel to the user interface 300 (or of the screen on which the user interface is displayed). For example, the visual representation of the breast 302 may swivel or pivot about an axis representing a position of a spine of a patient. The visual representation of the breast 302 may change in shape to deform based on the relative position and orientation of the visual representation of the medical imaging system 304. For example, the visual representation of the breast 302 may deform to mimic deformation of breast tissue under compression by the imaging system. The visual representation of the breast 302 may include visual representations of one or two breasts. An example that might be more intuitive for a technologist, however, includes a visual representation of both breasts 302 of the patient, as depicted in FIG. 3. This representation of both breasts 302 may remain fixed in a position that is consistent with the patient standing or sitting at an imaging system, e.g., the representation of both breasts 302 does not rotate. By not rotating, this provides the technologist with a consistent frame of reference.


The visual representation of the medical imaging system 304 includes a visual representation of a detector 308 and a visual representation of one or more of a source 306 or a compression paddle 306 of the imaging system. In FIG. 3, a paddle 306 is depicted. In this regard, the representation of both the detector 308 and the paddle 306 is again intuitive for a technologist, since the breast contacts the paddle and detector (or support platform under which the detector is located) during imaging. Regardless, as further described herein, a specific source 306 may additionally or alternatively be visually represented. As further described herein, the visual representation of the medical imaging system 304 may be rotatable and positionable relative to the visual representation of the breast 302. This may be the most intuitive configuration for a technologist, since during most breast imaging procedures, the patient remains upright while the detector, source, and compression system of the imaging system rotate. The visual representation of a detector 308 and the visual representation of the source or paddle 306 may be fixed relative to each other or, alternatively, may be independently moved (e.g, towards or away from each other) or rotated. To rotate the visual representation of the medical imaging system 304, a rotation element 310 may be selected. Although one rotation element 310 is shown, multiple rotation elements may be appreciated (e.g., one for the visual representation of a detector 308 and one for the visual representation of the source or paddle 306 and/or one for the visual representation of the breast 302). Interaction with the rotation element 310 at the user interface 300 (e.g., a click, a click and drag, etc.) may result in rotation of the associated visually represented element. Rotation may be stepwise or may be based on an amount of drag or time with which the rotation element 310 is interacted. Rotation of a visually represented element may be about a center of the element or a predefined point on an element. Rotation may also be about a point predefined by an approximate midway point between the detector 308 and the source or paddle 306. Rotation may be clockwise or counterclockwise along an axis of rotation.


The visual representation of the medical imaging system 304 shown in FIG. 3 includes two, two-dimensional parallel lines representing the detector 308 and the source and/or paddle 306. The distance between the parallel lines may be fixed or variable and may by representative of the relative position of the detector to the compression paddle. Additionally, the visual representation of the medical imaging system 304 may be two-dimensional or three-dimensional. The dimensions of the visual representation of the medical imaging system 304 may be the same as or different than the visual representation of the breast 302. In an example, the visual representation of the breast 302 is three-dimensional and the visual representation of the medical imaging system 304 is two-dimensional. In another example, both the visual representation of the breast 302 and the visual representation of the medical imaging system 304 are three-dimensional. In a further example, both the visual representation of the breast 302 and the visual representation of the medical imaging system 304 are two-dimensional.


Based on the relative position and orientation of the visual representation of the medical imaging system 304 relative to the visual representation of the breast 302, an image type is determined. The image type may be determined based on predetermined zones or regions containing the visual representation of the medical imaging system 304. Alternatively, the position and orientation of the visual representation of the medical imaging system 304 may be compared to predetermined positions/orientations, with each position/orientation associated with a specific imaging type. In an example, the actual position/orientation of the visual representation of the medical imaging system 304 relative to the visual representation of the breast 302 is compared with the predetermined positions/orientations to determine which predetermined position/orientation is closest. The visual representation of the medical imaging system 304 may be moved or rotated based on a click and drag action. In an example, the visual representation of the medical imaging system 304 may snap to the nearest predetermined position/orientation upon release of the click and drag action. The image type may be displayed at the user interface. Additionally, based on the determined image type, an image of a real breast is captured while the breast is compressed or immobilized by the imaging system.


Elements on the user interface 300 may include controls, graphics, charts, tool bars, input fields, icons, etc. Alternatively, other suitable means for providing input may be provided at the medical imaging system or at the workstation, for instance by a wheel, dial, knob, keyboard, mouse, bezel key, or other suitable interactive device. Thus, commands associated with the user interface 300 may be accepted through a display as touch input or through other input devices. Inputs may be received by the medical imaging system from a medical professional. A variety of gestures may be supported by the user interface 300, including a swipe, double-tap, drag, touch and hold, drag and drop, etc. A drag gesture may include an uninterrupted selection and movement. For example, a drag may include movement of a touch interaction across a touch surface without losing contact with the touch surface. The drag gesture may be similar to a swipe gesture over a longer time and/or at a slower speed. As further described herein, a drag gesture at the user interface 300 may visually change a portion of the user interface 300, such as by rotating or moving the visual representation of the breast 302 or the visual representation of the medical imaging system 304.



FIGS. 4A-4D depict different configurations of the example user interface of FIG. 3. For example, a first configuration 400A shows the visual representation of the medical imaging system 304 in a horizontal position/orientation relative to the visual representation of the breast 302 that is associated with an LCV (or Left Cleavage View) image type. In another example, a second configuration 400B shows the visual representation of the medical imaging system 304 in a horizontal position/orientation relative to the visual representation of the breast 302 that is associated with an LCC (or Left Craniocaudal) image type. In another example, a third configuration 400C shows the visual representation of the medical imaging system 304 in a tilted position/orientation relative to the visual representation of the breast 302 that is associated with an LMLO (or Left Mediolateral Oblique) image type. In a further example, a fourth configuration 400D shows the visual representation of the medical imaging system 304 in a vertical position/orientation relative to the visual representation of the breast 302 that is associated with an LML (or Left Mediolateral) image type. Although only four configurations are shown, other configurations associated with other image types are contemplated. For example, FIG. 2 depicts a number of breast image types, as known in the conventional mammography modality. A representation of any known or developed image type in any known or developed imaging modality may be utilized.


In an example application, a doctor or other specialist may, prior to an imaging procedure, select one or more image types to be later obtained by a technologist who is working with the patient and the imaging system. The specialist may select one or more image types depending on a previously-identified region of interest that requires further investigation, a prior diagnosis, or a to investigate a new suspect location. In most examples, the specialist may select the representation from a predefined list of representations of image types (such as the image types depicted in FIG. 2 or as otherwise known in the art). These particular representations may be saved to the patient's record, then accessed by a technologist during the imaging procedure. The technologist may select the image on a workstation or on an imaging system itself. The imaging system may then adjust automatically the position of any number of components to place those components in the general position required to take the selected image type. For example, for a right mediolateral oblique (RMLO) image type, the breast support platform and detector may tilt to an angle consistent with such an image type, as would the compression paddle. A tube head containing the x-ray source may move to a position out of the way of the technologist as she assists in placing and compressing the breast. Adjustments to the position of the detector and/or compression paddle may be possible to fine tune positioning as required or desired to accommodate patient comfort, anatomy, height, etc. Thereafter, the patient's breast may be compressed and the imaging procedure may be performed. Subsequent to imaging and release of the breast from the imaging system, the technologist may select the next required representation and the imaging system may automatically position its components as required in preparation therefor.


Each image type in a predefined list of image types includes a representation of the breasts, a representation of the detector (relative to the breast), and a representation of the paddle and/or source (relative to the breast and the detector), along with an indicator of image type (e.g., RCC, LXCCL, RCV, etc.). The technologies described herein, however, contemplate a user interface that may reflect non-standard relationships between a representation of the breast and the representation of the detector and paddle and/or source. In such a system, the representation of the detector and paddle and/or source may be movable relative to the representation of the breast. As such, based on the user control described above, the user may move the representation of the detector and the paddle and/or source along an x-axis (e.g., between the right breast representation and the left breast representation), along a y-axis (e.g., higher or lower along the breast representation), and about a z-axis (e.g., a rotation position reflected in the angle of the detector representation relative to the breast representation). Information may be provided on the interface (such as detector angle) for future reference by the technologist when preparing the imaging system. The ability to move the representations in the x-, y-, and z-axes allows the representation of the breast to reflect easy-to-understand conditions, so a technologist may readily ascertain the type of image required or desired.


Although the user interface technologies described herein are described primarily in the context of breast imaging, imaging of other types of body parts are also contemplated. Breast imaging is a particular type of imaging that allows for movement of the imaging system (e.g., detector and source) to accommodate an otherwise static body part (e.g., the breast is not easy or comfortable to manipulate). As such, adjustment of the representations of the detector and paddle and/or source are about three axes, generally. In imaging procedures for other body parts that are more easily manipulated relative to imaging system, position and rotation of the body part relative to the detector. For example, if the detector on an imaging system is fixed (e.g., such as with a table-type imaging system), the representation of the body part itself may be manipulated within the user interface, e.g., with regard to both position and degree of rotation, while a representation of the detector remains fixed. Imaging information such as specific pre-defined image type, angle of rotation of the body part, or other information, may be provided on the user interface.


In one example, based on a particular imaging procedure, a series of image types may be preloaded as a way to walk the specialist through an imaging procedure. For example, for a routine screening exam, it is typical to perform bilateral craniocaudal (CC) and mediolateral oblique (MLO) views. For patients undergoing a diagnostic exam, additional views may be indicated. The series of image types may be preloaded onto the workstation and displayed to the specialist. The series may be based on the types of screening exam, patient's electronic medical record, the practice at the imaging facility, the geographic region of the imaging facility, and prescribed indications by a radiologist.



FIG. 5 depicts an example method 500 for determining an image type for a medical imaging system. The method 500 may assist a user of an imaging system to determine an appropriate image type to image a breast, based on a configuration of the breast relative to the imaging system. The method 500 may be performed using systems described herein (e.g., imaging system 100 and user interface 300).


The method 500 begins at operation 502, where a visual representation of a breast and a visual representation of a medical imaging system are displayed. The display may be coupled to a medical imaging system or may be remote from a medical imaging system, such as at a remote workstation. The display shows a graphical user interface (GUI) that may have one or more features of the user interface 300 described with respect to FIG. 3. The visual representation of the medical imaging system may include visual representations of one or more components of the medical imaging system (e.g., the components shown in FIGS. 1A-1B of imaging system 100).


At operation 504, an indication to move the visual representation of the medical imaging system to a position and an orientation relative to the visual representation of the breast is received. The indication may be received at the GUI (e.g., via touch input) or may be received at a peripheral device connected to the display showing the GUI. In an example, movement and/or rotation of a visually represented element on the GUI may be a click and drag of the element. Visually represented elements being moved on the GUI may move relative to any received user input in real-time (e.g., an element moves according to a drag movement input).


At operation 506, the visual representation of the medical imaging system in the position and the orientation relative to the visual representation of the breast is displayed. The visual representation of the medical imaging system may snap to a closest position of a set of predetermined positions/orientations. For example, upon release of a click and drag input associated with the visual representation of the medical imaging system, the visual representation of the medical imaging system may be automatically re-positioned or re-oriented to align with a predetermined position/orientation. Each position/orientation of the set of predetermined positions may be associated with a unique image type.


At operation 508, an image type is determined based on the relative position and orientation of the visual representation. As further described herein, the image type may be based on which predetermined position/orientation is closest to the current position/orientation of the visual representation of the imaging system. Alternatively, the image type may be based on the current position/orientation of the visual representation of the medical imaging system being contained substantially within a predefined region or threshold. At operation 510, the image type is displayed. At operation 512, the imaging system may then adjust automatically the position of any number of components to place those components in the general position required to take the selected image type.


Operations 504-512 may repeat as required or desired. For example, if a visually represented element on the GUI is moved or reoriented relative to another visually represented element, the image type may be re-evaluated.



FIG. 6 illustrates an example suitable operating environment 600 for a medical imaging system, as described herein. In its most basic configuration, operating environment 600 typically includes at least one processing unit (or processor) 602 and memory 604. Depending on the exact configuration and type of computing device, memory 604 (storing, instructions to perform projection of an image onto a specimen) may be volatile (such as RAM), non-volatile (such as RAM, flash memory, etc.), or some combination of the two. This most basic configuration is illustrated in FIG. 6 by dashed line 606. Further, environment 600 may also include storage devices (removable, 608, and/or non-removable, 610) including, but not limited to, magnetic or optical disks or tape. Similarly, environment 600 may also have input device(s) 614 such as keyboard, mouse, pen, voice input, etc. and/or output device(s) 616 such as a display, speakers, printer, etc. Also included in the environment may be one or more communication connections 612, such as LAN, WAN, point to point, etc. In embodiments, the connections may be operable to facility point-to-point communications, connection-oriented communications, connectionless communications, etc.


Operating environment 600 typically includes at least some form of computer readable media. Computer readable media can be any available media that can be accessed by one or more processing units (or processors) 602 or other devices comprising the operating environment. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transitory medium which can be used to store the desired information. Computer storage media does not include communication media.


Communication media embodies computer readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared, microwave, and other wireless media. Combinations of the any of the above should also be included within the scope of computer readable media.


The operating environment 600 may be a single computer operating in a networked environment using logical connections to one or more remote computers. The remote computer may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above as well as others not so mentioned. As an example, the operating environment 600 may be shared between one or more imaging systems, such as imaging system 100. The logical connections may include any method supported by available communications media. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets, and the Internet.


Although aspects of the present disclosure are described with respect to image analysis of living breast tissue, it should be appreciated that the present disclosure may also be useful in variety of other applications, such as imaging excised breast tissue, other tissue, bone, living organisms, body parts, or any other object, living or dead.


As should be appreciated, while the above methods have been described in a particular order, no such order is inherently necessary for each operation identified in the methods. For instance, the operations identified in the methods may be performed concurrently with other operations or in different orders. In addition, the methods described above may be performed by the systems described herein. For example, a system may have at least one processor and memory storing instructions that, when executed by the at least one processor, cause the system to perform the methods described herein.


The embodiments described herein may be employed using software, hardware, or a combination of software and hardware to implement and perform the systems and methods disclosed herein. Although specific devices have been recited throughout the disclosure as performing specific functions, one of skill in the art will appreciate that these devices are provided for illustrative purposes, and other devices may be employed to perform the functionality disclosed herein without departing from the scope of the disclosure.


This disclosure describes some embodiments of the present technology with reference to the accompanying drawings, in which only some of the possible embodiments were shown. Other aspects may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments were provided so that this disclosure was thorough and complete and fully conveyed the scope of the possible embodiments to those skilled in the art. Further, as used herein and in the claims, the phrase “at least one of element A, element B, or element C” is intended to convey any of: element A, element B, element C, elements A and B, elements A and C, elements B and C, and elements A, B, and C.


Although specific embodiments are described herein, the scope of the technology is not limited to those specific embodiments. One skilled in the art will recognize other embodiments or improvements that are within the scope and spirit of the present technology. Therefore, the specific structure, acts, or media are disclosed only as illustrative embodiments. The scope of the technology is defined by the following claims and any equivalents therein.

Claims
  • 1. A method for determining an imaging type, the method comprising: displaying a visual representation of a breast and a visual representation of a medical imaging system capable of imaging the breast, wherein the visual representation of the medical imaging system includes a visual representation of a compression mechanism and a visual representation of an x-ray detector;receiving an indication to move the visual representation of the medical imaging system to a position and an orientation relative to the visual representation of the breast;displaying the visual representation of the medical imaging system in the position and the orientation relative to the visual representation of the breast;based on the position and the orientation of the visual representation of the medical imaging system relative to the visual representation of the breast, determining an image type of the medical imaging system;displaying the image type; andautomatically adjusting at least one component of the medical imaging system based at least in part on the determined image type.
  • 2. The method of claim 1, the method further comprising: acquiring, while the breast is compressed by the imaging system, an x-ray image of the breast according to the determined image type.
  • 3. The method of claim 1, the method further comprising: receiving an indication to rotate the visual representation of the breast.
  • 4. The method of claim 3, wherein the indication to rotate the visual representation of the breast is associated with rotation relative to a vertical axis along a length of the visual representation of the breast.
  • 5. The method of claim 1, wherein the orientation of the visual representation of the medical imaging system is tilted relative to the visual representation of the breast.
  • 6. The method of claim 1, wherein the visual representation of the medical imaging system includes two parallel lines.
  • 7. The method of claim 1, wherein the indication to move the visual representation of the medical imaging system includes an indication to rotate the visual representation of the medical imaging system in one of a clockwise or counter-clockwise direction.
  • 8. The method of claim 1, wherein the indication to move the visual representation of the medical imaging system is a click and drag.
  • 9. The method of claim 1, wherein the position and the orientation of the visual representation of the medical imaging system aligns with a predetermined position and a predetermined orientation.
  • 10. A user interface for breast imaging, the user interface comprising: a visual representation of a breast;a visual representation of a medical imaging system capable of imaging the breast, wherein the visual representation of the medical imaging system includes: a visual representation of a compression paddle of the imaging system; anda visual representation of an x-ray detector of the imaging system;wherein the visual representation of the compression paddle and the visual representation of the x-ray detector are rotatable and positionable relative to the visual representation of the breast; andwherein an image type is displayed, based on a position and an orientation of the visual representation of the medical imaging system relative to the visual representation of the breast.
  • 11. The user interface of claim 10, wherein the visual representation of the breast is three-dimensional.
  • 12. The user interface of claim 11, wherein the visual representation of the breast is rotatable.
  • 13. The user interface of claim 10, wherein the visual representation of the breast deforms based on the position and the orientation of the visual representation of a compression paddle and the visual representation of the x-ray detector relative to the visual representation of the breast.
  • 14. The user interface of claim 10, wherein the visual representation of the breast includes a visual representation of two breasts.
  • 15. The user interface of claim 10, wherein the visual representation of the medical imaging system includes two parallel lines.
  • 16. The user interface of claim 15, wherein the two parallel lines are fixed relative to each other.
  • 17. The user interface of claim 15, wherein the visual representation of the compression paddle and the visual representation of the x-ray detector are rotatable and positionable relative to a predetermined position and a predetermined orientation.
  • 18. The user interface of claim 17, wherein the predetermined position and the predetermined orientation are associated with a specific image type.
  • 19. The user interface of claim 10, wherein the image type is selected from a group consisting of: craniocaudal (CC);mediolateral oblique (MLO);mediolateral (ML);exaggerated craniocaudal lateral (XCCL);exaggerated craniocaudal medial (XCCM);cleavage view (CV);lateromedial (LM);tangential (TAN);caudocranial from below (FB);axillary tail (AT);lateromedial oblique (LMO);superoinferior oblique (SIO); andinferomedial superolateral oblique (ISO).
  • 20. An apparatus for breast imaging, the apparatus comprising: an x-ray source capable of selectively moving relative to a breast;an x-ray detector;a compression system for compressing the breast, the compression system disposed between the x-ray source and the x-ray detector;a display;a processor; andmemory storing instructions that, when executed by the processor, cause the apparatus to perform a set of operations comprising: displaying, at the display, a visual representation of the breast, a visual representation of the compression system, and a visual representation of the x-ray detector;receiving an indication to move the visual representation of the compression system and the visual representation of the x-ray detector relative to the visual representation of the breast;displaying, at the display, the visual representation of the compression system and the visual representation of the x-ray detector in a position and an orientation relative to the visual representation of the breast, based on the indication;based on the position and the orientation of the visual compression system and the visual representation of the x-ray detector relative to the visual representation of the breast, determining an image type; anddisplaying, at the display, the image type.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of priority to U.S. Provisional Application No. 63/252,320 filed Oct. 5, 2021, which application is hereby incorporated in its entirety by reference.

US Referenced Citations (255)
Number Name Date Kind
3502878 Stewart Mar 1970 A
3863073 Wagner Jan 1975 A
3971950 Evans et al. Jul 1976 A
4160906 Daniels et al. Jul 1979 A
4310766 Finkenzeller et al. Jan 1982 A
4496557 Malen et al. Jan 1985 A
4559641 Caugant et al. Dec 1985 A
4706269 Reina et al. Nov 1987 A
4744099 Huettenrauch et al. May 1988 A
4773086 Fujita et al. Sep 1988 A
4773087 Plewes Sep 1988 A
4819258 Kleinman et al. Apr 1989 A
4821727 Levene et al. Apr 1989 A
4969174 Scheid et al. Nov 1990 A
4989227 Tirelli et al. Jan 1991 A
5018176 Romeas et al. May 1991 A
RE33634 Yanaki Jul 1991 E
5029193 Saffer Jul 1991 A
5051904 Griffith Sep 1991 A
5078142 Siczek et al. Jan 1992 A
5163075 Lubinsky et al. Nov 1992 A
5164976 Scheid et al. Nov 1992 A
5199056 Darrah Mar 1993 A
5240011 Assa Aug 1993 A
5289520 Pellegrino et al. Feb 1994 A
5359637 Webber Oct 1994 A
5365562 Toker Nov 1994 A
5404152 Nagai Apr 1995 A
5415169 Siczek et al. May 1995 A
5426685 Pellegrino et al. Jun 1995 A
5452367 Bick et al. Sep 1995 A
5506877 Niklason et al. Apr 1996 A
5526394 Siczek et al. Jun 1996 A
5539797 Heidsieck et al. Jul 1996 A
5553111 Moore et al. Sep 1996 A
5592562 Rooks Jan 1997 A
5594769 Pellegrino et al. Jan 1997 A
5596200 Sharma et al. Jan 1997 A
5598454 Franetzki et al. Jan 1997 A
5609152 Pellegrino et al. Mar 1997 A
5627869 Andrew et al. May 1997 A
5657362 Giger et al. Aug 1997 A
5668889 Hara Sep 1997 A
5719952 Rooks Feb 1998 A
5735264 Siczek et al. Apr 1998 A
5769086 Ritchart et al. Jun 1998 A
5803912 Siczek et al. Sep 1998 A
5818898 Tsukamoto et al. Oct 1998 A
5828722 Ploetz et al. Oct 1998 A
5872828 Niklason et al. Feb 1999 A
5878104 Ploetz Mar 1999 A
5896437 Ploetz Apr 1999 A
5986662 Argiro et al. Nov 1999 A
6005907 Ploetz Dec 1999 A
6022325 Siczek et al. Feb 2000 A
6075879 Roehrig et al. Jun 2000 A
6091841 Rogers et al. Jul 2000 A
6137527 Abdel-Malek et al. Oct 2000 A
6141398 He et al. Oct 2000 A
6149301 Kautzer et al. Nov 2000 A
6175117 Komardin et al. Jan 2001 B1
6196715 Nambu et al. Mar 2001 B1
6216540 Nelson et al. Apr 2001 B1
6219059 Argiro Apr 2001 B1
6233473 Shepherd et al. May 2001 B1
6243441 Zur Jun 2001 B1
6256370 Yavuz Jul 2001 B1
6272207 Tang Aug 2001 B1
6289235 Webber et al. Sep 2001 B1
6292530 Yavus et al. Sep 2001 B1
6327336 Gingold et al. Dec 2001 B1
6341156 Baetz et al. Jan 2002 B1
6375352 Hewes et al. Apr 2002 B1
6411836 Patel et al. Jun 2002 B1
6415015 Nicolas et al. Jul 2002 B2
6442288 Haerer et al. Aug 2002 B1
6459925 Nields et al. Oct 2002 B1
6501819 Unger et al. Dec 2002 B2
6515685 Halverson Feb 2003 B1
6525713 Soeta et al. Feb 2003 B1
6556655 Chichereau et al. Apr 2003 B1
6597762 Ferrant et al. Jul 2003 B1
6611575 Alyassin et al. Aug 2003 B1
6620111 Stephens et al. Sep 2003 B2
6626849 Huitema et al. Sep 2003 B2
6633674 Barnes et al. Oct 2003 B1
6638235 Miller et al. Oct 2003 B2
6647092 Eberhard et al. Nov 2003 B2
6744848 Stanton et al. Jun 2004 B2
6748044 Sabol et al. Jun 2004 B2
6751285 Eberhard et al. Jun 2004 B2
6751780 Neff et al. Jun 2004 B1
6758824 Miller et al. Jul 2004 B1
6813334 Koppe et al. Nov 2004 B2
6882700 Wang et al. Apr 2005 B2
6885724 Li et al. Apr 2005 B2
6912319 Barnes et al. Jun 2005 B1
6940943 Claus et al. Sep 2005 B2
6978040 Berestov Dec 2005 B2
6999554 Mertelmeier Feb 2006 B2
7025725 Dione et al. Apr 2006 B2
7110490 Eberhard Sep 2006 B2
7110502 Tsuji Sep 2006 B2
7123684 Jing et al. Oct 2006 B2
7127091 Op De Beek et al. Oct 2006 B2
7142633 Eberhard et al. Nov 2006 B2
7245694 Jing et al. Jul 2007 B2
7315607 Ramsauer Jan 2008 B2
7319735 Defreitas et al. Jan 2008 B2
7323692 Rowlands et al. Jan 2008 B2
7430272 Jing et al. Sep 2008 B2
7443949 Defreitas et al. Oct 2008 B2
7577282 Gkanatsios et al. Aug 2009 B2
7606801 Faitelson et al. Oct 2009 B2
7630533 Ruth et al. Dec 2009 B2
7702142 Ren et al. Apr 2010 B2
7760924 Ruth et al. Jul 2010 B2
7840905 Weber Nov 2010 B1
8239784 Hotelling Aug 2012 B2
8571289 Ruth et al. Oct 2013 B2
8712127 Ren et al. Apr 2014 B2
8799013 Gustafson Aug 2014 B2
8842806 Packard Sep 2014 B2
9084579 Ren Jul 2015 B2
9795357 Carelsen Oct 2017 B2
9811758 Ren Nov 2017 B2
9962138 Schweizer May 2018 B2
10076295 Gemmel Sep 2018 B2
10111631 Gkanatsios et al. Oct 2018 B2
10206644 Kim Feb 2019 B2
10248882 Ren Apr 2019 B2
10679095 Ren Jun 2020 B2
10922897 Maeda Feb 2021 B2
11650672 Mellett May 2023 B2
11857358 Liu Jan 2024 B2
12011305 Cowles Jun 2024 B2
20010038681 Stanton et al. Nov 2001 A1
20010038861 Hsu et al. Nov 2001 A1
20020012450 Tsujii Jan 2002 A1
20020050986 Inoue et al. May 2002 A1
20020075997 Unger et al. Jun 2002 A1
20030018272 Treado et al. Jan 2003 A1
20030073895 Nields et al. Apr 2003 A1
20030095624 Eberhard et al. May 2003 A1
20030149364 Kapur Aug 2003 A1
20030194050 Eberhard et al. Oct 2003 A1
20030194051 Wang et al. Oct 2003 A1
20030194115 Kaufhold et al. Oct 2003 A1
20030194121 Eberhard et al. Oct 2003 A1
20030210254 Doan et al. Nov 2003 A1
20030215120 Uppaluri et al. Nov 2003 A1
20040001094 Unnewehr Jan 2004 A1
20040008809 Webber Jan 2004 A1
20040066882 Eberhard et al. Apr 2004 A1
20040066884 Hermann Claus et al. Apr 2004 A1
20040066904 Eberhard et al. Apr 2004 A1
20040094167 Brady et al. May 2004 A1
20040101095 Jing et al. May 2004 A1
20040109529 Eberhard et al. Jun 2004 A1
20040171986 Tremaglio, Jr. et al. Sep 2004 A1
20040267157 Miller et al. Dec 2004 A1
20050049521 Miller et al. Mar 2005 A1
20050063509 Defreitas et al. Mar 2005 A1
20050078797 Danielsson et al. Apr 2005 A1
20050089205 Kapur Apr 2005 A1
20050105679 Wu et al. May 2005 A1
20050113681 DeFreitas May 2005 A1
20050113715 Schwindt et al. May 2005 A1
20050129172 Mertelmeier Jun 2005 A1
20050135555 Claus et al. Jun 2005 A1
20050135664 Kaufhold et al. Jun 2005 A1
20050140656 McLoone Jun 2005 A1
20050226375 Eberhard et al. Oct 2005 A1
20060026535 Hotelling Feb 2006 A1
20060030784 Miller et al. Feb 2006 A1
20060074288 Kelly et al. Apr 2006 A1
20060098855 Gkanatsios et al. May 2006 A1
20060129062 Nicoson et al. Jun 2006 A1
20060155209 Miller et al. Jul 2006 A1
20060291618 Eberhard et al. Dec 2006 A1
20070030949 Jing et al. Feb 2007 A1
20070036265 Jing et al. Feb 2007 A1
20070076844 Defreitas et al. Apr 2007 A1
20070223651 Wagenaar et al. Sep 2007 A1
20070225600 Weibrecht et al. Sep 2007 A1
20070242800 Jing et al. Oct 2007 A1
20080019581 Gkanatsios et al. Jan 2008 A1
20080021877 Saito Jan 2008 A1
20080045833 Defreitas et al. Feb 2008 A1
20080109740 Prinsen et al. May 2008 A1
20080130979 Ren et al. Jun 2008 A1
20080187095 Boone Aug 2008 A1
20080262874 Toshimutsu Oct 2008 A1
20080267467 Sokulin et al. Oct 2008 A1
20090003519 Defreitas et al. Jan 2009 A1
20090010384 Jing et al. Jan 2009 A1
20090033522 Skillman Feb 2009 A1
20090080594 Brooks et al. Mar 2009 A1
20090080602 Brooks et al. Mar 2009 A1
20090135997 Defreitas et al. May 2009 A1
20090174663 Rudd Jul 2009 A1
20090213034 Wu et al. Aug 2009 A1
20090268865 Ren et al. Oct 2009 A1
20090296882 Gkanatsios et al. Dec 2009 A1
20090304147 Jing et al. Dec 2009 A1
20100054400 Ren et al. Mar 2010 A1
20100083154 Takeshita Apr 2010 A1
20100086188 Ruth et al. Apr 2010 A1
20100135558 Ruth et al. Jun 2010 A1
20100194682 Orr Aug 2010 A1
20100195882 Ren et al. Aug 2010 A1
20100226475 Smith et al. Sep 2010 A1
20100325088 Hsieh et al. Dec 2010 A1
20110137132 Gustafson Jun 2011 A1
20110270358 Davis Nov 2011 A1
20110282686 Venon Nov 2011 A1
20110314405 Turner Dec 2011 A1
20120131498 Gross et al. May 2012 A1
20120133600 Marshall et al. May 2012 A1
20120154431 Fram Jun 2012 A1
20120275656 Boese et al. Nov 2012 A1
20130239063 Ubillos Sep 2013 A1
20130259193 Packard Oct 2013 A1
20140013280 Yoshioka et al. Jan 2014 A1
20140033126 Kreeger Jan 2014 A1
20140123183 Fujimoto May 2014 A1
20140140604 Carton et al. May 2014 A1
20140143710 Zhao May 2014 A1
20140282216 Baker Sep 2014 A1
20140314205 Carelsen Oct 2014 A1
20150094581 Butler Apr 2015 A1
20150260816 Liang Sep 2015 A1
20150309712 Marshall et al. Oct 2015 A1
20150317434 Kondo Nov 2015 A1
20150374325 Shimizu Dec 2015 A1
20160162163 Park et al. Jun 2016 A1
20160166222 Kim Jun 2016 A1
20160235386 Schweizer Aug 2016 A1
20160296185 Gemmel Oct 2016 A1
20160364122 Shimomura Dec 2016 A1
20160367120 Dupont et al. Dec 2016 A1
20170038914 Kawagishi Feb 2017 A1
20170065238 Smith et al. Mar 2017 A1
20180137385 Ren May 2018 A1
20180211421 Wicklein Jul 2018 A1
20190196662 Mitchell Jun 2019 A1
20190221046 Maeda Jul 2019 A1
20190325255 Ren Oct 2019 A1
20200363877 Mellett Nov 2020 A1
20200373013 Cao Nov 2020 A1
20220015731 Liu Jan 2022 A1
20220020475 Chen et al. Jan 2022 A1
20220031262 Cowles Feb 2022 A1
20220172824 Solis Jun 2022 A1
20230107616 Saba Apr 2023 A1
Foreign Referenced Citations (39)
Number Date Country
108135580 Jun 2018 CN
775467 May 1997 EP
982001 Mar 2000 EP
1004957 May 2000 EP
1428473 Jun 2004 EP
2783632 Oct 2014 EP
2913769 Sep 2015 EP
2952376 Dec 2015 EP
2000-322198 Nov 2000 JP
2004-038947 Feb 2004 JP
2004-357789 Dec 2004 JP
2007-029260 Feb 2007 JP
2007-282656 Nov 2007 JP
2007-330374 Dec 2007 JP
2008-503253 Feb 2008 JP
2008-073436 Apr 2008 JP
2008-199293 Aug 2008 JP
2010-086149 Apr 2010 JP
2014-068874 Apr 2014 JP
2014-104099 Jun 2014 JP
2017-000664 Jan 2017 JP
199005485 May 1990 WO
199816903 Apr 1998 WO
0051484 Sep 2000 WO
03020114 Mar 2003 WO
2005051197 Jun 2005 WO
2005110230 Nov 2005 WO
2005112767 Dec 2005 WO
2006055830 May 2006 WO
2006058160 Jun 2006 WO
2011044295 Apr 2011 WO
2011066486 Jun 2011 WO
2012071429 May 2012 WO
2014183183 Nov 2014 WO
2018183548 Oct 2018 WO
2018183549 Oct 2018 WO
2018183550 Oct 2018 WO
2019032558 Feb 2019 WO
WO-2020068845 Apr 2020 WO
Non-Patent Literature Citations (14)
Entry
“Later Basics—Adobe Press”, published Apr. 6, 2017; https://www.adobepress.com/articles/article.asp?p=2756476&seqNum=4.
Cole, Elodia, et al., “The Effects of Gray Seale Image Processing on Digital Mammography Interpretation Performance”, Academic Radiology, vol. 12, No. 5, pp. 585-595, May 2005.
Digital Clinical Reports, Tomosynthesis, GE Brochure 98/5493, Nov. 1998.
Dobbins JT et al. “Digital x-ray tomosynthesis: current state of the art and clinical potential” Physics in Medicine and Biology vol. 48, No. 19, pp. 65-81 (2003).
Essentials for life: Senographe Essential Full-Field Digital Mammography System, GE Health-care Brochure, MM-0132-05.06-ENUS, 2006.
Filtered Back Projection, (NYGREN) published May 8, 2007; URL: http://web.archive.org/web/1999101013 I 715/http://www.owlnet.rice.edu/-elec539/Projects97/cult/node2.html.
Grant, DG, “Tomosynthesis, a three dimensional imagine technique”, IEEE Trans. Biomed Engineering, vol. BME-19, #1, Jan. 1972, pp. 20-28.
Heang-Ping Chan et al., “ROC study of the effect of stereoscopic imaging on assessment of breast lesions”, Medical Physics, vol. 32, No. 4, Apr. 2005.
Kita et al., “Correspondence between different view breast X-rays using simulation of breast deformation”, Proceedings 1998 IEE Computer Society Conference on Computer Vision and Pattern Recognition, Santa Barbara, CA, Jun. 23-25, 1998, pp. 700-707.
Lorad Selenia Document B-BI-SEO US/Intl (May 2006) copyright Hologic 2006.
Mammographic Accreditation Phantom, http://www.cirsinc.com/pdfs/015cp.pdf.
Pediconi, Federica et al., “Color-coded automated signal intensity curve for detection and characterization of breast lesions: Preliminary evaluation of a new software for MR-based breast imaging”, International Congress Series 1281 (2005) 1081-1086.
Senographe 700 & 8OOT (GE); 2-page download on Jun. 22, 2006 from www.gehealthcare.com/inen/rad/whe/products/mswh800t.html.; Figures 1-7 on 4 sheets relateral shift compression paddle.
Smith, A. , “Fundamentals of Breast Tomosynthesis”, White Paper, Hologic Inc., WP-00007, Jun. 2008.
Related Publications (1)
Number Date Country
20230107616 A1 Apr 2023 US
Provisional Applications (1)
Number Date Country
63252320 Oct 2021 US