An ongoing tension is found in today's healthcare environments, such as radiology departments, between providing high-quality image review and maintaining adequate patient throughput to keep costs under control. Despite ongoing advances in imaging technology and related data processing systems, it is the radiologist who continues to bear the burden of the cost-quality tradeoff. As used herein, radiologist generically refers to a medical professional that analyzes medical images and makes clinical determinations therefrom.
Radiologists have expressed a clinical need for an automated solution to correlate corresponding regions of interests (ROIs) in medical images acquired from various imaging modalities. ROIs may be associated with breast abnormalities such as masses or microcalcification. ROIs could also be associated with specific focus areas of the breast that radiologist are interested reviewing in more detail. For any given patient, a radiologist may be able to review images from mammography, ultrasound, and MRI of the same patient. In an effort to determine the location of the ROI in each image, the radiologist will make a visual comparison, sometimes aided by a separate ruler or simply using the radiologist's hand or fingers. If these areas of interest appear in multiple images, it may lead to the conclusion that the region of interest is indeed a mass or microcalcification. If, on the other hand, there is no distinct region of interest in the second image at the appropriate location, it may lead to a conclusion that the region of interest in the first image is not a mass or microcalcification. Previously presented automated solutions have been proposed to provide for correlation between images of different modalities. However, such solutions proved unsatisfactory due to the vast amount of training data required to train machine learning (ML) mechanisms and the inaccuracy of the results provided by those ML mechanisms. In addition, ML mechanisms involve a substantial amount of processing resources and the automated correlation have resulted in a delay in presenting images to the radiologists. As a result, healthcare professionals have been forced to perform the ROI correlations manually with the aid of image manipulation tools, such as pan and zoom tools. This manual ROI correlation results in excessive mouse clicks and movements that impede image review workflow, add review time and associated costs, and cause fatigue to the radiologist.
It is with respect to these and other general considerations that the aspects disclosed herein have been made. Also, although relatively specific problems may be discussed, it should be understood that the examples should not be limited to solving the specific problems identified in the background or elsewhere in the present disclosure.
Examples of the present disclosure describe systems and methods for an auto-focus tool for multimodality image review. In aspects, an image review system may provide for the display of a set of medical images representing one or more imaging modalities. The system may also provide an auto-focus tool that may be used during the review of the set of medical images. After receiving an instruction to activate the auto-focus tool, the system may receive a selection of an ROI in at least one of the images in the set of medical images. The auto-focus tool may identify the location of the ROI within the image and use the identified ROI location to identify a corresponding area or ROI in the remaining set of medical images. The auto-focus tool may orient (e.g., pan, zoom, flip, rotate, align, center) and display the remaining set of medical images such that the identified area or ROI is prominently displayed in the set of medical images.
In one aspect, examples provided in the present disclosure relate to a system comprising: a processor; and memory coupled to the processor, the memory comprising computer executable instructions that, when executed, perform a method. The method comprises receiving a selection of a region of interest (ROI) in a first image of a plurality of images; identifying, using an auto-focus tool, a location of the ROI within the first image; identifying, using the auto-focus tool, an area corresponding to the location of the ROI in at least a second image of the plurality of images, wherein the first image and the second image are different imaging modality types and an automated determination mechanism is used to identify the area corresponding to the location of the ROI; and causing the auto-focus tool to automatically: focus a first field of view on the ROI in the first image; focus a second field of view on the area corresponding to the location of the ROI in the second image.
In a first alternative aspect, the method comprises receiving a selection of a bounding box in a mammography image of a plurality of images, the bounding box identifying an ROI; identifying, using an auto-focus tool, a location of the ROI within the mammography image; identifying, using the auto-focus tool, an area corresponding to the location of the ROI in at least a tomography slice image of the plurality of images and an MRI image of the plurality of images; and causing the auto-focus tool to automatically: magnify the ROI in a field of view of the tomography slice image; and pan to at least one of: a breast comprising the ROI or an image plane identifying the ROI in a field of view of the MRI image.
In a second alternative aspect, the method comprises receiving a selection of a bounding box in a mammography image of a plurality of images, the bounding box identifying an ROI; identifying, using an auto-focus tool, a location of the ROI within the mammography image; identifying, using the auto-focus tool, an area corresponding to the location of the ROI in at least a tomography slice image of the plurality of images, an ultrasound image of the plurality of images, and an MM image of the plurality of images; and causing the auto-focus tool to automatically: magnify the ROI in a field of view of the tomography slice image; pan to a breast comprising the ROI in a field of view of the ultrasound image; and pan to at least one of: a breast comprising the ROI or an image plane identifying the ROI in a field of view of the MRI image.
In an example, the system is an electronic image review system for reviewing medical images within a healthcare environment. In another example, focusing the first field of view on the ROI in the first image comprises centering the ROI within the first field of view and scaling the ROI to increase or decrease a size of the ROI within the first field of view. In another example, the imaging modality types include at least two of: mammography, MM, or ultrasound. In another example, the first image is generated during a current patient visit for a patient and the second image was generated during a previous patient visit for the patient. In another example, the plurality of images is arranged for viewing based on an image viewing layout specified by a user, the image viewing layout enabling the plurality of images to be concurrently presented to a user.
In another example, receiving the selection of the ROI comprises: receiving a selection of a point in the first image; and defining an area surrounding the point as the ROI, wherein in response to receiving the selection of the point in the first image, a bounding box comprising at least a portion of the ROI is automatically applied to the image such that at least a first object in the first image is delineated from at least a second object in the first image. In another example, receiving the selection of the ROI comprises: receiving a selection of a plurality of points in the first image; determining a centroid of the plurality of points; and defining an area surrounding the centroid as the ROI. In another example, the automated determination mechanism is at least one of: a rule set, a mapping algorithm, or an image or object classifier. In another example, a process for identifying the location of the ROI within the first image is based on the imaging modality type of the first image and the process includes the use of at least one of: image view information, spatial coordinate information, image header information, or image orientation data.
In another example, when the first image and a third image of a plurality of images are a same imaging modality type, a mapping function is used to map first ROI identification information of the first image to second ROI identification information of the third image such that the first ROI identification information and the second ROI identification information are a same type. In another example, a mapping function is used to convert first ROI identification information of the first image to second ROI identification information of the second image such that the first ROI identification information and the second ROI identification information are a different type. In another example, focusing the first field of view on the ROI in the first image comprises orienting the first image such that the ROI is at least one of horizontally or vertically centered in a viewport comprising the first image. In another example, wherein focusing the second field of view on the area corresponding to the location of the ROI in the second image comprises applying a scaling factor to the area corresponding to the location of the ROI.
In another aspect, examples provided in the present disclosure relate to a method comprising: receiving, at an image review device, a selection of a region of interest (ROI) in a first image of a plurality of images; identifying, using the auto-focus tool, a location of the ROI within the first image; identifying, using the auto-focus tool, an area corresponding to the location of the ROI in at least a second image of the plurality of images, wherein the first image and the second image are different imaging modality types and an automated determination mechanism is used to identify the area corresponding to the location of the ROI; and causing the auto-focus tool to automatically: focus a first field of view on the ROI in the first image; and focus a second field of view on the area corresponding to the location of the ROI in the second image. In an example, the plurality of images comprises at least a mammography image, an MRI image, and an ultrasound image.
In yet another aspect, examples provided in the present disclosure relate to a computing device comprising: a processor; and an image auto-focus tool configured to: receive a selection of a region of interest (ROI) in a first image of a plurality of images; identify a location of the ROI within the first image; identify an area corresponding to the location of the ROI in at least a second image of the plurality of images, wherein the first image and the second image are different imaging modality types and an automated determination mechanism is used to identify the area corresponding to the location of the ROI; focus a first field of view on the ROI in the first image; and focus a second field of view on the area corresponding to the location of the ROI in the second image, wherein focusing the second field of view comprises at least one of scaling the second image or panning the second image.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Additional aspects, features, and/or advantages of examples will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the disclosure.
Non-limiting and non-exhaustive examples are described with reference to the following figures.
Medical imaging has become a widely used tool for identifying and diagnosing ROIs and abnormalities, such as cancers or other conditions, within the human body. Medical imaging processes such as mammography and tomosynthesis are particularly useful tools for imaging breasts to screen for, or diagnose, cancer or other lesions within the breasts. Tomosynthesis systems are mammography systems that allow high resolution breast imaging based on limited angle tomosynthesis. Tomosynthesis, generally, produces a plurality of X-ray images, each of discrete layers or slices of the breast, through the entire thickness thereof. In contrast to conventional two-dimensional (2D) mammography systems, a tomosynthesis system acquires a series of X-ray projection images, each projection image obtained at a different angular displacement as the X-ray source moves along a path, such as a circular arc, over the breast. In contrast to conventional computed tomography (CT), tomosynthesis is typically based on projection images obtained at limited angular displacements of the X-ray source around the breast. Tomosynthesis reduces or eliminates the problems caused by tissue overlap and structure noise present in 2D mammography imaging.
In recent times, healthcare professionals have expressed a clinical need for an automated solution to correlate ROIs in medical images acquired from various imaging modalities, such as mammography, synthesized mammography, tomosynthesis, wide angle tomosynthesis, ultrasound, computed tomography (CT), and magnetic resonance imaging (MM). Proposed solutions have typically involved the use of various ML approaches, which require a vast amount of diverse training data that is generally not readily available for clinical use. Acquiring and using the training data to train an ML model or algorithm, thus, requires a substantial resource investment. This resource investment is further exacerbated by the substantial computing resources (e.g., central processing unit (CPU), memory, and file storage resources) demand required to operate the trained ML model or algorithm. In many cases, the analysis performed by a trained ML model or algorithm is slow (due to the computing resource demand) and the results of the trained ML model or algorithm are inaccurate or imprecise.
For the above reasons, the ROI correlation is still primarily performed manually by healthcare professionals. For example, healthcare professionals use image manipulation tools, such as pan and zoom tools, to focus on an ROI or narrow a field of view in an image. However, the use of such image manipulation tools often results in excessive mouse clicks and movements that impede image review workflow and cause healthcare professionals to fatigue. For instance, when using such image manipulation tools, a user must select multiple tools to accomplish a specific task. Each selected tool must be applied to each image or viewport in a set of medical images to enable the user to manually pan and/or zoom the image/viewport. Each instance of manual panning/zooming may result in multiple mouse clicks and movements while the user attempts to achieve an optimal or acceptable view of the image/viewport. Moreover, in some cases, the image manipulation tools are specific to a particular imaging modality or imaging system (e.g., the image manipulation tools are not multimodal). For instance, a first set of image manipulation tools of a first image review system may be used to view mammography images and a second set of image manipulation tools of a second image review system may be used to view MRI images. To compare the mammography images to the MM images, a healthcare professional may display the mammography and MRI images on separate display screens and manually orient the respective images on each display screen using the respective image manipulation tools. The use of different sets of image manipulation tools is cumbersome, complicated, and requires healthcare professionals to be proficient using multiple sets of image manipulation tools.
To address such issues with traditional methods for ROI correlation, the present disclosure describes systems and methods for an auto-focus tool for multimodality image review. In aspects, an image review system may provide for the display of a set of medical images representing one or more imaging modalities, such as mammography, tomosynthesis, MRI, and ultrasound, among others. As a specific example, the set of medical images may include a current mammography image of a patient's breast (collected during a current patient visit) and one or more prior mammography images of the patient's breast (collected during one or more previous patient visits). Displaying the set of medical images may include the use of one or more hanging protocols. A hanging protocol, as used herein, may describe what images to display (e.g., image attributes and conditions, including modality, anatomy, laterality, procedure, and reason) and how to display the images (e.g., viewport height or width; image zoom, pan, or rotation, image order, tiling). The hanging protocols may enable the simultaneous or concurrent display of multiple images within respective viewports of a display screen. A viewport, as used herein, may refer to a frame, a sub window, or a similar viewing area within a display area of a device. As used herein, a mammography image may refer to an image acquired on a conventional two-dimensional (2D) mammography system or a synthesized 2D image that is created from combining information from a tomosynthesis data set. For ease of reading, both can be referred to as a mammogram or a mammography image.
The system may also provide an auto-focus tool that may be used during the review of the set of medical images. The auto-focus tool may provide for automatically orienting (e.g., panning, zooming, centering, aligning) images in the set of medical images in accordance with a selected portion of an image in the set of medical images. Upon activation of the auto-focus tool, the system may enable a user, such as a healthcare professional (e.g., a radiologist, a surgeon or other physician, a technician, a practitioner, or someone acting at the behest thereof) to select an ROI in a displayed image. Alternatively, the selection of the ROI in a displayed image may cause the auto-focus tool to be activated or be part of the process for activating the auto-focus tool. The auto-focus tool may identify the location of the selected ROI within the image based on image attributes, such as view position information (e.g., bilateral craniocaudal (CC) view, mediolateral oblique (MLO) view, true lateral view), laterality (e.g., left, right), image coordinates and direction information (e.g., image position and image orientation with respect to the patient), and other information embedded in the DICOM header (e.g., pixel spacing, slice thickness, slice location) or information burned in the pixel data of the image. Additionally, information within an image, such as segmentation bounding box information (e.g., object-background differentiation data, skin-tissue demarcation data, and similar object classification data) may be used for identification.
The auto-focus tool may use the identified ROI location to associate a corresponding area or ROI in each of the other images in the set of medical images. The areas or ROIs in the other images may be identified using the image characteristics described above. As one example, a bounding box area of an identified ROI in a first mammography image may be used to identify the same bounding box area in a second mammography image of the same view position. As another example, a bounding box area of an identified ROI in a first mammography image may be used to set the viewing plane, set the display field of view, or set the size of an MRI image; the location may correspond to the selected ROI in a first mammography image and may be oriented and scaled according to the position, size, and location of the selected ROI.
After identifying the corresponding ROIs in the other images, the auto-focus tool may orient the set of medical images such that the identified ROI is prominently displayed in the set of medical images. For example, in each image in the set of medical images, the auto-focus tool may pan to, zoom in on, and/or center (within the respective viewport for the image) an area of the image corresponding to the identified ROI. As such, the auto-focus tool serves as a single tool that replaces (or minimizes) the need for several other tools (e.g., pan tool, zoom tool, centering/alignment tools, scroll tool, orientation tool) and enables healthcare professionals to quickly and efficiently focus on, for example, a patient's breast or a region within the patient's breast. These capabilities of the auto-focus tool improve image review workflow and decrease the fatigue of healthcare professionals (due to decreased mouse clicks and movements) during image review.
Accordingly, the present disclosure provides a plurality of technical benefits including but not limited to: automating ROI correlation in images having the same and/or different imaging modalities, consolidating multiple image manipulation tools into a single tool, enabling multimodal image review in a single system, improving image review workflow, and decreasing healthcare professional fatigue during image review, among others.
In
In examples, input processing system 100 may represent a content review and manipulation system, such as a medical image review system. Input processing system 100 may be implemented in a secure computing environment comprising sensitive or private information, such as a healthcare facility (e.g., a hospital, an imaging and radiology center, an urgent care facility, a medical clinic or medical offices, an outpatient surgical facility, or a physical rehabilitation center). Alternatively, one or more components of input processing system 100 may be implemented in a computing environment external to the secure computing environment.
Content selection component 102 may be configured to enable content to be selected from one or more data sources. For example, content selection component 102 may have access to multiple data stores comprising image data of a medical imaging technology, such as picture archiving and communication system (PACS) or radiology information system (RIS). A user, such as a healthcare professional, may use content selection component 102 to select images (and associated image content) of one or more imaging modalities, such as mammography, ultrasound, and MRI. The images may be selected using a user interface provided by content selection component 102. The user interface may enable the user to select images by various criterion, such as patient name/identifier, imaging modality type, image creation/modification date, image collection/generation location, etc.
Content presentation component 104 may be configured to present selected content to a user. For example, content presentation component 104 may enable a user to select or define a content presentation style or layout, such as a hanging protocol, using the user interface (or a separate user interface). Alternatively, a default content presentation style or layout may be applied to the content presentation style or layout. Based on the selected or applied content presentation style or layout, content presentation component 104 may arrange the selected content into one or more viewports. For instance, a patient's current mammography image may be arranged into a leftmost viewport, the patient's mammography image from a patient visit one year ago may be arranged into a center viewport, and the patient's mammography image from a patient visit two years ago may be arranged into a rightmost viewport. In examples, the images in each viewport may be manipulated independently from the other presented viewports. That is, a user may manipulate (e.g., orient, pan, scroll, apply window level, annotate, remove, or otherwise modify) an image in a first presented viewport without affecting images in other presented viewports.
ROI selection component 106 may be configured to enable a user to select an area or point in the presented content. For example, the user interface may comprise one or more area selection mechanisms for identifying an ROI within an image presented in a viewport. In some aspects, an area selection mechanism may be automated to automatically identify an ROI within an image. For instance, ROI selection component 106 may implement an image recognition algorithm or model. The algorithm/model may enable processing an image (and other content) to identify and analyze objects and attributes of the image. Examples of the algorithm/model may include convolution neural networks (CNN), bag-of-words, logistic regression, support vector machines (SVM), and k-nearest-neighbor (KNN). In examples, the algorithm/model may automatically overlay a segmentation bounding box (or a similar area selection utility) on an image. The bounding box may identify and/or delineate one or more objects in the image. As a specific example, in a mammography image, the bounding box may encompass a patient's breast such that the breast is delineated from the background of the image or from other content (e.g., annotations or embedded data) within the image. In other aspects, an area selection mechanism may be used by a user to manually select an ROI within an image. For instance, ROI selection component 106 may provide an input tool, such as a cursor or pointer object, an enclosure tool (e.g., elliptical ROI, rectangular ROI, freehand ROI), or a highlighting tool. A user may use the input tool to specify a point or region of the image.
Auto-focus tool 108 may be configured to enable multimodality image review of the presented content. For example, the user interface may comprise auto-focus tool 108 or may enable a means for activating auto-focus tool 108 (e.g., a command button, a menu item, a keyboard sequence, a voice command, an eye-gaze command). A user may activate auto-focus tool 108 after selection of an ROI in presented content. Alternatively, the user may activate auto-focus tool 108 prior to selection of the ROI. For instance, activation of auto-focus tool 108 may cause the activation of ROI selection component 106.
Upon selection of auto-focus tool 108 and/or the ROI, auto-focus tool 108 may identify the location of the ROI within the content from which the ROI was selected (“source content”). The process for identifying the location of the ROI may differ based on the type of imaging modality for the source content. As one example, for a mammography image, auto-focus tool 108 may identify the location of an ROI based on image view information, such as laterality (e.g., right breast or left breast) and view position (e.g., MLO, CC). For instance, an area of the mammography image corresponding to the upper region of a MLO view of a right breast may be selected as the ROI.
As another example, for an MM image, auto-focus tool 108 may identify the location of an ROI based on DICOM header information, which may include image position and slice information, for the MRI image. For instance, the header information for the MM image may contain image orientation and image position information that can be used to determine spatial coordinates of objects, boundaries, and/or landmarks within the image using the Reference Coordinate System (RCS). Using the information embedded in the header spatial coordinates corresponding to the ROI may be identified. In some instances, the spatial coordinates for the ROI may be converted to or defined in terms of coordinates relative to the patient, such as right/left, anterior/posterior, and feet/head positions.
As yet another example, for an ultrasound image, auto-focus tool 108 may identify the location of an ROI based on DICOM header information and/or pixel data embedded in the image. For instance, the header information for the ultrasound image may indicate image laterality (e.g., right or left) and/or the header information of objects associated with the ultrasound image, such as Grayscale Softcopy Presentation State (GSPS) objects, may contain information embedded in the annotation that captures the location. Alternatively, the annotations may be embedded in the pixel data of the ultrasound image and describes the location of the ROI. In at least one example, auto-focus tool 108 may also use movement data for the ultrasound device to identify the location of an ROI. For instance, position and movement data for an ultrasound transducer or probe may be recorded during the ultrasound imaging.
Auto-focus tool 108 may use the identified location of the ROI in the source content to identify a corresponding area in the other presented content. The method for identifying the corresponding areas may differ based on the type of imaging modality of the other presented content. In some aspects, when the source content and the other presented content is the same imaging modality type, auto-focus tool 108 may map the image view information, spatial coordinate information, header information, and/or pixel data of the ROI to the corresponding area in the other presented content. As one example, in a first mammography image (source content), an identified ROI may be in the upper inner quadrant of a CC view of a patient's right breast. Accordingly, in a second mammography image (other presented content), auto-focus tool 108 may identify the upper inner quadrant of a CC view of the patient's right breast. As another example, in a first MRI image for a patient (source content), the slice information (e.g., middle slice) and spatial coordinate information of an identified ROI may be mapped to the same image slice and coordinates in a second MRI image for the patient. As yet another example, in a first ultrasound image for a patient (source content), the laterality information associated with an identified ROI may be used to identify a second ultrasound image of the same laterality.
In other aspects, when the source content and the other presented content is a different imaging modality type, auto-focus tool 108 may convert the header information of the image, such as image view information or spatial information, or information contained in objects associated with the image or embedded in the pixel data into information corresponding to the other presented content type. As one example, auto-focus tool 108 may convert the image view position or laterality information associated with an ROI in a mammography image (source content) into a slice location or set of spatial coordinates approximating the corresponding location of the ROI in an MM image (other presented content). For instance, a mammography image of the CC view position and right laterality may correspond to the middle portion of the right breast from a bilateral breast MM image. As another example, the location of the ROI in the mammography image or information contained in associated objects (e.g., GSPS, CAD SR) may be mapped to set of MRI coordinates corresponding to the location of the ROI. Determining the spatial coordinates of the ROI or determining the laterality, quadrant, or region of the breast where the ROI resides, may include the use of one or more determination mechanisms, such as a rule set, decision logic, an ML component (e.g., algorithm/model), a mapping algorithm, etc.
As another example, auto-focus tool 108 may convert the image view information of an ROI in a mammography image (source content) into view laterality information identifying an ultrasound image (other presented content). As neither the mammography image nor the ultrasound image may comprise spatial coordinate information, the laterality information in the image view information may be used to identify a corresponding ultrasound image (e.g., an ultrasound image of similar laterality). Identifying the corresponding ultrasound image may include the use of at least one of the determination mechanisms.
As yet another example, auto-focus tool 108 may convert the spatial coordinates of an ROI in an MRI image (source content) into laterality information identifying an ultrasound image (other presented content). For instance, the sagittal midline of a patient's body may represent a patient's origin according to the Reference Coordinate System such that positive values in the X direction, or values to the left of the midline, indicate areas in or around the patient's left breast and negative values in the X direction, or values to the right of the midline, indicate areas in or around the patient's right breast. Accordingly, a determination mechanism, such as a coordinate mapping algorithm, may be used to map/convert the spatial coordinates into a laterality determination.
After identifying the corresponding area in each of the other presented content, auto-focus tool 108 may focus the field of view of each viewport such that the identified ROI and the corresponding areas are prominently displayed. For example, in the viewport comprising the source content, focus tool 108 may orient the identified ROI such that ROI is horizontally and/or vertically centered in the viewport. The orienting may occur automatically and in real-time in response to the selection of the auto-focus tool 108 and/or the ROI. Additionally, focus tool 108 may magnify the ROI to further focus the attention of a user on a particular region of the breast (e.g., upper inner quadrant, lower outer quadrant). In the viewports of the other presented content, focus tool 108 may similarly orient the areas corresponding to the ROI in the source content. As one example, an MRI image in a viewport may be oriented such that the center slice from the right side or left side of the patient (right breast or left breast) is displayed regardless of whether the ROI in the source content is located more internal (medial) or more external (lateral) for one breast. In such an example, the center slice may serve as a general or starting focus area for the user. As another example, an MM image in a viewport may have its focus set on the upper region of the breast based on the location of the ROI in the source content.
Having described a system and process flow that may employ the techniques disclosed herein, the present disclosure will now describe one or more methods that may be performed by various aspects of the present disclosure. In aspects, method 200 may be executed by a system, such as system 100 of
Example method 200 begins at operation 202, where an image focus tool is selected. In aspects, the image review system may comprise or provide access to an image focus tool, such as auto-focus tool 108. For example, the image review system may provide a user interface component (e.g., graphical user interface (GUI), command line, microphone, haptic mechanism, camera) for selecting and/or activating the image focus tool. A user using the image review system to view one or more images may use the user interface component to select and activate the image focus tool. Alternatively, the user may select and activate the image focus tool prior to accessing, retrieving, or viewing the images.
At operation 204, an ROI in a first image may be selected. In aspects, a user may use an input tool (e.g., cursor, stylus, enclosure tool, highlighting tool, voice-based tool, eye-gaze tool) provided by the image focus tool or the image review system to select one or more points or portions of an image. For instance, the user may select a point in a first image of image viewing layout comprising four images. The selected points or portions may define a ROI. As one example, when a single point in an image is selected, an area surrounding the single point may be defined as the ROI. As another example, when multiple points in an image are selected, the image focus tool may determine a centroid (or approximate center point) of the multiple points. An area surrounding the centroid may be defined as the ROI. The amount and/or shape (e.g., ellipse, rectangle, freeform) of the area used to define the ROI may be determined automatically by the image focus tool or defined manually by the user. For instance, the image focus tool may automatically overlay the image with a bounding box that encompasses the selected/determined point. The bounding box may delineate one or more objects in the image from other objects or the background of the image.
At operation 206, the location of the ROI within the first image may be identified. The process for identifying the location of the ROI may include the use of one or more determination mechanisms (e.g., a rule set, decision logic, an ML component, a mapping algorithm, image or object classifier) and may differ based on the imaging modality type of the first image. As one example, the image focus tool may use a set of data extraction rules to identify ROI in a mammography image using image view information of the mammography image, such as laterality (e.g., right or left) and view position (e.g., MLO, CC). For instance, the data extraction rules may label or otherwise designate the ROI as “CC View, Right Breast” based on the information in a Digital Imaging and Communication in Medicine (DICOM) header of the mammography image. Alternatively, the ROI may be further labeled/designated using additional area information in the image, such as “Right Breast, Upper Outer Quadrant.”
As another example, the image focus tool may use a spatial mapping function to identify ROI in an MRI image using information available in the DICOM header of an MRI image and/or associated objects, such as Image Position (Patient), Image Orientation (Patient), Pixel Spacing, Slice Thickness, and Slice Location. The spatial mapping function may define the ROI using a 3D reference coordinate system (e.g., x, y, and z coordinates) in which the boundary of the ROI is defined by multiple sets of coordinate values or defined in terms of right/left, anterior/posterior, and feet/head coordinate values that are relative to the patient (e.g., L:52.2, A:5.5, H:10.6). Alternatively, the ROI may be defined by a single set of coordinate values (e.g., voxel (50, 100, 55)) representing the center (or centroid) of the ROI.
As another example, the image focus tool may use a text recognition algorithm to identify ROI in an ultrasound image using image header information, pixel data, orientation data, and/or imaging device data, such as laterality information (e.g., right or left), embedded image information (e.g., annotations and notes), and 2D/3D transducer/probe movement data. For instance, the text recognition algorithm may generally label or otherwise designate the ROI as “Right Breast” based on text-based laterality information extracted from the DICOM header of the ultrasound image. Alternatively, the ROI may be labeled/designated based on embedded annotations (e.g., handwritten notes, burned-in text) and/or an orientation map in the image data of the ultrasound image.
At operation 208, areas corresponding to the ROI may be identified in other images. In aspects, the image focus tool may use the identified location of the ROI within the first image to identify corresponding areas in the other images presented in the image viewing layout. The process for identifying the corresponding areas in the other images may include the use of one or more of the determination mechanisms described above (and/or additional determination mechanisms and may differ based on the imaging modality type of the other images. As one example, when the imaging modality type of the first image matches the imaging modality type of a second image, a mapping function may be used to map the ROI identification information of the first image (e.g., image view information, spatial coordinate information, header information) to the same (or similar) ROI identification information of the second image. For instance, the ROI label/designation “Right Breast, Upper Outer Quadrant” for a first mammography image may be used by the mapping function to map the same laterality (e.g., right breast) and region (e.g., Upper Outer Quadrant) in a second mammography image based on the image view information for the second mammography image.
As another example, when the imaging modality type of the first image does not match the imaging modality type of a second image, a mapping function may be used to map the ROI identification information of the first image (e.g., image view information, spatial coordinate information, header information) to different, but corresponding ROI identification information of the second image. For instance, the ROI label/designation “Right Breast, Axillary Region” for a mammography image may be converted to a set of MRI spatial coordinates. The set of MRI spatial coordinates may be predefined for one or more areas in each type of mammography image. As a specific example, spatial coordinates may be predefined for the various quadrants of the breast (e.g., Upper Outer, Upper Inner, Lower Outer, Lower Inner), regions (e.g., Central, Retroareolar, Axillary), and/or laterality (e.g., right, left) of the mammography image. Accordingly, the ROI label/designation for a mammography image may be used to select the corresponding MRI spatial coordinates in an MRI image.
At operation 210, the images may be automatically focused on the ROI and corresponding areas. In aspects, the image focus tool may focus the field of view of each image in the image viewing layout such that the ROI and corresponding areas are prominently displayed. For example, after (or prior to) identifying the areas corresponding to the ROI, the image focus tool may orient the identified ROI in the first image such that ROI is horizontally and/or vertically centered in the viewport comprising the first image. The image focus tool may also (simultaneously or subsequently) orient the other images such that the areas corresponding to the ROI are horizontally and/or vertically centered in their respective viewports and may also display the areas in an orientation that is different from the original image acquisition plane, for example displaying in one or more MRI images (presented content) the axial, sagittal, and/or coronal plane to provide different perspectives of the ROI. In some examples, the image focus tool may apply some degree of scaling, magnification, and/or filtering to one or more of the images. For instance, the image focus tool may apply a 2× scaling factor to a second image and a 4× scaling factor to a third image. As should be appreciated, the automatic orientation and scaling operations of the image focus tool reduces the amount of image manipulation tools, input device clicks (and other type of selections), and input device movements required to review images. The image focus tool also enables users to review images of differing imaging modality types using the same system and image review tool. Accordingly, the image focus tool improves the image review workflow, reduces the fatigue experienced by healthcare professional fatigue during image review, and may reduce the need to learn and operate multiple image review systems and image review tools.
In
Operating environment 400 typically includes at least some form of computer readable media. Computer readable media can be any available media that can be accessed by processing unit 402 or other devices comprising the operating environment. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transitory medium which can be used to store the desired information. Computer storage media does not include communication media.
Communication media embodies computer readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared, microwave, and other wireless media. Combinations of the any of the above should also be included within the scope of computer readable media.
The operating environment 400 may be a single computer operating in a networked environment using logical connections to one or more remote computers. The remote computer may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above as well as others not so mentioned. The logical connections may include any method supported by available communications media. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet.
In aspects, system 500 may represent a computing environment comprising sensitive or private information associated with, for example, a healthcare facility, healthcare patients, and/or healthcare personnel. Although specific reference to a healthcare environment is described herein, it is contemplated that the techniques of the present disclosure may be practiced in other environments. For example, system 500 may represent a software development environment or an alternative environment that does not comprise sensitive or private medical information.
In
Computing device(s) 502 may be configured to collect, manipulate, and/or display input data from one or more users or devices. For example, computing device(s) 502 may collect input data from a healthcare professional, medical equipment (e.g., imaging devices, treatment devices, monitoring devices), medical workstations, data storage locations, etc. The input data may correspond to user interaction with one or more software applications or services implemented by, or accessible to, user device(s) 502. The input data may include, for example, voice input, touch input, text-based input, gesture input, video input, and/or image input. The input data may be detected/collected using one or more sensor components of user device(s) 502. Examples of sensors include microphones, touch-based sensors, geolocation sensors, accelerometers, optical/magnetic sensors, gyroscopes, keyboards, and pointing/selection tools. Examples of user device(s) 502 may include, but are not limited to, personal computers (PCs), medical workstations, server devices, cloud-based devices, mobile devices (e.g., smartphones, tablets, laptops, personal digital assistants (PDAs)), and wearable devices (e.g., smart watches, smart eyewear, fitness trackers, smart clothing, body-mounted devices, head-mounted displays).
Computing device(s) 502 may comprise or otherwise have access to application(s) 504. Application(s) 504 may enable users to access and/or interact with one or more types of content, such as images, text, audio, images, video, and animation. As one example, application(s) 504 may represent a multimodality image processing and/or review service that enables healthcare professionals to review medical images. Although specific reference to an image processing application/service, alternative implementations are contemplated. For example, application(s) 504 may represent word processing applications, spreadsheet application, presentation applications, document-reader software, social media software/platforms, search engines, media software/platforms, multimedia player software, content design software/tools, and database applications.
Application(s) 504 may comprise or have access to one or more data stores, such as data store(s) 506. Data store(s) 506 may comprise a corpus of content of various types (e.g., images, videos, documents, files, records). For example, data store(s) 506 may include image types, such as mammography images, MRI images, ultrasound images, etc. Data store(s) 506 may be stored and accessed locally on computing device(s) 502 or stored and accessed remotely via network 508. Examples of network 508 include, but are not limited to, personal area networks (PANs), local area networks (LANs), metropolitan area networks (MANs), and wide area networks (WANs). Application(s) 504 may retrieve content from data store(s) 506. Application(s) 504 may present the content using one or more display devices or components of Computing device(s) 502. In a particular example, application(s) 504 may present the content according to a hanging protocol or a similar content display format. The hanging protocol may provide for displaying a sequence of content items, such as images, in respective viewports of the display device or component.
In some examples, application(s) 504 implements or has access to an auto-focus tool (not pictured) for reviewing presented content. The auto-focus tool may provide for automatically orienting the presented content in accordance with a user-selected area within the content. For example, a healthcare professional may select an ROI in one image of a set of presented images that includes mammography images, MRI images, and ultrasound images. The auto-focus tool may orient each of the presented images such that the identified ROI is prominently displayed in each of the images. Accordingly, the auto-focus tool may enable the healthcare professional to automatically pan to, zoom in on, set the orientation, and/or center and align (within the respective viewport for the content) an area of the content corresponding to the identified ROI in various content items.
The embodiments described herein may be employed using software, hardware, or a combination of software and hardware to implement and perform the systems and methods disclosed herein. Although specific devices have been recited throughout the disclosure as performing specific functions, one of skill in the art will appreciate that these devices are provided for illustrative purposes, and other devices may be employed to perform the functionality disclosed herein without departing from the scope of the disclosure.
This disclosure describes some embodiments of the present technology with reference to the accompanying drawings, in which only some of the possible embodiments were shown. Other aspects may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments were provided so that this disclosure was thorough and complete and fully conveyed the scope of the possible embodiments to those skilled in the art.
Although specific embodiments are described herein, the scope of the technology is not limited to those specific embodiments. One skilled in the art will recognize other embodiments or improvements that are within the scope and spirit of the present technology. Therefore, the specific structure, acts, or media are disclosed only as illustrative embodiments. The scope of the technology is defined by the following claims and any equivalents therein.
This application claims the benefit of priority to U.S. Provisional Application No. 63/271,339, filed Oct. 25, 2021, which application is hereby incorporated in its entirety by reference.
Number | Name | Date | Kind |
---|---|---|---|
3502878 | Stewart | Mar 1970 | A |
3863073 | Wagner | Jan 1975 | A |
3971950 | Evans et al. | Jul 1976 | A |
4160906 | Daniels | Jul 1979 | A |
4310766 | Finkenzeller et al. | Jan 1982 | A |
4496557 | Malen et al. | Jan 1985 | A |
4559557 | Keyes | Dec 1985 | A |
4559641 | Caugant et al. | Dec 1985 | A |
4706269 | Reina et al. | Nov 1987 | A |
4727565 | Ericson | Feb 1988 | A |
4744099 | Huettenrauch | May 1988 | A |
4773086 | Fujita | Sep 1988 | A |
4773087 | Plewes | Sep 1988 | A |
4819258 | Kleinman et al. | Apr 1989 | A |
4821727 | Levene et al. | Apr 1989 | A |
4907156 | Doi et al. | Jun 1990 | A |
4969174 | Schied | Nov 1990 | A |
4989227 | Tirelli et al. | Jan 1991 | A |
5018176 | Romeas et al. | May 1991 | A |
RE33634 | Yanaki | Jul 1991 | E |
5029193 | Saffer | Jul 1991 | A |
5051904 | Griffith | Sep 1991 | A |
5078142 | Siczek et al. | Jan 1992 | A |
5099846 | Hardy | Mar 1992 | A |
5129911 | Siczek et al. | Jul 1992 | A |
5133020 | Giger et al. | Jul 1992 | A |
5163075 | Lubinsky | Nov 1992 | A |
5164976 | Scheid et al. | Nov 1992 | A |
5199056 | Darrah | Mar 1993 | A |
5219351 | Teubner | Jun 1993 | A |
5240011 | Assa | Aug 1993 | A |
5279309 | Taylor et al. | Jan 1994 | A |
5280427 | Magnusson | Jan 1994 | A |
5289520 | Pellegrino et al. | Feb 1994 | A |
5343390 | Doi et al. | Aug 1994 | A |
5359637 | Webbe | Oct 1994 | A |
5365562 | Toker | Nov 1994 | A |
5386447 | Siczek | Jan 1995 | A |
5415169 | Siczek et al. | May 1995 | A |
5426685 | Pellegrino et al. | Jun 1995 | A |
5452367 | Bick | Sep 1995 | A |
5491627 | Zhang et al. | Feb 1996 | A |
5499097 | Ortyn et al. | Mar 1996 | A |
5506877 | Niklason et al. | Apr 1996 | A |
5526394 | Siczek | Jun 1996 | A |
5539797 | Heidsieck et al. | Jul 1996 | A |
5553111 | Moore | Sep 1996 | A |
5592562 | Rooks | Jan 1997 | A |
5594769 | Pellegrino et al. | Jan 1997 | A |
5596200 | Sharma | Jan 1997 | A |
5598454 | Franetzki | Jan 1997 | A |
5609152 | Pellegrino et al. | Mar 1997 | A |
5627869 | Andrew et al. | May 1997 | A |
5642433 | Lee et al. | Jun 1997 | A |
5642441 | Riley et al. | Jun 1997 | A |
5647025 | Frost et al. | Jul 1997 | A |
5657362 | Giger et al. | Aug 1997 | A |
5660185 | Shmulewitz et al. | Aug 1997 | A |
5668889 | Hara | Sep 1997 | A |
5671288 | Wilhelm et al. | Sep 1997 | A |
5709206 | Teboul | Jan 1998 | A |
5712890 | Spivey | Jan 1998 | A |
5719952 | Rooks | Feb 1998 | A |
5735264 | Siczek et al. | Apr 1998 | A |
5757880 | Colomb | May 1998 | A |
5763871 | Ortyn et al. | Jun 1998 | A |
5769086 | Ritchart et al. | Jun 1998 | A |
5773832 | Sayed et al. | Jun 1998 | A |
5803912 | Siczek et al. | Sep 1998 | A |
5818898 | Tsukamoto et al. | Oct 1998 | A |
5828722 | Ploetz | Oct 1998 | A |
5835079 | Shieh | Nov 1998 | A |
5841124 | Ortyn et al. | Nov 1998 | A |
5872828 | Niklason et al. | Feb 1999 | A |
5875258 | Ortyn et al. | Feb 1999 | A |
5878104 | Ploetz | Mar 1999 | A |
5878746 | Lemelson et al. | Mar 1999 | A |
5896437 | Ploetz | Apr 1999 | A |
5941832 | Tumey | Aug 1999 | A |
5954650 | Saito | Sep 1999 | A |
5986662 | Argiro | Nov 1999 | A |
6005907 | Ploetz | Dec 1999 | A |
6022325 | Siczek et al. | Feb 2000 | A |
6067079 | Shieh | May 2000 | A |
6075879 | Roehrig et al. | Jun 2000 | A |
6091841 | Rogers | Jul 2000 | A |
6091981 | Cundari et al. | Jul 2000 | A |
6101236 | Wang et al. | Aug 2000 | A |
6102866 | Nields et al. | Aug 2000 | A |
6137527 | Abdel-Malek | Oct 2000 | A |
6141398 | He | Oct 2000 | A |
6149301 | Kautzer et al. | Nov 2000 | A |
6175117 | Komardin | Jan 2001 | B1 |
6196715 | Nambu | Mar 2001 | B1 |
6215892 | Douglass et al. | Apr 2001 | B1 |
6216540 | Nelson | Apr 2001 | B1 |
6219059 | Argiro | Apr 2001 | B1 |
6256370 | Yavus | Apr 2001 | B1 |
6233473 | Sheperd | May 2001 | B1 |
6243441 | Zur | Jun 2001 | B1 |
6245028 | Furst et al. | Jun 2001 | B1 |
6272207 | Tang | Aug 2001 | B1 |
6289235 | Webber et al. | Sep 2001 | B1 |
6292530 | Yavus | Sep 2001 | B1 |
6293282 | Lemelson | Sep 2001 | B1 |
6327336 | Gingold et al. | Dec 2001 | B1 |
6327377 | Rutenberg et al. | Dec 2001 | B1 |
6341156 | Baetz | Jan 2002 | B1 |
6375352 | Hewes | Apr 2002 | B1 |
6389104 | Bani-Hashemi et al. | May 2002 | B1 |
6411836 | Patel | Jun 2002 | B1 |
6415015 | Nicolas | Jul 2002 | B2 |
6424332 | Powell | Jul 2002 | B1 |
6442288 | Haerer | Aug 2002 | B1 |
6459925 | Nields et al. | Oct 2002 | B1 |
6463181 | Duarte | Oct 2002 | B2 |
6468226 | McIntyre, IV | Oct 2002 | B1 |
6480565 | Ning | Nov 2002 | B1 |
6501819 | Unger et al. | Dec 2002 | B2 |
6556655 | Chichereau | Apr 2003 | B1 |
6574304 | Hsieh | Jun 2003 | B1 |
6597762 | Ferrant | Jul 2003 | B1 |
6611575 | Alyassin et al. | Aug 2003 | B1 |
6620111 | Stephens et al. | Sep 2003 | B2 |
6626849 | Huitema et al. | Sep 2003 | B2 |
6633674 | Barnes | Oct 2003 | B1 |
6638235 | Miller et al. | Oct 2003 | B2 |
6647092 | Eberhard | Nov 2003 | B2 |
6650928 | Gailly | Nov 2003 | B1 |
6683934 | Zhao | Jan 2004 | B1 |
6744848 | Stanton | Jun 2004 | B2 |
6748044 | Sabol et al. | Jun 2004 | B2 |
6751285 | Eberhard | Jun 2004 | B2 |
6758824 | Miller et al. | Jul 2004 | B1 |
6813334 | Koppe | Nov 2004 | B2 |
6882700 | Wang | Apr 2005 | B2 |
6885724 | Li | Apr 2005 | B2 |
6901156 | Giger et al. | May 2005 | B2 |
6912319 | Barnes | May 2005 | B1 |
6940943 | Claus | Sep 2005 | B2 |
6978040 | Berestov | Dec 2005 | B2 |
6987331 | Koeppe | Jan 2006 | B2 |
6999553 | Livingston | Feb 2006 | B2 |
6999554 | Mertelmeier | Feb 2006 | B2 |
7022075 | Grunwald et al. | Apr 2006 | B2 |
7025725 | Dione et al. | Apr 2006 | B2 |
7030861 | Westerman | Apr 2006 | B1 |
7110490 | Eberhard | Sep 2006 | B2 |
7110502 | Tsuji | Sep 2006 | B2 |
7117098 | Dunlay et al. | Oct 2006 | B1 |
7123684 | Jing et al. | Oct 2006 | B2 |
7127091 | OpDeBeek | Oct 2006 | B2 |
7142633 | Eberhard | Nov 2006 | B2 |
7218766 | Eberhard | May 2007 | B2 |
7245694 | Jing et al. | Jul 2007 | B2 |
7286634 | Sommer, Jr. et al. | Oct 2007 | B2 |
7289825 | Fors et al. | Oct 2007 | B2 |
7298881 | Giger et al. | Nov 2007 | B2 |
7315607 | Ramsauer | Jan 2008 | B2 |
7319735 | Defreitas et al. | Jan 2008 | B2 |
7323692 | Rowlands | Jan 2008 | B2 |
7346381 | Okerlund et al. | Mar 2008 | B2 |
7406150 | Minyard et al. | Jul 2008 | B2 |
7430272 | Jing et al. | Sep 2008 | B2 |
7443949 | Defreitas et al. | Oct 2008 | B2 |
7466795 | Eberhard et al. | Dec 2008 | B2 |
7556602 | Wang et al. | Jul 2009 | B2 |
7577282 | Gkanatsios et al. | Aug 2009 | B2 |
7606801 | Faitelson et al. | Oct 2009 | B2 |
7616801 | Gkanatsios et al. | Nov 2009 | B2 |
7630533 | Ruth et al. | Dec 2009 | B2 |
7634050 | Muller et al. | Dec 2009 | B2 |
7640051 | Krishnan | Dec 2009 | B2 |
7697660 | Ning | Apr 2010 | B2 |
7702142 | Ren et al. | Apr 2010 | B2 |
7705830 | Westerman et al. | Apr 2010 | B2 |
7760924 | Ruth et al. | Jul 2010 | B2 |
7769219 | Zahniser | Aug 2010 | B2 |
7787936 | Kressy | Aug 2010 | B2 |
7809175 | Roehrig et al. | Oct 2010 | B2 |
7828733 | Zhang et al. | Nov 2010 | B2 |
7831296 | DeFreitas et al. | Nov 2010 | B2 |
7869563 | DeFreitas | Jan 2011 | B2 |
7974924 | Holla et al. | Jul 2011 | B2 |
7991106 | Ren et al. | Aug 2011 | B2 |
8044972 | Hall et al. | Oct 2011 | B2 |
8051386 | Rosander et al. | Nov 2011 | B2 |
8126226 | Bernard et al. | Feb 2012 | B2 |
8155421 | Ren et al. | Apr 2012 | B2 |
8165365 | Bernard et al. | Apr 2012 | B2 |
8532745 | DeFreitas et al. | Sep 2013 | B2 |
8571289 | Ruth | Oct 2013 | B2 |
8594274 | Hoernig et al. | Nov 2013 | B2 |
8677282 | Cragun et al. | Mar 2014 | B2 |
8712127 | Ren et al. | Apr 2014 | B2 |
8787522 | Smith et al. | Jul 2014 | B2 |
8897535 | Ruth et al. | Nov 2014 | B2 |
8983156 | Periaswamy et al. | Mar 2015 | B2 |
9020579 | Smith | Apr 2015 | B2 |
9075903 | Marshall | Jul 2015 | B2 |
9084579 | Ren et al. | Jul 2015 | B2 |
9119599 | Itai | Sep 2015 | B2 |
9129362 | Jerebko | Sep 2015 | B2 |
9289183 | Karssemeijer | Mar 2016 | B2 |
9451924 | Bernard | Sep 2016 | B2 |
9456797 | Ruth et al. | Oct 2016 | B2 |
9478028 | Parthasarathy | Oct 2016 | B2 |
9589374 | Gao | Mar 2017 | B1 |
9592019 | Sugiyama | Mar 2017 | B2 |
9805507 | Chen | Oct 2017 | B2 |
9808215 | Ruth et al. | Nov 2017 | B2 |
9811758 | Ren et al. | Nov 2017 | B2 |
9901309 | DeFreitas et al. | Feb 2018 | B2 |
10008184 | Kreeger et al. | Jun 2018 | B2 |
10010302 | Ruth et al. | Jul 2018 | B2 |
10074199 | Robinson et al. | Sep 2018 | B2 |
10092358 | DeFreitas | Oct 2018 | B2 |
10111631 | Gkanatsios | Oct 2018 | B2 |
10242490 | Karssemeijer | Mar 2019 | B2 |
10276265 | Reicher et al. | Apr 2019 | B2 |
10282840 | Moehrle et al. | May 2019 | B2 |
10335094 | DeFreitas | Jul 2019 | B2 |
10357211 | Smith | Jul 2019 | B2 |
10410417 | Chen et al. | Sep 2019 | B2 |
10413263 | Ruth et al. | Sep 2019 | B2 |
10444960 | Marshall | Oct 2019 | B2 |
10456213 | DeFreitas | Oct 2019 | B2 |
10573276 | Kreeger et al. | Feb 2020 | B2 |
10575807 | Gkanatsios | Mar 2020 | B2 |
10595954 | DeFreitas | Mar 2020 | B2 |
10624598 | Chen | Apr 2020 | B2 |
10977863 | Chen | Apr 2021 | B2 |
10978026 | Kreeger | Apr 2021 | B2 |
11419565 | Gkanatsios | Aug 2022 | B2 |
11508340 | Kreeger | Nov 2022 | B2 |
11701199 | DeFreitas | Jul 2023 | B2 |
20010038681 | Stanton et al. | Nov 2001 | A1 |
20010038861 | Hsu et al. | Nov 2001 | A1 |
20020012450 | Tsuji | Jan 2002 | A1 |
20020050986 | Inoue | May 2002 | A1 |
20020075997 | Unger et al. | Jun 2002 | A1 |
20020113681 | Byram | Aug 2002 | A1 |
20020122533 | Marie et al. | Sep 2002 | A1 |
20020188466 | Barrette et al. | Dec 2002 | A1 |
20020193676 | Bodicker | Dec 2002 | A1 |
20030007598 | Wang | Jan 2003 | A1 |
20030018272 | Treado et al. | Jan 2003 | A1 |
20030026386 | Tang | Feb 2003 | A1 |
20030048260 | Matusis | Mar 2003 | A1 |
20030073895 | Nields et al. | Apr 2003 | A1 |
20030095624 | Eberhard et al. | May 2003 | A1 |
20030097055 | Yanof | May 2003 | A1 |
20030128893 | Castorina | Jul 2003 | A1 |
20030135115 | Burdette et al. | Jul 2003 | A1 |
20030169847 | Karellas | Sep 2003 | A1 |
20030194050 | Eberhard | Oct 2003 | A1 |
20030194121 | Eberhard et al. | Oct 2003 | A1 |
20030194124 | Suzuki et al. | Oct 2003 | A1 |
20030195433 | Turovskiy | Oct 2003 | A1 |
20030210254 | Doan | Nov 2003 | A1 |
20030212327 | Wang | Nov 2003 | A1 |
20030215120 | Uppaluri | Nov 2003 | A1 |
20040008809 | Webber | Jan 2004 | A1 |
20040008900 | Jabri et al. | Jan 2004 | A1 |
20040008901 | Avinash | Jan 2004 | A1 |
20040036680 | Davis | Feb 2004 | A1 |
20040047518 | Tiana | Mar 2004 | A1 |
20040052328 | Saboi | Mar 2004 | A1 |
20040064037 | Smith | Apr 2004 | A1 |
20040066884 | Claus | Apr 2004 | A1 |
20040066904 | Eberhard et al. | Apr 2004 | A1 |
20040070582 | Smith et al. | Apr 2004 | A1 |
20040077938 | Mark et al. | Apr 2004 | A1 |
20040081273 | Ning | Apr 2004 | A1 |
20040094167 | Brady | May 2004 | A1 |
20040101095 | Jing et al. | May 2004 | A1 |
20040109028 | Stern et al. | Jun 2004 | A1 |
20040109529 | Eberhard et al. | Jun 2004 | A1 |
20040127789 | Ogawa | Jul 2004 | A1 |
20040138569 | Grunwald | Jul 2004 | A1 |
20040171933 | Stoller et al. | Sep 2004 | A1 |
20040171986 | Tremaglio, Jr. et al. | Sep 2004 | A1 |
20040267157 | Miller et al. | Dec 2004 | A1 |
20050047636 | Gines | Mar 2005 | A1 |
20050049521 | Miller et al. | Mar 2005 | A1 |
20050063509 | Defreitas et al. | Mar 2005 | A1 |
20050078797 | Danielsson et al. | Apr 2005 | A1 |
20050084060 | Seppi et al. | Apr 2005 | A1 |
20050089205 | Kapur | Apr 2005 | A1 |
20050105679 | Wu et al. | May 2005 | A1 |
20050107689 | Sasano | May 2005 | A1 |
20050111718 | MacMahon | May 2005 | A1 |
20050113680 | Ikeda et al. | May 2005 | A1 |
20050113681 | DeFreitas et al. | May 2005 | A1 |
20050113715 | Schwindt et al. | May 2005 | A1 |
20050124845 | Thomadsen et al. | Jun 2005 | A1 |
20050135555 | Claus | Jun 2005 | A1 |
20050135664 | Kaufhold | Jun 2005 | A1 |
20050226375 | Eberhard | Oct 2005 | A1 |
20060004278 | Giger et al. | Jan 2006 | A1 |
20060009693 | Hanover et al. | Jan 2006 | A1 |
20060018526 | Avinash | Jan 2006 | A1 |
20060025680 | Jeune-Iomme | Feb 2006 | A1 |
20060030784 | Miller et al. | Feb 2006 | A1 |
20060074288 | Kelly et al. | Apr 2006 | A1 |
20060098855 | Gkanatsios et al. | May 2006 | A1 |
20060129062 | Nicoson et al. | Jun 2006 | A1 |
20060132508 | Sadikali | Jun 2006 | A1 |
20060147099 | Marshall et al. | Jul 2006 | A1 |
20060154267 | Ma et al. | Jul 2006 | A1 |
20060155209 | Miller et al. | Jul 2006 | A1 |
20060197753 | Hotelling | Sep 2006 | A1 |
20060210131 | Wheeler | Sep 2006 | A1 |
20060228012 | Masuzawa | Oct 2006 | A1 |
20060238546 | Handley | Oct 2006 | A1 |
20060257009 | Wang | Nov 2006 | A1 |
20060269040 | Mertelmeier | Nov 2006 | A1 |
20060274928 | Collins et al. | Dec 2006 | A1 |
20060291618 | Eberhard et al. | Dec 2006 | A1 |
20070014468 | Gines et al. | Jan 2007 | A1 |
20070019846 | Bullitt et al. | Jan 2007 | A1 |
20070030949 | Jing et al. | Feb 2007 | A1 |
20070036265 | Jing et al. | Feb 2007 | A1 |
20070046649 | Reiner | Mar 2007 | A1 |
20070047793 | Wu et al. | Mar 2007 | A1 |
20070052700 | Wheeler et al. | Mar 2007 | A1 |
20070076844 | Defreitas et al. | Apr 2007 | A1 |
20070114424 | Danielsson et al. | May 2007 | A1 |
20070118400 | Morita et al. | May 2007 | A1 |
20070156451 | Gering | Jul 2007 | A1 |
20070223651 | Wagenaar et al. | Sep 2007 | A1 |
20070225600 | Weibrecht et al. | Sep 2007 | A1 |
20070236490 | Casteele | Oct 2007 | A1 |
20070242800 | Jing et al. | Oct 2007 | A1 |
20070263765 | Wu | Nov 2007 | A1 |
20070274585 | Zhang et al. | Nov 2007 | A1 |
20080019581 | Gkanatsios et al. | Jan 2008 | A1 |
20080043905 | Hassanpourgol | Feb 2008 | A1 |
20080045833 | DeFreitas et al. | Feb 2008 | A1 |
20080101537 | Sendai | May 2008 | A1 |
20080114614 | Mahesh et al. | May 2008 | A1 |
20080125643 | Huisman | May 2008 | A1 |
20080130979 | Ren | Jun 2008 | A1 |
20080139896 | Baumgart | Jun 2008 | A1 |
20080152086 | Hall | Jun 2008 | A1 |
20080165136 | Christie et al. | Jul 2008 | A1 |
20080187095 | Boone et al. | Aug 2008 | A1 |
20080198966 | Hjarn | Aug 2008 | A1 |
20080221479 | Ritchie | Sep 2008 | A1 |
20080229256 | Shibaike | Sep 2008 | A1 |
20080240533 | Piron et al. | Oct 2008 | A1 |
20080297482 | Weiss | Dec 2008 | A1 |
20090003519 | DeFreitas et al. | Jan 2009 | A1 |
20090005668 | West et al. | Jan 2009 | A1 |
20090005693 | Brauner | Jan 2009 | A1 |
20090010384 | Jing et al. | Jan 2009 | A1 |
20090034684 | Bernard | Feb 2009 | A1 |
20090037821 | O'Neal et al. | Feb 2009 | A1 |
20090063118 | Dachille | Mar 2009 | A1 |
20090079705 | Sizelove et al. | Mar 2009 | A1 |
20090080594 | Brooks et al. | Mar 2009 | A1 |
20090080602 | Brooks et al. | Mar 2009 | A1 |
20090080604 | Shores et al. | Mar 2009 | A1 |
20090080752 | Ruth | Mar 2009 | A1 |
20090080765 | Bernard et al. | Mar 2009 | A1 |
20090087067 | Khorasani | Apr 2009 | A1 |
20090123052 | Ruth | May 2009 | A1 |
20090129644 | Daw et al. | May 2009 | A1 |
20090135997 | Defreitas et al. | May 2009 | A1 |
20090138280 | Morita et al. | May 2009 | A1 |
20090143674 | Nields | Jun 2009 | A1 |
20090167702 | Nurmi | Jul 2009 | A1 |
20090171244 | Ning | Jul 2009 | A1 |
20090238424 | Arakita | Sep 2009 | A1 |
20090259958 | Ban | Oct 2009 | A1 |
20090268865 | Ren et al. | Oct 2009 | A1 |
20090278812 | Yasutake | Nov 2009 | A1 |
20090296882 | Gkanatsios et al. | Dec 2009 | A1 |
20090304147 | Jing et al. | Dec 2009 | A1 |
20100034348 | Yu | Feb 2010 | A1 |
20100049046 | Peiffer | Feb 2010 | A1 |
20100054400 | Ren et al. | Mar 2010 | A1 |
20100067648 | Kojima | Mar 2010 | A1 |
20100079405 | Bernstein | Apr 2010 | A1 |
20100086188 | Ruth et al. | Apr 2010 | A1 |
20100088346 | Urness et al. | Apr 2010 | A1 |
20100098214 | Star-Lack et al. | Apr 2010 | A1 |
20100105879 | Katayose et al. | Apr 2010 | A1 |
20100121178 | Krishnan | May 2010 | A1 |
20100131294 | Venon | May 2010 | A1 |
20100131482 | Linthicum et al. | May 2010 | A1 |
20100135558 | Ruth et al. | Jun 2010 | A1 |
20100152570 | Navab | Jun 2010 | A1 |
20100166147 | Abenaim | Jul 2010 | A1 |
20100166267 | Zhang | Jul 2010 | A1 |
20100171764 | Feng | Jul 2010 | A1 |
20100189322 | Sakagawa | Jul 2010 | A1 |
20100195882 | Ren et al. | Aug 2010 | A1 |
20100208037 | Sendai | Aug 2010 | A1 |
20100231522 | Li | Sep 2010 | A1 |
20100246884 | Chen et al. | Sep 2010 | A1 |
20100246909 | Blum | Sep 2010 | A1 |
20100259561 | Forutanpour et al. | Oct 2010 | A1 |
20100259645 | Kaplan | Oct 2010 | A1 |
20100260316 | Stein et al. | Oct 2010 | A1 |
20100280375 | Zhang | Nov 2010 | A1 |
20100293500 | Cragun | Nov 2010 | A1 |
20110018817 | Kryze | Jan 2011 | A1 |
20110019891 | Puong | Jan 2011 | A1 |
20110054944 | Sandberg et al. | Mar 2011 | A1 |
20110069808 | Defreitas et al. | Mar 2011 | A1 |
20110069906 | Park | Mar 2011 | A1 |
20110087132 | DeFreitas et al. | Apr 2011 | A1 |
20110105879 | Masumoto | May 2011 | A1 |
20110109650 | Kreeger | May 2011 | A1 |
20110110570 | Bar-Shalev | May 2011 | A1 |
20110110576 | Kreeger | May 2011 | A1 |
20110123073 | Gustafson | May 2011 | A1 |
20110125526 | Gustafson | May 2011 | A1 |
20110134113 | Ma et al. | Jun 2011 | A1 |
20110150447 | Li | Jun 2011 | A1 |
20110157154 | Bernard | Jun 2011 | A1 |
20110163939 | Tam et al. | Jul 2011 | A1 |
20110178389 | Kumar et al. | Jul 2011 | A1 |
20110182402 | Partain | Jul 2011 | A1 |
20110234630 | Batman et al. | Sep 2011 | A1 |
20110237927 | Brooks et al. | Sep 2011 | A1 |
20110242092 | Kashiwagi | Oct 2011 | A1 |
20110310126 | Georgiev et al. | Dec 2011 | A1 |
20120014501 | Pelc | Jan 2012 | A1 |
20120014504 | Jang | Jan 2012 | A1 |
20120014578 | Karssemeijer | Jan 2012 | A1 |
20120069951 | Toba | Mar 2012 | A1 |
20120106698 | Karim | May 2012 | A1 |
20120127297 | Baxi | May 2012 | A1 |
20120131488 | Karlsson et al. | May 2012 | A1 |
20120133600 | Marshall | May 2012 | A1 |
20120133601 | Marshall | May 2012 | A1 |
20120134464 | Hoernig et al. | May 2012 | A1 |
20120148151 | Hamada | Jun 2012 | A1 |
20120150034 | DeFreitas et al. | Jun 2012 | A1 |
20120189092 | Jerebko | Jul 2012 | A1 |
20120194425 | Buelow | Aug 2012 | A1 |
20120238870 | Smith et al. | Sep 2012 | A1 |
20120277625 | Nakayama | Nov 2012 | A1 |
20120293511 | Mertelmeier | Nov 2012 | A1 |
20130016255 | Bhatt | Jan 2013 | A1 |
20130022165 | Jang | Jan 2013 | A1 |
20130044861 | Muller | Feb 2013 | A1 |
20130059758 | Haick | Mar 2013 | A1 |
20130108138 | Nakayama | May 2013 | A1 |
20130121569 | Yadav | May 2013 | A1 |
20130121618 | Yadav | May 2013 | A1 |
20130202168 | Jerebko | Aug 2013 | A1 |
20130259193 | Packard | Oct 2013 | A1 |
20130272494 | DeFreitas | Oct 2013 | A1 |
20140033126 | Kreeger | Jan 2014 | A1 |
20140035811 | Guehring | Feb 2014 | A1 |
20140064444 | Oh | Mar 2014 | A1 |
20140073913 | DeFreitas et al. | Mar 2014 | A1 |
20140082542 | Zhang | Mar 2014 | A1 |
20140200433 | Choi | Jul 2014 | A1 |
20140219534 | Wiemker et al. | Aug 2014 | A1 |
20140219548 | Wels | Aug 2014 | A1 |
20140276061 | Lee et al. | Sep 2014 | A1 |
20140327702 | Kreeger et al. | Nov 2014 | A1 |
20140328517 | Gluncic | Nov 2014 | A1 |
20150004558 | Inglese | Jan 2015 | A1 |
20150052471 | Chen | Feb 2015 | A1 |
20150061582 | Smith | Apr 2015 | A1 |
20150238148 | Georgescu | Aug 2015 | A1 |
20150258271 | Love | Sep 2015 | A1 |
20150302146 | Marshall | Oct 2015 | A1 |
20150309712 | Marshall | Oct 2015 | A1 |
20150317538 | Ren et al. | Nov 2015 | A1 |
20150331995 | Zhao | Nov 2015 | A1 |
20160000399 | Halmann et al. | Jan 2016 | A1 |
20160022364 | DeFreitas et al. | Jan 2016 | A1 |
20160051215 | Chen | Feb 2016 | A1 |
20160078645 | Abdurahman | Mar 2016 | A1 |
20160140749 | Erhard | May 2016 | A1 |
20160210774 | Wiskin et al. | Jul 2016 | A1 |
20160228034 | Gluncic | Aug 2016 | A1 |
20160235380 | Smith | Aug 2016 | A1 |
20160350933 | Schieke | Dec 2016 | A1 |
20160364526 | Reicher et al. | Dec 2016 | A1 |
20160367210 | Gkanatsios | Dec 2016 | A1 |
20170071562 | Suzuki | Mar 2017 | A1 |
20170132792 | Jerebko et al. | May 2017 | A1 |
20170202453 | Sekiguchi | Jul 2017 | A1 |
20170262737 | Rabinovich | Sep 2017 | A1 |
20180008220 | Boone et al. | Jan 2018 | A1 |
20180008236 | Venkataraman et al. | Jan 2018 | A1 |
20180047211 | Chen et al. | Feb 2018 | A1 |
20180109698 | Ramsay et al. | Apr 2018 | A1 |
20180132722 | Eggers et al. | May 2018 | A1 |
20180137385 | Ren | May 2018 | A1 |
20180144244 | Masoud | May 2018 | A1 |
20180256118 | DeFreitas | Sep 2018 | A1 |
20190000318 | Caluser | Jan 2019 | A1 |
20190015173 | DeFreitas | Jan 2019 | A1 |
20190037173 | Lee | Jan 2019 | A1 |
20190043456 | Kreeger | Feb 2019 | A1 |
20190057778 | Porter et al. | Feb 2019 | A1 |
20190287241 | Hill et al. | Sep 2019 | A1 |
20190290221 | Smith | Sep 2019 | A1 |
20190325573 | Bernard | Oct 2019 | A1 |
20200046303 | DeFreitas | Feb 2020 | A1 |
20200054300 | Kreeger | Feb 2020 | A1 |
20200093562 | DeFreitas | Mar 2020 | A1 |
20200184262 | Chui | Jun 2020 | A1 |
20200205928 | DeFreitas | Jul 2020 | A1 |
20200253573 | Gkanatsios | Aug 2020 | A1 |
20200345320 | Chen | Nov 2020 | A1 |
20200390404 | DeFreitas | Dec 2020 | A1 |
20210000553 | St. Pierre | Jan 2021 | A1 |
20210100518 | Chui | Apr 2021 | A1 |
20210100626 | St. Pierre | Apr 2021 | A1 |
20210113167 | Chui | Apr 2021 | A1 |
20210118199 | Chui | Apr 2021 | A1 |
20210174504 | Madabhushi | Jun 2021 | A1 |
20210212665 | Tsymbalenko | Jul 2021 | A1 |
20220005277 | Chen | Jan 2022 | A1 |
20220013089 | Kreeger | Jan 2022 | A1 |
20220036545 | St. Pierre | Feb 2022 | A1 |
20220192615 | Chui | Jun 2022 | A1 |
20220254023 | McKinney | Aug 2022 | A1 |
20220386969 | Smith | Dec 2022 | A1 |
20230000467 | Shi | Jan 2023 | A1 |
20230033601 | Chui | Feb 2023 | A1 |
20230038498 | Xu | Feb 2023 | A1 |
20230053489 | Kreeger | Feb 2023 | A1 |
20230054121 | Chui | Feb 2023 | A1 |
20230056692 | Gkanatsios | Feb 2023 | A1 |
20230082494 | Chui | Mar 2023 | A1 |
20230098305 | St. Pierre | Mar 2023 | A1 |
20230103969 | St. Pierre | Apr 2023 | A1 |
20230124481 | St. Pierre | Apr 2023 | A1 |
20230225821 | DeFreitas | Jul 2023 | A1 |
20230344453 | Yang | Oct 2023 | A1 |
20240169958 | Kreeger | May 2024 | A1 |
20240315654 | Chui | Sep 2024 | A1 |
20240320827 | Chui | Sep 2024 | A1 |
Number | Date | Country |
---|---|---|
2014339982 | Apr 2015 | AU |
1802121 | Jul 2006 | CN |
1846662 | Oct 2006 | CN |
101066212 | Nov 2007 | CN |
102169530 | Aug 2011 | CN |
202161328 | Mar 2012 | CN |
102429678 | May 2012 | CN |
102473300 | May 2012 | CN |
105193447 | Dec 2015 | CN |
106659468 | May 2017 | CN |
107440730 | Dec 2017 | CN |
112561908 | Mar 2021 | CN |
102010009295 | Aug 2011 | DE |
102011087127 | May 2013 | DE |
775467 | May 1997 | EP |
928001 | Mar 2000 | EP |
1428473 | Jun 2004 | EP |
2236085 | Jun 2010 | EP |
2215600 | Aug 2010 | EP |
2301432 | Mar 2011 | EP |
2491863 | Aug 2012 | EP |
1986548 | Jan 2013 | EP |
2656789 | Oct 2013 | EP |
2823464 | Jan 2015 | EP |
2823765 | Jan 2015 | EP |
2889743 | Jul 2015 | EP |
3060132 | Apr 2019 | EP |
H09-35043 | Feb 1997 | JP |
H09-198490 | Jul 1997 | JP |
H09-238934 | Sep 1997 | JP |
H10-33523 | Feb 1998 | JP |
2000-200340 | Jul 2000 | JP |
2002-109510 | Apr 2002 | JP |
2002-282248 | Oct 2002 | JP |
2003-126073 | May 2003 | JP |
2003-189179 | Jul 2003 | JP |
2003-199737 | Jul 2003 | JP |
2003-531516 | Oct 2003 | JP |
2004254742 | Sep 2004 | JP |
2005-110843 | Apr 2005 | JP |
2005-522305 | Jul 2005 | JP |
2005-227350 | Aug 2005 | JP |
2005-322257 | Nov 2005 | JP |
2006-519634 | Aug 2006 | JP |
2006-312026 | Nov 2006 | JP |
2007-130487 | May 2007 | JP |
2007-216022 | Aug 2007 | JP |
2007-325928 | Dec 2007 | JP |
2007-330334 | Dec 2007 | JP |
2007-536968 | Dec 2007 | JP |
2008-068032 | Mar 2008 | JP |
2008518684 | Jun 2008 | JP |
2008-253401 | Oct 2008 | JP |
2009-034503 | Feb 2009 | JP |
2009-522005 | Jun 2009 | JP |
2009-526618 | Jul 2009 | JP |
2009-207545 | Sep 2009 | JP |
2010-137004 | Jun 2010 | JP |
2011-110175 | Jun 2011 | JP |
2012-011255 | Jan 2012 | JP |
2012-501750 | Jan 2012 | JP |
2012-061196 | Mar 2012 | JP |
2013-530768 | Aug 2013 | JP |
2013-244211 | Dec 2013 | JP |
2014-507250 | Mar 2014 | JP |
2014-534042 | Dec 2014 | JP |
2015-506794 | Mar 2015 | JP |
2015-144632 | Aug 2015 | JP |
2016-198197 | Dec 2015 | JP |
2016059743 | Apr 2016 | JP |
2017-000364 | Jan 2017 | JP |
2017-056358 | Mar 2017 | JP |
10-2015-0010515 | Jan 2015 | KR |
10-2017-0062839 | Jun 2017 | KR |
9005485 | May 1990 | WO |
9317620 | Sep 1993 | WO |
9406352 | Mar 1994 | WO |
199700649 | Jan 1997 | WO |
199816903 | Apr 1998 | WO |
0051484 | Sep 2000 | WO |
2003020114 | Mar 2003 | WO |
03077202 | Sep 2003 | WO |
2005051197 | Jun 2005 | WO |
2005110230 | Nov 2005 | WO |
2005112767 | Dec 2005 | WO |
2006055830 | May 2006 | WO |
2006058160 | Jun 2006 | WO |
2007095330 | Aug 2007 | WO |
08014670 | Feb 2008 | WO |
2008047270 | Apr 2008 | WO |
2008050823 | May 2008 | WO |
2008054436 | May 2008 | WO |
2009026587 | Feb 2009 | WO |
2010028208 | Mar 2010 | WO |
2010059920 | May 2010 | WO |
2011008239 | Jan 2011 | WO |
2011043838 | Apr 2011 | WO |
2011065950 | Jun 2011 | WO |
2011073864 | Jun 2011 | WO |
2011091300 | Jul 2011 | WO |
2012001572 | Jan 2012 | WO |
2012068373 | May 2012 | WO |
2012063653 | May 2012 | WO |
2012112627 | Aug 2012 | WO |
2012122399 | Sep 2012 | WO |
2013001439 | Jan 2013 | WO |
2013035026 | Mar 2013 | WO |
2013078476 | May 2013 | WO |
2013123091 | Aug 2013 | WO |
2013136222 | Sep 2013 | WO |
2014080215 | May 2014 | WO |
2014149554 | Sep 2014 | WO |
2014207080 | Dec 2014 | WO |
2015061582 | Apr 2015 | WO |
2015066650 | May 2015 | WO |
2015130916 | Sep 2015 | WO |
2016103094 | Jun 2016 | WO |
2016184746 | Nov 2016 | WO |
2016206942 | Dec 2016 | WO |
2018183548 | Oct 2018 | WO |
2018183549 | Oct 2018 | WO |
2018183550 | Oct 2018 | WO |
2018236565 | Dec 2018 | WO |
2019032558 | Feb 2019 | WO |
2019091807 | May 2019 | WO |
2021021329 | Feb 2021 | WO |
2021168281 | Aug 2021 | WO |
2021195084 | Sep 2021 | WO |
Entry |
---|
European Extended Search Report in European Application 22203472.0, mailed Mar. 23, 2023, 8 pages. |
Duan, Xiaoman et al., “Matching corresponding regions of interest on cranio-caudal and medio-lateral oblique view mammograms”, IEEE Access, vol. 7, Mar. 25, 2019, pp. 31586-31597, XP011715754, DOI: 10.1109/Access.2019.2902854, retrieved on Mar. 20, 2019, abstract. |
Samulski, Maurice et al., “Optimizing case-based detection performance in a multiview CAD system for mammography”, IEEE Transactions on Medical Imaging, vol. 30, No. 4, Apr. 1, 2011, pp. 1001-1009, XP011352387, ISSN: 0278-0062, DOI: 10.1109/TMI.2011.2105886, abstract. |
Nikunjc, Oza et al., Dietterich, T.G., Ed., “Ensemble methods in machine learning”, Jan. 1, 2005, Multiple Classifier Systems, Lecture Notes in Computer Science; LNCS, Springer-Verlag Berlin/Heidelberg, pp. 1-15, abstract. |
“Filtered Back Projection”, (NYGREN), published May 8, 2007, URL: http://web.archive.org/web/19991010131715/http://www.owlnet.rice.edu/˜elec539/Projects97/cult/node2.html, 2 pgs. |
“Supersonic to feature Aixplorer Ultimate at ECR”, AuntiMinnie.com, 3 pages (Feb. 2018). |
Al Sallab et al., “Self Learning Machines Using Deep Networks”, Soft Computing and Pattern Recognition (SoCPaR), 2011 Int'l. Conference of IEEE, Oct. 14, 2011, pp. 21-26. |
Berg, WA et al., “Combined screening with ultrasound and mammography vs mammography alone in women at elevated risk of breast cancer”, JAMA 299:2151-2163, 2008. |
Burbank, Fred, “Stereotactic Breast Biopsy: Its History, Its Present, and Its Future”, published in 1996 at the Southeastern Surgical Congress, 24 pages. |
Bushberg, Jerrold et al., “The Essential Physics of Medical Imaging”, 3rd ed., In: “The Essential Physics of Medical Imaging, Third Edition”, Dec. 28, 2011, Lippincott & Wilkins, Philadelphia, PA, USA, XP05579051, pp. 270-272. |
Caroline, B.E. et al., “Computer aided detection of masses in digital breast tomosynthesis: A review”, 2012 International Conference on Emerging Trends in Science, Engineering and Technology (INCOSET), Tiruchirappalli, 2012, pp. 186-191. |
Carton, AK, et al., “Dual-energy contrast-enhanced digital breast tomosynthesis—a feasibility study”, BR J Radiol. Apr. 2010;83 (988):344-50. |
Chan, Heang-Ping et al., “Computer-aided detection system for breast masses on digital tomosynthesis mammograms: Preliminary Experience”, Radiology, Dec. 2005, 1075-1080. |
Chan, Heang-Ping et al., “ROC Study of the effect of stereoscopic imaging on assessment of breast lesions,” Medical Physics, vol. 32, No. 4, Apr. 2005, 1001-1009. |
Chen, SC, et al., “Initial clinical experience with contrast-enhanced digital breast tomosynthesis”, Acad Radio. Feb. 2007 14(2):229-38. |
Conner, Peter, “Breast Response to Menopausal Hormone Therapy—Aspects on Proliferation, apoptosis and Mammographic Density”, 2007 Annals of Medicine, 39;1, 28-41. |
Diekmann, Felix et al., “Thick Slices from Tomosynthesis Data Sets: Phantom Study for the Evaluation of Different Algorithms”, Journal of Digital Imaging, Springer, vol. 22, No. 5, Oct. 23, 2007, pp. 519-526. |
Diekmann, Felix., et al., “Digital mammography using iodine-based contrast media: initial clinical experience with dynamic contrast medium enhancement”, Invest Radiol 2005; 40:397-404. |
Dromain C., et al., “Contrast enhanced spectral mammography: a multi-reader study”, RSNA 2010, 96th Scientific Assembly and Scientific Meeting. |
Dromain, C., et al., “Contrast-enhanced digital mammography”, Eur J Radiol. 2009; 69:34-42. |
Dromain, Clarisse et al., “Dual-energy contrast-enhanced digital mammography: initial clinical results”, European Radiology, Sep. 14, 2010, vol. 21, pp. 565-574. |
Dromain, Clarisse, et al., “Evaluation of tumor angiogenesis of breast carcinoma using contrast-enhanced digital mammography”, AJR: 187, Nov. 2006, 16 pages. |
E. Shaw de Paredes et al., “Interventional Breast Procedure”, published Sep./Oct. 1998 in Curr Probl Diagn Radiol, pp. 138-184. |
EFilm Mobile HD by Merge Healthcare, web site: http://itunes.apple.com/bw/app/efilm-mobile-hd/id405261243?mt=8, accessed on Nov. 3, 2011 (2 pages). |
EFilm Solutions, eFilm Workstation (tm) 3.4, website: http://estore.merge.com/na/estore/content.aspx?productID=405, accessed on Nov. 3, 2011 (2 pages). |
Elbakri, Idris A. et al., “Automatic exposure control for a slot scanning full field digital mammography system”, Med. Phys. 2005; Sep. 32(9):2763-2770, Abstract only. |
Ertas, M. et al., “2D versus 3D total variation minimization in digital breast tomosynthesis”, 2015 IEEE International Conference on Imaging Systems and Techniques (IST), Macau, 2015, pp. 1-4. |
Feng, Steve Si Jia, et al., “Clinical digital breast tomosynthesis system: Dosimetric Characterization”, Radiology, Apr. 2012, 263(1); pp. 35-42. |
Fischer Imaging Corp, Mammotest Plus manual on minimally invasive breast biopsy system, 2002, 8 pages. |
Fischer Imaging Corporation, Installation Manual, MammoTest Family of Breast Biopsy Systems, 86683G, 86684G, P-55957-IM, Issue 1, Revision 3, Jul. 2005, 98 pages. |
Fischer Imaging Corporation, Operator Manual, MammoTest Family of Breast Biopsy Systems, 86683G, 86684G, P-55956-OM, Issue 1, Revision 6, Sep. 2005, 258 pages. |
Freiherr, G., “Breast tomosynthesis trials show promise”, Diagnostic Imaging—San Francisco 2005, V27; N4:42-48. |
Georgian-Smith, Dianne, et al., “Stereotactic Biopsy of the Breast Using an Upright Unit, a Vacuum-Suction Needle, and a Lateral Arm-Support System”, 2001, at the American Roentgen Ray Society meeting, 8 pages. |
Ghiassi, M. et al., “A Dynamic Architecture for Artificial Networks”, Neurocomputing, vol. 63, Aug. 20, 2004, pp. 397-413. |
Giger et al. “Development of a smart workstation for use in mammography”, in Proceedings of SPIE, vol. 1445 (1991), pp. 101103; 4 pages. |
Giger et al., “An Intelligent Workstation for Computer-aided Diagnosis”, in RadioGraphics, May 1993, 13:3 pp. 647-656; 10 pages. |
Glick, Stephen J., “Breast CT”, Annual Rev. Biomed. Eng., Sep. 2007;501-26. |
Hologic, “Lorad StereoLoc II” Operator's Manual 9-500-0261, Rev. 005, 2004, 78 pgs. |
Hologic, Inc., 510(k) Summary, prepared Nov. 28, 2010, for Affirm Breast Biopsy Guidance System Special 510(k) Premarket Notification, 5 pages. |
Hologic, Inc., 510(k) Summary, prepared Aug. 14, 2012, for Affirm Breast Biopsy Guidance System Special 510(k) Premarket Notification, 5 pages. |
ICRP Publication 60: 1990 Recommendations of the International Commission on Radiological Protection, 12 pages. |
Ijaz, Umer Zeeshan, et al., “Mammography phantom studies using 3D electrical impedance tomography with numerical forward solver”, Frontiers in the Convergence of Bioscience and Information Technologies 2007, 379-383. |
Jochelson, M., et al., “Bilateral Dual Energy contrast-enhanced digital mammography: Initial Experience”, RSNA 2010, 96th Scientific Assembly and Scientific Meeting. 1 page. |
Jong, RA, et al., Contrast-enhanced digital mammography: initial clinical experience. Radiology 2003; 228:842-850. |
Kao, Tzu-Jen et al., “Regional admittivity spectra with tomosynthesis images for breast cancer detection”, Proc. Of the 29th Annual Int'l. Conf. of the IEEE EMBS, Aug. 23-26, 2007, 4142-4145. |
Koechli, Ossi R., “Available Sterotactic Systems for Breast Biopsy”, Renzo Brun del Re (Ed.), Minimally Invasive Breast Biopsies, Recent Results in Cancer Research 173:105-113; Springer-Verlag, 2009. |
Kopans, Daniel B., “Breast Imaging”, 3rd Edition, Lippincott Williams and Wilkins, published Nov. 2, 2006, pp. 960-967. |
Kopans, et al. Will tomosynthesis replace conventional mammography? Plenary Session SFN08: RSNA 2005. |
Lehman, CD, et al. MRI evaluation of the contralateral breast in women with recently diagnosed breast cancer. N Engl J Med 2007; 356:1295-1303. |
Lewin, JM, et al., Dual-energy contrast-enhanced digital subtraction mammography: feasibility. Radiology 2003; 229:261-268. |
Lilja, Mikko, “Fast and accurate voxel projection technique in free-form cone-beam geometry with application to algebraic reconstruction,” Applies Sciences on Biomedical and Communication Technologies, 2008, Isabel '08, first international symposium on, IEEE, Piscataway, NJ, Oct. 25, 2008. |
Lindfors, KK, et al., Dedicated breast CT: initial clinical experience. Radiology 2008; 246(3): 725-733. |
Mahesh, Mahadevappa, “AAPM/RSNA Physics Tutorial for Residents—Digital Mammography: An Overview”, Nov.-Dec. 2004, vol. 24, No. 6, 1747-1760. |
Metheany, Kathrine G. et al., “Characterizing anatomical variability in breast CT images”, Oct. 2008, Med. Phys. 35 (10); 4685-4694. |
Niklason, L., et al., Digital tomosynthesis in breast imaging. Radiology. Nov. 1997; 205(2):399-406. |
Pathmanathan et al., “Predicting tumour location by simulating large deformations of the breast using a 3D finite element model and nonlinear elasticity”, Medical Image Computing and Computer-Assisted Intervention, pp. 217-224, vol. 3217 (2004). |
Pediconi, “Color-coded automated signal intensity-curve for detection and characterization of breast lesions: Preliminary evaluation of new software for MR-based breast imaging,” International Congress Series 1281 (2005) 1081-1086. |
Poplack, SP, et al., Digital breast tomosynthesis: initial experience in 98 women with abnormal digital screening mammography. AJR Am J Roentgenology Sep. 2007 189(3):616-23. |
Prionas, ND, et al., Contrast-enhanced dedicated breast CT: initial clinical experience. Radiology. Sep. 2010 256(3):714-723. |
Rafferty, E. et al., “Assessing Radiologist Performance Using Combined Full-Field Digital Mammography and Breast Tomosynthesis Versus Full-Field Digital Mammography Alone: Results” . . . presented at 2007 Radiological Society of North America meeting, Chicago IL. |
Reynolds, April, “Stereotactic Breast Biopsy: A Review”, Radiologic Technology, vol. 80, No. 5, Jun. 1, 2009, pp. 447M-464M, XP055790574. |
Sakic et al., “Mammogram synthesis using a 3D simulation. I. breast tissue model and image acquisition simulation” Medical Physics. 29, pp. 2131-2139 (2002). |
Samani, A. et al., “Biomechanical 3-D Finite Element Modeling of the Human Breast Using MRI Data”, 2001, IEEE Transactions on Medical Imaging, vol. 20, No. 4, pp. 271-279. |
Sechopoulos, et al., “Glandular radiation dose in tomosynthesis of the breast using tungsten targets”, Journal of Applied Clinical Medical Physics, vol. 8, No. 4, Fall 2008, 161-171. |
Shrading, Simone et al., “Digital Breast Tomosynthesis-guided Vacuum-assisted Breast Biopsy: Initial Experiences and Comparison with Prone Stereotactic Vacuum-assisted Biopsy”, the Department of Diagnostic and Interventional Radiology, Univ. of Aachen, Germany, published Nov. 12, 2014, 10 pgs. |
Smith, A., “Full field breast tomosynthesis”, Radiol Manage. Sep.-Oct. 2005; 27(5):25-31. |
Taghibakhsh, F. et al., “High dynamic range 2-TFT amplified pixel sensor architecture for digital mammography tomosynthesis”, IET Circuits Devices Syst., 2007, 1(10, pp. 87-92. |
Van Schie, Guido, et al., “Generating Synthetic Mammograms from Reconstructed Tomosynthesis Volumes”, IEEE Transactions on Medical Imaging, vol. 32, No. 12, Dec. 2013, 2322-2331. |
Van Schie, Guido, et al., “Mass detection in reconstructed digital breast tomosynthesis volumes with a computer-aided detection system trained on 2D mammograms”, Med. Phys. 40(4), Apr. 2013, 41902-1-41902-11. |
Varjonen, Mari, “Three-Dimensional Digital Breast Tomosynthesis in the Early Diagnosis and Detection of Breast Cancer”, IWDM 2006, LNCS 4046, 152-159. |
Weidner N, et al., “Tumor angiogenesis and metastasis: correlation in invasive breast carcinoma”, New England Journal of Medicine 1991; 324:1-8. |
Weidner, N, “The importance of tumor angiogenesis: the evidence continues to grow”, AM J Clin Pathol. Nov. 2004 122(5):696-703. |
Wen, Junhai et al., “A study on truncated cone-beam sampling strategies for 3D mammography”, 2004, IEEE, 3200-3204. |
Williams, Mark B. et al., “Optimization of exposure parameters in full field digital mammography”, Medical Physics 35, 2414 (May 20, 2008); doi: 10.1118/1.2912177, pp. 2414-2423. |
Wodajo, Felasfa, MD, “Now Playing: Radiology Images from Your Hospital PACS on your iPad,” Mar. 17, 2010; web site: http://www.imedicalapps.com/2010/03/now-playing-radiology-images-from-your-hospital-pacs-on-your-ipad/, accessed on Nov. 3, 2011 (3 pages). |
Yin, H.M., et al., “Image Parser: a tool for finite element generation from three-dimensional medical images”, BioMedical Engineering Online. 3:31, pp. 1-9, Oct. 1, 2004. |
Zhang, Yiheng et al., “A comparative study of limited-angle cone-beam reconstruction methods for breast tomosythesis”, Med Phys., Oct. 2006, 33(10): 3781-3795. |
Zhao, Bo, et al., “Imaging performance of an amorphous selenium digital mammography detector in a breast tomosynthesis system”, May 2008, Med. Phys 35(5); 1978-1987. |
Cho, N. et al., “Distinguishing Benign from Malignant Masses at Breast US: Combined US Elastography and Color Doppler US-Influence on Radiologist Accuracy”, Radiology, 262(1): 80-90 (Jan. 2012). |
Green, C. et al., “Deformable mapping using biochemical models to relate corresponding lesions in digital breast tomosynthesis and automated breast ultrasound images”, Medical Image Analysis, 60: 1-18 (Nov. 2019). |
Kim, Eun Sil, et al., “Significance of microvascular evaluation of ductal lesions on breast ultrasonography: Influence on diagnostic performance”, Clinical Imaging, Elsevier, NY, vol. 51, Jun. 6, 2018, pp. 252-259. |
Lee, E. et al., “Combination of Quantitative Parameters of Shear Wave Elastography and Superb Microvascular Imaging to Evaluate Breast Masses”, Korean Journal of Radiology: Official Journal of the Korean Radiological Society, 21(9): 1045-1054 (Jan. 2020). |
Love, Susan M., et al. “Anatomy of the nipple and breast ducts revisited”, Cancer, American Cancer Society, Philadelphia, PA, vol. 101, No. 9, Sep. 20, 2004, pp. 1947-1957. |
Number | Date | Country | |
---|---|---|---|
20230125385 A1 | Apr 2023 | US |
Number | Date | Country | |
---|---|---|---|
63271339 | Oct 2021 | US |