This is a non-provisional application of provisional application Ser. No. 61/266,526 filed 4 Dec., 2009, by Markus Lendl.
This invention concerns a medical image data processing system for automatically selecting images showing an anatomically invasive instrument having a pair of instrument identification marker objects.
It is desirable to have precise and clear visibility of a stent in an angiographic image for evaluation of stent placement. A stent is used as an example of an object used invasively such as during a PTCA (Percutaneous Transluminal Coronary Angioplasty) procedure, for example. The location and inflation status of a stent are of particular interest. A stent comprises a mesh of fine wires (struts) and an X-ray based angiographic system is typically used for visualization of a stent during placement. Displaying stent struts is particularly challenging when a patient is large or X-ray beams are applied at steep angles. In order to improve image quality for stent imaging, multiple images may be registered (aligned) based on location of balloon marker balls on a stent and subsequently averaged. Correctly performed this procedure increases the CNR (Contrast to Noise Ratio) significantly and improves visibility of stent struts, or at least the limits of the stent. A pre-condition for a reasonable outcome of this image processing procedure is reliable selection of “consistent” and “sharp” image frames for further post-processing, like registration and averaging. In this context “consistent” means that the stent need to have the same shape in images used for post-processing. Image frames that include a stent with different curvature typically results in sup-optimal post-processing results. A “sharp” frame can be defined in terms of visibility of the marker ball borders and of course stent struts. Sharpness is degraded by motion blur. A blurred image decreases the quality of image post-processing results.
A system provides robust automated selection of specific medical image frames for further post-processing from an angiographic multi-frame image sequence that contains balloon markers using statistical analysis and application of multiple different criteria (e.g., marker velocity). A medical image data processing system automatically selects images showing an anatomically invasive instrument having a pair of instrument identification marker objects, for use in Angiography or another medical procedure. An image data processor automatically, identifies one or more candidate image objects potentially representing invasive instrument marker objects in multiple images in a sequence of acquired images in response to predetermined size and shape data of marker objects. The image data processor identifies pairs of the identified candidate image objects potentially indicating an invasive instrument, in response to predetermined data associated with relative location of individual marker objects of a pair of identification marker objects of an invasive instrument. The image data processor selects in the multiple images, at least one of the identified pairs of identified candidate image objects in response to predetermined criteria and selects images of the multiple images associated with a selected pair of identified candidate image objects.
A system according to invention principles selects “consistent” and “sharp” images showing an anatomically invasive instrument having a pair of instrument identification marker objects. The system selects images for further post-processing (like registration and averaging) from a sequence of images by identifying “consistent” and “sharp” image frames. In the “consistent” and “sharp” image frames stents have substantially the same shape and marker balls and stent struts are substantially not degraded by motion blur. The system employs statistical marker pair selection based on multiple predetermined criteria concerning pre-classified marker-like objects in images. A marker sphere as used herein comprises a sphere or another radio-opaque object used to mark position or boundaries of a stent or invasive instrument.
Server 20 includes image data processor 29 and system and imaging controller 34. User interface 31 generates data representing display images comprising a Graphical User Interface (GUI) for presentation on display 19 of processing device 12. Imaging controller 34 controls operation of imaging device 25 in response to user commands entered via data entry device 26. In alternative arrangements, one or more of the units in server 20 may be located in device 12 or in another device connected to network 21.
Image data processor 29 automatically identifies one or more candidate image objects potentially representing invasive instrument marker objects in multiple images in a sequence of acquired images in response to predetermined size and shape data of marker objects. Processor 29 identifies pairs of the identified candidate image objects potentially indicating an invasive instrument, in response to predetermined data associated with relative location of individual marker objects of a pair of identification marker objects of an invasive instrument. Processor 29 further selects in the multiple images, at least one of the identified pairs of identified candidate image objects in response to predetermined criteria and selects images of said plurality of images associated with a selected pair of identified candidate image objects.
In step 309 of
Processor 29, in step 312 identifies an image object pair as candidate stent balloon marker objects based on predetermined identification criteria stored in repository 17 and by considering objects clusters. The identification criteria includes, (a) an object pair occurs in multiple frames, (b) distance between objects does not change substantially between successive images e.g., objects are separated by a length within a predetermined range (e.g. +/−20 pixels), (c) balloon orientation as determined by a line connecting an object pair, does not change substantially, between successive images e.g., variation of direction of a line connecting an object pair is within a predetermined range (e.g. +/−10° and (d) movement of object pair location as determined by a mid point between the object pair, is limited between successive images e.g., an object pair mid point remains within a predetermined range (e.g. +/−50 pixels).
Processor 29, in step 315 selects image object pairs from the candidate pairs identified in step 312 by selecting a winning group (cluster) of pairs associated with different image frames, as having the highest number of pairs in a cluster. In selecting pairs, the system recognizes that a single object pair is a correct marker pair in a particular image. If there is more than one pair associated with the same image, the pair with the higher contrast (defined as a grey level difference between the object area and its background) is chosen. If multiple object pair groups have the same number of members, the system uses an average contrast value as a criterion to decide on which group wins, i.e., a group having the highest average contrast value is selected. Processor 29 in step 315 further selects images associated with a selected winning object pair in a selected winning group so that a single catheter and a single marker object pair present in the single image are selected. Thereby, if there is more than one marker object pair in a sequence of images, only one pair wins.
In step 317 processor 29 discards fast moving object pairs comprising image object pairs that move substantially between successive images in an image sequence and registers and averages multiple images in order to improve image quality for stent imaging. The multiple images are registered (aligned) based on the location of the identified balloon marker object pairs of a stent and the images are subsequently averaged. This procedure increases the CNR (Contrast to Noise Ratio) significantly and improves visibility of stent struts and limits of the stent. Processor 29 discards fast moving object pairs that are associated with transitional heart phases (contraction, expansion) to eliminate use of blurred object pairs in aligning different images which results in degraded image alignment. This improves image alignment for patients undergoing a PCTA (Percutaneous Transluminal Coronary Angioplasty) procedure that tend to exhibit arrhythmic heart beat cycles.
System 10 (
In step 623, image data processor 29 automatically identifies pairs of the identified candidate image objects potentially indicating an invasive instrument, in response to predetermined data associated with relative location of individual marker objects of a pair of identification marker objects of an invasive instrument in the multiple images. The image data processor excludes pairs of identified candidate image objects, from the identified pairs of identified candidate image objects, having a distance between the identified image objects outside of a predetermined range. Image data processor 29 identifies the pairs of the identified candidate image objects, in response to predetermined criteria and determining at least one of, (a) a distance between identified candidate image objects does not change substantially over the multiple images, (b) identified candidate image object orientation indicated by a projected line between a candidate pair of the identified candidate image objects does not change substantially over the multiple images and (c) movement of a candidate pair of the identified candidate image objects determined using at least a portion of the projected line does not change substantially over the multiple images. Image data processor 29 in step 626 automatically selects in the multiple images, at least one of the identified pairs of identified candidate image objects in response to predetermined criteria and/or as a pair with the highest contrast between object area and the object background.
In step 628, image data processor 29 automatically selects images of the multiple images associated with a selected pair of identified candidate image objects. Image data processor 29 excludes from use in image selection identified pairs of identified candidate image objects having a movement velocity between image frames exceeding a predetermined threshold velocity value. The image data processor also excludes from use in image selection, images having less than two identified candidate image objects. Image data processor 29 determines a movement velocity of an identified pair of identified candidate image objects between image frames by determining movement distance of substantially a mid-point of the pair of identified candidate image objects occurring between a successive pair of image frames.
Image data processor 29 identifies in the multiple images, at least one group of one or more of the identified pairs of identified candidate image objects in response to predetermined criteria; and selects images of the multiple images associated with an identified pair of identified candidate image objects in the at least one group. The image data processor identifies the group in response to the predetermined criteria indicating at least one of, (a) identified corresponding candidate image objects in the multiple images are within a predetermined threshold distance of each other, (b) the direction of a projected line joining an identified pair of identified candidate image objects in the multiple images is within a predetermined threshold angular range over the multiple images and (c) the median point of identified pairs of corresponding identified candidate image objects in the multiple images is within a predetermined threshold distance over the multiple images. Based on the velocity information image data processor 29 in step 629 excludes images images containing fast moving candidate image objects that may degrade the final resulting image. Image data processor 29 in step 630 aligns and averages the selected images of the multiple images based on the location of the selected identified pair of identified candidate image objects, to improve stent visibility. The process of
A processor as used herein is a computer, processing device, logic array or other device for executing machine-readable instructions stored on a computer readable medium, for performing tasks and may comprise any one or combination of, hardware and firmware. A processor may also comprise memory storing machine-readable instructions executable for performing tasks. A processor acts upon information by manipulating, analyzing, modifying, converting or transmitting information for use by an executable procedure or an information device, and/or by routing the information to an output device. A processor may use or comprise the capabilities of a controller or microprocessor, for example, and is conditioned using executable instructions to perform special purpose functions not performed by a general purpose computer. A processor may be coupled (electrically and/or as comprising executable components) with any other processor enabling interaction and/or communication there-between. A display processor or generator is a known element comprising electronic circuitry or software or a combination of both for generating display images or portions thereof.
An executable application, as used herein, comprises code or machine readable instructions for conditioning the processor to implement predetermined functions, such as those of an operating system, a context data acquisition system or other information processing system, for example, in response to user command or input. An executable procedure is a segment of code or machine readable instruction, sub-routine, or other distinct section of code or portion of an executable application for performing one or more particular processes. These processes may include receiving input data and/or parameters, performing operations on received input data and/or performing functions in response to received input parameters, and providing resulting output data and/or parameters. A user interface (UI), as used herein, comprises one or more display images, generated by a display processor and enabling user interaction with a processor or other device and associated data acquisition and processing functions.
The UI also includes an executable procedure or executable application. The executable procedure or executable application conditions the display processor to generate signals representing the UI display images. These signals are supplied to a display device which displays the image for viewing by the user. The executable procedure or executable application further receives signals from user input devices, such as a keyboard, mouse, light pen, touch screen or any other means allowing a user to provide data to a processor. The processor, under control of an executable procedure or executable application, manipulates the UI display images in response to signals received from the input devices. In this way, the user interacts with the display image using the input devices, enabling user interaction with the processor or other device. The functions and process steps herein may be performed automatically or wholly or partially in response to user command. An activity (including a step) performed automatically is performed in response to executable instruction or device operation without user direct initiation of the activity.
The system and processes of
Number | Date | Country | |
---|---|---|---|
61266526 | Dec 2009 | US |