This invention concerns a system for erasing features in a displayable three dimensional (3D) medical image volume.
Blood vessel analysis performed on a 3D image volume is often affected by noise and unwanted touching vessels present in the region of interest of the 3D image volume. In one known system a volume punching function is used to remove unwanted parts of a 3D volume. However, the volume punching process is tedious and burdensome involving multiple repetitive erasure actions to erase an unwanted volume portion. A system according to invention principles addresses this deficiency and related problems.
A system provides 3D Eraser functions for erasing unwanted touching vessels, for example, from a 3D volume to increase user viewability of a vessel display image for analysis. A system erases features in a displayable three dimensional (3D) medical image volume comprising multiple image slices. An erasure cursor coordinate detector detects two dimensional (2D) location coordinates identifying location of a movable erasure cursor in a displayed image within the medical image volume. A data processor translates the detected erasure cursor 2D location coordinates to corresponding 2D pixel coordinates within an image slice in the 3D medical image volume. The data processor sets luminance values of pixels corresponding to the 2D pixel coordinates of the image slice to a background luminance value of the image slice to provide erased pixels corresponding to erasure cursor locations having an identifier. A display processor generates data representing a display image showing the image slice with the set background luminance values of the erased pixels.
A 3D erasure system is usable to clean a 3D image volume and eliminate unwanted vessel segments within the 3D volume. The system advantageously facilitates and expedites a 3D image volume cleanup process removing unwanted parts of a 3D volume in comparison with known systems by erasing unwanted touching vessels, for example, to increase user viewability of a vessel display image for analysis. A user is not limited by having to select a region of interest of an imaging volume followed by selection of an inside or outside cleaning option for cropping each individual single region. Rather, a user selects an Erasure cursor in a user interface (UI) and is able to move the Erasure cursor around an image volume to erase unwanted features until an erasing task is complete.
A system 3D Eraser function is used to eliminate noise and unwanted touching vessels from a 3D image volume to increase clarity of desired vessels for analysis. A user is provided with an option to erase unwanted vessel features and noise from an image volume region of interest before initiating a vessel analysis process. The cleaning of the unwanted vessel parts and features comprises selecting a 3D erasure cursor of selectable size from multiple different size cursors available via a UI and moving it around the vessel region. A user selects an erasure cursor of a particular size from multiple different sizes based on the size of the area to be cleaned.
In step 213 processor 15 processes original image mask 203 by detecting image areas in step 216 that match mouse drag position information provided in response to a mouse drag event. In step 219, processor 15 removes image areas that fall under a mouse drag position in 3D space and in step 222 returns an updated image mask that contains removed image areas.
In response to generation of contour 335, processor 15 maps 3D coordinates generated during a cursor drag event to 2D slice coordinates. A 3D coordinate provided by the cursor drag event is in (x, y, z) format, where z correspond to a 2D slice number. The 3D coordinate is mapped to a 2D coordinate in the format (x,y) for the affected 2D slice. Sets of 2D coordinates are processed for affected slices in steps 309, 311 and 313. Affected rows in each of the affected 2D slices are processed by setting pixel color luminance values corresponding to the 2D coordinates concerned to a background luminance values, hence providing the erase operation. An original 3D mask image volume is modified by altering the affected 2D slices and a new modified mask is provided and displayed. In steps 309, 311 and 313 the 2D image slices 350, 352 and 354 comprise slices n-4, n-3 and n-2 respectively through volume 330 that are processed by processor 15 to provide 2D coordinates for contour 335 in the 2D slices in areas 340, 342 and 344 respectively. In steps 315, 317 and 319, processor 15 erases selected pixels in 2D slice 350, 352 and 354 by setting pixels in the slice area corresponding to area 333 to background color as shown by portions 360, 362 and 364, respectively. A cursor erase operation is repeated for each erasure cursor drag event. The system processes the affected 2D slices, and not an entire 3D volume in order to accelerate processing for real time update and image data display on monitor 33.
Processor 15 in step 619 sets luminance values of pixels corresponding to the 2D pixel coordinates of the image slice (having an identifier) and pixels within the 3D region within the volume to a background luminance value of the image slice to provide erased pixels corresponding to erasure cursor locations. Processor 15 in one embodiment, also sets luminance values of pixels corresponding to the 2D pixel coordinates of the first and second different image slices to a background luminance value of the corresponding image slice. Display processor 12 in step 621 generates data representing a display image showing the image slice with the set background luminance values of the erased pixels. The process of
A processor as used herein is a device for executing machine-readable instructions stored on a computer readable medium, for performing tasks and may comprise any one or combination of, hardware and firmware. A processor may also comprise memory storing machine-readable instructions executable for performing tasks. A processor acts upon information by manipulating, analyzing, modifying, converting or transmitting information for use by an executable procedure or an information device, and/or by routing the information to an output device. A processor may use or comprise the capabilities of a computer, controller or microprocessor, for example, and is conditioned using executable instructions to perform special purpose functions not performed by a general purpose computer. A processor may be coupled (electrically and/or as comprising executable components) with any other processor enabling interaction and/or communication there-between. Computer program instructions may be loaded onto a computer, including without limitation a general purpose computer or special purpose computer, or other programmable processing apparatus to produce a machine, such that the computer program instructions which execute on the computer or other programmable processing apparatus create means for implementing the functions specified in the block(s) of the flowchart(s). A user interface processor or generator is a known element comprising electronic circuitry or software or a combination of both for generating display elements or portions thereof. A user interface comprises one or more display elements enabling user interaction with a processor or other device.
An executable application, as used herein, comprises code or machine readable instructions for conditioning the processor to implement predetermined functions, such as those of an operating system, a context data acquisition system or other information processing system, for example, in response to user command or input. An executable procedure is a segment of code or machine readable instruction, sub-routine, or other distinct section of code or portion of an executable application for performing one or more particular processes. These processes may include receiving input data and/or parameters, performing operations on received input data and/or performing functions in response to received input parameters, and providing resulting output data and/or parameters. A graphical user interface (GUI), as used herein, comprises one or more display elements, generated by a display processor and enabling user interaction with a processor or other device and associated data acquisition and processing functions.
The UI also includes an executable procedure or executable application. The executable procedure or executable application conditions the display processor to generate signals representing the UI display images. These signals are supplied to a display device which displays the elements for viewing by the user. The executable procedure or executable application further receives signals from user input devices, such as a keyboard, mouse, light pen, touch screen or any other means allowing a user to provide data to a processor. The processor, under control of an executable procedure or executable application, manipulates the UI display elements in response to signals received from the input devices. In this way, the user interacts with the display elements using the input devices, enabling user interaction with the processor or other device. The functions and process steps herein may be performed automatically or wholly or partially in response to user command. An activity (including a step) performed automatically is performed in response to executable instruction or device operation without user direct initiation of the activity. A histogram of an image is a graph that plots the number of pixels (on the y-axis herein) in the image having a specific intensity value (on the x-axis herein) against the range of available intensity values. The resultant curve is useful in evaluating image content and can be used to process the image for improved display (e.g. enhancing contrast).
The system and processes of
This is a non-provisional application of provisional application Ser. No. 61/651,069 filed May 24, 2012, by K. Dutta.
Number | Date | Country | |
---|---|---|---|
61651069 | May 2012 | US |