This invention concerns an image data processing system for automatically detecting a boundary of an object in 3D (three dimensional) medical image data.
It is often desired to draw a contour and outline of an object in three dimensional (3D) medical image data representing an anatomical volume. For example, it may be necessary to visualize a vessel outline through which a catheter is to be guided to a detected tumor or lesion for use in applying a surgical procedure to the tumor. In known systems an image showing a 3D contour of an Aorta, for example, is presented on a monitor in order to aid a physician place an artificial aortic valve on top of a malfunctioning valve. One known system generates a 3D outline contour for an object of interest by displaying an Aorta surface in a 3D image view on a monitor, capturing the displayed image data and by using a known boundary tracing method to generate the outline contour. In this known system, the generated outline is not smooth and the method is typically computation intensive and slow and the 3D image view often does not match a user interpretation.
Another known system involves generating and use of an Aorta mesh outline.
A system generates an outline that looks smooth in real-time with a 3D look and feel and identifies hidden lines whilst remaining insensitive to the rendering order of objects. An image data processing system automatically detects a boundary of an object in 3D (three dimensional) medical image data using a repository and image data processor. The repository includes a 3D (three dimensional) image dataset comprising data representing a 3D mesh of individual points of an anatomical volume of interest. The image data processor processes the 3D mesh data retrieved from the repository to identify an object boundary by, identifying for a first line segment between first and second points of the mesh, third and fourth points lying either side of the line segment, the first, second and third points comprising a first triangle and the first, second and fourth points comprising a second triangle. The image data processor determines a first normal vector for the first triangle and a second normal vector for the second triangle, determines a third normal vector perpendicular to a display screen, determines a first product of the first and third vectors and a second product of the second and third vectors and identifies the first line segment as a potential segment of the object boundary that is viewable by a user on the display screen in response to the sign of the first and second products.
A system generates an outline that looks smooth in real-time with a 3D look and feel and identifies hidden lines whilst remaining insensitive to the rendering order of objects.
Server 20 includes image data processor 15. In alternative arrangements, image data processor 15 may be located in device 12 or in another device connected to network 21. Repository 17 includes a 3D (three dimensional) image dataset representing an anatomical volume of interest. Image data processor 15 processes the 3D mesh data retrieved from repository 17 to identify an object boundary by, identifying for a first line segment between first and second points of the mesh, third and fourth points lying either side of the line segment, the first, second and third points comprising a first triangle and the first, second and fourth points comprising a second triangle. Processor 15 determines a first normal vector for the first triangle and a second normal vector for the second triangle and determines a third normal vector perpendicular to a display screen. Processor 15 determines a first product of the first and third vectors and a second product of the second and third vectors and identifies the first line segment as a potential segment of the object boundary that is viewable by a user on the display 19 screen in response to the sign of the first and second products. In addition processor 15 employs a hidden point detection function to determine if any of the ending points of the line segment are visible. Display processor 36 initiates generation of a display image including the object and displays the line segment as a portion of the object boundary in response to the line segment ending points being visible.
Processor 15 in step 609 identifies and selects points on the structure boundaries and generates 3D object surface mesh structure data using the identified points. Processor 15 generates a 3D mesh surface structure by applying a marching cube function to the binary mask and searches edges on the object mesh to find the edges that are the outline of the object. A marching cube function is a known function used for extracting a polygonal mesh of an isosurface from a three-dimensional scalar field (sometimes called voxels) by taking eight neighbor locations at a time (thus forming an imaginary cube) and determining the polygon(s) needed to represent the part of the isosurface that passes through this cube and the polygons are combined to form a desired surface (William E. Lorensen, Harvey E. Cline: Marching Cubes: A high resolution 3D surface construction algorithm. In: Computer Graphics, Vol. 21, Nr. 4, July 1987).
Processor 15 identifies the line segment AC as a potential segment of an object boundary that is viewable by a user on the display screen in response to the sign of the first and second products. Processor 15 computes a surface normal for a triangle by taking the vector cross product of two edges of that triangle. The order of the vertices used in the calculation will affect the direction of the normal (in or out of the triangle). For a triangle A, B, C, if an edge vector U=B−A and an edge vector V=C−A then the normal N =U X V is calculated by:
Nx=UyVz−UzVy
Ny=UzVx−UxVz
Nz=UxVy−UyVx
For each edge (i.e. AC) on a surface, a map is generated by mapping edge AC to point X and point Y (point Y may be currently unknown). Triangle ACB contains edge AC with vertice B, for example and the mesh structure is updated as edge AC is mapped to point X (i.e. B) and point Y (currently unknown). Triangle ACD contains edge AC with vertice D. and the mesh structure is updated by mapping edge AC to point X (i.e. B) and point Y (i.e. D). Given edge AC and point B, corresponding vertice on the other side is determined as D.
Processor 15 in step 627 applies hidden point detection function 629 to determine if any of the ending points of line segment AC are visible. The hidden point detection function is described in Published U.S. Patent Application 2011/0072397 by S. Baker et al. If any of the ending points of the edge are visible, the edge is displayed as the final outline for the 3D object in step 631.
In step 919 processor 15 determines a first normal vector for the first triangle and a second normal vector for the second triangle and in step 923 determines a third normal vector perpendicular to a display screen. In step 926 processor 15 determines a first product of the first and third vectors and a second product of the second and third vectors and in step 929 identifies the first line segment as a potential segment of the object boundary that is viewable by a user on the display 19 screen in response to the sign of the first and second products being different. In one embodiment the first and second products are dot products. Processor 15 in step 931 employs a hidden point detection function to automatically detect if the line segment is obscured by another object and is not viewable by the user on the display screen. The hidden point detection function also determines if any of the ending points of the line segment are viewable by the user on the display screen. Further, in step 933 display processor 36 initiates generation of a display image excluding the line segment in response to the line segment being obscured and in another embodiment including the object and the line segment as a portion of the object boundary in response to the ending points (the first and second points) being visible. The process of
A processor as used herein is a device for executing machine-readable instructions stored on a computer readable medium, for performing tasks and may comprise any one or combination of, hardware and firmware. A processor may also comprise memory storing machine-readable instructions executable for performing tasks. A processor acts upon information by manipulating, analyzing, modifying, converting or transmitting information for use by an executable procedure or an information device, and/or by routing the information to an output device. A processor may use or comprise the capabilities of a controller, computer or microprocessor, for example, and is conditioned using executable instructions to perform special purpose functions not performed by a general purpose computer. A processor may be coupled (electrically and/or as comprising executable components) with any other processor enabling interaction and/or communication there-between. A user interface processor or generator is a known element comprising electronic circuitry or software or a combination of both for generating display images or portions thereof. A user interface comprises one or more display images enabling user interaction with a processor or other device.
An executable application, as used herein, comprises code or machine readable instructions for conditioning the processor to implement predetermined functions, such as those of an operating system, a context data acquisition system or other information processing system, for example, in response to user command or input. An executable procedure is a segment of code or machine readable instruction, sub-routine, or other distinct section of code or portion of an executable application for performing one or more particular processes. These processes may include receiving input data and/or parameters, performing operations on received input data and/or performing functions in response to received input parameters, and providing resulting output data and/or parameters. A graphical user interface (GUI), as used herein, comprises one or more display images, generated by a display processor and enabling user interaction with a processor or other device and associated data acquisition and processing functions.
The UI also includes an executable procedure or executable application. The executable procedure or executable application conditions the display processor to generate signals representing the UI display images. These signals are supplied to a display device which displays the image for viewing by the user. The executable procedure or executable application further receives signals from user input devices, such as a keyboard, mouse, light pen, touch screen or any other means allowing a user to provide data to a processor. The processor, under control of an executable procedure or executable application, manipulates the UI display images in response to signals received from the input devices. In this way, the user interacts with the display image using the input devices, enabling user interaction with the processor or other device. The functions and process steps (e.g., of
The system and processes of