The present invention relates generally to image processing. More particularly, this invention relates to volume based contour generation using a graphic processing unit.
Contour generation has many useful applications, such as showing tumor contour on top of a three-dimensional (3D) image or a digitally reconstructed radiograph (DRR). Contour generation is also widely used by the computer game and animation film industries. Some conventional approaches to generate contour are presented below.
According to one conventional approach, two-dimensional (2D) edge detection is used to generate 2D contours of an object in some applications. Specifically, the edge detection may be implemented by contour tracking. One fundamental step of image analysis is segmentation, which partitions an image into individual objects. One existing way of segmentation is gray level edge detection. The outputs of edge detectors are usually linked together to form continuous boundaries for further processing, such as shape analysis. Hence, besides edge location, the output may also include other features, such as thinness and continuity of edge segments. According to one conventional approach, the edge direction is used to trace an edge segment. Further, different edge operators have been developed for edge detection, such as Sobel operator, a three-level template matching operator, and the Frei-Chen operator.
In addition to 2D contour generation, conventional software has been developed to render 3D contour. Currently, the discontinuities in a z-buffer derivative are highlighted to render the contour according to one conventional approach. Another current approach renders the outlines of 3D objects by applying edge detection filters to specially prepared depth and normal maps, and then compositing the results with the rest of the image showing the object. Another current approach to render contour of a 3D object is model based. Specifically, the technique uses image processing and a stochastic, physically based particle system to draw the visible contour of a 3D model of the object. To detect the contour, a depth map of the model and a few simple parameters set by a user are used.
However, one common drawback of the above conventional techniques is that they are computationally expensive. Since the above conventional techniques are implemented using complex software executed by some conventional hardware, such as general-purpose processors. Since the software is usually computationally intensive, it may take a long time to generate contours by running the software on conventional hardware, and thus, it will not be practical to apply the above conventional technique in real-time applications.
The present invention is illustrated by way of example and not limitation in the figures of the accompanying drawings in which like references indicate similar elements.
Volume based contour generation using a graphics processing unit are described herein. In the following description, numerous details are set forth to provide a more thorough explanation of embodiments of the present invention. It will be apparent, however, to one skilled in the art, that embodiments of the present invention may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form, rather than in detail, in order to avoid obscuring embodiments of the present invention.
Reference in the specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention. The appearances of the phrase “in one embodiment” in various places in the specification do not necessarily all refer to the same embodiment.
According to one embodiment, contour generation may start with a volume dataset representing a volume of interest (VOI) in a three-dimensional (3D) space. A contour of the VOI may be generated from the volume dataset representing the VOI, wherein at least a portion of the generation is performed using a graphics processing unit. In some embodiments, the contour is a two-dimensional (2D) contour of a projection of the VOI.
In the following discussion, a contour generally refers to an outline of an object at a particular viewpoint or viewing angle. Thus, the contour may be viewpoint dependent. A related concept is a silhouette, which typically refers to an outline of an object and a featureless interior within the outline. Thus, the contour generation techniques may be used to generate a silhouette.
The graphics processing unit described herein refers to hardware specialized in or dedicated to image processing, which is logically and/or physically separated from a general-purpose processing unit in a computing system (e.g., a central processing unit of a personal computer). For example, the graphics processing unit may include a graphics accelerator, which is a computer microelectronics component to which a computer program may offload certain image processing tasks, such as the sending and refreshing of images to the display monitor and the computation of special effects common to 2D and 3D images. Graphics accelerators may speed up the displaying of images on the monitor making it easier and faster to achieve certain graphic effects, such as, for example, the presentation of very large images and/or of interactive games in which images need to change quickly. The graphics processing unit may be implemented as hardware or a combination of hardware and software. One example of the graphics processing unit is a graphics card manufactured from a variety of vendors, such as, for example, NVIDIA Corporation® of Santa Clara, Calif. or ATI Technologies, Inc.® of Ontario, Canada, etc. To render images at real-time, the graphics processing unit may operate at a rate of at least thirty (30) frames/second in some embodiments.
Referring to
In some embodiments, processing logic renders the volume dataset to generate a 2D image of the VOI using a graphics processing unit (processing block 120). Processing logic may load a direct volume rendering program into the graphics processing unit and may cause the graphics processing unit to execute the direct volume rendering program to generate the 2D image of the VOI from the volume dataset. The 2D image of the VOI may be temporarily stored in a frame buffer.
In some embodiments, processing logic then generates a contour of the 2D image in the frame buffer (processing block 130). In one embodiment, processing logic generates the contour by using a hardware accelerated fragment shader program. In one embodiment, a fragment shader program is a computer program used in 3D graphics to determine one or more surface properties of an object or an image. One embodiment of the pseudo-code of an exemplary fragment shader program is shown in
Referring back to
In some embodiments, processing logic saves the set of points to be transferred to another system later (processing block 150). Depending on the application involved, the set of points may be transferred to different systems later, such as a radiosurgical treatment delivery system, a video game system, etc.
Finally, processing logic may render the contour on top of the 3D image or a 2D image generated from the 3D image (processing block 160). For example, the 3D image may be a CT image and the 2D image generated from the CT image may be a DRR.
Some portions of the contour generation process, such as volume rendering and fragment shader program execution, may be computationally intensive. By offloading at least part of the contour generation process to the graphics processing unit, the general-purpose processor of the computing system may be alleviated of the computationally intensive tasks. Furthermore, the graphics processing unit may be specialized in executing some predetermined graphics processes and/or the graphics processing unit may operate at a higher rate than many general-purpose processors in some embodiments. As a result, it may take less time to generate contours using the approach described herein. Because of faster speed, the contour generation approach described herein may be suitable for applications demanding high speed processing, such as real time graphics applications (e.g., lung tumor tracking in radiosurgery, video games, etc.).
Referring to
In some embodiments, the volume rendering module 220 loads a direct volume rendering program and the volume dataset 203 into the graphics processing unit 260 and causes the graphics processing unit 260 to execute the direct volume rendering program on the volume dataset 203. The direct volume rendering program may output a 2D projection image (a.k.a. a 2D mask) 205 of the VOI. The graphics processing unit 260 may return the 2D mask 205 to the volume rendering module 220, which may forward the 2D mask 205 to the frame buffer 230.
To generate the contour of the VOI, a fragment shader program may be loaded into the graphics processing unit 260 to process the 2D mask 205. Details of some embodiments of the fragment shader program have been discussed above. In one embodiment, the fragment shader program is loaded into the storage device 262 of the graphics processing unit 260. The graphics processor 264 may retrieve instructions of the fragment shader program from the storage device 262 for execution. In response to the instructions, the graphics processor 264 may retrieve frames of the 2D mask from the frame buffer 230. By executing the fragment shader program on the frames retrieved, the graphics processor 264 may generate a contour 207 of the VOI from the frames of the 2D mask of the VOI. The graphics processor 264 may return the contour generated to the frame buffer 230.
In some embodiments, the contour converter 240 retrieves the contour 207 from the frame buffer 230 to convert the contour into a series of points 209. The series of points 209 represent the contour in a 2D space. Further, the series of points 209 may be arranged in different formats, such as crack code, chain code, run code, etc. The series of points 209 may be stored in the data storage device 250 for later use by other systems, such as a treatment delivery system in radiosurgery.
As discussed above, the contour generation technique described herein is useful in many different applications. For instance, the technique may be applied to rendering a contour of a tumor in medical imaging in order to provide a better view of the tumor during treatment delivery in radiosurgery. Alternatively, the technique may be applied to rendering shadow of an object in a display of a video game. Other exemplary applications may include industrial imaging and non-destructive testing of materials (e.g., motor blocks in the automotive industry, airframes in the aviation industry, welds in the construction industry and drill cores in the petroleum industry), seismic surveying, etc. One exemplary application in radiosurgery is described in details below for illustrative purpose. However, it should be appreciated that application of the volume based contour generation technique described herein is not limited to the following example.
In some embodiments, the contour rendering module 313 receives a series of points 301 representing a contour of the VOI. The series of points 301 have been generated using some embodiments of the contour generation described above. Based on the series of points 301, the contour rendering module 313 renders a contour 309 of the VOI over a 2D image of the VOI in the DRR 307. With the contour 309 outlining the VOI in the DRR 307, the VOI in the DRR 307 is more visible, thus, making it easier to compare the DRR 307 with other images of the VOI. As a result, tracking and locating of the VOI using the DRR 307 with the contour 309 during radiosurgery may be more accurate. Moreover, the technique described herein also improves the speed of contour generation significantly, thus, making it practical to apply some embodiments of the contour generation described herein to time sensitive applications, such as real-time tumor tracking in a treatment delivery stage of radio surgery.
Diagnostic imaging system 2000 is representative of a system capable of producing medical diagnostic images of a VOI that may be used for subsequent diagnosis, treatment planning, and/or treatment delivery. For example, diagnostic imaging system 2000 may be a computed tomography (CT) system, a magnetic resonance imaging (MRI) system, a positron emission tomography (PET) system, an ultrasound system, or the like. For ease of discussion, diagnostic imaging system 2000 is discussed at times in relation to a CT x-ray imaging modality. However, other imaging modalities such as those above may also be used.
Diagnostic imaging system 2000 includes an imaging source 2010 to generate an imaging beam (e.g., x-rays, ultrasonic waves, radio frequency waves, etc.) and an imaging detector 2020 to detect and receive the beam generated by imaging source 2010, or a secondary beam or emission stimulated by the beam from the imaging source (e.g., in an MRI or PET scan). In one embodiment, imaging system 2000 represents a CT scanner. In one embodiment, diagnostic imaging system 2000 may include two or more diagnostic X-ray sources and two or more corresponding imaging detectors. For example, two x-ray sources may be disposed around a patient to be imaged, fixed at an angular separation from each other (e.g., 90 degrees, 45 degrees, etc.) and aimed through the patient toward (an) imaging detector(s) which may be diametrically opposed to the x-ray sources. A single large imaging detector, or multiple imaging detectors, may also be used that would be illuminated by each x-ray imaging source. Alternatively, other numbers and configurations of imaging sources and imaging detectors may be used.
The imaging source 2010 and the imaging detector 2020 are coupled to a digital processing system 2030 to control the imaging operation and process image data. Diagnostic imaging system 2000 includes a bus or other means 2035 for transferring data and commands among digital processing system 2030, imaging source 2010 and imaging detector 2020. Digital processing system 2030 may include one or more general-purpose processors (e.g., a microprocessor), special purpose processor such as a digital signal processor (DSP) or other type of device such as a controller or field programmable gate array (FPGA). Digital processing system 2030 may also include other components (not shown) such as memory, storage devices, network adapters and the like. Digital processing system 2030 may be configured to generate scan data of digital diagnostic images in a standard format, such as the DICOM (Digital Imaging and Communications in Medicine) format, for example. In other embodiments, digital processing system 2030 may generate other standard or non-standard digital image formats. Digital processing system 2030 may transmit diagnostic image files (e.g., the aforementioned DICOM formatted files) to treatment planning system 3000 over a data link 1500, which may be, for example, a direct link, a local area network (LAN) link or a wide area network (WAN) link such as the Internet. In addition, the information transferred between systems may either be pulled or pushed across the communication medium connecting the systems, such as in a remote diagnosis or treatment planning configuration. In remote diagnosis or treatment planning, a user may utilize embodiments of the present invention to diagnose or treatment plan despite the existence of a physical separation between the system user and the patient.
Treatment planning system 3000 includes a processing device 3010 to receive and process image data such as the 4D CT data discussed above. Processing device 3010 may represent one or more general-purpose processors (e.g., a microprocessor), special purpose processor such as a digital signal processor (DSP) or other type of device such as a controller or field programmable gate array (FPGA). Processing device 3010 may be configured to execute instructions for performing the operations of the methods discussed herein that, for example, may be loaded in processing device 3010 from storage 3030 and/or system memory 3020.
Treatment planning system 3000 may also include system memory 3020 that may include a random access memory (RAM), or other dynamic storage devices, coupled to processing device 3010 by bus 3055, for storing information and instructions to be executed by processing device 3010. System memory 3020 also may be used for storing temporary variables or other intermediate information during execution of instructions by processing device 3010. System memory 3020 may also include a read only memory (ROM) and/or other static storage device coupled to bus 3055 for storing static information and instructions for processing device 3010.
Treatment planning system 3000 may also include storage device 3030, representing one or more storage devices (e.g., a magnetic disk drive or optical disk drive) coupled to bus 3055 for storing information and data, for example, the CT data discussed above. Storage device 3030 may also be used for storing instructions for performing the treatment planning methods discussed herein. In some embodiment, storage device 3030 stores instructions for DRR generation. Processing device 3010 may retrieve the instructions and may execute the instructions to implement a DRR generator. Details of some embodiments of a DRR generator have been described above. Likewise, storage device 3030 stores instructions for volume-based contour generator. In some embodiments, processing device 3010 retrieves the instructions and executes the instructions to implement a volume-based contour generator. Details of some embodiments of a volume-based contour generator have been described above.
Processing device 3010 may also be coupled to a display device 3040, such as a cathode ray tube (CRT) or liquid crystal display (LCD), for displaying information (e.g., a two-dimensional or three-dimensional representation of the VOI) to the user. An input device 3050, such as a keyboard, may be coupled to processing device 3010 for communicating information and/or command selections to processing device 3010. One or more other user input devices (e.g., a mouse, a trackball or cursor direction keys) may also be used to communicate directional information, to select commands for processing device 3010 and to control cursor movements on display 3040.
It will be appreciated that treatment planning system 3000 represents only one example of a treatment planning system, which may have many different configurations and architectures, which may include more components or fewer components than treatment planning system 3000 and which may be employed with the present invention. For example, some systems often have multiple buses, such as a peripheral bus, a dedicated cache bus, etc. The treatment planning system 3000 may also include MIRIT (Medical Image Review and Import Tool) to support DICOM import (so images can be fused and target regions delineated on different systems and then imported into the treatment planning system for planning and dose calculations), expanded image fusion capabilities that allow the user to treatment plan and view dose distributions on any one of various imaging modalities (e.g., MRI, CT, PET, etc.). Treatment planning systems are known in the art; accordingly, a more detailed discussion is not provided.
Treatment planning system 3000 may share its database (e.g., data stored in storage device 3030) with a treatment delivery system, such as treatment delivery system 4000, so that it may not be necessary to export from the treatment planning system prior to treatment delivery. Treatment planning system 3000 may be linked to treatment delivery system 4000 via a data link 2500, which may be a direct link, a LAN link or a WAN link as discussed above with respect to data link 1500. It should be noted that when data links 1500 and 2500 are implemented as LAN or WAN connections, any of diagnostic imaging system 2000, treatment planning system 3000 and/or treatment delivery system 4000 may be in decentralized locations such that the systems may be physically remote from each other. Alternatively, any of diagnostic imaging system 2000, treatment planning system 3000 and/or treatment delivery system 4000 may be integrated with each other in one or more systems.
Treatment delivery system 4000 includes a therapeutic and/or surgical radiation source 4010 to administer a prescribed radiation dose to a target volume in conformance with a treatment plan. Treatment delivery system 4000 may also include an imaging device 4020 to capture intra-treatment or intra-operative images of a patient volume (including the target volume) for registration and/or correlation with the diagnostic images described above in order to position the patient with respect to the radiation source. The intra-operative imaging device 4020 may include a pair of x-ray imaging modules. Treatment delivery system 4000 may also include a digital processing system 4030 to control radiation source 4010, intra-operative imaging device 4020, and a patient support device such as a treatment couch 4040. Digital processing system 4030 may include one or more general-purpose processors (e.g., a microprocessor), special purpose processor such as a digital signal processor (DSP) or other type of device such as a controller or field programmable gate array (FPGA). Digital processing system 4030 may also include other components (not shown) such as memory, storage devices, network adapters and the like. Digital processing system 4030 may be coupled to radiation source 4010, intra-operative imaging device 4020 and treatment couch 4040 by a bus 4045 or other type of control and communication interface.
Furthermore, treatment delivery system 4000 may include a tumor tracking module 4045. In some embodiments, the intra-operative imaging device 4020 generates intra-operative images of the VOI in the patient during treatment delivery. The intra-operative images are provided to the tumor tracking module 4045, which also receives DRRs from the treatment planning system 3000. By comparing the DRRs and the intra-operative images, the tumor tracking module 4045 determines an intra-operative location of the VOI in the patient's body. Since a contour of the VOI has been rendered on each of the DRRs as described above, the image of the VOI may be more visible on the DRRs. As a result, the intra-operative location of the VOI may be determined more readily and more accurately.
It should be noted that the described treatment system 1700 is only representative of an exemplary system. Other embodiments of the system 1700 may have many different configurations and architectures and may include fewer or more components.
In one embodiment, as illustrated in
In
Digital processing system 4030 may implement algorithms to register images obtained from imaging system 4020 with pre-operative treatment planning images in order to align the patient on the treatment couch 4050 within the treatment delivery system 4000, and to precisely position the radiation source with respect to the target volume.
The treatment couch 4050 may be coupled to another robotic arm (not illustrated) having multiple degrees of freedom. The couch arm may be vertically mounted to a column or wall, or horizontally mounted to pedestal, floor, or ceiling. Alternatively, the treatment couch 4050 may be a component of another mechanical mechanism, such as the Axum® treatment couch developed by Accuray Inc. of California, or be another type of conventional treatment table known to those of ordinary skill in the art.
Alternatively, treatment delivery system 4000 may be another type of treatment delivery system, for example, a gantry based (isocentric) intensity modulated radiotherapy (IMRT) system. In a gantry based system, a radiation source (e.g., a LINAC) is mounted on the gantry in such a way that it rotates in a plane corresponding to an axial slice of the patient. Radiation is then delivered from several positions on the circular plane of rotation. In IMRT, the shape of the radiation beam is defined by a multi-leaf collimator that allows portions of the beam to be blocked, so that the remaining beam incident on the patient has a pre-defined shape. The resulting system generates arbitrarily shaped radiation beams that intersect each other at the isocenter to deliver a dose distribution to the target region. In IMRT planning, the optimization algorithm selects subsets of the main beam and determines the amount of time that the patient should be exposed to each subset, so that the prescribed dose constraints are best met. In one particular embodiment, the gantry based system may have a gimbaled radiation source head assembly.
In addition, specialized graphics program 402, which is a customized routine, such as, for example, a program related to DRR generation and/or a DRR enhancement routine, may be implemented to communicate with the application 401. In one embodiment, the specialized graphics program 402 may be loaded into the graphics processing unit 405, which may be implemented as part of a video adapter or a separate device, via the graphics device access API 403 and the graphics device driver 404. The graphics device access API 403 may be compatible with OpenGL® or DirectX®.
The graphics device driver 404 may be provided by a vendor of the graphics processing unit 405. Alternatively, the graphics device driver 404 may be provided by a vendor of an operating system (OS) running within the system 400. Some examples of the OS (not shown) may include a Windows® OS from Microsoft Corporation® of Washington or a Mac® OS from Apple Computer® of California. Alternatively, the OS may be UNIX, LINUX, etc.
As described above, the contour generated may be represented or encoded by a series of points in the 2D space. Some exemplary formats of the encoding are discussed in details below with reference to
In one embodiment, the contour is represented by chain code. The contour is traced in a clockwise manner over the digitized image 622 and the directions of the tracing are recorded as the tracing moves from one contour pixel to the next. In one embodiment, a contour pixel is an object pixel that has a non-object background pixel as one or more of its 4-connected neighbors. After tracing, a contour 624 as shown in
The chain codes may be associated with eight possible directions. For instance, with x as the current contour pixel position, the chain codes are generally defined as follows:
Even codes {0,2,4,6} correspond to horizontal and/or vertical directions, while odd codes {1,3,5,7} correspond to the diagonal directions. Each code may be considered as the angular direction, in multiples of about forty-five (45) degrees. The absolute coordinates [m,n] of the first contour pixel (e.g., top, leftmost pixel) together with the chain codes of the contour may represent a complete description of a discrete region contour. Note that a change between two consecutive chain codes may indicate that the contour has changed direction. Thus, this point is defined as a corner in some embodiments.
An alternative to the chain codes for contour representation or encoding is to use neither the contour pixels associated with the object nor the contour pixels associated with background, but rather the line, i.e., the crack, in between. This is illustrated in
The crack code may be viewed as a chain code with four possible directions instead of eight. For example, the chain code for the enlarged section 640 in
Alternatively, a third representation is based on coding the consecutive pixels along a row, also referred to as a run, which belongs to an object by giving the starting position of the run and the ending position of the run. One embodiment of the runs 626 is illustrated in
Thus, some embodiments of volume based contour generation using a graphics processing unit have been described. Some portions of the preceding detailed descriptions have been presented in terms of algorithm and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations are the ways used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of operations leading to a desired result. The operations are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the above discussion, it is appreciated that throughout the description, discussions utilizing terms such as “processing” or “computing” or “calculating” or “determining” or “displaying” or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
Embodiments of the present invention also relate to an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a machine readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), erasable programmable ROMs (EPROMs), electrically erasable programmable ROMs (EEPROMs), magnetic or optical cards, or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.
The operations and displays presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method operations. The required structure for a variety of these systems will appear from the description below. In addition, embodiments of the present invention are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of embodiments of the invention as described herein.
In the foregoing specification, embodiments of the invention have been described with reference to specific exemplary embodiments thereof. It will be evident that various modifications may be made thereto without departing from the broader spirit and scope of the invention as set forth in the following claims. The specification and drawings are, accordingly, to be regarded in an illustrative sense rather than a restrictive sense.