This disclosure relates to depiction of images of objects using computer enabled imaging, and especially to the lighting aspect of computer enabled imaging.
Visualization of volumetric objects which are represented by three dimensional scalar fields is one of the most complete, realistic and accurate ways to represent internal and external structures of real 3-D (three dimensional) objects.
As an example, Computer Tomography (CT) digitizes images of real 3-D objects and represents them as a discrete 3-D scalar field representation. MRI (Magnetic Resonant Imaging) is another system to scan and depict internal structure of real 3-D objects.
As another example, the oil industry uses seismic imaging techniques to generate a 3-D image volume of a 3-D region in the earth. Some important geological structures, such as faults or salt domes, may be embedded within the region and are not necessarily on the surface of the region.
Direct volume rendering is a computer enabled technique developed for visualizing the interior of a solid region represented by such a 3-D image volume on a 2-D image plane, e.g., displayed on a computer monitor. Hence a typical 3-D dataset is a group of 2-D image “slices” of a real object generated by the CT or MRI machine or seismic imaging. Typically the scalar attribute or voxel (volume element) at any point within the image volume is associated with a plurality of classification properties, such as color—red, green, blue—and opacity, which can be defined by a set of lookup tables. A plurality of rays is cast from the 2-D image plane into the volume and they are attenuated or reflected by the volume. The amount of attenuated or reflected ray energy of each ray is indicative of the 3-D characteristics of the objects embedded within the image volume, e.g., their shapes and orientations, and further determines a pixel value on the 2-D image plane in accordance with the opacity and color mapping of the volume along the corresponding ray path. The pixel values associated with the plurality of ray origins on the 2-D image plane form an image that can be rendered by computer software on a computer monitor. A more detailed description of direct volume rendering is described in “Computer Graphics Principles and Practices” by Foley, Van Dam, Feiner and Hughes, 2nd Edition, Addison-Wesley Publishing Company (1996), pp 1134-1139.
In the CT example discussed above, even though a doctor using MRI equipment and conventional methods can arbitrarily generate 2-D image slices/cut of e.g. a heart by intercepting the image volume in any direction, no single image slice is able to visualize the whole surface of the heart. In contrast, a 2-D image generated through direct volume rendering of the CT image volume can easily reveal on a computer monitor the 3-D characteristics of the heart, which is very important in many types of cardiovascular disease diagnosis. Similarly in the field of oil exploration, direct volume rendering of 3-D seismic data has proved to be a powerful tool that can help petroleum engineers to determine more accurately the 3-D characteristics of geological structures embedded in a region that are potential oil reservoirs and to increase oil production significantly.
One of the most common and basic structures used to control volume rendering is the transfer function. In the context of volume rendering, a transfer function defines the classification/translation of the original pixels of volumetric data (voxels) to its representation on the computer monitor screen, particularly the commonly used transfer function representation which is color - red, green, blue - and opacity classification. Hence each voxel has a color and opacity value defined using a transfer function. The transfer function itself is mathematically, e.g., a simple ramp, a piecewise linear function or a lookup table. Computer enable volume rendering as described here may use conventional volume ray tracing, volume ray casting, splatting, shear warping, or texture mapping. More generally, transfer functions in this context assign renderable (by volume rendering) optical properties to the numerical values (voxels) of the dataset. The opacity function determines the contribution of each voxel to the final (rendered) image.
Even though direct volume rendering plays a key role in many important fields, several challenges need to be overcome to assure the most informative 2-D representation of volumetric 3-D objects. First, volumetric data may have varieties of properties and some of them may not be favorable for particular lighting techniques so the flexibility to control the type of applied lighting may be valuable tools to ensure the most informative representation of volumetric objects. (In this context, “lighting” refers to computer enabled imagery and how it is depicted by a computer system, not to actual lighting.)
Therefore, the present inventor has determined it would be desirable to increase flexibility to control the type of lighting for direct volume rendering that may increase the rendering efficiency and provide more readable 2-D representation of 3-D volumetric objects.
The present disclosure relates generally to the field of computer enabled volume data rendering, and more particularly, to a method and system for rendering a volume dataset using a transfer function representation having explicit control of the type of lighting per particular range of scalar field of volumetric data. One embodiment is a method and system for rendering a volume dataset using an extended transfer function representation for explicit control of type of lighting per particular range of scalar field of volumetric data. (“Lighting” here is used in the computer imaging sense, not referring to actual physical light.) One exemplary way to control the lighting property in accordance with the invention is explicitly to specify that gradient of scalar field must be used for computation of lighting, or if it should not be used. This approach is not limited to such gradient lighting control via an extension of the transfer function but rather is an example of such lighting control. Another example of the present lighting control is selection of which type of gradient lighting to apply, such as selecting the Phong or Blinn-Phong shading models, both well known in the field. Also, each particular type of lighting is associated with a set of parameters which may be uniquely specified for the particular data range which is a scalar field range along the X-axis of the transfer function. For example, the Phong shading model is associated with these four parameters: ks which are the specular reflection constant; kd which is the diffuse reflection constant; ka which is the ambient reflection constant, and α which is the shininess constant for a material.
The present method and system add an additional user operated control parameter to an otherwise conventional transfer function where the parameter specifies the type of lighting which is to be applied for correspondent values of a scalar field. For example: the scalar field ranges which typically do not have steady gradients would likely appear as noise if gradient lighting is applied for this data range, so non-gradient based lighting is selected for this data range (the scalar field range). Steady gradient refers to the directions of neighboring lighting gradients being coherent, meaning the direction tend to have similar directions or the changing of directions is smooth at least up to the scale or sizes of the depicted structures. As described below, in one example the term “Lighting OFF” represents the case when gradient lighting is not used and the term “Lighting ON” represents the case when gradient lighting is used.
The method and system then apply the particular type of lighting for each sample event (a sample along a ray for ray casting or equivalent) according to the present lighting control parameter added to the transfer function. Note that if lighting for the current data range is gradient lighting, then the gradient associated with the sampled point is sampled or assessed also.
The aforementioned features and advantages of the invention as well as additional features and advantages thereof will be more clearly understood hereinafter as a result of a detailed description of embodiments of the invention when taken in conjunction with the drawings.
The user, by manipulating the control points on his computer screen by means of, e.g., a computer input device such as a mouse, can thereby turn the gradient lighting in this example on or off at each control point individually to optimize his view of the image. The gradient lighting in this example is turned on/off only for the data range associated with the portion of the transfer function extending from one user control point on the transfer function to the next control point along the X-axis of the transfer function. This X-axis defines the data values and scalar field values. The user thereby determines what sort of lighting to use based on, e.g., properties of the lighting gradients as he views the image.
In one embodiment the present method and apparatus to control the type of lighting therefore are embodied in computer software (code or a program) to be executed on a programmed computer or computing device 20. This code may be a separate application program and/or embedded in the transfer function representation. The input dataset (e.g. the CT data) may be provided live (in real time from a CT or MRI scanner or other source) or from storage as in
The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated.
This application claims priority to commonly invented U.S. provisional application No. 61/059,635, filed Jun. 6, 2008, incorporated herein by reference in its entirety.
| Number | Date | Country | |
|---|---|---|---|
| 61059635 | Jun 2008 | US |