The present invention relates to the field of digital image processing. In particular the present invention discloses a graphical user interface and methods for performing color correction and color keying.
Recent advances in video processing technologies have led to a surge in popularity of new video processing applications that make powerful video editing capabilities available to a wide base of users. Typically, video processing applications allow users to download and upload video and audio segments, as well as edit and manipulate these segments for producing a cohesive video or movie.
In performing such tasks, video processing applications often use a video capture card to capture and store video segments onto the hard drive of a computer system. Video capture cards typically employ a coder/decoder (also called a “CODEC”) to compress the video with a compression standard such as Motion-JPEG, DV, MPEG-1, MPEG-2, etc. Many digital video storage formats store pixel data in a luminance and chrominance colorspace often referred to Y/Cr/Cb (also referred to as YUV). In a luminance and chrominance colorspace three components are stored for each pixel: one for luminance (Y) and two for color information (Cr and Cb). Most computer display systems store pixel information in an RGB format that also contains three components per pixel, one each for the Red (R), Green (G), and Blue (B) portions of the color. Pixel information stored in either YUV or RGB format can be converted to the other format using straightforward matrix mathematics.
In the DV (digital video) storage format, storage is typically accomplished with an 8-bit luminance (Y) value for each pixel. 8 bits allows luminance (Y) values ranging from 0 through 255. In 8-bit digital video, black is typically encoded at Y=16 and white is encoded at Y=235. The luminance values from 1 to 15, referred to as footroom, and 236 to 254, referred to as headroom, are used to accommodate ringing and overshoot in a signal. Industry standard equations (such as those specified by Rec. ITU R BT-601) can convert 8-bit RGB encoded images with RGB values ranging from 0 to 255 into YUV encoded images with luminance (Y) values ranging from 16 to 235. Most software DV CODECs follow this mapping so that a use may translate, say, naturalistic computer pictures into quality video.
However, there are several phenomena which may contribute to degradation or compromising of the dynamic range of given colors in a resulting video segment. For example, difficulties often arise, for example, because cameras can often capture values that are superwhite (values resulting from specular reflections, sun, or bright lights, clouds or white walls). Superwhite values may exceed the nominal white value of 235 as registered on a waveform monitor, where these whites may peak at 100 IRE (NTSC) which is the brightest value allowable on a broadcast RF modulator. In the YUV space, the Y values range from 235 to 254; but on a waveform monitor, whites can be seen to range from 100 IRE to almost 110 IRE, all of which represent illegal values (e.g., Y values above 254), and are accordingly clipped to 254 by a CODEC when converting to RGB, thereby compromising the dynamic range of at least the white value in a given digital image.
Typically, users may attempt to mitigate such value degradation by employing color correction of the RGB space. A problem arises, however, because color correction, still yield only a limited amount of headroom for colors (such as superwhite), and also require extensive operations cycles, execution time, and memory access in computer systems that support digital video processing applications. Inherent in such a problem is the need for rendering, where editing is translated and stored on the hard drive a computer system supporting a given video image processing application. Even recently developed “real time” systems still need to go back to rendering in cases where a user simultaneously color corrects, adds filters, effects, and superimposes graphics, even for a high end real time system, the capacity will be overwhelmed and the real time performance will be compromised. As such, there are still deficiencies not addressed by recent advances in video processing, which in particular concerns the limited overhead on color values and a less expensive approach to color correction.
A graphical user interface for performing color correction and methods for implementing the color correction are disclosed. The graphical user interface allows a user to adjust the colorspace of the pixels in the image. In one embodiment, a color adjustment pad allows the user to push the pixels from a particular luminance level a desired magnitude towards a desired hue. Pixels from other luminance levels are affected proportionally. The graphical user interface further allows a user to adjust the luminance of the pixels in the image. A luminance adjustment slider allows the user to adjust the luminance of pixels from a selected luminance level by a relative amount. Pixels from other luminance levels have their luminance are affected in a manner proportional to a difference between the selected luminance level value and the luminance value of the other pixel.
Other objects, features, and advantages of present invention will be apparent from the company drawings and from the following detailed description.
The objects, features, and advantages of the present invention will be apparent to one skilled in the art, in view of the following detailed description in which:
a) illustrates a conceptual diagram of a three-dimensional colorspace defined in the luminance (Y) and chrominance (UV) format known as a YUV format.
b) illustrates a conceptual diagram of a three-dimensional colorspace of
a) illustrates a compliant video frame in the canvas window.
b) illustrates a subsequent video frame in the canvas window with all the pixels close to an allowable maximum highlighted with zebra striping.
c) illustrates a subsequent video frame in the canvas window with all the pixels close to an allowable maximum highlighted with a first zebra striping and all the pixels exceeding the allowable maximum highlighted with a second zebra striping.
a) illustrates the effect of moving the white luminance plane of the cylindrical colorspace representation to the right.
b) illustrates the effect of moving the black luminance plane of the cylindrical colorspace representation to the right.
a) to 14(g) illustrates graphical representations of the various input versus output relationships within a lookup table for luminance correction within the inventive system, as well as the control points of the Bezier curve which describes the M control modifications within the look-up table.
A method and user interface for performing color correction is disclosed. In the following description, for purposes of explanation, specific nomenclature is set forth to provide a thorough understanding of the present invention. However, it will be apparent to one skilled in the art that these specific details are not required in order to practice the present invention. For example, the present invention has been described with reference to Bezier curves. However, the same techniques can easily be applied with other types of curved functions.
Digital video pixel information is commonly stored in a three component luminance and chrominance format. For example, both the MPEG-1 and MPEG-2 digital video standards encode pixel data with three component luminance and chrominance format.
The luminance specifies the brightness of a pixel. Luminance ranges from pure black to pure white. Luminance is typically expressed in a single component Y value. The chrominance specifies the color of a pixel and is stored as two component values Cr and Cb (that are also referred to as U and V values). One reason that digital video is stored in a luminance and chrominance format is that human vision is much more sensitive to luminance information than chrominance information such that separate chrominance information can be more heavily compressed without significantly detectable image degradation.
Computer display systems generally represent pixel information in a three component RGB format wherein the three components specify the amount of Red (R), Green (G), and Blue (B) required to represent a pixel. Pixel information stored in either RGB format or YUV format can easily be converted to the other format using simple matrix math.
a) illustrates a conceptual diagram of a three-dimensional colorspace defined in the luminance (Y) and chrominance (UV) YUV format. The two part UV chrominance value is not intuitive to humans. Thus, the colorspace often defined in an alternative hue (H), saturation (S), and luminance (Y) format.
b) illustrates an alternate interpretation of the three-dimensional colorspace in
The luminance (Y) value continues to represent the brightness of a pixel. Luminance (Y) is represented as a vertical coordinate along the vertical axis of the cylinder of the cylindrical colorspace of
The hue (H) represents a particular color value. Hue is represented as an angle from the center of the cylinder of the cylindrical colorspace of
Saturation (S) refers to the depth or intensity of a pixel's color (hue). In other words, how strong the color is relative to an unsaturated white/gray/black of the same luminance level. For example, a deep red has a high level of saturation (mostly pure red with little white) and a pinkish red is only lightly saturated (less pure red and a higher level of white). Saturation is represented in the colorspace of
Film and television feature productions generally go through three phases: pre-production, production, and post-production. The pre-production phase is when the script is created, the actors are cast, and the sets are built. During production phase, the actual film is shot or the videotape is recorded using the actors and sets. Finally, during the post-production, the film or video tapes is edited together, the sound is edited, music is added, special effects are added, and the film or video tape images are adjusted.
As part of the post-production process, the images from many films and television shows go through a process of “color correction.” The process of color correction is used to fix color problems that occur during the filming or videotape production.
Professional colorists refer to primary, secondary, and tertiary color correction. Primary color correction involves fixing the basic color balance of an image to make it look correct with true color representation. Primary color correction problems include incorrect color casting from bad lighting, color problems caused by color filters, and improper exposure. Secondary color correction problems are caused by mismatched colors from related scenes shot at different times or with different lighting.
Tertiary color correction is actually color enhancement such as making a scene darker to enhance a particular mood. Tertiary color correction is also used to perform certain special effects. For example, the director or editor may wish to change the color of specific objects in a scene. For example, the director or editor may change the color of clothing worn by a particular character. The director may change the eye color of characters such as the distinctive blue eyes in the feature film “Dune.”
The present invention introduces a new color correction system that allows post-production specialists to have a wide latitude of control over the colors in series of images. The color correction system uses a highly intuitive graphical user interface such that post-production specialists can quickly learn the color correction system and professionally apply its features.
Image Canvas Window
A first window 210 is a “canvas” window that displays the video or other images that are being adjusted with the color correction system. In a preferred embodiment, the canvas window 210 includes novel feedback systems that provide more information than just the image.
Images may contain pixels that exceed a maximum luminance value allowed for broadcast material. Thus, it would be desirable to be able to identify and locate pixels that exceed this maximum allowed luminance value. In one embodiment, the canvas window 210 can be placed into a luminance test mode. When in the luminance test mode, the color correction system tests the all the pixels in an image to determine if the luminance of each pixel is close to or above an acceptable luminance threshold value. An example of the luminance test mode will be described with reference to
a) illustrates a canvas window video frame 310 that is tracking a ski jumper that will soon pass by the sun 320 that is currently not in the video frame. The sun emits so much light that it can easily cause image pixels having luminance values that are out of range. Referring to
Referring to
Finally, referring to
In alternate embodiments, the luminance test mode may display pixels “close” (within 20%) with green zebra striping, “very close” (within 10%) with yellow zebra striping, and exceeding the maximum luminance value with red zebra striping. Furthermore, in one embodiment the canvas window video frame implements a saturation test mode. When in the saturation test mode, the pixels that exceed the maximum allowed color saturation are highlighted with zebra striping. In a preferred embodiment, the user may activate the luminance test mode and the saturation test mode simultaneously such that the user can locate over saturated pixels and pixels that exceed the maximum allowed luminance value at the same time.
Workbench Scopes Window
A second window 220 is a “workbench” window that displays a number of commonly used scopes for video editing. The workbench window 220 may include scopes such as a waveform monitor, a vectorscope, and a luminosity histogram as is well known in the art. Other useful graphs may be created or subsequently added at a later date. The combination of the canvas window 210 and the workbench window 220 provide feedback to the user.
Color Correction Tools Window
The third window 230 is a color correction interface window that contains tools for performing color correction. The color correction interface window 230 comprises a graphical user interface containing several different color correction tools for performing color correction on the images in the canvas window 210. The graphical user interface containing several different color correction tools for performing color correction and the methods that implement those color correction tools are one of the primary focuses of this document.
Color Plane Adjustment Pads
In the color correction window 230 embodiment of
The color adjustment pads 430, 420, and 410 allow the user to adjust the meaning of “white”, “gray”, and “black” respectively by shifting the center toward a particular hue. Specifically, a user may use a cursor control device to draw a vector from the center 431, 421, and 411 of a color adjustment pad toward a particular hue (angle) for a specified distance (magnitude). The color correction then adjusts all the colors accordingly. In a preferred embodiment, the controls produce relative adjustments.
The three different color adjustment pads 410, 420, and 430 correspond to constant luminance planes in the three-dimensional colorspace illustrated in
Each color adjustment pad 410, 420, and 430 includes an associated reset button 417, 427, and 437, respectively. The reset button resets the color balance to the default (no adjustment) state.
Each color adjustment pad 410, 420, and 430 also includes an eyedropper button 415, 425, and 435, respectively. The eyedropper button allows the user to select a color from the image and define that selected color as the center (unsaturated) value of the associated luminance plane. In this manner, a user may perform post-production white balancing by selecting a colored pixel from an object known to be white in an image.
When an adjustment is made for a particular luminance level, other luminance levels are affected as well. When the whites color adjustment pad 430 or the blacks color adjustment pad 410 is used to adjust the whites luminance plane 530 or blacks luminance plane 510, respectively, all the other luminance places are adjusted proportionally according to their distances from the adjusted luminance plane. In one embodiment, an adjustment to the whites luminance plane 530 or blacks luminance plane 510 causes the other luminance planes to be adjusted depending on how close those planes are to the adjusted luminance plane. When the mids color adjustment pad 420 is used to adjust the mids luminance plane 520, then the other luminance places are also adjusted. However, in one embodiment, an adjustment to the mids luminance plane 520 causes the other luminance planes to be adjusted according to a curve depending on how close those planes are to the adjusted mids luminance plane 520.
Color Plane Adjustment Examples
Referring to
Similarly, if the user moves the center of the blacks color adjustment pad 410 to the right, the pixels of the blacks luminance level plane are adjusted toward the blue (B) hue and higher luminance level planes are adjusted proportionally. Specifically,
If the user moves the mids (middle) color adjustment pad 420, the system moves colors of the mids luminance plane 520 accordingly. The other luminance planes are also moved, but by an amount specified by a curve. Specifically,
The movements of the three different color adjustment pads 430, 420, and 410 may be combined to create different colorspace adjustments. For example, the user may first adjust the whites luminance level plane 530 to right by ΔX using the whites color adjustment pad 430 to produce the adjusted colorspace illustrated in
Luminance Adjustment Sliders
Referring back to
As with the color adjustment pads, the luminance sliders adjust the luminance on their respective luminance planes. The whites luminance slider 439 adjusts the luminance for the pixels on the whites luminance plane 530 and the remaining luminance planes as a function of how close those other planes are to the whites luminance plane as illustrated by the input (x)/output (y) graphs in
The blacks luminance slider 419 adjusts the luminance for the pixels on the blacks luminance plane 510 and the remaining luminance planes proportionally. Specifically, adjusting the blacks luminance slider 419 to the left decreases the luminance as illustrated by the input(x)/output(y) graphs of
The mids luminance slider 429 adjusts the luminance of pixels on the mids luminance plane 520. In one embodiment, the adjustment to the mids luminance slider adjusts the other luminance planes in a curved manner according to a curve as illustrated in
Auto-Contrast Buttons
In the center of the graphical user interface of
Referring to the auto-contrast buttons 460 in
These auto-contrast buttons 460 are highly desirable since one of the tasks that professional colorists often perform is to adjust the black and white levels to see what contrast the original image has. After seeing the contrast in the original image, the colorist may increase (or decrease) the image's contrast as appropriate.
Saturation Adjustment Slider
Referring again to
Adjusting the saturation with a saturation multiplier value of zero would change all the pixels to zero saturation, resulting in a Black and white image. Adjusting the saturation with a saturation multiplier value of two would double the saturation of each pixel. Leaving the saturation adjustment slider 450 in the center position multiplies the saturation by a value of one and thus does not change the saturation of the pixels.
Limit Effect Panel
To limit the effects of color correction, the user may define a limited three-dimensional space within the three-dimensional colorspace that should be adjusted. Referring to
Referring back to the colorspace illustration of
Referring back to the limit effect panel 460 of
The angle determined by the defining hue markers 471 and 472 is used to allow the user to generate a smooth falloff for the color effect. For colors completely inside the defining hue markers 471 and 472, the effect is applied 100%. For colors completely outside the defining hue markers 471 and 472, the effect is not applied. For colors that fall into the “falloff area” which the user specifies by the angle of the defining hue markers 471 and 472, the resulting pixel is calculated by blending the original, unmodified pixel with the pixel after the effect has been applied. The effect is proportional to where the original pixel color is in the falloff area. Thus, the angled defining hue markers 471 and 472 allows for a smooth gradation between pixels subject to the effect and pixels not subject to the effect.
The user may press hue angle reset button 477 to reset the defined hue angle to a default hue angle. In one embodiment, the default hue angle is −110° with a default width of 15° and a default softness of 10. The user may center the hue spectrum 470 on a particular hue by selecting a hue from the image using eyedropper button 474.
A user may specify a Δs saturation range by marking the saturation range checkbox 488 and defining a saturation range on the saturation scale 480. The Δs saturation range is specified along the saturation scale 480 using a pair of defining saturation markers 481 and 482. As set forth with reference to the defining hue markers 471 and 472, the angle determined by the defining saturation markers 481 and 482 is used to allow the user to generate a smooth falloff for the color correction. The user may press saturation range reset button 487 to reset the defined saturation range to a default saturation range. In one embodiment, the default saturation range starts at 35% with a width of 40 and a softness of 20. The user may center the saturation markers 481 and 482 around a particular pixel's saturation value by selecting a pixel from the image using eyedropper button 474.
Finally, a user may specify a Δy luminance range by marking the luminance range checkbox 498 and defining a luminance range on the luminance scale 490. The Δy luminance range is specified along the luminance scale 490 using a pair of defining luminance markers 491 and 492. As set forth with reference to the defining hue markers 471 and 472, the angle determined by the defining luminance markers 491 and 492 is used to allow the user to generate a smooth falloff for the color correction. The user may press luminance range reset button 497 to reset the defined luminance range to a default luminance range. In one embodiment, the default luminance range starts at 0% with a width of 40 and a softness of 20. The user may center the luminance markers 491 and 492 around a particular pixel's luminance value by selecting a pixel from the image using eyedropper button 474. Furthermore, if the user activates the eyedropper while holding the shift key down then the hue, saturation, or luminance on which the user clicks will be added to the keying selection.
The graphical user interface for the color correction window may be implemented in many different manners. The different color correction graphical user interface embodiments may include new features and omit other features.
Hue Phase Shift Adjustment
The color correction embodiment of
The present invention provides for color correction in a novel manner by performing the color correction and keying in the YUV colorspace in order to reduce processing cycles and to save time through the avoidance of rendering. This is in marked contrast to prior art techniques that perform color correction in the RGB space and, thus, tend to involve greater amounts of rendering. By reducing or removing the time consuming rendering steps, the system of present invention is more efficient than prior art systems.
As will be described more fully in the sections hereafter, the inventive techniques utilize look-up tables (LUTs) that may convert user inputted color corrections to the YUV or YCrCb colorspace in selected images to an outputted value. Although YUV and YCrCb denote different values mathematically, for purposes of the present invention, YUV and YCrCb are deemed interchangeable as used herein. The look-up tables (LUTs) that provide for the remapping of pixels according to the user's desired color correction are used to provide a high performance pixel remapping. However, the color correction pixel remapping performed by the present invention may be implemented in other manners.
To fully describe the implementation of all the different features, the implementation will be described with reference to the color correction graphical user interface of
Once Cartesian coordinates form, a recomputation of the applicable look-up tables (LUTs) is performed at 1130, the details of which will be discussed in greater detail below. Note that in certain applications, however, it may not be necessary to recomputed one or more LUTs, given that it is possible that a user may only modify a parameter that does not require a look-up table recomputation such as a saturation adjustment. Finally, the color information of designated pixels are remapped at step 1140. In one embodiment, the YCrCb values that define the designated pixels are remapped with the aid of the recomputed from the look-up tables (LUTs).
Based on user input from the previously described luminance adjustment sliders, an adjustment to the pixel's luminance is made at step 1220 based on the luminance look-up tables (LUTs), as described in greater detail hereafter. Specifically, the luminance (Y) of a pixel is set with the following equation:
Y=yLUT(Y)
where Y is the pixel's luminance value
As such, the previously described luminance adjustment sliders are able to provide user control of the luminance (Y) value of a YCrCb defined pixel.
Next, an adjustment to the chrominance is made at 1230 based on the chrominance look-up tables (LUTs) that have been modified based on user input from the color adjustment pads as previously disclosed. Note that the chrominance adjustment is relative to the initial chrominance value. The chrominance values are adjusted with the following equations:
Cb=Cb+signedCbLUT(Y); and
Cr=Cr+signedCrLUT(Y)
where Y is the pixel's luminance value
As such, the previously described color adjustment pads are able to provide user control of the chrominance (Cr and Cb) values of a YCrCb defined pixel.
Also as previously described, the color correction user interface may include a hue adjustment wheel that may rotate the hues of pixels. The hue rotation may be expressed as a phase shift h. In accordance with one embodiment, a final assignment of chrominance is made, based on any existing phase shift h, at step 1240. The phase shifted chrominance values are determined from the previous chrominance values and the phase shift h. The general equations for this determination are:
Cb=Cb*cos(h)−Cb*sin(h); and
Cr=Cr*cos(h)−Cr*sin(h).
In systems without a hue adjustment wheel for phase modification h, there would be no need for step 1240 of
A final adjustment is made to a pixel's Cr and Cb values at step 1250, based on a saturation adjustment s. The saturation adjustment s (or satadjust) may increase or decrease the amount of color depending on user input using the saturation adjustment slider. This increase or decrease is accomplished by setting the saturation adjustment s to a value between 0 and 2 inclusive and then multiplying the above final color correction results by the factor of s to get the final adjusted color correction. Specifically, the following equation may be used to implement saturation adjustments to the pixels:
Cb=128+(satadjust*(Cb−128)); and
Cr=128+(satadjust*(Cr−128))
Although it is possible to use other s values beyond the range of 0 and 2, such values tend to have little utility, given that the extremes of 0 to 2 represent the mainstream uses of color, where 0 represents no saturation, and 2 represents double the normal saturation for any given color.
Steps 1210 through 1250 are then repeated until the last pixel from the designated image(s) has been remapped, as determined at 1260.
Luminance Correction: LUT Recomputations
As set forth in the flow diagram of
The changes of in the luminance control values W, M, and B adjust the input/output graph as set forth in
a) to 14(g) illustrate more specific examples which are controlled for movement of individual variables in accordance with the previously described user interface. Specifically:
Although there are other techniques that would yield similar curves using different mathematical approaches, in one embodiment, a Bezier function is utilized. Accordingly,
In one embodiment, the curve for the M control is calculated using Bezier control points k01410, k11460, k21480 and k31440. The endpoints k01410 and k31440 for the Bezier curve are calculated the same way as they would be for a linear case, as shown in
First, a calculation is made for the midpoint between k01410 and k31440. Specifically:
A similar expression may be likewise employed for midPt.v. Next, a choice is made designating either the upper left or lower right, depending on the direction M is being pushed, such that:
Next, a calculation is made as to how far out to go on the line between MidPt 1430 and Corner 1450. Specifically, CtlPt 1470 is interpolated by using a weighting derived from weight=abs((mids−100)/100.0) where the larger the M (“mids”) value, the more influence that Corner 1470 will have. Conversely, the smaller the value, the less influence Corner 1470 will have, e.g.:
CtlPt.h=weiglit*Corner.h+(1.0−weight)*MidPt.h
A similar expression may be employed to obtain CtlPt.v.
Next, a calculation is made to determine k11460, by interpolating between k01410 and CtlPt 1470, again using the above-described weighting factor:
A similar expression may be employed to obtain k1.v.
Likewise, a calculation is made to determine k2 at 1480:
A similar expression may be employed to obtain k2.v.
g) illustrates where these points would lie relative to k01410 and k31440 for M>100. As M (and therefore the above described “weight”) is changed, CtlPt will be interpolated to different locations along the line between Corner 1450 and MidPt 1430. Furthermore, k11460 and k21480 will be interpolated to different positions on the lines between k01410 and CtlPt 1470 and between k31440 and CtlPt 1470. The arrows at k11460, CtlPt 1470, and k21480 suggest how these points move as the mids (M) luminance slider is adjusted (in the range 100-200).
Note that if M<100, then the lower right corner of the bounding box formed by (k0, k3) would be used as Corner, and the curve would then inflect in the other direction, but otherwise, the relative positioning of the points (based on “weight”) is the same. If, however, M was set to 100, the graph would instead be a straight line as shown in
Chrominance (U and V) Correction: LUT Recomputations
As discussed with reference to
When a user makes adjustments to a color adjustment pad, the user's input may be received as an angle and magnitude defined vector. The angle and magnitude defined vector is then translated into a relative U,V vector for that luminance plane. For example, referring to
To convert an angle vector push defined by magnitude m and an angle α, the Cartesian vector U and V can be found according to the generalized equations:
Referring back to
Referring back to
where UM and VM represent the user's directly specified offset for the mids luminance plane.
Accordingly, to generated the needed Cb look-up table (CbLUT) of
for i<50% White (CCIR luminance value of 128) then
CbLUT[i]=UB*(1.0−gMidRamp[i])+U′M*(gMidRamp[i])
for i≧50% White (CCIR luminance value of 128) then
CbLUT[i]=UW*(1.0−gMidRamp[i])+U′M*(gMidRamp[i])
Thus,
In the modified cylindrical colorspace representation of
Keying to Limit Effects
In one embodiment, keying may be employed to limit the effect of color correction.
The foregoing has described a method and user interface for performing color correction. It is contemplated that changes and modifications may be made by one of ordinary skill in the art, to the materials and arrangements of elements of the present invention without departing from the scope of the invention.
This application is a divisional application of U.S. patent application Ser. No. 10/005,383, filed Dec. 3, 2001, now U.S. Pat. No. 7,215,813 entitled “Method and Apparatus for Color Correction.” This application is incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
4488245 | Dalke et al. | Dec 1984 | A |
5313275 | Daly et al. | May 1994 | A |
5363477 | Kuragano et al. | Nov 1994 | A |
5469188 | Krishnamurthy et al. | Nov 1995 | A |
5487020 | Long | Jan 1996 | A |
5670986 | Leak | Sep 1997 | A |
5774112 | Kasson | Jun 1998 | A |
5850471 | Brett | Dec 1998 | A |
5867169 | Prater | Feb 1999 | A |
5930009 | Sato et al. | Jul 1999 | A |
5982924 | Power et al. | Nov 1999 | A |
6137540 | Desprez-Le Goarant et al. | Oct 2000 | A |
6181321 | Zhao et al. | Jan 2001 | B1 |
6236750 | Garber | May 2001 | B1 |
6262817 | Sato et al. | Jul 2001 | B1 |
6266103 | Barton et al. | Jul 2001 | B1 |
6268939 | Klassen et al. | Jul 2001 | B1 |
6477271 | Cooper et al. | Nov 2002 | B1 |
6757425 | Pettigrew et al. | Jun 2004 | B2 |
6763134 | Cooper et al. | Jul 2004 | B2 |
6765608 | Himeda et al. | Jul 2004 | B1 |
6822640 | Derocher | Nov 2004 | B2 |
6944335 | Pettigrew et al. | Sep 2005 | B2 |
7215813 | Graves | May 2007 | B2 |
20010036356 | Weaver et al. | Nov 2001 | A1 |
20020021221 | Okamoto et al. | Feb 2002 | A1 |
20030016866 | Cooper et al. | Jan 2003 | A1 |
20030016881 | Matsuura | Jan 2003 | A1 |
20030025835 | Segman | Feb 2003 | A1 |
20030026609 | Parulski | Feb 2003 | A1 |
20030103057 | Graves | Jun 2003 | A1 |
20030128220 | Ubillos | Jul 2003 | A1 |
20030133609 | Ubillos et al. | Jul 2003 | A1 |
Number | Date | Country |
---|---|---|
WO 2003069894 | Aug 2003 | WO |
Number | Date | Country | |
---|---|---|---|
20070248264 A1 | Oct 2007 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 10005383 | Dec 2001 | US |
Child | 11694922 | US |