Liquid crystal display with adaptive color

Information

  • Patent Grant
  • 7623105
  • Patent Number
    7,623,105
  • Date Filed
    Friday, November 19, 2004
    19 years ago
  • Date Issued
    Tuesday, November 24, 2009
    14 years ago
Abstract
A system for modifying images to be shown on displays that have display characteristics dependant on the angle at which a displayed image is viewed. An image may be modified by detecting the position of a viewer relative to a display and based on the detected position, correcting the image.
Description
BACKGROUND OF THE INVENTION

This application relates to displays with adaptive color.


Liquid crystal displays tend to exhibit a color dependency based upon the viewing angle between the viewer and the display. Liquid crystal displays are typically designed to exhibit the desired colors when viewed at a normal viewing angle (directly in front) to the display. When a viewer views an image on a display at a significant off-normal viewing angle, the colors tend to shift from those observed at a normal viewing direction, the contrast of the image tends to reverse, and the gamma characteristics degrade.


The primary techniques employed to improve non-normal viewing angle characteristics may be categorized into two major classes. The first class of techniques include those that focus on physical modification of the liquid crystal display and modification of the manufacturing processes of making the liquid crystal display. The second class of techniques include those that pre-process the image in a particular manner such that the signals provided to the liquid crystal display are modified so that when the image is viewed from an off-axis angle it is shifted toward what would be expected at a normal viewing angle. The pre-processing may include modifying the pixel image data or modifying the display driver data.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates a display and a viewer.



FIG. 2 illustrates a image modification system.



FIG. 3 illustrates color and white point shifts.



FIG. 4 illustrates a display and viewer position.



FIG. 5 illustrates a display with multiple regions.



FIG. 6 illustrates a viewer and a camera lens.



FIG. 7 illustrates image modification.





DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

The present inventors considered existing image processing techniques and determined that the results of such techniques could be significantly improved if the system had information regarding the actual location of the viewer in relation to the display. Referring to FIG. 1, the location information should provide an indication of the angular relationship that the viewer is viewing the surface of the image on the display in relation to an angle normal to the surface of the display. The angular relationship may be determined by processing data from a camera or other imaging device associated with the display. The image data is then modified in accordance with the angular relationship information.


Referring to FIG. 2, another embodiment includes a liquid crystal display 30 that presents an image thereon. A viewer 32 views the image on the display 30 and an imaging device 34 affixed to the display 30 captures an image or video that includes the viewer 32. Typically, the angular relationship between the imaging device and the display is known. The captured image or video 36 obtained from the imaging device 34 are provided to a detection and/or tracking module 38. The module 38 detects the facial region of the viewer 32, detects the eyes of the viewer 32, and/or obtains gaze information where the viewer 32 is looking. Based upon the information from the module 38 the viewing angle between the viewer 32 and a portion of the display 30 is determined at module 40. Based upon the viewing angle of the viewer 32, compensation parameters are determined at module 42 to adjust the color of the image to be viewed by the viewer 32. An image is normally formatted for the display in the form of pixel data from a source 46, such as a computer, a television signal, a digital-video-disc, or a video-cassette-recorder. The pixel data from the source 46 is modified by a color correction module 44 that also receives compensation parameters from module 42 in order to modify the image data to be observed by the viewer 32. The image data is modified so that on average the viewer observes an image that is closer to presenting the colors that would be viewed at a normal viewing angle than what would have been otherwise viewed at the off-axis viewing angle.


An illustration of how changes in the viewing angles results in tone-scale variations is illustrated in FIG. 3. In FIG. 3, a single gamma parameter is used to approximate the resulting variations for a given viewing angle. FIG. 3A illustrates measured tone-scale variations and computed gamma values at five different viewing angles, ranging from 0 degrees (normal to the display) to 60 degrees off normal. It may be observed that the viewing angle induced tone-scale variations in the image are dramatic. Specifically, the gamma of the displayed image is subject to dramatic change (from 2.2 to 0.82 in FIG. 3A) when the viewing angle changes from 0 degrees to 60 degrees. In fact, when the viewing angle tends to get larger, a single gamma characteristic is not especially suitable to characterize the color distortion of the viewed image, and thus a multi-segment gamma or a look-up-table should be used. Referring to FIG. 3B, it also was determined that the white-point tends to shift towards yellow as the viewing angle increases. Based upon these observations the image processing technique and hence the modifications of the pixel values may take into account the effects resulting from tone-scale variations, or white-point shifts, or both. In some cases, a single value may be used to characterize the entire curve, or otherwise multiple values may be used to characterize the entire curve.


Preferably, the imaging device is mounted to the display with a fixed or known orientation, with the imaging device constantly capturing the environment in front of the display, including any viewer(s). Although the projection of a physical point on the image may vary if the camera is moving, since the camera is maintained in a fixed orientation with respect to the display, every physical point in the field view of the camera should correspond to a fixed location in the image at any given time. Therefore, the camera may detect in the acquired image the locations of the viewer's eyes, and then the system may effectively estimate the viewing angle under given image capture device characteristics. Some image device characteristics include the focal length, image resolution, pixel size, etc.


The system may determine the viewer's gaze direction in order to more accurately determine the angle between the eye and the portion of the display being viewed. In some cases, the viewer may be viewing the left side of the display which may have a different angle than the right hand side of the display. In such a case, the pixel data modification may be modified in relation to that region of the display being viewed.


The system may determine the location of a portion of the viewer, such as the viewer's eyes, and determine the viewing angle with respect to one or more regions of the display. The regions of the display may be any suitable portion of the display, multiple portions of the display, or otherwise the central portion of the display.


The system may detect the facial region of the viewer and use the facial region of the viewer to determine a viewing angle with respect to a portion of the display. The portion of the display may be any suitable portion(s) of the display, or otherwise the central portion of the display. In some cases, the system may detect the facial region and then estimate the portion of the image where the eye(s) should be located, and then use the eye region.


The system may use any suitable imaging mechanism, such as a gaze detector, an eye detector, a face detector, or an eye estimator. Referring to FIG. 4, when the dimensions of a display are large compared with the distance between the viewer and the display (which is typically the case in the situation of a computer monitor); a single viewing angle does not tend to accurately characterize the viewer's viewing direction for all areas of the display. This is primarily because different regions of the display have significantly different viewing angles even when a viewer's position is fixed. Thus, the top part of the display, the central part of the display, the right hand part of the display, the left hand part of the display, and the bottom part of the display correspond to different viewing angles. In addition, there are changes due to the viewing angle variations in diagonal directions.


On the other hand, when the dimensions of a display are relatively small compared with the distance between the viewer and the display (which is typically the case in the situation of a liquid crystal display being used as a television or a handheld device that includes a small liquid crystal display, a single viewing angle may be adequate to characterize the viewing direction for all regions of the display. Since the viewing angle only varies slightly for different areas of the display.


In the case that the viewer is sufficiently far from the display in relation to its size, a suitable measure for a viewing angle is from the face or eye to the center of the display. In contract, assuming that the display is not sufficiently far from the viewer in relation to its size, then a plurality of different measures of the viewing angle should be used. For example, nine different gammas for nine different sub-regions may be used, as illustrated in FIG. 5. The gamma curves may be dynamically determined based upon the viewer's current location with respect to the display. In addition, the different regions of the display may be computed based upon an estimation of the normal size of the face and hence the spacing of the eyes, or otherwise based upon a determination of the location of the eyes. In this manner, the distances from the right eye and/or left eye may be taken into account to determine the viewing angles.


While the color correction modification may be based upon a single viewer, in some cases multiple viewers will be viewing the same display. Multiple viewers may be accommodated in several ways, such as for example, one or more of the following:


First, if more than two viewers are detected, the color correction may be done according to the primary viewer's viewing angle where the primary viewer may be determined, for example, by the largest size of the detected face/eyes. This determination is based upon the size normally corresponding to the distance of the viewer from the display.


Second, if two viewers are detected and they are approximately symmetric with respect to the center of the display, then the viewing angle may be set to that of either of the viewers and the other viewer will perceive the same compensation due to the symmetry in the viewing angle characteristics of the display.


Third, if two viewers are detected and they are not approximately symmetric with respect to the center of the display, then the viewing angle may be set to a statistical measure of the two viewing angles so that the two viewers will view similar quality images. Averaging is the preferred statistical measure since from FIG. 3 one may observe that, for example, the gamma changes monotonically with the viewing angle.


For techniques two and three, it may be observed that the two viewers are about the same distance from the display and that both are relatively far from the display, otherwise it is preferable to use the first technique. In addition, the system may include a mode that permits the color correction module to be disabled under some circumstances, such as when more than two viewers are detected but without a primary viewer being determined so that the display may use the default setting.


In some embodiments, the system may use multiple cameras, such as a pair of stereo cameras, so that the distance between the display and the viewer may be readily calculated. For example, when the display size is relatively large in comparison with the distance between a viewer and the display, then accurate determination of the viewing angle with respect to each sub-region of the display can be determined, which may be obtained based on the viewer-to-display distance computed through the images from a pair of stereo cameras.


Any suitable face, eye, gaze detection technique (or otherwise) may be used. For simplicity, it may be assumed that the camera is located in the center of an LCD display (in practice, since the camera cannot be placed in the center of a display, one may compensate for the camera position when computing the viewing angle from the captured image or video) with a simple pin-hole camera model (which may characterize most inexpensive consumer cameras). FIG. 6 illustrates the relationship between a face image and the angle, which is the angle between the optical axis of the camera lens and the viewer's viewing direction (this is also the viewing angle with respect to the center of the display under the above assumption of the camera location. When the camera is located elsewhere, after an angle is computed, the system can determine the actual viewing angle with respect to the display.


For simplicity, it may be assumed that the camera is located in the center of the LCD display. As illustrated in FIG. 6, since the face image is located on the right-hand side of the image plane, it is determined that the user's face (and thus eyes) is located in the left-hand side of the image. In this particular example, since it was assumed that the camera is located in the center of the LCD, the system may determine that the viewer is watching the LCD from left-hand side with a viewing angle (with respect to the center of the display)

θ=Arctan(d/f)

Where f is the focal length, and d is distance between the image center and the center of the eyes (d may be computed from the image resolution and pixel size of the camera). Notice that although FIG. 6 illustrates a graph for the horizontal viewing angle, the same technique likewise applies to the vertical viewing angle.


With the viewing angle estimated, and with the measured LCD viewing angle characteristics (such as those shown in FIG. 3), color correction is achieved by preprocessing the pixel data before displaying.


With a look-up-table computed based on the measured tone-scale variations, color correction can be preformed through transforming each pixel of an image using the table. In a simple case, gamma compensation is done to bring the gamma of the viewed image at a certain angle to the normal range (e.g. ˜2.2). For example, in the case of FIG. 3(a), when the system detects the user is viewing the LCD from a 45° angle, the pixel data is compensated by a gamma of 1.83 (=2.2/1.2) so that the ultimate image exhibits a normal gamma of 2.2 to the viewer at that angle.


As discussed earlier, instead of using a single gamma, the system may use the measured curves in FIG. 3(a) directly in compensating the tone-scale variation. FIG. 7 illustrates the process for tone scale correction using the measured tone curves. The input digital counts are converted to output luminance using the display tone curve at normal viewing condition (0 degrees) as shown in the up left sub plot. Since the overall luminance tends to fall off as viewing angle increases, the max input luminance (0 degrees) is normalized to the max output luminance at the target view angle (x). The same normalization should be done for the minimum luminance to make sure the target luminance is within the output range of the display at the target angle. The upper right plot shows the luminance normalization. The lower left curve shows the tone response of the display at the target angle. The desired digital counts may be looked-up from the three curves.


The arrows in FIG. 7 show how the digital counts (DC1 and DC2) are corrected via three lookup tables so that they can be displayed correctly at the target viewing angles (x). In some implementations, these three tables can be collapsed to just one table to reduce the computation.


Another type of correction is the compensation of the white-point shift caused by the shift in the primaries (or color shift in general). The color primaries are measured as shown in FIG. 3(b) for both the preferred viewing angle (O degree) and the currently detected viewing angle (x). Two color conversion matrixes can be derived to convert RGB signal to normalized XYX:







[





X
0

_







Y
0

_







Z
0

_




]

=



1


Y

r





0


+

Y

g





0


+

Y

b





0













X

r





0





X

g





0





X

b





0







Y

r





0





Y

g





0





Y

b





0







Z

r





0





Z

g





0





Z

b





0









[




R
0






G
0






B
0




]






[





X
x

_







Y
x

_







Z
x

_




]


=


1


Y
rx

+

Y
gx

+

Y
bx











X
rx




X
gx




X
bx






Y
rx




Y
gx




Y
bx






Z
rx




Z
gx




Z
bx








[




R
x






G
x






B
x




]









In order to have the same color at other viewing angles, the normalized XYZ should be equal, resulting in a single 3×3 matrix to convert RGB value of preferred viewing angle (θ) to the detected viewing angle (x) as:







[




R
x






G
x






B
x




]

=




Y
rx

+

Y
gx

+
Y



Y

r





0


+

Y

g





0


+

Y

b





0












X
rx




X
gx




X
bx






Y
rx




Y
gx




Y
bx






Z
rx




Z
gx




Z
bx







-
1










X

r





0





X

g





0





X

b





0







Y

r





0





Y

g





0





Y

b





0







Z

r





0





Z

g





0





Z

b





0









[




R
0






G
0






B
0




]








This conversion may be done in the gamma correction RGB domain. The color shift correction may be combined with the tone scale correction as shown in FIG. 7. This 3×3 correction may be applied either before the normalization or after the normalization. In some cases, multiple imaging devices may be used.


In some cases, multiple displays may be used and thus the viewing angle constantly changes for each display when the viewer checks back and forth of the individual monitors.


In yet another embodiment, ambient light characteristics can be estimated with the image/video captured by the camera so that color correction can be performed for ambient light in addition to viewing angle.

Claims
  • 1. A method of modifying an image to be displayed on a display that has viewing angle dependant image characteristics comprising: (a) receiving said image;(b) determining automatically, without user interaction, the respective locations of a viewer, each relative to one of an automatically determined number of regions into which said display is subdivided, each said location including both a distance from the respectively associated said region of said display and an angle of incidence with respect to the respectively associated said regions of said display, where automated determination of said number of regions is based on criteria selected as a function of the determined said distance relative to the size of said display;(c) modifying said image based upon said respective locations in such a manner that at least one of the gamma and the white point viewed by said viewer are on average shifted toward at least one of the gamma and the white point said viewer would observe at a normal viewing angle of said display.
  • 2. The method of claim 1 wherein said modification is based upon said gamma.
  • 3. The method of claim 1 wherein said modification is based upon said white point.
  • 4. The method of claim 1 wherein said location of said viewer is an angular relationship between said display and the location of said viewer.
  • 5. The method of claim 1 wherein said display is a liquid crystal display.
  • 6. The method of claim 1 wherein said determining said location is based upon an image received by an imaging device.
  • 7. The method of claim 6 wherein said imaging device is a camera.
  • 8. The method of claim 6 wherein the angular relationship between said imaging device and said display is known.
  • 9. The method of claim 8 wherein said angular relationship is normal.
  • 10. The method of claim 6 wherein said location is based upon face detection.
  • 11. The method of claim 6 wherein said location is based upon eye detection.
  • 12. The method of claim 6 wherein said location is based upon gaze detection.
  • 13. The method of claim 6 wherein said imaging device and said display do not freely move relative to one another.
  • 14. The method of claim 1 wherein said modifying is based upon a single parameter.
  • 15. The method of claim 1 wherein said modifying is based upon a plurality of parameters.
  • 16. The method of claim 1 wherein said modifying is different for different pixels of said image.
  • 17. The method of claim 1 wherein said modifying is different for different regions of said display.
  • 18. The method of claim 1 wherein said modifying is based upon sensing a plurality of viewers.
  • 19. The method of claim 1 wherein said modifying is based upon the anticipated distance between said viewer and said display.
  • 20. The method of claim 1 wherein said determining is based upon multiple imaging devices.
Parent Case Info

This application claims the benefit of U.S. Patent Application Ser. No. 60/524,321 filed Nov. 21, 2003 entitled METHOD AND SYSTEM FOR ADAPTIVE DISPLAY COLOR CORRECTION BASED ON AUTOMATIC VIEWING ANGLE ESTIMATION IN REAL TIME.

US Referenced Citations (285)
Number Name Date Kind
3329474 Harris et al. Jul 1967 A
3375052 Kosanke et al. Mar 1968 A
3428743 Hanlon Feb 1969 A
3439348 Harris et al. Apr 1969 A
3499700 Harris et al. Mar 1970 A
3503670 Kosanke et al. Mar 1970 A
3554632 Chitayat Jan 1971 A
3947227 Granger et al. Mar 1976 A
4012116 Yevick Mar 1977 A
4110794 Lester et al. Aug 1978 A
4170771 Bly Oct 1979 A
4187519 Vitols et al. Feb 1980 A
4384336 Frankle et al. May 1983 A
4385806 Fergason May 1983 A
4410238 Hanson Oct 1983 A
4441791 Hornbeck Apr 1984 A
4516837 Soref et al. May 1985 A
4540243 Fergason Sep 1985 A
4562433 Biferno Dec 1985 A
4574364 Tabata et al. Mar 1986 A
4611889 Buzak Sep 1986 A
4648691 Oguchi et al. Mar 1987 A
4649425 Pund Mar 1987 A
4682270 Whitehead et al. Jul 1987 A
RE32521 Fergason Oct 1987 E
4715010 Inoue et al. Dec 1987 A
4719507 Bos Jan 1988 A
4755038 Baker Jul 1988 A
4758818 Vatne Jul 1988 A
4766430 Gillette et al. Aug 1988 A
4834500 Hilsum et al. May 1989 A
4862270 Nishio Aug 1989 A
4862496 Kelly et al. Aug 1989 A
4885783 Whitehead et al. Dec 1989 A
4888690 Huber Dec 1989 A
4910413 Tamune Mar 1990 A
4917452 Liebowitz Apr 1990 A
4918534 Lam et al. Apr 1990 A
4933754 Reed et al. Jun 1990 A
4954789 Sampsell Sep 1990 A
4958915 Okada et al. Sep 1990 A
4969717 Mallinson Nov 1990 A
4981838 Whitehead Jan 1991 A
4991924 Shankar et al. Feb 1991 A
5012274 Dolgoff Apr 1991 A
5013140 Healey et al. May 1991 A
5074647 Fergason et al. Dec 1991 A
5075789 Jones et al. Dec 1991 A
5083199 Borner Jan 1992 A
5122791 Gibbons et al. Jun 1992 A
5128782 Wood Jul 1992 A
5138449 Kerpchar Aug 1992 A
5144292 Shiraishi et al. Sep 1992 A
5164829 Wada Nov 1992 A
5168183 Whitehead Dec 1992 A
5187603 Bos Feb 1993 A
5202897 Whitehead Apr 1993 A
5206633 Zalph Apr 1993 A
5214758 Ohba et al. May 1993 A
5222209 Murata et al. Jun 1993 A
5224178 Madden et al. Jun 1993 A
5247366 Ginosar et al. Sep 1993 A
5256676 Hider et al. Oct 1993 A
5293258 Dattilo Mar 1994 A
5300942 Dolgoff Apr 1994 A
5305146 Nakagaki et al. Apr 1994 A
5311217 Guerin et al. May 1994 A
5313225 Miyadera May 1994 A
5313454 Bustini et al. May 1994 A
5317400 Gurley et al. May 1994 A
5337068 Stewart et al. Aug 1994 A
5339382 Whitehead Aug 1994 A
5357369 Pilling et al. Oct 1994 A
5359345 Hunter Oct 1994 A
5369266 Nohda et al. Nov 1994 A
5369432 Kennedy Nov 1994 A
5386253 Fielding Jan 1995 A
5394195 Herman Feb 1995 A
5395755 Thorpe et al. Mar 1995 A
5416496 Wood May 1995 A
5422680 Lagoni et al. Jun 1995 A
5426312 Whitehead Jun 1995 A
5436755 Guerin Jul 1995 A
5450498 Whitehead Sep 1995 A
5456255 Abe et al. Oct 1995 A
5461397 Zhang et al. Oct 1995 A
5471225 Parks Nov 1995 A
5471228 Ilcisin et al. Nov 1995 A
5477274 Akiyoshi Dec 1995 A
5481637 Whitehead Jan 1996 A
5537128 Keene et al. Jul 1996 A
5570210 Yoshida et al. Oct 1996 A
5579134 Lengyel Nov 1996 A
5580791 Thorpe et al. Dec 1996 A
5592193 Chen Jan 1997 A
5617112 Yoshida et al. Apr 1997 A
5642015 Whitehead et al. Jun 1997 A
5642128 Inoue Jun 1997 A
D381355 Frank-Braun Jul 1997 S
5650880 Shuter et al. Jul 1997 A
5652672 Huignard et al. Jul 1997 A
5661839 Whitehead Aug 1997 A
5682075 Bolleman et al. Oct 1997 A
5684354 Gleckman Nov 1997 A
5689283 Shirochi Nov 1997 A
5715347 Whitehead Feb 1998 A
5717421 Katakura et al. Feb 1998 A
5717422 Fergason Feb 1998 A
5729242 Margerum et al. Mar 1998 A
5748164 Handschy et al. May 1998 A
5751264 Cavallerano et al. May 1998 A
5754159 Wood et al. May 1998 A
5767828 McKnight Jun 1998 A
5767837 Hara Jun 1998 A
5774599 Muka et al. Jun 1998 A
5784181 Loiseaux et al. Jul 1998 A
5796382 Beeteson Aug 1998 A
5809169 Rezzouk et al. Sep 1998 A
5854662 Yuyama et al. Dec 1998 A
5886681 Walsh et al. Mar 1999 A
5889567 Swanson et al. Mar 1999 A
5892325 Gleckman Apr 1999 A
5901266 Whitehead May 1999 A
5912651 Bitzakidis et al. Jun 1999 A
5939830 Praiswater Aug 1999 A
5940057 Lien et al. Aug 1999 A
5959777 Whitehead Sep 1999 A
5969704 Green et al. Oct 1999 A
5978142 Blackham et al. Nov 1999 A
5986628 Tuenge et al. Nov 1999 A
5991456 Rahman et al. Nov 1999 A
5995070 Kitada Nov 1999 A
5999307 Whitehead et al. Dec 1999 A
6008929 Akimoto et al. Dec 1999 A
6024462 Whitehead Feb 2000 A
6025583 Whitehead Feb 2000 A
6043591 Gleckman Mar 2000 A
6050704 Park Apr 2000 A
6064784 Whitehead et al. May 2000 A
6067645 Yamamoto et al. May 2000 A
6079844 Whitehead et al. Jun 2000 A
6111559 Motomura et al. Aug 2000 A
6111622 Abileah Aug 2000 A
6120588 Jacobson Sep 2000 A
6120839 Comiskey et al. Sep 2000 A
6129444 Tognoni Oct 2000 A
6160595 Kishimoto Dec 2000 A
6172798 Albert et al. Jan 2001 B1
6211851 Lien et al. Apr 2001 B1
6215920 Whitehead et al. Apr 2001 B1
6232948 Tsuchi May 2001 B1
6243068 Evanicky et al. Jun 2001 B1
6267850 Bailey et al. Jul 2001 B1
6268843 Arakawa Jul 2001 B1
6276801 Fielding Aug 2001 B1
6300931 Someya et al. Oct 2001 B1
6300932 Albert Oct 2001 B1
6304365 Whitehead Oct 2001 B1
6323455 Bailey et al. Nov 2001 B1
6323989 Jacobsen et al. Nov 2001 B1
6327072 Comiskey et al. Dec 2001 B1
RE37594 Whitehead Mar 2002 E
6359662 Walker Mar 2002 B1
6377383 Whitehead et al. Apr 2002 B1
6384979 Whitehead et al. May 2002 B1
6400436 Komatsu Jun 2002 B1
6414664 Conover et al. Jul 2002 B1
6418253 Whitehead Jul 2002 B2
6424369 Adair et al. Jul 2002 B1
6428189 Hochstein Aug 2002 B1
6435654 Wang Aug 2002 B1
6437921 Whitehead Aug 2002 B1
6439731 Johnson et al. Aug 2002 B1
6448944 Ronzani et al. Sep 2002 B2
6448951 Sakaguchi et al. Sep 2002 B1
6448955 Evanicky et al. Sep 2002 B1
6452734 Whitehead et al. Sep 2002 B1
6483643 Zuchowski Nov 2002 B1
6507327 Atherton et al. Jan 2003 B1
6545677 Brown Apr 2003 B2
6559827 Mangerson May 2003 B1
6573928 Jones et al. Jun 2003 B1
6574025 Whitehead et al. Jun 2003 B2
6590561 Kabel et al. Jul 2003 B1
6597339 Ogawa Jul 2003 B1
6608614 Johnson Aug 2003 B1
6624828 Dresevic et al. Sep 2003 B1
6657607 Evanicky et al. Dec 2003 B1
6680834 Williams Jan 2004 B2
6690383 Braudaway et al. Feb 2004 B1
6697110 Jaspers et al. Feb 2004 B1
6700559 Tanaka et al. Mar 2004 B1
6753876 Brooksby et al. Jun 2004 B2
6788280 Ham Sep 2004 B2
6791520 Choi Sep 2004 B2
6803901 Numao Oct 2004 B1
6816141 Fergason Nov 2004 B1
6816142 Oda et al. Nov 2004 B2
6816262 Slocum et al. Nov 2004 B1
6828816 Ham Dec 2004 B2
6834125 Woodell et al. Dec 2004 B2
6846098 Bourdelais et al. Jan 2005 B2
6856449 Winkler et al. Feb 2005 B2
6862012 Funakoshi et al. Mar 2005 B1
6864916 Nayar et al. Mar 2005 B1
6885369 Tanahashi et al. Apr 2005 B2
6891672 Whitehead et al. May 2005 B2
6900796 Yasunishi et al. May 2005 B2
6932477 Stanton Aug 2005 B2
6954193 Andrade et al. Oct 2005 B1
6975369 Burkholder Dec 2005 B1
7002546 Stuppi et al. Feb 2006 B1
7113163 Nitta et al. Sep 2006 B2
7113164 Kurihara Sep 2006 B1
7123222 Borel et al. Oct 2006 B2
7161577 Hirakata et al. Jan 2007 B2
20010005192 Walton et al. Jun 2001 A1
20010013854 Ogoro Aug 2001 A1
20010024199 Hughes et al. Sep 2001 A1
20010035853 Hoelen et al. Nov 2001 A1
20010038736 Whitehead Nov 2001 A1
20010048407 Yasunishi et al. Dec 2001 A1
20010052897 Nakano et al. Dec 2001 A1
20020003520 Aoki Jan 2002 A1
20020003522 Baba et al. Jan 2002 A1
20020008694 Miyachi et al. Jan 2002 A1
20020033783 Koyama Mar 2002 A1
20020036650 Kasahara et al. Mar 2002 A1
20020044116 Tagawa et al. Apr 2002 A1
20020057238 Nitta et al. May 2002 A1
20020057253 Lim et al. May 2002 A1
20020063963 Whitehead et al. May 2002 A1
20020067325 Choi Jun 2002 A1
20020067332 Hirakata et al. Jun 2002 A1
20020070914 Bruning et al. Jun 2002 A1
20020093521 Daly et al. Jul 2002 A1
20020105709 Whitehead et al. Aug 2002 A1
20020135553 Nagai et al. Sep 2002 A1
20020149574 Johnson et al. Oct 2002 A1
20020149575 Moon Oct 2002 A1
20020154088 Nishimura Oct 2002 A1
20020159002 Chang Oct 2002 A1
20020159692 Whitehead Oct 2002 A1
20020162256 Wardle et al. Nov 2002 A1
20020171617 Fuller Nov 2002 A1
20020175907 Sekiya et al. Nov 2002 A1
20020180733 Colmenarez et al. Dec 2002 A1
20020190940 Itoh et al. Dec 2002 A1
20030012448 Kimmel et al. Jan 2003 A1
20030026494 Woodell et al. Feb 2003 A1
20030043394 Kuwata et al. Mar 2003 A1
20030048393 Sayag Mar 2003 A1
20030053689 Watanabe et al. Mar 2003 A1
20030072496 Woodell et al. Apr 2003 A1
20030090455 Daly May 2003 A1
20030107538 Asao et al. Jun 2003 A1
20030108245 Gallagher et al. Jun 2003 A1
20030112391 Jang et al. Jun 2003 A1
20030128337 Jaynes et al. Jul 2003 A1
20030132905 Lee et al. Jul 2003 A1
20030142118 Funamoto et al. Jul 2003 A1
20030169247 Kawabe et al. Sep 2003 A1
20030179221 Nitta et al. Sep 2003 A1
20030197709 Shimazaki et al. Oct 2003 A1
20040012551 Ishii Jan 2004 A1
20040041782 Tachibana Mar 2004 A1
20040051724 Elliott et al. Mar 2004 A1
20040057017 Childers et al. Mar 2004 A1
20040239587 Murata et al. Dec 2004 A1
20040263450 Lee et al. Dec 2004 A1
20050073495 Harbers et al. Apr 2005 A1
20050088403 Yamazaki Apr 2005 A1
20050157298 Evanicky et al. Jul 2005 A1
20050190164 Velthoven et al. Sep 2005 A1
20050200295 Lim et al. Sep 2005 A1
20050225561 Higgins et al. Oct 2005 A1
20050225574 Brown et al. Oct 2005 A1
20050259064 Sugino et al. Nov 2005 A1
20060071936 Leyvi et al. Apr 2006 A1
20060104508 Daly et al. May 2006 A1
20060120598 Takahashi et al. Jun 2006 A1
20060208998 Okishiro et al. Sep 2006 A1
20070052636 Kalt et al. Mar 2007 A1
20080025634 Border et al. Jan 2008 A1
20080088560 Bae et al. Apr 2008 A1
Foreign Referenced Citations (51)
Number Date Country
0 732 669 Sep 1996 EP
0 829 747 Mar 1998 EP
0 606 162 Nov 1998 EP
0912047 Apr 1999 EP
0 963 112 Dec 1999 EP
1168243 Jan 2002 EP
1 202 244 May 2002 EP
1 206 130 May 2002 EP
1 313 066 May 2003 EP
1 316 919 Jun 2003 EP
1 453 002 Sep 2004 EP
1 453 030 Sep 2004 EP
2 611 389 Feb 1987 FR
2 388 737 Nov 2003 GB
64-10299 Jan 1989 JP
1-98383 Apr 1989 JP
3-71111 Mar 1991 JP
3-198026 Aug 1991 JP
5-66501 Mar 1993 JP
5-80716 Apr 1993 JP
5-273523 Oct 1993 JP
5-289044 Nov 1993 JP
6-247623 Sep 1994 JP
6-313018 Nov 1994 JP
7-121120 May 1995 JP
9-244548 Sep 1997 JP
10-508120 Aug 1998 JP
11-052412 Feb 1999 JP
2002-099250 Apr 2000 JP
2000-206488 Jul 2000 JP
2000-275995 Oct 2000 JP
2000-321571 Nov 2000 JP
2001-142409 May 2001 JP
2002-091385 Mar 2002 JP
2003-204450 Jul 2003 JP
2003-230010 Aug 2003 JP
3523170 Feb 2004 JP
2004-294540 Oct 2004 JP
10-2004-0084777 Oct 2004 KR
406206 Mar 1998 TW
WO 9115843 Oct 1991 WO
PCTUS9300660 Oct 1993 WO
WO 9320660 Oct 1993 WO
WO 9633483 Oct 1996 WO
WO 9808134 Feb 1998 WO
WO 0075720 Dec 2000 WO
WO 0169584 Sep 2001 WO
WO 0203687 Jan 2002 WO
WO 02079862 Oct 2002 WO
WO 03077013 Sep 2003 WO
WO 2004013835 Feb 2004 WO
Related Publications (1)
Number Date Country
20050117186 A1 Jun 2005 US
Provisional Applications (1)
Number Date Country
60524321 Nov 2003 US