The present disclosure relates generally to the field of digital image processing and, more specifically, to the field of high dynamic range (HDR) image processing.
Real-world scenes often exhibit a wider range of brightness than can be captured by most camera system with a single exposure level. A captured image, e.g., by a digital camera, may include both very bright regions and very dark regions. Ideally, a photograph of a large brightness range scene would include both the details in the bright regions and the details in the dark regions. However, due to dynamic range limits of software and hardware of the imaging device, e.g., a digital still camera, an image capturing a high contrast scene is usually incapable of preserving the high dynamic range (ratio between dark and bright regions) of the entire scene as its original appearance. Typically, in a single exposure setting, the very bright region tends to become saturated and the very dark region tends to become under-exposed, and thereby can hardly be reproduced in one captured frame without distortion.
One technique used to obtain an image with a high dynamic range is by capturing multiple still images of the same resolution having different exposure levels, and then combining the images into a single output image having increased dynamic range. Another method for obtaining a high dynamic range image is the simultaneous capture of multiple images having different exposure levels. The images are subsequently combined into a single output image having increased dynamic range. This capture process can be achieved through the use of multiple imaging paths and sensors.
A recently developed technology, e.g., the so-called “Always-on High Dynamic Range (AOHDR),” takes a different approach in which the image sensor of an imaging device are programmed to capture a short exposure image and a long exposure image simultaneously in different pixel locations. The short and long exposure images are interleaved in the raw mosaiced output of the sensor, typically in a Bayer array pattern resulted from the use of a Bayer filter on the image sensor.
The process of deinterleaving the interleaved Bayer array can be complicated. Taking
Therefore, it would be advantageous to provide a mechanism to generate high dynamic images from interleaved source image data with high spatial resolution and reduced sampling artifacts. Accordingly, embodiments of the present disclosure employ a computer implemented method of demosaicing the source image data into components of an intermediate color space (e.g., YUV) before deinterleaving (and interpolation). Thereby, the source image can be sampled at significantly higher frequency, which advantageously leads to a substantial improvement in image quality.
More specifically, the source image data may be arranged in a Bayer array comprising alternate row pairs of a long exposure and a short exposure image, and the intermediate color space may be a YUV color space. The intensity (Y) component and the chroma (UV) components can be derived (extracted) from the Bayer array data through demosiac convolution processes. A respective convolution is performed between a convolution kernel and a set of adjacent pixels of the Bayer array that are in the same color channel. A convolution kernel is selected based the mosaic pattern of the Bayer array and the color channels of the set of adjacent pixels. The intensity component and the chroma components can then be deinterleaved and interpolated into frames of short exposure and long exposures in the second color space. The short exposure and long exposure frames are then blended to create the high dynamic range image which may be converted back to a RGB frame representing a high dynamic range image.
In accordance with an embodiment of the present disclosure, a method of processing digital image data comprises: (1) accessing first image data representing a captured image that comprises a plurality of pixels represented by a first color space and arranged in a Bayer array, wherein the plurality of pixels comprise: first exposure pixels associated with a first exposure duration of the captured image; and second exposure pixels associated with a second exposure duration of the captured image, and wherein the first exposure pixels and the second exposure pixels are interleaved in the Bayer array; (2) determining first luminance data and first chrominance data from the first exposure pixels of the first image data, wherein the first luminance data and the first chrominance data are represented by a second color space; (3) determining second luminance data and second chrominance data from the second exposure pixels of the first image data, wherein the second luminance data and the second chrominance data are represented by the second color space; (4) interpolating the first luminance data and the first chrominance data to produce a first exposure image; (5) interpolating the second luminance data and the second chrominance data to produce a second exposure image; and (6) generating a blended image of the second color space by blending the first exposure image and the second exposure image.
The first color space may be an RGB color space, and the second color space may be a YUV color space. The first image data may be converted to interleaved luminance data and interleaved chrominance data. Then, the interleaved luminance data may be deinterleaved to the first luminance data and the second luminance data; and the interleaved chrominance data may be deinterleaved data to the first chrominance data and the second chrominance data. The first image data may be converted by: determining a Y value for each pixel of the plurality of pixels by performing a convolution between a Y demosaicing kernel and the first image data of each pixel and neighbor pixels of each pixel; determining a U value for each horizontal pair of pixels of the plurality of pixels by performing a convolution between a U demosaicing kernel and the first image data for each horizontal pair and neighbor pixels of each horizontal pair; and determining a V value for each horizontal pair by performing a convolution between a V demosaicing kernel and the first image data for each horizontal pair and neighbor pixels of each horizontal pair. The neighbor pixels of each pixel may be associated with the same exposure duration as each pixel. The neighbor pixels of each horizontal pair may be associated with the same exposure duration as each horizontal pair. The luminance data and chrominance data may be interpolated in accordance with a minimum delta diagonal method.
In another embodiment of present disclosure, a computer implemented method of generating a high dynamic range (HDR) image comprises: (1) accessing interleaved image data representing a captured frame that comprises a plurality of pixels, wherein the plurality of pixels comprise first exposure pixels associated with a first exposure duration; and second exposure pixels associated with a second exposure duration, wherein the interleaved image data are represented by a first color space and arranged in a mosaic pattern, and wherein the first exposure pixels and the second exposure pixels are interleaved in a predetermined pattern; (2) generating first exposure data in a second color space from the first exposure pixels of the captured frame; (3) generating second exposure data in the second color space from the second exposure pixels of the captured frame; and (4) generating a resultant image representing the captured frame in the first color space based on blending the first exposure data with the second exposure data.
In another embodiment of present disclosure, a system comprises: a processor; and a memory coupled to the processor and storing an image processing program. The image processing program comprising instructions that cause the processor to perform a method of generating a high dynamic range image. The method comprises: (1) accessing interleaved image data representing a captured frame that comprises a plurality of pixels, wherein the plurality of pixels comprise first exposure pixels associated with a first exposure duration; and second exposure pixels associated with a second exposure duration, wherein the interleaved image data are arranged in a Bayer pattern, and wherein the first exposure pixels and the second exposure pixels are interleaved by alternating scanline pairs; (2) deriving first exposure data in a YUV color space from the first exposure pixels of the interleaved image data, wherein the first exposure data represent the captured frame; (3) deriving second exposure data in the YUV color space from the second exposure pixels of the interleaved image data, wherein the second exposure data represent the captured frame; and (4) generating a high dynamic range image represented by the first color space based on the first exposure data and the second exposure data.
This summary contains, by necessity, simplifications, generalizations and omissions of detail; consequently, those skilled in the art will appreciate that the summary is illustrative only and is not intended to be in any way limiting. Other aspects, inventive features, and advantages of the present invention, as defined solely by the claims, will become apparent in the non-limiting detailed description set forth below.
Embodiments of the present invention will be better understood from a reading of the following detailed description, taken in conjunction with the accompanying drawing figures in which like reference characters designate like elements and in which:
Reference will now be made in detail to the preferred embodiments of the present invention, examples of which are illustrated in the accompanying drawings. While the invention will be described in conjunction with the preferred embodiments, it will be understood that they are not intended to limit the invention to these embodiments. On the contrary, the invention is intended to cover alternatives, modifications and equivalents, which may be included within the spirit and scope of the invention as defined by the appended claims. Furthermore, in the following detailed description of embodiments of the present invention, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be recognized by one of ordinary skill in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the embodiments of the present invention. The drawings showing embodiments of the invention are semi-diagrammatic and not to scale and, particularly, some of the dimensions are for the clarity of presentation and are shown exaggerated in the drawing Figures. Similarly, although the views in the drawings for the ease of description generally show similar orientations, this depiction in the Figures is arbitrary for the most part. Generally, the invention can be operated in any orientation.
Notation and Nomenclature:
It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussions, it is appreciated that throughout the present invention, discussions utilizing terms such as “processing” or “accessing” or “executing” or “storing” or “rendering” or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories and other computer readable media into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices. When a component appears in several embodiments, the use of the same reference numeral signifies that the component is the same component as illustrated in the original embodiment.
Deinterleaving Interleaved High Dynamic Range Image by Using YUV Interpolation
At 201, a set of RGB Bayer array source data representing a captured frame including interleaved long exposure pixels and short exposure pixels are accessed. For instance, the source data may correspond to raw data of a still image or a frame of a video captured by a digital camera equipped with a Bayer filter. Alternatively, the source data may correspond to computer simulated data representing an interleaved HDR image.
The Bayer array source data can be represented by
At 202, the RGB source frame is demosaiced into intensity data of the captured frame represented in a selected intermediate color space, for example Y image data in the YUV space. As will be described in greater detail below, a convolution method can be used for the demosaic process in some embodiments. Thus, the resultant Y image data comprise interleaved long exposure pixels and short exposure pixels and can have the same resolution as the source data.
At 203 of
At 204, the interleaved YUV data are interpolated into full frame long and short exposure YUV data. At 205, the interpolated long and short exposure YUV data may be converted to long and short exposure RGB data respectively which are eventually blended to generate the HDR image at 206. Thus, by demosaicing the source image data into YUV space data before deinterleaving, the source image can be sampled at higher frequency, e.g., a full resolution, than in the conventional art, resulting in significantly improved HDR image quality. Alternatively, YUV short- and long exposure image can be directly blended to create a high dynamic range resultant image in YUV space which can then be translated to RGB space.
As will be appreciated by those skilled in the art, the present disclosure is not limited to any specific pattern of interleaving the long exposure data pixels and short exposure pixels. The source data comprising alternating pairs of scanlines as shown in
Although embodiments of the present disclosure are described with reference to the YUV space as the intermediate color space in the demosaicing processes, it will be appreciated that other intermediate color spaces may be used and as well and could lead to improved image quality with efficient implementation in accordance with the present disclosure. More specifically, an RGB source image may be demosaiced into respective color channel data of any suitable intermediate color space before deinterleaving, such as a YIQ space, a YCbCr space, an HSL space, and etc.
The intensity and chroma components of the interleaved source data can be extracted by convolution processes. With respect to a respective sampled pixel of the interleaved source data, a corresponding convolution kernel is applied on the sampled pixel and its neighbor pixels to perform the convolution. The convolution kernels can be designed such that the opposite-exposure pixels are not mixed with current-intensity pixels, since their response curves are different even after exposure compensation. Accordingly, a convolution only affects the neighbor pixels associated with the same exposure settings as the sampled pixel, e.g., long exposure only pixels or short exposure only pixels.
The present disclosure is not limited to any specific order of executing steps 201-206 of method 200. In some embodiments, method 200 can be implemented as a three-pass computation process, in which the first pass includes step 202, the second pass includes step 203, and the third pass includes steps 205-206. In some other embodiments, method 200 can be implemented as a two-pass computation process, in which Y is demosaiced and RGB data are recreated at the minimum diagonal endpoints on the fly. In still some other embodiments, method 200 can be implemented as a four-pass computation process, in which full resolution Y, U and V images are demosaiced and used for 205 and 206.
Each of the interleaved Y frame 252 and the interleaved UV frame 253 is deinterleaved into a long exposure frame (254 and 255) and a short exposure frame (256 and 257). The long exposure frames (254 and 255) include only data of pixels captured in the long exposure setting; and the short exposure frames (256 and 257) include only data of pixels captured in the short exposure setting. For instance, each of 254, 255, 256, and 257 contains half frame information, which is converted to full frame (258, 259, 260 and 261) by interpolation. The full frame long/short exposure Y/UV data 258, 259, 260 and 261 are blended into a YUV frame 262 which are eventually converted to a RGB frame for display.
The convolution kernels shown in
In some embodiments, during the convolution process, the currently sampled pixel can be normalized; the black bias can be subtracted; and white balancing can be performed on each source pixel, which is clamped to the [0-1] range before convolution. The resulting Y value can be clamped to a minimum of zero. If the currently sampled pixel is on a short exposure row, the Y value can be multiplied by the long/short exposure ratio. The resultant values can be stored as a normalized FP 16 number, FP 20 number, or any other suitable format. For example, the bits of an FP 20 image component can be encoded into an 8-bit MRT image.
The Y convolution kernels presented in
Y=0.229R+0.589G+0.114B
can yield the 13-tap Y kernels 421-424 which should then be adapted to the interleaving pattern of the HDR source image. For instance, raw AOHDR source images have interleaved short and long exposure rows which should be kept separated in individual convolution processes. Therefore, pixels can be synthesized for the opposite-exposure row assuming a simple linear interpolation between rows of the same exposure as the current row. The interpolated neighborhood is shown in the array 431.
Applying the Y demosaicing kernels 421-424 to the neighborhood produces the final Y demosaicing kernels shown in
The kernels are essentially redistributed based on the following algebra for a single column kernel K0, K1, K2, K3, K4] applied to an image column [I2, I3, I4, I5, I6]:
C=I2K0+I3K1+I4K2+I5K3+I6K4 (eq. 1)
For an even row, I2=(I0+I4)/2, I3=(I1+I5)/2, and I6=(I4+I8)/2, and the equation for a column is represented as:
C=(I0+I4)K0/2+(I1+I5)K1/2+I4K2+I5K3+(I4+I8)K4/2 (eq. 2)
This redistributes to the following equation:
C=I0(K0/2)+I1K1/2+I4((K0+K4)/2+K2)+I5(K1/2+K3)+I8K4/2 (eq. 3)
This is equivalent to convolution of the image column: [I0, I1, I2, I3, I4, I5, I6, I7, I8] with the single column kernel
[K0/2, K1/2, 0, 0, (K0+K4)/2+K2, K1/2+K3, 0, 0, K4/2]. (eq. 4)
This translates directly from the kernels 421-424 to those shown in
Transforming according to the single column kernel, the new center column is represented as:
This corresponds to the center column of the “Y at R” kernel 310 in
For odd rows, the kernels are similar, but the items are permuted so the single-column kernel becomes:
[K02, 0, 0, K3/2+K1, (K0+K4)/2+K2, 0, 0, K3/2, K4/2] (eq. 5)
In some embodiments, the U data and the V data can be respectively extracted in full resolution as the source image. However, in some other embodiments, the U data and the V data can be extracted as a lower resolution than the source image without causing significant negative visual impact as human eyes have less acuity for color differences than for luminance.
For example, the UV data can be extracted at half the horizontal resolution, resulting in the allocated size of the combined UV image identical to the source image.
For this Bayer array pattern, two U convolution kernels, 510 and 520, and two V convolution kernels, 530 and 540, can be used. Each of the kernels 510-540 is configured as a 9×3 grid that has 18 taps. Taking the kernel 510 as an example, the center pair of taps (with values −0.041884 and −0.122113) corresponds to the currently sampled horizontal pair of pixels. The empty rows 512-515 correspond to the neighbor pixels associated with the opposite exposure setting which are excluded from the present convolution computation. In addition to computing U and V values at those positions, these kernels effectively perform a 2× horizontal box filter to average pairs of U and V values horizontally.
In some embodiments, the source data can be processed again before UV convolutions. During the UV convolution process, the currently sampled pixels can be normalized; the black bias can be subtracted; and white balancing can be performed on each source pixel, which is clamped to the [0-1] range before convolution. The redundant math is performed because it reduces both memory consumption and memory bandwidth requirements, at the expense of increased arithmetic.
Unlike the Y demosaicing process, the U and V values are not clamped to zero, since negative numbers are legitimate chroma values. If the current pixel is on a short exposure row, both U and V values can be multiplied by the long/short exposure ratio. The resulting UV pairs can be stored as normalized FP 16 numbers, FP 20 numbers, or any other suitable format.
The convolution kernels shown in
The U/V convolution kernels presented in
U=−0.14713R−0.28886G+0.436B
V=0.615R−0.51499G−0.10001B
The final desired value for U on RG rows is the average of U at R and U at G1; for U on GB rows it is the average of U at G2 and U at B; for V at RG rows it is the average of V at R and V at G1; and for V at GB rows it is the average of V at G2 and V at B. Rather than performing two 13-tap convolutions and averaging the result, the kernels themselves are combined to produce an 18-tap convolution as shown in
The intensity and chroma component of the interleaved source data can also be extracted in any suitable mechanism that is well known in the art.
The interleaved Y image data and the UV image data derived from the demosaicing process described above results in half frame image data and can be deinterleaved based on the exposure settings of the pixels, and then interpolated to produce full frame short exposure YUV data and full frame long exposure YUV data.
For example, assume diagrams 710, 720 and 730 represent fragments of long exposure YUV data derived from the demosaicing convolution process. Rows 701 and 704 encompasses the long exposure Y data derived from the Y convolution process; while Y values of pixels of rows 702 and 703 are to be filled by interpolating the pixels of rows 701 and 704.
In this example, the interpolation is performed in accordance with a minimum delta diagonal method. With respect to generating a Y value for a center pixel (marked with “x”) located at the upper row 702 of a row pair 702 and 703, the diagram 710 illustrates the location of the 8 local neighbor pixels, y00, y01, y02, y10, y11, y12, y13 and y14, from which five diagonals can be constructed:
D1: diagonal from y00 to y14;
D1_5: diagonal from (y00+y01)/2 to y13
D2; diagonal from y01 to y12;
D2_5: diagonal from (y01+y02)/2 to y11; and
D3: diagonal from y02 to y10.
From these 8 samples, 5 diagonals are constructed. The absolute Y deltas of these 5 diagonals are compared, and the diagonal with the minimum delta is selected for interpolation. U and V values corresponding to the minimum delta diagonal are fetched. The Y, U, and V values from the endpoints are interpolated ⅓ of the way from the row 701 to the row 702 and used as an averaged value, as illustrated in diagram 720.
Similarly, with respect to generating a Y value for a center pixel (marked with “x”) located at the lower row 703 of a row pair 702 and 703, the diagram 720 illustrates the location of the 8 local neighbor pixels, y10, y01, y02, y13, y11, y12, y13 and y14, from which another five diagonals can be constructed.
The current YUV data and the averaged YUV data are sorted into a “shortYUV” and “longYUV” data, depending on whether the current pixel is on a short exposure or long exposure row. The short YUV and longYUV values represent full image data each. Subsequently, a blend factor is selected to blend the shortYUV and longYUV values for a particular pixel to derive the HDR pixel.
As will be appreciated by those skilled in the art, the present disclosure is not limited to any specific blending factors or any specific form of a blending factor curve. In some embodiments, two blending curves, one for long rows and one for short rows, may be implemented. Additionally, the “plateau” at 0.50 is actually configurable. In some application programs, these parameters can be tuned based on lighting conditions.
Once the blend factor is selected, the longYUV and shortYUV can be converted to a long and short red, green, or blue sample depending on the current Bayer phase. The industry-standard inverse transformation from YUV to RGB can be used:
R=Y+1.3983*V
G=Y−0.39465*U−0.58060*V
B=Y+2.03211*U
A final R, G, or B value is interpolated between the short[RGB] and long[RGB] based on the blend factor. A blend factor of 0.0 implies that only the long pixel contributes; a factor of 1.0 implies that only the short pixel contributes.
In some embodiments, this final sample value is clamped to a minimum of zero, and then raised to a power of less than 1 (e.g., 0.436295) to “compand” the value—boosting mid-tones and darkening highlights, and moving more values into the [0-1] range. Finally, the sample is clamped to a maximum of 1, multiplied by a “denormalization factor” and stored as a denormalized FP16 value, which allows subsequent fixed-point processing to proceed correctly.
An image processing method according to the present disclosure, e.g., as described with reference to
The computing system comprises a CPU 1001, a system memory 1002, a GPU 1003, I/O interfaces 1004 and other components 1005, an operating system 1006 and application software 1007 including the a HDR image generator 1010 stored in the memory 1002. When incorporating the user's configuration input and executed by the CPU 1001 or the GPU 1003, the HDR image generator 1010 can process interleaved image data and generate HDR images in accordance with an embodiment of the present disclosure. An HDR image generator 1010 may include various components or modules to perform functions of YUV data extraction, deinterleaving, interpolation and blending, etc. The user configuration input may include, source image data, a long/short exposure ratio, user-selection of convolution kernels, user-election of an intermediate color space, and a blending curve for example. An HDR image generator 1010 may be an integral part of a graphic simulation tool, a processing library, or a computer game that is written in Fortran, C, C++, or any other programming languages known to those skilled in the art. For example, the program 1010 can be implemented as a multi-pass OpenGL ES shader on mobile processors.
Although certain preferred embodiments and methods have been disclosed herein, it will be apparent from the foregoing disclosure to those skilled in the art that variations and modifications of such embodiments and methods may be made without departing from the spirit and scope of the invention. It is intended that the invention shall be limited only to the extent required by the appended claims and the rules and principles of applicable law.
Number | Name | Date | Kind |
---|---|---|---|
5032903 | Suzuki et al. | Jul 1991 | A |
5081594 | Horsley | Jan 1992 | A |
5287438 | Kelleher | Feb 1994 | A |
5313287 | Barton | May 1994 | A |
5335322 | Mattison | Aug 1994 | A |
5392396 | MacInnis | Feb 1995 | A |
5432898 | Curb et al. | Jul 1995 | A |
5446836 | Lentz et al. | Aug 1995 | A |
5452104 | Lee | Sep 1995 | A |
5452412 | Johnson, Jr. et al. | Sep 1995 | A |
5483258 | Cornett et al. | Jan 1996 | A |
5570463 | Dao | Oct 1996 | A |
5594854 | Baldwin et al. | Jan 1997 | A |
5623692 | Priem et al. | Apr 1997 | A |
5633297 | Valko et al. | May 1997 | A |
5664162 | Dye | Sep 1997 | A |
5748904 | Huang et al. | May 1998 | A |
5815162 | Levine | Sep 1998 | A |
5854631 | Akeley et al. | Dec 1998 | A |
5854637 | Sturges | Dec 1998 | A |
5872902 | Kuchkuda et al. | Feb 1999 | A |
5977987 | Duluk, Jr. | Nov 1999 | A |
5990904 | Griffin | Nov 1999 | A |
6028608 | Jenkins | Feb 2000 | A |
6034699 | Wong et al. | Mar 2000 | A |
6072500 | Foran et al. | Jun 2000 | A |
6104407 | Aleksic et al. | Aug 2000 | A |
6104417 | Nielsen et al. | Aug 2000 | A |
6115049 | Winner et al. | Sep 2000 | A |
6118394 | Dnaya | Sep 2000 | A |
6128000 | Jouppi et al. | Oct 2000 | A |
6137918 | Harrington et al. | Oct 2000 | A |
6160559 | Omtzigt | Dec 2000 | A |
6188394 | Morein et al. | Feb 2001 | B1 |
6201545 | Wong et al. | Mar 2001 | B1 |
6204859 | Jouppi et al. | Mar 2001 | B1 |
6219070 | Baker et al. | Apr 2001 | B1 |
6249853 | Porterfield | Jun 2001 | B1 |
6359623 | Larson | Mar 2002 | B1 |
6362819 | Dalal et al. | Mar 2002 | B1 |
6366289 | Johns | Apr 2002 | B1 |
6429877 | Stroyan | Aug 2002 | B1 |
6437780 | Baltaretu et al. | Aug 2002 | B1 |
6452595 | Montrym et al. | Sep 2002 | B1 |
6469707 | Voorhies | Oct 2002 | B1 |
6490058 | Takabatake et al. | Dec 2002 | B1 |
6501564 | Schramm et al. | Dec 2002 | B1 |
6504542 | Voorhies et al. | Jan 2003 | B1 |
6522329 | Ihara et al. | Feb 2003 | B1 |
6525737 | Duluk, Jr. et al. | Feb 2003 | B1 |
6529207 | Landau et al. | Mar 2003 | B1 |
6606093 | Gossett et al. | Aug 2003 | B1 |
6614444 | Duluk, Jr. et al. | Sep 2003 | B1 |
6624823 | Deering | Sep 2003 | B2 |
6633197 | Sutardja | Oct 2003 | B1 |
6633297 | McCormack et al. | Oct 2003 | B2 |
6646639 | Greene et al. | Nov 2003 | B1 |
6664961 | Ray et al. | Dec 2003 | B2 |
6671000 | Cloutier | Dec 2003 | B1 |
6683979 | Walker et al. | Jan 2004 | B1 |
6693637 | Koneru et al. | Feb 2004 | B2 |
6693639 | Duluk, Jr. et al. | Feb 2004 | B2 |
6697063 | Zhu | Feb 2004 | B1 |
6704026 | Kurihara et al. | Mar 2004 | B2 |
6717578 | Deering | Apr 2004 | B1 |
6741247 | Fenney | May 2004 | B1 |
6747057 | Ruzafa et al. | Jun 2004 | B2 |
6765575 | Voorhies et al. | Jul 2004 | B1 |
6778177 | Furtner | Aug 2004 | B1 |
6788301 | Thrasher | Sep 2004 | B2 |
6798410 | Redshaw et al. | Sep 2004 | B1 |
6819332 | Baldwin | Nov 2004 | B2 |
6825847 | Molnar et al. | Nov 2004 | B1 |
6833835 | van Vugt | Dec 2004 | B1 |
6906716 | Moreton et al. | Jun 2005 | B2 |
6940514 | Wasserman et al. | Sep 2005 | B1 |
6947057 | Nelson et al. | Sep 2005 | B2 |
6978317 | Anantha et al. | Dec 2005 | B2 |
7009607 | Lindholm et al. | Mar 2006 | B2 |
7009615 | Kilgard et al. | Mar 2006 | B1 |
7064771 | Jouppi et al. | Jun 2006 | B1 |
7075681 | Brothers | Jul 2006 | B1 |
7081902 | Crow et al. | Jul 2006 | B1 |
7119809 | McCabe | Oct 2006 | B1 |
7126600 | Fowler et al. | Oct 2006 | B1 |
7154066 | Talwar et al. | Dec 2006 | B2 |
7158148 | Toji et al. | Jan 2007 | B2 |
7170515 | Zhu | Jan 2007 | B1 |
7184040 | Tzvetkov | Feb 2007 | B1 |
7224364 | Yue et al. | May 2007 | B1 |
7307628 | Goodman et al. | Dec 2007 | B1 |
7382368 | Molnar et al. | Jun 2008 | B1 |
7403212 | Schick et al. | Jul 2008 | B2 |
7453466 | Hux et al. | Nov 2008 | B2 |
7479965 | King et al. | Jan 2009 | B1 |
7548996 | Baker et al. | Jun 2009 | B2 |
7551174 | Lourcha et al. | Jun 2009 | B2 |
7692659 | Molnar et al. | Apr 2010 | B1 |
7791617 | Crow et al. | Sep 2010 | B2 |
7978921 | Donovan | Jul 2011 | B1 |
8031977 | Min et al. | Oct 2011 | B2 |
8063903 | Vignon et al. | Nov 2011 | B2 |
8116579 | Fenney et al. | Feb 2012 | B2 |
8427487 | Crow | Apr 2013 | B1 |
8605104 | McAllister et al. | Dec 2013 | B1 |
8660347 | Tamura | Feb 2014 | B2 |
8670613 | McAllister et al. | Mar 2014 | B2 |
20010038642 | Alvarez et al. | Nov 2001 | A1 |
20020114461 | Shimada | Aug 2002 | A1 |
20030020741 | Boland | Jan 2003 | A1 |
20030201994 | Taylor | Oct 2003 | A1 |
20040086177 | Zhang et al. | May 2004 | A1 |
20050213128 | Imai et al. | Sep 2005 | A1 |
20060170703 | Liao | Aug 2006 | A1 |
20070002165 | Parks | Jan 2007 | A1 |
20070268298 | Alben et al. | Nov 2007 | A1 |
20080218599 | Klijn et al. | Sep 2008 | A1 |
20080247641 | Rasmusson et al. | Oct 2008 | A1 |
20090295816 | Kallio | Dec 2009 | A1 |
20090295941 | Nakajima et al. | Dec 2009 | A1 |
20100060629 | Rasmusson et al. | Mar 2010 | A1 |
20100309333 | Smith et al. | Dec 2010 | A1 |
20110074980 | Border et al. | Mar 2011 | A1 |
20110090361 | Kobayashi et al. | Apr 2011 | A1 |
20110122273 | Kanemitsu | May 2011 | A1 |
20120050303 | McAllister et al. | Mar 2012 | A1 |
20120113787 | Komma et al. | May 2012 | A1 |
20120183215 | Van Hook et al. | Jul 2012 | A1 |
20130021352 | Wyatt et al. | Jan 2013 | A1 |
20130083226 | Kwan et al. | Apr 2013 | A1 |
20130208138 | Li et al. | Aug 2013 | A1 |
20130242133 | Li | Sep 2013 | A1 |
20140063300 | Lin et al. | Mar 2014 | A1 |
20140184612 | Dunaisky et al. | Jul 2014 | A1 |
Number | Date | Country |
---|---|---|
2143280 | Jan 2010 | EP |
200820131 | May 2008 | TW |
Number | Date | Country | |
---|---|---|---|
20160037044 A1 | Feb 2016 | US |