Systems and methods for temporal subpixel rendering of image data

Information

  • Patent Grant
  • 8378947
  • Patent Number
    8,378,947
  • Date Filed
    Monday, August 7, 2006
    18 years ago
  • Date Issued
    Tuesday, February 19, 2013
    11 years ago
Abstract
Methods are disclosed to render image data over time. In one embodiment, a mapping from image data values to first and second sets of subpixels in a plurality of output frames uses brightness versus viewing angle performance measures to reduce color error when the image is viewed on the display panel at an off-normal viewing angle. In another embodiment, temporal subpixel rendering is used to improve the viewing angle in LCD displays or to improve subpixel rendering in other display technologies.
Description
BACKGROUND

The present application is related to display systems, and more particularly, to systems and methods for subpixel rendering source image data over time. Temporal subpixel rendering may be used to improve viewing angle in LCD displays or to improve subpixel rendering in other display technologies.


In commonly owned U.S. Patent Applications: (1) U.S. patent application Ser. No. 09/916,232 entitled “ARRANGEMENT OF COLOR PIXELS FOR FULL COLOR IMAGING DEVICES WITH SIMPLIFIED ADDRESSING” filed Jul. 25, 2001, and published as U.S. Patent Application Publication No. 2002/0015110 (“the '110 application”); (2) U.S. patent application Ser. No. 10/278,353, entitled “IMPROVEMENTS TO COLOR FLAT PANEL DISPLAY SUB-PIXEL ARRANGEMENTS AND LAYOUTS FOR SUB-PIXEL RENDERING WITH INCREASED MODULATION TRANSFER FUNCTION RESPONSE,” filed Oct. 22, 2002, and published as U.S. Patent Application Publication No. 2003/0128225 (“the '225 application”); (3) U.S. patent application Ser. No. 10/278,352, entitled “IMPROVEMENTS TO COLOR FLAT PANEL DISPLAY SUB-PIXEL ARRANGEMENTS AND LAYOUTS FOR SUB-PIXEL RENDERING WITH SPLIT BLUE SUBPIXELS,” filed Oct. 22, 2002, and published as U.S. Patent Application Publication No. 2003/0128179 (“the '179 application”); (4) U.S. patent application Ser. No. 10/243,094, entitled “IMPROVED FOUR COLOR ARRANGEMENTS AND EMITTERS FOR SUBPIXEL RENDERING,” filed Sep. 13, 2002, and published as U.S. Patent Application Publication No. 2004/0051724 (“the '724 application); (5) U.S. patent application Ser. No. 10/278,328, entitled “IMPROVEMENTS TO COLOR FLAT PANEL DISPLAY SUB-PIXEL ARRANGEMENTS AND LAYOUTS WITH REDUCED BLUE LUMINANCE WELL VISIBILITY,” filed Oct. 22, 2002, and published as U.S. Patent Application Publication No. 2003/0117423 (“the '423 application”); (6) U.S. patent application Ser. No. 10/278,393, entitled “COLOR DISPLAY HAVING HORIZONTAL SUB-PIXEL ARRANGEMENTS AND LAYOUTS,” filed Oct. 22, 2002, and published as U.S. Patent Application Publication No. 2003/0090581 (“the '581 application”); and (7) U.S. patent application Ser. No. 10/347,001, entitled “SUB-PIXEL ARRANGEMENTS FOR STRIPED DISPLAYS AND METHODS AND SYSTEMS FOR SUB-PIXEL RENDERING SAME,” filed Jan. 16, 2003, and published as U.S. Patent Application Publication No. 2004/0080479 (“the '479 application”), novel subpixel arrangements are therein disclosed for improving the cost/performance curves for image display devices. The '110, '225, '179, '724, '423, '581 and '479 applications are all incorporated by reference herein.


These improvements are particularly pronounced when coupled with subpixel rendering (SPR) systems and methods further disclosed in those applications and in commonly owned U.S. Patent Applications: (1) U.S. patent application Ser. No. 10/051,612, entitled “CONVERSION OF RGB PIXEL FORMAT DATA TO PENTILE MATRIX SUB-PIXEL DATA FORMAT,” filed Jan. 16, 2002, and published as U.S. Patent Application Publication No. 2003/0034992 (“the '992 application”); (2) U.S. patent application Ser. No. 10/150,355, entitled “METHODS AND SYSTEMS FOR SUB-PIXEL RENDERING WITH GAMMA ADJUSTMENT,” filed May 17, 2002, and published as U.S. Patent Application Publication No. 2003/0103058 (“the '058 application”); (3) U.S. patent application Ser. No. 10/215,843, entitled “METHODS AND SYSTEMS FOR SUB-PIXEL RENDERING WITH ADAPTIVE FILTERING,” filed Aug. 8, 2002, and published as U.S. Patent Application Publication No. 2003/0085906 (“the '906 application”). The '992, '058, and '906 applications are herein incorporated by reference herein.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings, which are incorporated in, and constitute a part of this specification illustrate exemplary implementations and embodiments of the invention and, together with the description, serve to explain principles of the invention.



FIG. 1 depicts an observer viewing a display panel and the cones of acceptable viewing angle off the normal axis to the display.



FIG. 2 shows one embodiment of a graphics subsystem driving a panel with subpixel rendering and timing signals.



FIG. 3 depicts an observer viewing a display panel and the possible color errors that might be introduced as the observer views subpixel rendered text off normal axis to the panel.



FIG. 4 depicts a display panel and a possible cone of acceptable viewing angles for subpixel rendered text once techniques of the present application are applied.



FIGS. 5 through 8 show several embodiments of performing temporal subpixel rendering over two frames.



FIG. 9 shows two curves of brightness (100% and 50%) versus viewing angle on a LCD display.



FIGS. 10A-10E show a series of curves depicting the performance of brightness versus time when the response curve of a typical liquid crystal is modulated by various pulse trains.



FIGS. 11A-11D show another series of curves of brightness versus time with different types of pulse trains.



FIGS. 12 and 13 depict several embodiments of implementing temporal subpixel rendering.





DETAILED DESCRIPTION

Reference will now be made in detail to implementations and embodiments, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.



FIG. 1 shows a display panel 10 capable of displaying an image upon its surface. An observer 12 is viewing the image on the display at an appropriate distance for this particular display. It is known that, depending upon the technology of the display device (liquid crystal display LCD, optical light emitting diode OLED, EL, and the like) that the quality of the displayed image falls off as a function of the viewing angle, but particularly so for LCDs. The outer cone 14 depicts an acceptable cone of viewing angles for the observer 12 with a typical RGB striped system that is not performing sub-pixel rendering (SPR) on the displayed image data.


A further reduction in acceptable viewing angle (i.e. inner cone 16) may occur when the image data itself is sub-pixel rendered in accordance with any of the SPR algorithms and systems as disclosed in the incorporated applications or with any known SPR system and methods. One embodiment of such a system is shown in FIG. 2 wherein source image data 26 is placed through a driver 20 which might include SPR subsystem 22 and timing controller 24 to supply display image data and control signals to panel 10. The SPR subsystem could reside in a number of embodiments. For example, it could reside entirely in software, on a video graphics adaptor, a scalar adaptor, in the TCon, or on the glass itself implemented with low temperature polysilicon TFTs.


This reduction in acceptable viewing angle is primarily caused by color artifacts that may appear when viewing a subpixel rendered image because high spatial frequency edges have different values for red, green, and blue subpixels. For example, black text on a white background which uses SPR on a design similar to FIG. 5 will result in the green subpixels switching between 100% and 0% while the red and blue subpixels switching from 100% to 50%.



FIG. 3 depicts the situation as might apply to subpixel rendered black text 30 on a white background. As shown, observer 12 experiences no color artifact when viewing the text substantially on the normal axis to the panel 10. However, when the observer “looks down or up” on the screen, the displayed data may show a colored hue on a liquid crystal display (LCD), which is due to the anisotropic nature of viewing angle on some LCDs for different gray levels, especially for vertical angles (up/down). Thus it would be desirable to perform corrections to the SPR data in order to increase the acceptable viewing angle 40 of SPR data, as depicted in FIG. 4.


Currently, red and blue image data are averaged via a SPR process to create the proper value on the red and blue subpixels on a display. This averaging causes viewing angle problems for some liquid crystal displays because the viewing angle characteristics are a function of the voltage setting on the pixel. To smooth out the visual effects, several embodiments disclosed herein describe a temporal method to create the average value such that the viewing angle is not affected by subpixel rendering. As will be discussed further below in connection with FIG. 12, one embodiment takes the image data from two adjacent source pixels and uses them sequentially frame by frame. Since the data from pixel to pixel does not change dramatically, there should be no flicker observed. For sharp transitions, adaptive filtering takes over and this temporal averaging can be turned off.


As an example, FIG. 5 shows how a “white” line can be rendered on a panel having a subpixel repeat grouping—such as grouping 50 which comprises red subpixels 52, green subpixels 54, and blue subpixels 56. It will be appreciated that this choice of subpixel repeat grouping is merely for illustrative purposes and that other subpixel repeat groupings would suffice for purposes of the present invention. Such other subpixel repeat groupings are further described in the above-incorporated by reference patent applications.



FIGS. 5-8 depict various embodiments of temporally subpixel rendering a single vertical white line in order to reduce the amount of off-normal axis color error. In Frame 1 of FIG. 5, the first three columns of colored subpixels are fully illuminated (as indicated by the heavy hatching lines); whereas in Frame 2 of FIG. 5, only the middle column of green subpixels 54 are fully illuminated and the rest are off. If the two frames are switched sufficiently fast enough, then the visual effect remains a “white” line; but, as will be explained below, reduces the amount of off-normal axis color error.



FIG. 6 shows Frame 1 with the top row (first three subpixels) and only the bottom middle column green subpixel as fully illuminated. Frame 2 has the bottom row (first three subpixels) and top middle column green subpixel as fully illuminated.



FIG. 7 shows Frame 1 with upper left and lower right red subpixels with two middle green subpixels fully illuminated. Frame 2 has the lower left and upper right blue subpixels with two green subpixels fully illuminated.



FIG. 8 shows Frame 1 with the first two columns fully illuminated; while Frame 2 shows the second and third columns fully illuminated. All four FIGS. 5-8 depict embodiments of performing subpixel rendering in time that produces for the human viewer the proper color on the normal axis viewing; while reducing the color error introduced on off-normal axis viewing—particularly for LCD displays. These combinations of ON and OFF pixels can be varied in a prescribed time sequence to minimize flicker; for example, the sequence of FIG. 5 through 8 could be repeated over 8 frames of data.


For illustrative purposes, FIG. 9 depicts why these color artifacts arise. When a single “white” line is drawn as in Frame 1 of FIG. 5 and held over time (which is typical for SPR that does not vary over time), it is centered on the middle row of green subpixels. As measured on the normal axis, the middle column of green subpixels is fully illuminated at 100% brightness level; the blue and the red subpixels are illuminated at 50% brightness. Put another way, the green subpixel is operating with a filter kernel of [255] (i.e. the “unity” filter with ‘255’ being 100% on a digital scale); while the blue and red subpixels have a filter kernel of [128 128] (i.e. a “box” filter with ‘128’ being 50% on a digital scale). At zero viewing angle (i.e. normal to the display), a “white” line is shown because the red and blue subpixels are of double width of the green subpixels. So with G˜100, R˜50, B˜50, a chroma-balaced white is produced at 100−2×(50)−2×(50). The multiplicative factor of “2” for red and blue comes from the fact that the red and blue subpixels are twice the width of the green subpixels.


As the viewing angle increases to angle ΘUP, then the observer would view a fall-off of ΔG in the green subpixel brightness—while viewing a ΔR,B fall-off 902 in the brightness of either the red or the blue subpixel brightness. Thus, at ΘUP, there is G′˜80, R′˜20, B′˜20, which results in the image of the white line assuming a more greenish hue—e.g. 80−2×(20)−2×(20). For angle ΘDOWN, the green pixels will again fall off an amount ΔG, while the red and blue subpixels will actually rise an amount ΔR,B 904. In this case, the white line will assume a magenta hue.


So, to correct for this color artifact, it might be desirable to drive the red and blue subpixels effectively on a different curve so that the delta fall-off in the green vs. the red/blue subpixels better match each other as a relative percentage of their total curve. An intermediate curve which is the average curve between 100% and 0% is shown in FIG. 9. This intermediate curve depicts the time-averaged curve that occurs if the red and blue subpixels are driven in Frame 1 to 100% luminance and in Frame 2 to 0% luminance. As may be seen, at the same off-normal axis angle as in FIG. 9, the difference in the fall-off between the green and the red/blue subpixels are better matched.


Other embodiments and refinements of the above temporal subpixel rendering are possible. FIGS. 10A, B, and C are a series of three graphs. FIG. 10A shows a typical brightness response curve of a liquid crystal over time. FIG. 10B shows a series of pulse trains—each a width equal to one frame and represents the voltage applied to the red and blue subpixels (e.g. for the white line example above). Thus, the red and blue subpixels are driven to 100% luminance for odd frames and 0% for even frames.


As may be seen, the response time for liquid crystals (as shown in FIG. 10A) is longer than the frame time, as shown in FIG. 10B. Thus, FIG. 10C shows the resulting brightness response of the red and blue subpixels on the display. As with our above example, the green subpixels are driven at 100% luminance. The average response for the red and blue subpixels in FIG. 10C is around 20%—which does not give a chroma-balanced white; but more of a greenish hue.


To correct this color imbalance, FIG. 10D depicts one embodiment of drive voltages that achieves approximately 50% average brightness of the red and blue subpixels. The effect of driving the red and blue subpixels with the pulse train depicted in FIG. 10D—that is, having two voltages that straddle the 50% luminance point of the red and blue subpixels—is shown in FIG. 10E. It will be appreciated that any suitable pairs of voltage values that substantially give the resulting luminance curve of FIG. 10E would suffice—so the present invention is not limited to the two voltages depicted in FIG. 10D.


An alternate embodiment that achieves a 50% average brightness but experiences near 100% and 0% peak luminances would improve the overall viewing angle performance because the liquid crystal has it's best viewing angles at these two extreme luminance values. If the LC does not fully switch, then the brightness of the red and blue pixels will be wrong and color fringing will be seen. In this case, a “gain” or offset to the pixel values can be applied so as to achieve the desired brightness. For example, if the pixel cannot fully switch in a frame time (˜15 ms), then the average brightness (transmission) of the LCD will be less than the average of the two pixel values. If a black to white edge is desired, then the two values are 100% and 0% for an average of 50%. If, for example, the LC only switches to 50% and then goes back to 0%, it will be necessary to multiply the two pixel values by 0.5 and then add 0.25. Then the two states will switch between 100*0.5+0.25=75% and 0*0.5+0.25=25% for an average of the desired 50%. These gain and offset values are adjusted empirically or can be calculated; once determined, they will be the same for all panels unless the LC material or cell gap is changed. The color stability will not be as good as with faster responding LC material, but will be an improvement over non-temporal filtering. One may also just adjust the lower value, leaving the higher value constant. This may improve the viewing angle.


Temporal Patterns With Arbitrary Numbers of Frames


An alternative embodiment is now described that uses multiple numbers of frames to achieve the desired temporal averaging. FIGS. 11A and 11B depict a pulse train optimized for a certain liquid crystal performance, such as depicted in FIG. 10A (e.g. a slower rise time than fall time). FIGS. 11C and 11D depict a pulse train optimized for a liquid crystal having a performance curve in which the rise time and fall times are more equal.



FIG. 11A shows a pulse train in which the voltage applied to the red and blue subpixels is 100% for two frames and 0% for one frame. FIG. 11B is the resulting brightness. FIG. 11C shows a pulse train in which the voltage applied to the red and blue subpixels is 100% for three frames and 0% for three frames. FIG. 11D is resulting brightness. As can be seen in both FIGS. 11B and 11D, the liquid crystal spends most of its time at either 100% or at 0% with an average about 50%.


With either FIG. 11B or 11D, however, there is a potential for flicker in the red and blue subpixels. This potential flicker can be reduced by varying the pulse train temporally or spatially. For example, the red and blue subpixels that are near each other on the panel can be driven with the same pulse train but taken at different phase from each other. Thus, the red and blue subpixels are effectively interlaced to reduce the temporal flicker effect. The same phased pulse trains could be applied to neighboring red subpixels themselves or blue subpixels themselves to achieve the same result. Additionally, the pulse trains could be designed to minimize observable flicker in other ways: (1) by keeping the flicker frequency as high as possible; and/or (2) by designing the pattern to have less energy in lower frequency flicker components and more energy in higher frequency components.


Other embodiments of suitable pulse trains to achieve substantially the same result can be designed to match any given liquid crystal performance curve. For example, if the liquid crystal has a fast rise time and slow fall time then an appropriate pulse train may be 0% for frame 1, 100% for frame 2 and 3, and then repeat.


In general, by using arbitrary number of frames in an on/off-pattern-period, one can design pulse trains or patterns of ON's and OFF's that ultimately give the correct average pixel luminance. As discussed, separate patterns can be applied to each color. This technique may have lower temporal resolution, but judiciously applied to static images, the correct amount of emitted light from a particular pixel may be realized. In the case of scrolling text, the technique may also be applied. Since the operator in general is not attempting to read the text while it is moving, any temporal distortion of the text due to the applied pattern will not negatively impact the user's experience. The patterns can be designed to provide color correction to scrolling text.


This embodiment avoids the necessity of employing a voltage offset from the zero value as used in FIG. 10D to realize arbitrary values of subpixel luminance, thereby avoiding viewing angle and color error problems introduced with non-zero values. By using only full ON and full OFF values, the performance should be similar to RGB stripe panel performance.


Another example of a suitable pulse train is as follows: consider a four frame pattern 1,1,1,0 (or some other arbitrary pattern) that is applied to red and blue subpixels such that the flicker from each cancels each other—i.e. red and blue subpixels are out of luminance phase. Green remains unmodulated in this example. Theoretically, the output luminance will be 75% of maximum for red and blue subpixels. However, given the asymmetry of the ON and OFF response times, the response will be less than 75%, approaching 50% depending on the specific LC response time. The flicker frequency will be 15 Hz assuming a 60 Hz refresh rate, but the variations can be minimized by phasing the red and blue to cancel each other. The remaining flicker will be a fraction of the total light due to the proximity of a 100% green pixel, so the flicker effect will be attenuated.


Inversion Schemes For Effecting Temporal SPR


For LCDs which are polarity inverted to achieve zero DC voltage across the cell, there is an extra requirement when using temporal filtering. Usually the polarity is inverted every frame time, either row by row (row inversion), column by column (column inversion) or pixel by pixel (dot inversion). In the case of dot inversion, the polarity of the inversion either varies every row (1:1) or every two rows (1:2). The choice of inverting the polarity every frame is somewhat for convenience of the circuitry; polarity can be inverted every two frames without degrading the LC material. It may be desirable to invert every two frames when temporal dithering is employed so as to not get extra DC applied to the cell along edges. This could occur for the case with inversion every frame because some pixels may be switching 1 0 1 0 . . . ; if the polarity is switching every frame, then the “1” state will always be the same polarity.


Various Implementation Embodiments


One further embodiment for implementing a temporal SPR system is shown in FIG. 12. This embodiment assumes a panel comprising a subpixel repeat grouping as found in FIG. 5; however, it should be appreciated that suitable changes can be made to the present embodiment to accommodate other subpixel repeat groupings. FIG. 12 shows only the red image data; blue data would be treated similarly. As green data in the repeat grouping of FIG. 5 is mapped 1:1 from source image data, there is no need to temporally process the green data. Of course, with other subpixel repeat groupings, green data may be temporally processed as well.



FIG. 12 shows how the red data is mapped from a source image data plane 1202 to the panel data planes over frames 1204 and 1206, wherein the panel has the layout as described above. For example, RS11 maps to RP11 in Frame 1 (1204) whereas RS12 maps to RP11 in Frame 2 (1206). This mapping effectively averages the values of RS11 and RS12 (creating the equivalent of a spatial “box” filter) and outputs the result to RP11. Similarly, RS22 will be output to RP21 in Frame 1 and RS23 will be output to RP21 in Frame 2.


As may be seen, red source image data 1202 may be stored in the system or otherwise input into the system. This temporal averaging for red and blue data will result in the same visual appearance compared to an RGB stripe system; viewing angle and response time effects will be the same. It may also simplify the data processing for pictorial applications such as camera or TV applications. This one embodiment for remapping may work well for rendering text, but might lead to some inaccuracies in gray levels which can affect picture quality. Thus, yet another embodiment for a remapping for images, as shown in FIG. 13, is to first average the source pixels and then output to the panel. For example, RS11 and RS12 are averaged via function 1308 and outputted to RP11 in frame 1 (1304). Then RS12 and RS13 are averaged by function 1308 and outputted to RP11 in frame 2 (1306). It will be understood that function 1308 could be more than just the averaging of two pixels and could include a more complex subpixel rendering process of two or more input pixels. It will also be understood that these techniques described in FIGS. 12 and 13 apply equally to all display technologies—such as LCD, OLED, plasma, EL and other pixilated color displays. For OLED and plasma in particular, the viewing angle and response time are not an issue as it is with LCD. Therefore, the primary purpose of using temporal SPR for these technologies is to simplify the SPR processing—e.g. gamma adjustment is not required.


Use of Adaptive Filtering


Adaptive filtering can be applied to decide when to use the values directly or to average them. For edges, the R and B values are temporally averaged frame by frame, preserving the viewing angle. For non-edges, the adjacent values are first averaged and then outputted to the output subpixels. Averaging adjacent image data values for edges is not necessarily desirable because averaging would tend to blur the edge—thus making the transition less sharp. So, it may be desirable to detect where and when an edge is occurring in the image.


The averaging will make pictures slightly more accurate. Note that the averaging goes to left pixel on odd frames and right pixel on even. A typical algorithm is as follows (shown for red):

    • Odd Field:
    • IF ABS(RSn−RSn−1 )>max THEN RPn=RSn−1 ELSE RPn=(RSn+RSn−1)/2 where RS is source pixel (e.g. RED) and RP is a panel pixel and where “max” is chosen sufficient such that an edge is occurring at this point in the image with a good degree of probability.
    • Even Field:
    • IF ABS(RSn−RSn−1 )>max THEN RPn=RSn ELSE RPn=(RSn+RSn+1)/2 where RS is source pixel (e.g. RED) and RP is a panel pixel and where “max” is chosen sufficient such that an edge is occurring at this point in the image with a good degree of probability.

Claims
  • 1. A method for rendering image data on a display panel comprising at least a first set of subpixels of a first color and a second set of subpixels of a second color, the first color having a first brightness versus viewing angle performance measure and the second color having a second brightness versus viewing angle performance measure so as to introduce color error when an image is viewed on the display panel at an off-normal viewing angle, the method comprising: receiving image data indicating an image and comprising image data values in the first and second colors;determining a mapping from the image data values in the first and second colors in a single image data plane to the first set of subpixels and the second set of subpixels in a plurality of frames to be displayed on the display panel such that one of the first set of subpixels is substantially equal to an average of two neighboring image data values and one of the second set of subpixels is substantially equal to an average of two neighboring image data values and such that an average of a first image data value and a second image data value adjacent to the first image data value maps to a first subpixel in a first frame and an average of the second image data value and a third image data value adjacent to the second image data value maps to the first subpixel in a second frame; andoutputting to the display a pulse train of on/off patterns for the first set of subpixels and the second set of subpixels over a predetermined number of frames to produce a desired average subpixel luminance.
  • 2. The method of claim 1 wherein determining a mapping further comprises: determining a first-color viewing angle fall-off value at each of a plurality of off-normal viewing angles using the first brightness versus viewing angle performance measure of the first color;determining a second-color viewing angle fall-off value at each of a plurality of off-normal viewing angles using the second brightness versus viewing angle performance measure of the second color; anddetermining the mapping of the image data values for the first set of subpixels and the second set of subpixels such that the first-color viewing angle fall-off values and the second-color viewing angle fall-off values are proportionally matched between two preselected viewing angles.
  • 3. The method of claim 1 wherein the display panel further comprises a third set of subpixels of a third color.
  • 4. The method of claim 1 wherein the display panel further comprises a fourth set of subpixels of a fourth color.
  • 5. The method of claim 1 wherein the display panel is a liquid crystal display panel.
CROSS REFERENCE TO RELATED APPLICATIONS

This application is a divisional of U.S. Nonprovisional application Ser. No. 10/379,767 entitled SYSTEMS AND METHODS FOR TEMPORAL SUBPIXEL RENDERING OF IMAGE DATA, filed on Mar. 4, 2003, which is incorporated by reference herein. The present application is related to commonly owned U.S. patent applications: (1) U.S. patent application Ser. No. 10/379,766 entitled “SUB-PIXEL RENDERING SYSTEM AND METHOD FOR IMPROVED DISPLAY VIEWING ANGLES,” published as U.S. Patent Application Publication 2004/0174375; and (2) U.S. patent application Ser. No. 10/379,765, entitled “SYSTEMS AND METHODS FOR MOTION ADAPTIVE FILTERING,” published as U.S. Patent Application Publication 2004/0174380. U.S. Patent Application Publications 2004/0174375 and 2004/0174380 are hereby incorporated by reference herein.

US Referenced Citations (197)
Number Name Date Kind
3971065 Bayer Jul 1976 A
4353062 Lorteije et al. Oct 1982 A
4439759 Fleming et al. Mar 1984 A
4593978 Mourey et al. Jun 1986 A
4642619 Togashi Feb 1987 A
4651148 Takeda et al. Mar 1987 A
4751535 Myers Jun 1988 A
4773737 Yokono et al. Sep 1988 A
4786964 Plummer et al. Nov 1988 A
4792728 Chang et al. Dec 1988 A
4800375 Silverstein et al. Jan 1989 A
4853592 Stratham Aug 1989 A
4874986 Menn et al. Oct 1989 A
4886343 Johnson Dec 1989 A
4908609 Stroomer Mar 1990 A
4920409 Yamagishi Apr 1990 A
4946259 Matino et al. Aug 1990 A
4965565 Noguchi Oct 1990 A
4966441 Conner Oct 1990 A
4967264 Parulski et al. Oct 1990 A
5006840 Hamada et al. Apr 1991 A
5010413 Bahr Apr 1991 A
5052785 Takimoto et al. Oct 1991 A
5062057 Blacken et al. Oct 1991 A
5113274 Takahashi et al. May 1992 A
5132674 Bottorf Jul 1992 A
5144288 Hamada et al. Sep 1992 A
5184114 Brown Feb 1993 A
5189404 Masimo et al. Feb 1993 A
5196924 Lumelsky et al. Mar 1993 A
5233385 Sampsell Aug 1993 A
5311337 McCartney, Jr. May 1994 A
5315418 Sprague et al. May 1994 A
5334996 Tanigaki et al. Aug 1994 A
5341153 Benzschawel et al. Aug 1994 A
5398066 Martinez-Uriegas et al. Mar 1995 A
5416890 Beretta May 1995 A
5436747 Suzuki Jul 1995 A
5438649 Ruetz Aug 1995 A
5448652 Vaidyanathan et al. Sep 1995 A
5450216 Kasson Sep 1995 A
5461503 Deffontaines et al. Oct 1995 A
5477240 Huebner et al. Dec 1995 A
5485293 Robinder Jan 1996 A
5535028 Bae et al. Jul 1996 A
5541653 Peters et al. Jul 1996 A
5561460 Katoh et al. Oct 1996 A
5563621 Silsby Oct 1996 A
5579027 Sakurai et al. Nov 1996 A
5638128 Hoogenboom et al. Jun 1997 A
5642176 Abukawa et al. Jun 1997 A
5646702 Akinwande et al. Jul 1997 A
5648793 Chen Jul 1997 A
5694186 Yanagawa et al. Dec 1997 A
5719639 Imamura Feb 1998 A
5724442 Ogatsu et al. Mar 1998 A
5731818 Wan et al. Mar 1998 A
5739802 Mosier Apr 1998 A
5748828 Steiner et al. May 1998 A
5754163 Kwon May 1998 A
5754226 Yamada et al. May 1998 A
5792579 Phillips Aug 1998 A
5815101 Fonte Sep 1998 A
5821913 Mamiya Oct 1998 A
5899550 Masaki May 1999 A
5917556 Katayama Jun 1999 A
5929843 Tanioka Jul 1999 A
5933253 Ito et al. Aug 1999 A
5949496 Kim Sep 1999 A
5973664 Badger Oct 1999 A
5990997 Jones et al. Nov 1999 A
6002446 Eglit Dec 1999 A
6008868 Silverbrook Dec 1999 A
6034666 Kanai et al. Mar 2000 A
6038031 Murphy Mar 2000 A
6049626 Kim Apr 2000 A
6054832 Kunzman et al. Apr 2000 A
6061533 Kajiwara May 2000 A
6064363 Kwon May 2000 A
6069670 Borer May 2000 A
6097367 Kuriwaki et al. Aug 2000 A
6100872 Aratani et al. Aug 2000 A
6108122 Ulrich et al. Aug 2000 A
6144352 Matsuda et al. Nov 2000 A
6151001 Anderson et al. Nov 2000 A
6160535 Park Dec 2000 A
6184903 Omori Feb 2001 B1
6188385 Hill et al. Feb 2001 B1
6198507 Ishigami Mar 2001 B1
6219025 Hill et al. Apr 2001 B1
6225967 Hebiguchi May 2001 B1
6225973 Hill et al. May 2001 B1
6236390 Hitchcock May 2001 B1
6239783 Hill et al. May 2001 B1
6243055 Fergason Jun 2001 B1
6243070 Hill et al. Jun 2001 B1
6256425 Kunzman Jul 2001 B1
6262710 Smith Jul 2001 B1
6271891 Ogawa et al. Aug 2001 B1
6278434 Hill et al. Aug 2001 B1
6297826 Semba et al. Oct 2001 B1
6299329 Mui et al. Oct 2001 B1
6326981 Mori et al. Dec 2001 B1
6327008 Fujiyoshi Dec 2001 B1
6335719 An et al. Jan 2002 B1
6346972 Kim Feb 2002 B1
6348929 Acharya et al. Feb 2002 B1
6360008 Suzuki et al. Mar 2002 B1
6360023 Betrisey et al. Mar 2002 B1
6377262 Hitchcock et al. Apr 2002 B1
6384836 Naylor, Jr. et al. May 2002 B1
6388644 De Zwart et al. May 2002 B1
6392717 Kunzman May 2002 B1
6393145 Betrisey et al. May 2002 B2
6396505 Lui et al. May 2002 B1
6414719 Parikh Jul 2002 B1
6417867 Hallberg Jul 2002 B1
6429867 Deering Aug 2002 B1
6441867 Daly Aug 2002 B1
6453067 Morgan et al. Sep 2002 B1
6466618 Messing et al. Oct 2002 B1
6483518 Perry et al. Nov 2002 B1
6545653 Takahara et al. Apr 2003 B1
6545740 Werner Apr 2003 B2
6570584 Cok et al. May 2003 B1
6600495 Boland et al. Jul 2003 B1
6624828 Dresevic et al. Sep 2003 B1
6661429 Phan Dec 2003 B1
6674436 Dresevic et al. Jan 2004 B1
6714206 Martin et al. Mar 2004 B1
6714212 Tsuboyama et al. Mar 2004 B1
6738526 Betrisey et al. May 2004 B1
6750875 Keely, Jr. et al. Jun 2004 B1
6781626 Wang Aug 2004 B1
6801220 Greier et al. Oct 2004 B2
6804407 Weldy Oct 2004 B2
6836300 Choo et al. Dec 2004 B2
6850294 Roh et al. Feb 2005 B2
6856704 Gallagher et al. Feb 2005 B1
6867549 Cok et al. Mar 2005 B2
6885380 Primerano et al. Apr 2005 B1
6888604 Rho et al. May 2005 B2
6897876 Murdoch et al. May 2005 B2
6903378 Cok Jun 2005 B2
6917368 Credelle et al. Jul 2005 B2
20010003446 Takafuji Jun 2001 A1
20010017515 Kusunoki et al. Aug 2001 A1
20010040645 Yamazaki Nov 2001 A1
20010048764 Betrisey et al. Dec 2001 A1
20020012071 Sun Jan 2002 A1
20020017645 Yamazaki et al. Feb 2002 A1
20020093476 Hill et al. Jul 2002 A1
20020122160 Kunzman Sep 2002 A1
20020140831 Hayashi Oct 2002 A1
20020149598 Greier et al. Oct 2002 A1
20020190648 Bechtel et al. Dec 2002 A1
20030011603 Koyama et al. Jan 2003 A1
20030034992 Brown Elliott et al. Feb 2003 A1
20030071775 Ohashi et al. Apr 2003 A1
20030071943 Choo et al. Apr 2003 A1
20030072374 Sohm Apr 2003 A1
20030103058 Elliott et al. Jun 2003 A1
20030146893 Sawabe Aug 2003 A1
20030218618 Phan Nov 2003 A1
20040008208 Dresevic et al. Jan 2004 A1
20040021804 Hong et al. Feb 2004 A1
20040036704 Han et al. Feb 2004 A1
20040075764 Law et al. Apr 2004 A1
20040085495 Roh et al. May 2004 A1
20040095521 Song et al. May 2004 A1
20040108818 Cok et al. Jun 2004 A1
20040114046 Lee et al. Jun 2004 A1
20040150651 Phan Aug 2004 A1
20040155895 Lai Aug 2004 A1
20040169807 Rho et al. Sep 2004 A1
20040174375 Credelle et al. Sep 2004 A1
20040174380 Credelle et al. Sep 2004 A1
20040179160 Rhee et al. Sep 2004 A1
20040189662 Frisken et al. Sep 2004 A1
20040189664 Frisken et al. Sep 2004 A1
20040233339 Elliott Nov 2004 A1
20040239813 Klompenhouwer Dec 2004 A1
20040239837 Hong et al. Dec 2004 A1
20040263528 Murdoch et al. Dec 2004 A1
20050007539 Taguchi et al. Jan 2005 A1
20050024380 Lin et al. Feb 2005 A1
20050031199 Ben-Chorin et al. Feb 2005 A1
20050040760 Taguchi et al. Feb 2005 A1
20050068477 Shin et al. Mar 2005 A1
20050083356 Roh et al. Apr 2005 A1
20050088385 Elliott et al. Apr 2005 A1
20050094871 Berns et al. May 2005 A1
20050099426 Primerano et al. May 2005 A1
20050134600 Credelle et al. Jun 2005 A1
20050140634 Takatori Jun 2005 A1
20050162600 Rho et al. Jul 2005 A1
20050169551 Messing et al. Aug 2005 A1
Foreign Referenced Citations (43)
Number Date Country
299 09 537 Oct 1999 DE
199 23 527 Nov 2000 DE
201 09 354 Sep 2001 DE
0 158 366 Oct 1985 EP
0 203 005 Nov 1986 EP
0 322 106 Jun 1989 EP
0 671 650 Sep 1995 EP
0 793 214 Feb 1996 EP
0 812 114 Dec 1997 EP
0 878 969 Nov 1998 EP
0 899 604 Mar 1999 EP
1 083 539 Mar 2001 EP
1 261 014 Nov 2002 EP
1 381 020 Jan 2004 EP
2 133 912 Aug 1984 GB
2 146 478 Apr 1985 GB
60-107022 Jun 1985 JP
02-000826 Jan 1990 JP
02-983027 Apr 1991 JP
03-78390 Apr 1991 JP
03-036239 May 1991 JP
06-102503 Apr 1994 JP
06-214250 Aug 1994 JP
2001203919 Jul 2001 JP
2002215082 Jul 2002 JP
2004-004822 Jan 2004 JP
WO 9723860 Jul 1997 WO
WO 0021067 Apr 2000 WO
WO 0042564 Jul 2000 WO
WO 0042762 Jul 2000 WO
WO 0045365 Aug 2000 WO
WO 0065432 Nov 2000 WO
WO 0067196 Nov 2000 WO
WO 0110112 Feb 2001 WO
WO 0129817 Apr 2001 WO
WO 0152546 Jul 2001 WO
WO 02059685 Aug 2002 WO
WO 03014819 Feb 2003 WO
WO 2004021323 Mar 2004 WO
WO 2004027503 Apr 2004 WO
WO 2004040548 May 2004 WO
WO 2004086128 Oct 2004 WO
WO 2005050296 Jun 2005 WO
Non-Patent Literature Citations (41)
Entry
Adobe Systems, Inc. website http://www.adobe.com/products/acrobat/cooltype.html.
Betrisey, C., et al., Displaced Filtering for Patterned Displays, SID Symp. Digest, 296-299, 1999.
Brown Elliott, C., “Active Matrix Display . . . ”, IDMC 2000, 185-189, Aug. 2000.
Brown Elliott, C., “Color Subpixel Rendering Projectors and Flat Panel Displays,” SMPTE, Feb. 27-Mar. 1, 2003, Seattle, WA pp. 1-4.
Brown Elliott, C, “Co-Optimization of Color AMLCD Subpixel Architecture and Rendering Algorithms,” SID 2002 Proceedings Paper, May 30, 2002 pp. 172-175.
Brown Elliott, C, “Development of the PenTile Matrix™ Color AMLCD Subpixel Architecture and Rendering Algorithms”, SID 2003, Journal Article.
Brown Elliott, C, “New Pixel Layout for PenTile Matrix™ Architecture”, IDMC 2002, pp. 115-117.
Brown Elliott, C, “Pentile Matirx™ Displays and Drivers” ADEAC Proceedings Paper, Portland OR., Oct. 2005.
Brown Elliott, C, “Reducing Pixel Count Without Reducing Image Quality”, Information Display Dec. 1999, vol. 1, pp. 22-25.
Carvajal, D., “Big Publishers Looking Into Digital Books,” The NY Times, Apr. 3, 2000, Business/ Financial Desk.
“ClearType magnified”, Wired Magazine, Nov. 8, 1999, Microsoft Typography, article posted Nov. 8, 1999, last updated Jan. 27, 1999 1 page.
Credelle, Thomas, “P-00: MTF of High-Resolution PenTile Matrix Displays”, Eurodisplay 02 Digest, 2002 pp. 1-4.
Daly, Scott, “Analysis of Subtriad Addressing Algorithms by Visual System Models”,SID Symp. Digest, Jun. 2001 pp. 1200-1203.
Feigenblatt, R.I., Full-color imaging on amplitude-quantized color mosaic displays, SPIE, 1989, pp. 199-204.
Feigenblatt, Ron, “Remarks on Microsoft ClearType™”, http://www.geocities.com/SiliconValley/Ridge/6664/ClearType.html, Dec. 5, 1998, Dec. 7, 1998, Dec. 12, 1999, Dec. 26, 1999, Dec. 30, 1999 and Jun. 19, 2000, 30 pages.
Gibson, S., “Sub-Pixel Rendering; How it works,” Gibson Research Corp., http://www.grc.com/ctwhat.html.
Johnston, Stuart, “An Easy Read: Microsoft's ClearType,” InformationWeek Online, Redmond WA, Nov. 23, 1998. 3 pages.
Johnston, Stuart, “Clarifying ClearType,” InformationWeek Online, Redmond WA, Jan. 4, 1999, 4 pages.
Just Outta Beta, Wired Magazine, Dec. 1999 Issue 7-12, 3 pages.
Klompenhouwer, Michiel, Subpixel Image Scaling for Color Matrix Displays, SID Symp. Digest, May 2002, pp. 176-179.
Krantz, John et al., Color Matrix Display Image Quality: The Effects of Luminance . . . SID 90 Digest, pp. 29-32.
Lee, Baek-woon et al., 40.5L: Late-News Paper: TFT-LCD with RGBW Color system, SID 03 Digest, 2003, pp. 1212-1215.
Markoff, John, Microsoft's Cleartype Sets Off Debate on Originality, NY Times, Dec. 7, 1998, 5 pages.
Martin, R., et al., “Detectability of Reduced Blue-Pixel Count in Projection Displays,” SID Symp. Digest, May 1993, pp. 606-609.
Messing, Dean et al., Improved Display Resolution of Subsampled Colour Images Using Subpixel Addressing, IEEE ICIP 2002, vol. 1, pp. 625-628.
Messing, Dean et al., Subpixel Rendering on Non-Striped Colour Matrix Displays, 2003 International Conf on Image Processing, Sep. 2003, Barcelona, Spain, 4 pages.
“Microsoft ClearType,” website, Mar. 26, 2003, 4 pages.
Microsoft Corp. website, http://www.microsoft.com/typography/cleartype, 7 pages.
Microsoft press release, Microsoft Research Announces Screen Display Breakthrough at COMDEX/Fall '98; . . . Nov. 15, 1998.
Murch, M., “Visual Perception Basics,” SID Seminar, 1987, Tektronix Inc, Beaverton Oregon.
Okumura et al., “A New Flicker-Reduction Drive Method for High Resolution LCTVs”, SID Digest,pp. 551-554, 2001.
Platt, John, Optimal Filtering for Patterned Displays, IEEE Signal Processing Letters, 2000, 4 pages.
Wandell, Brian A., Stanford University, “Fundamentals of Vision: Behavior . . . ,” Jun. 12, 1994, Society for Information Display (SID) Short Course S-2, Fairmont Hotel, San Jose, California.
USPTO, Notice of Allowance, dated Dec. 15, 2004 in US Patent 6,917,368 (U.S. Appl. No. 10/379,766.
USPTO, Non-Final Office Action, dated Aug. 1, 2006 in US Patent Publication No. 2005/0134600, (U.S. Appl. No. 11/048,498.
USPTO, Non-Final Office Action, dated Oct. 26, 2004 in US Patent Publication No. 2004/0174380 (U.S. Appl. No. 10/379,765.
Clairvoyante Inc, Response to Non-Final Office Action, dated Jan. 24, 2005 in US Patent Publication No. 2004/0174380 (U.S. Appl. No. 10/379,765.
USPTO, Final Office Action, dated Jun. 2, 2005 in US Patent Publication No. 2004/0174380 (U.S. Appl. No. 10/379,765.
USPTO, Non-Final Office Action, dated Nov. 2, 2005 in US Patent Publication No. 2004/0174380 (U.S. Appl. No. 10/379,765.
Clairvoyante Inc, Response to Non-Final Office Action, dated Apr. 10, 2006 in US Patent Publication No. 2004/0174380 (U.S. Appl. No. 10/379,765.
USPTO, Notice of Allowance, dated Jul. 26, 2006 in US Patent Publication No. 2004/0174380 (U.S. Appl. No. 10/379,765.
Related Publications (1)
Number Date Country
20070052721 A1 Mar 2007 US
Continuations (1)
Number Date Country
Parent 10379767 Mar 2003 US
Child 11462979 US