Subpixel rendering for cathode ray tube devices

Information

  • Patent Application
  • 20040232844
  • Publication Number
    20040232844
  • Date Filed
    May 20, 2003
    21 years ago
  • Date Published
    November 25, 2004
    20 years ago
Abstract
A cathode ray tube (CRT) device is disclosed that increases image resolution. The CRT device includes a plurality of electron guns to produce a plurality of electron beams. A plurality of separate phosphor dots corresponding to separate colors are produced when impacted by the electron beams. The CRT device also includes steering electronics to guide the electron beams to the plurality of separate phosphor dots that form part of separate and shifted color planes.
Description


BACKGROUND

[0001] In commonly owned United States Patent Applications: (1) Ser. No. 09/916,232 (“the '232 application”), entitled “ARRANGEMENT OF COLOR PIXELS FOR FULL COLOR IMAGING DEVICES WITH SIMPLIFIED ADDRESSING,” filed Jul. 25, 2001; (2) Ser. No. 10/278,353 (“the '353 application”), entitled “IMPROVEMENTS TO COLOR FLAT PANEL DISPLAY SUB-PIXEL ARRANGEMENTS AND LAYOUTS FOR SUB-PIXEL RENDERING WITH INCREASED MODULATION TRANSFER FUNCTION RESPONSE,” filed Oct. 22, 2002; (3) Ser. No. 10/278,352 (“the '352 application”) entitled “IMPROVEMENTS TO COLOR FLAT PANEL DISPLAY SUB-PIXEL ARRANGEMENTS AND LAYOUTS FOR SUB-PIXEL RENDERING WITH SPLIT BLUE SUBPIXELS,” filed Oct. 22, 2002; (4) Ser. No. 10/243,094 (“the '094 application), entitled “IMPROVED FOUR COLOR ARRANGEMENTS AND EMITTERS FOR SUBPIXEL RENDERING,” filed Sep. 13, 2002; (5) Ser. No. 10/278,328 (“the '328 application”), entitled “IMPROVEMENTS TO COLOR FLAT PANEL DISPLAY SUB-PIXEL ARRANGEMENTS AND LAYOUTS WITH REDUCED BLUE LUMINANCE WELL VISIBILITY,” filed Oct. 22, 2002; (6) Ser. No. 10/278,393 (“the '393 application”), entitled “COLOR DISPLAY HAVING HORIZONTAL SUB-PIXEL ARRANGEMENTS AND LAYOUTS,” filed Oct. 22, 2002, novel subpixel arrangements are therein disclosed for improving the cost/performance curves for image display FINNEGAN HENDERSON devices and herein incorporated by reference.


[0002] These improvements are particularly pronounced when coupled with subpixel rendering (SPR) systems and methods further disclosed in those applications and in commonly owned United States Patent Applications: (1) Ser. No. 10/051,612 (“the '612 application”), entitled “CONVERSION OF RGB PIXEL FORMAT DATA TO PENTILE MATRIX SUB-PIXEL DATA FORMAT,” filed Jan. 16, 2002; (2) Ser. No. 10/150,355 (“the '355 application”), entitled “METHODS AND SYSTEMS FOR SUB-PIXEL RENDERING WITH GAMMA ADJUSTMENT,” filed May 17, 2002; (3) Ser. No. 10/215,843 (“the '843 application”), entitled “METHODS AND SYSTEMS FOR SUB-PIXEL RENDERING WITH ADAPTIVE FILTERING,” filed May 17, 2002; (4) Ser. No. ______ (“the ______ application) entitled “IMAGE DATA SET WITH EMBEDDED PRE-SUBPIXEL RENDERED IMAGE”, filed Apr. 7, 2003.


[0003] Additionally, the present application is also related to commonly owned: (1) Ser. No. 10/047,995 (“the '995 application”) entitled “COLOR DISPLAY PIXEL ARRANGEMENTS AND ADDRESSING MEANS” filed Jan. 14, 2002; (2) Ser. No. ______ (“______ application”) entitled “IMPROVED PROJECTOR SYSTEMS” filed May 20, 2003; (3) Ser. No. ______ (“______ application”) entitled “IMPROVED IMAGE CAPTURE DEVICE AND CAMERA” filed May 20, 2003; and (4) Ser. No. ______ (” application”) entitled “IMPROVED PROJECTOR SYSTEMS WITH REDUCED FLICKER” filed May 20, 2003.


[0004] The above-referenced and commonly owned applications are hereby incorporated herein by reference.







BRIEF DESCRIPTION OF THE DRAWINGS

[0005] The accompanying drawings, which are incorporated in, and constitute a part of the specification, illustrate exemplary implementations and embodiments of the invention, and, together with the detailed description, serve to explain principles of the invention.


[0006]
FIG. 1 illustrates a side view of a prior art projector projecting images, in a frontal view, to a central point on an imaging screen.


[0007]
FIG. 2 illustrates a side view of a projector, projecting images, in a frontal view, to a central point on an imaging screen in which the three colors are offset from each other.


[0008]
FIG. 3A illustrates a side view of a prior art CRT projecting images to a central point on an imaging screen.


[0009]
FIG. 3B illustrates a portion of the phosphor screen of the prior art CRT illustrated in FIG. 3A, focusing Gaussian spots to a single point on an imaging screen.


[0010]
FIG. 4A illustrates a side view of a CRT projecting images to an imaging screen in which the three colors are offset from each other.


[0011]
FIG. 4B illustrates a portion of the CRT illustrated in FIG. 4A focusing Gaussian spot to a phosphor screen in which the three color spots are offset by one-third pixel in the horizontal direction.


[0012]
FIG. 4C illustrates a portion of the CRT illustrated in FIG. 4A focusing Gaussian spot to a phosphor screen in which the green color spots are offset in the diagonal direction.


[0013]
FIG. 5 illustrates a prior art arrangement of pixels for electronic information display projectors.


[0014]
FIGS. 6, 7, and 8 illustrates an arrangement of pixels for each of the colors green, red, and blue, respectively.


[0015]
FIG. 9 illustrates the arrangements of FIGS. 6, 7, and 8 overlaid on one another to show how a full color image is constructed.


[0016]
FIG. 10 illustrates the overlaid image of FIG. 9 with one full color logical pixel


[0017]
FIGS. 11 and 12 illustrates the green and red image planes, respectively, with a single column of logical pixels turned on.


[0018]
FIG. 13 illustrates the red and green image planes of FIGS. 11 and 12 overlaid;


[0019]
FIGS. 14A-14B and 15A-15B illustrate the green and red image planes, respectively, with two columns of logical pixels turned on.


[0020]
FIGS. 16A and 16B illustrate the green and red image planes of FIGS. 14A-14B and 15A-15B overlaid, respectively.


[0021]
FIG. 17 illustrates two images of the pixel arrangement of FIG. 6 overlaid, offset by one-half pixel, to demonstrate how a single imaging plane can build up a higher resolution image using field sequential color, or to demonstrate how two imaging planes of a multi-panel may be offset to build up a higher resolution image.


[0022]
FIG. 18 illustrates splitting of an image path into two different paths for different colors through an inclined plate made of a chromodispersive material.


[0023]
FIG. 19 illustrates a prior art arrangement of pixels.


[0024]
FIG. 20 illustrates an overlay of the arrangement of prior art FIG. 19 in which the two colors are offset by one-half pixel in the diagonal direction.


[0025]
FIG. 21 illustrates the overlaid arrangement of FIG. 20 with two color logical pixels at different addressable points.


[0026]
FIG. 22 illustrates the overlaid arrangement of FIG. 20 with an alternative color logical pixel and a column line of logical pixels.


[0027]
FIG. 23 illustrates an overlay of FIG. 8 for three colors in which the colors are offset by one-third pixel each, with one full color logical pixel turned on.


[0028]
FIG. 24A is a chart showing the chromaticity coordinates of the emitters of a prior art three color display.


[0029]
FIG. 24B is a chart showing the chromaticity coordinates of the emitters of an improved three color display, compared to the chromaticity coordinates of FIG. 24A.


[0030]
FIG. 25 is a chart showing the chromaticity coordinates of the emitters of a novel four color display.


[0031]
FIG. 26 is a chart showing the chromaticity coordinates of the emitters of a novel five color display.


[0032]
FIG. 27 illustrates the reconstruction points of the prior art display of FIG. 19 overlaid on the appearance of the display.


[0033]
FIG. 28 illustrates the reconstruction points of the novel display shown in FIG. 20.


[0034]
FIG. 29 illustrates the arrangement of emitters and reconstruction points of a novel twinned projector arrangement with coincident color planes.


[0035]
FIG. 30 illustrates the arrangement of emitters and reconstruction points of another novel twinned projector arrangement with displaced color planes.


[0036]
FIGS. 31A and 31B illustrate a prior art arrangement of a multi-sensor chip camera in which all of the color plane sample areas are coincident, sampling an image and the resulting data set respectively.


[0037]
FIGS. 32A and 32B illustrate a novel arrangement of a multi-sensor chip camera in which two of the color plane sample areas are displaced, sampling an image and the resulting data set respectively.


[0038]
FIGS. 33A and 33B illustrate a novel arrangement of a multi-sensor chip camera in which three of the color plane sample areas are displaced, sampling an image and the resulting data set respectively.


[0039]
FIG. 33C is the illustrates displaying the processed image of FIG. 33B onto a higher resolution, conventional prior art display.


[0040]
FIGS. 34A and 34B illustrate a novel arrangement of color filter array for a two chip color camera, one with a red/green checkerboard, the other a lower resolution sensor for imaging blue image component, respectively.


[0041]
FIG. 35A illustrates a novel display arrangement of the color planes on a display.


[0042]
FIGS. 35B, 35C, and 35D illustrate the color planes overlaid on each other to create a full color image as shown in FIG. 35A.


[0043]
FIGS. 36A and 36B illustrate how this moiré distortion is eliminated by the arrangement of FIG. 35A.


[0044]
FIG. 37 illustrates a prior art color wheel filter of three colors.


[0045]
FIGS. 38A, 38B, 38C, and 38D illustrate novel color wheel filters of three colors.


[0046]
FIGS. 39A and 39B illustrates novel color wheel filters of three colors and black.


[0047]
FIG. 40 illustrates a novel color wheel filter of four colors, one of which is white.


[0048]
FIGS. 41, 42A, 42B, 42C, and 42D illustrate a spatial light modulator and a method of reducing data bandwidth, image size, while maintaining image quality using spatio-temporally displaced filtering and reconstruction.







DETAILED DESCRIPTION

[0049] Reference will now be made in detail to exemplary implementations and embodiments of the invention, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts. Furthermore, the following description is illustrative only and not in any way intended to be limiting.


[0050] Prior art projectors typically overlaps the three-color images (e.g. RGB) exactly coincidentally, with the same spatial resolution. As taught in the '995 application, the color imaging planes are overlaid upon each other with an offset of about one-half pixel. By offsetting the color imaging planes, an electronic image capture, processing, and display having higher resolution images is created by increasing the resolution of the system.



Cathode Ray Tube Displays, Projector Displays, and Subtractive Flat Panel Displays

[0051]
FIG. 1 is schematic of a prior art projector 100 having a light beam 102 that projects red (R), blue (B), and green (G) images 106 on to an imaging (or projection) screen 104. Prior art practices converge the red, the blue, and the green images to a point 110 on the projection screen 104. In contrast, FIG. 2 illustrates schematic of a projector 200 having a light beam 202 that projects through an optical element (or lens) 204 red 206, blue 208, and green images 210 on to an imaging (or projection) screen 212. As illustrated in this example, such a projector will separate and differentially shift the red, green, and blue images. Thus, the image is again formed, but the image is shifted optically to separate the red, blue, and green color planes about one-half pixel.


[0052] A similar procedure is used with a Cathode Ray Tube (CRT) video display, as illustrated in prior art FIG. 3A. An electron gun 300 projects an electron beam 302 inside the CRT 304 onto a phosphor surface 306 with an array of color primary emitting phosphor dots. Prior art practices converge the red, the blue, and the green image pixel to a circular Gaussian spot 308 on the phosphor surface 306. The CRT 304 can direct the electron beam 302 towards the phosphor surface 306 electrostatically or magnetically. FIG. 3B illustrates a portion of the phosphor screen 306 in which the CRT focuses Gaussian spot 308 to a single point on the phosphor screen 306.


[0053] In contrast, FIG. 4A shows a diagrammatic illustration of a CRT video display having electron guns 400 that project electron beams 402 inside the CRT 404 onto a phosphor surface 406. As illustrated in this example, CRT 404 will separate and differentially shift the red 416, green 420, and blue 418 images 408. This can be accomplished by misconverging the electron beams with steering electronics, such as yoke coils, electrostatic deflection plates, or by appropriately displacing the electron guns. Thus, the image is again formed, but the image is shifted to separate the red 416, blue 418, and green 420 color planes by about one-third pixel or by shifting just the green 420 color plane by one-half pixel. FIG. 4B illustrates the portion of the phosphor screen in which the CRT focuses so that pixel color spot 408 consists of red 416, green 420, and blue 418 spots that are offset by one-third pixel in the horizontal direction. This modification allows CRTs so adjusted to use the very same subpixel rendering techniques utilized in the art on conventional RGB stripe architecture liquid crystal display (LCD) panels. FIG. 4C illustrates a portion of the phosphor screen 406 in which the CRT focuses the Gaussian spots so that the green 420 color spot is offset by one-half pixel in the diagonal direction from the converged red 416 and blue 418 spots.


[0054] Subpixel rendering can also be supported on conventional CRTs without major modification to the CRT. Instead, the timing of the data going to the CRT is modified. This could be accomplished by a modification of the video graphics card on a computer.


[0055] In one embodiment, one dimensional subpixel rendering could be supported. For example, the red data would lead, the green delayed by one third (⅓) of a pixel clock, the blue delayed by two thirds (⅔) of a pixel clock. This can be accomplished by using a “subpixel clock” (shown schematically as element 422 in FIG. 4A) at three times the usual pixel clock for the data D/A converters. The result will be that a single “pixel” will paint displaced red 416, green 420, and blue 418 spots as shown in FIG. 4B. This modification could be made to the video graphics card and would make a CRT look like an RGB stripe LCD and compatible commercial subpixel rendered text such as that disclosed in Hill, et al., U.S. Pat. No. 6,188,385. It might be advantageous to use a system to turn on and off the new mode, either globally or locally by detecting the presence of the subpixel rendered text using a suitable method, as disclosed in the ______ application noted above.


[0056] It is also possible to simulate a two dimensionally subpixelated flat panel display. For example, the timing of the color data could be switched every row. The odds rows will have the red data lead with the green data delayed by one half (½) of a pixel clock. On the even rows, the green data will lead while the red data is delayed by one half (½) pixel clock. The blue data is always delayed by one third (⅓) of a pixel clock. The pixel clock is half the frequency of a “normal” pixel clock.


[0057] The above system will allow presubpixel rendered images to be displayed on the with minimal processing. Further, the CRT can support higher resolution than ordinarily possible by doubling the number of rows, doubling the horizontal frequency, while using the same bandwidth amplifiers, cables, and memory.


[0058] Contrary to prior art projectors, subtractive flat panels, or t CRT displays which are not subpixelated, the projectors, subtractive flat panel displays, or CRT displays discussed herein are subpixelated and may thus take advantage of subpixel rendering techniques.


[0059] Multi-image plane color projectors often use a single white light source that is broken into narrower spectral regions and separate beam paths through the use of dichroic beam splitting filters. The separated colors illuminate separate spatial light modulators. The modulated light is bought back together and focused onto an imaging screen to be viewed as a full color image.


[0060]
FIG. 5 illustrates a prior art arrangement 510 of square pixels 512, and in this example, forming an array of 12×8 pixels. For prior art projection or subtraction displays, three planes of 12×8 pixels would be overlaid to create a set of 12×8 logical pixels. This is a total of ninety-six (96) pixels comprising two-hundred-eighty-eight (288) color elements.


[0061]
FIGS. 6, 7, and 8 are illustrate an arrangement of pixel images for each color of green, red, and blue, respectively, for projectors. The same FIGS. 6, 7, and 8 are also illustrations of an arrangement of subpixels for each color of magenta, cyan, and yellow, respectively, for subtractive color flat panel displays. Magenta is equivalent to subtracting green from white. Cyan is equivalent to subtracting red from white. While yellow is equivalent to subtracting blue from white. For example, a multispectral light source is illuminated, illuminating panels of magenta, cyan, and yellow that are offset from one another in x and y by substantially less than 100%. In the following discussions regarding the theory of operation of the arrangement of subpixel elements, the additive color projector is used as an example. However, for subtractive flat panel display, the same theory of operation applies if one applies additive to subtractive color transforms well known in the art.


[0062]
FIG. 9 illustrates the resulting multipixel image 20 of overlaying the images 14, 16, and 18 of FIGS. 6, 7, and 8, respectively, for a three-color plane projector or subtractive flat panel display. The resulting multipixel image 20 of FIG. 9 has the same number of logical pixels 24 as illustrated in FIG. 10 and the same addressability and MTF as the image formed by the arrangement of prior art FIG. 5. However, the same image quality is achieved with only one-hundred-twenty-three (123) color elements, less than half of the number required by the prior art arrangement illustrated in FIG. 5. As the costs increase with the number of elements, the reduction in the number of elements offers the same image quality at a significantly lower cost, significantly higher image quality at the same cost, or a higher image quality at lower cost, when compared to the prior art arrangement illustrated in FIG. 5.


[0063] In each of the imaging devices discussed above, the beams (or panels) are convergent by substantially less than about 100%, with less than about 75% preferred, and with about 50% more preferred.


[0064] One advantage of the three-color plane array disclosed here is improved resolution of color displays. This occurs since only the red and green pixels (or emitters) contribute significantly to the perception of high resolution in the luminance channel. Offsetting the pixels allows higher perceived resolution in the luminance channel. The blue pixel can be reduced without affecting the perceived resolution. Thus, reducing the number of blue pixels reduces costs by more closely matching human vision.


[0065] The multipixel image 22 of FIG. 10 illustrates a logical pixel 24 showing a 50% of the input value associated with that logical pixel 24. Surrounding and overlapping this central pixel 26 are four pixels 28 of the opposite color of the red/green opposition channel (in this case it is red) that is set at 12.5% of the input value associated with that logical pixel 24. Partially overlapping and offset is a blue pixel 30, which is set at about 25% of the input value associated with that logical pixel 24.


[0066] The logical pixel 24 of FIG. 10 illustrates that the central area defined by the central pixel 26 is the brightest area, at 31.25%, while the surrounding area, defined by the surrounding pixels 28 of the “opposite” color (not overlapping with the central pixel 24) remains at 6.25% brightness. This approximates a Gaussian spot, similar to those formed by the electron gun spot of a CRT.


[0067] Images 52 and 68 are built up by overlapping logical pixels as shown in FIGS. 13 and 16, respectively. For ease of illustration, the blue plane in each figure has not been shown for clarity. The arrangement of the pixels of each color plane 14, 16, and 18 illustrated in FIGS. 6, 7, and 8, respectively, are essentially identical to some of the effective sample area arrangements found in many of the above-referenced applications that are incorporated by reference. Further, the arrangement of this present embodiment use the same reconstruction points of the pixel arrangements disclosed in the above-referenced applications.


[0068] For projected image or subtractive color flat panel displays, the present application discloses using the same pixel rendering techniques and human vision optimized image reconstruction layout. However, a smoother image construction is created in the present application due to the overlapping nature of the pixels. For an example of a multipixel image 52 having the smoother image construction, FIG. 13 illustrates a vertical line 54 comprising the green component image 40 and the red component image 50 of FIGS. 11 and 12, respectively. As illustrated in the multipixel image 40 in FIG. 11, a vertical line 41 comprises central green pixels 42 and outer green pixels 44. As illustrated in the multipixel image 50 in FIG. 12, a vertical line 51 comprises central red pixels 46 and outer red pixels 48. For clarity, the blue color plane is not shown in FIG. 13. This example assumes that the vertical line 54 displayed at about 100% of the input value and is surrounded on both sides by a field at 0% of the input value.


[0069]
FIG. 13 illustrates that the central red pixels 46 of the vertical line are offset from the central green pixels 42 when superimposed onto each other. These central pixels 42 and 46 are each set at 75%. The outer pixels 44 and 48 are each set at 12.5%. The areas of overlap of the central pixels 42 and 46 form a central series of smaller diamonds 56 that are at 75% brightness. The overlap of pixels 44 and pixels 46, and the overlap of pixels 48 and pixels 42, respectfully, form two series, just outside of the said central series, of smaller diamonds 58 that are at 43.75% brightness. The overlap of the outer pixels 44 and 48 form two series of smaller diamonds 60 that are at 12.5% brightness. While the areas of the outer pixels 44 and 48 that do not overlap form an outermost series of smaller diamonds 62 that are at 6.25% brightness. This series of brightness levels, 6.25%, 12.5%, 43.75%, 75%, 43.75%, 12.5%, and 6.25% exhibits a Gaussian distribution. Further, if one were to imagine an infinitely narrow vertical line segment, at least several pixels long, moving across the displayed vertical line 54, integrating the brightness, the resulting function would be a series of smooth segments joining the brightness levels, from zero to 75% to zero. Thus, the resulting cross-sectional brightness function, integrated over several pixels tall, along the displayed line, closely approximates a smooth Gaussian curve. This displayed vertical line can be moved over by about one-half pixel, such that the addressability would be about one-half pixel.


[0070] In moving the vertical line, the amount of improvement is proportional to the amount that the red and green planes are out of phase. Having the image planes out of phase at a value of substantially less than about 100% is preferred, with less than about 75% more preferred, and with the images being exactly out of phase by about one-half pixel, or about 50%, is ideal.


[0071]
FIGS. 16A and 16B illustrate two multipixel images 68 and 68b of two vertical lines 69 and 69b, respectively, displayed to demonstrate that the MTF is about one-half of the addressability, which is the theoretical limit for subpixelated displays. FIG. 16A illustrates the two vertical lines 69 comprising the green component image 64 and the red component image 66 of FIGS. 14A and 15A, respectively. As illustrated in the multipixel image 64 in FIG. 14A, the central green pixels 70 and outer green pixels 72 comprise two vertical lines 65. As illustrated in the multipixel image 66 in FIG. 15A, the central red pixels 76 and outer red pixels 78 comprise two vertical lines 67. For clarity, the blue color plane is not shown in FIG. 16A. This example assumes that the vertical line 69 is displayed at about 100% of the input value and is surrounded on both sides by a field at 0% of the input value.


[0072] The central red pixels 76 of the two vertical lines 69 are offset from the central green pixels 70 when superimposed as in FIG. 16A. These central line pixels 70 and 76 are each set at 75%. The outer pixels 72 and 78 are each set at 12.5%. The pixels 74 and 80 between the two central lines of pixels 76 and 70 are set at 25%.


[0073] The outer edges, those not adjoining the other line, have the same sequence of brightness levels as described for the case of FIG. 13. That is, the areas of the outer pixels 72 and 78 that do not overlap form an outermost series of smaller diamonds 88 at 6.25% brightness. The overlap of the outer pixels 72 and 78 form two series of smaller diamonds 84 that are at 12.5% brightness. The overlap of pixels 72 and pixels 76, and the overlap of pixels 78 and pixels 70, respectfully, form two series, just outside of the central line series 86, of smaller diamonds 82 that are at 43.75% brightness. The areas of overlap of the central line pixels 70 and 76 form a central series of smaller diamonds 92 that are at 75% brightness.


[0074] The space between the two central vertical lines 69 has three series of smaller diamonds 90 and 94. The overlap of red central line pixels 76 and green interstitial pixels 74, and the overlap of green central line pixels 70 and red interstitial pixels 80, respectively, form a series of smaller diamonds 90 at 50% brightness. The overlap of interstitial pixels 74 and 80 form a series of smaller diamonds 94 at 25% brightness. Theoretically, this represents samples of a sine wave at the Nyquist limit, exactly in phase with the samples. However, when integrating over an imaginary vertical line segment as it moves across from peak to trough to peak, the function is that of a triangle wave. Yet, with the MTF of the projection lens limiting the bandpass of the projected image, the function is that of a smooth sine wave. The display effectively removes all Fourier wave components above the reconstruction point Nyquist limit. Here, the modulation depth is 50%. As long as this is within the human viewer's Contrast Sensitivity Function (CSF) for a given display's contrast and resolution, this modulation depth is visible.


[0075]
FIG. 16B illustrates the two vertical lines 69b comprising the green component image 64b and the red component image 66b of FIGS. 14B and 15B, respectively. These images are designed to be ‘sharper’ than those of FIGS. 16A, 14A, and 15A. As illustrated in the multipixel image 64b in FIG. 14B, the green pixels 70b comprise two vertical lines 65b. As illustrated in the multipixel image 66b in FIG. 15B, the red pixels 76b comprise two vertical lines 67b. For clarity, the blue color plane is not shown in FIG. 16B. This example assumes that the vertical lines 69b are displayed at about 100% of the input value and is surrounded on both sides by a field at 0% of the input value. Here, the values of both the red pixels 76b and the green pixels 70b are set at 100% output value, while the pixels, 74b and 80b, between the double lines 67b and 65b are set at 0% output value. Likewise the pixels, 72b and 78b, outside the double lines 67b and 65b are set at 0% output value. These values are generated by using sharpening coefficients in the filter matrix used in the subpixel rendering operation.


[0076]
FIG. 17 illustrates an overlay 96 of the image 14 of FIG. 6 offset 50% with itself. This represents an alternative embodiment of a single panel projector, using field or frame sequential color that is well known in the art. In this embodiment, the array is again formed from diamonds, but the image 14 is shifted optically to separate the red and green color planes by about one-half pixel. This color shift may be accomplished as shown in FIG. 18 by an inclined plane lens 98 of a suitable chroniodispersive transparent material. Such an arrangement will separate and differentially shift the red, green, and blue images due to the different index of refraction for each wavelength. This lens element may be a separate flat plane lens, or may be an inclined curved element that is an integral part of the projection lens assembly. Such modifications to the lens assembly may be designed using techniques well known in the art.


[0077] These optical and mechanical means for shifting the color image planes can be used to improve display systems that use prior art arrangements 100 of pixels as illustrated in FIG. 19. The green image 102 may be shifted from the red image 104 by about one-half pixel in the diagonal direction as illustrated in the arrangement 106 in FIG. 20. This allows subpixel rendering to be applied to the resulting system. FIG. 21 illustrates two logical pixels centered on a square grid that lies on corner interstitial 108 and edge interstitial 110 points in the arrangement 106 of FIG. 20. FIG. 22 illustrates arrangement 106 with a logical pixel and a column line 112 of overlapping logical pixels centered on pixel quadrants defined by the pixel overlaps.


[0078] In examining the example of a logical pixel 114, 116, and 118 shown in FIG. 22, the output value of each pixel is determined by a simple displaced box filter in which four input pixels are averaged for each output pixel. Each input pixel uniquely maps to one red output pixel 114 and one green output pixel 118 that overlaps by one quadrant 116. Thus, the addressability of the display has been increased four fold, twice in each axis. With one input pixel at about 100% value surrounded by a field at 0% value, the red output pixel 114 and the green output pixel 118 are set at 25% output. The area of overlap 116 is at 25% brightness while the areas of the output pixels 114 and 118 not overlapping are at 12.5% brightness. Thus, the peak brightness is in the overlapping quadrant.


[0079] In examining the vertical line 112 displayed in FIG. 22, it is displaying a line at about 100% input value surrounded on both sides by a field at 0% input value. The overlapping logical pixels are additive. Thus, the red output pixels 120 and the green output pixels 124 are set at 50%. The area of overlap 122 is at 50% brightness while the areas of the output pixels 120 and 124 that are not overlapping are at 25% brightness. Thus, the area of peak brightness corresponds with location of the displayed line 112.


[0080] In examining and evaluating the display system, it can be noted that while the addressability of the display has been doubled in each axis, the MTF has been increased by a lesser degree. The highest spatial frequency that may be displayed on the modified system is about one-half octave higher than the prior art system. Thus, the system may display 2.25 times more information on four times as many addressable points.


[0081] In the above systems the blue information has been ignored for clarity. This is possible due to the poor blue resolving power of human vision. However, in so far as the blue filter or other blue illumination system is less than perfect and allows green light that will be sensed by the green sensing cones of human vision, the blue image will be sensed by the green cones and add to the perception of brightness in the luminance channel. This may be used as an advantage by keeping the blue pixels in registration with the red pixels to add to the red brightness and to offset the slight brightness advantage that green light has in the luminance channel. Thus, the red output pixels may be, in fact, a magenta color to achieve this balance of brightness.


[0082] If a system were designed in which the “blue” image has significant leakage of green, and possibly yellow or even red, the “blue” image may be used to further increase the effective resolution of a display. The “blue” color may be closer to a pale pastel blue, a cyan, a purple, or even a magenta color. An example of such a display 126 is illustrated in FIG. 23. FIG. 23 illustrates three images of the array of pixels shown in FIG. 8 overlaid with a shift of one third of a pixel each. A logical pixel 128 is illustrated on the resulting image 126 in FIG. 23. The red pixel 130, green pixel 132, and “blue” pixel 134 overlap to form a smaller triangular area 136 that is at the center of the logical pixel. This overlap area is brightest, followed by the three areas where there are only two pixels overlapping, while the areas with no overlap have the lowest brightness. The manner of calculating the values of the pixels follows in a similar manner as outlined above.


[0083] Another embodiment of the present invention is shown in FIG. 35A in which the red 3504, blue 3502, and green 3506 color planes shown in FIGS. 35B, 35C, and 35D respectively are overlaid one another to form the full color arrangement 3510. The color planes are overlaid each other such that they are substantially “out of alignment” as shown in FIG. 35A.


[0084] This arrangement is characterized by having a green plane 3506 that is higher resolution than both the red 3504 and blue 3502. In this present arrangement, the red 3504 and blue 3502 have the same resolution, but this need not be the case. It is contemplated that all three of the color planes might be different resolutions. For example, one might use the high resolution green color plane 3506 of FIG. 35D, with the red color plane 3504 of FIG. 35B, and the blue color plane 18 of FIG. 8, overlaying thus such that they are all substantially “out of alignment”. Alternatively, the red color plane may be the higher of the three planes. However, in practice, given the luminances found in most projector systems, the green color plane will be found to be the best choice for the highest resolution.


[0085] More particularly, if the green luminance is approximately half the total luminance, as is commonly found in projectors, there may be an advantage to the particular arrangement shown in FIG. 35A in which the green color plane 3506 is twice that of the red 3504 and blue 3502 color planes. This is not to say that the resolution ratio is determined by the luminance ratio, rather it is the fact that one can achieve the same resolution from the offset red 3504 and blue 3502 color planes as from the green color plane 3506 alone. These are then set to be substantially offset from one another, the green 3506 from the virtual magenta (combined red 3504 and blue 3502). The advantage found in this arrangement is that moiré distortion when reconstruction a high resolution image may be significantly reduced with a minimal number of color reconstruction points.


[0086] Moiré distortion occurs when the desired signal is 90° out of phase with the reconstruction points of the display. For example, if one is attempting to display a single pixel wide line halfway between two pixels, the two pixels would be set to 50%. One could still see that the total signal strength and position is present, but the image is not as sharp. If two single wide lines were to be displayed with only a single pixel between them, but offset by half a pixel, then the two grey lines would be smeared together, and it would no longer be distinguishable from a wide grey line. FIGS. 36A and 36B illustrate how this moiré distortion is eliminated by the arrangement of FIG. 35A. When narrow lines 3515 are in phase with the pixels of the green color plane 3506, the lines are out of phase by 90° for both the red 3504 and blue 3502 color planes as shown in FIG. 36A. When the narrow lines 3515b are out of phase with the pixels of the green color plane 3506, the lines are in phase with the red 3504 and blue 3502 color planes as shown in FIG. 36B.



Twinned Projectors

[0087] In the prior art, when brightness is required that is beyond the capability of a single projector to supply, two projectors may be used. The images are conventionally converged 100%, as if the twinned units were in fact one unit. The combined image might be like that shown in FIG. 27, which shows the fully converged pixels 2705 and the associated reconstruction points 2701. The image may have twice the brightness of that from a single projector, but has the same resolution.


[0088] One improvement of this system may be to displace the full color pixel images from one of the projectors by one-half pixel in the diagonal direction as shown in FIG. 29. This gives similar, and in some aspects superior, performance improvements as that of the displaced color planes of FIGS. 20, 21, 22, and 28. FIG. 28 shows the displaced color arrangement of FIGS. 20, 21, and 22, and the associated color plane reconstruction points 2801 and 2803. Comparing FIGS. 28 and 29 illustrate the differences. FIG. 29 has full color reconstruction points 2901 at each position where FIG. 28 has either a first color (e.g. red) 2801 or second color (e.g. green) 2803 reconstruction point. Thus, for monochrome images, the twinned projector arrangement of FIG. 29 is similar to the single projector arrangement of FIG. 28. However, for highly saturated color images, the increased addressability of the twinned projector arrangement of FIG. 29 allows a single color to have twice as many reconstruction points.


[0089] A further improvement for twinned projectors is to displace the color planes of both projectors. One of the projectors has the arrangement shown in FIG. 28, while the other has the mirror arrangement, resulting in the overlapped and fully displaced four image planes of FIG. 30. This arrangement has the same saturated color image quality as that of FIG. 29, but has additional monochrome addressability, resulting in significantly improved overall image quality when suitably subpixel rendered.


[0090] Any system that traditionally uses converged, overlapped color and/or white pixels can take advantage of the concepts taught herein. Examples given above included a color CRT display used for computer monitor, video, or television display may also be improved by shifting the color components and applying appropriate subpixel rendering algorithms and filters. A simple and effective change for computer monitors is to shift the green electron spot as described above for FIG. 4B and FIG. 22. This deliberate misconvergence will seem counter-intuitive to those most knowledgeable in the CRT art, but the resulting improvement will be as described above. The displacement of the multi-color display imaging planes by a percentage of a pixel creates a display of higher resolution images by increasing the addressability of the system. Additionally, the MTF is increased to better match the design to human vision. A projector system using three separate panels can be optimized to better match the human vision system with respect to each of the primary colors. These results can be achieved in a single panel, field sequential color projector using an inclined plane chromodispersive lens element.



Film Scanners, Cameras, and Film Printers

[0091] The improvements and arrangements described herein may also help image capture and printer devices.


[0092] One embodiment may be an improved video or still camera. Some prior art cameras use multi-chip sensors, along with color filters or dichroic beam splitters. These may be considered to be the inverse operations of the projectors described herein, and may benefit from the same or similar arrangements of pixels. For example, FIG. 27 may represent the arrangement of fully converged color planes of a prior art multi-chip color camera. FIG. 28 may represent the offset color plane arrangement of a multi-chip color camera. Such an arrangement may be formed by offsetting one or more of the sensor chips such the image formed upon it is displaced by substantially one-half pixel. This would create a camera that directly and automatically captures and delivers a subpixel rendered data set. If the data set were delivered for display to a projector with the same resolution and arrangement, then the image data set would need no further processing, and yet provide a superior image than a conventional, fully converged, camera, image data set, and projector arrangement. Thus, the entire system, from image capture to display, is a matched, improved, system. Such a system performs as though it was a higher resolution system with perceptually encoded “lossless” compression.


[0093]
FIG. 31A shows a prior art arrangement of fully converged sensor elements sampling an exemplary image, in this case a “w” character, giving rise to the resulting image data set shown in FIG. 31B. It is to be understood that any natural image will behave in like manner. When the same exemplary image “w” is potentially sampled by a novel sensor arrangement (such as shown in FIG. 28), the resulting image data set is illustrated in FIG. 32B. FIGS. 31B and 32B may also be seen as representing the resulting images of projecting, displaying the resulting data sets on matching projector systems, a prior art projector in the case of FIG. 31B and the novel projector of one embodiment of the present invention in the case of FIG. 32B. Comparing the resulting image quality, the novel system represented by FIG. 32B would be an improvement over that of the prior art. If the system analysis is extended to three offset image capture planes and projector planes, as shown in FIGS. 33A and 33B respectively, the image quality continues to increase.


[0094] Similarly, the pixel arrangements of FIGS. 6, 7, and 8 may be used to capture images on a sampling plane that appears as that shown in FIG. 9. Again, when the resulting captured image data set is directly displayed on a matching projector or flat panel display, the image quality will be superior to that of the prior art systems.


[0095] With multi-chip image sensors, each having independent electronic shutter control, creating the image data set to be displayed on matching, or at least compatible, display means, another improvement is possible—namely, reduced jutter. Jutter occurs when objects that move across a scene are displayed in a series of still frames at a moderately low rate, such as the twenty-four (24) frames per second for film, or twenty-five (25) to thirty (30) frames per second for most television type video systems, the image appears to be jumping from frame image to frame image and smeared in the direction of motion as the eye smoothly tracts the average position of the moving image, but the image formed on the retina is lagging, then leading the average position for half of the frame period each. With the ability to stagger the shutter timing such that each color plane captures and represents a different point in time during the frame, i.e. represents subframes or fields, the jutter will be reduced as, on average, more of the reconstructed image energy will be closer to the average position of the ideal smoothly moving image. The display means is similarly timed such that each color field is updated with the same relative timing as the original electronic shutters. This aspect of the present invention, of displaced timing for the color planes may be combined with the spatial displacement of the sample and reconstruction points, or it may be used in conventional fully converged systems to equal advantage.


[0096] Note, that though the above examples used identical resolution camera sensor and projectors, such need not be the case and yet still gain improved performance of the total system. Images captured directly in a subpixel rendered format may be scaled up or down, to be shown on either subpixelated or fully converged displays, and potentially retain the performance benefit of the displaced image capture. For example, using the data set of FIG. 33B, the image may be processed, converted, and shown on a higher resolution conventional fully converged projector or other display as shown in FIG. 33C. Note that the image quality is higher than the image that would have been possible using the fully converged camera sensor arrangement of FIG. 31A.


[0097] An alternative multi-chip image sensor may have one of more of the sensors include a color filter array. One such example is shown in FIGS. 34A and 34B. FIG. 34A shows an arrangement of square sensors with red 3404 and green 3406 color filters affixed thereupon. FIG. 34B illustrates the lower resolution blue sensor plane. This blue sensor may or may not have a blue filter depending upon on whether the image beam splitter in the camera assembly was a dichroic filter. If a dichroic filter that splits off the red and green colors from the blue is used, then the blue plane may not need an additional filter.


[0098] Other sensors with color filter arrays may be used to advantage to create subpixel rendered images that are directly displayed on suitable subpixelated display means. For example, the conventional prior art Bayer pattern, and its improved variants, may be used with minimal processing. Said processing comprising the interpolation of surrounding red samples to fill in the missing red samples where the blue samples interrupt the red sample grid.


[0099] Scanners, devices that are used to convert still images, or movie film frames, to a digital or analog video format will also benefit from the teaching herein. Offset scanning, either mechanically or electronically may provide a direct subpixel rendered image data set, similar to those described above, which may be used in like manner to improve total system image quality.


[0100] Another embodiment would be to offset, electronically, physically, magnetically, and/or electrostatically the raster scan of a multi-tube video camera. Likewise, if the resulting direct subpixel rendered data set were delivered to a suitably matched display, such as a CRT or subpixelated flat panel display, the image quality would be increased.


[0101] Conversely, color image printers, either photographic (film printer: CRT or laser scanning, spatial light modulator, etc.), xerographic (laser printer), or mechanical (ink jet, dye sublimation, dye transfer, etc.) may also benefit from the teaching herein, in which subpixel rendering of conventional high resolution image data sets or direct printing of previously subpixel rendered image data sets is used on a printer system with matching displaced color image planes.


[0102] One complete system that uses the teaching contained herein may comprise original image capture using conventional color film photography and color film print presentation, with subpixel rendered film digitization, editing and manipulation, followed by subpixel rendered film printing. Such a system potentially would use modified equipment and processes presently used in film production, have the same size image data files, etc., but due to the benefits of subpixel rendering techniques taught herein, exhibit significantly better image quality in the final product. The process may have the additional benefit that the digitized image is in a subpixel rendered format that may be used in matching electronic cinema projectors with minimal or no further processing, again exhibiting improved image quality.



Additional Color Planes

[0103] Most conventional projector displays utilize three emitter colors, providing a color gamut that includes the inside of a triangle when charted on the 1931 CIE Color Chart, an example of which is shown in FIG. 24A. These colors are typically substantially red 2404, green 2406, and blue 2402. The luminances of these color emitters are typically unequal. For several reasons, some projectors displays are constructed with a fourth color emitter. Prior art four color displays usually use white as the fourth color. This is typically done to increase the brightness of the display, as the colors are usually created using dichroic filters. The white is created by removing a color filter; the light of the lamp which, being white 2408 already, is allowed to pass to the spatial light modulator unobstructed or modified. The four colors collectively are grouped into a pixel that may show any color within the triangle defined by the saturated colors, with the added ability to show lower saturation colors at a higher brightness by the addition of the appropriate amount of white.


[0104] For displays that are to be driven using subpixel rendering, the choice of a non-filtered white color plane or field creates a serious problem. Subpixel rendering depends on the ability to shift the apparent center of luminance by varying the brightness of the subpixels. This works best when each of the colors has the same perceptual brightness. Blue subpixels are perceived as substantially darker than the red and green, thus do not significantly contribute to the perception of increased resolution with subpixel rendering, leaving the task to the red and green subpixels. With the addition of an unfiltered white, the white color plane or field, being significantly brighter than both the red and green subpixels, the red and green lose much of their effectiveness in subpixel rendering.


[0105] In an ideal display, the luminance of each of the subpixels would be equal, such that for low saturation image rendering, each subpixel has the same luminance weight. However, the human eye does not see each wavelength of light as equally bright. The ends of the spectrum are seen as darker than the middle. That is to say that a given energy intensity of a green wavelength is perceived to be brighter than that same energy intensity of either red or blue. Further, due to the fact that the short wavelength sensitive cones, the “S-cones”, those giving rise to the sensation of ‘blue’, do not feed the Human Vision System's luminance channel, blue colors appear even darker.


[0106] In most prior art projector systems, the splitting of the white spectrum is usually done so that the red 2404 and the blue 2402 color points have the greatest color saturation as possible, while the green 2406 point is formed from the middle of the spectrum, having both more energy and brightness than the red 2404 and blue 2402 combined.


[0107] One embodiment for a three color system shown in FIG. 24B entails using wider bands for red 2404 and blue 2402, pushing them up the chart towards the apex slightly to create new red 2404b and blue 2402b color points, while the green 2406b, being narrower, also is pushed toward the apex. This increases the energy of the red 2404b and blue 2402b, while reducing the energy of the green 2406b. The white point 2408 remains in the same place. This remapping of the spectrum to the color triangle improves the subpixel rendering performance, but shifts the color gamut. For many applications, this improvement may be quite satisfactory and economical.


[0108] One embodiment that reduces the above problem adds a fourth color that substantially takes its energy from the shorter wavelength green part of the spectrum. In a system of dichroic beam splitters or regenerating color wheel assembly, this will reduce the energy being used on the “green” color plane, splitting it between a yellowish green 2506 and a cyan 2508 color as shown in FIG. 25. The total brightness and light efficiency remains the same, but the red 2504, yellowish-green 2506, and cyan 2508 beams have the substantially the same brightness. A further advantage is that the color gamut thus formed from the four color system is wider than the prior art three color system. Yet a further advantage of this invention is that the additional color beam may be independently modulated as a displaced subpixelated image, thus increasing the image quality of the resulting subpixel rendered image, with three color planes with near equal perceived brightness.


[0109] With three planes of near equal perceived brightness, the arrangement of subpixelated color planes of FIG. 23 may be used to full benefit. FIG. 23 illustrates three images of the array of pixels shown in FIG. 8 overlaid with a shift of one third of a pixel each. A logical pixel 128 is illustrated on the resulting image 126 in FIG. 23. The red pixel 130, green pixel 132, and “blue” (now possibly cyan) pixel 134 overlap to form a smaller triangular area 136 that is at the center of the logical pixel. This overlap area is brightest, followed by the three areas where there are only two pixels overlapping, while the areas with no overlap have the lowest brightness.


[0110] This process of increasing the number of color points and displaced color plane images can be performed again to yield a five color system as shown in FIG. 26. Here, the red 2604 and blue 2602 may be further pushed into their respective ‘corners’ by restricting their bandpasses at the edges of the visible spectrum, increasing the color gamut. The mid-spectrum is divided into three equally, perceptually, bright color points; greenish-yellow 2605, deep-green 2606, and deep-cyan 2608. This, along with the red 2604, gives four planes of effective subpixel rendered image. For good measure, the blue plane may be made coincident, fully converged, with the red to add to its brightness, giving a magenta color plane. These four colors may be used with arrangement of pixels of FIG. 30 to advantage.


[0111] In yet another embodiment, there is a possibility for integrating a “front-to-back” system (i.e. from image capture and/or generation to image render) using five colors. Each of the colors is subpixel rendered, from the camera to the projector. The color points are chosen carefully to both cover a wide gamut and be approximately the same luminance. Each color comes from narrow spectral band defined by dichroic filter-beam splitters. When the projector recombines the light, save for random loss, all of the light from the lamp is used to recreate the same white light.


[0112] Several color arrangements are possible. For example, here are two that use the colors R=red; Y=yellow; C=cyan; G=green and B=blue—in either a diamond or square matrix layout:
1CYCYRYRYGRGCGCGYCYCRYRYRGRCGCG


[0113] Of course, other matrices are possible—with other colors also selected. It should also be possible to use the blue plane at a lower resolution.


[0114] As well as separating the sample points of each color in space, by subpixel rendering, the color plane samples are displaced in time as well. Not only will this reduce temporal aliasing of moving objects, but it will significantly reduce jutter. The four longer wavelength colors are shuttered on a rotating basis, 90 degrees from the preceding and following color plane. That means there is also a color shuttered at 180 degrees from each color. The blue plane may be shuttered at any point since it will not greatly add to brightness. But if one of the other colors is the dimmest, the blue may be shuttered with it to keep its transition roughly the same amplitude as the others to eliminate flicker. With four major colors to work with, the addressability is increased by a factor of four and the MTF is doubled in each axis.


[0115] This process of breaking up the spectrum and increasing the number of subpixel rendering planes may be performed up to any arbitrary number, N.



Flicker Reduction in Field Sequential Color Systems

[0116] The perception of flicker in Field Sequential Color (FSC) systems is primarily caused by the unequal luminances of the color components that are time sequentially flashed onto the screen or to the viewer's eyes. The largest luminance difference in prior art three color systems is between the green color and the blue color, the blue color having comparatively little or no perceived luminance. Prior art methods of reducing the perception of flicker have included increasing the temporal frequency at which the three or more colors fields are presented. However, for some spatial light modulators, this is impractical either due to the bandwidth limits being less than that required to transfer the image of each field or to the time required for the spatial light modulator to present a high contrast image of the field (e.g. Liquid Crystal response time) being too long for the desired field rate.


[0117] A novel method of reducing the perception of flicker comprises the reduction of the total time that the dark, low luminance, color, such as blue, is presented to the viewer. Another novel method is to increase only the dark, low luminance, color frequency. Additionally, the two methods listed above may be combined to advantage.


[0118] For direct view applications, Light Emitting Diodes (LEDs) may be used as the illuminants. In this case, the practice is to use very brief flashes of monochromatic light for each color field. Thus, the set-up time for the spatial light modulator is often the limiting factor for the field and frame rates. As described above one method to reduce flicker perception is to increase the blue flash rate. In this case, instead of the prior art order of color flashes, which is typically something like: . . . red, green, blue, red, green, blue, red, green, blue . . . , the following order of color flashes may be substituted: . . . red, blue, green, blue, red, blue, green, blue . . . , etc. Note this will slow the frame rate if the field rate is kept constant. This will however, increase the frequency of the blue flashes, countered by the higher luminance flashes, namely red and green in the above example, reducing the perception of flicker. If the time for setting up the blue field image on the spatial light modulator may be reduced by a suitable method, the time between the red or green fields and the blue field flash may be reduced to maintain the same frame rate as the prior art field order. In each of the above, the total illumination intensity of each color component, averaged over the frame, is adjusted to maintain the desired white point; Specifically, the intensity of the doubled blue flashes may be reduced in half, or one may be one fourth (¼) and the other flash may be three fourths (¾) of the single flash intensity.


[0119] For projectors that use color filter wheels, the color wheel may be modified to provide the same or similar novel arrangement of color flashes as above. In FIG. 37, a prior art color wheel arrangement 3700 is illustrated. In this color wheel 3700 there are three color filter regions blue 3702, red 3704, and green 3706. The color wheel spins at the same rate as the frame rate of the display system, illuminating the spatial light modulator: . . . red, green, blue, red, green, blue, red, green, blue . . . , etc. FIGS. 38A, 38B, 38C, and 38D illustrate novel color wheels, with various combinations of reduced low luminance color component (e.g. blue) time or doubled low luminance color component frequency, or combinations of the two.


[0120]
FIG. 38A illustrates a novel color filter wheel 3800 that reduces the size of the low luminance color (e.g. blue) filter region 3802. Reducing the time that the blue illumination is the only light being viewed reduces the strength of the Fourier signal energy from the luminance variation. Reduced Fourier signal energy reduces the visibility of the perceived flicker.


[0121]
FIG. 38D illustrates a novel color filter wheel 3830 that has four color regions, two of which are low luminance (e.g. blue) color 3832, while the two are higher luminance colors. These may be red 3834 and green 3836. Thus, this may provide the following color field sequence: . . . red, blue, green, blue, red, blue, green, blue . . . , etc. Note this will slow the frame rate if the field rate is kept constant. This will however, increase the frequency of the blue flashes, countered by the higher luminance flashes, namely red and green in the above example, reducing the perception of flicker.


[0122] If the time for setting up the blue field image on the spatial light modulator may be reduced by a suitable method, the field time may be reduced to maintain the same frame rate as the prior art field order. FIGS. 38B and 38C illustrate examples where both the blue time period is reduced and the frequency increased. The color filter wheel 3810 of FIG. 38B has the property that the combined angular area and/or angular distance of the two blue regions 3812 is the same of that of either of the other two colors, red 3814 or green 3816. This gives the advantage that the illumination balance is identical to the prior art color filter wheel 3700 of FIG. 37. FIG. 38C illustrates a color filter wheel 3820 that has both doubled and reduced angular area and/or angular distance blue filter regions 3822. This doubles the blue dark frequency and reduces the total time at that lower luminance, reducing the perception of flicker.


[0123]
FIG. 39A illustrates a novel color filter wheel 3900 that places a very low luminance and low radiance (e.g. black) filter region 3912 opposite the low luminance color (e.g. blue) filter region 3902. The opposition of black and blue doubles the temporal frequency of low luminance, reducing the perception of flicker. FIG. 39B illustrates the same color filter wheel 3900 with the addition of two additional very low luminance color (e.g. black) filter regions 3912b and 3912b that break up the red filter region 3904 and green filter region 3906 of FIG. 39A into two red filter regions 3904b and 3904b and two green filter regions 3906b and 3906b. The spatial light modulator may remain displaying the same red or green color field information during the black time intervals created by the superimposed black filter region 3912b or 3912b. The presence of the two additional black filter regions 3912b and 3912b further increases the temporal frequency of the low luminance signal, reducing the perception of flicker.


[0124]
FIG. 40 illustrates a novel color filter wheel 4000 of four colors. The fourth color may be comprised of high transmissive, and therefore, high luminance (e.g. white or clear) regions 4008. These clear regions 4008 may be placed in opposition, such that their higher luminance temporal frequency is doubled, reducing perception of flicker. The low luminance color (e.g. blue) regions 4002 may be placed in opposition, such that their lower luminance temporal frequency is doubled, reducing the perception of flicker. Further, the high luminance and low luminance regions may be placed next to each other such that one leads or follows the other. This juxtaposition creates higher temporal frequency Fourier signal components than if they were not so juxtaposed, reducing the perception of flicker.


[0125] In addition to using the timing of Light Emitting Diodes and the transmission sequence of color filter wheels, other color timing methods may be similarly modified. For example, the use of Liquid Crystal based PI cell color modulators, colored fluorescent backlights, or electrically controlled, color selecting, holograms may be modified such that the timing follows the above examples.



Bandwidth Reduction

[0126] Bandwidth reduction, to allow faster transfer of data to the spatial light modulator, or to allow greater image compression for transmission or storage, may be facilitated by another embodiment. This bandwidth reduction may enable the reduced time to form the image on the spatial light modulator, which in turn may enable reduced time and/or divided low luminance color field display as disclosed above. This bandwidth reduction may be implemented with spatio-temporally displaced filtering and reconstruction to maintain addressability and Modulation Transfer Function, maintaining image quality.


[0127]
FIGS. 41, 42A, 42B, 42C, and 42D illustrate a data set that is spatio-temporally displaced filtered and reconstructed. FIG. 41 illustrates the original data set 4100. It may also represent a matching prior art spatial light modulator 4100 which is to be used to reconstruct the spatio-temporally displaced filtered data set. Examining FIG. 42A, data points 4205 are grouped together into larger data points 4215, applying a suitable filter to the original data points 4205, perhaps a simple box filter. This creates a lower resolution image data set, with less data points, thus reducing the bandwidth required to transmit the image. Turning to FIG. 42B, again, data points 4205 are grouped together into larger data points 4225. Note that these larger data points 4225 comprise a different grouping of original data points 4205, than does the first larger data points 4215. The groupings 4215 and 4225 are displaced diagonally by one half. This is functionally similar to the displaced filtered and reconstructed image of FIG. 20. When these two data sets are sequentially displayed, one after another, each time that color field is displayed, the temporal integration of the human eye composites the two lower resolution images as a higher resolution image, in a manner very similar to that described above for images that are simultaneously presented. Examining FIGS. 42C and 42D, note that each groups original data points 4205 into larger data points 4235 and 4245 respectively. Again, note that the larger groupings are displaced from one another, and from both of the previously discussed groups 4215 and 4225. When all four are presented sequentially, each time for that color field, the temporal integration of the human eye composites the four lower resolution images as a higher resolution image, in a manner very similar to that described above for images that simultaneously presented. This is functionally similar to the displaced filtered and reconstructed image of FIG. 30.


[0128] While the above example used square grid data samples, box filters, of two by two original data points 4205 going to each output data resample 4215, 4225, 4235, and 4245, it will be appreciated that other combinations of input samples (e.g. 3×3, 4×5, etc), filters (e.g. tent, Gaussian, Difference-Of-Gaussians, etc), and output resample grid (e.g. FIGS. 6, 7, and 8, etc.) will also function in a similar manner. All such variations are contemplated.+


[0129] While the invention has been described with reference to exemplary implementations and embodiments, it will be understood that various changes may be made and equivalents may be substituted for elements thereof without departing from the scope of the invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings without departing from the essential scope thereof. Therefore, it is intended that the invention not be limited to the particular embodiment disclosed as the best mode contemplated for carrying out this invention, but that the invention will include all embodiments falling within the scope of the appended claims.


Claims
  • 1. A cathode ray tube (CRT) device comprising: a plurality of electron guns to produce a plurality of electron beams; a plurality of separate phosphor dots corresponding to separate colors produced when impacted by the electron beams; and steering electronics to guide the electron beams to the plurality of separate phosphor dots that form part of separate and shifted color planes.
  • 2. The CRT device of claim 1, wherein the phosphor dots include red, blue, and green phosphor dots.
  • 3. The CRT device of claim 1, further comprising: a subpixel clock coupled to the steering electronics and to effect subpixel rendering.
  • 4. The CRT device of claim 3, wherein the subpixel clock effects a timing such that a plurality of colored image data is offset to each other by a spatial and temporal amount.
  • 5. The CRT device of claim 4, wherein the colored image data included red color, blue color, and green color image data.
  • 6. The CRT device of claim 5, wherein the red color image data leads the green color image data, and wherein the green color image data leads the blue color image data.
  • 7. The CRT device of claim 5, wherein the red color image data leads the green color image data by one third of a pixel clock, and the green color image data leads the blue color image data by two thirds of a pixel clock.
  • 8. The CRT device of claim 5, wherein timing of the color data is switched in an odd and an even row, and wherein during the odd row, the red color image data leads the green color image data, and during the even row, the green color image data leads the red color image data.
  • 9. The CRT device of claim 4, wherein the offset is substantially effected in one dimensions.
  • 10. The CRT device of claim 4, wherein the offset is substantially effected in two dimensions.
  • 11. A method for a cathode ray tube (CRT) device, the method comprising: producing a plurality of electron beams; and steering the electron beams to a plurality of separate phosphor dots that form part of separate and shifted color planes.
  • 12. The method of claim 11, wherein the phosphor dots include red, blue, and green phosphor dots.
  • 13 The method of claim 11, further comprising: effecting subpixel rendering.
  • 14. The method of 13, further comprising: effecting a timing such that a plurality of colored image data is offset to each other by a spatial and temporal amount.
  • 15. The method of claim 14, wherein the colored image data included red color, blue color, and green color image data.
  • 16. The method of claim 15, further comprising: leading the green color image data by the red color image data; and leading the blue color image data by the green color image data.
  • 17. The method of claim 15, further comprising: leading the green color image data by the red color image data by one third of a pixel clock; leading the blue color image data by the green color image data by two thirds of a pixel clock.
  • 18. The method of claim 15, switching timing of the color data in an odd and an even row, wherein during the odd row, the red color image data leads the green color image data, and during the even row, the green color image data leads the red color image data.
  • 19. The method of claim 14, wherein the offset is substantially effected in one dimensions.
  • 20. The method of claim 14, wherein the offset is substantially effected in two dimensions.
  • 21. A cathode ray tube (CRT) device comprising: a plurality of electron guns to produce a plurality of electron beams; a plurality of separate phosphor dots corresponding to separate colors produced when impacted by the electron beams; steering electronics to guide the electron beams to the plurality of separate phosphor dots that form part of separate and shifted color planes; and a subpixel clock coupled to the steering electronics and to effect subpixel rendering.
  • 22. The CRT device of claim 21, wherein the subpixel clock effects a timing such that a plurality of colored image data is offset to each other by a spatial and temporal amount.