The invention will be described with reference to the accompanying drawings, wherein like numbers reference like elements.
Hereinafter, embodiments of the invention will be described with reference to the accompanying drawings. In the following descriptions, the column direction of the screen (the arrangement direction of data lines) will be referred to as the “vertical direction,” and the row direction of the screen (the arrangement direction of scanning lines) will be referred to as the “horizontal direction”. The same components as those of the known display device shown in
The multiple image data D′ includes the right-eye image data DR′ and the left-eye image data DL′. The right-eye image data DR′ and the left-eye image data DL′ each include image data which corresponds to one screen. As shown in
The image data combining circuit 2 includes the read-in control circuit 21 that compresses the multiple image data D′ and sequentially stores the compressed image data in the memory 22; and the read-out control circuit 23 which reads out the image data stored in the memory 22 in accordance with a predetermined rule and outputs the image data as the image data D, which corresponds to one screen. The image data combining circuit 2 filters out portions of the right-eye image data DR′ and the left-eye image data DL′ included in the multiple image data D′ using the read-in control circuit 21, and alternately rearranges the image data DR′ and DL′ using the memory 22, thereby combining new image data D.
The timing control circuit 8 is provided with a timing signal generating unit (not shown) that generates dot clocks for scanning pixels of the liquid crystal panel 3. Based on the dot clocks generated by the timing signal generating unit, the timing control circuit 8 generates a Y clock signal CLY, an inverted Y clock signal CLYinv, an X clock signal CLX, an inverted X clock signal CLXinv, a Y start pulse DY, and an X start pulse DX, which are supplied to the image data supply circuit 25 and the liquid crystal panel 3.
The image data supply circuit 25 includes an S/P conversion circuit 20, a read-in control circuit 21, a memory 22, and a read-out control circuit 23. The S/P conversion circuit 20 divides a chain of multiple image data D′ serially supplied from an external source into image data components DR′r, DR′g and DR∝ b for the right-eye image and image data components DL′r, DL′g and DL′b and outputs the image data components as six-phase-developed image data. The read-in control circuit 21 filters out portions of six image data components DR′r, DR′g, DR′b, DL′r, DL′g and DL′b that were phase-developed by the S/P conversion circuit 20 in order to produce six new image data components DRr, DRg, DRb, DLr, DLg and DLb, which are supplied to the memory 22. The read-out control circuit 23 rearranges the image data components DRr, DRg, DRb, DLr, DLg and DLb stored in the memory 22, and outputs image data components Dr, Dg, and Db for the combined image. Image data components designated by “r,” “g” and “b” are image data components of red, green, and blue, respectively. The image data components Dr, Dg, and Db respectively are image data components of red, green, and blue for the combined image of the right-eye image and the left-eye image.
Each formation area of the pixel electrodes 33 constitutes a subpixel. Each subpixel corresponds to a color element, such as red, green, or blue. The whole image display area W is formed by arranging the subpixels in the horizontal and vertical directions. Although not shown in the drawings, a plurality of color filters are arranged as stripes on the image display area W. Each color filter has a red, green, or blue color, which corresponds to a column of subpixels arranged in the vertical direction. The color filters of red, green, and blue are alternately arranged to correspond with the alternating subpixels. One pixel (panel pixel) is comprised of three subpixels which correspond to three color filters of red, green, and blue.
In the peripheral portion of the image display area W, a peripheral driving circuit is provided, which includes a scanning line driving circuit 31, a data line driving circuit 32, and a sampling circuit 38. These circuits may be integrally formed on the substrate of the pixel electrodes 33 or may be separately provided as driving ICs.
Between the data line driving circuit 32 and the sampling circuit 38, three image signal lines 37 are provided for supplying the image data components Dr, Dg, and Db. Each of the three image signal lines 37 corresponds to any one of the three-phase-developed, red-, green- and blue-image data components Dr, Dg, and Db.
One end of each data line 35 is electrically connected to a corresponding sampling switch 36. Each sampling switch 36 is electrically connected to any one of three image signal lines 37 for supplying three-phase image data components Dr, Dg, and Db. The sampling switches 36 are arranged in the horizontal direction and constitute the sampling circuit 38.
The scanning line driving circuit 31 receives the Y clock signal CLY, the inverted Y clock signal CLYinv and the Y start pulse DY from the timing control circuit 8 shown in
The data line driving circuit 32 receives the X clock signal CLX, the inverted X clock signal CLXinv and the X start pulse DX from the timing control circuit 8 shown in
The sampling signals are supplied to each pixel (panel pixel) in a set of three (red, green, and blue) subpixels that are arranged adjacently in the horizontal direction. The data line driving circuit 32 sequentially supplies the sampling signals S1, S2, . . . , Sn to the sampling switches 36 on a pixel-by-pixel basis. The sampling switches 36 are sequentially turned ON in accordance with the sampling signals. The image data components Dr, Dg, and Db are sequentially supplied to the data lines 35 on a pixel-by-pixel basis via the turned ON sampling switches 36.
Next, an image processing method of the image data combining circuit 2 will be described with reference to
The multiple image data D′ includes right-eye image data DR′ corresponding to one screen, which is represented as R(1, 1), R(1, 2), and left-eye image data DL′ corresponding to one screen, which is represented by L(1, 1), L(1, 2). In
In the arrangement diagrams 5 and 7, each rectangular area represents image data of a subpixel. The characters on the upper two lines in each rectangular area represent the type (right-eye or left-eye image) of the image data along with the coordinates on the screen W of the pixel which includes the subpixel. For example, if the upper portion of the rectangular area of the image data is indicated by “R(n, k)” (wherein, n and k are integers), the image data is the right-eye image data of a pixel arranged at the n-th row and k-th column on the screen W. The character on the bottom line in each rectangular area represents the color information of the subpixel. The characters “r,” “g,” and “b” represent color information of red, green, and blue, respectively. For example, if the bottom line in the rectangular area of the image data is indicated by “m” (wherein, m is r, g or b), the image data is the image data of a subpixel corresponding to a color filter of m color. Hereinafter, the image data of the subpixel will be simply denoted as “R(n, k)m” (wherein, n and k are integers; and m is r, g or b).
First, in the read-in control circuit 21 stores image data in the memory 22, which corresponds to the right-eye image data DR′, including image data which corresponds to coordinates (1, 1), etc., and filter color. Here, the image data R(1, 1)r, R(1, 1)g, R(1, 1)b, . . . , R(1, 4)r, R(1, 4)g, and R(1, 4)b are stored in the memory 22. In this configuration, the image data R(2, 1)r, R(2, 1)g, R(2, 1)b, . . . , R(2, 4)r, R(2, 4)g, and R(2, 4)b are filtered out and thus are not stored in the memory 22, while the image data R(3, 1)r, R(3, 1)g, R(3, 1)b, . . . , R(3, 4)r, R(3, 4)g, and R(3, 4)b are stored in the memory 22.
In this configuration, the image data corresponding to odd-numbered rows are stored in the memory 22, and the image data in even-numbered rows are filtered out without being stored in the memory 22. As a result, new right-eye image data DR having half the amount of information than the right-eye image data DR′ are stored in the memory 22. The new right-eye image data DR is obtained by storing a portion of the original right-eye image data DR′ while filtering out the remaining portion of the original right-eye image data DR′. Thus, the information amount of the image data DR is reduced by 50 percent.
After completing the image processing operation for the right-eye image data DR′, the image processing operation for the left-eye image data DL′ is performed. Using a process similar to the one described above, the read-in control circuit 21, stores image data for the left-eye image data DL′, corresponding to the row and arranged at coordinates (1, 1). Thus, the image data L(1, 1)r, L(1, 1)g, L(1, 1)b, . . . , L(1, 4)r, L(1, 4)g, and L(1, 4)b is stored in the memory 22. Then the image data of the next row arranged at coordinates (2, 1), are filtered out. Thus, the image data L(2, 1)r, L(2, 1)g, L(2, 1)b, . . . , L(2, 4)r, L(2, 4)g, and L(2, 4)b are not stored in the memory 22, while the image data of the next row at coordinates (3, 1), i.e., the image data L(3, 1)r, L(3, 1)g, L(3, 1)b, . . . , L(3, 4)r, L(3, 4)g, and L(3, 4)b are stored in the memory 22.
In this way, the image data corresponding to odd-numbered rows are stored in the memory 22, and the image data in even-numbered rows are filtered out without being stored in the memory 22. As a result, new left-eye image data DL with half the amount of information the original left-eye image data DL′ is stored in the memory 22. The new left-eye image data DL are obtained by selecting a portion of the original left-eye image data DL′ in a predetermined row while filtering out the remaining portion of the original left-eye image data DL′. Thus, the information amount of the image data DL is reduced by 50 percent.
After the image processing operation, the read-out control circuit 23 reads out the right-eye image data DR and the left-eye image data DL from the memory 22 according to a predetermined rule. The red, green, and blue subpixels included in the pixel at coordinates (1, 1) are supplied with the image data R(1, 1)r, L(1, 1)g, and R(1, 1)b, respectively. The red, green, and blue subpixels included in the pixel at coordinates (1, 2) are supplied with the image data L(1, 2)r, R(1, 2)g, and L(1, 2)b, respectively. The red, green, and blue subpixels included in the pixel at coordinates (1, 3) are supplied with the image data R(1, 3)r, L(1, 3)g, and R(1, 3)b, respectively. The red, green, and blue subpixels included in the pixel at coordinates (1, 4) are supplied with the image data L(1, 4)r, R(1, 4)g, and L(1, 4)b, respectively.
Similarly, the red, green, and blue subpixels included in the pixel at coordinates (2, 1) are supplied with the image data L(2, 1)r, R(2, 1)g, and L(2, 1)b, respectively, and the red, green, and blue subpixels included in the pixel at coordinates (2, 2) are supplied with the image data R(2, 2)r, L(2, 2)g, and R(2, 2)b, respectively. The red, green, and blue subpixels included in the pixel at coordinates (2, 3) are supplied with the image data L(2, 3)r, R(2, 3)g, and L(2, 3)b, respectively, and the red, green, and blue subpixels included in the pixel at coordinates (2, 4) are supplied with the image data R(2, 4)r, L(2, 4)g, and R(2, 4)b, respectively.
In this configuration, the right-eye image data corresponding to the first and second rows, (the image data R(1, 1)r, R(1, 1)g, R(1, 1)b, . . . , R(2, 4)r, R(2, 4)g, and R(2, 4)b), and the left-eye image data corresponding to the first and second rows, (the image data L(1, 1)r, L(1, 1)g, L(1, 1)b, . . . , L(2, 4)r, L(2, 4)g, L(2, 4)b), are alternately stored on the subpixel level in both the horizontal and vertical directions. The right-eye image data and the left-eye image data corresponding to the first row are combined into image data which corresponds to two rows (the first and second rows) on the screen W. Using the same technique, the image data corresponding to the third and fourth rows on the screen W is created.
In this way, the right-eye image data corresponding to the odd-numbered rows (the p-th rows) and the left-eye image data corresponding to the odd-numbered rows (the p-th rows) are alternately stored at the subpixel level in both the horizontal and vertical direction. Similarly, the right-eye image data corresponding to the p-th row and the left-eye image data corresponding to the p-th row are combined into image data corresponding to two rows (the p-th and (p+1)-th rows) on the screen W. As a result, a new image data (combined data) D is produced by combining the right-eye image data DR and the left-eye image data DL.
As shown in
In
In
The size in the horizontal direction of the individual display pixels PR1 and PR2 is equal to that of the pixels (panel pixels) required for displaying the two-dimensional image. Thus, the resolution is not reduced, even when displaying a fine image. Accordingly, the right-eye image R formed by the display pixels PR1 and PR2 has a high horizontal resolution and a smooth border.
The above statements can be similarly applied to the left-eye image L shown in
As described above, according to the display device 1 of the present embodiment, since the size in the horizontal direction of the display pixels PR1, PR2, PL1, and PL2 for displaying a combined image is equal to that of the pixels (panel pixels) required for displaying the two-dimensional image, it is possible to display a clear image with a high resolution. In this case, although a resulting image may become coarse in the vertical direction, it can be negated by additionally inputting image data having a density that is doubled in the vertical direction, as the multiple image data D′.
Another embodiment of the invention includes a stereoscopic display device for displaying a stereoscopic image, however, the invention may be applied in an number of multi-viewpoint display devices for presenting a multi-viewpoint image to a plurality of observers. In the case of the stereoscopic display device, the right-eye image data DR′ and the left-eye image data DL′ are prepared as the image data for display, and the right-eye image R and the left-eye image L are spatially separated using the image separating unit (parallax barrier, for example). In a multi-viewpoint display device the multi-viewpoint image data is prepared as the image data for display, and images of different viewpoints are spatially separated using the image separating unit (parallax barrier, for example). For example, in the case of a display device for an on-vehicle navigation system, an image of a first viewpoint (driver's seat side) and an image of a second viewpoint (passenger's seat side) may be prepared as a navigation image and a television image, respectively, and presented to the corresponding observers (driver and passenger) using the image separating unit. In the case of the stereoscopic display device, the arrangement between the panel pixel and the apertures of the parallax barrier is based on position of the right and left eyes, while in a multi-viewpoint display device, the arrangement relationship between the panel pixel and the apertures of the parallax barrier is based on the position of the observers.
In the present embodiment, a liquid crystal panel is used as the display unit 3. Instead of the liquid crystal panel, other display panels may be used as the display unit. For example, a non-emission-type panel such as a liquid crystal panel or an electrophoresis panel and a self-emission-type panel such as an electroluminescence (EL) panel may be used as a display panel. Moreover, image separating units other than the parallax barrier B may be used, such as lenticular lenses.
Although the exemplary embodiments of the invention have been described with reference to the accompanying drawings, it should be understood that the invention is not limited to such embodiments. Various shapes or combinations of respective constituent elements illustrated in the above-described embodiments are merely examples, and various changes may be made depending on design requirements or the like without departing from the spirit or scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2006-268985 | Sep 2006 | JP | national |
2007-003236 | Jan 2007 | JP | national |