This application claims benefit of priority under 35USC§119 to Japanese Patent Application No. 2003-387039, filed on Nov. 17, 2003, the entire contents of which are incorporated by reference herein.
1. Field of the Invention
The present invention relates to a display device having an image capture function, and an imaging method.
2. Related Art
A liquid crystal display device has an array substrate on which signal lines, scanning lines, and pixel TFTs (thin film transistors) are disposed, and drive circuits that drives the signal lines and the scanning lines. Based on recent advancement and development of integrated circuit technology, a process technique of forming a part of the drive circuits on the array substrate is in practical use. With this technique, the total liquid crystal display device can be made light, thin and compact, to enable the display device to be widely used for various kinds of portable devices such as portable telephones and notebook computers.
A display device having image capture function, which has photoelectric conversion elements to capture images disposed on the array substrate, is proposed (For example, see Japanese Patent Application Nos. 2001-292276 and 2001-339640).
The conventional display device having the image capture function changes charges of capacitors connected to the photoelectric conversion elements according to the amount of light received by the photoelectric conversion elements, to detect voltages at both ends of the capacitor, thereby capturing an image.
Recently, a technique of forming pixel TFTs and drive circuits on the same glass substrate according to a polycrystalline silicon (i.e., polysilicon) process has been developed. It is possible to easily form the photoelectric conversion elements in each pixel by using the polysilicon process.
In order to detect gradations of a picked-up image in the display device having an image capture function, a method of picking up images by changing the pickup time in stages, and combining the picked-up images to finally obtain one image is available.
According to this method, however, the picked-up image finally obtained is not correctly subjected to a gamma adjustment, and therefore, has poor display quality. There is also a risk that the obtained image has much noise, depending on characteristics of the photoelectric conversion element. There is also a risk that a setting of an image capture condition is complex.
The present invention has been achieved in the light of the above problems, and has an object of providing a display device having an image capture function with excellent display quality and with little noise. It is another object of the present invention to provide an imaging method capable of picking up image based on a simple condition setting.
In order to solve the above-described problem, an object of the present invention is to provide a display device having an image capture function with an excellent display quality and with little noises.
A display device according to one embodiment of the present invention, comprising:
a pixel array part having display elements provided inside of pixels formed in vicinity of cross points of signal lines and scanning lines aligned in matrix form;
image capture circuits provided corresponding to said display elements, each including an optical sensor which picks up image at prescribed range of an imaging subject; and
an image capture processing unit which generates ultimate image capture data based on result of having picked up images a plurality of times while changing image capture condition except for display color, with respect to each of M kinds of display colors.
Furthermore, a display device according to one embodiment of the present invention, comprising:
an image array part which has display elements provided inside of pixels formed in vicinity of cross points of signal lines and scanning lines aligned in matrix form;
image capture circuits provided corresponding to said display elements, each picking up image at prescribed range of an image capture subject;
an image capture condition switching unit which switches in stages a exposure time in said image capture circuit; and
an irregularity processing unit which performs irregularity processing of a picked-up image, taking into consideration polarity of display data written into the pixel at image capture time.
Furthermore, a display device, comprising:
a pixel array part having display elements provided inside of pixels formed in vicinity of cross points of signal lines and scanning lines aligned in matrix form;
image capture circuits provided corresponding to said display elements, each including an optical sensor which picks up image at prescribed range of an imaging subject;
an image capture condition switching unit which switches in stages exposure time in said image capture circuits,
wherein said image capture condition switch unit sets exposure time T in said image capture circuits so that (1/T−1/T0)1/γ changes at substantially a constant interval, where T is an exposure time of said image capture circuits, T0 is an exposure time for reading out ideal black and γ is a gamma value of said pixel array part.
Furthermore, an image capture method of a display device according to one embodiment of the present embodiment, comprising:
image capture circuits provided corresponding to said display elements, each including an optical sensor which picks up image at prescribed range of an imaging subject,
wherein display face of said pixel array part is arranged closely to white face of a lid, and said image capture circuit picks up images at state of setting display of said pixel array part to halftone, to acquire irregularity image for irregularity subtraction.
Furthermore, an image capture method of a display device, comprising:
a pixel array part having display elements provided inside of pixels formed in vicinity of cross points of signal lines and scanning lines aligned in matrix form; and
image capture circuits provided corresponding to said display elements, each including an optical sensor which picks up image at prescribed range of an imaging subject,
wherein said image capture circuit picks up images while displaying a plurality of colors for picking up color image at the same time, and image capture condition of each color is decided based the image capture result.
Hereafter, a display device and an imaging method according to the present invention will be described more specifically with reference to the drawings.
The image processing IC 2 can be mounted on the array substrate 1, or can be mounted on another separate from the array substrate 1. The image processing IC 2 can have any package such as an ASIC (an LSI for a specific application), and an FPGA (a programmable LSI). The image processing IC 2 has a memory 6 and a processing circuit 7. For the memory 6, an SRAM (static random access memory) or a DRAM (dynamic random access memory) can be used.
The host PC send image data for display and video/setting rewrite commands to the image processing IC 2. Display data from the host PC 3 is stored in the memory 6, and a video/setting rewrite command is stored in the processing circuit 7. The video data stored in the memory 6 is sent to the array substrate 1 via the interface 5. The processing circuit 7 sends a display/imaging control signal to the array substrate 1 via the interface 5. The image data obtained on the array substrate 1 is sent to the memory 6 via the interface 5. The processing circuit 7 executes image processing such as multiple gradation or rearranging of the video data and image data stored in the memory 6.
The multiple gradation refers to addition of plural (binary) image data obtained by picking up images under plural image capture conditions, and averaging the added result by the number of conditions. The rearranging refers to rearranging the order of image data output from the array substrate 1 (which is determined based on the configuration of the circuit that outputs data from the array substrate 1) to the order following the sensor disposition. The processed image data (hereinafter, gradation data) is sent from the memory 6 to the host PC 3 via the interface 4.
The image data contains noise, and therefore, requires image processing. The processing circuit 7 carries out a part of the image processing, and the host PC 3 carries out the rest of the image processing by software. The display device sends a large amount of image data to the image processing IC 2. The image processing IC 2 sends only the image-processed data to the host PC 3. The amount of the gradation data is smaller than the amount of all image data.
As is clear from
Only the collection of image-processed data and the sending of the video/setting rewrite command are carried out via the CPU bus, concerning the imaging operation. Therefore, the CPU bus does not require a high-speed transmission. Each time when one image is picked up, the image processing including the multiple gradation and the rearranging is carried out inside the IC 2. Therefore, the image processing time can be reduced substantially as compared with the time required to process all the multiple gradation and the rearranging at the host PC side. Because the transmission speed of the CPU bus can be slow, the cost of the total system can be reduced.
The pixel array 11 has plural pixel circuits 15 disposed vertically and horizontally.
The photodiode PD can be formed using a polysilicon TFT, or can be a diode formed by injecting an impurity into polysilicon.
The display device according to the present embodiment can also carry out a normal display, and an image capturing like a scanner. To carry out a normal display, a transistor Q3 is set to the off state. No valid data is stored in the buffer 13. A pixel voltage is supplied to the signal line from the signal line drive circuit 12, to carry out a display corresponding to this pixel voltage.
Light from the backlight 27 is irradiated onto a subject 28 to be imaged through the array substrate 1 and the opposite substrate 26. The photodiode PD on the array substrate 1 receives light reflected from the subject 28, thereby capturing the image of the subject.
The sensor capacity C1 stores the picked-up image data as shown in
When the light shielding layer 24 is provided in the array substrate 1 as shown in
The processing circuit 7 shown in
An output from the image capture sensor 7 is binary data. Gradation data of red is generated based on the result of imaging by N times (step S3).
Similarly, in a state that the whole array substrate 1 is displayed in green, the image capture sensor 7 images the subject N times while changing an image capture condition in stages (step S4). Gradation data of green is generated based on the result of imaging by N times (step S5).
Similarly, in a state that the whole array substrate 1 is displayed in blue, the image capture sensor 7 images the subject N times while changing an image capture condition in stages (step S6). Gradation data of blue is generated based on the result of imaging by N times (step S7).
By picking up images in the color order of red, green, and blue, an imaged result with a least influence of noise is obtained. Because red receives most influence of a temperature change, it is preferable to obtain the red image first.
The diode or the polysilicon TFT formed by injecting an impurity into polysilicon has good S/N ratios of the optical leak current in the order of blue, green, and red. The S/N ratio of red is worst. During the imaging, heat is transferred from paper to the sensor. For example, the array substrate or the opposite substrate including the sensor has a temperature of about 32° C. due to the backlight. Paper does not necessarily have the same temperature as that of the array substrate depending on the state, and has a low temperature of 25° C., for example. It takes a few seconds to several dozens of seconds to carry out the imaging. During the imaging, the sensor is cooled with the paper. Temperatures are not uniform on the surface, and there occurs a large variation in thermocurrent within the chip.
For the above reasons, the picked-up image in the red luster color has a large variation and large noise. When the optical sensor is formed using a different material, it is preferable to check the S/N ratio of each color of the optical leak current, and pick up the image of the color of the worst S/N ratio first.
Next, the gradation data of the red, green, and blue colors obtained at steps S3, S5, and S7 are simply combined together (step S8). In this state, an image having irregular colors is obtained.
A color correction linear conversion is carried out to correct the gradation data of the red, green, and blue colors (step S9). Specifically, a color correction is carried using a 3×3 matrix as shown in the following expression (1).
|G′|=|M21,M22,M23| |G| (1)
Each coefficient of this matrix is obtained in advance such that a difference between a reproduced color and the original color of a color chart is as small as possible. Each coefficient has constraints as shown in the following expressions (2) to (4). Therefore, it is possible to improve color reproduction by changing white balance.
M11+M12+M13=1 (2)
M21+M22+M23=1 (3)
M31+M32+M33=1 (4)
In the flowchart shown in
After the color correction linear conversion is carried out at step S9 shown in
The display device can have a lid 31 as shown in
Next, a lateral irregularity processing is carried out to remove lateral lines on the screen (step S11). At the same time, a skeleton processing is carried out (step S12).
The outline of the lateral irregularity processing is explained.
When carrying out a lateral irregularity processing, a proximity filter consisting of a 3×3 matrix as shown in
G(x,y)=[F(x,y−1)+2F(x,y)+F(x,y+1)]/4 (5)
When this lateral irregularity processing is carried out, a picked-up image having no lateral lines is obtained as shown in
As another method, when a pixel voltage is written into each pixel under the same image capture condition, two pieces of picked-up images are acquired, and data of line with the same polarity among these two images is pulled out and combined, to generate one piece of image.
More specifically, polarities of the first picked-up image change in order of +/−/+/−/ . . . for each line when the pixel voltage is written into each pixel under the same image capture condition. Polarities of the second picked-up image change in order of −/+/−/+/ . . . for each line when the pixel voltage is written into each pixel under the same image capture condition. The image captured data of odd lines is selected from the first picked-up image, and the image captured data of even lines is selected from the second picked-up image. By combining these data, it is possible to obtain one piece of picked-up image without lateral irregularity.
Even when polarity inversion occurs for each row, or for each line and row, by extracting and combining component of the same polarity from two picked-up images, it is possible to eliminate vertical irregularity and checked irregularity.
The image obtained in the blue display has the least blur because the image capture sensor 7 using the diode or the polysilicon TFT formed by injecting an impurity into polysilicon has the highest sensitivity to the blue color among three colors of blue, green, and red.
When the above lateral irregularity processing is carried out to an image obtained by combining the picked-up images shown in
When simple-combined image data obtained by simply combining the outputs from the image capture sensor 7 are expressed as (R1, G1, B1), when image data before the lateral irregularity processing are expressed as (R2, G2, B2), and when a coefficient is expressed as U, then gradation data (R3, G3, B3) after the averaging are expressed as shown in the following expressions (6) to (8).
R3=(R1+U×B2)/(1+U) (6)
G3=(G1+U×B2)/(1+U) (7)
B3=(B1+U×B2)/(1+U) (8)
In the expressions (6) to (8), the coefficient U denotes a ratio of blue mixed into the original image, which is a real number other than a negative value. U is equal to 0.5 to 1 when high resolution (i.e., sharp outline) is required for a name card or the like. U is equal to 0 to 0.5 when a reproduction of white is important for a natural image or the like. In other words, a skeleton processing is not carried out strong for a natural image. When the optical sensor is formed using a different material, color dependency of the S/N ratio is checked, and an image obtained by using a color of the best S/N ratio is used for the skeleton processing.
After ending the lateral irregularity processing at step S11 and the skeleton processing at step S12 shown in
A coefficient aij is expressed in a 3×3 matrix as shown in
After the edge emphasizing at step S14 shown in
Specifically, when gradation values before the gamma adjustment are (R0, G0, and B0), and also when gradation values after the gamma adjustment are expressed as (R, G, B), the following expression (10) is calculated.
R=R0γ,G=G0γ,B=B0γ (10)
From the above processing, a color image of a line image of a name card or the like is obtained. For a monochromatic image of a subject, a similar processing is carried out.
At steps S2 to S7 shown in
In general, a reproduction brightness Y of the display device and the gradation value are often set to satisfy a relationship of the expression (11).
Y=(gradation value)γ (11)
The expression (11) is modified to obtain the expression (12).
Gradation value=Y1/γ (12)
An exposure time T of the image capture sensor 7 necessary to read an image of an n-th gradation satisfies a relationship of the expression (13).
T∝1/(leak current in the n-th gradation gray scale)=1/(thermocurrent+photocurrent in the n-th gradation gray scale) (13)
From the expression (13), the following expression (14) is obtained.
1/T=a constant×(thermocurrent+photocurrent in the n-th gradation gray scale) (14)
An exposure time T0 to read ideal black (i.e., a status in which a photocurrent does not occur) is expressed by the following expression (15).
1/T0=constant×thermocurrent (15)
When the expression (15) is subtracted from the expression (14), the following expression (16) is obtained.
1/T−1/T0=constant×(photocurrent in the n-th gradation gray scale)∝brightness in the n-th gradation) (16)
From the expression (16) and the expression (12), a relationship of the following expression (17) is obtained.
(1/T−1/T0)1/γ∝Y1/γ=gradation of the imaged subject (17)
Therefore, in order to image the subject while changing the gradation of the subject at a constant interval, it is preferable to switch in stages the exposure time T such that (1/T−1/T0)1/γ changes at substantially a constant interval. When this exposure time T is calculated in advance and is stored in the memory 6 shown in
The “substantially a constant interval” refers to a range which does not exceed the neighboring Ti when the exposure times Ti (T1, T2, . . . ) are divided by the above division method.
Specifically, T0 is obtained as follows. White paper (i.e., photographic printing paper or quality paper for a photograph) is closely contacted to a display surface of the display device, and the backlight of the display device is turned off. In this state, imaging is carried out while changing the imaging time. Time when the ratio of white to black of the image data output from the image capture sensor 7 becomes 1:1 is set to T0.
A largest T is adjusted taking into consideration how bright white user wants as the largest gradation. When the largest T is set to be a large value, if a print having low reflectivity is read out, such as a newspaper, it is possible to reproduce background color of the newspaper as white having high reflectivity.
A smallest T is obtained as follows. A white color surface of white paper (i.e., photographic printing paper or quality paper for a photograph, or the inner white color side of the lid 31 shown in
A processing procedure when the subject is a natural image is explained next.
A contrast-up processing is carried out next (step S32). In the contrast-up processing, gradation data after the lateral irregularity processing is converted into gradation data using from a minimum gradation value of the gradation data to a maximum gradation value thereof.
A median filtering is carried out next (step S33). The median filtering is a kind of the edge emphasizing processing at step S14 shown in
After ending the median filtering, a gamma processing similar to that at step S15 shown in
As explained above, according to the present embodiment, in the state that the whole array substrate 1 is displayed in red, the subject is imaged N times while changing the image capture condition in stages. Next, in the state that the whole array substrate 1 is displayed in green, the subject is imaged N times while changing the image capture condition in stages. Next, in the state that the whole array substrate 1 is displayed in blue, the subject is imaged N times while changing the image capture condition in stages. Accordingly, the image having no influence of noise can be picked up.
In the color correction linear conversion, positive and negative signs of a diagonal element and an off-diagonal element of the matrix are reversed. Therefore, the influence of unnecessary color components leaked out to as oblique direction can be removed.
In reading a line drawing such as a name card, the skeleton processing is carried out, thereby mixing a high sensitive blue component. As a result, a sharply defined image can be obtained.
In the above-mentioned embodiments, scanner technique for reading out information of documents and pictures put on the display device has been described. Even in touch panel technique for picking-up and analyzing a state that a finger has touched on the surface of the display device to detect/calculate input coordinates and touch operation, by applying individual techniques, it is possible to improve accuracy of operation.
Even in pen input technique of contacting a pointing member (light pen) having a light source on display face, which picks up and analyzes a state that light from the light pen has been radiated, by applying individual techniques, it is possible to improve accuracy of operation.
For example, technique for removing influence of polarity inversion of the pixel described in
Number | Date | Country | Kind |
---|---|---|---|
2003-387039 | Nov 2003 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
4768085 | Hashimoto | Aug 1988 | A |
6243069 | Ogawa et al. | Jun 2001 | B1 |
6600465 | Koyama | Jul 2003 | B1 |
6747638 | Yamazaki et al. | Jun 2004 | B2 |
6876762 | Ono | Apr 2005 | B1 |
7151560 | Matherson et al. | Dec 2006 | B2 |
7184009 | Bergquist | Feb 2007 | B2 |
7205988 | Nakamura et al. | Apr 2007 | B2 |
7295237 | Kusuda | Nov 2007 | B2 |
Number | Date | Country |
---|---|---|
11-298660 | Oct 1999 | JP |
2001-292276 | Oct 2001 | JP |
Number | Date | Country | |
---|---|---|---|
20050104877 A1 | May 2005 | US |