This application is based upon and claims the benefit of priority from the prior Japanese Patent Applications No.2010-140661, filed on Jun. 21, 2010 and Japanese Patent Application No. 2010-234760, filed on Oct. 19, 2010, the entire contents of which are incorporated herein by reference.
1. Field of the Invention
The present invention relates to an image pickup apparatus for capturing the image of a projection plane and a projection type image display apparatus carrying said image pickup apparatus.
2. Description of the Related Art
Recent years have seen a broadening practical use of a projection type image display apparatus carrying a camera (hereinafter referred to as “projector” as appropriate). The camera mounted thereon can be used for start-up settings such as trapezoidal distortion correction and autofocus adjustment. That is, the camera takes an image of a test pattern projected on a screen at the initial setting, and the projector itself carries out trapezoidal distortion correction and autofocus adjustment using the image picked up by the camera.
Also, as a recent trend, presentation or study guidance using the projector employs systems of directly writing letters or drawing diagrams on the screen by the use of a pointing device (e.g., a pointing pen using a laser). Such a system often uses infrared rays, instead of visible light, cast from the pointing device in order to minimize the effects of extraneous lights or accidental application of laser light on human eyes.
The above-mentioned test pattern at initial setting is normally projected with visible light. Accordingly, the camera to capture the test pattern is generally one equipped with an infrared cut filter. On the other hand, the camera to detect the coordinate position cast by the pointing device using infrared rays is generally an infrared camera equipped with a visible light cut filter. As a result, a projector carrying both of the systems must be equipped with two cameras.
An image pickup apparatus according to one embodiment of the present invention includes: a visible light cut filter configured to transmit an infrared component and configured to block-a visible light component; and a plurality of image pickup devices configured to receive light transmitted through the visible light cut filter such that a plurality of color components are received separately from each other. The visible light cut filter allows part of the visible light components to transmit such that the visible light component enters at least one of the plurality of image pickup devices.
Another embodiment of the present invention relates also to an image pickup apparatus. This apparatus includes a visible light cut filter configured to transmit an infrared component and configured to block a visible light component; and a plurality of image pickup devices configured to receive light transmitted through the visible light cut filter such that a plurality of color components are received separately from each other. The visible light cut filter is set such that the transmittance of visible light wavelength region is set to a value which is greater than zero and which is less than the transmittance of near-infrared wavelength region.
Still another embodiment of the present invention relates to a projection type image display apparatus. This apparatus includes: a projection unit configured to project an image onto a projection plane; and the above-described image pickup apparatus configured to pick up an image of the projection plane. The image pickup apparatus is used to both pick up a pattern image in start-up setting of the projection type image display apparatus and pick up a pointing image projected onto the projection plane by a pointing device emitting infrared rays.
Still another embodiment of the present invention relates to a projection type image display apparatus. This apparatus includes: a projection unit configured to project an image onto a projection plane; an image pickup apparatus configured to pick up an image of the projection unit; and an image analysis unit configured to identify pointing coordinates specified by a red component signal as the pointing coordinates projected onto the projection plane by a pointing device emitting red light, configured to identify pointing coordinates specified by a green component signal as the pointing coordinates projected onto the projection plane by a pointing device emitting green light and configured to identify pointing coordinates specified by a near-infrared component signal as the pointing coordinates projected onto the projection plane by a pointing device emitting infrared rays.
Embodiments will now be described by way of examples only, with reference to the accompanying drawings which are meant to be exemplary, not limiting, and wherein like elements are numbered alike in several Figures in which:
The invention will now be described by reference to the preferred embodiments. This does not intend to limit the scope of the present invention, but to exemplify the invention.
The projection unit 50 projects an image onto the projection plane 300 such as a screen. The projection unit 50 includes a light source 51 and an optical modulator 52. The type of the light source 51 that can be employed includes a halogen lamp having a filament-type electrode structure, a metal halide lamp having an electrode structure capable of producing arc discharge, a xenon short arc lamp, a high-pressure mercury lamp, and an LED lamp.
The optical modulator 52 modulates the light incident from the light source 51 according to an image signal set by the control unit 60 (more precisely by an image signal setting unit 65 to be discussed later). For example, a digital micromirror device (DMD) can be employed as the optical modulator 52. The DMD, which is equipped with a plurality of micromirrors corresponding to the number of pixels, can produce desired image light as the orientation of each micromirror is controlled according to each pixel signal.
The image pickup apparatus 100 is installed in a fixed manner at a predetermined position on a casing of the projection type image display apparatus 200 or at a predetermined position away from the casing thereof in order to capture the image of the projection plane 300. The image pickup apparatus 100 includes a visible light cut filter 10, a lens 20, image pickup devices 30, and a signal processing circuit 40.
The visible light cut filter 10 transmits the infrared components of light reflected from the projection plane 300 and blocks basically the visible-light components of light reflected from the projection plane 300. A plurality of image pickup devices 30 receive via the lens 20 the light passed through the visible light cut filter 10 such that a plurality of color components are received separately from each other.
The image pickup devices 30 that can be used are CMOS (Complementary Metal Oxide Semiconductor) image sensors or CCD (Charge-Coupled Devices) image sensors, for instance. The plurality of image pickup devices 30 are arranged in a matrix, and a color filter is provided for each of the image pickup devices 30. Hereinbelow in this specification, a description will be given of an example in which the incident light to the image pickup devices 30 is received by decomposing it by the use of an RGB primary color filter (red transmission filter, green transmission filter, and blue transmission filter) into a red component, a green component, and a blue component thereof.
The incident light may be decomposed by the use of a complementary filter into yellow, cyan, and magenta components or the yellow, cyan, and green components or yellow, cyan, magenta, and green components.
When the RGB primary color filter is to be installed, the plurality of image pickup devices 30, which include those to receive the red component (hereinafter referred to as “red image pickup device), those to receive the green component (hereinafter referred to as “green image pickup device), and those to receive the blue component (hereinafter referred to as “blue image pickup device), are configured in a Bayer arrangement.
The visible light cut filter 10 allows part of the visible light components to pass through such that the visible light component enters at least one of the plurality of image pickup devices 30. In the first embodiment, at least part of the red component is transmitted, so that at least the part of the red component enters the red image pickup device.
The signal processing circuit 40 performs various signal processings, such as analog-to-digital (A/D) conversion and RGB format to YUV format conversion, on the signals outputted from the image pickup devices 30, thereby outputting the processed signals to the control unit 60.
The image pickup apparatus 100 is used for both of the capturing of a pattern image in start-up setting of the projection type image display apparatus 200 and the capturing of a pointing image projected onto the projection plane 300 by the pointing device 400 sending infrared rays. Note also that the image pickup apparatus 100 is used for capturing images in a calibration processing for the pointing device 400. These processings will be described in detail later.
The pointing device 400, which is provided with an infrared LED, for instance, is a device capable of emitting infrared rays. Also, the pointing device 400 may be provided with a visible-light LED, such as a red LED, so that it can give off a visible light. This function of emitting a visible light can be utilized in calibration.
The control unit 60 performs an overall control of the projection type image display apparatus 200. The control unit 60 includes an image analysis unit 61, a setting unit 62, a synthesis unit 63, an image memory 64, and an image signal setting unit 65.
The structure of the control unit 60 may be achieved hardwarewise by elements such as a given processor, memory and other LSIs, and softwarewise by memory-loaded programs or the like. Depicted herein are functional blocks implemented by cooperation of hardware and software. Therefore, it will be obvious to those skilled in the art that the functional blocks may be implemented by a variety of manners including hardware only, software only or a combination of both.
The image memory 64 stores image signals to be projected onto the projection plane 300. The image memory 64 also stores test pattern images to be used in setting the projection type image display apparatus 200 (e.g., pattern image for detection of trapezoidal distortion, pattern image for focus adjustment). Such test pattern images may be stripe pattern, checkered flag pattern, and so forth. Also, the image memory 64 stores test pattern images for calibration of the pointing device 400. Such a test pattern image may be a pattern with a plurality of points spaced apart from each other at arbitrary intervals, for instance.
The image signal setting unit 65 sets each of the required test pattern images for the setting of the projection type image display apparatus 200 in the optical modulator 52, thereby projecting it onto the projection plane 300. The image pickup apparatus 100 captures the test pattern image projected onto the projection plane 300. As already described, although the image pickup apparatus 100 is provided with the visible light cut filter 10, the test pattern image can be captured because part of the visible light components (red component in the first embodiment) is allowed to transmit the visible light cut filter 10.
It is to be noted that the settings of the projection type image display apparatus 200 basically concern the positional relationship between the projection type image display apparatus 200 and the projection plane 300 and/or focusing, and therefore there is no need to use a test pattern image having color variations. Hence, the receiving of the red component only can achieve recognition of the test pattern without any problem.
The image analysis unit 61 analyzes the test pattern image captured by the image pickup apparatus 100 and supplies the results of the analysis to the setting unit 62. In the trapezoidal distortion correction, the setting unit 62 sets in the image signal setting unit 65 a correction value for correcting the image signal in response to the identified vertical or horizontal swell in the test pattern image in such a manner as to cancel out the swell. In the focus adjustment, the setting unit 62 sets the position of the focusing lens (not shown) to an optimum position using contrast detecting technique.
The image signal setting unit 65 sets a test pattern image required for the calibration of the pointing device 400 in the optical modulator 52, thereby projecting the test pattern image onto the projection plane 300. A user of the pointing device 400 directs a visible light (red light in the first embodiment) at a point within the test pattern image.
The image analysis unit 61 analyzes the positional relationship between the test pattern image captured by the image pickup apparatus 100 and the pointing image cast by the pointing device 400, which is tapped by the user, and supplies the results of the analysis to the setting unit 62. If there is any positional mismatch, the setting unit 62 sets in the image signal setting unit 65 a correction value for correcting the coordinate values in such a manner as to effect the calibration for the positional mismatch.
During normal projection, the synthesis unit 63 identifies the coordinate values pointed by the pointing device 400 and makes the setting in the image signal setting unit 65 so that predetermined color data is rendered at the coordinate values. Note that although the image pickup apparatus 100 does not include image pickup devices for detecting infrared rays, the infrared rays emitted by the pointing device 400 can be captured because the ordinary color filter passes infrared rays.
Therefore, for the setting of the projection type image display apparatus 200 and for the calibration of the pointing device 400, the red image pickup device is used as shown in
On the other hand, for the pointing during normal projection, the green and blue image pickup devices are used as shown in
According to the first embodiment as described above, provision of a visible light cut filter can accomplish the start-up settings of the projector and the drawing system using the pointing device with a single camera. Also, if a pointing device capable of emitting visible light is used, then the calibration thereof can be realized by the same camera. Furthermore, ordinary solid-state image sensing devices configured in a Bayer arrangement may be employed without use of the image pickup devices for infrared rays.
Also, according to the first embodiment, the visible light cut filter provided on the image sensor equipped with the RGB primary color filter allows structuring at low cost of an image sensor capable of separately acquiring visible light signals and infrared light signals. The image sensor having regularly arranged color filter for transmitting visible-light components and color filter for transmitting infrared-light components has only limited uses, so that the image sensor may not be mass-produced and thus may tend to be expensive. Contrary to this, the image sensor equipped with the RGB primary color filter has its applications in digital still cameras, mobile phones and the like, so that the image sensor is mass-produced and less expensive. Thus, the present embodiment employing a less expensive image sensor enables production of the projection type image display apparatus 200 at low cost.
A description is now given of a second embodiment. The second embodiment is an example where the cutoff wavelength of the visible cut filter 10 is set at a shorter-wavelength side. The structures and operations of an image pickup apparatus 100 and a projection type image display apparatus 200 according to the second embodiment are basically the same as those of the first embodiment and therefore the description thereof is omitted here.
Therefore, for the setting of the projection type image display apparatus 200 and for the calibration of the pointing device 400, the red image pickup device and the green image pickup devices are used as shown in
On the other hand, for the pointing during normal projection, the blue image pickup device is used as shown in
According to the second embodiment as described above, the cutoff wavelength of the visible cut filter is set at a shorter-wavelength side. Also, the red transmission filter, the green transmission filter, and the blue transmission filter are each configured by a band-pass filter. Hence, the roles of the respective image pickup devices can be varied. More specifically, in the first embodiment, the resolution is low for the setting of the projector and for the calibration of the pointing device, whereas the resolution is high for the pointing during normal projection. In contrast, in the second embodiment, the resolution is high for the setting of the projector and for the calibration of the pointing device, whereas the resolution is low for the pointing during normal projection.
A third embodiment will now be described. The third embodiment is an example where the visible light cut filter 10 is set to allow a low transmittance without being cut in the visible light region. The structures and operations of an image pickup apparatus 100 and a projection type image display apparatus 200 according to the third embodiment are basically the same as those of the first embodiment and therefore the description thereof is omitted here.
Since the transmittance of visible light region of the visible light cut filter 10 is low for the setting of the projection type image display apparatus 200 and for the calibration of the pointing device 400, the control unit 60 sets the exposure time of the image pickup devices 30 to a longer duration or amplifies the image values. For example, where the transmittance is 10%, the exposure time is set to as long as ten times. Since the test pattern image projected onto the projection plane 300 is a still image, no problems arises even if the exposure time is set longer.
The transmittance of near-infrared region of the visible light cut filter 10 is high. For the pointing during normal projection, a projected image by the projection type image display apparatus 200 (namely, the wavelength of visible light) is dimmed by the visible light cut filter 10. Thus, the image analysis unit 61 can easily separate the pointing image cast by the infrared rays.
According to the third embodiment as described above, the exposure time required for the pointing during normal projection is short. Thus, the following (tracking) capability to the pointing image that moves fast is high.
The present invention has been described based on the first to the third embodiments. These embodiments are intended to be illustrative only, and it is understood by those skilled in the art that various modifications to constituting elements and processes as well as arbitrary combinations thereof could be developed and that such modifications and combinations are also within the scope of the present invention.
The transmittances of infrared wavelength regions in the red image pickup device, the green image pickup device and the blue image pickup device are approximately equal to each other. The image analysis unit 61 references the pixel values of adjacent red image pickup devices, green image pickup devices and blue image pickup devices and determines the coordinates, where these pixel values thereof are aligned within a certain range, to be bright points by the infrared light emitted from the pointing device 400. The bright points due to the extraneous lights (visible light) not cast by the pointing device 400 differs in the pixel values from the adjacent red pickup image devices, the green image pickup devices and the blue image pickup devices. If it is determined that the bright points are produced by the infrared rays emitted from the pointing device 400, all pixel values of the red image pickup device, the green image pickup device and the blue image pickup device can be utilized for the pointing.
In the above-described first to third embodiments, descriptions have been given of a system where the image pickup apparatus 100 is used to both capture a pattern image in start-up setting of the projection type image display apparatus 200 and capture a pointing image projected onto the projection plane 300 by the pointing device 400 emitting infrared rays. Also, a description has been given of an example where the calibration is realized by using the pointing device 400 capable of emitting visible light in setting the projection type image display apparatus 200. It is assumed in the above-described first to third embodiments, however, that a single pointing device 400 for emitting infrared rays is used during normal projection.
In the following embodiments, a description will be given of examples where a plurality of pointing devices are used simultaneously during normal projection. Assume in the fourth embodiment that two kinds of pointing devices, which are (1) a first pointing device emitting red light (e.g., a laser-pointer type electronic pen emitting red light) and (2) a second pointing device emitting infrared rays (e.g., an electronic pen using an infrared LED). For example, assumed is a situation where a teacher and a student or a presenter and a questioner simultaneously draw symbols, characters and the like on the projection plane 300.
The image pickup apparatus 100 according to the fourth embodiment is the same as the image pickup apparatus 100 according to the first embodiment. Thus, as shown in
Comparing the projection type image display apparatus 200 according to the fourth embodiment with the projection type image display apparatus 200 according to the first embodiment, the image analysis unit 61 in the control unit 60 differs. The image analysis unit 61 according to the fourth embodiment indentifies first pointing coordinates specified by the output signal of the red image pickup device, as a pointing image projected onto the projection plane 300 by the first pointing device emitting red light. Also, the image analysis unit 61 according to the fourth embodiment indentifies second pointing coordinates specified by at least one of the output signals of the green image pickup device and the blue image pickup device, as a pointing image projected onto the projection plane 300 by the second pointing device emitting infrared rays.
If the first pointing coordinates and the second pointing coordinates practically agree with each other, the image analysis unit 61 will determine the pointing image projected on the pointing coordinates to be a pointing image projected by the second pointing device.
Here, the case where the first pointing coordinates and the second pointing coordinates agree with each other corresponds to the case where both the first pointing coordinates and the second pointing coordinates are detected and also the both pointing coordinates are two kinds of pixels, respectively, configured in the same Bayer arrangement. Also, that the first pointing coordinates and the second pointing coordinates practically agree with each other means the case where both the first pointing coordinates and the second pointing coordinates are detected and also the first pointing coordinates and the second pointing coordinates are related to each other in such a manner that the both pointing coordinates are adjacent pixels configured in the adjacent Bayer arrangements. The range within which the pixels are regarded adjacent to each other may be adjusted optionally by the designer.
The signal processing circuit 40 performs such processings as A/D conversion and noise removal. It is to be appreciated that the fourth embodiment does not involve conversion from RGB format to YUV format.
The first pointing coordinate identifying unit 611 compares the output signals Rout of the red image pickup device of each pixel against a first preset value. If a pixel that has outputted a signal exceeding the first preset value is detected, the first pointing coordinate identifying unit 611 will convert the address of said pixel into a coordinate system and then output the coordinates to the coordinate comparison unit 613. The first preset value is a value used to detect the reflected light caused by the irradiation light of the first pointing device, and is set to a value by which the reflected light caused by the light emitted from the first pointing device, and the reflected light from the image projected onto the projection plane 300, environmental light, and so on can be distinguished from each other.
The second pointing coordinate identifying unit 612 compares the output signals Gout of the green image pickup device of each pixel against a second preset value. If a pixel that has outputted a signal exceeding the second preset value is detected in the output signals Gout of the green image pickup device, the second pointing coordinate identifying unit 612 will convert the address of said pixel into a coordinate system and then output the coordinates to the coordinate comparison unit 613. Similarly, the second pointing coordinate identifying unit 612 compares the output signals Bout of the blue image pickup device of each pixel against a third preset value. If a pixel that has outputted a signal exceeding the third preset value is detected in the output signals Bout of the blue image pickup device, the second pointing coordinate identifying unit 612 will convert the address of said pixel into a coordinate system and then output the coordinates to the coordinate comparison unit 613.
The second preset value and the third preset value are also set under the same design concept as the first preset value. Each preset value is actually determined based on the intensities of light emitted from the first pointing device and the second pointing device, the sensitivity of each pickup device where the red transmission filter, the green transmission filter and the blue transmission filter are installed, and so forth.
The configuration may be such that the output signal Gout of the green image pickup device of each pixel added with the output signal Gout of the green image pickup device corresponding thereto are inputted to the second pointing coordinate identifying unit 612. In this case, the added signal and a fourth preset value are compared with each other. The fourth preset value is also set under the same design concept as the first preset value.
The coordinate comparison unit 613 outputs the coordinates detected by the first pointing coordinate identifying unit 611, to the synthesis unit 63 as the coordinates that indicate the pointing image by the first pointing device emitting the red light. The synthesis unit 63 makes settings in the image signal setting unit 65 such that data of a predetermined color (e.g., red) is rendered at the first pointing coordinates.
The coordinate comparison unit 613 outputs the coordinates detected by the second pointing coordinate identifying unit 612, to the synthesis unit 63 as the coordinates that indicate the pointing image by the second pointing device emitting the infrared rays. The synthesis unit 63 makes settings in the image signal setting unit 65 such that data of a predetermined color (e.g., black) is rendered at the second pointing coordinates.
If the first pointing coordinates and the second pointing coordinates detected by the first pointing coordinate identifying unit 611 and the second pointing coordinate identifying unit 612, respectively, practically agree with each other, the coordinate comparison unit 613 will output the coordinates to the synthesis unit 63 as the coordinates indicating the pointing image by the second pointing device emitting the infrared rays but not as the coordinates indicating the pointing image by the first pointing device emitting the red light. The synthesis unit 63 makes settings in the image signal setting unit 65 such that data of a predetermined color (e.g., black) is rendered at the coordinates.
According to the fourth embodiment as described above, during normal projection two pointing devices can be simultaneously used with accuracy in addition to the advantageous effects achieved by the projection image display apparatus 200 according to the first embodiment. In particular, where two pointing images by the two pointing devices agree with each other or are located close to each other on the projection plane, which pointing device makes the pointing image can be determined with accuracy.
In the first to fourth embodiments thus far explained, descriptions have been given of examples which use the RGB primary color filter having ideal spectral sensitivity characteristics as shown in
Similar to the fourth embodiment, the visible light cut filter 10 cuts the green component, the blue component, and part of the shorter-wavelength side of the red component and transmits part of the longer-wavelength side of the red component and the infrared light component. Hereinbelow in the fifth embodiment, this visible light cut filter 10 will be referred to as the GB cut filter.
The demosaicing processing unit 45 classifies the signals outputted from the image pickup devices 30 into output signals R from the red image pickup devices 32, output signals G from the green image pickup devices 34, and output signals B from the blue image pickup devices 36 and outputs the classified signals to an arithmetic processing unit 48.
The arithmetic processing unit 48 corrects the output signals R, G, and B from the red image pickup devices 32, the green image pickup devices 34, and the blue image pickup devices 36, respectively, in order to lower, in a pseudo manner, the sensitivity in the near-infrared wavelength region of the red image pickup devices 32 and the sensitivity in the red wavelength region of the green image pickup devices 34 and the blue image pickup devices 36. The arithmetic processing unit 48 may perform the computation by the use of such hardware as a multiplier or an adder or through software processing by a processor.
As indicated in Equation (1) below, the arithmetic processing unit 48 obtains an output signal G1 after the correction of the green image pickup device 34 by subtracting a signal, which is the output signal R of the red image pickup device 32 multiplied by a first coefficient cl (positive value), from the output signal G of the green image pickup device 34. Also, as indicated in Equation (2) below, the arithmetic processing unit 48 obtains an output signal B1 after the correction of the blue image pickup device 36 by subtracting a signal, which is the output signal R of the red image pickup device 32 multiplied by the first coefficient c1 (positive value), from the output signal B of the blue image pickup device 36. Also, as indicated in Equation (3) below, the arithmetic processing unit 48 obtains an output signal R1 after the correction of the red image pickup device 32 by subtracting a signal, which is the output signal B of the blue image pickup device 36 or the output signal G of the green image pickup device 34 multiplied by a second coefficient c2 (positive value), from the output signal R of the red image pickup device 32. The arithmetic processing unit 48 outputs the respective output signals R1, G1, and B1 after the correction to the control unit 60.
G1=G−c1*R Equation (1)
B1=B−c1*R Equation (2)
R1=R−c2*B Equation (3)
The image analysis unit 61 according to the fifth embodiment identifies the pointing coordinates specified by the output signal R1 after the correction of the red image pickup device 32 as the pointing coordinates projected onto the projection plane 300 by the first pointing device emitting red light. Also, the image analysis unit 61 according to the fifth embodiment identifies the pointing coordinates specified by at least one of the output signals G1 and B1 of the green image pickup device 34 and the blue image pickup device 36 as the pointing coordinates projected onto the projection plane 300 by the second pointing device emitting infrared rays.
The synthesis unit 63 makes settings in the image signal setting unit 65 such that data of mutually different colors are rendered at the pointing coordinates specified by the output signal R1 after the correction of the red image pickup device 32 and the pointing coordinates specified by at least one of the output signals G1 and B1 of the green image pickup device 34 and the blue image pickup device 36.
In the fifth embodiment as described above, the red light from the first pointing device is recognized by the red image pickup device 32, and the infrared rays from the second pointing device are recognized by the green image pickup device 34 or the blue image pickup device 36. In this case, it is preferable that the green image pickup device 34 and the blue image pickup device 36 have sensitivity only in a neighborhood of the wavelengths of the infrared rays emitted by the second pointing device. On the other hand, it is preferable that the red image pickup device 32 has sensitivity only in the neighborhood of the wavelengths of the red light emitted by the first pointing device.
In
Also, the red transmission filter has a transmittance of over 60% in the wavelength region of about 780 to 900 nm (see W2 in
For the image pickup apparatus 100 having spectral sensitivity characteristics as shown in
In this manner, a conversion into spectral sensitivity characteristics as shown in
Also, the red transmission filter has a transmittance of 10% or below in the wavelength region of about 780 to 900 nm in the near-infrared wavelength region (see W2 in
Each of the values to be subtracted from the output signal G of the green image pickup device 34, the output signal B of the blue image pickup device 36, and the output signal R of the red image pickup device 32, respectively, is set to a value calculated by the designer based on the spectral sensitivity characteristics of the image pickup devices 30 and the characteristic Fcgb of the GB cut filter.
The first coefficient c1 used to determine the values to be subtracted from the output signal G of the green image pickup device 34 and the output signal B of the blue image pickup device 36, respectively, is determined as follows, for instance. That is, the first coefficient c1 is set to a value such that the sensitivity in a wavelength region between the half-wavelength of the GB cut filter (i.e., the wavelength at which the transmittance becomes 50%) and a wavelength on the boundary of a lower-wavelength side of the near-infrared wavelength region becomes lower in a pseudo manner. Ideally, the value is preferably set such that the sensitivity of the green image pickup device 34 and the blue image pickup device 36 in said wavelength region is practically zero.
The second coefficient c2 used to determine the value to be subtracted from the output signal R of the red image pickup device 32 is set such that the sensitivity in the near-infrared wavelength region is lowered in a pseudo manner. Ideally, the value is preferably set such that the sensitivity of the red image pickup device 32 in the near-infrared wavelength region is practically zero. In the examples of
G1=G−(0.4*R) Equation (4)
B1=B−(0.4*R) Equation (5)
R1=R−(0.9*B) Equation (6)
According to the fifth embodiment as described above, the arithmetic processing unit 48, which is provided in the image pickup apparatus 100, corrects the output signal of each image pickup device and therefore the robustness for the pointing is improved. Hence, the same advantageous effects as those of the fourth embodiment can be achieved even though the RGB primary color filter having low spectral sensitivity characteristics is used.
The visible light cut filter 10 transmits the infrared component, the red component and the green component and blocks the blue component. Hereinbelow in the sixth embodiment, this visible light cut filter 10 will be referred to as the B cut filter.
The demosaicing processing unit 45 classifies the signals outputted from the image pickup devices 30 into the output signals R from the red image pickup devices 32, the output signals G from the green image pickup devices 34, and the output signals B from the blue image pickup devices 36 and outputs the classified signals to the arithmetic processing unit 48.
The arithmetic processing unit 48 generates three kinds of signals, which are a red component signal R2, a green component signal G2, and an near-infrared component signal B2 by subjecting the output signals R, G, and B from the red image pickup devices 32, the green image pickup devices 34, and the blue image pickup devices 36, respectively, to mutually different predetermined arithmetic operations.
As indicated in Equation (7) below, the arithmetic processing unit 48 generates the red component signal R2, which has lowered in a pseudo manner the sensitivity in the near-infrared wavelength region, by subtracting (i) a signal which is the output signal G of the green image pickup device 34 multiplied by a third coefficient c3 (positive value) and (ii) a signal which is the output signal B of the blue image pickup device 36 multiplied by a fourth coefficient c4 (positive value) from the output signal R of the red image pickup device 32. Also, as indicated in Equation (8) below, the arithmetic processing unit 48 generates the green component signal G2, which has lowered in a pseudo manner the sensitivity in the near-infrared wavelength region, by subtracting (iii) a signal which is the output signal R of the red image pickup device 32 multiplied by a fifth coefficient c5 (positive value) and (iv) a signal which is the output signal B of the blue image pickup device 36 multiplied by a sixth coefficient c5 (positive value) from the output signal G of the green image pickup device 34. Also, as indicated in Equation (9) below, the arithmetic processing unit 48 generates the near-infrared component signal B2, which has lowered in a pseudo manner the sensitivity in the visible light region, by subtracting a signal, which is the output signal R of the red image pickup device 32 multiplied by a seventh coefficient c7 (positive value), from the output signal B of the blue image pickup device 36. The arithmetic processing unit 48 outputs the generated red component signal R2, green component signal G2 and near-infrared component signal B2 to the control unit 60.
R2=R−c3*G−c4*B Equation (7)
G2=G−c5*R−c6*B Equation (8)
B2=B−c7*R Equation (9)
The blue component of light incident to the RGB primary color filter is cut by the B cut filter. Note that the characteristic Fr of the red transmission filter, the characteristic Fg of the green transmission filter, and the characteristic Fb of the blue transmission filter before the installation of the B cut filter are the same as those shown in
As shown in
Similar to the fifth embodiment, each of the values of the third coefficient c3 to the seventh coefficient c7 used in the above Equation (7) to Equation (9) is set to a value calculated by the designer based on the spectral sensitivity characteristics of the image pickup devices 30 and the characteristic Fcb of the B cut filter. In the examples of
R2=R−0.5*G−0.7*G Equation (10)
G2=G−0.3*R−0.7*B Equation (11)
B2=B−0.25*R Equation (12)
The image analysis unit 61 according to the sixth embodiment identifies the pointing coordinates specified by the red component signal R2 as the pointing coordinates projected onto the projection plane 300 by a pointing device emitting red light. Also, the image analysis unit 61 according to the sixth embodiment identifies the pointing coordinates specified by the green component signal G2 as the pointing coordinates projected onto the projection plane 300 by a pointing device emitting green light. Also, the image analysis unit 61 according to the sixth embodiment identifies the pointing coordinates specified by the infrared component signal B2 as the pointing coordinates projected onto the projection plane 300 by a pointing device emitting infrared rays.
The synthesis unit 63 makes settings in the image signal setting unit 65 such that data of mutually different colors are rendered at the pointing coordinates specified by the red component signal R2, the pointing coordinates specified by the green component signal G2, and the pointing coordinates specified by the near-infrared component signal B2, respectively. In this manner, the three pointing devices can be simultaneously used with accuracy in the sixth embodiment.
Also, according to the image pickup apparatus 100 of the sixth embodiment, the three components, which are the green component, the red component and the infrared component, can be used in start-up setting of the projection type image display apparatus 200. There is available a projection type image display apparatus 200 provided with a wall color correction function for adjusting the projection plane to the white color during projection where the color of wall used as the projection plane is other than the white color is. According the sixth embodiment, the green component, the red component and the infrared component are detected, so that the color of the projection plane can be easily estimated. Then, the color balance is adjusted based on the estimated result and thereby the color of the projection plane can be brought as close as to the white. Also, in the above-described focus adjustment, the two components which are the green component and the red component are used and thereby the adjustment can be made more accurately than when the red component only is used.
According to the sixth embodiment as described above, the B cut filter and the arithmetic processing unit are added to a commonly-used image pickup apparatus that uses the RGB primary color filter having low spectral sensitivity characteristics. As a result, realized is the image pickup apparatus capable of shifting the wavelength region to a longer-wavelength side and identifying, with a high degree of accuracy, the three kinds of components which are the green component, the red component, and the infrared component. Such an image pickup apparatus according to each of the above-described embodiments is not only applicable to the projection type image display apparatus but also applicable to a monitoring camera, where the detection of infrared rays is effective, and other variety of usages.
The description of the present invention given above is based upon illustrative embodiments. These embodiments are intended to be illustrative only and it will be obvious to those skilled in the art that various modifications to constituting elements and processes could be further developed and that such additional modifications are also within the scope of the present invention.
Number | Date | Country | Kind |
---|---|---|---|
2010-140661 | Jun 2010 | JP | national |
2010-234760 | Oct 2010 | JP | national |