1. Field of the Invention
The present invention relates to an image projection device.
2. Description of the Related Art
Projectors project enlarged images, such as letters or graphs, and thus are widely used in presentations for a large number of people, or the like. In a presentation, the presenter may point an image that is projected on the screen with, for example, a pointer in order to make explanation easy to understand. However, directly pointing a projected image with a laser pointer has a problem in that a desired part cannot be pointed accurately due to trembling of the hand. Accordingly, a technique according to Japanese Laid-open Patent Publication No. H11-271675 has been already known in which a built-in CCD (Charg Coupled Device) camera of a projector detects the spot that is illuminated by a user with a laser pointer and a pointer image is displayed on the same spot as the illuminated spot.
However, detection of a spot that is illuminated with a laser pointer from a video image that is captured by a camera in the conventional manner has a problem in that, when the color of the laser pointer is similar to the color or luminance of a projected video image, there is a possibility that the laser pointer cannot be detected depending on the content of projection.
In view of the above, there is a need to provide an image projection device that can accurately detect a spot that is illuminated by an illuminating device, such as a laser pointer.
It is an object of the present invention to at least partially solve the problems in the conventional technology.
An image projection device includes: a drive control unit that generates colors sequentially; a setting unit that sets a color of an illumination image; an image capturing control unit that receives, from the drive control unit, a synchronization signal that specifies a timing at which light of a color closest to the set color of the illumination image is not projected and causes an image of a projection surface to be captured in accordance with the timing that is specified by the synchronization signal; an illumination image detection unit that detects, from captured image data, an illumination image that is produced by illuminating the projection surface by an illuminating device; and an illumination image generation unit that generates projection image data obtained by combining given image data at a position at which the illumination image is detected.
An image projection device includes: a drive control unit that generates colors sequentially; a setting unit that sets a color of an illumination image; an image capturing control unit that receives, from the drive control unit, a synchronization signal that specifies a timing at which light of a color closest to the set color of the illumination image is not projected and causes an image of a projection surface to be captured in accordance with the timing that is specified by the synchronization signal; a light spot device that produces a light spot around the projection surface from the captured image data; a light spot detection unit that detects the light spot that is produced around the light spot device; and a light spot image generation unit that generates projection image data obtained by combining given image data at a position at which the light spot is detected.
The above and other objects, features, advantages and technical and industrial significance of this invention will be better understood by reading the following detailed description of presently preferred embodiments of the invention, when considered in connection with the accompanying drawings.
An image projection device according to an embodiment of the invention will be described below with reference to the accompanying drawings.
The discoid color wheel 5 converts, in time units, white light from a light source 4 into light of colors R, G and B and emits the converted light toward the light tunnel 6. In the embodiment, a configuration for detecting a laser pointer in the image projection device 10 using the color wheel 5 will be described. A detailed configuration of the color wheel 5 will be described below. The light tunnel 6 is made by adhering plate glasses to have a cylindrical shape and guides the light emitted from the color wheel 5 to the relay lens 7. The relay lens 7 is constructed by combining two lenses and focuses the light emitted from the light tunnel 6 while correcting the chromatic aberration of the light on the optical axis. The plane mirror 8 and the concave mirror 9 reflect the light emitted by the relay lens 7 and guide the light to the image forming unit 2 to cause the light to be focused. The image forming unit 2 includes a DMD that has rectangular mirror surfaces consisting of multiple micromirrors where time-division driving is performed on the micromirrors on the basis of video image or image data so as to process and reflect the projected light so that given image data is formed.
For the light source 4, for example, a high pressure mercury lamp is used. The light source 4 emits white light toward the optical system 3a. In the optical system 3a, the white light emitted from the light source 4 is divided into light of R, G and B and the light is guided to the image forming unit 2. The image forming unit 2 forms an image according to modulation signals. The projection system 3b projects an enlarged image of the formed image.
An OFF light plate that receives unnecessary light not used as projection light from among the light incident on the image forming unit 2 is provided above the image forming unit 2 illustrated in
First, digital signals according to, for example, HDMI (trademark), and analog signals, according to, for example, VGA or component signals, are input to the video image signal input unit 13 of the image projection device 10. The video image signal input unit 13 performs a process for processing a video image into RGB or YPbPr signals, etc. according to the input signals. If the input video image signal is a digital signal, the video image signal input unit 13 converts the digital signal into a bit format that is defined by the video image processing unit 12 according to the number of bits of the input signal. If the input signal is an analog signal input, the video image signal input unit 13 performs DAC processing for performing digital sampling on the analog signal, etc. and inputs the RGB or YPbPr format signals to the video image processing unit 12. Image data of a projected image that is captured by the camera unit 22 is also input to the video image signal input unit 13.
The video image processing unit 12 performs digital image processing, etc. according to the input signal. Specifically, proper image processing is performed according to the contrast, brightness, chroma, hue, RGB gain, sharpness, scaler function for scaling up/down, the characteristics of the drive control unit 14, and/or the like. The input signal on which the digital image processing has been performed is passed to the drive control unit 14. The video image processing unit 12 can also generate image signals of an arbitrarily specified or registered layout.
The drive control unit 14 determines conditions on driving the color wheel 5 that colors the white light according to the input signals, the image forming unit 2 that chooses whether to emit or discard light, and a lamp power supply 17 that controls the current for driving the lamp, and the drive control unit 14 issues a drive instruction to the color wheel 5, the image forming unit 2 and the lamp power supply 17. The drive control unit 14 controls the color wheel 5 to sequentially generate colors.
The processing performed by the drive control unit 14 includes a flow of processing for capturing an image of a projection pattern that is projected by the image projection device 10 and calculating a projection conversion coefficient from the difference between the position of coordinates in the projected data of the captured image and the position of coordinates in image data and a flow of processing for detecting an illuminated spot. First, a flow of processing for capturing an image of a projected projection pattern will be described.
The drive control unit 14 issues an image capturing instruction to the camera unit 22 in accordance with a synchronization signal in accordance with the timing of image projection. In other words, the drive control unit 14 is configured to serve also as an image capturing control unit according to the embodiment, but the drive control unit 14 and an image capturing control unit may be provided separately. When the image that is captured by the camera unit 22 is input to the video image signal input unit 13, the video image signal input unit 13 performs shading correction, Bayer conversion, color correction, and/or the like on the captured camera image to generate RGB signals. The video image processing unit 12 generates the projection pattern illustrated in
The camera unit 22 captures an image of the scene resulting from projection of the projection pattern. The image capturing systems that may be employed by the camera unit 22 include a global shutter system and a rolling shutter system. The global shutter system exposes all pixels simultaneously and and requires, for each pixel, a circuit more complicated than that for the rolling shutter system. However, there is an advantage that an image can be captured by exposing all the pixels at a time. In contrast, the rolling shutter system performs sequential scanning exposure, can be implemented with a simple circuit, and can capture an image by scanning. However, the image capturing timings are different between respective pixels and thus, when an image of an object moving fast is captured, distortion and/or the like may occur. It is preferable that the shutter speed of the camera unit 22 be controlled by the drive control unit 14. The drive control unit 14 determines and controls a shutter speed that is required for image capturing for the period of a timing that is specified by the synchronization signal.
On the basis of the captured image, the light spot detection unit 24 acquires coordinates of each lattice point on the projection surface on which the image is currently projected. The conversion coefficient arithmetic logic unit 26 calculates a projection conversion coefficient H that associates the coordinates (x,y) in the video image signal of the projection pattern with the coordinates (x′,y′) in the captured image and sets a parameter H in the conversion processing unit 25. The light spot detection unit 24 extracts the coordinates where the projection surface is illuminated. The conversion processing unit 25 performs projection conversion using the corresponding lattice point parameter H on the detected illuminated spot coordinates (x′,y′) so that the illuminated spot coordinates (x′,y′) are converted into the coordinates (x,y) of the illuminated spot coordinate in the video image signal. The pointer generation unit 27 (illumination image generation unit) generates illuminated spot image data at the coordinates (x,y). For the illuminated spot image data, for example, a circle having a diameter corresponding to z pixels about the coordinates (x,y) or a previously-registered pointer image may be generated with an arbitrary generating method. The pointer generation unit 27 performs calculations for the image generation, generates projection image signals, and transmits the projection image signal to a video image signal transmission unit 28. The video image processing unit 12 of the image projection device 10 superimposes the video image signal generated by the pointer generation unit 27 on the video image signal, performs arbitrary image processing, and then outputs a control signal to the drive control unit 14. Then, projection image data obtained by superimposing a pointer image is projected from the image projection device 10.
The flow of processing performed by the light spot detection unit 24 to detect the coordinates of a pointer that is caused to illuminate the projection surface will be described below.
Accordingly, in the embodiment, image capturing and detection of laser pointers of R, G and B that are caused to illuminate the projection pattern are performed in synchronization with such timing.
In the interval of Timing A, no video image signal of a Red plane is emitted, which makes it easy to detect an illuminated spot of a laser pointer of red. For this reason, when illuminating light of a red pointer is detected, the accuracy of detection improves if image capturing is performed by the camera unit 22 in synchronization with Timing A. Similarly, regarding the Timing B Green shot image at Timing B, no video image signal of a Green plane is emitted, which improves accuracy of detection of a laser pointer of green. Furthermore, regarding the timing C Blue shot image at Timing C, no video image signal of a Blue plane is emitted, which makes it easy to detect an illuminated spot of a laser pointer of blue. As described above, the colors become easy to detect at respective timings and, by effectively utilizing each. Timing, multiple illuminated spots can be detected simultaneously.
A method of synchronizing image capturing by the camera unit 22 and driving of the color wheel 5 will be described here. The color wheel 5 is provided with a black seal serving as an index for detecting the rotation and the holder of the color wheel 5 is provided with a sensor for detecting the black seal. By acquiring the timing at which the black seal is detected from the sensor, the drive control unit 14 issues an instruction for generating a synchronization signal. The synchronization signal to be generated is determined so as to be in synchronization with the period in which a color closest to the color set for a spot illuminated by the laser pointer is not projected. Accordingly, the camera unit 22 that performs image capturing can control the shutter timing of image capturing in accordance with the synchronization signal of the color wheel.
Normally, light is emitted to the screen 30 when the image forming unit 2 is on and no light is emitted when the image forming unit 2 is off. However, in the above-described example, by performing image capturing in synchronization with the corresponding timing even when the the image forming unit 2 is on, an illuminated spot can be detected also from a color plane data of R, G or B. In other words, a spot illuminated by the laser pointer 40 can be detected regardless of the content of the video image.
An example has been described where, when a laser pointer of multiple colors is detected, multiple colors are detected by using one camera. Alternatively, each camera for each color to be detected may be individually mounted and detection control may be performed for each color. In this case, there is no time when an image capturing task of the camera is occupied by other colors, and thus the detection interval of each single color can be short. In such a case, it is preferable to previously set which timing corresponds to which camera unit. In other words, for example, if three camera units are provided for three colors of R, G and B, each illuminated spot of the laser pointer of each color can be detected in each cycle.
Alternatively, when it is desired to detect a middle color, it is satisfactory if the areas of Timing A, Timing B, and Timing C are further divided for colors. In such a case, by performing image capturing at a timing corresponding to a middle color desired to be detected, detection can be performed for each segment of the color wheel 5.
The flow of processing for calculating a projection conversion coefficient and for detecting the laser pointer will be described with reference to
In contrast, when determining that it is in the illuminated spot detection mode, (YES at step S102), the drive control unit 14 determines whether it is in an initial setting mode (step S103). The initial setting mode is to calculate a projection conversion coefficient when the projection environment changes. If the illuminated spot detection mode is first started, it is in the initial setting mode. The determination may be made depending on, for example, whether a projection conversion coefficient has been set or whether a given time has elapsed after a projection conversion coefficient is set. The projection conversion coefficient is a coefficient for correcting the difference between the coordinates in an image signal before projection and the coordinates in a projected image pattern.
When determining that it is in the initial setting mode (YES at step S103), the drive control unit 14 drives the image forming unit 2 and so on to project a projection conversion image pattern (see
In contest, when determining that it is not in the initial setting mode (NO at step S103), the drive control unit 14 captures an image of the emitted pattern according to the image capturing timing that is specified by the synchronization signal (step S107). The image capturing timing is determined according to the color emitted by the laser pointer. Thus, the light spot detection unit 24 can detect the spot illuminated by the laser pointer from the image data of the captured image (step S108). The coordinates of the detected illuminated spot are input to the conversion processing unit 25 to perform projection conversion with the projection conversion coefficient, which is calculated by the conversion coefficient arithmetic logic unit 26, to convert the coordinates of the illuminated spot to the coordinates in the image data (step S109). The data of the coordinates obtained by the projection conversion is transmitted to the image projection device 10 and the video image processing unit 12 generates, for the original video image signal to be emitted, image data of the pointer to be combined at the received coordinates (step 5110) and combines the video image signal with the image data of the pointer (step S111). In other words, projection image data is generated that is obtained by adding a given image according to the position of the detected illuminated spot.
Regarding the illuminated spot image data of the pointer to be combined, to increase the visibility, the laser pointer 40 that is enlarged from the original size around the calculated illuminated spot may be projected as illustrated in
For the projection pattern to be projected to calculate a projection conversion coefficient, for example, in addition to the pattern illustrated in
The external PC 20 that is an information processing device is locally connected to the image projection device 10. Arithmetic operations and synchronization of image capturing may be performed by an information processing device that is connected via a network. For example, an arithmetic operation processing server with an advanced feature may be used to perform an initial projection conversion matrix operation, to download content to be superimposed, to perform image processing, and/or to perform the like.
Modification
In the above-described embodiment, an image of an illuminated spot at which illuminating light from the illuminating device is incident on the projection surface is captured and image processing is performed according to the position of the spot. Alternatively, the light spot at which a light emitting substance emits light may be detected.
For example, a substance (stress illuminant) that emits light when a pushing force (stress) is applied is known. By applying such a substance onto a screen, a screen that emits light in response to a stress (exemplary light spot device that produces a light spot) can be made.
In Modification, the light spot detection unit 24 detects, instead of the above-described illuminated spot, a light emitting spot (light spot) on the screen as illustrated in
Instead of using the light emitting screen, a tool with an LED (Light Emitting Diode) (an exemplary light spot device that produces a light spot), such as a ballpoint pen with a built-in LED, may be used. For example, the light spot detection unit 24 detects, instead of the above-described illuminated spot, light emission from a ballpoint pen with a built-in LED that emits light when pushing the screen. Accordingly, the same processing as that of the embodiment can be implemented.
The embodiment of the present invention has been described above. The above-described embodiment is represented as an example only and is not intended to limit the scope of the invention. The invention is not limited to the above-described embodiment and the components can be modified and embodied within the scope of the invention when the invention is carried out. By properly combining the components disclosed in the above-described embodiment, various inventions can be formed. For example, some components can be omitted from the whole components shown in the embodiment.
A program that is executed by the image projection device according to the embodiment is previously incorporated in, for example, a ROM and provided. The program that is executed by the image projection device according to the embodiment may be recorded in a computer-readable recording medium, such as a CD-ROM, flexible disk (FD), CD-R, DVD (Digital Versatile Disk), in an installable or executable file and be provided.
Alternatively, the program that is executed by the image projection device according to the embodiment may be stored in a computer that is connected to a network, such as the Internet, and downloaded via the network so as to be provided. Alternatively, the program that is executed by the image projection device according to the embodiment may be provided or distributed via a network, such as the Internet.
The program that is executed by the image projection device according to the embodiment has a module configuration including the above-described units. For practical hardware, a CPU (processor) reads the program from the ROM and executes the program so that the above-described units are loaded and generated in the main storage device. Alternatively, the units of an image projection device may be implemented as software according to a program or implemented as hardware according to a given combination of electronic circuits.
The image projection device according to an embodiment can detect a pointer accurately without affected by video image signals.
An image projection device includes: a drive control unit that, when image data is projected onto a projection surface, causes light colors of the image data to be produced by performing control to cause, in time units, light to be transmitted to respective areas of multiple colors that are determined by light colors of the image data; a setting unit that sets a light color of an illuminating device that illuminates an illuminated spot on the projection surface; an image capturing control unit that, for, receives a synchronization signal that specifies a timing at which light of the set light color of the illuminated point illuminated by the illuminating device related to the areas of the multiple colors is not projected, and controls exposure and image capturing of an image capturing unit in accordance with the timing that is specified by the synchronization signal to capture an image of the projected image data; an illuminated spot detection unit that detects, from the image of the captured image data, the spot on the projection surface illuminated by the illuminating device; a conversion processing unit that converts coordinates of the detected illuminated spot into coordinates of the illuminated spot in the image data by using a projection conversion coefficient that is calculated from a difference between the projected image data on the projection surface and the image data before being projected; and an illuminated spot generation unit that generates illuminated spot image data that is obtained by combining the illuminated spot at the converted coordinates of the illuminated spot.
Although the invention has been described with respect to specific embodiments for a complete and clear disclosure, the appended claims are not to be thus limited but are to be construed as embodying all modifications and alternative constructions that may occur to one skilled in the art that fairly fall within the basic teaching herein set forth.
Number | Date | Country | Kind |
---|---|---|---|
2013-163593 | Aug 2013 | JP | national |
2014-146793 | Jul 2014 | JP | national |
The present application claims priority to and incorporates by reference the entire contents of Japanese Patent Application No. 2013-163593 filed in Japan on Aug. 6, 2013 and Japanese Patent Application No. 2014-146793 filed in Japan on Jul. 17, 2014.