This nonprovisional application is based on Japanese Patent Application No. 2011-130248 filed on Jun. 10, 2011 with the Japan Patent Office, the entire contents of which are hereby incorporated by reference.
1. Field of the Invention
The present invention relates to projection type image display devices, and in particular, to a projection type image display device configured to be capable of changing projection image by a user's interactive operation.
2. Description of the Background Art
Examples of projection type image display devices (hereinafter referred to as projectors) of this type include a projector that projects a projection screen on a wall, as well as a VUI (Virtual User Interface) screen on a desk on which the projector is placed. According to such a projector, when an object (a stylus pen or a user's finger) is placed on the VUI screen, a laser beam directed toward the VUI screen from the projector is scattered by the object. When a light receiving element detects the light scattered by the object, an operational command for switching the projection screen is generated based on the detection signal.
The above projector is a so-called laser projector for displaying an image on a projection screen by irradiating the projection screen with a laser beam, and irradiates the projection screen with a part of the laser beam scanned by the scanning unit, and irradiates a projection plane of the VUI screen with the rest of the laser beam. Then, when the light receiving element detects the light scattered by the object on the desk, a position of the object is calculated based on a scanning position of the laser beam at a timing at which the light receiving element detects the light.
In this manner, according to this projector, it is possible to determine the position of the object on the VUI screen by irradiating the desk with a part of the laser beam applied to the projection screen. However, when the projector is displaying a black screen, a light intensity of the laser beam applied to the desk as the projection plane of the VUI screen is reduced, and therefore it becomes difficult for the light receiving element to accurately detect the scattering light. Further, when an image is projected in a room illuminated by a fluorescent lamp, the light scattered by the object may not be accurately detected due to the reflection of light from the fluorescent lamp on the VUI screen on the desk.
Moreover, according to this projector, in order to calculate the position of the object, it is necessary to synchronize a timing at which a resonant MEMS mirror scans the laser beam with a timing at which the light receiving element detects the scattering light. This requires a light receiving element capable of operating at a high speed, and makes the projector more complicated and expensive.
Additionally, as an object placed outside the VUI screen does not generate scattering light, the light receiving element cannot determine the position of this object. Accordingly, a range in which the user operation is possible is limited to the VUI screen, and it has been difficult to respond to a demand for various interactive features.
A projection type image display device according to one aspect of the present invention is provided with: a projecting unit which projects image light based on an inputted image signal to a projection surface; an invisible light emitter configured to emit invisible light to the projection surface, and provided such that an irradiation area of the invisible light includes a predetermined region; an image pickup unit configured to receive at least one of a plurality of color lights constituting the image light as well as the invisible light, and which picks up an image of the projection surface to generate a pickup image; an object detector which detects at least one of a position and a behavior of an object that has come into the predetermined region based on the pickup image; and a display controller which changes the projected image based on at least one of the position and the behavior of the object detected by the object detector.
The foregoing and other objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description of the present invention when taken in conjunction with the accompanying drawings.
Hereinafter, an embodiment according to the present invention will be described in detail with reference to the drawings. Like or corresponding components in the drawings are denoted by like reference numerals, and the description for such components will not be repeated.
Referring to
Main body cabinet 200 is provided with a projection hole 211 for transmitting image light. The image light directed forward and downward through projection hole 211 is extended and projected on a projection surface provided in front of projector 100. In this embodiment, the projection surface is provided on a floor on which projector 100 is placed. In a case in which projector 100 is placed such that a back surface of main body cabinet 200 is in contact with the floor, the projection surface can be provided on a wall surface.
Infrared light emitter 300 is provided on a side of a lower surface of main body cabinet 200. Infrared light emitter 300 emits infrared light (hereinafter also referred to as “IR light”) directed to the front side of projector 100. As illustrated in
Infrared light emitter 300 constitutes an “invisible light emitter” according to the present invention. Invisible light refers to light having a wavelength outside a range of visible light, and specific examples include infrared light, far-infrared light, and ultraviolet light. In this embodiment, infrared light emitter 300 is a representative example of the “invisible light emitter”. In other words, instead of infrared light emitter 300, it is possible to use a light emitter emitting invisible light other than infrared light. As illustrated in
Image pickup unit 500 is provided on a side of an upper surface of main body cabinet 200. Image pickup unit 500 is provided with such as an image pickup device for example including a CCD (Charge Couple Device) sensor or a CMOS (Completely Metal Oxide Semiconductor) sensor and an optical lens provided in front of image pickup device. Image pickup unit 500 picks up an image of an area including at least the projection region. Image pickup unit 500 generates image data indicating the pickup image (hereinafter referred to as “pickup data”). As illustrated in
Projector 100 according to this embodiment is configured to be operable selectively in a “normal mode” corresponding to a normal image display operation and an “interactive mode” in which an image projected on the projection surface can be changed based on an interactive operation by the user. The user can select between the normal mode and the interactive mode by operating an operation panel provided for main body cabinet 200 or a remote controller.
When projector 100 is set in the interactive mode, the user can change a display mode of the projection image interactively by performing an operation on the projection image using a finger or a pointer.
Referring to
Light source device 10 includes a plurality (three, for example) of light sources 10R, 10G, and 10B. Light source 10R is a red light source emitting a light in a red wavelength band (hereinafter referred to as “R light”), and configured by a red LED (Light Emitting Device) or a red LD (Laser Diode), for example. Light source 10R is provided with a cooler 400R (not depicted) configured by a heatsink and a heat pipe for releasing heat generated in light source 10R.
Light source 10G is a green light source emitting a light in a green wavelength band (hereinafter referred to as “G light”), and configured by a green LED or a green LD, for example. Light source 10G is provided with a cooler 400G configured by a heatsink 410G and a heat pipe 420G for releasing heat generated in light source 10G.
Light source 10B is a blue light source emitting a light in a blue wavelength band (hereinafter referred to as “B light”), and configured by a blue LED or a blue LD, for example. Light source 10B is provided with a cooler 400B configured by a heatsink 410B and a heat pipe 420B for releasing heat generated in light source 10B.
Cross dichroic mirror 20 only transmits the B light in the light incident from light source device 10, and reflects the R light and the G light. The B light that has transmitted through cross dichroic mirror 20, and the R light and the G light that have reflected on cross dichroic mirror 20 are directed to and reflected on folding mirror 30, and enter DMD 40.
DMD 40 includes a plurality of micromirrors disposed in matrix. A single micromirror constitutes a single pixel. The micromirrors are driven between on and off at a high speed based on DMD driving signals corresponding to the R light, the G light, and the B light that have entered.
The light from each light source (the R light, the G light, and the B light) is modulated by changing an inclination angle of the micromirrors. Specifically, when a micromirror corresponding to one pixel is in an OFF state, the light reflected on this micromirror does not enter projection unit 50. By contrast, when the micromirror is in an ON state, the light reflected on this micromirror enters projection unit 50. By adjusting a ratio of a time period in which each micromirror is in the ON state to a light emitting period of each light source, gradation of the pixel is adjusted pixel by pixel.
DMD 40 drives each micromirror in synchronization with a timing at which light source device 10 time-divisionally emits the R light, the G light, and the B light. Projection unit 50 includes a pixel 51 and a reflection mirror 52. The light reflected on DMD 40 (image light) is transmitted through a projection lens unit 51 and incident into reflection mirror 52. The image light reflects on reflection mirror 52, and exits outside through projection hole 211 provided for main body cabinet 200. Images respectively in the R, G, and B color lights are sequentially projected on the projection surface. In human eyes, the images in the color lights projected on the projection surface are recognized as a color image generated by overlapping the images in the color lights.
Referring to
Image pickup unit 500 picks up an image of an area including at least the projection region on the projection surface (floor), and generates pickup data. Specifically, image pickup unit 500 takes a predetermined area as a pickup area that sufficiently contains the projection region and includes the projection region and its surrounding area.
An optical image in a field of image pickup unit 500 is applied to a light receiving surface of image pickup device 508, that is, the pickup plane through optical lens 502. On the pickup plane, a charge corresponding to the optical image in the field (pickup data) is generated by photoelectric conversion. GB cut filter 504 is provided between optical lens 502 and image pickup device 508. GB cut filter 504 shields the G light and the B light in the light incident to image pickup device 508, and transmits the R light and the invisible light.
In response to a switching command outputted from main body cabinet 200, filter switching controller 506 selectively drives between a condition in which GB cut filter 504 is inserted into an optical path of image pickup device 508 and a condition in which GB cut filter 504 is removed from this optical path.
In this embodiment, when projector 100 is set in the interactive mode, filter switching controller 506 drives GB cut filter 504 to be in the condition of being inserted into the optical path of image pickup device 508. By contrast, when projector 100 is set in the normal mode, filter switching controller 506 drives GB cut filter 504 to be in the condition of being removed from this optical path. With the above configuration, during the interactive mode, it is possible to determine a content of a user's instruction based on the pickup data generated by image pickup unit 500. By contrast, during the normal mode, it is possible to generate image data representing a content displayed on the projection surface based on the pickup data. For example, by picking up an image in a condition in which such as a letter drawn on the projection image, it is possible to store this condition as a single piece of the image data.
Referring to
Optical axes of light emitting elements 311 to 315 lie in parallel with the projection surface (in the Y direction). It is preferable that an interval between the optical axes of light emitting elements 311 to 315 and the projection surface be as short as structurally possible. This is because as a distance of the infrared light from the projection surface increases, inconsistency is likely to occur between a position at which a touch operation is actually performed by the user and a position of the operation detected by scattering light position detector 204, and the interactive feature may not function normally.
Housing 310 is provided with a slit 322 in a side surface on the front side of projector 100. After transmitting slit 322, the infrared light emitted from light emitting elements 311 to 315 travels in substantially parallel with the projection surface.
Here, light emission source 320 is turned on (ON) when projector 100 is set in the interactive mode, and turned off (OFF) when projector 100 is set in the normal mode. With the above configuration, during the interactive mode, it is possible to change the display mode of the projection image by the user performing a touch operation using a finger or a pointer on the projection surface within the irradiation area of the infrared light according to the touched position (position of operation). Further, during the normal mode, it is possible to save power consumption by preventing light emission source 320 from unnecessary emitting light.
Referring to
Operation accepting unit 216 receives a remote controller signal transmitted from the remote controller operated by the user. Operation accepting unit 216 also accepts a signal form the operation panel provided for main body cabinet 200, in addition to the reception of the remote controller signal. When an operation is made to the remote controller or the operation panel, operation accepting unit 216 accepts the operation, and transmits a command signal triggering various operations to element controller 208.
Image signal processor 214 receives a image signal supplied from an input unit that is not depicted. Image signal processor 214 processes the received image signal into a signal for display and outputs the processed signal. Specifically, image signal processor 214 writes the received image signal into a frame memory (not depicted) for each frame (each screen), and reads the image written into the frame memory. Then, by performing various image processing in the course of processing for writing and reading, image signal processor 214 converts the received image signal to generate a display image signal for the projection image.
Here, examples of the image processing performed by image signal processor 214 include image distortion correction processing for correcting an image distortion due to a relative inclination between an optical axis projection light from projector 100 and the projection surface, and image size adjustment processing for enlarging or reducing a display size of the projection image displayed on the projection surface.
Element controller 208 generates control signals for controlling display operation of the image according to the display image signal outputted from image signal processor 214. The generated control signals are respectively outputted to light source device 10 and DMD 40.
Specifically, element controller 208 controls to drive DMD 40 according to the display image signal. Element controller 208 drives the plurality of micromirrors constituting DMD 40 in synchronization with timings that light source device 10 sequentially emits the R light, the G light, and the B light. With this, DMD 40 modulates the R light, the G light, and the B light based on the display image signal to output the images respectively in the R, G, B color lights.
Further, element controller 208 generates a control signal for controlling intensity of the light emitted from light source device 10 and outputs the generated signal to light source device 10. The control of the intensity of the light emitted from light source device 10 is performed when the user has instructed to adjust brightness of the projection image by operating the remote controller, the operation panel, or a menu screen. Alternatively, it is possible to employ a configuration in which a display controller 208 automatically adjusts the brightness of the projection image according to brightness of the display image signal supplied from image signal processor 214.
Projector 100 according to this embodiment executes a calibration process as preprocessing for performing the display operation of the image. The calibration process is for converting an image coordinate system of image pickup unit 500 into a coordinate image system of projector 100. By executing the calibration process, it is possible to execute the interactive mode described above. Here, prior to the execution of the calibration process, a distortion state of the projection image is automatically adjusted.
The calibration process is included in initialization of projector 100 that is performed when the power is activated or the projection is instructed. Other than performing as the initialization, it is also possible to employ a configuration in which the calibration process is performed when such as an inclination sensor that is not depicted detects that the location of projector 100 changes.
Upon instruction of the execution of the calibration process via operation accepting unit 216, display controller 208 causes a predetermined calibration pattern to be projected on the projection surface. The calibration pattern is previously recorded in calibration pattern storage unit 212. In this embodiment, the calibration pattern is an all-white image with which an entire screen is displayed monochromatically in white.
Image pickup unit 500 picks up an image of the projection surface according to an instruction from display controller 208. Image pickup unit 500 generates image data (pickup data) representing the pickup image of the projection surface, and outputs the pickup image to projection region detector 202.
Projection region detector 202 acquires the pickup data from image pickup unit 500, and the image data generated based on the calibration pattern (all-white image, for example) from image signal processor 214. Projection region detector 202 compares the pickup data with the image data, thereby detecting positional information of the projection region in the image coordinate system of image pickup unit 500.
In this case, as illustrated in
Referring to
In Step S02, image pickup unit 500 picks up the image of the projection surface to acquire the pickup image (pickup data).
In Step S03, upon acquisition of the pickup image (
Hereinafter, an operation of projector 100 when set in the interactive mode will be described with reference to the drawings.
Referring to
When an object (the user's finger tip or the pointer, for example) 600 comes in the irradiation area of the infrared light, the infrared light is scattered by object 600 to become diverging light. Image pickup unit 500 receives the scattered infrared light. Image pickup unit 500 generates the pickup data based on the image light (the R light) that has transmitted through GB cut filter 504 (
Upon acquisition of the pickup data from image pickup unit 500, scattering light position detector 204 detects a position of the scattered infrared light based on the positional information detected by projection region detector 202 and indicating the position of the projection region in the recorded pickup image. Specifically, scattering light position detector 204 detects the position of the scattered infrared light in the projection region based on positional information of a point having brightness equal to or greater than a predetermined threshold value in the pickup image and the positional information of the projection region. With this, scattering light position detector 204 detects whether or not object 600 has come in the projection region, as well as a position of object 600 that has come in the projection region.
At this time, scattering light position detector 204 detects whether the scattered infrared light is positioned within the projection region or within a predetermined region outside the projection region based on the positional information of the projection region. Then, if the scattered infrared light is positioned within the projection region, scattering light position detector 204 detects the positional information indicating XY coordinates of the scattered infrared light within the projection region. The position of the detected scattered infrared light is at the position of object 600 illustrated in
In the following, a method of detecting the position of the scattered infrared light by scattering light position detector 204 illustrated in
As illustrated in
After recognizing the position of point P1 within the projection region as described above, scattering light position detector 204 calculates XY coordinates (Xa, Ya) corresponding to point P1 within the image display region illustrated in
Referring back to
Specifically, when the projection surface is touched and operated, object operation determining unit 206 recognizes the operated position based on the positional information of object 600 supplied from scattering light position detector 204. Then, object operation determining unit 206 determines the content of the user's instruction corresponding to the position of operation.
Alternatively, when the user moves the position of operation while touching and operating the projection surface, object operation determining unit 206 recognizes a movement state of the position of operation (such as a movement direction and movement amount) based on the positional information of object 600 supplied from scattering light position detector 204. Then, object operation determining unit 206 determines the content of the user's instruction corresponding to the movement state of the position of operation.
Image signal processor 214 performs predetermined image processing to a display control signal based on the content of the user's instruction transmitted from object operation determining unit 206 via display controller 208. Element controller 208 controls to drive light source device 10 and DMD 40 according to the processed display image signal. As a result, the display mode of the projection image changes according to the content of the user's instruction.
As illustrated in
Further,
In the example shown in
In this manner, according to projector 100 of this embodiment, it is possible to change the projection image by the user operating on the projection image. Further, projector 100 according to this embodiment is able to change the projection image by the user's operation within the region outside the projection region illustrated in
Referring to
In this embodiment, a mode of change of the projection image varies depending on the region that the user has performed a touch operation out of three regions A to C.
Further, when the object is located in region B, “page forward” for switching the projection image is set. The “page forward” refers to an operation of switching an image of a frame that is currently displayed in the projection surface to an image of a next frame. On the other hand, when the object is located in region C, “page backward” for switching the projection image is set. The “page backward” refers to an operation of switching an image of a frame that is currently displayed in the projection surface to an image of a previous frame.
Referring to
In Step S12, infrared light emitter 300 emits the infrared light substantially in parallel with the projection surface. At this time, all of the plurality of light emitting elements 311 to 315 that constitute light emission source 320 are driven to the ON state.
After reading of the pickup data recorded in memory 510 of image pickup unit 500 in Step S13, in Step S14, scattering light position detector 204 determines whether or not the object has come in the irradiation area of the infrared light based on the pickup data. Specifically, scattering light position detector 204 determines whether or not there is a point having brightness equal to or greater than the predetermined threshold value within the pickup image.
When it is determined that the object has not come in the irradiation area of the infrared light (when determined to be NO in Step S14), scattering light position detector 204 determines that the user has not instructed to change the projection image, and terminates the series of processing.
On the other hand, when it is determined that the object has come in the irradiation area of the infrared light (when determined to be YES in Step S14), in Step S15, scattering light position detector 204 detects the position of the scattered infrared light (the position of the object) based on the positional information of the projection region.
In Step S16, scattering light position detector 204 determines whether or not the scattered infrared light is within the projection region, that is, whether or not the object that has come in is within the projection region, based on the positional information of the projection region. When it is determined that the object is within the projection region (when determined to be YES in Step S16), scattering light position detector 204 detects the positional information indicating XY coordinates of the scattered infrared light within the projection region. The detected positional information of scattering light is outputted to object operation determining unit 206.
In Step S17, object operation determining unit 206 determines whether or not the object has moved based on the positional information of the scattering light. When it is determined that the object has moved (when determined to be YES in Step S17), in Step S18, object operation determining unit 206 determines the content of the user's instruction based on the movement state of the object (the movement direction, the movement amount, and such). By contrast, when it is determined that the object has not moved (when determined to be NO in Step S17), in Step S19, object operation determining unit 206 determines the content of the user's instruction based on the position of the object.
On the other hand, when it is determined that the object is within the predetermined region outside the projection region in Step S16 (when determined to be NO in Step S16), in Step S20, scattering light position detector 204 determines the region in which the object is located, out of the plurality of regions A to C outside the projection region. In Step S21, object operation determining unit 206 determines the content of the user's instruction based on the region in which the object is located.
Upon determination of the content of the user's instruction through Steps S18, S19, and S21, image signal processor 214 performs the predetermined image processing to the display control signal based on the content of the user's instruction transmitted from object operation determining unit 206 via display controller 208. Element controller 208 controls to drive light source device 10 and DMD 40 according to the processed display image signal. As a result, the display mode of the projection image changes according to the content of the user's instruction.
Referring to
When it is determined that the operation by the user on the projection screen (or in the region outside the projection region) has been completed based on the content of the user's instruction transmitted from object operation determining unit 206, light emission controller 210 (
Specifically, light emission controller 210 switches a part of the plurality of light emitting elements 311 to 315 that constitute light emission source 320 from the ON state to the OFF state. At this time, as illustrated in
In this manner, it is possible to reduce electricity consumed by infrared light emitter 300 in a standby state in which an operation by the user is awaited by reducing the emitted light intensity of the infrared light when the time elapsed reaches predetermined time Tth without any operation on the projection screen by the user. As a result, it is possible to reduce power consumption of projector 100.
In contrast, as only light emitting element 313 as a part of the plurality of light emitting elements 311 to 315 is turned to the ON state as described above, the irradiation area of the infrared light becomes smaller than in a normal operation as illustrated in
In order to avoid such a trouble, in the standby state in which an operation by the user is awaited, as illustrated in
As described above, the plurality of light emitting elements 311 to 315 that constitute light emission source 320 are arranged in the X direction.
The arrangement illustrated in
The plurality of light emitting elements 311 to 315 are arranged such that their optical axes are in parallel with the projection surface as well as that the directions in which the infrared light is emitted are in parallel with each other.
On the other hand, in the arrangement illustrated in
Further, as illustrated in
In a case in which the irradiation area of the infrared light is spread in the Z direction, the infrared light is scattered by the user's finger and turned into the diverging light at a timing before the user's finger touches the projection surface. Therefore, inconsistency is likely to occur between the position at which the touch operation is actually performed by the user and the position of the operation detected by scattering light position detector 204, and the interactive feature may not function normally. On the other hand, according to the configuration illustrated in
In the configuration illustrated in
Further, while the above embodiment describes the DMD as an example of light modulation devices, it is possible to employ a liquid crystal panel of a reflective type or of a transmission type.
Although the present invention has been described and illustrated in detail, it is clearly understood that the same is by way of illustration and example only and is not to be taken by way of limitation, the scope of the present invention being interpreted by the terms of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2011-130248 | Jun 2011 | JP | national |