The present invention relates to an imaging apparatus including a first imaging unit for detecting a user movement and a second imaging unit for capturing an object body.
A user interface system that recognizes a gesture of a user on a projector-projected video to allow the user to perform an intuitive operation is used. A system like this recognizes a user's gesture on a projected video using a touch panel or a video recognition technology.
US2014/0292647 discusses an interactive projector that projects a video from a projection unit onto an object to be projected such as a table, captures and analyzes a hand movement of a user for a projected image with a first camera, and projects an image corresponding to the hand movement from the projection unit onto a projection surface. To record character information placed on the projection surface, a second camera is used to capture the character information for recording it as an image.
A depth of field is used as an index for determining whether a camera can read an object body correctly. The greater the depth of field is, the wider is the range for bringing the object body into focus. One way to increase the depth of field of a camera is to extend an optical path length of the camera. When an image to be captured by a camera is a document script, a greater depth of field is required so that characters included in a whole area of the captured image can be correctly read. This requirement is further increased when optical character reader (OCR) processing is performed for a captured image.
US2014/0292647
According to an aspect of the present invention, an imaging apparatus includes a first imaging unit configured to include a first imaging element, and to detect a movement of a detection target object near an imaging surface, and a second imaging unit configured to include a second imaging element, and to capture an object body placed on the imaging surface, wherein a relation, “an optical path length from the second imaging element to the imaging surface>an optical path length from the first imaging element to the imaging surface” is satisfied.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
A first exemplary embodiment of the present invention is described in detail below with reference to the attached drawings. The components described in the exemplary embodiment below are only exemplary, and the scope of the present invention is not limited to those components.
<Usage State of Information Processing Apparatus 109>
The information processing apparatus 109 includes a projector 106 serving as a projection unit, a gesture sensor 107 serving as a first imaging unit, a camera 105 serving as a second imaging unit, and a lens barrel 132 (See
The projector 106 projects an image 111 onto a projection surface 110 (Because an imaging surface 301 to be described below is equivalent to the projection surface 110, only the projection surface 110 is described).
A user performs an operation for this image 111. The projected image 111 includes a menu button 122 via which the user uses a finger to select a power ON/OFF operation or other operations. A user's operation selection, which is detected by the gesture sensor 107, functions as an interface.
When the user wants to capture a document using the information processing apparatus 109, an object body (a document) to be captured is placed on the projection surface 110 to allow the camera 105 to capture the document as an image.
In the apparatus itself, a side on which an image is projected is a front side and its opposite side is a back side. The respective sides of the apparatus viewed from the front side are a right side and a left side.
<Description of Information Processing Apparatus 109>
A recognition unit 203, which is composed of a CPU and other components, tracks a user's hand/finger detected by the gesture sensor 107 and the detection unit 202 to recognize a gesture operation performed by the user. An identification unit 204, which is composed of a CPU and other components, identifies which user's finger was used to perform the gesture operation recognized by the recognition unit 203. A holding unit 205, which is composed of a CPU and other components, holds, in a storage area provided in the RAM 103, object information included in the projected electronic data and specified by a user via a gesture operation in association with the finger used for the gesture operation. An acceptance unit 206, which is composed of a CPU and other components, accepts an editing operation specified for electronic data via the gesture operation recognized by the recognition unit 203, and, as necessary, updates electronic data stored in the storage device 104. The storage device 104 stores electronic data that is to be processed via an editing operation. The CPU 101 refers to the information held in the holding unit 205 according to the gesture recognized by the recognition unit 203, and generates a projection image to be projected on the work space. The projector 106 projects a projection video generated by the CPU 101 onto the work space that includes the projection surface 110 and the user's hand near the projection surface 110.
<Block Diagram of Projector 106>
The projector 106 includes a liquid crystal control unit 150, liquid crystal elements 151R, 151G, and 151B, a light source control unit 160, a light source 161, a color separation unit 162, a color combination unit 163, an optical system control unit 170, and a projection optical system 171.
The liquid crystal control unit 150 controls a voltage applied to liquid crystals of pixels of the liquid crystal elements 151R, 151G, and 151B based on an image signal, which has been processed by an image processing unit 140, to adjust transmittance of the liquid crystal elements 151R, 151G, and 151B.
The liquid crystal control unit 150 includes a microprocessor for a control operation.
Each time one frame image is received from the image processing unit 140 when an image signal is input to the image processing unit 140, the liquid crystal control unit 150 controls the liquid crystal elements 151R, 151G, and 151B so that the transmittance corresponds to the image.
The liquid crystal element 151R, a liquid crystal element corresponding to red, adjusts the transmittance of red light that is included in the light output from the light source 161 and is one of the colors separated by the color separation unit 162 into red (R), green (G), and blue (B).
The liquid crystal element 151G, a liquid crystal element corresponding to green, adjusts the transmittance of green light that is included in the light output from the light source 161 and is one of the colors separated by the color separation unit 162 into red (R), green (G), and blue (B).
The liquid crystal element 151B, a liquid crystal element corresponding to blue, adjusts the transmittance of blue light that is included in the light output from the light source 161 and is one of the colors separated by the color separation unit 162 into red (R), green (G), and blue (B).
The light source control unit 160, which controls a ON/OFF state of the light source 161 and controls the amount of light, includes a microprocessor for a control operation.
The light source 161 is to output light for projecting an image onto the projection surface. For example, a halogen lamp is used as the light source 161.
The color separation unit 162 is to separate the light, which is output from the light source 161, into red (R), green (G), and blue (B). For example, a dichroic mirror is used as the color separation unit 162.
When a light emitting device (LED) corresponding to each of colors is used as the light source 161, the color separation unit 162 is not necessary.
The color combination unit 163 is to combine light components of red (R), green (G), and blue (B) respectively transmitted through the liquid crystal elements 151R, 151G, and 151B. For example, a dichroic mirror is used as the color combination unit 163.
The light generated by combining the components of red (R), green (G), and blue (B) by the color combination unit 163 is sent to the projection optical system 171.
The liquid crystal elements 151R, 151G, and 151B are controlled by the liquid crystal control unit 150 so that the each transmittance becomes the transmittance of light corresponding to an image input from the image processing unit 140.
When the light combined by the color combination unit 163 is projected onto the screen by the projection optical system 171, an image corresponding to the image input by the image processing unit 140 is displayed on the projection surface.
The optical system control unit 170, which controls the projection optical system 171, includes a microprocessor for a control operation.
The projection optical system 171 is to project the combined light, which is output from the color combination unit 163, onto the projection surface. The projection optical system 171 includes a plurality of lenses.
A light source unit 119 includes the light source 161, the color separation unit 162, the liquid crystal elements 151R, 151G, and 151B, and the color combination unit 163.
<Configuration of Projector 106>
The projector 106 includes the light source unit 119 and a lens barrel unit 115 in which the projection optical system 171 is stored.
The light source unit 119 and the lens barrel unit 115 are connected via a bending portion 135. The light source unit 119 is arranged in the back side of the bending portion 135. A reflection mirror 136 (see
Another reflection mirror 134 is arranged on the upper front side of the lens barrel unit 115. The reflection mirror 134 reflects light toward the projection surface 110 to project an image on the projection surface 110. The reflection mirror 136 arranged in the bending portion 135 reflects light output from the light source unit 119 toward the reflection mirror 134.
A cooling mechanism 137 is provided next to the light source unit 119 to radiate heat generated by the light source unit 119.
The configuration of the camera 105 and other components is described below with reference to
The camera 105 includes a CCD sensor 114 (see
When an object body placed on the projection surface 110 is read by the camera 105, an image of the object body is reflected by the imaging mirror 117 into the lens barrel 132, and the reflected image, which passes through the lenses 207 inside the lens barrel 132, is read by the CCD sensor 114 (see
The CCD sensor 114 is installed approximately horizontally to the projection surface 110. The lenses 207 are installed with its optical axis approximately perpendicular to the projection surface 110.
An object image of the object body placed on the imaging surface 301, which is the same surface as the projection surface 110, passes through the imaging mirror 117 and the plurality of lenses 207 and, after that, an image is formed on the light receiving surface of the CCD sensor 114.
An image plane IMG formed on the light receiving surface of the CCD sensor 114 is a shift optical system in which the image plane IMG shifts toward a right side in the figure with respect to the optical axis of the plurality of lenses 207.
Let Ia be a light beam directed toward the center of the imaging surface 301 when the imaging surface 301 is captured. Let Ib and Ic be light beams directed toward both left and right end-sides of the imaging surface 301, respectively, when the imaging surface 301 is captured.
The optical path length of the camera 105 is defined by an optical path length of the light beam Ia.
The optical path length of the light beam Ia is the sum of a distance between the point Pa1 that is an intersection with the imaging surface 301 and the point Pa2 that is an intersection with the reflection surface of the imaging mirror 117 and a distance between the point Pa2 that is the intersection with the reflection surface of the imaging mirror 117 and the point Pa3 where an image is formed on the IMG.
The configuration of the gesture sensor 107 is described with reference to
The gesture sensor 107 is attached to the main frame 113. The gesture sensor 107 includes a CCD sensor 107a serving as a first imaging unit and at least one lens 107b (a first imaging optical system) made of resin. The gesture sensor 107 is attached to the leading edge of the imaging mirror 117.
In order for the gesture sensor 107 to detect a movement of a user's hand/finger extended above the projection surface 110, it is necessary to reserve a detection area such that an area A having a height of 100 mm above the projection surface 110 can be detected. The gesture sensor 107 recognizes a movement of a user's hand/finger with a viewing angle of 60 degrees in front and back directions, and 90 degrees in right and left directions, with respect to the optical axis. The gesture sensor 107 is arranged in an area where it does not interfere with the light beams of the camera 105 and of the projector 106.
Let Sa be a light beam that passes through the optical axis of the lens 107b in the gesture sensor 107.
The optical path length of the gesture sensor 107 is defined by an optical path length of the light beam Sa. The optical path length of the light beam Sa is a distance between the point Qa1 that is an intersection with the imaging surface 301 and the point Qa2 where an image is formed on the CCD sensor 107a.
The optical path lengths of the components such as the camera 105 are described below with reference to
The relation among the optical path length of the light beam Ia of the camera 105, the optical path length of the light beam Sa of the gesture sensor 107, and the optical path length of the light Ja of the projector 106 is as follows; the optical path length of the camera 105>the optical path length of the projector 106>the optical path length of the gesture sensor 107.
The reason for the relation, “the optical path length of the camera 105>the optical path length of the gesture sensor 107” is as follows. The camera 105 sometimes reads a document image to be processed via an optical character reader (OCR). For this reason, a readable range (range for bringing an object into focus) of a reading image is increased by extending the optical path length of the light beam Ia using the imaging mirror 117 to increase a depth of field. On the other hand, the gesture sensor 107 is required only to detect a user's hand/finger and not required to have a reading ability with accuracy as high as that of the camera 105. Accordingly, the optical path length of the light beam Sa of the gesture sensor 107 is made shorter than the optical path length of the light beam Ia of the camera 105 (Ia>Sa).
To make the optical path length of the light beam Sa of the gesture sensor 107 the same length as the optical path length of the light beam Ia of the camera 105, a mirror to reflect the light beam Sa of the gesture sensor 107 should be added. Adding such a mirror makes the information processing apparatus 109 larger. Therefore, the relation, Ia>Sa, if satisfied, makes the apparatus more compact. The mirror mentioned here for reflecting the light beam Sa of the gesture sensor 107 refers not to an optical system included in the gesture sensor 107, but to a mirror provided externally to the gesture sensor 107.
The relation among the optical path length of the projector 106, the optical path length of the camera 105, and the optical path length of the gesture sensor 107 is described below. There is no need for the projector 106 to have a long optical path length as that of the camera 105, because the projector 106 is not required to have reading performance equivalent to that of the camera 105. On the other hand, it is preferable for the projector 106 to have an optical path length longer than that of the gesture sensor 107, because the projector 106 is required to project the image 111 onto the projection surface 110. Thus, the optical path length relation, “the optical path length of the camera 105>the optical path length of the projector 106>the optical path length of the gesture sensor 107” is required.
<Relation of Viewing Angles>
The relation between a viewing angle of the camera 105 and a viewing angle of the gesture sensor 107 is described below with reference to
In
The viewing angle DS of the lens 107b of the gesture sensor 107 is set wider than the viewing angle DI of the lenses 207 of the camera 105. Setting the viewing angles in this manner allows a readable area of the gesture sensor 107 to be set approximately in the same range as that of a readable area of the camera 105 while satisfying the relation, “the optical path length of Ia>the optical path length of Sa”.
<Arrangement Configuration of Imaging mirror 117 and Gesture sensor 107>
A second exemplary embodiment of the present invention is described in detail below with reference to the attached drawings. Only a part different from the first exemplary embodiment is described, and the description of a part similar to that in the first exemplary embodiment is omitted.
<Relation Between Optical Path Length of Camera 105 and Optical Path Length of Gesture Sensor 120>
The optical path lengths of the components, such as the camera 105, are described below with reference to
The optical path length of the gesture sensor 120 is defined by the optical path length of the light Sa. The optical path length of the light Sa is a distance between the point Qa1 that is an intersection with the imaging surface 301 and the point Qa2 where an image is formed on the light receiving element 120b1.
At this time, the relation among the optical path lengths of the components, such as the camera 105, is as follows. the optical path length of the camera 105>the optical path length of the projector 106>the optical path length of the gesture sensor 120.
The reason for the relation, “the optical path length of the camera 105>the optical path length of the gesture sensor 120” is as follows. The camera 105 sometimes reads a document image to be processed via an OCR. For this reason, a readable range (range for bringing an object into focus) of a reading image is increased by extending the optical path length of the light beam Ia using the imaging mirror 117 to increase a depth of field. On the other hand, the gesture sensor 120 is required only to detect a user's hand/finger and not required to have a reading ability with accuracy as high as that of the camera 105. In addition, when an infrared camera is used for the gesture sensor 120, light attenuation in the infrared light used for the gesture sensor 120 increases as the optical path length between the light receiving element 120b1 and the projection surface 110 becomes longer. Therefore, it is desirable that the optical path length between the light receiving element 120b1 and the projection surface 110 be short. For this reason, the optical path length of the light beam Sa of the gesture sensor 120 is made shorter than the optical path length of the light beam Ia of the camera 105 (Ia>Sa).
The relation among the optical path length of the projector 106, the optical path length of the camera 105, and the optical path length of the gesture sensor 120 is the same as that in the first exemplary embodiment.
When an infrared camera is used for the gesture sensor 120 as in the second exemplary embodiment, the apparatus can also be made compact while maintaining the imaging performance of the camera 105 of the information processing apparatus 109 as in the first exemplary embodiment.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2015-169628, filed Aug. 28, 2015, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2015-169628 | Aug 2015 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2016/003704 | 8/10/2016 | WO | 00 |