This application claims priority to Japanese Patent Application No. 2013-077909 filed on Apr. 3, 2013. The entire disclosure of Japanese Patent Application No. 2013-077909 is hereby incorporated herein by reference.
1. Field of the Invention
The present invention generally relates to an input device and an input method.
2. Background Information
An input device is well-known that makes use of a VUI (virtual user interface) (see Japanese Unexamined Patent Application Publication No. 2009-258569 (Patent Literature 1), for example). This VUI is a virtual input interface that allows the user to perform input operations on a projected image.
For example, this Patent Literature 1 discloses an electronic device having a projector module that projects an image onto an installation surface of an electronic device, and a photodiode. A scanned laser beam is projected from the projector module onto the installation surface. This scanned laser beam is reflected by an object, such as the user's finger, located above the installation surface. This reflected light is detected by the photodiode, which allows the position of the object with respect to the projected image to be detected.
It has been discovered that with a conventional input device utilizing a VUI, the inclination of the object with respect to the projection surface of the projected image can not be detected. For example, the above-mentioned Patent Literature 1 is silent about the method for detecting the inclination of an object.
One aspect is to provide an input device and an input method with which a virtual input interface can be realized, allowing an inclination of an object located above a projection region in a line scanning direction of a laser beam to be detected.
In view of the state of the known technology, an input device is provided that includes a light source, a laser beam scanner, a photodetector, and an inclination determination component. The light source is configured to emit a laser beam. The laser beam scanner is configured to scan a plurality of lines of the laser beam in a line scanning direction. The lines of the laser beam are projected in a projection region of a projection surface. The photodetector is configured to detect a reflected light of the laser beam reflected by an object located above the projection region. The inclination determination component is configured to determine an inclination of the object with respect to the projection surface in the line scanning direction based on a change amount in a timing at which the photodetector detects the reflected light.
Also other objects, features, aspects and advantages of the present disclosure will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses one embodiment of the input device and the input method.
Referring now to the attached drawings which form a part of this original disclosure:
A selected embodiment will now be explained with reference to the drawings. It will be apparent to those skilled in the art from this disclosure that the following descriptions of the embodiment are provided for illustration only and not for the purpose of limiting the invention as defined by the appended claims and their equivalents.
Referring initially to
As shown in
If there is an object O, such as a touch pen or the user's finger, above the projection region A, then the scanned laser beam R is reflected by this object O. A reflected light r is incident on a light incidence face 10b of the projector 1. The projector 1 detects the relative position of the object O with respect to the projection region A based on the detection result for the reflected light r of the scanned laser beam R reflected by the object O located above the projection region A.
Here, if the object O is inclined in the line scanning direction (the x direction) by θx (0°≦θx<90°) with respect to the projection surface F, then the timing at which the scanned laser beam R is reflected by the object O varies according to what number of line the scanned laser beam R is on. For example, as will be discussed below, the timing of reflection by the object O differs between the scanned laser beam Rn of the n-th line (n is a positive integer of 1 or more) and the scanned laser beam Rn+1 of the n+1-th line. Therefore, the timing at which the reflected light r from the object O is incident on the light incidence face 10b of the projector 1, and the detection timing at which the reflected light r is detected both vary according to what number of line the scanned laser beam R for the reflected light r is on. The projector 1 detects the inclination θx in the line scanning direction of the object O with respect to the projection surface F based on the deviation (e.g., the amount of change) in the detection timing of the reflected light r from the object O. The method for determining the inclination θx of the object O in the line scanning direction will be discussed in detail below.
Next, the configuration of the projector 1 will be described.
The memory component 11 is a nonvolatile memory medium, and holds programs and control information used by the CPU 12 and so forth. The memory component 11 also holds image data about images projected in the projection region A, for example. In
The CPU 12 is a controller that uses the programs, control information, and so on contained in the memory component 11 to control the various components of the projector 1. This CPU 12 has a position determination component 121 and an inclination determination component 122.
The position determination component 121 determines the relative position of the object O with respect to the projection region A based on the detection result of the photodetector 22. This relative position is calculated based on the result calculated by the photodetector 22 for the reflected light r of the scanned laser beam R reflected by the object O, for example.
The inclination determination component 122 determines the inclination θx (see
The image data processor 13 converts the image data outputted from the CPU 12 into data for three colors, namely, red (R), green (G), and blue (B). The converted data for three colors is outputted to the laser beam driver 14.
The laser beam driver 14 is a control circuit that performs drive control of the LD 15. The red laser beam driver 14a performs drive control for the emission, light output, and so forth of the red LD 15a. The green laser beam driver 14b performs drive control for the emission, light output, and so forth of the green LD 15b. The blue laser beam driver 14c performs drive control for the emission, light output, and so forth of the blue LD 15c.
The LD 15 is a light source that emits a laser beam with a wavelength in the visible light band. The red LD 15a is a light emitting element that emits a red laser beam. The green LD 15b is a light emitting element that emits a green laser beam. The blue LD 15c is a light emitting element that emits a blue laser beam.
The MEMS mirror 18 is an optical reflection element that reflects the laser beams emitted from the LD 15 and incident via the collimator lenses 16a to 16c and the beam splitters 17a to 17c. The actuator 19 drives the MEMS mirror 18 and varies the reflection direction of the laser beams in biaxial directions. The mirror servo 20 is a drive controller that controls the drive of the MEMS mirror 18 by the actuator 19 based on a control signal inputted from the CPU 12. The MEMS mirror 18, the actuator 19, and the mirror servo 20 are examples of a laser beam scanner that scans a plurality of lines of the scanned laser beam R projected onto the projection surface F.
The reflecting mirror 21 is a laser beam reflector provided on the outside of the housing 10. The reflecting mirror 21 reflects the scanned laser beam R that goes through the light emission face 10a formed on the housing 10 and is emitted outside of the housing 10, and guides this light to the projection surface F. The reflecting mirror 21 is movably attached to the housing 10, and can also be removed from the optical path of the scanned laser beam R.
As shown in
The photodetector 22 includes a light receiving element or the like, and detects light that is incident after passing through the light incidence face 10b formed on the housing 10. The photodetector 22 detects, for example, the reflected light r of the scanned laser beam R reflected by the object O located above the projection region A. The photodetector 22 outputs a light detection signal to the CPU 12 based on this detection result.
The interface component 23 is a communication interface for wired or wireless communication with an external device. The input component 24 is an input unit that accepts user operation input.
Next, the scanning state of the scanned laser beam R projected in the projection region A will be described.
Next, the method by which the inclination determination component 122 determines the inclination θx of the line scanning direction of the object O located above the projection region A will be described.
Referring now to
As shown in
Therefore, as shown in
Specifically, the photodetector 22 detects the reflected light rn+1 for the (n+1)-th line reflected by the object O at a timing that is later by (tb−ta) than the reflected light rn of the n-th line. Also, the photodetector 22 detects the reflected light rn+2 for the (n+2)-th line reflected by the object O at a timing that is later by (tc−tb) than the reflected light rn+1 of the (n+1)-th line.
The inclination determination component 122 determines the inclination θx in the line scanning direction (the x direction) of the object O by using the following Equation 1, based on the deviation (tb−ta) of the detection timing of the light detection signal for the reflected light rn+1 of the (n+1)-th line, for example.
θx=tan−1 [m/{(L+1/T)×(tb−ta)}] (Equation 1)
In Equation 1, m is the spacing of the reflection starting points Pn and Pn+1 (the shortest distance in the z direction). Ln+1 is the line scanning distance of the scanned laser beam Rn+1 for the (n+1)-th line (that is, the projection distance in the x direction). T is the line scanning time for one line of the scanned laser beam R (see
Referring now to
As shown in
Thus, as shown in
In this case, since the deviation (the amount of change) in the detection timing between the various light detection signals is zero, the inclination determination component 122 determines the inclination θx of the object O in the line scanning direction to be 90° based on the above Equation 1.
With the above method for determining the inclination θx, the inclination θx is calculated based on the single deviation (the amount of change) in the detection timings. However, the inclination θx can be calculated based on a plurality of deviations between a plurality of detection timings. This allows the inclination θx of the object O in the line scanning direction to be calculated more accurately. More specifically, in the illustrated embodiment, the inclination θx is calculated based on the single deviation (tb−ta) of the detection timings ta and tb. However, the inclination θx can be calculated based on the deviations (e.g., (tb−ta) and (tc−tb)) of the detection timings ta, tb and tc. In this case, the inclination θx can be calculated an average value of the inclinations θx calculated based on the deviations (e.g., (tb−ta) and (tc−tb)) using Equation 1, respectively. Of course, when the inclination θx is calculated based on the deviation (tc−tb) using Equation 1, m is the spacing of the reflection starting points Pn+1 and Pn+2 (the shortest distance in the z direction). Also, Ln+2 indicative of the line scanning distance of the scanned laser beam Rn+2 for the (n+2)-th line (that is, the projection distance in the x direction) is used instead of Ln+1.
Also, with the above method for determining the inclination θx, the detection timing is found at the points ta to tc (see
Next, an application example of this embodiment will be described.
In this state, if the user moves the object O over the projected key to be inputted (such as the projected key K2, then the character indicated by that projected key (such as the character “I”) is selected and inputted, for example. Alternatively, in a state in which the projected keys K2 to K5 have been newly displayed, the projected key can be selected according to the inclination direction of the object O or the magnitude of the inclination θx. If the touch pen is removed from the projection region A without moving the object O, the projected key K1 indicating the character “A” is selected and inputted.
Another application example of this embodiment will be described.
The projector 1 pertaining to the above aspect of this embodiment includes the LD 15 (e.g., the light source), a laser beam scanner (such as the MEMS mirror 18, the actuator 19, and the mirror servo 20), the photodetector 22, and the inclination determination component 122. The LD 15 emits a laser beam. The laser beam scanner line-scans a plurality of laser beams in the line scanning direction (the x direction). The lines of the laser beam are projected into the projection region A on the projection surface F. The photodetector 22 detects the reflected light r of the scanned laser beam R reflected by the object O located above the projection region A. The inclination determination component 122 determines the inclination θx of the object O with respect to the projection surface F in the line scanning direction (the x direction) of the scanned laser beam R based on the amount of change (e.g., change amount) in the timing (such as ta, tb, tc, tr, etc.) at which the photodetector 22 detects the reflected light r.
With this projector 1, the laser beams that has been scanned for a plurality of lines by the laser beam scanner (such as the reflecting mirror 21, the actuator 19, and the mirror servo 20) are projected into the projection region A on the projection surface F, and are reflected by the object O located above the projection region A. The inclination determination component 122 determines the inclination θx of the object O with respect to the projection surface F in the line scanning direction (the x direction) of the scanned laser beam R based on the amount of change in the timing (such as ta, tb, tc, tr, etc.) at which the reflected light r is detected by the photodetector 22. Therefore, a virtual input interface can be realized with which the inclination θx of the object O located above the projection region A in the line scanning direction (the x direction) of the scanned laser beam R can be detected.
Also, with the projector 1 pertaining to an aspect of this embodiment, the inclination determination component 122 determines the inclination θx of the object O with respect to the projection surface F in the line scanning direction (the x direction) based on the amount of change (ta−tb) between the timing ta (e.g., the first timing) and the timing tb (e.g., the second timing), for example. What is detected at the timing ta is the reflected light rn (e.g., the first reflected light) obtained in response to the scanned laser beam Rn (e.g., the first laser beam) of the n-th line scanned by the laser beam scanner (such as the reflecting mirror 21, the actuator 19, and the mirror servo 20) being reflected by the object O. What is detected at the timing tb is the reflected light rn+1 (e.g., the second reflected light) obtained in response to the scanned laser beam Rn+1 (e.g., the second laser beam) of the (n+1)-th line scanned after the scanned laser beam Rn of the n-th line being reflected by the object O.
This allows the inclination θx of the object O with respect to the projection surface F in the line scanning direction (the x direction) to be detected based on the amount of change (tb−ta) in the timings ta and tb. The reflected light rn and rn+1 of the scanned laser beams Rn and Rn+1 of the n-th and the (n+1)-th lines reflected by the object O are detected at the timings ta and tb.
With the projector 1, the reflected lights rn and m+1 (e.g., the first and second reflected lights) are obtained in response to the scanned laser beams Rn and Rn+1 (e.g., the first and second laser beams) being reflected on different points Pn and Pn+1 of the object O, respectively.
With the projector 1, the amount of change (tb−ta) (e.g., the change amount) between the timings ta and tb (e.g., the first and second timings) is calculated by subtracting the timing ta (e.g., the first timing) from the timing tb (e.g., the second timing).
With the projector 1, the inclination determination component 122 further calculates the timings ta and tb (e.g., the first and second timings) relative to the points or timings t0 (e.g., the line-scan start timings) of the scanned laser beams Rn and Rn+1 (e.g., the first and second laser beams), respectively.
Also, with the projector 1 pertaining to an aspect of this embodiment, the inclination determination component 122 determines a plurality of amounts of change (e.g., change amounts) (such as (tb−ta), (tc−tb), etc.) for the timing (such as ta, tb, tc, tr, etc.) at which the reflected light r from the object O is detected. The inclination determination component 122 further determines the inclination θx of the object O with respect to the projection surface F in the line scanning direction (the x direction) based on the plurality of amounts of change.
This allows the inclination θx of the object O with respect to the projection surface F in the line scanning direction (the x direction) to be calculated more accurately.
Also, in an aspect of this embodiment, the projector 1 is an input device having a VUI function.
This allows the projector 1 having a VUI (virtual user interface) function to be used as an input device.
Also, the method for inputting the inclination θx of the object O in an aspect of this embodiment includes the following steps. First, a laser beam is emitted, and a plurality of lines of the laser beam projected in the projection region A on the projection surface F are scanned in the line scanning direction (the x direction). The reflected light r of the scanned laser beam R reflected by the object O located above the projection region A is detected. The inclination θx of the object O with respect to the projection surface F in the line scanning direction (the x direction) of the scanned laser beam R is determined based on the amount of change in the timing (such as ta, tb, tc, etc.) at which the reflected light r is detected.
With this input method, the laser beam that is scanned for a plurality of lines is projected into the projection region A on the projection surface F, and reflected by the object O located above the projection region A. The inclination θx of the object O with respect to the projection surface F in the line scanning direction (the x direction) of the scanned laser beam R is determined based on the amount of change in the timing (such as ta, tb, tc, tr, etc.) at which each reflected light r is detected. Therefore, a virtual input interface can be obtained with which the inclination θx of the object O located above the projection region A in the line scanning direction (the x direction) of the scanned laser beam R can be detected.
An embodiment of the present invention is described above. The above embodiment is merely an example, and various modifications in the combination of the constituent elements and processing steps are possible, and it will be understood by a person skilled in the art that this lies within the scope of the present invention.
For example, in the above embodiment, the position determination component 121 and the inclination determination component 122 are realized as functional components of the CPU 12. However, the present invention is not limited to or by this example. The position determination component 121 and the inclination determination component 122 can each be realized by an electronic circuit component that is separate from the CPU 12.
Also, in the above embodiment, the projector 1 includes the reflecting mirror 21. However, the present invention is not limited to or by this example. The projector 1 need not include the reflecting mirror 21. In this case, the scanned laser beam R emitted from the light emission face 10a will be projected directly onto the projection surface F.
Also, in the above embodiment, the inclination θx of the object O is determined based on the reflected light r (such as the reflected light rn and rn+1) of two scanned laser beams R line-scanned consecutively. However, the present invention is not limited to or by this example. The inclination θx can be determined based on the reflected light r of the scanned laser beam R for a given line, and the reflected light r of the scanned laser beam R a plurality of lines later. For instance, the inclination θx of the object O in the line scanning direction (the x direction) can be determined by using the above-mentioned Equation 1, based on the deviation (tc−ta) in the detection timing in light detection signals for the reflected light rn and rn+2 of the n-th and the (n+2)-th lines. In this case, it should go without saying that the spacing 2m of the reflection starting points Pn and Pn+2 and the line scanning distance Ln+2 of the scanned laser beam Rn+2 of the (n+2)-th line are used in Equation 1. This allows the inclination θx of the object O with respect to the projection surface F in the line scanning direction (the x direction) to be calculated more accurately by using Equation 1.
Also, in the above embodiment, the LD 15 emits a laser beam with a wavelength in the visible light band. However, the present invention is not limited to or by this example. The LD 15 can instead emit a laser beam with a wavelength outside the visible light band (such as infrared light or ultraviolet light).
In understanding the scope of the present invention, the term “comprising” and its derivatives, as used herein, are intended to be open ended terms that specify the presence of the stated features, elements, components, groups, integers, and/or steps, but do not exclude the presence of other unstated features, elements, components, groups, integers and/or steps. The foregoing also applies to words having similar meanings such as the terms, “including”, “having” and their derivatives. Also, the terms “part,” “section,” “portion,” “member” or “element” when used in the singular can have the dual meaning of a single part or a plurality of parts unless otherwise stated.
As used herein, the following directional terms “forward”, “rearward”, “front”, “rear”, “up”, “down”, “above”, “below”, “upward”, “downward”, “top”, “bottom”, “side”, “vertical”, “horizontal”, “perpendicular” and “transverse” as well as any other similar directional terms refer to those directions of an input device or projector in an upright position. Accordingly, these directional terms, as utilized to describe the input device should be interpreted relative to an input device in an upright position on a horizontal surface.
Also it will be understood that although the terms “first” and “second” may be used herein to describe various components these components should not be limited by these terms. These terms are only used to distinguish one component from another. Thus, for example, a first component discussed above could be termed a second component and vice-a-versa without departing from the teachings of the present invention. The term “attached” or “attaching”, as used herein, encompasses configurations in which an element is directly secured to another element by affixing the element directly to the other element; configurations in which the element is indirectly secured to the other element by affixing the element to the intermediate member(s) which in turn are affixed to the other element; and configurations in which one element is integral with another element, i.e. one element is essentially part of the other element. This definition also applies to words of similar meaning, for example, “joined”, “connected”, “coupled”, “mounted”, “bonded”, “fixed” and their derivatives. Finally, terms of degree such as “substantially”, “about” and “approximately” as used herein mean an amount of deviation of the modified term such that the end result is not significantly changed.
While only a selected embodiment has been chosen to illustrate the present invention, it will be apparent to those skilled in the art from this disclosure that various changes and modifications can be made herein without departing from the scope of the invention as defined in the appended claims. For example, unless specifically stated otherwise, the size, shape, location or orientation of the various components can be changed as needed and/or desired so long as the changes do not substantially affect their intended function. Unless specifically stated otherwise, components that are shown directly connected or contacting each other can have intermediate structures disposed between them so long as the changes do not substantially affect their intended function. The functions of one element can be performed by two, and vice versa unless specifically stated otherwise. The structures and functions of one embodiment can be adopted in another embodiment. It is not necessary for all advantages to be present in a particular embodiment at the same time. Every feature which is unique from the prior art, alone or in combination with other features, also should be considered a separate description of further inventions by the applicant, including the structural and/or functional concepts embodied by such feature(s). Thus, the foregoing descriptions of the embodiment according to the present invention are provided for illustration only, and not for the purpose of limiting the invention as defined by the appended claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
2013-077909 | Apr 2013 | JP | national |