This application is based upon and claims the benefit of priority of the prior Japanese Patent Application No. 2015-121028, filed on Jun. 16, 2015, the entire contents of which are incorporated herein by reference.
The embodiment discussed herein is related to information input control.
In a portable terminal such as a smartphone, a display and a touch panel are often mounted on top of each other. A user may operate such a terminal by touching regions where icons, a menu, and the like are displayed with his/her finger. In this event, the touch panel detects a region where input processing is performed by the user, the kind of an input operation, and the like. Then, the portable terminal performs processing associated with the operation detected by the touch panel.
A wearable terminal such as a head mounted display (HMD) and a watch-type device has recently attracted more and more attention. However, the wearable terminal is not equipped with a touch panel of a size easy to use by a user for input operations. This may complicate input processing. To address this, there has been proposed, as a related technique, a symbol generation device configured to generate a signal corresponding to a movement trajectory pattern of a mobile object such as a finger of a user. The symbol generation device uses a detection signal to specify a movement trajectory pattern of the mobile object, and specifies a signal corresponding to the specified movement trajectory pattern (for example, Japanese Laid-open Patent Publication No. 11-31047). Moreover, there has also been proposed an information input method in which ultrasonic speakers are mounted at both ends of wrists whereas ultrasonic microphones are worn on the respective finger tips, and the two-dimensional position of each finger tip from a time difference between when an ultrasonic wave is outputted from the corresponding ultrasonic speaker and when the ultrasonic wave is detected by the corresponding ultrasonic microphone (for example, Japanese Laid-open Patent Publication No. 2005-316763). In the information input method, acceleration sensors having sensitivity in a direction perpendicular to the palm are further mounted on the finger tips of the respective fingers, and movements of the fingers such as punching on a keyboard are recognized by detecting changes in accelerations outputted from the acceleration sensors. Furthermore, there has also been proposed an input device including a camera attached to a wrist so as to capture images of user's fingers, a sensor configured to recognize the tilt of the user's arm, and an information processing device (for example, Japanese Laid-open Patent Publication No. 2005-301583). The input device detects a punching operation based on the position of the user's hand detected by the sensor and the positions of the user's finger tips detected from an image captured by the camera.
According to an aspect of the invention, an input device that inputs a command to an information processing device includes a member with which the input device is attached to a hand of a user, a light source configured to project light onto a finger of the user, an image capture device configured to capture an image of reflected light from the finger onto which the light is projected, and a processor configured to: acquire a plurality of images from the image capture device, specify a positional change in the finger based on a change in patterns of the reflected light specified from the respective images, generate the command corresponding to a movement of the finger, based on the positional change, and input the generated command to the information processing device.
The object and advantages of the invention will be realized and attained by means of the elements and combinations particularly pointed out in the claims.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention, as claimed.
As described in the background, several techniques have been studied on the input devices for use to perform input to another device. However, the method using the ultrasonic speakers mounted at both ends of the wrists, the ultrasonic microphones worn on the finger tips, and the acceleration sensors attached to the palms is difficult for the user to use because of many parts to mount. Moreover, in the case of generating a signal from a trajectory pattern of a mobile object or in the case of using the input device used to detect the punching operation, it is difficult to determine each of many kinds of operations used for input to a touch panel. Therefore, an input result desired by the user may not be obtained.
It is an object of the technique disclosed in the embodiment to provide a highly convenient input device.
In Step S1, the terminal 10 displays an image on a screen. The user of the terminal 10 wears the input device 20 such that the infrared light pattern outputted from the input device 20 is projected onto his/her finger. The input device 20 uses an infrared camera or the like to periodically monitor a pattern obtained from the reflection by the user's finger when the infrared light pattern is projected onto the finger. Furthermore, the input device 20 uses the acceleration sensor to also periodically monitor changes in vibration (Step S2).
The input device 20 determines whether or not a change in the infrared light pattern is detected (Step S3). The input device 20 repeats the processing of Steps S2 and S3 until a change in the infrared light pattern is detected (No in Step S3).
Meanwhile, for example, it is assumed that the user of the terminal 10 visually confirms the display on the screen of the terminal 10 and moves his/her finger to perform processing. In this case, the infrared light pattern projected onto the user's finger changes with a change in a distance between an infrared light irradiation position on the input device 20 and the user's finger tip. In such a case, since a change in the infrared light pattern is detected, the input device 20 specifies processing performed by the user from the change in the infrared light pattern and the vibration (Yes in Step S3, Step S4). Then, the input device 20 notifies the terminal 10 of contents specified as the processing performed by the user (Step S5). Upon receipt of the notification of the processing by the user from the input device 20, the terminal 10 changes the display on the screen according to the processing by the user (Step S6).
As described above, the input device 20 that projects the infrared light pattern usable to obtain a change in distance is used at the user's finger tip. Then, the trajectory of the user's finger is obtained without attaching many parts to the user's hand, fingers, and the like. In other words, the method according to the embodiment enables the processing by the user to be specified just by installing one input device 20 at the position where the infrared light may be projected onto the user's finger, the input device 20 including parts used for irradiation of the infrared light pattern, the infrared camera, and the acceleration sensor. Therefore, the input device 20 is easy for the user to wear and is highly convenient. Furthermore, the input device 20 specifies the movement of the user's finger, and thus has an advantage that the user easily performs the processing compared with the case where the user performs input processing by wearing a mobile object used for the processing. Moreover, there is also an advantage for a vendor that the input device 20 may be manufactured at a lower cost than a device used in a method using a number of parts.
<Device Configuration>
The projector unit 25 projects an infrared light pattern. Note that the preferable infrared light pattern to be projected is one in which a change in the size and interval of a pattern depending on a distance to a target to be projected with infrared light is easily recognizable in an image captured by the image capture unit 26. The image capture unit 26 captures images of the pattern of the infrared light reflected from a target object irradiated with the infrared light pattern. The following description is given assuming that the infrared light pattern is projected onto user's finger.
The measurement unit 31 measures the magnitude of vibration of the user's finger. The generation unit 32 uses a change in the infrared light pattern captured by the image capture unit 26 to specify a positional change in the user's finger, and generates an input signal using the trajectory of the position of the user's finger. Moreover, the generation unit 32 uses the magnitude of the vibration obtained by the measurement unit 31 to specify the type of processing performed by the user. The storage unit 40 stores information to be used for processing by the control unit 30 and information obtained by the processing by the control unit 30. The transmission unit 22 transmits data such as the input signal generated by the generation unit 32 to the terminal 10. The reception unit 23 receives data from the terminal 10 as appropriate.
The processor 101 is an arbitrary processing circuit including a central processing unit (CPU). The processor 101 operates as the generation unit 32 by reading and executing a program stored in the ROM 103. The measurement unit 31 is realized by the processor 101 and the acceleration sensor 104. The lens 111 and the infrared LED 112 operate as the projector unit 25. The infrared camera 113 operates as the image capture unit 26. The communication interface 105 realizes the transmission and reception unit 21. The storage unit 40 is realized by the RAM 102 and the ROM 103.
B of
G1 of
<Method for Specifying Input Signal>
A method for specifying an input signal is described below by dividing the method into an example of a method for specifying a finger position by using an infrared light pattern and for specifying a positional change in the user's finger, and an example of determination processing using the vibration obtained by the measurement unit 31. Note that the example of the specification method and the example of the determination processing described below are examples for helping the understanding, and may be modified according to the implementation.
(A) Specification of Finger Position and Trajectory Using Infrared Light Pattern.
Pattern P1 in
Pattern P2 in
It is assumed that the interval of the irradiation pattern reflected on the finger when the finger is at a position Po1 is D1 and the interval of the irradiation pattern at a position Pot is D2. Moreover, it is assumed that the distance between the position Po1 and the image capture unit 26 is L1 and the distance between the position Po1 and the projector unit 25 is L2. Furthermore, it is assumed that the distance between the positions Po1 and Pot is ΔL. Then, the relationship represented by Equation (1) is established.
D2/D1=(L2+ΔL)/L2 (1)
Next, it is assumed that the interval of the pattern image-captured by the image capture unit 26 when the finger is at the position Po1 is d1 and the interval of the pattern image-captured by the image capture unit 26 at the position Pot is d2. Then, as for d1 and d2, Equation (2) is established.
d2/d1=L1/(L1+ΔL)×(L2+ΔL)/L2 (2)
Equation (3) is obtained when Equation (2) is solved in terms of ΔL.
ΔL=(d1−d2)/{(d2/L1)−(d1/L2)} (3)
Here, L1 and L2 are obtained from measurement when the user's finger is located at Po1. Moreover, d1 and d2 are obtained by analysis processing on an image obtained by the image capture unit 26. Thus, the generation unit 32 may obtain ΔL by using Equation (3).
On the other hand, in the case of use of infrared light with large diffusion, the position of each finger is specified by using the fact that the size of the dot obtained by diffusion is increased proportional to the distance from the input device 20. A calculation method used in this event is any known calculation method.
G4 illustrates an example of an image on the XZ plane captured by the image capture unit 26 when the user extends his/her index finger f1 to the left. The data obtained by the image capture unit 26 is obtained as Y-coordinate values at each coordinate within the XY plane as illustrated in
In the example of G4, the size of the dots of infrared light projected onto the index finger f1 and the interval between the dots are almost the same as those of the other fingers. Thus, the generation unit 32 determines that, even though the leftmost finger (index finger f1) among the four fingers onto which the infrared light is projected is at almost the same distance as the other fingers, the position thereof is shifted to the left. Therefore, using the image illustrated in G4, the generation unit 32 determines that the tips of the user's four fingers are positioned as illustrated in G5 when represented on the XY plane. In G5, while the tip of the index finger f1 is positioned to the left of the home position, the middle finger f2, the fourth finger f3, and the fifth finger f4 are at the home positions.
G6 illustrates an analysis example of an image on the XZ plane captured by the image capture unit 26 when the user extends his/her index finger f1 from the home position in a direction away from the input device 20. In G6, again, to facilitate visualization, the shapes of the fingers specified by the generation unit 32 and how the infrared light pattern is projected are illustrated as in the case of G4. In the example of G6, the size of the dots of infrared light projected onto the index finger f1 is larger than that of the dots projected onto the other fingers. Therefore, the generation unit 32 determines that the leftmost finger (index finger f1) among the four fingers onto which the infrared light is projected is located at a position farther away from the input device 20 than the other fingers. Moreover, as illustrated in G6, the positions of the fingers in the X-axis direction (horizontal direction) of the shapes of the fingers obtained on the XZ plane are not very different from the home positions. Therefore, the generation unit 32 determines, based on the information indicated in G6, that the leftmost finger is extended in a direction opposite to the input device 20, and the other three fingers are at the home positions. Moreover, the generation unit 32 calculates the distance of each of the fingers from the image capture unit 26 by using the size of the dots on each finger, and thus specifies the position of each finger on the XY plane. As a result of the processing by the generation unit 32, it is determined that the tips of the user's four fingers are positioned as illustrated in G7 when represented on the XY plane. More specifically, it is determined that, while the tip of the index finger f1 is at the position farther away from the input device 20 than the home position and not different from the home position in the horizontal direction, the middle finger f2, the fourth finger f3, and the fifth finger f4 are at the home positions. Note that, in G7, it is assumed that the larger the Y-axis value, the closer to the input device 20.
By the processing described above with reference to
(B) Determination Processing Using Result of Vibration Measurement by Measurement Unit 31
Hereinafter, description is given of examples of input processing from the input device 20 and operations of the terminal 10, which are performed using the method for specifying an input signal according to the embodiment. Note that, in the following examples, description is given of the case, as an example, where an infrared light pattern using infrared light with relatively large diffusion is projected onto the user's finger. However, infrared light with small diffusion may be projected.
In a first operation example, description is given of an example of processing when each of selectable regions is associated with the finger detected by the input device 20 on a screen displaying an image including the selectable regions.
It is assumed that the terminal 10 displays an image illustrated in G12 of
G13 of
On the other hand, when the amount of movement of the fingers in the obtained trajectory exceeds the predetermined threshold Th1, the generation unit 32 determines whether or not the absolute value of the acceleration measured by the measurement unit 31 at the time when the finger f1 is moved in the Z-axis direction exceeds a threshold Th2. When the absolute value of the acceleration measured by the measurement unit 31 at the time when the finger f1 is moved in the Z-axis direction exceeds the threshold Th2, the generation unit 32 determines that the tap operation is performed with the finger f1.
Upon detection of the tap operation, the generation unit 32 transmits a packet for notifying the identifier of the finger performing the tap operation to the terminal 10. In the example of G13, the finger (index finger) having the smallest X value on the XZ plane among the observed fingers is tapped, and the identifier f1 is associated with the finger. Therefore, the generation unit 32 transmits a packet notifying that the tap operation is performed with the finger f1 to the terminal 10 through the transmission unit 22.
Upon receipt of the packet notifying that the tap operation is performed with the finger f1 from the input device 20, the terminal 10 determines that Time button is selected, since Time button is associated with the finger f1. Therefore, the terminal 10 performs the same processing as that when Time button is selected from the input device included in the terminal 10 itself.
When a tap operation is performed with the other finger, again, the input device 20 notifies the terminal 10 of the identifier of the finger of which tap operation is detected. Thus, the terminal 10 performs the same processing as that when a button associated with the finger of which tap operation is performed is selected. For example, in the case of G15, the generation unit 32 specifies the execution of a tap operation with the finger f2, and notifies the terminal 10 of the execution of the tap operation with the finger f2. The terminal 10 performs processing when Weather button is selected, since Weather button is associated with the finger f2 (G16). Moreover, in the case of G17, the generation unit 32 specifies the execution of a tap operation with the finger f3, and notifies the terminal 10 of the execution of the tap operation with the finger f3. The terminal 10 performs processing when GPS button is selected, since GPS button is associated with the finger f3 (G18). Furthermore, in the case of G19, the generation unit 32 specifies the execution of a tap operation with the finger f4, and notifies the terminal 10 of the execution of the tap operation with the finger f4. The terminal 10 performs processing when NEXT button is selected, since NEXT button is associated with the finger f4 (G20).
The projector unit 25 projects an infrared light pattern onto the user's finger wearing the input device 20 (Step S12). The image capture unit 26 periodically captures the image of the infrared light pattern reflected on the finger, and outputs the obtained image data to the generation unit 32 (Step S13). The generation unit 32 specifies the position of each finger at the XYZ coordinates by analyzing the image data (Step S14). Note that the “position of each finger at the XYZ coordinates” is a combination of the position of each finger on the XZ plane and the distance in the depth direction. The generation unit 32 calculates the amount of positional change in the finger from a difference between the previous position of each finger at the XYZ coordinates and the position obtained by the analysis (Step S15). The measurement unit 31 measures the amount of vibration (Step S16). The generation unit 32 determines whether or not there is finger whose positional change amount exceeds the threshold Th1 (Step S17). When there is no finger whose positional change amount exceeds the threshold Th1, the processing returns to Step S14 and thereafter (No in Step S17). On the other hand, the positional change amount exceeds the threshold Th1, the generation unit 32 determines whether or not the amplitude of vibration exceeds a threshold Th2 (Yes in Step S17, Step S18). When the amplitude of vibration does not exceed the threshold Th2, the processing returns to Step S14 (No in Step S18).
On the other hand, when the amplitude of vibration exceeds the threshold Th2, the generation unit 32 determines that a tap operation is executed by the finger whose positional change exceeds the threshold Th1, and transmits a packet notifying the finger of which tap operation is executed to the terminal 10 (Yes in Step S18, Step S19). Upon receipt of the packet from the input device 20, the terminal 10 determines that a region associated with the finger of which tap operation is executed is selected, and performs processing corresponding to the determined region (Step S20).
As described above, according to the first operation example, the finger detected by the input device 20 is associated with the region displayed on the display of the terminal 10. Moreover, when notified of the finger whose tap operation is detected, the terminal 10 performs the same processing as that when the region associated with the finger whose tap operation is detected is selected. Thus, the user may easily perform input processing to the terminal 10.
Furthermore, when the length of the trajectory obtained by the movement of the fingers does not exceed the predetermined threshold Th1, the input device 20 determines that the position of the user's finger is changed even though such an operation is not intended by the user. Thus, the input device 20 may reduce input processing that is not intended by the user, when the user is using the input device 20 in a crowded place and something hits against the user's hand or when the user intends to perform processing and stops the processing, for example.
Moreover, when the input device 20 is used, the user only has to wear one device, as described with reference to
In a second operation example, description is given of an example of processing when each of selectable regions is not associated with the finger detected by the input device 20 on the screen displaying an image including the selectable regions.
Thereafter, it is assumed that, as illustrated in G34, the user performs a tap operation with his/her index finger. Then, the execution of the tap operation with the user's finger f1 is detected, using the same method as that described in the first operation example, by the processing by the projector unit 25, the image capture unit 26, and the generation unit 32. G35 illustrates the positions of the user's fingers f1 to f4 on the XY plane when the tap operation is executed. In the infrared light pattern illustrated in G34, the size of the dots does not significantly vary with the finger. Thus, the generation unit 32 determines, as illustrated in G35, that the distances of the fingers f1 to f4 from the image capture unit 26 are almost the same. The generation unit 32 transmits the execution of the tap operation with the finger f1, the identifier of the finger whose tap operation is executed, and the coordinates of the execution of the tap operation to the terminal 10.
When notified of the execution of the tap operation with one finger from the input device 20, the terminal 10 displays a pointer α at a predetermined position on the screen. In the example of
G37 is an example of the positions of the user's fingers on the XZ plane and the infrared light pattern reflected on the user's fingers when the user moves his/her index finger to the right after the processing of G34. G38 illustrates the positions of the user's fingers f1 to f4 on the XY plane when the user moves his/her index finger to the right. When the infrared light pattern illustrated in G37 is obtained, it is assumed that the measurement unit 31 does not observe vibration in which the absolute value of acceleration exceeds the threshold a. Then, the generation unit 32 recognizes that the finger f1 slides as illustrated in G35 to G38, and notifies the terminal 10 of the coordinates at which the finger f1 is positioned after the movement. Note that the generation unit 32 may periodically notify the terminal 10 of the coordinates of the finger f1 during the movement of the finger f1. Hereinafter, the operation when the finger slides as illustrated in G37 may be described as a swipe operation. It may be said, from the characteristics of the trajectory of the swipe operation, that the swipe operation is input involving a positional change on the image displayed on the screen of the terminal 10.
Upon receipt of a change in the coordinates of the finger f1 from the input device 20, the terminal 10 obtains the trajectory of the positional change in the finger f1. Furthermore, the terminal 10 moves the pointer α based on the trajectory obtained by the input device 20. Here, the trajectory of the pointer α is calculated as the trajectory at the coordinates used for display on the screen of the terminal 10 by multiplying the trajectory of the finger f1 notified from the input device 20 by a preset multiplying factor. G39 illustrates an example of the movement of the pointer α.
G41 of
Upon receipt of a change in the coordinates of the tip of the finger f1 from the input device 20, the terminal 10 obtains the trajectory of the pointer α from the trajectory of the change in the coordinates of the finger f1 by performing the same processing as that described taking G39 as an example. The terminal 10 changes the position of the pointer α along the obtained trajectory. Thus, as illustrated in G43, the pointer α moves on the screen.
Thereafter, it is assumed that the user performs a tap operation with his/her index finger. As described with reference to G34, the execution of the tap operation with the user's finger f1 is detected. Moreover, in this case, the positions of the respective fingers when the tap operation is performed are not changed from those described with reference to G41. Thus, the infrared light pattern projected onto the user's fingers is as illustrated in G44. G45 illustrates the positions of the user's fingers f1 to f4 on the XY plane when the tap operation is executed. The generation unit 32 transmits the execution of the tap operation with the finger f1, the identifier of the finger whose tap operation is executed, and the coordinates of the execution of the tap operation to the terminal 10.
Upon receipt of the execution of the tap operation with the finger f1 from the input device 20, the terminal 10 determines whether or not the position of the pointer α overlaps with the selectable region on the screen. In
The terminal 10 displays a screen including selectable regions such as icons on the display (Step S31). The processing of Steps S32 to S36 is the same as that of Steps S12 to S16 described with reference to
Upon receipt of a packet notifying the execution of the tap operation from the input device 20, the terminal 10 determines whether or not a pointer is being displayed on the screen (Step S40). When the pointer is not displayed on the screen, the terminal 10 displays the pointer on the screen (No in Step S40, Step S41). Thereafter, the terminal 10 determines whether or not the position of the pointer overlaps with the selectable region (Step S42). When the position of the pointer does not overlap with the selectable region, the processing of Step S34 and thereafter is repeated (No in Step S42). When the position of the pointer overlaps with the selectable region, the terminal 10 determines that the region displayed at the position overlapping with the pointer is selected, and performs processing associated with the region determined to be selected (Step S43).
When the amplitude of vibration does not exceed the threshold Th2 in Step S38, the generation unit 32 determines that a swipe operation is performed (No in Step S38, Step S44). The generation unit 32 notifies the terminal 10 of the execution of the swipe operation together with the position of the finger where the operation is observed. The terminal 10 determines whether or not a pointer is being displayed (Step S45). When the pointer is not displayed, the terminal 10 determines that no operation to the terminal 10 is started by the user, and the processing of Step S34 and thereafter is repeated (No in Step S45). When the pointer is displayed on the screen, the terminal 10 moves the coordinates at which the pointer is displayed on the screen, according to the positional change in the finger (Yes in Step S45, Step S46).
As described above, according to the second operation example, the user may select the selectable region on the screen even though the finger detected by the input device 20 is not associated with the region displayed on the screen of the terminal 10. Moreover, the user may easily perform input processing to the terminal 10. Moreover, in the second operation example, again, it is determined that no input processing is performed when the trajectory obtained by the movement of the user's finger does not exceed the predetermined threshold Th1, as in the case of the first operation example. Thus, input processing not intended by the user attributable to a positional shift in the user's finger and the like may be reduced.
In a third operation example, description is given of an example of processing when the display of the screen is increased or reduced (pinched out or pinched in) on the screen displaying an image. In the third operation example, description is given of the case where, in order to reduce erroneous operations, an increase or reduction processing is performed when an operation is normally performed with a finger stored by the input device 20 as the finger to be used by the user to increase or reduce the display of the screen. Note that, in the third operation example, again, the identifier of the finger observed by the input device 20 is represented as a combination of the alphabet f and the number of the observed finger. Moreover, the number included in the identifier of the finger is generated such that the finger having a smaller average of X coordinates among the fingers observed is associated with a smaller value.
G51 of
Thereafter, it is assumed that the user performs a tap operation with his/her index finger and middle finger, as illustrated in G54. Then, it is specified that a positional change in the Z-axis direction exceeds the threshold Th1 almost simultaneously at the user's fingers f1 and f2, using the same method as that described in the first operation example, by the processing by the projector unit 25, the image capture unit 26, and the generation unit 32. Furthermore, it is assumed that the acceleration whose absolute value exceeds the threshold Th2 is observed by the measurement unit 31 at the time when the positional changes in the user's fingers f1 and f2 are observed. Then, the generation unit 32 determines that the tap operations are performed with the user's fingers f1 and f2. Note that, in the infrared light pattern illustrated in G54, the size of the dots does not significantly vary with the finger. Thus, the generation unit 32 determines that the distances of the fingers f1 to f4 from the image capture unit 26 are almost the same. G55 illustrates a specific example of the positions of the user's fingers f1 to f4 on the XY plane when the tap operations are executed. The generation unit 32 transmits the execution of the tap operations, the identifiers of the fingers whose tap operations are executed, and the coordinates of the execution of the tap operations to the terminal 10. Thus, in the example of G54 and G55, the information of the tap operations executed with the fingers f1 and f2 are notified to the terminal 10.
When notified of the execution of the tap operations with the two fingers from the input device 20, the terminal 10 displays a pointer α and a pointer β on the screen as illustrated in G56, and sets the terminal 10 in an edit mode. Note that, in
G57 of
Upon receipt of changes in the coordinates of the fingers f1 and f2 from the input device 20, the terminal 10 uses the coordinates of the fingers f1 and f2 newly notified from the input device 20 to calculate the distance between the pointers α and β. Then, the terminal 10 changes the distance between the pointers α and β in response to the obtained calculation result, and at the same time, changes the display magnification of the image that is being displayed, according to a change in the distance between the pointers. For example, when notified of the fact that the fingers f1 and f2 are moving away from each other as illustrated in G55 to G58, the terminal 10 increases the display magnification of the image that is being displayed on the screen, with the display positions of the pointers α and β as the center as illustrated in G59.
Thereafter, it is assumed that the user performs tap operations with both of the index finger and the middle finger. The execution of the tap operations with the user's fingers f1 and f2 is detected as described with reference to G51. Moreover, in this case, the positions of the respective fingers when the tap operations are performed are not changed from those described with reference to G57 and G58. Thus, the infrared light pattern projected onto the user's fingers is as illustrated in G60. G61 illustrates the positions of the user's fingers f1 to f4 on the XY plane when the tap operations are executed. The generation unit 32 transmits the execution of the tap operations, the identifiers of the fingers whose tap operations are executed, and the coordinates of the execution of the tap operations to the terminal 10.
When notified of the execution of the tap operations with the fingers f1 and f2 from the input device 20, the terminal 10 determines that the processing of changing the magnification of the displayed image with the pointers α and β is completed. Therefore, the terminal 10 sets the size of the image to that currently displayed. When determining that the size of the display is set, the terminal 10 finishes the edit mode. Furthermore, the terminal 10 finishes the display of the pointers α and β used to change the size of the display. Thus, an image illustrated in G62 is displayed on the screen of the terminal 10.
Upon receipt of a packet notifying the execution of the tap operations from the input device 20, the terminal 10 determines whether or not the edit mode is set (Step S60). When notified of the tap operations with the fingers f1 and f2 from the input device 20 while being set in the edit mode, the terminal 10 finishes the edit mode and finishes the display of the pointers on the screen before returning to Step S54 (Yes in Step S60, Step S61). On the other hand, when notified of the tap operations with the fingers f1 and f2 from the input device 20 while not being set in the edit mode, the terminal 10 enters the edit mode and displays the pointers corresponding to the fingers f1 and f2 on the screen (No in Step S60, Step S62). Thereafter, the processing of Step S54 and thereafter is repeated.
When the input device 20 determines in Step S58 that the amplitude of vibration does not exceed the threshold Th2, the generation unit 32 notifies the terminal 10 of the coordinates of the fingers f1 and f2 (No in Step S58). The terminal 10 uses the packet received from the input device 20 to acquire the coordinates of the fingers f1 and f2. When notified of the coordinates from the input device 20, the terminal 10 in the edit mode moves the pointers according to the notified coordinates, changes the display magnification of the image that is being displayed according to the distance between the pointers, and then returns to Step S54 (No in Step S58, Step S64).
When it is determined in Step S57 that the positional changes at the fingers f1 and f2 do not exceed the threshold Th1, the generation unit 32 determines whether or not the positional change at either of the fingers f1 and f2 exceeds a threshold Th3 and the amplitude of vibration also exceeds the threshold Th2 (Step S63). Here, the threshold Th3 is the magnitude of a positional change obtained by a tap operation with one finger. When the positional change at either of the fingers f1 and f2 exceeds the threshold Th3 and the amplitude of vibration also exceeds the threshold Th2, the input device 20 notifies the terminal 10 of the end of the processing of changing the magnification, and the terminal 10 finishes the edit processing (Yes in Step S63). Note that the fact that the positional change at either of the fingers f1 and f2 exceeds the threshold Th3 and the amplitude of vibration also exceeds the threshold Th2 means that the tap operation is performed with either of the fingers f1 and f2. Therefore, in the flowchart of
As described above, according to the third operation example, the image displayed on the screen of the terminal 10 is increased or reduced through input from the input device 20. Furthermore, since the finger used to increase or reduce the image is the finger f1 (index finger) or f2 (middle finger), the input device 20 generates no input signal to the terminal 10 even when an operation is performed with any of the other fingers. Therefore, when the processing of the terminal 10 is performed using the input device 20, the display magnification of the image does not change in the terminal 10 even if an operation is performed with the finger other than the index finger and the middle finger. Therefore, the use of the third operation example is likely to reduce erroneous operations. For example, in such a case as where input processing is performed using the input device 20 in a crowded transportation facility, the user's finger may be moved against the user's intention. In such a case, again, operations not intended by the user may be reduced if the finger other than the finger used to increase or reduce the image is moving.
Furthermore, in the example of
In a fourth operation example, description is given of an example of processing when an image displayed on the screen is rotated. In the fourth operation example, description is given of the case where, in order to reduce erroneous operations, rotation processing is performed when an operation is performed with a finger stored in the input device 20 as the finger to be used by the user to rotate the image displayed on the screen. Note that, in the fourth operation example, again, the same identifiers as those in the third operation example are used as the identifiers of the respective fingers. In the following example, it is assumed that the input device 20 stores, in the storage unit 40, that the processing of rotating the image displayed on the screen is performed with the fingers f1 and f2.
Thereafter, it is assumed that the user performs tap operations with his/her index finger and middle finger, as illustrated in G74. G75 illustrates the positions of the tips of the fingers on the XY plane when the user performs the tap operations as illustrated in G74. The detection of the tap operations by the input device 20 and the notification from the input device 20 to the terminal 10 are the same as the processing described with reference to G54 to G57 of
It is assumed that, after the processing of G74, the user moves his/her index finger and middle finger such that a line connecting the index finger and the middle finger before movement intersects with a line connecting the index finger and the middle finger after movement. G77 of
Upon receipt of changes in the coordinates of the fingers f1 and f2 from the input device 20, the terminal 10 uses the coordinates of the fingers f1 and f2 newly notified from the input device 20 to change the positions of the pointers α and β. Thus, the positions of the pointers α and β change as illustrated in G79. Furthermore, when changing the positions of the pointers α and β, the terminal 10 rotates the displayed image by the angle θ formed by the line connecting the pointers after movement and the line connecting the pointers before movement. The image displayed on the screen at the point of G79 is rotated by θ from the image illustrated in G76.
Thereafter, it is assumed that the user performs tap operations with both of his/her index finger and middle finger. The execution of the tap operations with the user's fingers f1 and f2 is detected as described with reference to G74. Moreover, in this case, it is assumed that the positions of the respective fingers when the tap operations are performed are not changed from those described with reference to G77 and G78. In this case, the infrared light pattern projected onto the user's fingers is as illustrated in G80. G81 illustrates the positions of the user's fingers f1 to f4 on the XY plane when the tap operation is executed. The generation unit 32 transmits the execution of the tap operations, the identifiers of the fingers whose tap operations are executed, and the coordinates of the execution of the tap operations to the terminal 10.
When notified of the execution of the tap operations with the fingers f1 and f2 from the input device 20, the terminal 10 determines that the processing of rotating the displayed image with the pointers α and β is completed. Therefore, the terminal 10 sets the display angle of the image to the current display angle. As it is determined that the display angle is set, the terminal 10 ends the edit mode. Furthermore, the terminal 10 ends the display of the pointers α and β used to change the display angle. As a result, an image illustrated in G82 is displayed on the screen of the terminal 10.
As described above, according to the fourth operation example, the image displayed on the screen of the terminal 10 is rotated through input from the input device 20. Furthermore, since the fingers used to perform the rotation are the fingers f1 (index finger) and f2 (middle finger), the input device 20 transmits no input signal to the terminal 10 even when an operation is performed with the other fingers, as in the case of the third operation example. Therefore, the display angle of the image is not changed by the movement of the fingers other than the fingers f1 and f2. Thus, erroneous operations are likely to be reduced. Furthermore, as in the case of the third operation example, the method for detecting the finger with the input device 20 may be changed depending on whether the hand wearing the input device 20 is the right hand or left hand.
In a fifth operation example, description is given of an example of processing when the display position of the image displayed on the screen is changed.
Thereafter, it is assumed that the user performs a tap operation with his/her index finger, as illustrated in G94. Then, the execution of the tap operation with the user's finger f1 is detected, using the same method as that described in the first operation example, by the processing by the projector unit 25, the image capture unit 26, and the generation unit 32. G95 illustrates the positions of the user's fingers f1 to f4 on the XY plane when the tap operation is executed. The generation unit 32 transmits the execution of the tap operation with the finger f1, the identifier of the finger whose tap operation is executed, and the coordinates of the execution of the tap operation to the terminal 10. When notified of the execution of the tap operation with one finger from the input device, the terminal 10 displays a pointer α at a preset position on the screen as illustrated in G96, and shifts to an image scroll mode. Hereinafter, it is assumed that the image scroll mode is a mode for changing the display position of the image. The terminal 10 stores the coordinates notified from the input device 20.
G97 is an example of the positions of the user's fingers on the XZ plane and the infrared light pattern reflected on the user's fingers when the user moves his/her index finger to the right after the processing of G94. G98 illustrates the positions of the user's fingers f1 to f4 on the XY plane when the user moves his/her index finger to the right. Note that it is assumed that, when the infrared light pattern illustrated in G97 is obtained, the measurement unit 31 does not observe vibration in which the absolute value of acceleration exceeds the threshold a. Then, the generation unit 32 recognizes that the finger f1 slides as illustrated in G97, and notifies the terminal 10 of the coordinates at which the finger f1 is positioned.
Upon receipt of a change in the coordinates of the finger f1 from the input device 20, the terminal 10 obtains the trajectory of the positional change in the finger f1. Moreover, the terminal 10 moves the pointer α based on the trajectory obtained by the input device 20. A method for obtaining the trajectory of the pointer α is the same as that in the second operation example. Furthermore, the terminal 10 changes the displayed image itself by the change in display position of the pointer α. Thus, the display position of the image itself is also moved to the right together with the pointer by the processing illustrated in G97 and G98. Thus, the display of the screen changes from G96 to G99.
G111 of
Upon receipt of a change in the coordinates of the finger f1 from the input device 20, the terminal 10 obtains the trajectory of the positional change in the finger f1. Moreover, the terminal 10 moves the pointer α and the display position of the image based on the trajectory obtained by the input device 20. The processing in this event is the same as that described with reference to G97 to G99. Thus, a screen after the pointer and the image are moved is as illustrated in G113 of
Thereafter, it is assumed that the user performs a tap operation with his/her index finger. In this case, the execution of the tap operation with the user's finger f1 is detected, as described with reference to G94 (
When the amplitude of vibration does not exceed the threshold Th2 in Step S98, the generation unit 32 determines that a swipe operation is performed (Step S103). The generation unit 32 notifies the terminal 10 of the position of the finger where the operation is observed and the execution of the swipe operation. The terminal 10 determines whether or not the terminal is in the image scroll mode (Step S104). When not in the image scroll mode, the terminal 10 determines that the operation is not started, and returns to Step S94 (No in Step S104). When set in the image scroll mode, the terminal 10 moves the display position of the image and the display coordinates of the pointer on the screen according to the positional change in the finger (Yes in Step S104, Step S105).
As described above, according to the fifth operation example, the display position of the image displayed on the screen of the terminal 10 may be easily changed using the input device 20.
In a sixth operation example, description is given of an example where the generation unit 32 reduces erroneous operations by using the fact that the pattern of change in the detection position of the finger in the Z-axis direction varies with the shape of the finger in operation.
On the other hand, when the position of the finger is in the state indicated by P2, the distance between the position where the finger is observed and the input device 20 is gradually increased from the joint of the finger toward the finger tip. Therefore, the smaller the height of the finger (Z-axis value) to be observed, the smaller the Y-coordinate value at the detection position of the finger. Thus, when changes in the detection position of the finger are plotted with the Y-axis as the vertical axis and the Z-axis as the horizontal axis, the region where the finger is observed is directly proportional to the Z-axis value, as illustrated in G121.
Next, with reference to AC2, description is given of the shape of the finger when a tap operation is performed and a change in Y-coordinate value at the detection position of the finger. When the tap operation is performed, the finger is lifted from a position indicated by P4 to a position indicated by P3, and then returned to the position of P4 again. Thus, from the beginning to the end of the tap operation, there is no significant difference in distance of the position where the finger is observed from the input device 20 between the joint of the finger and the finger tip, as in the case where the position of the finger is at P1. Moreover, with the shape of the finger when the tap operation is performed, the Y-coordinate value at the detection position of the finger does not change significantly even though the height of the spot of the finger to be observed changes. Thus, when changes in the detection position of the finger are plotted with the Y-axis as the vertical axis and the Z-axis as the horizontal axis, G122 is obtained.
Therefore, if a period with a graph having a shape that the Y-axis value is proportional to the Z-axis value as illustrated in G121 is during the operation, when the Y-coordinate values at the detection position of the finger are plotted as the function of the height, the generation unit 32 may determine that the user slides his/her finger.
On the other hand, it is assumed that, when the Y-coordinate values at the detection position of the finger are plotted as the function of the height, the period with the graph having the shape that the Y-axis value is proportional to the Z-axis value as illustrated in G121 is not during the operation, and only the plot illustrated in G122 is obtained. In this case, the generation unit 32 may determine that the user is performing a tap operation.
The sixth determination processing may also be used together with any of the first to fifth operation examples described above. A probability of erroneous operations may be further reduced by combining the processing of determining the type of user operation based on a change in the shape of the user's finger.
For example, the generation unit 32 determines that the tap operation is performed, since the length of the trajectory obtained by the movement of a certain finger exceeds the threshold Th1 and a vibration having an amplitude of a predetermined value or more is generated. In this event, the generation unit 32 further determines whether or not the relationship between the Y-coordinate value and the distance from the finger tip as for the finger determined to be performing the tap operation is as illustrated in G122. When the relationship between the Y-coordinate value and the distance in the height direction from the finger tip as for the finger determined to be performing the tap operation is as illustrated in G122 of
On the other hand, when the relationship between the Y-coordinate value and the distance in the height direction from the finger tip as for the finger determined to be performing the tap operation takes a shape different from that illustrated in G122 of
Likewise, when determining whether or not a swipe operation is performed, again, the generation unit 32 may use the relationship between the Y-coordinate value and the distance in the height direction from the finger tip. For example, the generation unit 32 determines that the swipe operation is performed, since the length of the trajectory obtained by the movement of a certain finger exceeds the threshold Th1, but the vibration having the amplitude of the predetermined value or more is not generated. In this event, the generation unit 32 determines whether or not the relationship between the Y-coordinate value and the distance in the height direction from the finger tip is changed in a gradation pattern according to the distance from the finger tip during a user operation, as illustrated in G121 of
On the other hand, when the Y-coordinate value is not changed in the gradation pattern, the state where the finger is tilted to the input device 20 is not generated during the operation. Therefore, no swipe operation is performed. Thus, the generation unit 32 determines that there is a possibility that a user operation is erroneously recognized, and thus terminates the processing without generating any input signal.
Note that, in the example of
<Others>
The embodiment is not limited to the above, various modifications may be made. Some examples are described below.
For example, when the processor 101 included in the input device 20 has poor performance, the input device 20 may periodically notify the terminal 10 of the result of observation by the measurement unit 31 and a change in coordinate by the generation unit 32, without specifying the type of processing by the user. In this case, the terminal 10 specifies the type of the processing performed by the user, based on information notified from the input device 20.
The terminal 10 configured to acquire an input signal from the input device 20 is not limited to a device of a shape that makes input difficult to perform, such as a wearable terminal and a watch-type terminal. For example, the terminal 10 may be a cell-phone terminal including a smartphone, a tablet, a computer, or the like. When the cell-phone terminal including the smartphone, the tablet, the computer, or the like is used as the terminal 10, there is an advantage that the user may operate the terminal 10 without smearing the surface thereof with skin oil on the finger tip or the like.
All examples and conditional language recited herein are intended for pedagogical purposes to aid the reader in understanding the invention and the concepts contributed by the inventor to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions, nor does the organization of such examples in the specification relate to a showing of the superiority and inferiority of the invention. Although the embodiment of the present invention has been described in detail, it should be understood that the various changes, substitutions, and alterations could be made hereto without departing from the spirit and scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2015-121028 | Jun 2015 | JP | national |