This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2011-213998, filed on Sep. 29, 2011; the entire contents of which are incorporated herein by reference.
Embodiments described herein relate generally to a command issuing device, a method therefor and a computer program product.
User interfaces that realize user operation by combining picture projection techniques, imaging techniques, object recognition techniques and the like are conventionally known. For example, some of such user interfaces realize user operation by projecting a picture of a graphical user interface (GUI) on the palm of one hand of an operator, capturing a moving image of the palm, and recognizing that the operator is touching the GUI projected on the palm with a finger of the other hand on the basis of the captured moving image.
Such techniques as described above, however, are based on the operation using both hands of the operator.
According to an embodiment, a command issuing device includes an acquiring unit configured to acquire a moving image by capturing a hand of an operator; a projection area recognizer configured to recognize a projection area of a projection finger in the moving image, the projection finger being one of fingers onto which one of pictures of a graphical user interface (GUI) is projected; a projector configured to project one of the pictures of the GUI onto the projection area; an operation area recognizer configured to recognize an operation area of an operation finger in the moving image, the operation finger being one of fingers which is assigned to operate the GUI; a selection determining unit configured to measure an overlapping degree of the projection area and the operation area to determine whether or not the GUI is selected; and a issuing unit configured to issue a command associated with the GUI when it is determined that the GUI is selected.
Note that, a thumb 23 of the hand 22 is an operation finger used for operation and an index finger 24-1, a middle finger 24-2, a third finger 24-3 and a little finger 24-4 of the hand 22 are projection fingers used for projection of GUI pictures in this embodiment, but the allocation to fingers is not limited thereto. In the example illustrated in
The acquiring unit 101 acquires a moving image obtained by capturing a hand of an operator. The acquiring unit 101 may be any device capable of capturing a moving image. Although it is assumed in this embodiment that the acquiring unit 101 is realized by a video camera, the acquiring unit 101 is not limited thereto. In this embodiment, the acquiring unit 101 can capture moving images in color at 60 frames per second, and has a sufficient angle of view for capturing the hand 22, a mechanism capable of automatically adjusting the focal length, and a distortion correction function for correcting a distortion caused in a captured image because of a lens.
It is assumed in this embodiment that color markers are attached on a tip portion of the thumb 23 of the operator 21 and entire portions (portions from bases to tips) of the fingers 24 other than the thumb of the operator 21. It is assumed that the color markers are made of diffuse reflective materials that are not specular and each having a single color that can be distinguished from the surface of the hand. The color markers of the respective fingers have different colors from one another.
The extracting unit 102 performs image processing on the moving image acquired by the acquiring unit 101 to extract the projection fingers and the operation finger. Specifically, the extracting unit 102 applies a color filter on the moving image acquired by the acquiring unit 101 to assign nonzero values only to pixels with colors within a specific hue range and “0” to pixels with other colors and thereby extracts the projection fingers and the operation finger. It is assumed here that the colors of the color markers attached on the respective fingers of the operator 21 are known and that the extracting unit 102 holds in advance luminance distribution of the color markers in the moving image acquired by the acquiring unit 101.
As described above, the technique of extracting the operation finger and the projection fingers by attaching the color markers on the respective fingers of the operator 21 is described in this embodiment, but the technique for extracting the operation finger and the projection fingers is not limited thereto. For example, the operation finger and the projection fingers may be extracted by measuring distance distribution from the command issuing device 1 to the hand 22 of the operator 21 using a range finder or the like employing a laser ranging system and applying known shape information of the hand such as the length and the thickness of the fingers. For measuring the distance distribution from the command issuing device 1 to the hand 22 of the operator 21, a technique such as stereo matching using a plurality of cameras can be used. Moreover, if finger areas are detected by using a detector for image recognition based on Haar-Like features to extract the operation finger and the projection fingers, for example, the color markers need not be attached on the respective fingers of the operator 21.
The projection area recognizer 103 recognizes projection areas of the projection fingers from the moving image acquired by the acquiring unit 101. Specifically, the projection area recognizer 103 extracts shape feature values from the bases to the tips of the projection fingers from the image of the projection fingers extracted by the extracting unit 102, and recognizes areas represented by the extracted shape feature values as projection areas. Details of the technique for recognizing the projection areas will be described later.
The GUI information storage unit 104 stores therein information on the GUIs projected on the projection areas of the projection fingers.
The GUI information is in a form of a table associating a finger ID, a display form, displayed information, a display attribute and a command ID. The “finger ID” is an index for identifying the projection fingers. For example, a finger ID “1” represents the index finger 24-1, a finger ID “2” represents the middle finger 24-2, a finger ID “3” represents the third finger 24-3, and a finger ID “4” represents the little finger 24-4. Note that information in the case where the finger ID is “2” to “4” is omitted in the examples illustrated in
Although an example in which the GUI information associates a projection finger uniquely with a GUI element is described in this embodiment, the association is not limited thereto. For example, the number of GUI elements may be smaller than the number of the projection fingers depending on the information projected by the command issuing device 1 or the device to be controlled by the command issuing device 1. In addition, even if the number of the GUI elements is equal to the number of the projection fingers, only some of the projection fingers may be recognized because of fingers obscured by one another. The GUI information may therefore not associate a projection finger uniquely with a GUI element.
The managing unit 105 manages the GUI projected on projection areas and commands issued when the GUI is selected. The managing unit 105 includes an issuing unit 105A. Details of the issuing unit 105A will be described later.
In the case of the GUI information illustrated in
In the case of the GUI information illustrated in
In the case of the GUI information illustrated in
In the case of the GUI information illustrated in
In the case of the GUI information illustrated in
The position association table storage unit 106 stores therein a position association table associating a coordinate position on an imaging plane of the moving image acquired by the acquiring unit 101 with a coordinate position on a projection plane of the projector 108. When the capturing angle of view and the optical axis of the acquiring unit 101 are not coincident with the projection angle of view and the optical axis of the projector 108, the capturing range of the acquiring unit 101 and the projecting range of the projector 108 are not coincident with each other. Accordingly, the position association table storage unit 106 holds association of positions between the imaging plane of the acquiring unit 101 and the projection plane of the projector 108 as described above.
In this embodiment, the projector 108 projects a pattern to a predetermined position expressed by two-dimensional coordinates on the projection plane, the acquiring unit 101 images the pattern, and the position on the projection plane and the position on the imaging plane are associated to obtain the position association table.
In this embodiment, the position association table is used for a process of transforming the shape of a projected picture performed by the projected picture generating unit 107. When the projected picture generating unit 107 uses perspective projection-based transformation for the process for transforming the shape of a projected picture, the position association table has only to prepare at least four associations and holds a position on the imaging plane and a position on the projection plane for each point. Details of these processes will not be described because techniques known in the field of computer vision can be used therefor and these processes can be performed using instructions called cvGetPerspective Transform and cvWarpPerspective included in a commonly-available software library OpenCV, for example.
The projected picture generating unit 107 generates a projected picture to be projected onto a projection area recognized by the projection area recognizer 103 according to the GUI information set by the managing unit 105. Specifically, the projected picture generating unit 107 generates a picture according to the GUI information set by the managing unit 105 for each projection finger and transforms the generated picture according to the position association table to generate a projected picture conforming to the projection area recognized by the projection area recognizer 103. Accordingly, the projector 108 can project a projected picture conforming to the projection area. The projected picture generating unit 107 can be realized by a graphics processor.
The projector 108 projects a GUI picture onto the projection area recognized by the projection area recognizer 103. Specifically, the projector 108 projects projected GUI pictures 51-1 to 51-4 generated by the projected picture generating unit 107 onto projection areas 41-1 to 41-4, respectively, of projection fingers recognized by the projection area recognizer 103 as illustrated in
The operation area recognizer 109 recognizes an operation area of the operation finger from a moving image acquired by the acquiring unit 101. Specifically, the operation area recognizer 109 extracts the shape feature value of the tip of the operation finger from the image of the operation finger extracted by the extracting unit 102, and recognizes an area represented by the extracted feature value as the operation area.
An index for the operation area recognized by the operation area recognizer 109 may be obtained by using a method of approximating the tip area of the thumb 23 as an elliptical shape having the center position, the diameter direction and length and the length of a rectangle that crosses the diameter at right angle, a method of approximating the area as a rectangle, or a method of approximating the area as an area within a contour including a plurality of connected lines, in addition to the method of approximating the area as a circle as described above.
The selection determining unit 110 measures the overlapping degree of the projection area recognized by the projection area recognizer 103 and the operation area recognized by the operation area recognizer 109, and determines whether or not the GUI projected on the projection area is selected.
Here, the issuing unit 105A will be described. When it is determined by the selection determining unit 110 that the GUI is selected, the issuing unit 105A issues a command associated with the GUI.
For example, it is assumed that the GUI information illustrated in
As another example, it is assumed that the GUI information illustrated in
The communication unit 111 transmits the command issued by the issuing unit 105A to an external device to be controlled. Upon receiving a notification of a change in the GUI information from the external device to be controlled to which the communication unit 111 transmitted a command as a result of transmitting the command, the communication unit 111 informs the managing unit 105 of the same. Then, the managing unit 105 switches to the GUI information to be used among the GUI information stored in the GUI information storage unit 104. For example, when the GUI information illustrated in
First, the acquiring unit 101 performs an acquisition process of acquiring a moving image obtained by capturing a hand of an operator (step S101).
Subsequently, the extracting unit 102 performs an extraction process of performing image processing on the moving image acquired by the acquiring unit 101 to extract projection fingers and an operation finger (step S102).
Subsequently, the projection area recognizer 103 performs a projection area recognition process of recognizing projection areas of the projection fingers from the image of the projection fingers extracted by the extracting unit 102 (step S103).
Details of the projection area recognition process will be described here.
First, in this embodiment, since a GUI is selected by bringing the operation area of the operation finger over a GUI picture projected on a projection area of a projection finger, the color marker attached to the projection finger may be hidden by the operation finger in the picture of the projection finger extracted by the extracting unit 102 as illustrated in
Details of the projection area recognition process will be described here.
The projection area recognizer 103 determines the prior feature values of the projection fingers by using positions of tip points and base points of the color markers attached to the projection fingers. The projection area recognizer 103 sets IDs of the projection fingers to n (n=1 to 4). In this embodiment, the ID of the index finger 24-1 is n=1, the ID of the middle finger 24-2 is n=2, the ID of the third finger 24-3 is n=3, and the ID of the little finger 24-4 is n=4. The projection area recognizer 103 also sets the base points of the projection fingers to Pn and the tip points of the projection fingers to P′n. Specifically, the projection area recognizer 103 sets the coordinates of a pixel with the largest x-coordinate value to P′n and the coordinates of a pixel with the smallest x-coordinate value to Pn for each ID of the projection fingers.
The projection area recognizer 103 then obtains a median point G of P1 to P4 and an average directional vector V of directional vectors P1P′1 to P4P′4. The projection area recognizer 103 further searches for a pixel that is farthest from line a PnP′n in the direction of a line perpendicular to the directional vector PnP′n for each ID of the projection fingers, stores the distance between the pixel and the line PnP′n in the counterclockwise direction from the line PnP′n to feature values en and in the clockwise direction from the line PnP′n to feature values fn.
First, the projection area recognizer 103 extracts feature points Rn, R′n of the projection fingers (step S201). Specifically, the projection area recognizer 103 extracts base points Rn and tip points R′n of the areas 32-n of the color markers attached to the projection fingers as illustrated in
Subsequently, the projection area recognizer 103 calculates estimated base points R″n taking hidden areas into consideration on the basis of the prior feature points Pn, P′n (step S202). Specifically, the projected area recognizer 103 sets points obtained by extending the lines PnP′n in the direction from R′n to Rn, where R′n is a reference point, to R″n as illustrated in
Subsequently, the projection area recognizer 103 calculates corrected points Sn of the estimated base points R″n (step S203). Specifically, the projection area recognizer 103 first obtains a median point G″ of R″1 to R″4 as illustrated in
Subsequently, the projection area recognizer 103 obtains points Fni (i=0 to 3) away from the end points Sn and R′n of the line SnR′n in the direction perpendicular thereto by the amounts of the feature values en and fn representing the thickness of the fingers calculated in advance for the respective IDs of the projection fingers as illustrated in
An index for a projection area may be obtained by using a method of approximating the projection area as an elliptical shape having the center position, the diameter direction and length and the length of a rectangle that crosses the diameter at right angle, a method of approximating the projection area as an area within a contour including a plurality of connected lines, in addition to the method of approximating the projection area as a rectangle as described above.
The description refers back to
Subsequently, the projector 108 performs a projection process of projecting projected pictures generated by the projected picture generating unit 107 onto projection areas recognized by the projection area recognizer 103 (step S105).
Details of the projection process will be described here.
First, the projector 108 reserves areas into which projected pictures projected by itself are to be stored in a frame memory and initializes the areas (step S301). For example, the projector 108 initializes the entire areas with black because pictures are not projected onto areas displaying in black.
Subsequently, the projected picture generating unit 107 obtains information on the projection areas of the respective projection fingers input from the projection area recognizer 103, defines a polygon using coordinates of vertexes of the rectangle Fn0Fn1Fn2Fn3 that is a projection area for ID=n, and assigns texture coordinates (u, v) to the vertexes of the polygon (see
Subsequently, the projected picture generating unit 107 performs a perspective projection-based transformation process using the information in the position association table storage unit 106 to transform the vertexes Fn0Fn1Fn2Fn3 of the polygon to F′n0F′n1F′n2F′n3 (step S303). This process can be realized by a vertex shader function of a graphics processor.
Subsequently, the projected picture generating unit 107 generates a texture image (GUI image) for the projection area of each projection finger (step S304). For example, a character string or an image may be drawn close to the tip end of the projection finger taking the physical characteristics that the base of a projection finger is more easily hidden than the tip thereof by the operation finger. For example, the character string or the image may be drawn on the right side when the left hand is used for operation.
Subsequently, the projected picture generating unit 107 maps the texture image to the polygon area (step S305). This process can be realized by a texture mapping function of a graphics processor.
Since a projected image as illustrated in
The description refers back to
Subsequently, the selection determining unit 110 performs a selection determination process of measuring the overlapping degree of the projection area recognized by the projection area recognizer 103 and the operation area recognized by the operation area recognizer 109, and determining whether or not the GUI projected on the projection area is selected (step S107).
Details of the selection determination process will be described here.
First, the selection determining unit 110 obtains an area R (ID) of an overlapping region of the operation area of the operation finger and the projection area of the projection finger on the imaging plane for each ID of the projection fingers (step S401).
Subsequently, the selection determining unit 110 obtains the ID of a projection finger with the largest overlapping area, sets the value thereof to CID, and sets the overlapping area of the projection finger with this ID to R (step S402).
Subsequently, the selection determining unit 110 compares R with a threshold RT (step S403).
If R≧RT is satisfied (Yes in step S403), the selection determining unit 110 determines whether or not CID and SID are equal (step S404).
If CID=SID is satisfied (Yes in step S404), the selection determining unit 110 adds the time elapsed from the previous determination time to the current time to STime (step S405).
Subsequently, the selection determining unit 110 compares STime with a threshold STimeT of the selection time (step S406).
If STime≧StimeT is satisfied (Yes in step S406), the selection determining unit 110 determines that the operator 21 has chosen the GUI of SID, outputs SID to the issuing unit 105A (step S407), and terminates the process.
If R≧RT is not satisfied (No in step S403) or CID=SID is not satisfied (No in step S404), on the other hand, the selection determining unit 110 determines that the operator 21 has not selected the GUI, initializes SID to −1 and STime to 0 (step S408), and terminates the process. If STime≧STimeT is not satisfied (No in step S406), the selection determining unit 110 terminates the process.
As described above, according to the first embodiment, operation of an external device to be controlled can be completed by making one-hand operation of viewing a GUI picture projected onto a projection finger and touching the GUI by the operation finger. Moreover, since the state of an external device to be controlled can also be displayed as a GUI picture, it is possible to check the external device to be controlled in the picture projected onto a projection finger. Although a case where the operation finger and the projection fingers are all of one hand is described in the first embodiment, even in a case where both hands are used and a finger of a hand opposite to the hand with projection fingers is an operation finger, the position of the operation area can be detected by adding a color marker to the operation finger of the hand opposite to the hand with the projection fingers and this can be implemented by the above-described technique. Alternatively, an operation finger and projection fingers may be arranged on fingers of one hand and a further operation finger may be assigned to the opposite hand, further projection fingers may be assigned thereto, or both of the operation finger and the projection fingers can also be assigned thereto.
While an example in which the operator 21 selects a GUI by laying the operation finger over the GUI picture on a projection finger is described in the first embodiment above, an example in which the operator 21 selects a GUI by bending a projection finger on which the GUI picture to be selected is projected instead of using an operation finger by the operator 21 will be described in the modified example 1. In the following, the difference from the first embodiment will be mainly described and components having similar functions as in the first embodiment will be designated by the same names and reference numerals as in the first embodiment, and the description thereof will not be repeated.
The selection determining unit 112 measures the bending degree of a projection finger and determines whether or not the GUI is selected. Specifically, the selection determining unit 112 assigns a point with the largest x-coordinate out of the points of the rectangle Fn0Fn1Fn2Fn3 (projection area) to bn for each of the projection fingers. Here, n=1 to 4. Next, the selection determining unit 112 obtains an average value bAvg of bn and regards bn−bAvg as the bending amount. The selection determining unit 112 then compares the bending amount bn−bAvg with a predetermined threshold bT, and determines that a projection finger with an ID n is bent, that is, a GUI projected onto a projection area of a projection finger with an ID n is selected if bn−bAvg≦bT is satisfied, and outputs the determination result to the managing unit 105. If bn−bAvg>bT is satisfied for all the fingers, on the other hand, the selection determining unit 112 determines that none of the projection fingers is bent, that is, none of the GUIs is selected, and outputs the determination result to the managing unit 105.
Accordingly, the operator 21 can select a GUI by bending a projection finger onto which a picture of a GUI to be selected is projected.
The number of selectable GUIs may be more than one, and the selection determining unit 112 may prioritize the projection fingers in descending order of the bending amount, and selects GUIs projected onto projection areas of two or more projection fingers in descending order of the priority may be selected at the same time, for example.
Alternatively, the first embodiment and the modified example 1 may be combined such that selection of a GUI by laying the operation finger over a GUI picture on a projection finger and selection of a GUI by bending by the operator 21 a projection finger onto which a picture of the GUI to be selected is projected are combined.
While an example in which one GUI is projected onto one projection finger is described in the first embodiment, an example in which a plurality of GUIs are projected onto one projection finger will be described in the modified example 2. In the following, the difference from the first embodiment will be mainly described and components having similar functions as in the first embodiment will be designated by the same names and reference numerals as in the first embodiment, and the description thereof will not be repeated.
The dividing unit 113 divides a projection area recognized by the projection area recognizer 103 into a plurality of divided projection areas.
In the example illustrated in
The selection determining unit 110 determines whether or not a GUI is selected by measuring the overlapping degree of the operation area and a divided projection area.
As a result, it is possible to project numerical keys or the like onto the projection fingers and to project more various menus.
In a case where the operator 21 selects a GUI by bending a projection finger onto which a picture of the GUI to be selected is projected as in the modified example 1, a silhouette image acquired by an infrared sensor may be used as a feature value.
In this case, the acquiring unit 101 irradiates the hand 22 with infrared light from an infrared light source and captures the infrared light diffusely reflected by the surface of the hand 22 with an infrared camera with a filter that only transmits infrared rays instead of using a visible light camera.
In addition, since the reflected infrared light is attenuated in the background area other than the hand 22, the extracting unit 102 can separate the area of the hand 22 from the background area other than the hand 22 by extracting only an area where the infrared rays are reflected at an intensity equal to or higher than a certain threshold. In this manner, the extracting unit 102 extracts a silhouette image of the hand 22.
In addition, the projection area recognizer 103 can recognize a projection area of a projection finger from the silhouette image of the hand 22 by tracing the outline of the silhouette of the hand 22 and extracting an inflection point of the outline.
Accordingly, the operator 21 can select a GUI by bending a projection finger without attaching a color marker to the projection finger.
While the operator 21 recognizes the positions of the operation finger and the projection fingers on the imaging plane by recognizing the colors of the color markers attached to the fingers using a visible light camera in the first embodiment, an example in which the positions of the operation finger and the projection fingers on the imaging plane are recognized by using a distance sensor instead of the visible light camera will be described in the modified example 4.
A distance sensor is a sensor that obtains a distance from a camera to an object as an image. For example, there is a method of irradiating the hand 22 by using an infrared light source installed near an infrared camera and obtaining the intensity of the reflected light as a distance by utilizing the property that reflected light is attenuated as the distance is longer. There is also a method of projecting a specific pattern by using a laser light source or the like and obtaining a distance by utilizing the property that the reflection pattern on an object surface changes depending on the distance. In addition, there is also a method of obtaining a distance by image processing utilizing the property that the parallax between images captured by two visible light cameras installed with a distance therebetween is larger as the distance to the object is shorter. The acquiring unit 101 may obtain the distance by any method in the modified example 4.
As described above, the acquiring unit 101 acquires an image (hereinafter referred to as a distance image) expressing the distance from the command issuing device 1 to the hand 22 or the background as a luminance in the modified example 4.
Note that the distance d from the command issuing device 1 to the hand 22 or the background is stored as a numerical value in the pixels of the distance image. The value of d is smaller as the distance is shorter and larger as the distance is longer.
The extracting unit 102 divides the distance image into an area of the bent operation finger, an area of the projection fingers and the palm, and a background area on the basis of the distribution of the distance d by using thresholds dTs and dTp. The extracting unit 102 then extracts an area where d<dTs is satisfied as the area of the bent operation finger, determines an inflection point of the outline of the silhouette of this area, and outputs a tip area of the operation finger to the operation area recognizer 109. The extracting unit 102 also extracts an area where dTs≦d<dTp is satisfied as the area of the projection fingers and the palm, determines an inflection point of the outline of the silhouette of this area, and outputs an area from the bases to the tips of the projection fingers to the projection area recognizer 103.
In this manner, it is possible to recognize the positions of the operation finger and the projection fingers on the imaging plane without using color markers.
In the second embodiment, an example in which pictures of GUIs projected onto projecting fingers are switched according to the posture of a hand will be described. In the following, the difference from the first embodiment will be mainly described and components having similar functions as in the first embodiment will be designated by the same names and reference numerals as in the first embodiment, and the description thereof will not be repeated.
The palm area recognizer 114 recognizes a palm area by using a projection area recognized by the projection area recognizer 103. Specifically, the palm area recognizer 114 extracts a shape feature value of the palm by using the projection area recognized by the projection area recognizer 103 and recognizes the area represented by the extracted feature value as the palm area.
The switching determining unit 115 measures the overlapping degree of the palm area recognized by the palm area recognizer 114 and the operation area recognized by the operation area recognizer 109, and determines whether to switch the GUIs to be projected onto the projection areas. The switching determining unit 115 makes determination by a technique similar to that for the selection determining unit 110. If the overlapping area of the palm area and the operation area is equal to or larger than HT for a predetermined time HTimeT or longer, the switching determining unit 115 determines that the GUIs to be projected onto the projection areas are to be switched to GUIs of a first state. On the other hand, if the overlapping area of the palm area and the operation area is not equal to or larger than HT for the predetermined time HTimeT or longer, the switching determining unit 115 determines that the GUIs to be projected onto the projection areas are to be switched to GUIs of a second state.
If the switching determining unit 115 determines that the GUI pictures are to be switched to the GUIs of the first state, the switch 116 switches the GUIs to be projected onto the projection areas to the GUIs of the first state, and if the switching determining unit 115 determines that the GUI pictures are to be switched to the GUIs of the second state, the switch 116 switches the GUIs to be projected onto the projection areas to the GUIs of the second state.
For example, the GUI information storage unit 104 stores therein GUI information illustrated in
In this case, if the operator 21 lays the thumb 23 over the palm area 36 and the switching determining unit 115 determines to switch to the GUIs of the first state, the switch 116 sets the GUI information illustrated in
On the other hand, if the operator 21 does not lay the thumb 23 over the palm area 36 and the switching determining unit 115 determines to switch to the GUIs of the second state, the switch 116 sets the GUI information illustrated in
If the operator 21 lays the operation finger over the palm and then moves the operation finger over a projection finger, the switching determining unit 115 may make switching determination that the previous state of laying the operation finger over the palm is held even if the operation finger is not over the palm any longer.
As a result, it is possible to switch the display of the GUIs according to whether or not the operator 21 takes the posture of laying the operation finger over the palm. Alternatively, the projection may be performed such that the GUIs are not projected onto the projection fingers when the operator 21 does not lay the operation finger over the palm and GUIs are projected onto the projection fingers when the operator 21 lays the operation finger over the palm.
According to the second embodiment, even when there are a number of GUI elements to be projected and the number of GUI elements is not fixed, it is possible to display a plurality of GUI elements by switching the displayed information.
While an example in which the operator 21 switches the GUIs by laying the operation finger over the palm is described in the second embodiment above, an example in which the operator 21 switches the GUIs by opening and closing the projection fingers will be described in the modified example 5. In the following, the difference from the second embodiment will be mainly described and components having similar functions as in the second embodiment will be designated by the same names and reference numerals as in the second embodiment, and the description thereof will not be repeated.
The switching determining unit 117 measures the opening degrees of the projection fingers to determine whether or not to switch the GUIs to be projected onto the projection areas. If the projection fingers are open, the switching determining unit 117 determines to switch the GUIs to be projected onto the projection areas to the GUIs of the first state. If the projection fingers are not open, on the other hand, the switching determining unit 117 determines to switch the GUIs to be projected onto the projection areas to the GUIs of the second state.
Specifically, the switching determining unit 117 compares a sum dSum of absolute values of scalar products dn of a directional vector Vs that is a sum of directional vectors SnR′n of the base positions Sn of estimated projection fingers and the directional vectors SnR′n of the respective projection fingers with a threshold dT. If dSum≦dT is satisfied, the switching determining unit 117 determines that the fingers are open and to switch the GUIs to be projected onto the projection areas to the GUIs of the first state. If dSum>dT is satisfied, on the other hand, the switching determining unit 117 determines that the fingers are closed and to switch the GUIs to be projected onto the projection areas to the GUIs of the second state.
For example, if the operator 21 opens the fingers and the switching determining unit 117 determines to switch to the GUIs of the first state, the switch 116 sets the GUI information illustrated in
On the other hand, if the operator 21 closes the fingers and the switching determining unit 117 determines to switch to the GUIs of the second state, the switch 116 sets the GUI information illustrated in
As a result, it is possible to switch the display of the GUIs according to whether or not the operator 21 takes the hand posture of opening the projection fingers. Alternatively, the projection may be performed such that the GUIs are not projected onto the projection fingers when the operator 21 closes the projection fingers and the GUIs are projected onto the projection fingers when the operator 21 opens the projection fingers.
While an example in which the operator 21 switches the GUIs by laying the operation finger over the palm is described in the second embodiment above, an example in which the operator 21 switches the GUIs by changing the direction of the projection fingers will be described in the modified example 6. In the following, the difference from the second embodiment will be mainly described and components having similar functions as in the second embodiment will be designated by the same names and reference numerals as in the second embodiment, and the description thereof will not be repeated.
The switching determining unit 118 measures the direction of the projection fingers to determine whether or not to switch the GUIs to be projected onto the projection areas. If the projection fingers are oriented in the vertical direction, the switching determining unit 118 determines to switch the GUIs to be projected onto the projection areas to the GUIs of the first state. If the projection fingers are oriented in the horizontal direction, on the other hand, the switching determining unit 118 determines to switch the GUIs to be projected onto the projection areas to the GUIs of the second state.
Specifically, the switching determining unit 118 obtains an angle aS of the directional vector Vs that is a sum of directional vectors SnR′n of the base positions Sn of estimated projection fingers with respect to the horizontal direction of the imaging plane, and compares aS with a threshold aST. If aS≧aST is satisfied, the switching determining unit 118 determines that the projection fingers are oriented in the vertical direction and to switch the GUIs to be projected onto the projection areas to the GUIs of the first state. If aS<aST is satisfied, on the other hand, the switching determining unit 118 determines that the hand is oriented in the horizontal direction and to switch the GUIs to be projected onto the projection areas to the GUIs of the second state.
For example, if the operator 21 orients the projection fingers in the vertical direction and the switching determining unit 118 determines to switch to the GUIs of the first state, the switch 116 sets the GUI information illustrated in
On the other hand, if the operator 21 orients the projection fingers in the horizontal direction and the switching determining unit 118 determines to switch to the GUIs of the second state, the switch 116 sets the GUI information illustrated in
As a result, it is possible to switch the display of the GUIs according to the direction in which the operator 21 orients the projection fingers (the posture of the hand). Alternatively, the projection may be performed such that the GUIs are not projected onto the projection fingers when the operator 21 orients the projection fingers in the horizontal direction and the GUIs are projected onto the projection fingers when the operator 21 orients the projection fingers in the vertical direction.
While an example in which the operator 21 switches the GUIs by laying the operation finger over the palm is described in the second embodiment above, an example in which the operator 21 switches the GUIs by changing the position of the operation finger relative to the projection fingers will be described in the modified example 7. In the following, the difference from the second embodiment will be mainly described and components having similar functions as in the second embodiment will be designated by the same names and reference numerals as in the second embodiment, and the description thereof will not be repeated.
The switching determining unit 119 measures the relative positions of the operation area and the projection areas to determine whether or not to switch the GUIs to be projected onto the projection areas. If the distance between the operation area and the projection areas is equal to or longer than a threshold, the switching determining unit 119 determines that the operation finger and the projection fingers are apart from each other and to switch the GUIs to be projected onto the projection areas to the GUIs of the first state. If the distance between the operation area and the projection areas is not equal to or longer than the threshold, on the other hand, the switching determining unit 119 determines that the operation finger and the projection fingers are not apart from each other and to switch the GUIs to be projected onto the projection areas to the GUIs of the second state.
For example, if the operator 21 separates the operation finger and the projection fingers and the switching determining unit 119 determines to switch to the GUIs of the first state, the switch 116 sets the GUI information illustrated in
On the other hand, if the operator 21 brings the operation finger and the projection fingers together and the switching determining unit 119 determines to switch to the GUIs of the second state, the switch 116 sets the GUI information illustrated in
As a result, it is possible to switch the display of the GUIs according to the relative positions of the operation finger and the projection fingers (the posture of the hand). Alternatively, the projection may be performed such that the GUIs are not projected onto the projection fingers when the operator 21 brings the operation finger and the projection fingers together and the GUIs are projected onto the projection fingers when the operator 21 separates the operation finger and the projection fingers.
Alternatively, the moving direction of the operation finger may be detected when the operator 21 makes an operation of tracing the projection fingers with the operation finger, and the GUIs may be switched depending on whether the operation finger has traced the projecting fingers from the bases to the tips or from the tips to the bases.
In the third embodiment, an example in which a feedback picture allowing the operator to view a current operation is projected will be described. In the following, the difference from the first embodiment will be mainly described and components having similar functions as in the first embodiment will be designated by the same names and reference numerals as in the first embodiment, and the description thereof will not be repeated.
The feedback picture generating unit 120 generates a feedback picture for a GUI picture that is projected onto a projection area with which the operation area is determined to overlap by the selection determining unit 110. Herein, a feedback picture is a projected picture allowing the operator 21 to view the current operation.
When the operator 21 lays the operation finger over a GUI picture on a projection finger, the projected picture is projected across the operation finger and the projection finger, and the visibility of the GUI picture on the projection finger is lowered. The feedback picture generating unit 120 thus projects the GUI picture on the projection finger that the operator 21 is about to select onto an area other than the projection finger or shortens the GUI picture and projects the shortened picture on a region of the projection area of the projection finger where the operation finger does not overlap.
While the selection determining unit 110 determines that the GUI is selected when the operation area and the projection area overlap with each other for a predetermined time or longer, the feedback picture generating unit 120 generates a feedback picture as long as the operation area and the projection area overlap with each other.
In other words, the feedback picture generating unit 120 only makes determination on the conditions of steps S403 and S404 in the flowchart illustrated in
For example, the projector 108 projects a feedback picture 51-5 for the GUI 51-1 that the operator 21 is about to select onto the base of the operation finger as illustrated in
Alternatively, for example, the projector 108 may project a feedback picture 51-6 for the GUI 51-1 that the operator 21 is about to select onto the palm as illustrated in
Still alternatively, for example, the projector 108 may project a shortened or zoomed out version of the GUI 51-1 that the operator 21 is about to select as illustrated in
In this case, shortened texts or zoomed out pictures corresponding to the respective pieces of the GUI information are stored in the GUI information storage unit 104. For example, when the GUI information storage unit 104 stores therein a shortened text “N ON” for “ILLUMINATION ON” and the operator 21 is about to select the GUI 51-1, the projector 108 projects the shortened text “N ON” for “ILLUMINATION ON”. A region 61-1 of the projection finger without the operation finger laying thereover can be obtained as a rectangle QaQbFj2Fj3 by obtaining an intersection Q of a directional vector R′jSj for a projection finger with an ID=j (j=1 to 4) that is being selected and a circle resulting from approximating the operation finger, and obtaining points Qa, Qb away from the endpoint Q of the line R′jQ by the feature values ej, fj representing the thickness of the fingers calculated in advance in the direction perpendicular to R′jQ as illustrated in
The superimposing unit 121 superimposes the projected picture generated by the projected picture generating unit 107 and the feedback picture generated by the feedback picture generating unit 120 to generate a composite picture.
According to the third embodiment, the operator 21 can more easily view the content of the GUI element that the operator 21 has selected and it is possible to reduce the possibility of performing erroneous operation by implementing this example.
In the fourth embodiment, an example in which GUI information associated with the projection fingers is assigned will be described. In the following, the difference from the first embodiment will be mainly described and components having similar functions as in the first embodiment will be designated by the same names and reference numerals as in the first embodiment, and the description thereof will not be repeated.
The assignment determining unit 123 determines assignment of GUIs to respective projection areas of the projection fingers on the basis of the GUI information stored in the GUI information storage unit 104. As a result, it is possible to change the GUI information to be assigned to the projection fingers on the basis of the difference in displayable area based on the difference in the size of the projection fingers and the difference in easiness of selecting operation depending on the relative positions of the projection fingers.
For example, in the case where the thumb 23 is the operation finger and the fingers 24 other than the thumb are the projection fingers, a value representing the easiness of selecting operation for each finger is held in advance in the assignment determining unit 123 for each projection finger of GUI pictures, and the operation frequency for each GUI is obtained from information recording the operation history for each GUI and also held in advance in the assignment determining unit 123. As a result, the assignment determining unit 123 assigns the GUIs in descending order of the operation frequency to the projection fingers in descending order of the easiness of the selecting operation, and it is thus possible to more easily operate the GUI that is more frequently subjected to selecting operation and reduce the operation errors.
In addition, for example, when text character strings are to be displayed as GUI pictures, the number of characters of the text character string for each GUI is counted in advance in the assignment determining unit 123. The assignment determining unit 123 then assigns the GUIs in descending order of the number of characters in the text character strings to the projection fingers obtained from the projection area recognizer 103 in descending order of the number of pixels in the horizontal direction of the projection areas of the projection fingers. In other words, the assignment determining unit 123 assigns GUI elements with larger numbers of characters to longer fingers such as the index finger 24-1 and the middle finger 24-2 and GUI elements with small numbers of characters to shorter fingers such as the little finger 24-4, and it is thus possible to improve the visibility of the GUI characters strings and reduce the operation errors.
In addition, for example, for projecting a document including a plurality of lines of text or the like onto the projection fingers, the assignment determining unit 123 assigns the document so that one line of the text is projected onto one finger. In this case, the assignment determining unit 123 inserts line breaks in the middle of the text depending on the length of the projection fingers by using the projection areas of the projection fingers obtained from the projection area recognizer 103 to divide the text into a plurality of lines, and then assigns each line to each projection finger. As a result, it is possible to project and view a long sentence or the like on the projection fingers.
In this manner, it is possible to assign various GUIs that are not limited to the number of projection fingers onto the fingers according to the fourth embodiment.
In the fifth embodiment, an example in which a head-mounted display (HMD) is used will be described. In the following, the difference from the first embodiment will be mainly described and components having similar functions as in the first embodiment will be designated by the same names and reference numerals as in the first embodiment, and the description thereof will not be repeated.
The acquiring unit 124 is worn on the head of the operator 21 and captures the hand 22 of the operator 21 as a moving image.
The presenting unit 125 presents a picture generated by the projected picture generating unit 107 onto the eyewear type display device that the operator 21 wears on his/her head.
The position determining unit 126 determines the presenting position of the picture to be presented on a presentation finger recognized by the projection area recognizer 103.
In the fifth embodiment, since the presenting unit 125 is an eyewear type display that the operator 21 wears on his/her head, the presentation area of pictures is not limited to the surface of an object such as the hand 22 that actually exists. The presenting unit 125 can therefore separate a region for presenting a GUI picture from a region for determining the overlap with the operation finger, and present the GUI picture at a position beside the tip of the presentation finger as illustrated in
Although the command issuing device 10 worn by the operator 21 on his/her head is described in the fifth embodiment, a command issuing device can also be realized by a personal digital assistant by displaying a moving image captured by the acquiring unit 124 on a display that is the presenting unit 125 and superimposing a GUI at the picture position on the hand captured by the acquiring unit 124 on the moving image.
The command issuing devices according to the embodiments and the modified examples described above each include a controller such as a central processing unit (CPU), a storage unit such as a ROM and a RAM, an external storage device such as a HDD and a SSD, a display device such as a display, an input device such as a mouse and a keyboard, and a communication device such as a communication interface, which is a hardware configuration utilizing a common computer system.
Programs to be executed by the command issuing devices according to the embodiments and the modified examples described above are recorded on a computer readable recording medium such as a CD-ROM, a CD-R, a memory card, a DVD and a flexible disk (FD) in a form of a file that can be installed or executed, and provided therefrom.
Alternatively, the programs to be executed by the command issuing devices according to the embodiments and the modified examples described above may be stored on a computer system connected to a network such as the Internet, and provided by being downloaded via the network. Alternatively, the programs to be executed by the command issuing devices according to the embodiments and the modified examples described above may be provided or distributed through a network such as the Internet. Still alternatively, the programs to be executed by the command issuing devices according to the embodiments and the modified examples may be embedded on a ROM or the like in advance and provided therefrom.
The programs to be executed by the command issuing devices according to the embodiments and the modified examples described above have modular structures for implementing the units described above on a computer system. In an actual hardware configuration, a CPU reads the programs from a HDD and executes the programs, for example, whereby the respective units described above are implemented on a computer system.
As described above, according to the embodiment and the modification examples described above, it is possible to complete the user operation by using one hand.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
Number | Date | Country | Kind |
---|---|---|---|
2011-213998 | Sep 2011 | JP | national |