1. Field of the Invention
The disclosed embodiments of the present invention relate to a non-contact control mechanism, and more particularly, to a method for controlling an electronic apparatus according to a position of an object untouching the electronic apparatus in space.
2. Description of the Prior Art
A touch-based electronic apparatus provides a user with intuitive and user-friendly interaction. However, it is inconvenient for the user to control the electronic apparatus when the user holds other objects in a user's hand (e.g. documents or drinks) or the user's hand is oily. For example, while eating french fries and reading an electronic book displayed on a screen of a tablet computer, the user prefers to turn pages of the electronic book without touching the screen using oily fingers.
Thus, a novel touch mechanism is needed to solve the aforementioned problems.
It is therefore one objective of the present invention to provide a method for controlling an electronic apparatus according to a position of an object untouching the electronic apparatus in space.
According to an embodiment of the present invention, an exemplary control method of an electronic apparatus is disclosed. The exemplary control method comprises the following steps: generating a plurality of detection signals to a non-contact object around the electronic apparatus; receiving a plurality of reflected signals reflected from the non-contact object in response to the detection signals, and accordingly generating a plurality of detection results; performing arithmetic operations upon the detection results to calculate motion information of the non-contact object around the electronic apparatus; recognizing a non-contact gesture corresponding to the non-contact object according to the motion information; and enabling the electronic apparatus to perform a specific function according to the non-contact gesture.
The proposed control method of an electronic apparatus provides non-contact human-computer interaction to facilitate control of the electronic apparatus. The proposed non-contact control mechanism and touch control may be employed together to realize flexible and intuitive control.
These and other objectives of the present invention will no doubt become obvious to those of ordinary skill in the art after reading the following detailed description of the preferred embodiment that is illustrated in the various figures and drawings.
In order to provide non-contact human-computer interaction, an electronic apparatus capable of determining motion information of a floating/hovering object (e.g. position and time information associated with the hovering object), and a hovering gesture (or an air gesture; i.e. a non-contact gesture which does not touch the electronic apparatus) defined according to the motion information of the hovering object are utilized to realize a non-contact and intuitive control mechanism. In the following, the proposed non-contact and intuitive control mechanism is described with reference to an electronic apparatus which obtains motion information according to reflected signals reflected from a hovering object. However, this is for illustrative purposes only. Any electronic apparatus capable of determining motion information of a hovering object maybe used to realize the proposed non-contact and intuitive control mechanism.
Please refer to
In steps 210 and 220, each IR LED may be used to generate a detection signal (i.e. emitting an infrared (IR) light signal) to a non-contact object around the electronic apparatus 100. In this embodiment, the non-contact object may be represented by a user's finger OB. Each IR sensor may be used to receive a reflected signal reflected from the user's finger OB in response to the detection signal, and accordingly generate a detection result to the processing unit in order to obtain motion information of the user's finger OB.
To illustrate the obtainment of the motion information of the user's finger OB, the electronic apparatus 100 may define a predefined space coordinate system in the surroundings thereof in this embodiment, wherein the predefined space coordinate system may include a reference surface (i.e. the display screen 102) defined by the IR sensors IS1-IS3. In addition, the IR LEDs IL1-IL3 may be disposed adjacent to the IR sensors IS1-IS3, respectively. Hence, the IR LED IL1 and the IR sensor S1 may be regarded as being located at the same position P1(0, 0, 0), the IR LED IL2 and the IR sensor S21 may be regarded as being located at the same position P2 (X0, 0, 0), and the IR LED IL3 and the IR sensor S3 may be regarded as being located at the same position P3 (0, Y0, 0).
The IR LEDs IL1-IL3 may be activated alternately, wherein only one IR LED is activated in a period of time. During one IR LED is activated, a corresponding adjacent IR sensor may be activated to receive a reflected signal reflected from the user's finger OB; when the IR LED is deactivated, the corresponding adjacent IR sensor may be deactivated, thus ensuring that the reflected signal received by the corresponding adjacent IR sensor corresponds to the IR LED. For example, the IR LED L1 may emit an IR light signal I1 to the finger OB, and the IR sensor S1 may receive a reflected signal R1 reflected from the finger OB in response to the IR light signal I1, and accordingly generate a first detection result (e.g. a current signal). Similarly, the IR sensor S2 may receive a reflected signal R2 reflected from the finger OB in response to an IR light signal I2, and accordingly generate a second detection result (e.g. a current signal), and the IR sensor S3 may receive a reflected signal R3 reflected from the finger OB in response to an IR light signal I3, and accordingly generate a third detection result (e.g. a current signal).
In step 230, the processing unit of the electronic apparatus 100 may perform arithmetic operations upon the first, second and third detection results to calculate motion information of the finger OB around the electronic apparatus 100. For example, the processing unit may obtain respective reflected energies of the reflected signals R1-R3 according to the first, second and third detection results, and obtain distances between each IR sensor and the finger OB according to a relationship between a reflected energy and an energy transmission distance. Next, the processing unit may perform the arithmetic operations upon the obtained detection results according to a position of each IR sensor (or a position of each IR LED) and the distances between each IR sensor and the finger OB, thereby calculating a plurality of coordinates (e.g. a position PH (XH, YH, ZH) shown in
As the electronic apparatus 100 is capable of determining a position of the finger OB, the electronic apparatus 100 may track motion of the finger OB over time. In step 240, the processing unit may recognize a corresponding non-contact gesture to according to the motion information of the finger OB. In this embodiment, the processing unit may determine at least one of a direction of the movement and a distance of the movement of the finger OB in the predefined space coordinate system according to a relationship between the coordinates/positions of the finger OB and time.
For example, when determining that the finger OB moves from the position PH (XH, YH, ZH) to a position PH′ (XH+k, YH, ZH) at two adjacent points in time, the processing unit may recognize the non-contact gesture corresponding to the motion information of the finger OB as a shift gesture. To put it another way, as a motion vector of the finger OB in the predefined coordinate system equals (k, 0, 0) (which is parallel to the display screen 102), the motion information of the finger OB may be recognized as a left-to-right shift/swipe gesture. In one implementation, the horizontal shift movement of the finger OB may have a slight vertical shift, resulting in a motion vector (k, 0, Δz) of the finger OB. It should be noted that, as long as the value Oz is smaller than a predetermined offset (i.e. the direction of the movement of the finger OB is substantially parallel to the display screen 102), the processing unit may recognize the motion information of the finger OB as a shift gesture. In another implementation, the processing unit may recognize the motion information of the finger OB as a shift gesture if a horizontal shift distance (e.g. the aforementioned shift distance k) of the finger OB is greater than a predetermined distance.
In step 250, the processing unit may enable the electronic apparatus 100 to perform a specific function according to the recognized non-contact gesture. In this embodiment, an application shortcut icon AP of a picture taking function is displayed on the display screen 102, wherein a position of the application shortcut icon AP on the display screen 102 may be denoted by PA (XA, YA, 0). When the finger OB enters a sensing area of the electronic apparatus 100 (e.g. the finger OB enters a space above the display screen 102), the processing unit may start to track the motion of the finger OB (e.g. a cursor corresponding to the motion of the finger OB may be displayed on the display screen 102). When the processing unit detects that the finger OB stays above the application shortcut icon AP (e.g. a position (XA, YA, ZH)) over a predetermined period of time (i.e. the finger OB is stationary over the predetermined period of time, or a period of time in which the distance of the movement of the finger OB is substantially zero is greater than the predetermined period of time), the processing unit may recognize a hold gesture and enable the electronic apparatus 100 to activate the picture taking function. In other words, the user may enable he electronic apparatus 100 to activate the picture taking function without touching the application shortcut icon AP displayed on the display screen 102.
Please note that the above is for illustrative purposes only, and is not meant to be a limitation of the present invention. In one implementation, the processing unit may obtain a motion vector directly according to the relationship between the coordinate of the finger OB and time, thereby recognizing a corresponding non-contact gesture. In another implementation, the processing unit may generate an image of the motion of the finger OB according to the motion information, thereby recognizing a corresponding non-contact gesture according to the image. In yet another implementation, the display screen 102 may be touch panel. Hence, the user may control the electronic apparatus 100 in a touch manner together with a non-contact manner. Additionally, an IR LED and a corresponding IR sensor may not be adjacent to each other. The number of the IR LEDS and the number of the corresponding IR sensors may not be the same. For example, as long as the IR LEDs IL1-IL3 are still activated alternately to emit signals, it is feasible to install only one IR sensor in the electronic apparatus 100. Further, the non-contact motion information of the finger OB may trigger a variety of specific functions.
Please refer to
Additionally, the electronic apparatus 100 may perform a non-contact picture taking function. For example, after using a hold gesture to enable the electronic apparatus 100 to perform the non-contact focus function, the user may tap once the display screen 102 to enable the electronic apparatus 100 to perform a picture taking function. Please refer to
In one implementation, the processing unit may determine whether a displacement of the finger OB is greater than the predetermined distance ds/dR according to a reflected energy of a reflected signal. Assume that a reflected energy corresponding to the finger OB at a first position (e.g. the position PH (XH, YH, ZH)) is a first sensing count. While the finger OB is moving toward the display screen 102 to a second position (e.g. the position PH″ (XH, YH, ZH-d1)) so that a difference between the first sensing count and a reflected energy corresponding to the finger OB at the second position (e.g. a second sensing count) is greater than a predetermined ratio of the first sensing count, the processing unit may determine that the displacement of the finger OB is greater than a predetermined distance (e.g. the predetermined distance ds). Similarly, while the finger OB is moving away from the display screen 102 (e.g. moving from the position PH″ (XH, YH, ZH-dl) to the position PK (XH, YH, ZH-d2)), the processing unit may determine whether the finger OB moves over another predetermined distance according to another predetermined ratio. For example, the processing unit may determine that a difference between a sensing count corresponding to the finger OB at the position PH″ (XH, YH, ZH-d1) and a sensing count corresponding to the finger OB at the position PK (XH, YH, ZH-d2) is greater than the another predetermined ratio of a sensing count corresponding to the finger OB at the position PH″ (XH, YH, ZH-d1), thereby determining that the finger OB moves over the predetermined distance dR. It should be noted that the predetermined ratio and the another predetermined ratio may be set based on actual designs/requirements, wherein the predetermined ratio and the another predetermined ratio may be the same or different.
In view of above, the user may activate the focus function according to different environments without touching the electronic apparatus 100 and activate the picture taking function without touching the electronic apparatus 100, thus avoiding hand shaking and touching the display screen 102 with oily hands.
Please refer to
In a case where the electronic apparatus 100 has the ability to detect a distant object, everyone (including a photographer) may be photographed based on the aforementioned non-contact picture taking mechanism. In one implementation, reflected energies of the IR LEDs IL1-IL3 shown in
Please refer to
When the user wants to search another area, the user's finger OB may move away from the display screen 102 first (or move away from the display screen 102 over a predetermined distance) to enable the display screen 102 to display the previous image. In other words, when the processing unit determines that the finger OB moves over the predetermined distance in a vertical direction away from the display screen 102 (i.e. the finger OB moves away from the display screen 102), the processing unit may recognize the motion information of the finger OB as a receding gesture, thereby enabling the electronic apparatus 100 to zoom out an image currently displayed on the display screen 102.
In this implementation, the user's hand or finger needs not touch the display screen 102. Hence, the image displayed on the display screen 102 will not be hidden by hand(s) during a zoom-in/zoom-out operation. Additionally, when the user wants to tap a landmark M in the area of interest A to obtain related information, the user may tap icons near the landmark M unintentionally as an image of the landmark M is too small. The user may use the aforementioned approaching gesture to zoom in the area of interest A so that the landmark M may be selected (e.g. using a single tap gesture) correctly.
Please refer to
Further, when the user wants to stop turning pages, the user may tap the display screen 102 twice to enable the electronic apparatus 100 to perform a page-turning stopping function. Please refer to
Please note that the aforementioned correspondence between the non-contact gesture and the specific function performed by the electronic apparatus is for illustrative purposes only, and is not meant to be a limitation of the present invention. For example, the focus function shown in
Based on the motion information of the non-contact object, a combination gesture formed by a plurality of non-contact gestures may be recognized to enable the electronic apparatus to perform a specific function. Please refer to
As shown in
In this implementation, a star point of the specific rage S is the position D1, which corresponds to the position PD (Xc, Yc, Zc+r) (where the finger OB moves toward and stays), and an end point of the specific rage S is the position D2, which corresponds to the position PE (Xc+p, Yc+q, Zc+r) (where the finger OB stays finally), wherein the specific rage S may be defined as a range corresponding to a line connecting the start point and the end point mentioned above. For example, the specific rage S may be defined as a rectangular range corresponding to a diagonal line connecting the start point and the end point. In an alternative design, the position D1 and the position D2 may be located at the same row or the same column. Hence, the selected specific ranged may be a line connecting the start point and the end point.
In an alternative design, the processing unit of the electronic apparatus 100 may complete a range selection operation and a copy operation. In other words, the processing unit may select and copy the specific range S according to the combination gesture formed by the drag gesture and the hold gesture. In another alternative design, the combination gesture formed by the drag gesture and the hold gesture may be replaced by a combination gesture formed by two consecutive non-contact gestures, wherein a time interval between the two consecutive non-contact gestures may be shorter than a predetermined period of time. For example, the combination gesture formed by the drag gesture and the hold gesture may be replaced by a combination gesture formed by a drag gesture and a receding gesture, a combination gesture formed by a drag gesture and a single tap gesture, or a combination gesture formed by a single tap gesture and a shift gesture.
In another alternative design, the user may use a combination gesture formed by a single tap, a shift gesture and a specific gesture to activate the aforementioned range selection function and/or range selection and copy operation, wherein a time interval between the single tap gesture and the shift gesture may be shorter than a predetermined period of time, and a time interval between the shift gesture and the specific gesture may be shorter than another predetermined period of time. In other words, the combination of the single tap gesture and the shift gesture may replace the drag gesture shown in
Those skilled in the art will readily observe that numerous modifications and alterations of the device and method may be made while retaining the teachings of the invention. Accordingly, the above disclosure should be construed as limited only by the metes and bounds of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
103100268 | Jan 2014 | TW | national |
This application claims the benefit of U.S. provisional application No. 61/749,398, filed on Jan. 7, 2013, the contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
61749398 | Jan 2013 | US |