The present application claims priority under 35 U.S.C. § 119(a) to Russian Patent Application No. 2017106851, filed Mar. 2, 2017, and Korean Patent Application No. 10-2018-0009832, filed Jan. 26, 2018, the entire disclosures of each of which are hereby incorporated by reference.
The present disclosure relates generally to gesture recognition. More particularly, the present disclosure relates to a device and a method for recognizing a gesture using a Radio Frequency (RF) sensor.
With the development of electronic devices, many types of the electronic devices may be controlled by gestures of a user. In order to generate a control command for the electronic device, gesture recognition is needed. Various technologies for recognizing gestures are used in the electronic devices.
Systems employing gesture control in electronic devices have been described, for example, in the following publications:
International Patent Application Publication WO 2016/053459 A1 published on Apr. 7, 2016 and titled as “TENDON BASED MOTION AND GESTURE INPUT FROM A WEARABLE DEVICE” provides a device that detects a user's motion and gesture input by piezo pressure sensor array or light sensor array. However, the sensor arrays require tight fixing on a user's wrist to provide commands to the device or to other devices.
Patent Application Publication US2015/0370326 A1 published on Dec. 24, 2015 and titled as “SYSTEMS, ARTICLES, AND METHODS FOR WEARABLE HUMAN-ELECTRONICS INTERFACE DEVICES” provides electronic bands that employ multiple microelectromechanical systems (“MEMS”) microphones to detect and distinguish between different types of tapping gestures. MEMS vibration (sound) sensors must be distributed around a wrist in a wristband. However, placement of the sensors only in a wrist may limit the information available for detection.
Patent Application Publication CN105022471 A published on Nov. 4, 2015 and titled as “DEVICE AND METHOD FOR CARRYING OUT GESTURE RECOGNITION BASED ON PRESSURE SENSOR ARRAY” provides a device for carrying out gesture recognition based on a pressure sensor array which must be placed around a wrist in a wristband. However, placement of the sensors only in a wrist may limit the information available for detection.
Patent Application Publication US2016/041617 published on Feb. 11, 2016 and titled as “RADAR-BASED GESTURE RECOGNITION” describes techniques and devices for radar-based gesture recognition. These techniques and devices may recognize gestures made in three dimensions, such as in-the-air gestures. These in-the-air gestures can be made from varying distances, such as from a person sitting on a couch to control a television, a person standing in a kitchen to control an oven or refrigerator, or at several millimeters from a desktop computer's display. The techniques and devices are capable of providing a radar field that can sense gestures. However, since the radar sensor is placed on some distance from a user, the user should attend to the fact that his hands are needed to keep in the radar field. If sensor is placed on a one hand, then the other hand will be busy with gesture control.
Furthermore, there appear to be other problems and disadvantages in applying the systems and methods of the above-described publications such as:
A signal received in detecting a gesture may be unstable due to blind periods, lacking a permanent contact of a detecting device with body of user.
The user may be required to pay attention to a device's screen.
If a user wears a device on one hand, other hand may be needed to use for controlling it, so both hands are busy with a device control.
Piezo-based pressure sensors can require tightly affixing devices to a wearer.
Installation of external sensors outside the devices may be required.
Permanent skin contact to skin may be required.
Embodiments as described herein have been made to address at least one of the problems and the disadvantages described above, and to provide at least one of the advantages described below.
To address the above-discussed problems, it is a primary aspect of the present disclosure to provide methods and devices for recognizing a gesture using Radio Frequency (RF) sensors.
Certain embodiments of this disclosure provide a method and an electronic device for recognizing a gesture by switching an RF frequency band according to a required resolution or a user's movement.
Certain embodiments of this disclosure provide a method and an electronic device for generating reference data for an RF signal per gesture, recognizing a gesture according to changes of the RF signal, and controlling a function of the electronic device based on the recognized gesture.
Accordingly, certain embodiments of this disclosure a method for recognizing gestures using a radio-frequency (RF) sensor, comprising steps of: successively generating sets of RF signals by at least one transmitter and successively emitting the sets of RF signals into tissues of user body part via at least one antenna; receiving the sets of RF signals reflected from and distorted by the tissues of user body part by at least one receiver via the at least one antenna; separating each received RF signal in each set of RF signals into a first RF signal and a second RF signal by the at least one receiver, wherein the first RF signal represents amplitude and the second RF signal represents phase shift; converting each of the first RF signals and the second RF signals in each set of RF signals into digital signals by at least one analog-to-digital converter (ADC) to obtain sets of digital signals, wherein each set of digital signals is obtained from corresponding set of RF signals; processing the sets of digital signals in the CPU by an artificial neural network (“ANN”) using reference data sets for recognizing gestures, wherein each reference data set is associated with particular gesture and obtained by a learning of the ANN.
According to certain embodiments, learning of the ANN is performed for each gesture of plurality of gestures and may comprise steps of: generating the set of RF signals by the at least one transmitter and emitting the set of RF signals into the tissues of user body part via the at least one antenna, when the user body part performs a gesture; receiving the set of RF signals reflected from and distorted by the tissues of user body part by the at least one receiver via the at least one antenna; separating each received RF signal into the first RF signal and the second RF signal by the at least one receiver, wherein the first RF signal represents amplitude and the second RF signal represents phase shift; converting each of the first RF signals and the second RF signals into the digital signals by the at least one ADC to obtain sets of digital signals; processing the sets of digital signals by ANN to obtain the reference data set associated with the gesture, storing the reference data set in a memory comprised in the CPU.
The step of generating the sets of RF signals may, in some embodiments of this disclosure, comprise generating the RF signals having different frequencies in the set.
The step of generating the sets of RF signals may, in certain non-limiting examples provided herein, comprise generating the sets of RF signals within low frequency band and high frequency band.
According to certain embodiments of this disclosure, the low frequency band occupies frequencies of about 1-3 GHz, and the high frequency band occupies frequencies of about 3-10 GHz.
The method according to certain embodiments may further comprise steps of: when the digital signals are obtained from the sets of RF signals generated within the low frequency band, the CPU processes the sets of digital signals by using the ANN and reference data sets for recognizing a gesture, and the ANN outputs non-zero value before the gesture is completely recognized, determining that the user body part performs the gesture; and switching the at least one transmitter to generate the RF signals within the high frequency band.
The method according to certain embodiments may further comprise steps of: measuring, by a movement detector, a vibration level of said user body part; comparing, by the CPU using the ANN, the vibration level with a threshold value, wherein the threshold value is obtained by the learning of the ANN; if the vibration level exceeds the threshold value: when the sets of RF signals are generated within the high frequency band, switching the at least one transmitter to generate the RF signals within the low frequency band, or when the sets of RF signals are generated within the low frequency band and it is determined that the user body part performs a gesture, continuing generation of the RF signals within the low frequency band.
The learning of the ANN may according to certain embodiments, further comprise steps of: measuring, by the movement detector, vibration levels of said user body part, when the said user body part performs the gestures; select a maximum vibration level among measured vibration levels; assigning the maximum vibration level as the threshold value; and storing the threshold value in a memory comprised in the CPU.
The method according to certain embodiments, may further comprise steps of: if one antenna is used for pair of the transmitter and the receiver: switching the antenna to the transmitter, when the RF signal is emitted; and switching the antenna to the receiver, when the RF signal reflected from and distorted by the tissues of user body part is received.
Accordingly, certain embodiments of this disclosure provide a device for recognizing gestures using a radio-frequency (RF) sensor, comprising: at least one antenna; at least one transmitter configured to successively generate sets of RF signals and emit the sets of RF signals into tissues of user body part via at least one antenna; at least one receiver configured to receive the sets of RF signals reflected from and distorted by the tissues of user body part via the at least one antenna, and to separate each received RF signal in each set of RF signals into a first RF signal and a second RF signal, wherein the first RF signal represents amplitude and the second RF signal represents phase shift; at least one analog-to-digital converter (ADC) configured to convert each of the first RF signals and the second RF signals in each set of RF signals into digital signals to obtain sets of digital signals, wherein each set of digital signals is obtained from corresponding set of RF signals; a central processing unit (CPU) comprising a memory, the CPU being configured to process the sets of digital signals by using the ANN and reference data sets for recognizing gestures, wherein each reference data set is associated with particular gesture and obtained by a learning of the ANN
The device may according to certain embodiments be further configured to perform the learning of the ANN for each gesture of plurality of gestures, wherein: the at least one transmitter generates the set of RF signals and emits the set of RF signals into the tissues of user body part via the at least one antenna, when the user body part performs a gesture; the at least one receiver receives the set of RF signals reflected from and distorted by the tissues of user body part via the at least one antenna and separates each received RF signal into the first RF signal and the second RF signal, wherein the first RF signal represents amplitude and the second RF signal represents phase shift; the at least one ADC converts each of the first RF signals and the second RF signals into the digital signals to obtain sets of digital signals; the CPU is configured to: process the sets of digital signals by ANN to obtain the reference data set associated with the gesture; store the reference data set in the memory.
The at least one transmitter according to certain embodiments may be configured to generate the RF signals having different frequencies in the set.
The at least one transmitter may according to certain embodiments, be configured to generate the sets of RF signals within low frequency band and high frequency band.
According to certain non-limiting examples provided herein, a low frequency band occupies frequencies of about 1-3 GHz, and a high frequency band occupies frequencies of about 3-10 GHz.
When the digital signals are obtained from the sets of RF signals generated within the low frequency band, the CPU according to certain embodiments processes the sets of digital signals by using the ANN and reference data sets for recognizing a gesture, and the ANN outputs non-zero value before the gesture is completely recognized, the CPU is configured to determine that the user body part performs the gesture, and switch the at least one transmitter to generate the RF signals within the high frequency band.
The device according to certain embodiments may further comprise a movement detector configured to measure a vibration level of said user body part, wherein the CPU may be configured to: by using the ANN, compare the vibration level with a threshold value stored in the memory, wherein the threshold value is obtained by the learning of the ANN; if the vibration level exceeds the threshold value: when the at least one transmitter generates the sets of RF signals within the high frequency band, switch the at least one transmitter to generate the RF signals within the low frequency band, or when the at least one transmitter generates the sets of RF signals within the low frequency band and the CPU determines that the user body part performs a gesture, continue generation of the RF signals within the low frequency band.
According to certain embodiments of this disclosure, as part of the ANN's learning process, the movement detector measures vibration levels of said user body part, when the said user body part performs the gestures; the CPU selects a maximum vibration level among measured vibration levels, assigns the maximum vibration level as the threshold value, and stores the threshold value in the memory.
The device may, according to certain embodiments further comprise, if one antenna is used for pair of the transmitter and the receiver, a switch configured to: switch the antenna to the transmitter, when the RF signal is emitted, and switch the antenna to the receiver, when the RF signal reflected from and distorted by the tissues of user body part is received, wherein the CPU is configured to control the switch.
In the non-limiting examples described herein, movement detector is at least one of an accelerometer, a magnetic sensor, a barometer, a 3D positioner.
The device for recognizing gestures using a radio-frequency (RF) sensor may in some embodiments, be embedded into a wearable device.
The CPU may be a CPU of the wearable device into which the device for recognizing gestures using a radio-frequency (RF) sensor is embedded.
The movement detector may be a movement detector of the wearable device into which the device for recognizing gestures using a radio-frequency (RF) sensor is embedded.
The wearable device into which the said device is embedded is, in certain embodiments, at least one of smartwatch, headphone.
The device for recognizing gestures using a radio-frequency (RF) sensor may further comprise a reflector arranged on a side of user body part opposite to a side of the user body part into which the RF signals is emitted, and configured to reflect the RF signals emitted into the tissues of user body part.
According to certain embodiments of the present disclosure, a method for operating an electronic device includes detecting a change of a Radio Frequency (RF) signal emitted into a user body using an RF sensor, determining a gesture corresponding to the RF signal based on reference data per gesture, and executing a function of the electronic device corresponding to the determined gesture.
According to some embodiments of the present disclosure, an electronic device includes an RF sensor and at least one processor functionally coupled with the RF sensor. The at least one processor detects a change of RF signals emitted into a user body using the RF sensor, determines a gesture corresponding to the RF signals based on reference data per gesture, and executes a function of the electronic device corresponding to the determined gesture.
Other aspects, advantages, and salient features of the present disclosure will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses exemplary embodiments of this disclosure.
Before undertaking the DETAILED DESCRIPTION below, it may be advantageous to set forth definitions of certain words and phrases used throughout this patent document: the terms “include” and “comprise,” as well as derivatives thereof, mean inclusion without limitation; the term “or,” is inclusive, meaning and/or; the phrases “associated with” and “associated therewith,” as well as derivatives thereof, may mean to include, be included within, interconnect with, contain, be contained within, connect to or with, couple to or with, be communicable with, cooperate with, interleave, juxtapose, be proximate to, be bound to or with, have, have a property of, or the like; and the term “controller” means any device, system or part thereof that controls at least one operation, such a device may be implemented in hardware, firmware or software, or some combination of at least two of the same. It should be noted that the functionality associated with any particular controller may be centralized or distributed, whether locally or remotely.
Moreover, various functions described below can be implemented or supported by one or more computer programs, each of which is formed from computer readable program code and embodied in a computer readable medium. The terms “application” and “program” refer to one or more computer programs, software components, sets of instructions, procedures, functions, objects, classes, instances, related data, or a portion thereof adapted for implementation in a suitable computer readable program code. The phrase “computer readable program code” includes any type of computer code, including source code, object code, and executable code. The phrase “computer readable medium” includes any type of medium capable of being accessed by a computer, such as read only memory (ROM), random access memory (RAM), a hard disk drive, a compact disc (CD), a digital video disc (DVD), or any other type of memory. A “non-transitory” computer readable medium excludes wired, wireless, optical, or other communication links that transport transitory electrical or other signals. A non-transitory computer readable medium includes media where data can be permanently stored and media where data can be stored and later overwritten, such as a rewritable optical disc or an erasable memory device.
Definitions for certain words and phrases are provided throughout this patent document, those of ordinary skill in the art should understand that in many, if not most instances, such definitions apply to prior, as well as future uses of such defined words and phrases.
The above and other aspects, features, and advantages of certain exemplary embodiments of the present disclosure will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
Throughout the drawings, like reference numerals will be understood to refer to like parts, components and structures.
The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of various embodiments of the present disclosure as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the various embodiments described herein can be made without departing from the scope of the present disclosure. In addition, descriptions of well-known functions and constructions may be omitted for clarity and conciseness.
The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the present disclosure. Accordingly, it should be apparent to those skilled in the art that the following description of various embodiments of the present disclosure is provided for illustration purpose only.
It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise.
It will be further understood that the terms “comprises,” “comprising,” “includes,” and/or “including,” when used herein, specify the presence of stated features, integers, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, operations, elements, components, and/or groups thereof.
The terms used herein are merely for the purpose of describing particular embodiments and are not intended to limit the scope of other embodiments. As used herein, singular forms may include plural forms as well unless the context clearly indicates otherwise. Unless defined otherwise, all terms used herein, including technical and scientific terms, have the same meaning as those commonly understood by a person skilled in the art to which the present disclosure pertains. Such terms as those defined in a generally used dictionary may be interpreted to have the meanings equal to the contextual meanings in the relevant field of art, and are not to be interpreted to have ideal or excessively formal meanings unless clearly defined in the present disclosure. In some cases, even the term defined in the present disclosure should not be interpreted to exclude embodiments of the present disclosure.
In various embodiments of the present disclosure to be described below, a hardware approach will be described as an example. However, since the various embodiments of the present disclosure include a technology using both hardware and software, the present disclosure does not exclude a software-based approach.
Hereinafter, various embodiments of the present disclosure are explained in detail by referring to the attached drawings. The present disclosure relates to a method and an electronic device for recognizing a user gesture in the electronic device. Specifically, the present disclosure provides a technique for recognizing a gesture using a Radio Frequency (RF) sensor and controlling a function of the electronic device based on the recognized gesture in order to improve user convenience in the electronic device.
In the following, terms indicating control information and terms indicating components (e.g., the RF sensor) of the device are mentioned for the sake of explanations. Accordingly, the present disclosure is not limited to the terms to be described, and can use other terms having technically identical meaning.
Also, various embodiments of the present disclosure can be easily modified and applied to any electronic device of various types including an RF sensor.
Certain embodiments of the systems and methods disclosed herein recognize gestures to use in generating control commands based on recognized gesture for controlling various devices. Embodiments can be implemented as a separate wearable device to recognize gestures as described in the present disclosure and send control commands via wireless communication to other device. The wireless communication can be realized in any standards, for example, Bluetooth, Wi-Fi, GSM, ZigBee, ISM, etc. Furthermore, some embodiments can be embedded into various wearable devices to control both wearable devices. The wearable devices may be smartwatches, headphones, and other wearable devices which are to be controlled and/or control the devices coupled with the wearable devices.
An electronic device according to various embodiments of the present disclosure can include, for example, a wearable device. The wearable device can include at least one of an accessory type (e.g., a watch, a ring, a bracelet, an ankle bracelet, a necklace, glasses, a contact lens, or a Head-Mounted-Device (HMD)), a fabric or clothing embedded type (e.g., electronic garments), a body attachable type (e.g., a skin pad or a tattoo), and an implantable circuit.
Referring to the non-limiting example of
In some embodiments, a transmitter (Tx) 122-1 generates RF signals and emits the generated RF signals into a user's body party on which the gesture recognizing device is positioned via the antenna 126-1. In
According to certain embodiments of the present disclosure, the gesture recognizing device can optionally include a movement (vibration) detector (not shown in
Referring to the non-limiting example of
Embodiments of the present disclosure may be applied in auxiliary car functions control (open boot, lock/unlock, etc.), smart home control, smart illumination control, identification (by means of gesture sequence), finger/palm gesture control, musician playing tempo, keyboard pressure rate, using neck movement, palm positioning in 3D, gesture recognition. Furthermore, embodiments according to the instant disclosure may present the following advantages: For example, when the device is placed on a back side of a palm, a user can hold real objects in hands. Accordingly, users of certain embodiments of this disclosure may able to control with “dirty” hands (while cooking, etc.) Additionally, embodiments according to this disclosure may obviate the need for complex calculations, and powerful computers may not be needed.
Hereinafter, embodiments of the present disclosure for the gesture recognizing device using the RF sensor can be provided with reference to the non-limiting examples of
Referring to the non-limiting examples of
In certain embodiments, the memory 112 is included in the CPU 110. However, the memory 112 can be implemented as a separate unit in the gesture recognizing device. The memory 112 can also include any type of a computer-readable storage device and/or a storage disk. The CPU 110 including the memory 112 can configure a controller. The controller can control operations of the gesture recognizing device.
In some embodiments, antennas 126 are connected to the transmitters 122 and the receivers 124, and transmit and receive RF signals. As shown in
In at least one embodiment, in which the pair of the transmitter 122 and the receiver 124 are connected to one antenna 126, the gesture recognizing device can include switches 440. The number of the switches 440 may correspond to the number of the pairs of the transmitters 122 and the receivers 124. Each switch 440 switches the antenna 124 connected thereto. Each switch 440 is controlled by the CPU 110. To emit the RF signals, the switch 440 switches the antenna 126 to the transmitter 122. To receive the RF signals, the switch 440 switches the antenna 126 to the receiver 124.
In some embodiments, transmitter 122 successively generates the RF signals and emits the RF signals into tissues of a user body part via the one antenna 126. The transmitters 122 are configured to operate in a low frequency band of about 1-3 GHz, or according to any other wireless communication standard in the mentioned band. The transmitters 122 are also configured to operate in a high frequency band of about 3-10 GHz, or according to any other wireless communication standard in the mentioned band. Hence, each transmitter 122 emits a set of single RF signals (frequency pulses) in the above-stated low or high frequency band. The transmitters 122 are configured to generate the RF signals having different frequencies in the set. All the RF signals having different frequencies in the set are generated in order to increase the frequency in a stepped manner. In other embodiments, the single RF signals can be generated in a descending order or in any other order. Each of the sets of RF signals is processed as “single measurement” because the sets of RF signals are processed in a shorter time than a typical time of the gesture. In some embodiments, each set of RF signals is emitted periodically, and the period is long enough to process each set in the CPU 110. The CPU 110 controls the transmitters 122 to generate the RF signals within the low frequency band and the high frequency band, and the frequencies of the RF signals are controlled by the CPU 110 via frequency control lines (Freq. control) as shown in
According to certain embodiments, the CPU 110 can control the operating frequency band based on at least one of a battery status of the gesture recognizing device, a running application, and content. For example, when the battery status of the gesture recognizing device falls below a certain level or the application or the content causing considerable battery consumption is running, the gesture recognizing device can switch the operating frequency band to the low frequency band. Also, when the battery status of the gesture recognizing device exceeds a certain level or accurate gesture recognition is required, the gesture recognizing device can switch the operating frequency band to the high frequency band.
The emitted RF signals are, in some embodiments, reflected from the tissues of the user body part. At the same time, the tissues of the user body part distort the RF signals. The distortion of the received RF signal indicates attenuation (amplitude change) and phase shift of the RF signal. The receivers 124 each receives the RF signals reflected from and distorted by the tissues of the user body part. In addition, the receivers 124 separate each received RF signal in each set of the RF signals into a first RF signal and a second RF signal. The first RF signal represents the amplitude and the second RF signal represents the phase shift.
The ADCs 128 are, in certain embodiments, connected to the receivers 124 respectively. The ADCs 128 convert the first RF signals and the second RF signals in each set of the RF signals into digital signals so as to acquire sets of digital signals fed to the CPU 110. Each set of the digital signals is obtained from a corresponding set of the RF signals
According to certain embodiments, CPU 110 controls the whole measurement process. The CPU 110 sends a command to generate the sets of the RF signals to the transmitters 122 and reads measurement results as the digital signals from the receivers 124. The CPU 110 implements an ANN stored in the memory 112 of the CPU 110. The CPU 110 is configured to process the sets of the digital signals using the ANN and reference data sets for the gesture recognition. The reference data sets are parameters of the ANN obtained during its learning for the gesture recognition. Each reference data set is associated with a particular gesture and is formed while the ANN learns to recognize the particular gesture. Further, the CPU 110 can determine that the user body part performs a gesture based on an output of the ANN as a non-zero value before the gesture is completely recognized. Such operations of the ANN as learning and testing the ANN are well-known in the related art and accordingly detailed descriptions thereof are not required.
According to other embodiments, the gesture recognizing device can optionally include a movement detector 430 as shown in
Using the gesture recognizing device, the user can perform certain actions such as walking, running, and driving a vehicle. An excessive vibration level of the user's body can affect the accuracy of the gesture recognition. In certain embodiments, such as shown in the non-limiting example of
The gesture recognizing device can, depending on embodiments, be a separate device or a device embedded into a wearable device such as a smartwatch, a headphone, and other wearable device, which is to be controlled or controls the device coupled with the wearable device. When the wearable device including the embedded the gesture recognizing device includes a CPU and/or a movement detector, the CPU and/or the movement detector of the wearable device can be used as the CPU 110 and/or the movement detector 430 of the gesture recognizing device.
According to certain embodiments, to obtain the reference data sets associated with particular gestures, the learning of the ANN stored in the memory 112 of the CPU 110 is performed. When the user body part performs a particular gesture, in the ANN learning, the transmitters 122 each generates and emits the set of the RF signals into the tissues of the user body part via the at least one antenna 126. In the non-limiting example of
According to certain embodiments, to acquire the threshold for the vibration level, the learning of the ANN is carried out. When the user body part conducts the gesture, the vibration level of the user body part is measured by the movement detector 430. The measured vibration level is sent to the CPU 110. The CPU 110 selects a maximum vibration level among the measured vibration levels, assigns the maximum vibration level as the threshold, and stores the threshold in the memory 112.
The gesture recognizing device can, in some embodiments, further include a reflector arranged on a side of the user body part opposite to a side of the user body part into which the RF signals are emitted. The reflector reflects the RF signals emitted into the tissues of the user body part and passing through the user body part, to increase intensity of the reflected RF signal received at the receiver 124. The reflector can be formed of a metal plate. The reflector is attached to a fixing means of the gesture recognizing device, which fixes the gesture recognizing device on the user body part. The fixing means can be any means adapted to fix the gesture recognizing device onto the user body part.
According to some embodiments, the gesture recognizing device can include the RF sensor including the at least one antenna 126, the at least one transmitter 122, the at least one receiver 124, and the at least one DAC, and the controller including the CPU 110. Although not depicted in
Referring to the non-limiting example of
In the non-limiting example of
According to some embodiments, at operation 505, the electronic device executes its function corresponding to the gesture. For example, the electronic device can control to execute its predefined function based on the recognized gesture. Herein, according to at least one embodiment, the electronic device can be the same wearable device as the gesture recognizing device. According to another embodiment, the electronic device can be a separate device from the gesture recognizing device, and can receive information about the recognized gesture over wired/wireless communication from the gesture recognizing device and control its function based on the received gesture information.
Referring to the non-limiting example of
According to certain embodiments, at operation 603, the at least one receiver 124 receives the sets of RF the signals reflected from and distorted by the tissues of the user body part via the at least one antenna 126.
According to certain embodiments, at operation 605, the at least one receiver 124 separates each received RF signal into a first RF signal and a second RF signal. In so doing, the first RF signal represents amplitude and the second RF signal represents phase shift.
According to certain embodiments, at operation 607, the at least one ADC 128 converts each of the first RF signals and the second RF signals in each set of the RF signals into digital signals, in order to obtain sets of digital signals.
According to certain embodiments, at operation 609, the CPU 110 processes the sets of the digital signals using the ANN and reference data sets for gesture recognition. Each reference data set is associated with a particular gesture and obtained by learning of the ANN.
The method according to certain embodiments includes obtaining the digital signals from the sets of the RF signals generated in the low frequency band, processing, at the CPU, the digital signal sets using the ANN and the reference data sets for the gesture recognition, determining that the user's body part conducts the gesture when the ANN outputs a non-zero value before the gesture is completely recognized, and switching at least one transmitter to generate RF signals in the high frequency band.
According to certain embodiments, when the gesture recognizing device further includes the movement detector 430, the method can further include the following operations. In one additional operation, the movement detector 430 measures a vibration level of the user body part. The controller of the gesture recognizing device can control to measure the vibration level of the user body part using the movement detector 430. In another additional operation, the CPU 110 using the ANN compares the vibration level with a threshold. The threshold is obtained through the learning of the ANN. In another additional operation, when the vibration level exceeds the threshold, the CPU 110 switches the at least one transmitter 122 to generate RF signals within the low frequency band when the sets of the RF signals are generated within the high frequency band. In another additional operation, when the vibration level exceeds the threshold, the sets of the RF signals are generated within the low frequency band, and it is determined that the user body part performs a gesture, the CPU 110 controls the at least one transmitter 122 to keep generating the RF signals in the low frequency band.
According to certain embodiments, when the gesture recognizing device includes the one antenna 126 for the pair of the transmitter 122 and the receiver 124, and the switch 440 for switching the antenna 126 between the transmitter 122 and the receiver 124, the method can further include the following operations. In one additional operation, when the RF signal is emitted, the switch 440 switches the antenna 126 to the transmitter 122. In another additional operation, when the RF signal reflected from and distorted by the tissues of the user body part is received, the switch 440 switches the antenna 126 to the receiver 124.
Referring to the non-limiting example of
According to certain embodiments, at operation 703, the at least one receiver 124 receives the set of the RF signals reflected from and distorted by the tissues of the user body part via the at least one antenna 126.
According to certain embodiments, at operation 705, the at least one receiver 124 separates each received RF signal into a first RF signal and a second RF signal. The first RF signal represents amplitude and the second RF signal represents phase shift.
According to certain embodiments, at operation 707, to obtain a set of digital signals, the at least one ADC converts each of the first RF signals and the second RF signals into digital signals. The set of the digital signals is obtained from the set of the RF signals.
According to certain embodiments, at operation 709, the CPU 110 processes the set of the digital signals by the ANN to obtain a reference data set associated with the gesture. For example, the CPU 110 can obtain a reference data set for a particular gesture by processing the set of the digital signals acquired in operation 707 using the ANN.
According to certain embodiments, at operation 711, the CPU 110 stores the reference data set in the memory. That is, the CPU 110 stores the reference data set acquired in operation 709 in the memory. Next, based on the stored reference data set, the CPU 110 can determine the gesture for the reflected signal pattern.
Referring to the non-limiting example of
According to certain embodiments, at operation 803, the CPU 110 selects a maximum vibration level among the measured vibration levels. That is, the CPU 110 can determine the greatest vibration level from the measured vibration levels.
According to certain embodiments, at operation 805, the CPU 110 assigns the maximum vibration level as the threshold. The CPU 110 can assign the maximum vibration level selected in operation 803, as the threshold for determining to switch the frequency band.
According to certain embodiments, at operation 807, the CPU 110 stores the threshold in the memory. That is, the CPU 110 stores the threshold assigned in operation 805 in the memory. Next, based on the stored threshold, the CPU 110 can determine whether to switch the frequency band to the low frequency band when the vibration level exceeds a certain level.
According to certain embodiments, the learning of ANN for obtaining the reference data sets associated with gestures and the learning of ANN for obtaining the threshold value may be performed separately or as single learning process.
The foregoing descriptions of the embodiments are illustrative, and modifications in configuration and implementation are within the scope of the current description. For instance, while the embodiments are generally described with relation to
The method and the electronic device according to various embodiments of the present disclosure can determine the user's gesture using the RF sensor, control the function of the electronic device according to the gesture, and thus provide the function control method of the electronic device more quickly and easily. Additionally, the electronic device according to certain embodiments of this disclosure can achieve continuous monitoring (gesture processing while the user is moving), can be embedded into various wearable devices (e.g., a watch, a headphone, etc.), does not need direct (electrical) contact to skin, and can determine the gesture through clothes (e.g., gloves, costume, shirt, trousers, etc.). Further, embodiments of the gesture recognizing device and the methods for operating same according to this disclosure need not be tightly affixed on the user's body by means of the RF signals having wavelengths greater than possible distances of displacement on the user's body part. Additionally, embodiments according to this disclosure may be able to ignore movements of other user's body parts not related to the user's body part (e.g., hands, neck, etc.) of which movements are detected. Additionally, embodiments according to this disclosure can use only one user's body part in the device control, can easily control the device by gesture, do not need to place active components (e.g., sensors, antennas) inside a strap, may require a small number of sensors (antennas), and may provide low power consumption and harmlessness for the user because a power of the emitted signals is low in view of the RF signals and the RF signals have low attenuation inside the body, bones, and so on.
The methods according to the embodiments described in the claims or the specification of the present disclosure can be implemented in software, hardware, or a combination of hardware and software.
As for the software, according to certain embodiments, a computer-readable storage medium storing one or more programs (software modules) can be provided. One or more programs stored in the computer-readable storage medium can be configured for execution by one or more processors of the electronic device. One or more programs can include instructions for controlling the electronic device to execute the methods according to the embodiments described in the claims or the specification of the present disclosure.
Such a program (software module, software) can be stored to a random access memory, a non-volatile memory including a flash memory, a Read Only Memory (ROM), an Electrically Erasable Programmable ROM (EEPROM), a magnetic disc storage device, a Compact Disc (CD)-ROM, Digital Versatile Discs (DVDs) or other optical storage devices, and a magnetic cassette. Alternatively, the program can be stored to a memory combining part or all of those recording media. A plurality of memories may be equipped.
In some embodiments, program can be stored in an attachable storage device accessible via a communication network such as Internet, Intranet, Local Area Network (LAN), Wide LAN (WLAN), or Storage Area Network (SAN), or a communication network by combining these networks. The storage device can access the electronic device through an external port. A separate storage device may access the present device over the communication network.
The elements identified in the disclosure as components or operations of embodiments according to this disclosure are expressed in a singular or plural form. However, the singular or plural expression is appropriately selected according to a proposed situation for the convenience of explanation and the present disclosure is not limited to a single element or a plurality of elements. The elements expressed in the plural form may be configured as a single element, and the elements expressed in the singular form may be configured as a plurality of elements.
While this disclosure has been described with reference to certain exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present disclosure.
Although the present disclosure has been described with various embodiments, various changes and modifications may be suggested to one skilled in the art. It is intended that the present disclosure encompass such changes and modifications as fall within the scope of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2017106851 | Mar 2017 | RU | national |
10-2018-0009832 | Jan 2018 | KR | national |
Number | Name | Date | Kind |
---|---|---|---|
5953541 | King et al. | Sep 1999 | A |
6965842 | Rekimoto | Nov 2005 | B2 |
7170496 | Middleton | Jan 2007 | B2 |
8489065 | Green et al. | Jul 2013 | B2 |
8572764 | Thellmann | Nov 2013 | B2 |
8624836 | Miller et al. | Jan 2014 | B1 |
8660959 | Callahan | Feb 2014 | B2 |
8742885 | Brodersen et al. | Jun 2014 | B2 |
8851372 | Zhou et al. | Oct 2014 | B2 |
8925006 | Callahan | Dec 2014 | B2 |
8965824 | Chun et al. | Feb 2015 | B2 |
8983539 | Kim et al. | Mar 2015 | B1 |
8985442 | Zhou et al. | Mar 2015 | B1 |
9002420 | Pattikonda et al. | Apr 2015 | B2 |
9042596 | Connor | May 2015 | B2 |
9044150 | Brumback et al. | Jun 2015 | B2 |
9049998 | Brumback et al. | Jun 2015 | B2 |
9098190 | Zhou et al. | Aug 2015 | B2 |
9100493 | Zhou et al. | Aug 2015 | B1 |
9104537 | Penilla et al. | Aug 2015 | B1 |
9153074 | Zhou et al. | Oct 2015 | B2 |
9176668 | Eleftheriou et al. | Nov 2015 | B2 |
9214043 | Beaurepaire | Dec 2015 | B2 |
9215274 | Penilla et al. | Dec 2015 | B2 |
9253264 | Robinson et al. | Feb 2016 | B2 |
9288270 | Penilla et al. | Mar 2016 | B1 |
9292097 | Miller et al. | Mar 2016 | B1 |
9341844 | Orhand et al. | May 2016 | B2 |
9342829 | Zhou et al. | May 2016 | B2 |
20040243342 | Rekimoto | Dec 2004 | A1 |
20060092177 | Blasko | May 2006 | A1 |
20090157206 | Weinberg et al. | Jun 2009 | A1 |
20100103106 | Chui | Apr 2010 | A1 |
20100220064 | Griffin | Sep 2010 | A1 |
20110054360 | Son et al. | Mar 2011 | A1 |
20130057640 | Callahan | Mar 2013 | A1 |
20130060914 | Callahan | Mar 2013 | A1 |
20130066937 | Callahan | Mar 2013 | A1 |
20130082824 | Colley | Apr 2013 | A1 |
20130132883 | Vayrynen | May 2013 | A1 |
20130144727 | Morot-Gaudry et al. | Jun 2013 | A1 |
20130219345 | Saukko | Aug 2013 | A1 |
20130237272 | Prasad | Sep 2013 | A1 |
20130242262 | Lewis | Sep 2013 | A1 |
20130261771 | Ten Kate | Oct 2013 | A1 |
20140045547 | Singamsetty et al. | Feb 2014 | A1 |
20140089399 | Chun et al. | Mar 2014 | A1 |
20140095420 | Chun et al. | Apr 2014 | A1 |
20140128032 | Muthukumar | May 2014 | A1 |
20140168060 | Liao et al. | Jun 2014 | A1 |
20140178029 | Raheman et al. | Jun 2014 | A1 |
20140349256 | Connor | Nov 2014 | A1 |
20140350353 | Connor | Nov 2014 | A1 |
20140365979 | Yoon et al. | Dec 2014 | A1 |
20150026708 | Ahmed et al. | Jan 2015 | A1 |
20150054639 | Rosen | Feb 2015 | A1 |
20150073907 | Purves et al. | Mar 2015 | A1 |
20150074797 | Choi et al. | Mar 2015 | A1 |
20150092520 | Robison et al. | Apr 2015 | A1 |
20150102208 | Appelboom et al. | Apr 2015 | A1 |
20150105125 | Min et al. | Apr 2015 | A1 |
20150109723 | Holtzman | Apr 2015 | A1 |
20150111558 | Yang | Apr 2015 | A1 |
20150120151 | Akay et al. | Apr 2015 | A1 |
20150121287 | Fermon | Apr 2015 | A1 |
20150124566 | Lake et al. | May 2015 | A1 |
20150145653 | Katingari et al. | May 2015 | A1 |
20150185764 | Magi | Jul 2015 | A1 |
20150205994 | Yoo et al. | Jul 2015 | A1 |
20150242120 | Rodriguez | Aug 2015 | A1 |
20150253885 | Kagan et al. | Sep 2015 | A1 |
20150259110 | Blackburn | Sep 2015 | A1 |
20150279131 | Nespolo | Oct 2015 | A1 |
20150286285 | Pantelopoulos et al. | Oct 2015 | A1 |
20150338917 | Steiner et al. | Nov 2015 | A1 |
20150339696 | Zhou et al. | Nov 2015 | A1 |
20150355805 | Chandler et al. | Dec 2015 | A1 |
20150370320 | Connor | Dec 2015 | A1 |
20150370326 | Chapeskie et al. | Dec 2015 | A1 |
20160004323 | Pantelopoulos et al. | Jan 2016 | A1 |
20160018948 | Parvarandeh et al. | Jan 2016 | A1 |
20160035135 | Park et al. | Feb 2016 | A1 |
20160041617 | Poupyrev | Feb 2016 | A1 |
20160058133 | Fournier | Mar 2016 | A1 |
20160062320 | Chung | Mar 2016 | A1 |
20160062321 | Lee et al. | Mar 2016 | A1 |
20160077587 | Kienzle et al. | Mar 2016 | A1 |
20160085266 | Lee et al. | Mar 2016 | A1 |
20160091867 | Mansour et al. | Mar 2016 | A1 |
20160124500 | Lee et al. | May 2016 | A1 |
20160147401 | Cha | May 2016 | A1 |
20160154489 | Collins et al. | Jun 2016 | A1 |
20160162873 | Zhou et al. | Jun 2016 | A1 |
20160287092 | Zhang | Oct 2016 | A1 |
20160313801 | Wagner | Oct 2016 | A1 |
20160379475 | Zack et al. | Dec 2016 | A1 |
20190011989 | Schwesig | Jan 2019 | A1 |
Number | Date | Country |
---|---|---|
1531676 | Sep 2004 | CN |
102915111 | Feb 2013 | CN |
103197529 | Jul 2013 | CN |
103472914 | Dec 2013 | CN |
104199546 | Dec 2014 | CN |
104238344 | Dec 2014 | CN |
104698831 | Jun 2015 | CN |
104915008 | Sep 2015 | CN |
204695504 | Oct 2015 | CN |
105022471 | Nov 2015 | CN |
204965089 | Jan 2016 | CN |
1010057 | Oct 2002 | EP |
1256871 | Nov 2002 | EP |
1394665 | Mar 2004 | EP |
2813921 | Dec 2014 | EP |
2843507 | Mar 2015 | EP |
2846224 | Mar 2015 | EP |
2863276 | Apr 2015 | EP |
2876907 | May 2015 | EP |
2960754 | Dec 2015 | EP |
2999208 | Mar 2016 | EP |
3016081 | May 2016 | EP |
3023861 | May 2016 | EP |
H07248873 | Sep 1995 | JP |
2002358149 | Dec 2002 | JP |
2584459 | May 2016 | RU |
2609566 | Feb 2017 | RU |
02099614 | Dec 2002 | WO |
2009093027 | Jul 2009 | WO |
2012143603 | Oct 2012 | WO |
2014046424 | Mar 2014 | WO |
2015102713 | Jul 2015 | WO |
2016053459 | Apr 2016 | WO |
2016170011 | Oct 2016 | WO |
Entry |
---|
International Search Report and Written Opinion regarding Application No. PCT/KR2018/001786, dated May 28, 2018, 11 pages. |
Communication from a foreign patent office in a counterpart foreign application, “A device and method for gesture recognition with a RF sensor,” Russian Application No. 2017106851/08, dated Mar. 2, 2017, 10 pages. |
European Patent Office, “Supplementary European Search Report,” Application No. EP 18760487.1, dated Nov. 15, 2019, 12 pages. |
Bainbridge, Rachel, “Wireless Hand Gesture Capture Through Wearable Passive Tag Sensing,” 2011 International Conference on Body Sensor Networks, IEEE, 2011, 5 pages. |
Number | Date | Country | |
---|---|---|---|
20180253151 A1 | Sep 2018 | US |