The following description relates to recognizing gestures (e.g., human gestures) based on wireless signals.
Motion detection systems have been used to detect movement, for example, of objects in a room or an outdoor area. In some example motion detection systems, infrared or optical sensors are used to detect movement of objects in the sensor's field of view. Motion detection systems have been used in security systems, automated control systems and other types of systems.
In some aspects of what is described here, a motion detection system detects gestures (e.g., human gestures) and initiates actions in response to the gestures. For example, time and frequency information that procedural gesture events produce on a wireless channel spectrum can be leveraged to initiate actions. In some cases, a state machine may be used to detect the sequence of gesture events that have disturbed the wireless channel in a specific way. These disturbances of the wireless channel may become manifest in channel information collected from wireless signals, which can be analyzed in time and/or frequency domains to distinguish between and recognize different gestures. In some implementations, at the end of a sequence of gestures, a state machine triggers an action command to a connected device (e.g., an IoT device) to perform a specified action.
In some instances, aspects of the systems and techniques described here provide technical improvements and advantages over existing approaches. For example, the systems and techniques described here may provide a gesture-based interface with IoT devices or other network-connected devices (e.g., through touch-free interaction) to enable or disable services at any location where there is wireless coverage. This may provide an alternative or an improvement over technologies (e.g., voice assistants) that leverage audio signals and require audible proximity to an audio sensor (e.g. a microphone). Because radio-frequency and other wireless signals can propagate through walls and over larger distances, the time and frequency signature imprinted from a gesture in a wide range of locations can be obtained and analyzed by the device collecting the channel information. Accordingly, the systems and techniques described here may provide improved user interaction with network-connected devices and other types of network-accessed services.
In some instances, wireless signals received at each of the wireless communication devices in a wireless communication network may be analyzed to determine channel information. The channel information may be representative of a physical medium that applies a transfer function to wireless signals that traverse a space. In some instances, the channel information includes a channel response. Channel responses can characterize a physical communication path, representing the combined effect of, for example, scattering, fading, and power decay within the space between the transmitter and receiver. In some instances, the channel information includes beamforming state information (e.g., a feedback matrix, a steering matrix, channel state information (CSI), etc.) provided by a beamforming system. Beamforming is a signal processing technique often used in multi antenna (multiple-input/multiple-output (MIMO)) radio systems for directional signal transmission or reception. Beamforming can be achieved by operating elements in an antenna array in such a way that signals at particular angles experience constructive interference while others experience destructive interference.
The channel information for each of the communication links may be analyzed (e.g., by a hub device or other device in a wireless communication network, or a remote device communicably coupled to the network) to detect whether motion (e.g., gestures or another type of motion) has occurred in the space, to determine a relative location of the detected motion, or both. In some aspects, the channel information for each of the communication links may be analyzed to detect a gesture or a gesture sequence, and a secondary action can be initiated based on the detected gesture or gesture sequence.
Example motion detection systems and localization processes that can be used to detect motion based on wireless signals include the techniques described in U.S. Pat. No. 9,523,760 entitled “Detecting Motion Based on Repeated Wireless Transmissions,” U.S. Pat. No. 9,584,974 entitled “Detecting Motion Based on Reference Signal Transmissions,” U.S. Pat. No. 10,051,414 entitled “Detecting Motion Based On Decompositions Of Channel Response Variations,” U.S. Pat. No. 10,048,350 entitled “Motion Detection Based on Groupings of Statistical Parameters of Wireless Signals,” U.S. Pat. No. 10,108,903 entitled “Motion Detection Based on Machine Learning of Wireless Signal Properties,” U.S. Pat. No. 10,109,167 entitled “Motion Localization in a Wireless Mesh Network Based on Motion Indicator Values,” U.S. Pat. No. 10,109,168 entitled “Motion Localization Based on Channel Response Characteristics,” and other techniques.
In some cases, the wireless communication system 100 can be deployed in a physical environment such as a home, an office or another type of space, and one or more components of the wireless communication system 100 may operate in coordinate with or as a component of a motion detection system. For instance, software associated with a motion detection system may be installed and executed on one or more of the wireless communication devices 102A, 102B, 102C, or on another computer device in the physical environment, on a remote server, on a cloud-based computer system, etc.
In some instances, the motion detection system performs gesture recognition and initiates predetermined actions in response to detecting specified human gestures or human gesture sequences. Accordingly, the motion detection system may provide Wi-Fi based gesture recognition within a home or office space, which may enable users to activate or deactivate any type of event wirelessly through pre-programmed or trained gestures. The gestures can be single-gesture or multi-gesture events. A single gesture can be, for example, a single continuous motion, whereas a multi-gesture event can be, for example, more than one gesture (of similar or different type) performed in sequence. The gestures in a multi-gesture event can be separated with a variable pause between the gestures, for example, to form gesture sequences that are distinct. As an example, the sequence of wave, long pause (e.g., 2 seconds), and wave, could be a distinct gesture from wave, short pause (e.g., 1 second), wave. Other types of gesture events may be detected by the motion detection system.
In some implementations, gestures can be coupled with localization information (e.g., from any source) to perform a different action depending on the location of the user. As an example, a user who performs the single gesture of an open palm rising vertically in the living room could trigger a volume increase on the living room television, and a horizontal swiping motion in the living room could trigger a channel change on the living room television; whereas the same gestures in the kitchen may trigger similar adjustments on the kitchen television. As another example, a user who performs the multi-gesture of two hand waves in sequence within a bedroom may dismiss an alarm sounding on the bedroom alarm clock, for example, and three hand waves in sequence can toggle the bedroom lights on or off; whereas the same gestures in another bedroom may trigger the same or different actions within that bedroom.
In some implementations, a gesture recognition engine receives channel information from one or more of the wireless communication devices 102A, 102B, 102C, which collect the channel information based on wireless signals transmitted through the physical environment of the wireless communication system 100. The gesture recognition engine performs a deep inspection of the frequency content of the channel information over time. In some cases, when a gesture is recognized by the gesture recognition engine, a state machine can be invoked. After the completion of a gesture or gesture sequence has been detected, an action can be initiated (e.g., by sending a commend to one of the IoT devices 120A, 120B) depending on the end state of the state machine.
The example wireless communication devices 102A, 102B, 102C and the IoT devices 120A, 120B can operate in a wireless network, for example, according to a wireless network standard or another type of wireless communication protocol. For example, the wireless network may be configured to operate as a Wireless Local Area Network (WLAN), a Personal Area Network (PAN), a metropolitan area network (MAN), or another type of wireless network. Examples of WLANs include networks configured to operate according to one or more of the 802.11 family of standards developed by IEEE (e.g., Wi-Fi networks), and others. Examples of PANs include networks that operate according to short-range communication standards (e.g., BLUETOOTH®, Near Field Communication (NFC), ZigBee), millimeter wave communications, and others.
In some implementations, the wireless communication devices 102A, 102B, 102C may be configured to communicate in a cellular network, for example, according to a cellular network standard. Examples of cellular networks include networks configured according to 2G standards such as Global System for Mobile (GSM) and Enhanced Data rates for GSM Evolution (EDGE) or EGPRS; 3G standards such as Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Universal Mobile Telecommunications System (UMTS), and Time Division Synchronous Code Division Multiple Access (TD-SCDMA); 4G standards such as Long-Term Evolution (LTE) and LTE-Advanced (LTE-A); 5G standards, and others.
In some cases, one or more of the wireless communication devices 102 is a Wi-Fi access point or another type of wireless access point (WAP). In some cases, one or more of the wireless communication devices 102 is an access point of a wireless mesh network, such as, for example, a commercially-available mesh network system (e.g., GOOGLE Wi-Fi, EERO mesh, etc.). In some cases, one or more of the wireless communication devices 102 is a mobile device (e.g., a smartphone, a smart watch, a tablet, a laptop computer, etc.), an IoT device (e.g., a Wi-Fi enabled thermostat, a Wi-Fi enabled lighting control, a Wi-Fi enabled camera, a smart TV, a Wi-Fi enabled doorbell), or another type of device that communicates in a wireless network.
The IoT devices 120A, 120B are examples of network-connected devices that can communicate with one or more of the wireless communication devices 102. The IoT devices 120A, 120B may include, for example, a network-connected thermostat, a network-connected lighting control, a network-connected camera, a network-connected TV, a network-connected doorbell, etc. Generally, a network-connected device may communicate with other devices over a communication network using a wired connection (e.g., Ethernet cable), a wireless connection (e.g., Local Area Network connection) or both.
In the example shown in
In the example shown in
In the example shown in
In the example shown in
In some examples, the wireless signals may propagate through a structure (e.g., a wall) before or after interacting with a moving object, which may allow the moving object's movement to be detected without an optical line-of-sight between the moving object and the transmission or receiving hardware. In some instances, the motion detection system may communicate the motion or gesture detection event to another device or system, such as a security system or a control center.
In some cases, the wireless communication devices 102 themselves are configured to perform one or more operations of the motion detection system, for example, by executing computer-readable instructions (e.g., software or firmware) on the wireless communication devices. For example, each device may process received wireless signals to detect motion based on changes detected in the communication channel. In some cases, another device (e.g., a remote server, a network-attached device, etc.) is configured to perform one or more operations of the motion detection system. For example, each wireless communication device 102 may send channel information to central device or system that performs operations of the motion detection system.
In an example aspect of operation, wireless communication devices 102A, 102B may broadcast wireless signals or address wireless signals to other the wireless communication device 102C, and the wireless communication device 102C (and potentially other devices) receives the wireless signals transmitted by the wireless communication devices 102A, 102B. The wireless communication device 102C (or another system or device) then processes the received wireless signals to detect motion of an object, human gestures or human gesture sequences in a space accessed by the wireless signals (e.g., in the zones 110A, 110B). In some instances, the wireless communication device 102C (or another system or device) may perform one or more operations of the example processes 300, 400 described with respect to
In some aspects of operation, channel information is obtained based on wireless signals transmitted through a space (e.g., through all or part of a home, office, outdoor area, etc.) by one or more of the wireless communication devices 102A, 102B, 102C. A gesture recognition engine analyzes the channel information to detect a gesture in the space. The gesture can include, for example, a hand wave, a hand swipe, arm movements, leg movements, head movements, or other types of human gestures. In some cases, the gesture recognition engine detects a sequence of such gestures. For example, a state machine can be used to detect a sequence of gestures as described with respect to
In some aspects of operation, an action to be initiated in response to the detected gesture or gesture sequence is identified (e.g., based on the type of gesture or sequence gesture, based on a state of a state machine, or otherwise). The action can be, for example, turning lights on or off, turning a television or other device on or off, adjusting the volume of a speaker or other device, adjusting a thermostat setting, etc. An instruction (e.g., a command) to perform the action may then be sent to a network-connected device (e.g., one or both of the IoT devices 120A, 120B) that will perform the action. As an example, the IoT device 102A may be a network-connected TV that receives a channel change command, a network-connected thermostat that receives a temperature adjustment command, a network-connected speaker that receives a volume adjustment command, a network-connected lighting system that receives a light toggle command, a network-connected device that receives a command to arm or disarm a security system, etc. The network-connected device may then perform the corresponding action in response to receiving the instruction.
In some cases, a location of the gesture may be detected, and the action to be initiated can be determined based on the location of the gesture. For instance, the location of the gesture (e.g., a specific room or zone of a home or office environment) may be associated with a type of action to be performed (e.g., arm/disarm security device), a location of the action to be performed (e.g., a room in which to turn lights on/off), or a device to perform the action (e.g., a specific TV). The location of the gesture may be detected by the motion detection system (e.g., based on the channel information) or in another manner. For example, another type of sensor may be used to detect the location of a user who made the gesture.
In some cases, the gesture is detected by using a time-frequency filter to detect a time-frequency signature of the gesture. For example, the channel information may include a time series of channel responses, and the time-frequency filter may apply weighting coefficients to frequency components (subcarriers) of the channel responses. The time-frequency filter may include an adaptive time-frequency filter that tunes the weighting coefficients (e.g., according to an optimization algorithm or otherwise) to detect time-frequency signatures of multiple gesture types. For instance, the adaptive time-frequency filter may tune the weighting coefficients to detect gestures that modulate the channel responses at a frequency range corresponding to human gestures (e.g., 0 to 4 Hertz, 0.25 to 0.75 Hertz, or another frequency range). An example of an adaptive time-frequency filter is described with respect to
The example wireless communication devices 204A, 204B, 204C can transmit wireless signals through the space 200. The example space 200 may be completely or partially enclosed or open at one or more boundaries of the space 200. The space 200 may be or may include an interior of a room, multiple rooms, a building, an indoor area, outdoor area, or the like. A first wall 202A, a second wall 202B, and a third wall 202C at least partially enclose the space 200 in the example shown.
As shown, a person makes a first gesture 214A at an initial time (t0) in
One or more of the wireless communication devices 204A, 204B, 204C can be part of, or may be used by, a motion detection system. In the example shown in
The example wireless signals shown in
As shown in
As shown in
As shown in
Mathematically, a transmitted signal f(t) transmitted from the first wireless communication device 204A may be described according to Equation (1):
where ωn represents the frequency of nth frequency component of the transmitted signal, cn represents the complex coefficient of the nth frequency component, and t represents time. With the transmitted signal f(t) being transmitted from the first wireless communication device 204A, an output signal rk(t) from a path k may be described according to Equation (2):
where αn,k represents an attenuation factor (or channel response; e.g., due to scattering, reflection, and path losses) for the nth frequency component along path k, and ϕn,k represents the phase of the signal for nth frequency component along path k. Then, the received signal R at a wireless communication device can be described as the summation of all output signals rk(t) from all paths to the wireless communication device, which is shown in Equation (3):
Substituting Equation (2) into Equation (3) renders the following Equation (4):
The received signal R at a wireless communication device can then be analyzed, for example, to detect motion or to recognize gestures as described below. The received signal R at a wireless communication device can be transformed to the frequency domain, for example, using a Fast Fourier Transform (FFT) or another type of algorithm. The transformed signal can represent the received signal R as a series of n complex values, one for each of the respective frequency components (at the n frequencies ωn). For a frequency component at frequency ωn, a complex value Yn may be represented as follows in Equation (5):
The complex value Yn for a given frequency component ωn indicates a relative magnitude and phase offset of the received signal at that frequency component ωn. When an object moves in the space, the complex value Yn changes due to the channel response αn,k of the space changing. Accordingly, a change detected in the channel response (and thus, the complex value Yn) can be indicative of movement of an object within the communication channel. Conversely, a stable channel response may indicate lack of movement. Thus, in some implementations, the complex values Yn for each of multiple devices in a wireless network can be processed to detect whether motion has occurred in a space traversed by the transmitted signals f(t).
In another aspect of
In some implementations, for example, a steering matrix may be generated at a transmitter device (beamformer) based on a feedback matrix provided by a receiver device (beamformee) based on channel sounding. Because the steering and feedback matrices are related to propagation characteristics of the channel, these matrices change as objects move within the channel. Changes in the channel characteristics are accordingly reflected in these matrices, and by analyzing the matrices, motion can be detected, and different characteristics of the detected motion can be determined. In some implementations, a spatial map may be generated based on one or more beamforming matrices. The spatial map may indicate a general direction of an object in a space relative to a wireless communication device. In some cases, “modes” of a beamforming matrix (e.g., a feedback matrix or steering matrix) can be used to generate the spatial map. The spatial map may be used to detect the presence of motion in the space or to detect a location of the detected motion.
In some aspects of operation, a motion detection system may detect certain gestures (e.g., the first and second gestures 214A, 214B shown in
In some cases, a gesture or series of gestures may be associated with an action to be taken by the device 220. For example, the network-connected device 220 may be controlled by the gestures 214A, 214B. As an example, the device 220 can be a Wi-Fi enabled alarm clock or another type of Wi-Fi device (e.g., a smartphone running an alarm clock application). In this example, the series of gestures 214A, 214B (e.g., such as waving an arm a certain number of times) can be associated with deactivating the alarm on the Wi-Fi device. In another example, the gestures of a particular breathing rate and/or heart rate may indicate that a person is awake or no longer sleeping, and those gestures may be associated with deactivating the alarm. Accordingly, a gesture or series of gestures may be associated with any action of the network-connected device 220, which may be controlled via the wireless communication network. Some examples include, turning on and off lights, activating and deactivating a home security system, etc. In some examples, a user application may be provided with, or on, the Wi-Fi connected device that provides an interface allowing the user to select gestures and associate the gestures to actions for controlling the device. In other cases, gestures may be selected and managed by the motion detection system, another device, etc.
The example process 300 may include additional or different operations, and the operations may be performed in the order shown or in another order. In some cases, one or more of the operations shown in
At a high level,
In the example shown in
At 310 channel inspection is performed (e.g., by a gesture recognition engine or another component of a motion detection system). The channel inspection process analyzes channel information to determine whether a gesture occurred. For example, the channel inspection process may include the example process 400 shown in
If a gesture is detected at 310, the state machine is initialized to “State 1” and the gesture timeout counter is initialized. The gesture timeout counter can be initialized to a timeout value representing a maximum amount of time that the state machine will remain in “State 1” before a gesture timeout occurs. In an example, the state machine may process 10 channel responses per second, and the gesture timeout counter can be initialized to 10 for a gesture timeout period of 1 second, to 20 for a gesture timeout period of 2 seconds, etc.
After initializing the gesture timeout counter at 310, the process 300 proceeds to 320, and channel inspection is performed based on new channel data. If a gesture is not detected based on the channel inspection of the new channel data at 320, then the gesture timeout counter is decremented, and the process 300 returns to 310. If the gesture timeout counter reaches zero, then a gesture timeout is detected at 310 and “Action 1” is initiated.
Thus, in some instances, the state machine determines that a second gesture was not detected within the gesture timeout period of a first gesture detected by the channel inspection at 310, and the state machine initiates “Action 1” in response to detecting the gesture timeout at 310.
If a gesture is detected based on the channel inspection of the new channel data at 320, then the state machine is incremented to “State N−1” and the gesture timeout counter is reinitialized (e.g., to the same value that it was initialized to at 310 or another value).
After reinitializing the gesture timeout counter at 320, the process 300 proceeds to 330, and channel inspection is performed based on new channel data. If a gesture is not detected based on the channel inspection of the new channel data at 330, then the gesture timeout counter is decremented, and the process 300 returns to 310. If the gesture timeout counter reaches zero, then a gesture timeout is detected at 320 and “Action N−1” is initiated.
Thus, in some instances, the state machine determines that a sequence of gestures was detected by the channel inspections at 310 and 320, and that a gesture timeout occurred after reinitiating the gesture timeout counter at 320, and the state machine may then initiate “Action N−1” in response to detecting the gesture timeout at 320.
In the example shown in
The example process 400 may include additional or different operations, and the operations may be performed in the order shown or in another order. In some cases, one or more of the operations shown in
At a high level, the process 400 proceeds as follows. At 404, weighting coefficients 412 are applied to channel responses 402. At 406, a gesture frequency bandpass filter is applied to the weighted channel response data. The filter output 408 produced by the gesture frequency bandpass filter 406 is then used to tune the weighting coefficients 412. The modified weighting coefficients 412 are then reapplied to the channel responses at 404. The process may continue to adjust the weighting coefficients 412 until an optimization condition is reached. Generally, the weighting coefficients 412 can be positive or negative values. At 414, gesture frequency detection is applied to the weighted channel response data, e.g., based on the tuned weighting coefficients 412. The gesture frequency detection 414 can analyze the weighted channel response data to detect gestures that occurred in the space. When a gesture is detected, the gesture frequency detection process may generate gesture data 416 indicating that a gesture was detected. In some cases, the gesture data 416 indicates a type of gesture, a location of the gesture, or other information about the gesture.
A gesture in the space traversed by the wireless signals will modulate the intensity of wireless signals at the receiver. Accordingly, the process 400 analyzes a time-series of frequency-domain channel responses 402 (derived from the wireless signals) for a pattern in this intensity change. For instance, a quick wave of the hand two times may appear as a sinusoid with a frequency of approximately 0.5 Hertz. This pattern can be detected with a frequency-selective filter (the gesture frequency bandpass filter 406) acting on the time-series of the frequency-domain channel data. The intensity can be discriminative across the frequency bins of the channel response because a gesture may, in some instances, only be affecting one particular path of the signal (e.g., one ray). Modulating one particular path of a multipath signal can push some frequencies up and the others down, setting up a negative correlation coefficient of different frequency bins in time. Thus, different frequency components of the wireless signal are affected differently based on where in space the gesture is happening. Accordingly, the example process 400 may perform gesture recognition by examining all the frequencies of the channel responses 402 over time.
In the example shown in
In some implementations, a time-frequency filter (or another type of gesture discriminating filter) is adaptively tuned by the motion detection system during its operation, so that it can pick up a gesture happening anywhere in the space. The ability to adaptively tune the time-frequency filter (e.g., by tuning the weighting coefficients 412 at 410) can be important, for example, due to variability of the time-frequency signature with different environments, and different locations where a gesture can be performed. To incorporate variations among different people and different environments, a bank of such filters can be designed with slightly different time-frequency footprints. Hence, a linear combination of channel response frequency components (corresponding to different frequency bins) can be formed and fed to a line spectrum estimation block (e.g., at 406) which looks for the characteristic frequencies associated with human gestures. Once the process 400 detects that signature, other members of the sequence (that forms a complete gesture) can be detected. When no further gestures are detected, the gesture sequence can be interpreted.
Accordingly, the process 400 in
The example interface 530 can communicate (receive, transmit, or both) wireless signals. For example, the interface 530 may be configured to communicate radio frequency (RF) signals formatted according to a wireless communication standard (e.g., Wi-Fi, 4G, 5G, Bluetooth, etc.). In some implementations, the example interface 530 includes a radio subsystem and a baseband subsystem. The radio subsystem may include, for example, one or more antennas and radio frequency circuitry. The radio subsystem can be configured to communicate radio frequency wireless signals on the wireless communication channels. As an example, the radio subsystem may include a radio chip, an RF front end, and one or more antennas. The baseband subsystem may include, for example, digital electronics configured to process digital baseband data. In some cases, the baseband subsystem may include a digital signal processor (DSP) device or another type of processor device. In some cases, the baseband system includes digital processing logic to operate the radio subsystem, to communicate wireless network traffic through the radio subsystem or to perform other types of processes.
The example processor 510 can execute instructions, for example, to generate output data based on data inputs. The instructions can include programs, codes, scripts, modules, or other types of data stored in memory 520. Additionally or alternatively, the instructions can be encoded as pre-programmed or re-programmable logic circuits, logic gates, or other types of hardware or firmware components or modules. The processor 510 may be or include a general-purpose microprocessor, as a specialized co-processor or another type of data processing apparatus. In some cases, the processor 510 performs high level operation of the wireless communication device 500. For example, the processor 510 may be configured to execute or interpret software, scripts, programs, functions, executables, or other instructions stored in the memory 520. In some implementations, the processor 510 may be included in the interface 530 or another component of the wireless communication device 500.
The example memory 520 may include computer-readable storage media, for example, a volatile memory device, a non-volatile memory device, or both. The memory 520 may include one or more read-only memory devices, random-access memory devices, buffer memory devices, or a combination of these and other types of memory devices. In some instances, one or more components of the memory can be integrated or otherwise associated with another component of the wireless communication device 500. The memory 520 may store instructions that are executable by the processor 510. For example, the instructions may include instructions to perform one or more of the operations in the example processes 300, 400 shown in
In the example shown in
The example gesture recognition engine 552 includes instructions that, when executed by the processor 510, can detect gestures (e.g., human gestures) based on channel information obtained from wireless signals. For example, the gesture recognition engine 552 may perform one or more operations of the example process 400 shown in
The example state machine 554 includes instructions that, when executed by the processor 510, can initiate an action associated with a detected gesture or sequence of gestures. For example, the state machine 554 may perform one or more operations of the example process 300 shown in
The example gesture database 556 includes data that associates gestures (e.g., individual gestures, gesture sequences, etc.) with respective actions to be initiated by the motion detection system 550 in response to the gestures. In some cases, the gesture database 556 includes data entries that directly associate specific gestures or gesture sequences with respective actions to be initiated. In some cases, the gesture database 556 includes data entries that directly associate specific states of the state machine 554 with the respective actions to be initiated by the motion detection system 550. The gesture database 556 may be configured in another manner.
The example power unit 540 provides power to the other components of the wireless communication device 500. For example, the other components may operate based on electrical power provided by the power unit 540 through a voltage bus or other connection. In some implementations, the power unit 540 includes a battery or a battery system, for example, a rechargeable battery. In some implementations, the power unit 540 includes an adapter (e.g., an AC adapter) that receives an external power signal (from an external source) and coverts the external power signal to an internal power signal conditioned for a component of the wireless communication device 500. The power unit 540 may include other components or operate in another manner.
Some of the subject matter and operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Some of the subject matter described in this specification can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions, encoded on a computer storage medium for execution by, or to control the operation of, data-processing apparatus. A computer storage medium can be, or can be included in, a computer-readable storage device, a computer-readable storage substrate, a random or serial access memory array or device, or a combination of one or more of them. Moreover, while a computer storage medium is not a propagated signal, a computer storage medium can be a source or destination of computer program instructions encoded in an artificially generated propagated signal. The computer storage medium can also be, or be included in, one or more separate physical components or media (e.g., multiple CDs, disks, or other storage devices).
Some of the operations described in this specification can be implemented as operations performed by a data processing apparatus on data stored on one or more computer-readable storage devices or received from other sources.
The term “data-processing apparatus” encompasses all kinds of apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, a system on a chip, or multiple ones, or combinations, of the foregoing. The apparatus can include special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit). The apparatus can also include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, a cross-platform runtime environment, a virtual machine, or a combination of one or more of them.
A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, object, or other unit suitable for use in a computing environment. A computer program may, but need not, correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
Some of the processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform actions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
To provide for interaction with a user, operations can be implemented on a computer having a display device (e.g., a monitor, or another type of display device) for displaying information to the user and a keyboard and a pointing device (e.g., a mouse, a trackball, a tablet, a touch sensitive screen, or another type of pointing device) by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input. In addition, a computer can interact with a user by sending documents to and receiving documents from a device that is used by the user; for example, by sending web pages to a web browser on a user's client device in response to requests received from the web browser.
In a general aspect, a motion detection system detects gestures (e.g., human gestures) and initiates actions in response to the detected gestures.
In a first example, channel information is obtained based on wireless signals transmitted through a space by one or more wireless communication devices. A gesture recognition engine analyzes the channel information to detect a gesture (e.g., a predetermined gesture or a predetermined gesture sequence) in the space. An action to be initiated in response to the detected gesture is identified. An instruction to perform the action is sent to a network-connected device associated with the space.
Implementations of the first example may include one or more of the following features. A location of the gesture may be detected, and the action to be initiated (e.g., a type of action, a location of the action, or a device to perform the action) can be determined based on the location of the gesture. Detecting the gesture may include detecting a sequence of gestures. Detecting the sequence of gestures may include determining that a first gesture and a second gesture occurred in the space within a gesture timeout period. Detecting the sequence of gestures may include: in response to detecting the first gesture, initiating a state of a state machine and initiating a gesture timeout counter; in response to detecting the second gesture within the gesture timeout period, progressing the state of the state machine and initiating the gesture timeout counter; after reinitiating the gesture timeout counter, detecting a gesture timeout based on the gesture timeout counter; and identifying the action based on the state of the state machine at the gesture timeout.
Implementations of the first example may include one or more of the following features. Detecting the gesture may include using a time-frequency filter to detect a time-frequency signature of the gesture. The channel information may include a time series of channel responses, and using the time-frequency filter may include applying weighting coefficients to frequency components of the channel responses. The time-frequency filter may include an adaptive time-frequency filter that tunes the weighting coefficients to detect time-frequency signatures of gestures. The adaptive time-frequency filter may tune the weighting coefficients to detect gestures that modulate an intensity of the channel responses at in a frequency range corresponding to human gestures (e.g., 0 to 4 Hertz, 0.25 to 0.75 Hertz, or another frequency range).
In a second example, a non-transitory computer-readable medium stores instructions that are operable when executed by data processing apparatus to perform one or more operations of the first example. In a third example, a system includes wireless communication devices, a wireless-connected device and a computer device configured to perform one or more operations of the first example.
Implementations of the third example may include one or more of the following features. One of the wireless communication devices can be or include the computer device. One of the wireless communication devices can be or include the network-connected device. The computer device can be located remote from the wireless communication devices and/or the network-connected device.
While this specification contains many details, these should not be understood as limitations on the scope of what may be claimed, but rather as descriptions of features specific to particular examples. Certain features that are described in this specification or shown in the drawings in the context of separate implementations can also be combined. Conversely, various features that are described or shown in the context of a single implementation can also be implemented in multiple embodiments separately or in any suitable subcombination.
Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the implementations described above should not be understood as requiring such separation in all implementations, and it should be understood that the described program components and systems can generally be integrated together in a single product or packaged into multiple products.
A number of embodiments have been described. Nevertheless, it will be understood that various modifications can be made. Accordingly, other embodiments are within the scope of the following claims.
This application is a continuation of U.S. patent application Ser. No. 16/425,310, filed May 29, 2019, which claims priority to U.S. Provisional Application No. 62/686,446 entitled “Motion Detection Based on Beamforming Dynamic Information” and filed Jun. 18, 2018. The priority applications are hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
4054879 | Wright et al. | Oct 1977 | A |
4649388 | Atlas | Mar 1987 | A |
4740045 | Goodson et al. | Apr 1988 | A |
5270720 | Stove | Dec 1993 | A |
5613039 | Wang et al. | Mar 1997 | A |
5696514 | Nathanson et al. | Dec 1997 | A |
6075797 | Thomas | Jun 2000 | A |
6380882 | Hegnauer | Apr 2002 | B1 |
6573861 | Hommel et al. | Jun 2003 | B1 |
6636763 | Junker | Oct 2003 | B1 |
6914854 | Heberley et al. | Jul 2005 | B1 |
7652617 | Kurtz et al. | Jan 2010 | B2 |
8463191 | Farajidana et al. | Jun 2013 | B2 |
8660578 | Yang et al. | Feb 2014 | B1 |
8671069 | Chang et al. | Mar 2014 | B2 |
8710984 | Wilson et al. | Apr 2014 | B2 |
8812654 | Gelvin et al. | Aug 2014 | B2 |
8832244 | Gelvin et al. | Sep 2014 | B2 |
8836344 | Habib et al. | Sep 2014 | B2 |
8836503 | Gelvin et al. | Sep 2014 | B2 |
9030321 | Breed | May 2015 | B2 |
9253592 | Moscovich et al. | Feb 2016 | B1 |
9329701 | Lautner | May 2016 | B2 |
9523760 | Kravets et al. | Dec 2016 | B1 |
9524628 | Omer et al. | Dec 2016 | B1 |
9551784 | Katuri et al. | Jan 2017 | B2 |
9584974 | Omer et al. | Feb 2017 | B1 |
9609468 | Moscovich et al. | Mar 2017 | B1 |
9628365 | Gelvin et al. | Apr 2017 | B2 |
9692459 | Maltsev et al. | Jun 2017 | B2 |
9743294 | Omer et al. | Aug 2017 | B1 |
9869759 | Furuskog et al. | Jan 2018 | B2 |
9927519 | Omer et al. | Mar 2018 | B1 |
9933517 | Olekas et al. | Apr 2018 | B1 |
9946351 | Sakaguchi | Apr 2018 | B2 |
9989622 | Griesdorf et al. | Jun 2018 | B1 |
10004076 | Griesdorf et al. | Jun 2018 | B1 |
10048350 | Piao et al. | Aug 2018 | B1 |
10051414 | Omer et al. | Aug 2018 | B1 |
10077204 | Maschmeyer et al. | Sep 2018 | B2 |
10108903 | Piao et al. | Oct 2018 | B1 |
10109167 | Olekas et al. | Oct 2018 | B1 |
10109168 | Devison et al. | Oct 2018 | B1 |
10111228 | Griesdorf et al. | Oct 2018 | B2 |
10129853 | Manku et al. | Nov 2018 | B2 |
20020080014 | McCarthy | Jun 2002 | A1 |
20030108119 | Mohebbi et al. | Jun 2003 | A1 |
20060152404 | Fullerton et al. | Jul 2006 | A1 |
20060284757 | Zemany | Dec 2006 | A1 |
20070036353 | Reznik | Feb 2007 | A1 |
20070296571 | Kolen | Dec 2007 | A1 |
20080119130 | Sinha | May 2008 | A1 |
20080240008 | Backes et al. | Oct 2008 | A1 |
20080258907 | Kalpaxis | Oct 2008 | A1 |
20080300055 | Lutnick | Dec 2008 | A1 |
20080303655 | Johnson | Dec 2008 | A1 |
20090062696 | Nathan et al. | Mar 2009 | A1 |
20090180444 | McManus et al. | Jul 2009 | A1 |
20100073686 | Medeiros et al. | Mar 2010 | A1 |
20100127853 | Hanson et al. | May 2010 | A1 |
20100130229 | Sridhara et al. | May 2010 | A1 |
20100306320 | Leppanen et al. | Dec 2010 | A1 |
20100315284 | Trizna et al. | Dec 2010 | A1 |
20110019587 | Wang | Jan 2011 | A1 |
20110035491 | Gelvin et al. | Feb 2011 | A1 |
20110263946 | el Kaliouby | Oct 2011 | A1 |
20120115512 | Grainger et al. | May 2012 | A1 |
20120146788 | Wilson et al. | Jun 2012 | A1 |
20120283896 | Persaud | Nov 2012 | A1 |
20130017836 | Chang et al. | Jan 2013 | A1 |
20130090151 | Ngai et al. | Apr 2013 | A1 |
20130094538 | Wang | Apr 2013 | A1 |
20130113647 | Sentelle et al. | May 2013 | A1 |
20130162459 | Aharony et al. | Jun 2013 | A1 |
20130178231 | Morgan | Jul 2013 | A1 |
20130283256 | Proud | Oct 2013 | A1 |
20140028539 | Newham et al. | Jan 2014 | A1 |
20140135042 | Buchheim et al. | May 2014 | A1 |
20140148195 | Bassan-Eskenazi et al. | May 2014 | A1 |
20140247179 | Furuskog | Sep 2014 | A1 |
20140266669 | Fadell et al. | Sep 2014 | A1 |
20140274218 | Kadiwala et al. | Sep 2014 | A1 |
20140286380 | Prager et al. | Sep 2014 | A1 |
20140329540 | Duggan et al. | Nov 2014 | A1 |
20140355713 | Bao et al. | Dec 2014 | A1 |
20140361920 | Katuri et al. | Dec 2014 | A1 |
20150043377 | Cholas et al. | Feb 2015 | A1 |
20150063323 | Sadek et al. | Mar 2015 | A1 |
20150078295 | Mandyam et al. | Mar 2015 | A1 |
20150098377 | Amini et al. | Apr 2015 | A1 |
20150159100 | Shi et al. | Jun 2015 | A1 |
20150181388 | Smith | Jun 2015 | A1 |
20150195100 | Imes et al. | Jul 2015 | A1 |
20150212205 | Shpater | Jul 2015 | A1 |
20150245164 | Merrill | Aug 2015 | A1 |
20150288745 | Moghaddam et al. | Oct 2015 | A1 |
20150304886 | Liu et al. | Oct 2015 | A1 |
20150309166 | Sentelle et al. | Oct 2015 | A1 |
20150312877 | Bhanage | Oct 2015 | A1 |
20150338507 | Oh et al. | Nov 2015 | A1 |
20150350849 | Huang et al. | Dec 2015 | A1 |
20160018508 | Chen et al. | Jan 2016 | A1 |
20160054804 | Gollakota et al. | Feb 2016 | A1 |
20160088438 | O'Keeffe | Mar 2016 | A1 |
20160088631 | Hedayat et al. | Mar 2016 | A1 |
20160135205 | Barbu et al. | May 2016 | A1 |
20160150418 | Kang et al. | May 2016 | A1 |
20160183059 | Nagy et al. | Jun 2016 | A1 |
20160187475 | Horng et al. | Jun 2016 | A1 |
20160210838 | Yan et al. | Jul 2016 | A1 |
20160259421 | Gollakota et al. | Sep 2016 | A1 |
20160262355 | Swan | Sep 2016 | A1 |
20160363663 | Mindell | Dec 2016 | A1 |
20170042488 | Muhsin | Feb 2017 | A1 |
20170052247 | Kong et al. | Feb 2017 | A1 |
20170055126 | O'Keeffe | Feb 2017 | A1 |
20170055131 | Kong et al. | Feb 2017 | A1 |
20170059190 | Stefanski et al. | Mar 2017 | A1 |
20170086281 | Avrahamy | Mar 2017 | A1 |
20170090026 | Joshi et al. | Mar 2017 | A1 |
20170111852 | Selen et al. | Apr 2017 | A1 |
20170123528 | Hu | May 2017 | A1 |
20170126488 | Cordeiro et al. | May 2017 | A1 |
20170146656 | Belsley et al. | May 2017 | A1 |
20170155439 | Chang et al. | Jun 2017 | A1 |
20170195893 | Lee et al. | Jul 2017 | A1 |
20170223628 | Snyder et al. | Aug 2017 | A1 |
20170278374 | Skaaksrud | Sep 2017 | A1 |
20170280351 | Skaaksrud | Sep 2017 | A1 |
20170311279 | Allegue et al. | Oct 2017 | A1 |
20170311574 | Swan | Nov 2017 | A1 |
20170343658 | Ramirez et al. | Nov 2017 | A1 |
20180027389 | Shirakata et al. | Jan 2018 | A1 |
20180086264 | Pedersen | Mar 2018 | A1 |
20180106885 | Blayvas | Apr 2018 | A1 |
20180120420 | McMahon | May 2018 | A1 |
20180157336 | Harris et al. | Jun 2018 | A1 |
20180180706 | Li et al. | Jun 2018 | A1 |
20180288587 | Allegue Martinez et al. | Oct 2018 | A1 |
20180330293 | Kulkarni et al. | Nov 2018 | A1 |
20190272718 | Hurtig | Sep 2019 | A1 |
20190384409 | Omer et al. | Dec 2019 | A1 |
Number | Date | Country |
---|---|---|
2834522 | May 2014 | CA |
2945702 | Aug 2015 | CA |
104615244 | May 2015 | CN |
105807935 | Jul 2016 | CN |
1997-507298 | Jul 1997 | JP |
2004286567 | Oct 2004 | JP |
2013072865 | Apr 2013 | JP |
2014021574 | Feb 2014 | WO |
2014201574 | Dec 2014 | WO |
2015168700 | Nov 2015 | WO |
2016005977 | Jan 2016 | WO |
2016066822 | May 2016 | WO |
2016110844 | Jul 2016 | WO |
2016170011 | Oct 2016 | WO |
2017106976 | Jun 2017 | WO |
2017132765 | Aug 2017 | WO |
2017177303 | Oct 2017 | WO |
2017193200 | Nov 2017 | WO |
2017210770 | Dec 2017 | WO |
2018094502 | May 2018 | WO |
2019041019 | Mar 2019 | WO |
2019241877 | Dec 2019 | WO |
Entry |
---|
EPO, communication pursuant to Article 94(3) EPC mailed Feb. 14, 2023, in EP 19823575.6, 5 pgs. |
1 JPO, Office Action issued in Application No. 2020-569199 on Jun. 12, 2023, 2 pages. |
EPO, Extended European Search Report mailed Jul. 27, 2021, in EP 19823575.6, 11 pgs. |
USPTO, Final Office Action mailed Oct. 26, 2020, in U.S. Appl. No. 16/425,310, 15 pgs. |
USPTO, Non-Final Office Action mailed May 27, 2020, in U.S. Appl. No. 16/425,310, 15 pgs. |
USPTO, Final Office Action mailed Jan. 17, 2020, in U.S. Appl. No. 16/425,310, 18 pgs. |
USPTO, Notice of Allowance mailed Oct. 19, 2022, in U.S. Appl. No. 16/425,310, 25 pgs. |
USPTO, Non-Final Office Action mailed Jul. 12, 2019, in U.S. Appl. No. 16/425,310, 29 pgs. |
USPTO, Advisory Action mailed Apr. 7, 2020, in U.S. Appl. No. 16/425,310, 3 pgs. |
WIPO, International Search Report and Written Opinion mailed Aug. 16, 2019, in PCT/CA2019/050843, 9 pgs. |
Abdelnasser , et al., “WiGest: A Ubiquitous WiFi-based Gesture Recognition System”, IEEE Conf. on Computer Communications, 2015, 9 pgs. |
Dekker , et al., “Gesture Recognition with a Low Power FMCW Radar and a Deep Convolutional Neural Network”, Proceedings of the 14th European Radar Conference, Nuremberg, Germany, Oct. 11-13, 2017, 4 pgs. |
Domenico , et al., “Exploring Training Options for RF Sensing Using CSI”, IEEE Communications Magazine, 2018, vol. 56, Issue 5, pp. 116-123, 8 pgs. |
Iqbal , et al., “Indoor Motion Classification Using Passive RF Sensing Incorporating Deep Learning”, ISSN: 2577-2465, Electronic IEEE, Jun. 3, 2018, 5 pgs. |
Kosba , et al., “Robust WLAN Device-free Passive Motion Detection”, IEEE Wireless Communications and Networking Conference, Apr. 2012, 6 pgs. |
Wang , et al., “Wi-Fi CSI-Based Behavoir Recognition: From Signals and Actions to Activities”, IEEE Communications Magazine, May 1, 2018, 7 pgs. |
Youssef, Moustafa , et al., “Challenges: Device-free Passive Localization for Wireless Environments”, Mobicom 07 Proceedings of the 13th Annual ACM International Conference on Mobile Computing and Networking, Sep. 2007, 11 pgs. |
EPO, Communication pursuant to Article 94(3) issued in Application No. 19823575.6 on Mar. 12, 2024, 8 pages. |
Li, Hong , et al., “WiFinger: Talk to Your Smart Devices with Finger-grained Gesture”, UBICOMP '16, Session: Interacting Using the Hands and Eyes, Sep. 12-16, 2016, pp. 250-261, 12 pages. |
KIPO, Office Action issued in Application No. 2020-7036155 on Aug. 26, 2024, 19 pages. |
Number | Date | Country | |
---|---|---|---|
20230125109 A1 | Apr 2023 | US |
Number | Date | Country | |
---|---|---|---|
62686446 | Jun 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16425310 | May 2019 | US |
Child | 18145941 | US |