The following description relates to using multiple-input/multiple-output (MIMO) training fields for motion detection.
Motion detection systems have been used to detect movement, for example, of objects in a room or an outdoor area. In some example motion detection systems, infrared or optical sensors are used to detect movement of objects in the sensor's field of view. Motion detection systems have been used in security systems, automated control systems and other types of systems.
In some aspects of what is described here, a wireless sensing system can process wireless signals (e.g., radio frequency signals) transmitted through a space between wireless communication devices for wireless sensing applications. Example wireless sensing applications include detecting motion, which can include one or more of the following: detecting motion of objects in the space, motion tracking, localization of motion in a space, breathing detection, breathing monitoring, presence detection, gesture detection, gesture recognition, human detection (e.g., moving and stationary human detection), human tracking, fall detection, speed estimation, intrusion detection, walking detection, step counting, respiration rate detection, sleep pattern detection, apnea estimation, posture change detection, activity recognition, gait rate classification, gesture decoding, sign language recognition, hand tracking, heart rate estimation, breathing rate estimation, room occupancy detection, human dynamics monitoring, and other types of motion detection applications. Other examples of wireless sensing applications include object recognition, speaking recognition, keystroke detection and recognition, tamper detection, touch detection, attack detection, user authentication, driver fatigue detection, traffic monitoring, smoking detection, school violence detection, human counting, metal detection, human recognition, bike localization, human queue estimation, Wi-Fi imaging, and other types of wireless sensing applications. For instance, the wireless sensing system may operate as a motion detection system to detect the existence and location of motion based on Wi-Fi signals or other types of wireless signals.
The examples described herein may be useful for home monitoring. In some instances, home monitoring using the wireless sensing systems described herein may provide several advantages, including full home coverage through walls and in darkness, discreet detection without cameras, higher accuracy and reduced false alerts (e.g., in comparison with sensors that do not use Wi-Fi signals to sense their environments), and adjustable sensitivity. By layering Wi-Fi motion detection capabilities into routers and gateways, a robust motion detection system may be provided.
The examples described herein may also be useful for wellness monitoring. Caregivers want to know their loved ones are safe, while seniors and people with special needs want to maintain their independence at home with dignity. In some instances, wellness monitoring using the wireless sensing systems described herein may provide a solution that uses wireless signals to detect motion without using cameras or infringing on privacy, generates alerts when unusual activity is detected, tracks sleep patterns, and generates preventative health data. For example, caregivers can monitor motion, visits from health care professionals, and unusual behavior such as staying in bed longer than normal. Furthermore, motion is monitored unobtrusively without the need for wearable devices, and the wireless sensing systems described herein offer a more affordable and convenient alternative to assisted living facilities and other security and health monitoring tools.
The examples described herein may also be useful for setting up a smart home. In some examples, the wireless sensing systems described herein use predictive analytics and artificial intelligence (AI), to learn motion patterns and trigger smart home functions accordingly. Examples of smart home functions that may be triggered included adjusting the thermostat when a person walk through the front door, turning other smart devices on or off based on preferences, automatically adjusting lighting, adjusting HVAC systems based on present occupants, etc.
In some aspects of what is described here, a multiple-input-multiple-output (MIMO) training field included in a wireless signal is used for motion detection. For instance, an HE-LTF field in a PHY frame of a wireless transmission according to the Wi-Fi 6 standard (IEEE 802.11ax) may be used for motion detection. The wireless signals may be transmitted through a space over a time period, for example, from one wireless communication device to another. A high-efficiency long training field (HE-LTF) or another type of MIMO training field may be identified in the PHY frame of each wireless signal. A Legacy PHY field may also be identified in the PHY frame of each wireless signal. Example Legacy PHY fields include L-LTF and L-STF. In some cases, channel information is generated based on the respective MIMO training fields and the respective Legacy PHY fields. The channel information obtained from the Legacy PHY fields can be used to make a macro-level determination of whether motion has occurred in the space during the time period. The channel information obtained from the MIMO training fields can be used to detect fine-grained motion attributes, for example, the location or direction of motion in the space during the time period.
In some instances, aspects of the systems and techniques described here provide technical improvements and advantages over existing approaches. For example, the MIMO training field may provide signals having higher frequency resolution, a greater number of subcarrier frequencies, and a higher frequency bandwidth (or a combination of these features) compared to signals provided by the Legacy PHY fields, which may provide more accurate and fine-grained motion detection capabilities. In some cases, motion detection can be performed with higher spatial and temporal resolution, precision and accuracy. The technical improvements and advantages achieved in examples where the wireless sensing system is used for motion detection may also be achieved in examples where the wireless sensing system is used for other wireless sensing applications.
In some instances, a wireless sensing system can be implemented using a wireless communication network. Wireless signals received at one or more wireless communication devices in the wireless communication network may be analyzed to determine channel information for the different communication links (between respective pairs of wireless communication devices) in the network. The channel information may be representative of a physical medium that applies a transfer function to wireless signals that traverse a space. In some instances, the channel information includes a channel response. Channel responses can characterize a physical communication path, representing the combined effect of, for example, scattering, fading, and power decay within the space between the transmitter and receiver. In some instances, the channel information includes beamforming state information (e.g., a feedback matrix, a steering matrix, channel state information (CSI), etc.) provided by a beamforming system. Beamforming is a signal processing technique often used in multi antenna (multiple-input/multiple-output (MIMO)) radio systems for directional signal transmission or reception. Beamforming can be achieved by operating elements in an antenna array in such a way that signals at particular angles experience constructive interference while others experience destructive interference.
The channel information for each of the communication links may be analyzed by one or more motion detection algorithms (e.g., running on a hub device, a client device, or other device in the wireless communication network, or on a remote device communicably coupled to the network) to detect, for example, whether motion has occurred in the space, to determine a relative location of the detected motion, or both. In some aspects, the channel information for each of the communication links may be analyzed to detect whether an object is present or absent, e.g., when no motion is detected in the space.
In some instances, a motion detection system returns motion data. In some implementations, motion data is a result that is indicative of a degree of motion in the space, the location of motion in the space, the direction of motion in the space, a time at which the motion occurred, or a combination thereof. In some instances, the motion data can include a motion score, which may include, or may be, one or more of the following: a scalar quantity indicative of a level of signal perturbation in the environment accessed by the wireless signals; an indication of whether there is motion; an indication of whether there is an object present; or an indication or classification of a gesture performed in the environment accessed by the wireless signals.
In some implementations, the motion detection system can be implemented using one or more motion detection algorithms. Example motion detection algorithms that can be used to detect motion based on wireless signals include the techniques described in U.S. Pat. No. 9,523,760 entitled “Detecting Motion Based on Repeated Wireless Transmissions,” U.S. Pat. No. 9,584,974 entitled “Detecting Motion Based on Reference Signal Transmissions,” U.S. Pat. No. 10,051,414 entitled “Detecting Motion Based On Decompositions Of Channel Response Variations,” U.S. Pat. No. 10,048,350 entitled “Motion Detection Based on Groupings of Statistical Parameters of Wireless Signals,” U.S. Pat. No. 10,108,903 entitled “Motion Detection Based on Machine Learning of Wireless Signal Properties,” U.S. Pat. No. 10,109,167 entitled “Motion Localization in a Wireless Mesh Network Based on Motion Indicator Values,” U.S. Pat. No. 10,109,168 entitled “Motion Localization Based on Channel Response Characteristics,” U.S. Pat. No. 10,743,143 entitled “Determining a Motion Zone for a Location of Motion Detected by Wireless Signals,” U.S. Pat. No. 10,605,908 entitled “Motion Detection Based on Beamforming Dynamic Information from Wireless Standard Client Devices,” U.S. Pat. No. 10,605,907 entitled “Motion Detection by a Central Controller Using Beamforming Dynamic Information,” U.S. Pat. No. 10,600,314 entitled “Modifying Sensitivity Settings in a Motion Detection System,” U.S. Pat. No. 10,567,914 entitled “Initializing Probability Vectors for Determining a Location of Motion Detected from Wireless Signals,” U.S. Pat. No. 10,565,860 entitled “Offline Tuning System for Detecting New Motion Zones in a Motion Detection System,” U.S. Pat. No. 10,506,384 entitled “Determining a Location of Motion Detected from Wireless Signals Based on Prior Probability, U.S. Pat. No. 10,499,364 entitled “Identifying Static Leaf Nodes in a Motion Detection System,” U.S. Pat. No. 10,498,467 entitled “Classifying Static Leaf Nodes in a Motion Detection System,” U.S. Pat. No. 10,460,581 entitled “Determining a Confidence for a Motion Zone Identified as a Location of Motion for Motion Detected by Wireless Signals,” U.S. Pat. No. 10,459,076 entitled “Motion Detection based on Beamforming Dynamic Information,” U.S. Pat. No. 10,459,074 entitled “Determining a Location of Motion Detected from Wireless Signals Based on Wireless Link Counting,” U.S. Pat. No. 10,438,468 entitled “Motion Localization in a Wireless Mesh Network Based on Motion Indicator Values,” U.S. Pat. No. 10,404,387 entitled “Determining Motion Zones in a Space Traversed by Wireless Signals,” U.S. Pat. No. 10,393,866 entitled “Detecting Presence Based on Wireless Signal Analysis,” U.S. Pat. No. 10,380,856 entitled “Motion Localization Based on Channel Response Characteristics,” U.S. Pat. No. 10,318,890 entitled “Training Data for a Motion Detection System using Data from a Sensor Device,” U.S. Pat. No. 10,264,405 entitled “Motion Detection in Mesh Networks,” U.S. Pat. No. 10,228,439 entitled “Motion Detection Based on Filtered Statistical Parameters of Wireless Signals,” U.S. Pat. No. 10,129,853 entitled “Operating a Motion Detection Channel in a Wireless Communication Network,” U.S. Pat. No. 10,111,228 entitled “Selecting Wireless Communication Channels Based on Signal Quality Metrics,” and other techniques.
The example wireless communication system 100 includes three wireless communication devices 102A, 102B, 102C. The example wireless communication system 100 may include additional wireless communication devices 102 and/or other components (e.g., one or more network servers, network routers, network switches, cables, or other communication links, etc.).
The example wireless communication devices 102A, 102B, 102C can operate in a wireless network, for example, according to a wireless network standard or another type of wireless communication protocol. For example, the wireless network may be configured to operate as a Wireless Local Area Network (WLAN), a Personal Area Network (PAN), a metropolitan area network (MAN), or another type of wireless network. Examples of WLANs include networks configured to operate according to one or more of the 802.11 family of standards developed by IEEE (e.g., Wi-Fi networks), and others. Examples of PANs include networks that operate according to short-range communication standards (e.g., BLUETOOTH®, Near Field Communication (NFC), ZigBee), millimeter wave communications, and others.
In some implementations, the wireless communication devices 102A, 102B, 102C may be configured to communicate in a cellular network, for example, according to a cellular network standard. Examples of cellular networks include networks configured according to 2G standards such as Global System for Mobile (GSM) and Enhanced Data rates for GSM Evolution (EDGE) or EGPRS; 3G standards such as Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Universal Mobile Telecommunications System (UMTS), and Time Division Synchronous Code Division Multiple Access (TD-SCDMA); 4G standards such as Long-Term Evolution (LTE) and LTE-Advanced (LTE-A); 5G standards, and others.
In some cases, one or more of the wireless communication devices 102 is a Wi-Fi access point or another type of wireless access point (WAP). In some cases, one or more of the wireless communication devices 102 is an access point of a wireless mesh network, such as, for example, a commercially-available mesh network system (e.g., GOOGLE Wi-Fi, EERO mesh, etc.). In some instances, one or more of the wireless communication devices 102 can be implemented as wireless access points (APs) in a mesh network, while the other wireless communication device(s) 102 are implemented as leaf devices (e.g., mobile devices, smart devices, etc.) that access the mesh network through one of the APs. In some cases, one or more of the wireless communication devices 102 is a mobile device (e.g., a smartphone, a smart watch, a tablet, a laptop computer, etc.), a wireless-enabled device (e.g., a smart thermostat, a Wi-Fi enabled camera, a smart TV), or another type of device that communicates in a wireless network.
In the example shown in
In the example shown in
In the example shown in
In some examples, the wireless signals may propagate through a structure (e.g., a wall) before or after interacting with a moving object, which may allow the object's motion to be detected without an optical line-of-sight between the moving object and the transmission or receiving hardware. In some instances, the motion detection system may communicate the motion detection event to another device or system, such as a security system or a control center.
In some cases, the wireless communication devices 102 themselves are configured to perform one or more operations of the motion detection system, for example, by executing computer-readable instructions (e.g., software or firmware) on the wireless communication devices. For example, each device may process received wireless signals to detect motion based on changes in the communication channel. In some cases, another device (e.g., a remote server, a cloud-based computer system, a network-attached device, etc.) is configured to perform one or more operations of the motion detection system. For example, each wireless communication device 102 may send channel information to a specified device, system or service that performs operations of the motion detection system.
In an example aspect of operation, wireless communication devices 102A, 102B may broadcast wireless signals or address wireless signals to the other wireless communication device 102C, and the wireless communication device 102C (and potentially other devices) receives the wireless signals transmitted by the wireless communication devices 102A, 102B. The wireless communication device 102C (or another system or device) then processes the received wireless signals to detect motion of an object in a space accessed by the wireless signals (e.g., in the zones 110A, 11B). In some instances, the wireless communication device 102C (or another system or device) may perform one or more operations of a motion detection system.
In some cases, a combination of one or more of the wireless communication devices 204A, 204B, 204C can be part of, or may be used by, a motion detection system. The example wireless communication devices 204A, 204B, 204C can transmit wireless signals through a space 200. The example space 200 may be completely or partially enclosed or open at one or more boundaries of the space 200. The space 200 may be or may include an interior of a room, multiple rooms, a building, an indoor area, outdoor area, or the like. A first wall 202A, a second wall 202B, and a third wall 202C at least partially enclose the space 200 in the example shown.
In the example shown in
As shown, an object is in a first position 214A at an initial time (t0) in
As shown in
In
The example wireless signals shown in
The transmitted signal may have a number of frequency components in a frequency bandwidth, and the transmitted signal may include one or more bands within the frequency bandwidth. The transmitted signal may be transmitted from the first wireless communication device 204A in an omnidirectional manner, in a directional manner or otherwise. In the example shown, the wireless signals traverse multiple respective paths in the space 200, and the signal along each path may become attenuated due to path losses, scattering, reflection, or the like and may have a phase or frequency offset.
As shown in
In the example shown in
In the example shown in
In the example shown in
When the client devices 232 seek to connect to and associate with their respective APs 226, 228, the client devices 232 may go through an authentication and association phase with their respective APs 226, 228. Among other things, the association phase assigns address information (e.g., an association ID or another type of unique identifier) to each of the client devices 232. For example, within the IEEE 802.11 family of standards for Wi-Fi, each of the client devices 232 may identify itself using a unique address (e.g., a 48-bit address, an example being the MAC address), although the client devices 232 may be identified using other types of identifiers embedded within one or more fields of a message. The address information (e.g., MAC address or another type of unique identifier) can be either hardcoded and fixed, or randomly generated according to the network address rules at the start of the association process. Once the client devices 232 have associated to their respective APs 226, 228, their respective address information may remain fixed. Subsequently, a transmission by the APs 226, 228 or the client devices 232 typically includes the address information (e.g., MAC address) of the transmitting wireless device and the address information (e.g., MAC address) of the receiving device.
In the example shown in
In the example shown in
The motion detection system, which may include one or more motion detection or localization processes running on the one or more of the client devices 232 or on one or more of the APs 226, 228, may collect and process data (e.g., channel information) corresponding to local links that are participating in the operation of the wireless sensing system. The motion detection system may be installed as a software or firmware application on the client devices 232 or on the APs 226, 228, or may be part of the operating systems of the client devices 232 or the APs 226, 228.
In some implementations, the APs 226, 228 do not contain motion detection software and are not otherwise configured to perform motion detection in the space 201. Instead, in such implementations, the operations of the motion detection system are executed on one or more of the client devices 232. In some implementations, the channel information may be obtained by the client devices 232 by receiving wireless signals from the APs 226, 228 (or possibly from other client devices 232) and processing the wireless signal to obtain the channel information. For example, the motion detection system running on the client devices 232 may have access to channel information provided by the client device's radio firmware (e.g., Wi-Fi radio firmware) so that channel information may be collected and processed.
In some implementations, the client devices 232 send a request to their corresponding AP 226, 228 to transmit wireless signals that can be used by the client device as motion probes to detect motion of objects in the space 201. The request sent to the corresponding AP 226, 228 may be a null data packet frame, a beamforming request, a ping, standard data traffic, or a combination thereof. In some implementations, the client devices 232 are stationary while performing motion detection in the space 201. In other examples, one or more of the client devices 232 may be mobile and may move within the space 201 while performing motion detection.
Mathematically, a signal f(t) transmitted from a wireless communication device (e.g., the wireless communication device 204A in
where ωn represents the frequency of nth frequency component of the transmitted signal, cn represents the complex coefficient of the nth frequency component, and t represents time. With the transmitted signal f(t) being transmitted, an output signal rk (t) from a path k may be described according to Equation (2):
where αn,k, represents an attenuation factor (or channel response; e.g., due to scattering, reflection, and path losses) for the nth frequency component along path k, and ϕn,k represents the phase of the signal for nth frequency component along path k. Then, the received signal R at a wireless communication device can be described as the summation of all output signals rk (t) from all paths to the wireless communication device, which is shown in Equation (3):
Substituting Equation (2) into Equation (3) renders the following Equation (4):
The received signal R at a wireless communication device (e.g., the wireless communication devices 204B, 204C in
The complex value Yn for a given frequency component ωn indicates a relative magnitude and phase offset of the received signal at that frequency component ωn. The signal f(t) may be repeatedly transmitted within a time period, and the complex value Yn can be obtained for each transmitted signal f(t). When an object moves in the space, the complex value Yn changes over the time period due to the channel response αn,k of the space changing. Accordingly, a change detected in the channel response (and thus, the complex value Yn) can be indicative of motion of an object within the communication channel. Conversely, a stable channel response may indicate lack of motion. Thus, in some implementations, the complex values Yn for each of multiple devices in a wireless network can be processed to detect whether motion has occurred in a space traversed by the transmitted signals f(t). The channel response can be expressed in either the time-domain or frequency-domain, and the Fourier-Transform or Inverse-Fourier-Transform can be used to switch between the time-domain expression of the channel response and the frequency-domain expression of the channel response.
In another aspect of
In some implementations, for example, a steering matrix may be generated at a transmitter device (beamformer) based on a feedback matrix provided by a receiver device (beamformee) based on channel sounding. Because the steering and feedback matrices are related to propagation characteristics of the channel, these beamforming matrices change as objects move within the channel. Changes in the channel characteristics are accordingly reflected in these matrices, and by analyzing the matrices, motion can be detected, and different characteristics of the detected motion can be determined. In some implementations, a spatial map may be generated based on one or more beamforming matrices. The spatial map may indicate a general direction of an object in a space relative to a wireless communication device. In some cases, “modes” of a beamforming matrix (e.g., a feedback matrix or steering matrix) can be used to generate the spatial map. The spatial map may be used to detect the presence of motion in the space or to detect a location of the detected motion.
In some implementations, the output of the motion detection system may be provided as a notification for graphical display on a user interface on a user device. In some implementations, the user device is the device used to detect motion, a user device of a caregiver or emergency contact designated to an individual in the space 200, 201, or any other user device that is communicatively coupled to the motion detection system to receive notifications from the motion detection system.
In some instances, the graphical display includes a plot of motion data indicating a degree of motion detected by the motion detection system for each time point in a series of time points. The graphical display can display the relative degree of motion detected by each node of the motion detection system. The graphical display can help the user determine an appropriate action to take in response to the motion detection event, correlate the motion detection event with the user's observation or knowledge, determine whether the motion detection event was true or false, etc.
In some implementations, the output of the motion detection system may be provided in real-time (e.g., to an end user). Additionally or alternatively, the output of the motion detection system may be stored (e.g., locally on the wireless communication devices 204, client devices 232, the APs 226, 228, or on a cloud-based storage service) and analyzed to reveal statistical information over a time frame (e.g., hours, days, or months). An example where the output of the motion detection system may be stored and analyzed to reveal statistical information over a time frame is in health monitoring, vital sign monitoring, sleep monitoring, etc. In some implementations, an alert (e.g., a notification, an audio alert, or a video alert) may be provided based on the output of the motion detection system. For example, a motion detection event may be communicated to another device or system (e.g., a security system or a control center), a designated caregiver, or a designated emergency contact based on the output of the motion detection system.
In some implementations, a wireless motion detection system can detect motion by analyzing components of wireless signals that are specified by a wireless communication standard. For example, a motion detection system may analyze standard headers of wireless signals exchanged in a wireless communication network. One such example is the IEEE 802.11ax standard, which is also known as “Wi-Fi 6.” A draft of the IEEE 802.11ax standard is published in a document entitled “P802.11ax/D4.0, IEEE Draft Standard for Information Technology—Telecommunications and Information Exchange Between Systems Local and Metropolitan Area Networks—Specific Requirements Part 11: Wireless LAN Medium Access Control (MAC) and Physical Layer (PHY) Specifications Amendment Enhancements for High Efficiency WLAN,” March 2019, which is accessible at https://ieeexplore.ieee.org/document/8672643 and hereby incorporated by reference in its entirety. Standard headers specified by other types of wireless communication standards may be used for motion detection in some cases.
In some implementations, a motion detection algorithm used by a wireless motion detection system utilizes a channel response (an output of a channel estimation process) computed by a wireless receiver (e.g., a Wi-Fi receiver). For example, the channel responses computed by a channel estimation process according to a Wi-Fi 6 standard may be received as inputs to the motion detection algorithm. The channel estimation in the Wi-Fi 6 standard occurs at the PHY layer, using the PHY Frame (the PHY Frame is also called a PPDU) of the received wireless signal.
In some examples, a motion detection algorithm employed by a wireless motion detection system uses channel responses computed from orthogonal frequency-division multiplexing (OFDM)-based PHY frames (including those produced by the Wi-Fi 6 standard). The OFDM-based PHY frames can, in some instances, be frequency-domain signals having multiple fields, each having a corresponding frequency-domain signal. With this class of OFDM-based PHY frames, there are typically two types of PPDU fields that allow the Wi-Fi receiver to estimate the channel. The first is the Legacy-Training-Field, and the second are the MIMO-Training-Fields. Either or both fields may be used for motion detection. An example of a MIMO-Training-Field that may be used is the so-called “High-Efficiency Long Training Field” known as HE-PHY (e.g., in the Wi-Fi 6 standard, according to the IEEE 802.11ax standard).
In the example shown in
In some IEEE 802.11 standards, the PHY layer is broken into 2 sub-layers: the PLCP Sublayer (Physical Layer Convergence Procedure), and the PMD Sublayer (PHY Medium Dependant). The PLCP Sublayer (Physical Layer Convergence Procedure) takes data from the MAC layer and translates it into a PHY frame format. The format of the PHY frame is also referred to as a PPDU (PLCP Protocol Data Unit). A PPDU may include fields that are used for channel estimation. The PMD Sublayer (PHY Medium Dependant) provides a modulation scheme for the PHY layer. There are many different IEEE 802.11 based PHY frame formats defined. In some examples, a wireless motion detection system uses information derived from OFDM based PHY frames, such as, for example, those described in the following standard documents: IEEE 802.11a-1999: Legacy OFDM PHY; IEEE 802.11n-2009: HT PHY (High-Throughput); IEEE 802.11ac-2013: VHT PHY (Very-High-Throughput); IEEE 802.11ax (Draft 4.0, March 2019): HE PHY (High-Efficiency).
Other types of PHY layer data may be used, and each PHY layer specification may provide its own PPDU format. For instance, the PPDU format for a PHY layer specification may be found in some IEEE 802.11 standards under the section heading “<XXX> PHY Specification”==>“<XXX> PHY”==>“<XXX> PPDU Format”. The example PHY frame 300 shown in
In some IEEE 802.11 standards (e.g., IEEE 802.11a-1999), the OFDM PHY divides a 20 MHz channel into 64 frequency bins. Modulation and Demodulation is done using 64-point complex inverse Fast Fourier Transform (IFFT) and Fast Fourier Transform (FFT). In an example modulation process: data bits grouped (e.g., depending on QAM constellation), each group of bits is assigned to one of the subcarriers (or frequency bins); depending on QAM constellation, group of bits mapped to a complex number for each subcarrier; and a 64-point IFFT is performed to generate complex-time-domain I and Q waveforms for transmission. In an example demodulation process: complex I and Q time domain signals are received; a 64-point FFT is performed to compute complex number for each subcarrier; depending on QAM constellation, each subcarrier complex number is mapped to bits; and bits from each subcarrier are re-assembled into data. In a typical modulation or demodulation process, not all 64 subcarriers are used; for example, only 52 of the subcarriers may be considered valid for data and pilot, and the rest of the subcarriers may be considered NULLED. The PHY layer specifications in more recently-developed IEEE 802.11 standards utilize larger channel bandwidths (e.g., 40 MHz, 80 MHz, and 160 MHz).
As shown in
In some implementations, a wireless communication device computes a channel response, for example, by performing a channel estimation process based on a PHY frame. For instance, a wireless communication device may perform channel estimation based on the example PHY frame 300 shown in
In some instances, the channel information used for motion detection may include a channel response generated by channel estimation based on the L-LTF in the PHY frame. The L-LTF in the 802-11ax standard can be equivalent to the LTF in the IEEE 802.11a-1999 standard. The L-LTF may be provided in the frequency domain as an input to a 64-point IFFT. Typically, only 52 of the 64 points are considered valid points for channel estimation; and the remaining points (points [−32, −26) and (26, 31]) are zero. As described in the IEEE 802.11a-1999 standard, the L-LTF may be a long OFDM training symbol including 53 subcarriers (including a zero value at DC), which are modulated by the elements of a sequence L given by the following:
The example “L” vector shown above represents the complex frequency-domain representation of the field at baseband (centered around DC) and is described in page 13 of a draft of the IEEE 802.11a-1999 standard. The draft of the IEEE 802.11a-1999 standard is published in a document entitled “802.11a-1999-IEEE Standard for Telecommunications and Information Exchange Between Systems—LAN/MAN Specific Requirements—Part 11: Wireless Medium Access Control (MAC) and physical layer (PHY) specifications: High Speed Physical Layer in the 5 GHz band” and accessible at https://ieeexplore.ieee.org/document/815305. The example “L” vector is considered “Legacy”, as it was part of the original OFDM PHY specification, and is considered part of the legacy preamble. Hence in later specification versions, it is referred to as the L-LTF (for Legacy-Long Training Field).
In some instances, the channel information used for motion detection may include a channel response generated by channel estimation based on one or more of the MIMO training fields in the PHY frame (e.g., the HE-LTF, HT-LTF, or VHT-LTF fields). The HE-LTF may be provided in the frequency domain as an input to a 256-point IFFT. With a typical HE-LTF, there are 241 valid points (e.g., instead of 52 as in the legacy case). Each point in the HE-LTF represents a frequency range of 78.125 kHz, whereas each Legacy point represents a larger frequency range of 312.5 kHz. Therefore, the HE-LTF may provide higher frequency resolution, more frequency domain data points, and a larger frequency bandwidth, which can provide more accurate and higher-resolution (e.g., higher temporal and spatial resolution) motion detection. An example HE-LTF is described on page 561 of the draft of the IEEE 802.11ax standard as follows:
In a 20 MHz transmission, the 4x HE-LTF sequence transmitted on subcarriers [−122, 122] is given by Equation (27-42).
In some instances, a channel response can be estimated on a receiver device by performing an FFT of the received time-domain sequence (e.g., the example L-LTF and HE-LTF sequences shown above), and dividing by the expected result [CH(N)═RX(N)/L(N)]. The 64-Point FFT Bin 600 in the top portion of
In the example shown in
In the example shown in
The propagation environment represented by the signal paths shown in
Here, the integer k indexes the three signal paths, and the coefficients αk are complex phasors that represent the magnitude and phase of the scattering along each signal path. The values of the coefficients αk are determined by physical characteristics of the environment, for example, free space propagation and the type of scattering objects present. In some examples, increasing attenuation along a signal path (e.g., by an absorbing medium like a human body or otherwise) may generally decrease the magnitude of the corresponding coefficient αk. Similarly, a human body or another medium acting as a scatterer can change the magnitude and phase of the coefficients αk.
A time domain representation of a filter may have additional or different pulses or other features. The number of pulses, as well as their respective locations on the time axis and their respective magnitudes, may vary according to the scattering profile of the environment. For example, if an object were to show up towards the end of the coverage area (e.g., at scatterer 710B), this may cause the third pulse (at time τ3) to move towards the left or the right. Typically, the first pulse (at time τ1) represents the earliest pulse or direct line of sight in most systems; accordingly, if an object were to come in the line of sight between transmitter and receiver, this pulse would be affected. In some instances, distance and direction of motion (relative to the transmitter and receiver) in the propagation environment can be inferred by looking at the behavior of these pulses over time. As an example, in some instances, an object moving towards the line of sight may affect the third, second and first pulses in that order, while an object moving away from the line of sight may affect the pulses in the opposite order.
Taking the Fourier transform of the filter h(t) from Equation (6) provides a frequency representation of the filter:
In the frequency representation shown in Equation (7), each impulse from Equation (6) has been converted to a complex exponential (a sine and cosine wave). Each component of the exponential in the frequency domain has a specific frequency of rotation which is given by an associated pulse time τk with a certain phase.
In some implementations, when a wireless communication device (e.g., a WiFi transceiver) receives a wireless signal, the wireless communication device obtains a frequency-domain representation from the PHY frames of the wireless signals, which may be expressed in the form of Equation (7) or otherwise. In some instances, a motion detection system can convert the frequency-domain representation to a time-domain representation, which may be expressed in the form of Equation (6) or otherwise. The motion detection system may then make inferences regarding motion in the propagation environment (e.g., near/far, line of sight/non line of sight motion) based on the time-domain representation.
In some implementations, the motion detection system uses channel responses estimated based on a Legacy PHY field (e.g., L-STF, L-LTF) and channel responses estimated based on a MIMO training field (e.g., HE-LTF, VHT-LTF, HT-LTF) to make inferences regarding motion in the propagation environment. In some instances, the differences in the continuous frequency bandwidths and frequency resolutions of the MIMO training field and the Legacy PHY field can be used to detect motion in a space with varying granularity. For example, the time-domain channel responses estimated based on the Legacy PHY field (e.g., referred to as Legacy PHY-based channel responses) may be used to make a macro-level determination of whether motion has occurred in the propagation environment, while the time-domain channel responses estimated based on the MIMO training field (e.g., referred to as MIMO field-based channel responses) may be used to make a finer-grained determination of motion. As an example, MIMO field-based channel responses can be used to make inferences regarding the location of the motion in the propagation environment, the direction of the motion in the propagation environment, or both. The MIMO field-based channel responses may be referred to as HE-LTF-based channel responses, HT-LTF-based channel responses, or VHT-LTF-based channel responses depending on which MIMO training field is used to estimate the channel response.
As an illustration,
Since the duration of the time window 804 is greater than the duration of the time window 802, the MIMO field-based channel response is able to detect the third pulse (at time τ3) without aliasing artifacts, thus accurately revealing the existence of indirect signal path 704C and scatterer 710B in the propagation environment. Furthermore, since the continuous frequency bandwidth of the MIMO training fields is greater than the continuous frequency bandwidth of the Legacy PHY fields, the MIMO field-based channel response has a finer (e.g., higher) temporal resolution than the Legacy PHY-based channel response. Since the MIMO field-based channel response has a finer (e.g., higher) temporal resolution, shifts of any of the channel response's pulses towards the left or the right (e.g. caused by motion along the direct path 704A or indirect paths 704B, 704C) can be detected without aliasing artifacts. For example, motion at the scatterer 710B (e.g., caused by motion of the scatterer 710B or motion of an object near the scatterer 710B) may cause the third pulse (at time τ3) to move towards the left or the right. The finer (e.g., higher) temporal resolution of the MIMO field-based channel response can detect the shifts in the third pulse (at time τ3) without aliasing artifacts, thus allowing an inference that motion has occurred at the location of the scatterer 710B.
Furthermore, the direction of motion (relative to the transmitter and receiver) in the propagation environment can, in some instances, be inferred by determining a change in the channel response's pulses over time. For example,
Although the Legacy PHY-based channel responses have smaller time windows and coarser (e.g., lower) temporal resolution than the MIMO field-based channel responses, the Legacy PHY-based channel responses can be used to make macro-level determination of whether motion has occurred in the propagation environment. For example, motion may be detected by identifying substantial changes over time in the coefficients and pulse times of the Legacy PHY-based channel responses.
The OFDM signal 1002 shown in
The transmitted OFDM signal 1002 passes through a propagation environment (e.g., from the wireless communication device 702A to the wireless communication device 702B in
The propagation environment transforms the transmitted OFDM signal 1002 and its components (e.g., MIMO training fields and other components) to form the received OFDM signal 1022. The effect of the propagation environment on the wireless signal may be represented as the channel multiplied by the signal (both in the frequency domain), which produces the received OFDM signal at the receiver. The third plot 1000C represents the received OFDM signal 1022 in the frequency domain. Thus, the received OFDM signal 1022 represents the transmitted OFDM signal 1002 as modified by the channel 1012.
As shown in
In some cases, a time-domain representation of the channel 1102 sampled by the bands 1104A, 1104B of the received wireless signal can be constructed (e.g., by applying a Fast Fourier Transform to the frequency-domain representation). The time-domain representation may include a number of pulses at times τk, for example, in the format represented in Equation (6) and
In some implementations, an optimization process may be used to convert any number of sampled frequency bands, and convert them to a pulse-based model (e.g., time-domain representation shown in Equation (6)). The process may be formulated as the following optimization problem:
The minimization problem in Equation (8) seeks to identify K paths through the channel, in which τk is the delay of path k, such that the response of the resulting time domain pulses matches the observed frequency response of the channel. The minimization operator seeks to minimize the difference between (1) the frequency response using a certain set of τk′S and (2) the measured frequency response. In general, any suitable optimization methodology can be used to minimize the difference using a set of τk′S. Once the optimization is complete, the output is a set of τk′S values. These values are the delays of pulses that allow the time domain response to best match the observed frequency domain response.
Accordingly, solving the optimization problem in Equation (8) corresponds to finding the pulse times τk′ which would minimize the residual of this set of equations, for all frequencies over which the channel response has been sampled. This is a nonlinear optimization problem because these equations are a nonlinear function of the pulse times τk despite being a linear function of the coefficients αk. In some cases, this optimization problem is solved by an iterative greedy process, such as, for example, stage-wise least squares. For example, a matrix equations may be formulated as follows:
[e−j2πf
Here, the matrix is created by sweeping the values of the pulse times over the rows and sweeping the values of the frequencies over the columns. The value fl in this case represents a vector of all frequencies over which the channel response has been observed for a given signal. A column from the matrix that is maximally correlated with the output H(fl) may be selected, and the coefficients αk corresponding to that can be found. The result can then be subtracted from the output H(fl) and the process can be repeated to provide K columns (each corresponding to a pulse at time τk) and K coefficients. In some cases, the value of K can be estimated a priori based on the dynamic range of the radio receiver, and hence the amount of noise in the CSI estimate. In some cases, the value of K can be estimated based on the studies of indoor environments which restrict the number of distinct pulses that can be observed in a typical environment. For instance, the free space propagation loss, combined with limited radio dynamic range, may restrict the number of pulses observable through a radio, to less than ten in some environments. In such environments, the stage-wise least squares operation can be iterated until some other predetermined, small integer number of values for the pulse times τk and coefficients αk have been extracted. In some cases, some of the values could be zero or negligibly small.
As shown in
Each of the frequency responses H1(f) and H2(f) is provided to the h(t) estimator block 1202, which generates respective time-domain channel estimates h1(t) and h2(t). In some implementations, the h(t) estimator block 1202 generates the time-domain channel estimates h1(t) and h2(t) based on the optimization expressed in Equation (8). Each of the time-domain channel estimates h1(t) and h2(t) can be represented as pulses having coefficients αk at pulse times τk (e.g., as seen in plot 800 in
The Fourier transform block 1206 then transforms the estimated time-domain representations h1(t) and h2(t) to respective estimated frequency-domain representations (f) and (f) by applying a Fourier transform. An error value is then computed based on a difference between the estimated frequency-domain representation and the respective received frequency response. For example, the error value between received frequency response H1(f) and the estimated frequency-domain representation (f) can be computed based on the difference between (f) and H1(f). Similarly, the error value between received frequency response H2(f) and the estimated frequency-domain representation (f) can be computed based on the difference between (f) and H2(f). This process loop can be iterated until the error value has been reduced to a sufficiently low value.
The error value is provided to the model-based threshold block 1208. The threshold block 1208 determines (e.g., based on the radio dynamic range, the free space propagation loss, and potentially other factors) what is an appropriate threshold for the system to have converged to a baseline model for the pulse times τk. Once converged, the threshold detector closes the latch 1210 (e.g., by outputting a certain value that causes the latch 1210 to close), which allows the detected coefficients αk and associated pulse times τk to move to the coefficient tracking block 1212. The coefficient tracking block 1212 takes the estimated frequency-domain representations H1(f) and H2(f) at each time step, and recomputes the coefficients αk to ensure that the channel model is tracked sufficiently closely. In some implementations, when a large-scale changes happen in the propagation environment, the error loop is triggered again for a fresh computation of coefficients αk and pulse times τk, which are then propagated to the coefficient tracker 1212. The output of the tracker 1212, representing the reflected pulses, and their corresponding complex multiplier coefficients is given to the motion inference engine 1214 to detect motion characteristics. For example, the motion inference engine 1214 may identify motion of an object in a space by analyzing changes in the coefficients αk and pulse times τk over time. As discussed above, changes in the Legacy PHY-based channel responses may be used to make a macro-level determination of whether motion has occurred, while changes in the MIMO field-based channel responses may be used to make a finer-grained determination of motion (e.g., the location of motion, the direction of both, or both).
As shown in
At 1302, wireless signals that are transmitted through a space (e.g., the spaces 200, 201) over a time period are received. The wireless signals may be transmitted between the example wireless communication devices 102A, 102B, 102C shown in
Each of the wireless signals may be formatted according to a wireless communication standard. In some instances, the wireless signals may be formatted according to the IEEE 802.11 standard and may include a PHY frame, examples being the example PHY frame 300 shown in
At 1304, a Legacy PHY field (e.g., L-STF and L-LTF) and a MIMO training field (e.g., HE-LTF, HT-LTF, or VHT-LTF) are identified in the PHY frame of each of the wireless signals. A first frequency-domain signal (e.g., H1(f)) may be included in the Legacy PHY field, and a second frequency-domain signal (e.g., H2 (f)) may be included in the MIMO training field. At 1306, a first time-domain channel estimate (e.g., h1(t)) is generated based on the first frequency-domain signal, and at 1308, a second time-domain channel estimate (e.g., h2(t)) is generated based on the second frequency-domain signal. Since the continuous frequency bandwidth of the MIMO training field is greater than the continuous frequency bandwidth of the Legacy PHY field, the temporal resolution of the first time-domain channel estimate is coarser than the temporal resolution of the second time-domain channel estimate. Stated differently, the temporal resolution of the first time-domain channel estimate is lower than the temporal resolution of the second time-domain channel estimate.
At 1310, a determination is made whether motion has occurred in the space based on the first time-domain channel estimate (e.g., h1(t)). In some instances, the determination at 1310 is a macro-level indication of whether motion has occurred in the space. At 1312, a location of motion within the space is determined based on the second time-domain channel estimate (e.g., h2(t)). In some instances, the determination at 1312 is finer-grained motion data that allows a motion detection system to localize motion within the space and to, in some instances, determine a direction of motion within the space (e.g., as illustrated in
The example interface 1430 can communicate (receive, transmit, or both) wireless signals. For example, the interface 1430 may be configured to communicate radio frequency (RF) signals formatted according to a wireless communication standard (e.g., Wi-Fi, 4G, 5G, Bluetooth, etc.). In some implementations, the example interface 1430 includes a radio subsystem and a baseband subsystem. The radio subsystem may include, for example, one or more antennas and radio frequency circuitry. The radio subsystem can be configured to communicate radio frequency wireless signals on the wireless communication channels. As an example, the radio subsystem may include a radio chip, an RF front end, and one or more antennas. The baseband subsystem may include, for example, digital electronics configured to process digital baseband data. In some cases, the baseband subsystem may include a digital signal processor (DSP) device or another type of processor device. In some cases, the baseband system includes digital processing logic to operate the radio subsystem, to communicate wireless network traffic through the radio subsystem or to perform other types of processes.
The example processor 1410 can execute instructions, for example, to generate output data based on data inputs. The instructions can include programs, codes, scripts, modules, or other types of data stored in memory 1420. Additionally or alternatively, the instructions can be encoded as pre-programmed or re-programmable logic circuits, logic gates, or other types of hardware or firmware components or modules. The processor 1410 may be or include a general-purpose microprocessor, as a specialized co-processor or another type of data processing apparatus. In some cases, the processor 1410 performs high level operation of the wireless communication device 1400. For example, the processor 1410 may be configured to execute or interpret software, scripts, programs, functions, executables, or other instructions stored in the memory 1420. In some implementations, the processor 1410 be included in the interface 1430 or another component of the wireless communication device 1400.
The example memory 1420 may include computer-readable storage media, for example, a volatile memory device, a non-volatile memory device, or both. The memory 1420 may include one or more read-only memory devices, random-access memory devices, buffer memory devices, or a combination of these and other types of memory devices. In some instances, one or more components of the memory can be integrated or otherwise associated with another component of the wireless communication device 1400. The memory 1420 may store instructions that are executable by the processor 1410. For example, the instructions may include instructions to perform one or more of the operations described above.
The example power unit 1440 provides power to the other components of the wireless communication device 1400. For example, the other components may operate based on electrical power provided by the power unit 1440 through a voltage bus or other connection. In some implementations, the power unit 1440 includes a battery or a battery system, for example, a rechargeable battery. In some implementations, the power unit 1440 includes an adapter (e.g., an AC adapter) that receives an external power signal (from an external source) and coverts the external power signal to an internal power signal conditioned for a component of the wireless communication device 1400. The power unit 1420 may include other components or operate in another manner.
Some of the subject matter and operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Some of the subject matter described in this specification can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions, encoded on a computer storage medium for execution by, or to control the operation of, data-processing apparatus. A computer storage medium can be, or can be included in, a computer-readable storage device, a computer-readable storage substrate, a random or serial access memory array or device, or a combination of one or more of them. Moreover, while a computer storage medium is not a propagated signal, a computer storage medium can be a source or destination of computer program instructions encoded in an artificially generated propagated signal. The computer storage medium can also be, or be included in, one or more separate physical components or media (e.g., multiple CDs, disks, or other storage devices).
Some of the operations described in this specification can be implemented as operations performed by a data processing apparatus on data stored on one or more computer-readable storage devices or received from other sources.
The term “data-processing apparatus” encompasses all kinds of apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, a system on a chip, or multiple ones, or combinations, of the foregoing. The apparatus can include special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit). The apparatus can also include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, a cross-platform runtime environment, a virtual machine, or a combination of one or more of them.
A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, object, or other unit suitable for use in a computing environment. A computer program may, but need not, correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
Some of the processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform actions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
To provide for interaction with a user, operations can be implemented on a computer having a display device (e.g., a monitor, or another type of display device) for displaying information to the user and αkeyboard and a pointing device (e.g., a mouse, a trackball, a tablet, a touch sensitive screen, or another type of pointing device) by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input. In addition, a computer can interact with a user by sending documents to and receiving documents from a device that is used by the user; for example, by sending web pages to a web browser on a user's client device in response to requests received from the web browser.
In a general aspect, one or more fields in a PHY frame are used for motion detection.
In a general example, wireless signals are transmitted over a time period from a first wireless communication device, through a space, to a second wireless communication device. The wireless signals are formatted according to a wireless communication standard, and each wireless signal includes a PHY frame according to the standard. A MIMO training field (e.g., HE-LTF, HT-LTF, or VHT-LTF) is identified in the PHY frame of each wireless signal. Channel responses are generated based on the respective MIMO training fields. The channel responses are used to detect motion (e.g., motion of an object) that occurred in the space during the time period.
Implementations of the general example may include one or more of the following features. The wireless communication standard is a standard for multiple-input-multiple-output (MIMO) radio communications, a MIMO training field is identified in the PHY frame of each wireless signal, and the channel responses are generated based on the respective MIMO training fields. The wireless communication standard is the IEEE 802.11ax standard. The channel responses are used to detect the location of the motion that occurred in the space during the time period. The channel responses may be analyzed in a time-domain representation, for example, to detect motion or the location of a moving object.
In a first example, wireless signals that are transmitted through a space over a time period are received. The wireless signals may be transmitted between wireless communication devices in a wireless communication network and can be formatted according to a wireless communication standard. A first training field and a second, different training field are identified in an orthogonal frequency-division multiplexing (OFDM)-based PHY frame of each wireless signal. A first time-domain channel estimate and a second time-domain channel estimate are generated for each wireless signal. The first time-domain channel estimate may be based on a first frequency-domain signal included in the first training field of the wireless signal, while the second time-domain channel estimate may be based on a second frequency-domain signal included in the second training field of the wireless signal. In some instances, the temporal resolution of the first time-domain channel estimate is coarser (e.g., lower) than a temporal resolution of the second time-domain channel estimate. A determination is made whether motion has occurred in the space during the time period based on the first time-domain channel estimates, and a location of the motion within the space is determined based on the second time-domain channel estimates.
Implementations of the first example may include one or more of the following features. Determining the location of the motion within the space can include determining a direction of the motion within the space based on the second time-domain channel estimates. A frequency resolution of the first frequency-domain signal may be coarser (e.g., lower) than a frequency resolution of the second frequency-domain signal. The first training field can include a legacy training field of the OFDM-based PHY frame, and the second training field can include a multiple-input-multiple-output (MIMO) training field of the OFDM-based PHY frame. The MIMO training field can include a high-efficiency long training field (HE-LTF). The MIMO training field can include a very high throughput long training field (VHT-LTF). The MIMO training field can include a high throughput long training field (HT-LTF). The wireless communication standard can be the IEEE 802.11 standard. The wireless communication network can be a wireless local area network (WLAN).
In a second example, a non-transitory computer-readable medium stores instructions that are operable when executed by data processing apparatus to perform one or more operations of the first example. In a third example, a system includes a plurality of wireless communication devices, and a computer device configured to perform one or more operations of the first example.
Implementations of the third example may include one or more of the following features. One of the wireless communication devices can be or include the computer device. The computer device can be located remote from the wireless communication devices.
While this specification contains many details, these should not be understood as limitations on the scope of what may be claimed, but rather as descriptions of features specific to particular examples. Certain features that are described in this specification or shown in the drawings in the context of separate implementations can also be combined. Conversely, various features that are described or shown in the context of a single implementation can also be implemented in multiple embodiments separately or in any suitable subcombination.
Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the implementations described above should not be understood as requiring such separation in all implementations, and it should be understood that the described program components and systems can generally be integrated together in a single product or packaged into multiple products.
A number of embodiments have been described. Nevertheless, it will be understood that various modifications can be made. Accordingly, other embodiments are within the scope of the description above.
This application is a continuation of U.S. patent application Ser. No. 17/082,456, filed Oct. 28, 2020, entitled “Using MIMO Training Fields for Motion Detection.” which claims priority to U.S. Provisional App. No. 62/928,684, filed Oct. 31, 2019, entitled “Using MIMO Training Fields for Motion Detection,” the contents of which are hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
4054879 | Wright et al. | Oct 1977 | A |
4197537 | Follen et al. | Apr 1980 | A |
4417157 | Gershberg et al. | Nov 1983 | A |
4636774 | Galvin et al. | Jan 1987 | A |
4649388 | Atlas | Mar 1987 | A |
4740045 | Goodson et al. | Apr 1988 | A |
5270720 | Stove | Dec 1993 | A |
5613039 | Wang et al. | Mar 1997 | A |
5696514 | Nathanson et al. | Dec 1997 | A |
6075797 | Thomas | Jun 2000 | A |
6380882 | Hegnauer | Apr 2002 | B1 |
6573861 | Hommel et al. | Jun 2003 | B1 |
6636763 | Junker et al. | Oct 2003 | B1 |
6914854 | Heberley et al. | Jul 2005 | B1 |
7652617 | Kurtz et al. | Jan 2010 | B2 |
7742753 | Carrero et al. | Jun 2010 | B2 |
8463191 | Farajidana et al. | Jun 2013 | B2 |
8660578 | Yang et al. | Feb 2014 | B1 |
8671069 | Chang et al. | Mar 2014 | B2 |
8710984 | Wilson et al. | Apr 2014 | B2 |
8760392 | Lloyd et al. | Jun 2014 | B2 |
8812654 | Gelvin et al. | Aug 2014 | B2 |
8832244 | Gelvin et al. | Sep 2014 | B2 |
8836344 | Habib et al. | Sep 2014 | B2 |
8836503 | Gelvin et al. | Sep 2014 | B2 |
8934884 | Gustafsson et al. | Jan 2015 | B2 |
9019148 | Bikhazi et al. | Apr 2015 | B1 |
9030321 | Breed | May 2015 | B2 |
9161172 | Poduri et al. | Oct 2015 | B2 |
9253592 | Moscovich et al. | Feb 2016 | B1 |
9329701 | Lautner | May 2016 | B2 |
9389085 | Khorashadi et al. | Jul 2016 | B2 |
9523760 | Kravets et al. | Dec 2016 | B1 |
9524628 | Omer et al. | Dec 2016 | B1 |
9551784 | Katuri et al. | Jan 2017 | B2 |
9584974 | Omer et al. | Feb 2017 | B1 |
9609468 | Moscovich et al. | Mar 2017 | B1 |
9628365 | Gelvin et al. | Apr 2017 | B2 |
9661508 | Siomina | May 2017 | B2 |
9692459 | Maltsev et al. | Jun 2017 | B2 |
9743294 | Omer et al. | Aug 2017 | B1 |
9866308 | Bultan et al. | Jan 2018 | B1 |
9869759 | Furuskog et al. | Jan 2018 | B2 |
9927519 | Omer et al. | Mar 2018 | B1 |
9933517 | Olekas et al. | Apr 2018 | B1 |
9946351 | Sakaguchi et al. | Apr 2018 | B2 |
9989622 | Griesdorf et al. | Jun 2018 | B1 |
10004076 | Griesdorf et al. | Jun 2018 | B1 |
10048350 | Piao et al. | Aug 2018 | B1 |
10051414 | Omer et al. | Aug 2018 | B1 |
10077204 | Maschmeyer et al. | Sep 2018 | B2 |
10108903 | Piao et al. | Oct 2018 | B1 |
10109167 | Olekas et al. | Oct 2018 | B1 |
10109168 | Devison et al. | Oct 2018 | B1 |
10111228 | Griesdorf et al. | Oct 2018 | B2 |
10129853 | Manku et al. | Nov 2018 | B2 |
10228439 | Olekas et al. | Mar 2019 | B1 |
10264405 | Manku et al. | Apr 2019 | B1 |
10318890 | Kravets et al. | Jun 2019 | B1 |
10349216 | Tran et al. | Jul 2019 | B1 |
10380856 | Devison et al. | Aug 2019 | B2 |
10393866 | Kravets et al. | Aug 2019 | B1 |
10404387 | Devison et al. | Sep 2019 | B1 |
10438468 | Olekas et al. | Oct 2019 | B2 |
10459074 | Omer et al. | Oct 2019 | B1 |
10459076 | Kravets et al. | Oct 2019 | B2 |
10460581 | Devison et al. | Oct 2019 | B1 |
10498467 | Ravkine | Dec 2019 | B1 |
10499364 | Ravkine | Dec 2019 | B1 |
10506384 | Omer et al. | Dec 2019 | B1 |
10565860 | Omer et al. | Feb 2020 | B1 |
10567914 | Omer et al. | Feb 2020 | B1 |
10600314 | Manku et al. | Mar 2020 | B1 |
10605907 | Kravets et al. | Mar 2020 | B2 |
10605908 | Kravets et al. | Mar 2020 | B2 |
10743143 | Devison et al. | Aug 2020 | B1 |
10805022 | Shan | Oct 2020 | B2 |
10849006 | Beg et al. | Nov 2020 | B1 |
11012122 | Beg | May 2021 | B1 |
11018734 | Beg | May 2021 | B1 |
11184063 | Beg | Nov 2021 | B2 |
11304254 | Omer | Apr 2022 | B2 |
11570712 | Beg | Jan 2023 | B2 |
20020080014 | McCarthy et al. | Jun 2002 | A1 |
20030108119 | Mohebbi et al. | Jun 2003 | A1 |
20050055568 | Agrawala et al. | Mar 2005 | A1 |
20050128067 | Zakrewski | Jun 2005 | A1 |
20060152404 | Fullerton et al. | Jul 2006 | A1 |
20060217132 | Drummond-Murray et al. | Sep 2006 | A1 |
20060284757 | Zemany | Dec 2006 | A1 |
20070036353 | Reznik et al. | Feb 2007 | A1 |
20070296571 | Kolen | Dec 2007 | A1 |
20080007404 | Albert et al. | Jan 2008 | A1 |
20080057978 | Karaoguz et al. | Mar 2008 | A1 |
20080119130 | Sinha | May 2008 | A1 |
20080240008 | Backes et al. | Oct 2008 | A1 |
20080258907 | Kalpaxis | Oct 2008 | A1 |
20080300055 | Lutnick et al. | Dec 2008 | A1 |
20080303655 | Johnson | Dec 2008 | A1 |
20090062696 | Nathan et al. | Mar 2009 | A1 |
20090122882 | Mujtaba | May 2009 | A1 |
20090180444 | McManus et al. | Jul 2009 | A1 |
20100073686 | Medeiros et al. | Mar 2010 | A1 |
20100127853 | Hanson et al. | May 2010 | A1 |
20100130229 | Sridhara et al. | May 2010 | A1 |
20100207804 | Hayward et al. | Aug 2010 | A1 |
20100234045 | Karr et al. | Sep 2010 | A1 |
20100306320 | Leppanen et al. | Dec 2010 | A1 |
20100315284 | Trizna et al. | Dec 2010 | A1 |
20110019587 | Wang | Jan 2011 | A1 |
20110035491 | Gelvin et al. | Feb 2011 | A1 |
20110090081 | Khorashadi et al. | Apr 2011 | A1 |
20110148689 | Filippi | Jun 2011 | A1 |
20110260856 | Rossmann et al. | Oct 2011 | A1 |
20110260871 | Karkowski | Oct 2011 | A1 |
20110263946 | El Kaliouby et al. | Oct 2011 | A1 |
20120115512 | Grainger et al. | May 2012 | A1 |
20120146788 | Wilson et al. | Jun 2012 | A1 |
20120182429 | Forutanpour et al. | Jul 2012 | A1 |
20120184296 | Milosiu | Jul 2012 | A1 |
20120283896 | Persaud | Nov 2012 | A1 |
20130017836 | Chang et al. | Jan 2013 | A1 |
20130045759 | Smith | Feb 2013 | A1 |
20130090151 | Ngai et al. | Apr 2013 | A1 |
20130094538 | Wang | Apr 2013 | A1 |
20130113647 | Sentelle et al. | May 2013 | A1 |
20130162459 | Aharony et al. | Jun 2013 | A1 |
20130178231 | Morgan | Jul 2013 | A1 |
20130208715 | Roh | Aug 2013 | A1 |
20130283256 | Proud | Oct 2013 | A1 |
20130301451 | Siomina | Nov 2013 | A1 |
20140015706 | Ishihara et al. | Jan 2014 | A1 |
20140073346 | Yang et al. | Mar 2014 | A1 |
20140126323 | Li et al. | May 2014 | A1 |
20140135042 | Buchheim et al. | May 2014 | A1 |
20140148195 | Bassan-Eskenazi et al. | May 2014 | A1 |
20140213284 | Yang et al. | Jul 2014 | A1 |
20140247179 | Furuskog | Sep 2014 | A1 |
20140266669 | Fadell et al. | Sep 2014 | A1 |
20140274218 | Kadiwala et al. | Sep 2014 | A1 |
20140286380 | Prager et al. | Sep 2014 | A1 |
20140288876 | Donaldson | Sep 2014 | A1 |
20140329540 | Duggan et al. | Nov 2014 | A1 |
20140355713 | Bao et al. | Dec 2014 | A1 |
20140358473 | Goel et al. | Dec 2014 | A1 |
20140361920 | Katuri et al. | Dec 2014 | A1 |
20150043377 | Cholas et al. | Feb 2015 | A1 |
20150049701 | Tian et al. | Feb 2015 | A1 |
20150063323 | Sadek et al. | Mar 2015 | A1 |
20150078295 | Mandyam et al. | Mar 2015 | A1 |
20150098377 | Amini et al. | Apr 2015 | A1 |
20150159100 | Shi et al. | Jun 2015 | A1 |
20150181388 | Smith | Jun 2015 | A1 |
20150195100 | Imes et al. | Jul 2015 | A1 |
20150212205 | Shpater | Jul 2015 | A1 |
20150245164 | Merrill | Aug 2015 | A1 |
20150269825 | Tran | Sep 2015 | A1 |
20150288745 | Moghaddam et al. | Oct 2015 | A1 |
20150304886 | Liu et al. | Oct 2015 | A1 |
20150309166 | Sentelle et al. | Oct 2015 | A1 |
20150312877 | Bhanage | Oct 2015 | A1 |
20150338507 | Oh et al. | Nov 2015 | A1 |
20150350849 | Huang et al. | Dec 2015 | A1 |
20150350976 | Kodali et al. | Dec 2015 | A1 |
20150366542 | Brown et al. | Dec 2015 | A1 |
20160014554 | Sen et al. | Jan 2016 | A1 |
20160018508 | Chen et al. | Jan 2016 | A1 |
20160073060 | Renkis | Mar 2016 | A1 |
20160080908 | Julian et al. | Mar 2016 | A1 |
20160088438 | O'Keeffe | Mar 2016 | A1 |
20160088631 | Hedayat et al. | Mar 2016 | A1 |
20160135205 | Barbu et al. | May 2016 | A1 |
20160150418 | Kang et al. | May 2016 | A1 |
20160178741 | Ludlow et al. | Jun 2016 | A1 |
20160183059 | Nagy et al. | Jun 2016 | A1 |
20160187475 | Horng et al. | Jun 2016 | A1 |
20160203689 | Hintz et al. | Jul 2016 | A1 |
20160210838 | Yan et al. | Jul 2016 | A1 |
20160217683 | Li | Jul 2016 | A1 |
20160261452 | Porat et al. | Sep 2016 | A1 |
20160262355 | Swan | Sep 2016 | A1 |
20160363663 | Mindell et al. | Dec 2016 | A1 |
20170042488 | Muhsin | Feb 2017 | A1 |
20170052247 | Kong et al. | Feb 2017 | A1 |
20170055126 | O'Keeffe | Feb 2017 | A1 |
20170055131 | Kong et al. | Feb 2017 | A1 |
20170059190 | Stefanski et al. | Mar 2017 | A1 |
20170086202 | Chen | Mar 2017 | A1 |
20170086281 | Avrahamy | Mar 2017 | A1 |
20170090026 | Joshi et al. | Mar 2017 | A1 |
20170111852 | Selen et al. | Apr 2017 | A1 |
20170123528 | Hu et al. | May 2017 | A1 |
20170126488 | Cordeiro et al. | May 2017 | A1 |
20170146656 | Belsley et al. | May 2017 | A1 |
20170150255 | Wang et al. | May 2017 | A1 |
20170155439 | Chang et al. | Jun 2017 | A1 |
20170177618 | Hu et al. | Jun 2017 | A1 |
20170180882 | Lunner et al. | Jun 2017 | A1 |
20170195016 | Alexander et al. | Jul 2017 | A1 |
20170195893 | Lee et al. | Jul 2017 | A1 |
20170223628 | Snyder et al. | Aug 2017 | A1 |
20170251392 | Nabetani | Aug 2017 | A1 |
20170257862 | Xue | Sep 2017 | A1 |
20170278374 | Skaaksrud | Sep 2017 | A1 |
20170280351 | Skaaksrud | Sep 2017 | A1 |
20170311279 | Allegue et al. | Oct 2017 | A1 |
20170311574 | Swan | Nov 2017 | A1 |
20170325117 | Li et al. | Nov 2017 | A1 |
20170343658 | Ramirez et al. | Nov 2017 | A1 |
20170359804 | Manku et al. | Dec 2017 | A1 |
20180027389 | Shirakata et al. | Jan 2018 | A1 |
20180086264 | Pedersen | Mar 2018 | A1 |
20180106885 | Blayvas | Apr 2018 | A1 |
20180120420 | McMahon et al. | May 2018 | A1 |
20180168552 | Shi et al. | Jun 2018 | A1 |
20180180706 | Li et al. | Jun 2018 | A1 |
20180184907 | Tran | Jul 2018 | A1 |
20180270821 | Griesdorf et al. | Sep 2018 | A1 |
20180288587 | Allegue Martinez et al. | Oct 2018 | A1 |
20180323835 | Wang et al. | Nov 2018 | A1 |
20180324815 | Nammi et al. | Nov 2018 | A1 |
20180330293 | Kulkarni et al. | Nov 2018 | A1 |
20190033446 | Bultan et al. | Jan 2019 | A1 |
20190074876 | Kakishima et al. | Mar 2019 | A1 |
20190122514 | Olekas et al. | Apr 2019 | A1 |
20190146075 | Kravets et al. | May 2019 | A1 |
20190146076 | Kravets et al. | May 2019 | A1 |
20190146077 | Kravets et al. | May 2019 | A1 |
20190147713 | Devison et al. | May 2019 | A1 |
20190156943 | Kocherscheidt et al. | May 2019 | A1 |
20190170869 | Kravets et al. | Jun 2019 | A1 |
20190222330 | Shan | Jul 2019 | A1 |
20190271775 | Zhang et al. | Sep 2019 | A1 |
20190272718 | Hurtig et al. | Sep 2019 | A1 |
20190294833 | Lu et al. | Sep 2019 | A1 |
20190327124 | Lai et al. | Oct 2019 | A1 |
20200112939 | Scharf | Apr 2020 | A1 |
20200150263 | Eitan | May 2020 | A1 |
20200175405 | Omer et al. | Jun 2020 | A1 |
20200178033 | Omer et al. | Jun 2020 | A1 |
20200204222 | Lou et al. | Jun 2020 | A1 |
20200209378 | Yokev et al. | Jul 2020 | A1 |
20200264292 | Kravets et al. | Aug 2020 | A1 |
20200271747 | Wu | Aug 2020 | A1 |
20200305231 | Sadeghi et al. | Sep 2020 | A1 |
20200351576 | Beg et al. | Nov 2020 | A1 |
20200351692 | Beg et al. | Nov 2020 | A1 |
20200367021 | Devison et al. | Nov 2020 | A1 |
20200371222 | Kravets et al. | Nov 2020 | A1 |
20210099835 | Omer | Apr 2021 | A1 |
20210099836 | Omer | Apr 2021 | A1 |
20210099970 | Omer | Apr 2021 | A1 |
20210103045 | Kravets et al. | Apr 2021 | A1 |
20210135711 | Beg et al. | May 2021 | A1 |
20210135718 | Beg | May 2021 | A1 |
20210329554 | Beg | Oct 2021 | A1 |
20210409132 | Bouttier | Dec 2021 | A1 |
20220070954 | Omer | Mar 2022 | A1 |
20220417706 | Gummadi | Dec 2022 | A1 |
Number | Date | Country |
---|---|---|
2834522 | May 2014 | CA |
2945702 | Aug 2015 | CA |
102037667 | Jan 2015 | CN |
111373281 | Jul 2020 | CN |
112955780 | Jun 2021 | CN |
116058071 | May 2023 | CN |
2733684 | May 2014 | EP |
2850751 | Apr 2017 | EP |
3739356 | Nov 2020 | EP |
1997-507298 | Jul 1997 | JP |
2004286567 | Oct 2004 | JP |
2013072865 | Apr 2013 | JP |
6324935 | May 2018 | JP |
6776374 | Oct 2020 | JP |
2619074 | May 2017 | RU |
WO-2013171138 | Nov 2013 | WO |
2014021574 | Feb 2014 | WO |
2014201574 | Dec 2014 | WO |
2015168700 | Nov 2015 | WO |
2016005977 | Jan 2016 | WO |
2016066822 | May 2016 | WO |
2016110844 | Jul 2016 | WO |
2017106976 | Jun 2017 | WO |
2017132765 | Aug 2017 | WO |
WO-2017155587 | Sep 2017 | WO |
2017177303 | Oct 2017 | WO |
2017210770 | Dec 2017 | WO |
2018071456 | Apr 2018 | WO |
2018094502 | May 2018 | WO |
2019041019 | Mar 2019 | WO |
2019075551 | Apr 2019 | WO |
WO-2020096960 | May 2020 | WO |
2020150806 | Jul 2020 | WO |
2020150807 | Jul 2020 | WO |
2020227804 | Nov 2020 | WO |
2020227805 | Nov 2020 | WO |
2020227806 | Nov 2020 | WO |
2021081635 | May 2021 | WO |
2021081637 | May 2021 | WO |
WO-2022040817 | Mar 2022 | WO |
Entry |
---|
1 USPTO, Notice of Allowance mailed Jul. 28, 2021, in U.S. Appl. No. 17/318,211, 41 pgs. |
USPTO, Notice of Allowance mailed Jan. 22, 2021 in U.S. Appl. No. 17/082,456, 35 pgs. |
USPTO, Notice of Allowance mailed Jan. 21, 2021, in U.S. Appl. No. 17/082,411, 39 pgs. |
WIPO, International Search Report and Written Opinion mailed Dec. 30, 2020, in PCT/CA2020/051442, 8 pgs. |
WIPO, International Search Report and Written Opinion mailed Jan. 11, 2021, in PCT/CA2020/051440, 8 pgs. |
Networking Layers—Apple Developer; https://developer.apple.com/library/archive/documentation/NetworkingInternet/Conceptual/NetworkingConcepts/NetworkingLayers/NetworkingLayers.html, Jul. 19, 2012, 4 pgs. |
Banerjee , et al., “Through Wall People Localization Exploiting Radio Windows”, arXiv:1307.7233v1, Jul. 27, 2013, 14 pgs. |
Bejarano , et al., “IEEE 802.11ac: From Channelization to Multi-User MIMO”, IEEE Communications Magazine, Oct. 1, 2013, 7 pgs. |
Brauers , et al., “Augmenting OFDM Wireless Local Networks with Motion detection Capability”, 2016 IEEE 6th International Conference on Consumer Electronics—Berlin (ICCE-Berlin); pp. 253-257, 2016, 5 pgs. |
Cai , et al., “Human Movement Detection in Wi-Fi Mesh Networks”, Technical Disclosure Commons, Dec. 17, 2017, 8 pgs. |
Dekker , et al., “Gesture Recognition with a Low Power FMCW Radar and a Deep Convolutional Neural Network”, Proceedings of the 14th European Radar Conference, Nuremberg, Germany, Oct. 11-13, 2017, 4 pgs. |
Domenico , et al., “Exploring Training Options for RF Sensing Using CSI”, IEEE Communications Magazine, 2018, vol. 56, Issue 5, pp. 116-123, 8 pgs. |
Ganesan , et al., “Robust Channel Estimation for 802.11n (MIMO-OFDM) Systems”, 2014 International Conference on Signal Processing and Communications (SPCOM), Bangalore; Jul. 22-25, 2014, 5 pgs. |
Iqbal , et al., “Indoor Motion Classification Using Passive RF Sensing Incorporating Deep Learning”, ISSN: 2577-2465, Electronic IEEE, Jun. 3, 2018, 5 pgs. |
Keerativoranan , et al., “Mitigation of CSI Temporal Phase Rotation with B2B Calibration Method for Fine-Grained Motion Detection Analysis on Commodity Wi-Fi Devices”, Sensors 2018, Nov. 6, 2018, 18 pgs. |
Kosba , et al., “Robust WLAN Device-free Passive Motion Detection”, IEEE Wireless Communications and Networking Conference, Apr. 2012, 6 pgs. |
Krumm , et al., “Locadio: Inferring Motion and Location from Wi-Fi Signal Strengths”, First Annual Int'l Conference on Mobile and Ubiquitous Systems: Networking and Services, Boston, MA, Aug. 22, 2004, 10 pgs. |
Lai , “This mesh WiFi router can track motion to protect your family”, https://www.engadget.com/2018-06-06-origin-wireless-wifi-mesh-motion-fall-sleep-detection.htm., Jun. 16, 2018, 6 pgs. |
Youssef, Moustafa , et al., “Challenges: Device-free Passive Localization for Wireless Environments”, Mobicom 07 Proceedings of the 13th Annual ACM International Conference on Mobile Computing and Networking, Sep. 2007, 11 pgs. |
Youssef , et al., “Challenges: Device-free Passive Localization for Wireless Environments”, Proceedings of the 13th Annual ACM Int'l Conference on Mobile Computing and Networking, Montreal, Canada, Sep. 9, 2007, 8 pgs. |
WIPO, International Search Report and Written Opinion mailed Aug. 15, 2022, in PCT/CA2022/050749, 10 pgs. |
USPTO, Notice of Allowance mailed Aug. 25, 2022, in U.S. Appl. No. 17/328,479, 47 pgs. |
EPO, Extended European Search Report mailed Nov. 11, 2022, in EP 20881889.8, 8 pgs. |
EPO, Extended European Search Report mailed Nov. 4, 2022, in EP 20881664.5, 8 pgs. |
CIPO, Office Action issued in Application No. 3,152,900 on Jan. 18, 2024, 3 pages. |
CIPO, Office Action issued in Application No. 3, 152,905 on Jan. 16, 2024, 4 pages. |
CNIPA, Office Action issued in Application No. 202080074360.X on Mar. 8, 2024, 29 pages. |
EPO, Communication pursuant to Article 94(3) issued in Application No. 20881889.8 on May 22, 2024, 6 pages. |
Number | Date | Country | |
---|---|---|---|
20210273685 A1 | Sep 2021 | US |
Number | Date | Country | |
---|---|---|---|
62928684 | Oct 2019 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17082456 | Oct 2020 | US |
Child | 17321839 | US |