This disclosure relates in general to a system and method for gesture detection, and, in particular embodiments, to a system and method of gesture detection for a remote device.
The future Internet of Things (IoT) will feature the internetworking of physical devices, vehicles, buildings and other “things” embedded with electronics, software, sensors, actuators, and/or network connectivity that enable these objects to collect and exchange data. The IoT will allow objects to be sensed and/or controlled remotely across existing network infrastructure, creating opportunities for more direct integration of the physical world into computer-based systems, and resulting in improved efficiency, accuracy and economic benefit.
Each connected thing in the future IoT may be uniquely identifiable through its embedded computing system but still able to interoperate within the existing Internet infrastructure. Experts estimate that the IoT will consist of almost 50 billion objects by 2020.
This burgeoning IoT will feature an ever-widening focus on machine-to-machine (M2M) communication. In such a world where physical objects are more networked than ever and will be having their own conversations around us, questions remain about what the future will hold for human-to-machine (H2M) communication. Human participants may feel increasingly disembodied as they stare at diminutive displays, manipulate their fingers across glass surfaces with unnatural swiping, spreading, and pinching motions, and read automated social media messages created by software applications.
In accordance with a first example embodiment of the present invention, a method for operating a mobile device is provided. The method includes detecting a gesture by the mobile device. Detecting the gesture includes receiving a reflected millimeter wave signal by the mobile device, generating a first message in accordance with the detected gesture, and transmitting the first message from the mobile device to an external remote device. The detected gesture is associated with an operation of the remote device.
In accordance with a second example embodiment of the present invention, a method for operating a first device is provided. The method includes receiving, by the first device from an external mobile device, a first message generated using millimeter wave radar signaling in a field of view of the mobile device. The method also includes processing, by the first device, the first message to detect a gesture associated with an operation of the first device, and performing, by the first device, an operation in accordance with the detected gesture.
In accordance with a third example embodiment of the present invention, a control circuit for a first device is provided. The circuit includes a receiver configured to receive, from an external mobile device, a first message including radar data generated using millimeter wave radar signaling in a field of view of the mobile device. The circuit also includes a processor and a non-transitory computer readable medium storing programming for execution by the processor. The programming includes instructions to process the first message to detect a gesture associated with an operation of the first device, and to perform the operation of the first device in accordance with the detected gesture.
For a more complete understanding of the disclosure, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
The making and using of various embodiments are discussed in detail below. It should be appreciated, however, that the disclosure provides many applicable concepts that can be embodied in a wide variety of specific contexts. The specific embodiments discussed are merely illustrative of specific ways to make and use embodiments, and do not limit the scope of the invention.
In various embodiments, a radar-based gesture detection system is used to control a remote device such, as, e.g., a vehicle, a building, a home appliance, etc. For example, when the remote device is a car, an embodiment gesture detection system allows a human participant to control various operations of the car from outside the car while approaching it.
In various embodiments, the gesture detection system includes a gesture sensor built into a mobile device such as, for example, a car key, a wristwatch, a smart phone, etc. In such embodiments, the mobile device may relay either raw radar data or processed control messages to the remote device so that the remote device will perform a desired operation. For example, a thumbs-up hand gesture may be detected by a smart key and relayed to a car, causing the trunk to open. Such a smart key gesture sensor could have a much smaller radar detection range than a hypothetical gesture sensor located in the car, and this smaller detection range would help prevent interfering radar reflections, including interfering gestures, from affecting the proper behavior of the system. As an example, the radar detection range may be reduced to 30 cm or less so that the mobile device may interact only with the intended operator, who could be, e.g., a person wearing the mobile device, holding the mobile device, carrying it in a pants pocket, etc.
In various embodiments, the mobile device could be paired with the remote device via a shared security key, so that data is encrypted for transmission from the mobile device and decrypted at the remote device, and vice versa. The data transfer between the mobile device and remote device could include any of a wide variety of communications technologies, including, e.g., Bluetooth, V2X, etc.
In various embodiments, the gesture detection system may be programmed via a user interface at the remote device, so that an end user is able to assign a remote device operation to each gesture. For example, an alphabet of 6 gestures may be loaded by default in the system, and the end user may be able to select which function to associate to each gesture. Emergency contact information could also be entered and an emergency call function associated with a gesture.
In various embodiments, the gesture recognition may cause sounds to be emitted by the mobile device based on feedback from the remote device. The feedback may indicate, for example, that a hand-shake with the remote device was successful or that a gesture-indicated operation was successfully performed by the remote device. For example, the volume of a sound generated by a smart key may increase as the temperature setting of a car's air conditioning system increases in response to user gestures detected by the smart key.
In an embodiment, the mobile device's radar data may include radar timing and/or frequency information. In an embodiment, the mobile device may process its radar data to detect a gesture. In another embodiment, however, the mobile device may transfer raw radar data over, for example, a large bandwidth connection (e.g., WiFi, WiGig, etc.) so that a processor embedded in the remote device may process the radar data and classify it as a particular gesture.
In various embodiments where the remote device is an automobile, the mobile device could be a smart key such as, e.g., an ignition or vehicle access key that includes a radar-based gesture sensor. The main drawback of such an implementation would be that such a smart key may have a battery with a short battery life. To combat this short battery life, in some embodiments a wireless charging system or other charging system may be built into the cockpit of the vehicle remote device.
Referring again to
Referring again to
Referring again to
In an embodiment, the input to second transmitter front end 460 is selectable between an output of radar circuitry 456 and an output of communication circuitry 458 via a circuit represented by switch 459. When second transmitter front end 460 receives input from radar circuitry 456, both first transmitter front end 454 and second transmitter front end 460 can be used to build a holographic radar. On the other hand, when second transmitter front end 460 receives it input from communication circuitry 458, first transmitter front end 454 provides a radar signal to transmit antenna 470a and second transmitter front end 460 provides a communications signal to transmit antenna 470b. This communications signal may be a carrier modulated signal. In one example, the second transmitter front end 460 may transmit a bipolar phase-shift keyed (BPSK) modulated signal to satellite radar device 480 that contains data. In some embodiments, a data link between radar transceiver device 452 and satellite radar device 480 may be used to coordinate RF transmission and reception between radar transceiver device 452 and satellite radar device 480 to implement phase array beam steering. In some embodiments, satellite radar device 480 may also be capable of data transmission and radar transceiver device 452 may be configured to receive data from satellite radar device 480 via antennas 472a-d.
In an embodiment, radar transceiver device 452, or portions of radar transceiver device 452 may be implemented in a package that contains first transmitter front end 454, second transmitter front end 460, receiver front end 462, as well as transmit antennas 470a and 470b and receive antennas 472a-d. In an embodiment, a ball grid array (BGA) package that contains patch antennas may be used to implement antennas 470a, 470b and 472a-d. In alternative embodiments, other antenna elements may be used besides patch antennas, for example, a Yagi-Uda antenna may be used provide sensing from the side of the packaged chip and antenna module.
In an embodiment, the frequency of operation of radar system 450, as well as other embodiments, disclosed in this disclosure, is between about 57 GHz and about 66 GHz. Alternatively, embodiment systems may operate at frequencies outside of this range also. For example, in an embodiment the frequency of operation of radar system 450, as well as other embodiments disclosed in this disclosure, is between about 57 GHz and about 71 GHz.
A millimeter-wave radar-based gesture sensor has been described in U.S. application Ser. No. 14/954,198, filed on Nov. 30, 2015, which application is incorporated herein by reference in its entirety.
Flow continues at step 605, where the mobile device detects which gesture, movement, hand sign, etc. (referred to in this disclosure as a “gesture”) has created the radar signal by analyzing the manner in which the frequency content of the radar signal changes over time. In some embodiments, the radar signal is converted in to the frequency domain using transform methods known in the art. These transform methods include, but are not limited to a discrete Fourier transform (DFT), a fast Fourier transform, a short-time Fourier Transform (STFT), and spectrogram analysis. During step 605, multiple transforms may be calculated over a sliding time window and peak frequencies of these multiple transforms may be tracked to analyze how the range Doppler and the velocity of the target change over time. For example, one or more peak frequency vs. time signals may be generated to track how the frequency content of the radar signal changes over time. In various embodiments, each gesture has a specific, pre-determined signature with respect to how the frequency content of the radar signal changes over time. Each gesture has previously been categorized by its associated radar signature and these pre-determined signatures have been stored in a look up table (LUT) in the mobile device. During operation the mobile device verifies if the signal detected by the sensor corresponds to one of the signatures stored in the LUT. By comparing the tracked frequency content of the radar signal with the stored signatures, a received gesture may be determined. Micro-Doppler analysis methods may also be used to analyze the radar signal and classify the gesture in some embodiments.
At step 607, the mobile device generates a first message indicating which gesture has been detected. At step 609, the mobile device transmits this first message to a remote device, such as, e.g., remote device 104 of
Referring again to
At step 809, the remote device generates feedback on whether the remote device successfully performed the operation associated with the detected gesture, and then transmits this feedback to the mobile device. The associated operation may include, as a first example, adjusting a setting within a range of values. For example, the operation may be adjusting a temperature setting, and a corresponding feedback may include a variable feedback signal that varies based on the adjustment of the temperature setting. As a second example the associated operation may be to initiate an emergency call based on emergency contact information received from the user at step 801.
Referring again to
In an embodiment, the memory 2006 includes a non-transitory computer readable medium storing programming for execution by the processor 204. In an embodiment, this programming includes instructions to process a message received from an external mobile device. These instructions may allow the processor 204 to detect a gesture associated with an operation of the host device and then cause the host device to perform the operation based on this detected gesture. In an embodiment, this received message includes radar data, and the instructions to process the message include analyzing the frequency content of the radar data. In an embodiment, the memory 2006 includes an LUT 2008 storing pre-determined time domain radar signature(s) uniquely identifying one or more gesture(s) and associating each gesture with a respective operation of the remote device 106. In an embodiment, an operation of the host device that is associated with a gesture may include any of the following: turning on the host device, turning off the host device, setting a temperature setting of the host device, setting a fan setting of the host device, turning on a sound system of the host device, turning off the sound system, setting a volume setting of the sound system, selecting an input of the sound system, tuning a radio channel of the host device, opening a mechanical latch of the host device, turning on a light of the host device, turning off the light, setting a dimming setting of the light, changing a color of the light, turning on a user display of the host device, turning off the user display, changing visible content of the user display, setting a cruise control setting, auto-pilot program, or voice recognition program of the host device, mapping a route, looking up weather/road/traffic information, initiating a diagnostic test of the host device, enabling or disabling a security system of the host device, initiating a communications session, etc.
In the embodiment of
Referring again to
In some embodiments, the processing system 2000 is included in a network device that is accessing, or part otherwise of, a telecommunications network. In one example, the processing system 2000 is in a network-side device in a wireless or wireline telecommunications network, such as a base station, a relay station, a scheduler, a controller, a gateway, a router, an applications server, or any other device in the telecommunications network. In other embodiments, the processing system 2000 is in a user-side device accessing a wireless or wireline telecommunications network, such as a mobile station, a user equipment (UE), a personal computer (PC), a tablet, a wearable communications device (e.g., a smartwatch, etc.), or any other device adapted to access a telecommunications network.
In some embodiments, one or more of the interfaces 2010, 2012, 2014 connects the processing system 2000 to a transceiver adapted to transmit and receive signaling over the telecommunications network.
The transceiver 2100 may transmit and receive signaling over any type of communications medium. In some embodiments, the transceiver 2100 transmits and receives signaling over a wireless medium. For example, the transceiver 2100 may be a wireless transceiver adapted to communicate in accordance with a wireless telecommunications protocol, such as a cellular protocol (e.g., long-term evolution (LTE), etc.), a wireless local area network (WLAN) protocol (e.g., Wi-Fi, etc.), or any other type of wireless protocol (e.g., Bluetooth, near field communication (NFC), etc.). In such embodiments, the network-side interface 2102 comprises one or more antenna/radiating elements. For example, the network-side interface 2102 may include a single antenna, multiple separate antennas, or a multi-antenna array configured for multi-layer communication, e.g., single input multiple output (SIMO), multiple input single output (MISO), multiple input multiple output (MIMO), etc. In other embodiments, the transceiver 2100 transmits and receives signaling over a wireline medium, e.g., twisted-pair cable, coaxial cable, optical fiber, etc. Specific processing systems and/or transceivers may utilize all of the components shown, or only a subset of the components, and levels of integration may vary from device to device.
Illustrative embodiments may use a remote device controlled by a mobile device that includes a radar-based gesture sensor having a small field of view relative to that of a gesture sensor located in the remote device. The merits of such embodiments may include preventing interfering gestures and other radar reflections from interfering with the correct detection of a target gesture.
The following additional example embodiments of the present invention are also provided. In accordance with a first example embodiment of the present invention, a method for operating a mobile device is provided. The method includes detecting a gesture by the mobile device. Detecting the gesture includes receiving a reflected millimeter wave signal by the mobile device, generating a first message in accordance with the detected gesture, and transmitting the first message from the mobile device to an external remote device. The detected gesture is associated with an operation of the remote device.
Also, the foregoing first example embodiment may be implemented to include one or more of the following additional features. The method may also be implemented such that detecting the gesture further includes transmitting, by the mobile device, a first millimeter wave signal to a location of the gesture. The reflected millimeter wave signal includes a reflection of the first millimeter wave signal reflected from the location of the gesture.
The method may also be implemented such that the gesture is located a distance of not greater than 30 centimeters from the mobile device. The method may also be implemented such that the mobile device includes at least one of a mobile phone, a watch, or a key for the remote device. The method may also be implemented further including wirelessly charging a battery of the mobile device, by the mobile device, from a power source of the remote device. The method may also be implemented such that the remote device includes a vehicle, and the mobile device includes at least one of a vehicle access key or a vehicle ignition key. The method may also be implemented such that the operation includes at least one of setting an air conditioning setting of the vehicle, setting a heat seating setting of the vehicle, opening a trunk of the vehicle, setting a lighting setting of the vehicle, or initiating an emergency call. The method may also be implemented such that the gesture includes a hand gesture. The method may also be implemented such that detecting the gesture further includes determining frequency content of the reflected millimeter wave signal, tracking the frequency content of the reflected millimeter wave signal over time, and comparing the tracked frequency content to a pre-determined gesture signature stored in a look-up table. The method may also be implemented such that the mobile device further includes a first antenna and a second antenna. Receiving the reflected millimeter wave signal includes receiving using the first antenna and transmitting the first message includes transmitting using the second antenna. The method may also be implemented such that the first antenna includes a receive antenna array and a transmit antenna array. The method may also be implemented such that transmitting the first message includes encrypting, by the mobile device, the first message using a shared security key of the mobile device and the remote device. The method may also be implemented further including receiving feedback, by the mobile device from the remote device, such that the feedback includes a status of whether the remote device successfully performed the operation associated with the gesture. The method may also be implemented further including generating, by the mobile device, a sound in accordance with the received feedback. The method may also be implemented such that the operation includes adjusting a temperature setting, the received feedback includes a variable feedback signal that varies in accordance with the adjusted temperature setting, and generating the sound includes adjusting the sound in accordance with the variable feedback signal. The method may also be implemented such that the first message includes at least one of radar timing information or radar frequency information.
In accordance with a second example embodiment of the present invention, a method for operating a first device is provided. The method includes receiving, by the first device from an external mobile device, a first message generated using millimeter wave radar signaling in a field of view of the mobile device. The method also includes processing, by the first device, the first message to detect a gesture associated with an operation of the first device, and performing, by the first device, an operation in accordance with the detected gesture.
Also, the foregoing second example embodiment may be implemented to include one or more of the following additional features. The method may also be implemented further including decrypting the first message, by the first device, in accordance with a shared security key of the mobile device and the first device. The method may also be implemented further including wirelessly charging the mobile device by a power source, such that the first device includes the power source. The method may also be implemented such that the first device includes a vehicle and the mobile device includes at least one of a vehicle access key or a vehicle ignition key.
The method may also be implemented such that the first message includes radar data. Processing the first message includes performing a spectral analysis of the radar data, tracking frequency content of the radar data over time based on the spectral analysis; and matching the tracked frequency content with a pre-determined gesture signature stored in a look-up table.
The method may also be implemented further including generating feedback including a status of whether the first device successfully performed the operation associated with the gesture and transmitting the feedback from the first device to the mobile device. The method may also be implemented such that performing the operation includes adjusting a temperature setting and the feedback includes a variable feedback signal that varies in accordance with the adjusted temperature setting. The method may also be implemented such that performing the operation includes initiating an emergency call. The method may also be implemented further including receiving, by the first device, a user input including emergency contact information, such that initiating the emergency call is in accordance with the emergency contact information. The method may also be implemented further including receiving, by the first device, user input including a desired operation of the first device to be associated with a desired gesture, and associating the desired operation with the desired gesture.
In accordance with a third example embodiment of the present invention, a control circuit for a first device is provided. The circuit includes a receiver configured to receive, from an external mobile device, a first message including radar data generated using millimeter wave radar signaling in a field of view of the mobile device. The circuit also includes a processor and a non-transitory computer readable medium storing programming for execution by the processor. The programming includes instructions to process the first message to detect a gesture associated with an operation of the first device, and to perform the operation of the first device in accordance with the detected gesture.
Also, the foregoing third example embodiment may be implemented to include one or more of the following additional features. The circuit may also be implemented further including a look-up table (LUT), such that the first message includes radar data. The instructions to process the first message include instructions to perform a spectral analysis of the radar data, track frequency content of the radar data over time based on the spectral analysis, and match the tracked frequency content with a pre-determined gesture signature stored in the LUT. The circuit may also be implemented such that the operation includes at least one of turning on the first device, turning off the first device, setting a temperature setting, setting a fan setting, turning on a sound system, turning off the sound system setting a volume setting of the sound system, selecting an input of the sound system, tuning a radio channel, opening a mechanical latch, turning on a light, turning off the light, setting a dimming setting of the light, changing a color of the light, turning on a user display, turning off the user display, changing visible content of the user display, setting a cruise control setting, turning on an auto-pilot program, turning off the auto-pilot program, turning on a voice recognition program, turning off the voice recognition program, mapping a route, looking up weather information, looking up information about a road condition, looking up information about traffic, initiating a diagnostic test, enabling a security system, disabling the security system, generating a sound, generating a haptic output, and initiating a communications session.
While this invention has been described with reference to illustrative embodiments, this description is not intended to be construed in a limiting sense. Various modifications and combinations of the illustrative embodiments, as well as other embodiments of the invention, will be apparent to persons skilled in the art upon reference to the description. It is therefore intended that the appended claims encompass any such modifications or embodiments.
This application is a continuation of U.S. patent application Ser. No. 15/401,598, filed on Jan. 9, 2017, which application is hereby incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
4241347 | Albanese et al. | Dec 1980 | A |
6147572 | Kaminski et al. | Nov 2000 | A |
6414631 | Fujimoto | Jul 2002 | B1 |
6636174 | Arikan et al. | Oct 2003 | B2 |
7048973 | Sakamoto et al. | May 2006 | B2 |
7057564 | Tsai et al. | Jun 2006 | B2 |
7171052 | Park | Jan 2007 | B2 |
7317417 | Arikan et al. | Jan 2008 | B2 |
7596241 | Rittscher et al. | Sep 2009 | B2 |
7692574 | Nakagawa | Apr 2010 | B2 |
7873326 | Sadr | Jan 2011 | B2 |
7889147 | Tam et al. | Feb 2011 | B2 |
8228382 | Pattikonda | Jul 2012 | B2 |
8497805 | Rofougaran et al. | Jul 2013 | B2 |
8659369 | Rofougaran et al. | Feb 2014 | B2 |
8731502 | Salle et al. | May 2014 | B2 |
8836596 | Richards et al. | Sep 2014 | B2 |
8847814 | Himmelstoss et al. | Sep 2014 | B2 |
8860532 | Gong et al. | Oct 2014 | B2 |
8976061 | Chowdhury | Mar 2015 | B2 |
9172132 | Kam et al. | Oct 2015 | B2 |
9202105 | Wang et al. | Dec 2015 | B1 |
9413079 | Kamgaing et al. | Aug 2016 | B2 |
9495600 | Heu et al. | Nov 2016 | B2 |
10503883 | Gillian | Dec 2019 | B1 |
20030179127 | Wienand | Sep 2003 | A1 |
20040238857 | Beroz et al. | Dec 2004 | A1 |
20060001572 | Gaucher et al. | Jan 2006 | A1 |
20060049995 | Imaoka et al. | Mar 2006 | A1 |
20060067456 | Ku et al. | Mar 2006 | A1 |
20070210959 | Herd et al. | Sep 2007 | A1 |
20080106460 | Kurtz et al. | May 2008 | A1 |
20080238759 | Carocari et al. | Oct 2008 | A1 |
20080291115 | Doan et al. | Nov 2008 | A1 |
20080308917 | Pressel et al. | Dec 2008 | A1 |
20090073026 | Nakagawa | Mar 2009 | A1 |
20090085815 | Jakab et al. | Apr 2009 | A1 |
20090153428 | Rofougaran et al. | Jun 2009 | A1 |
20090315761 | Walter et al. | Dec 2009 | A1 |
20100207805 | Haworth | Aug 2010 | A1 |
20110299433 | Darabi et al. | Dec 2011 | A1 |
20120087230 | Guo et al. | Apr 2012 | A1 |
20120092284 | Rofougaran et al. | Apr 2012 | A1 |
20120116231 | Liao et al. | May 2012 | A1 |
20120195161 | Little et al. | Aug 2012 | A1 |
20120206339 | Dahl | Aug 2012 | A1 |
20120280900 | Wang et al. | Nov 2012 | A1 |
20120296567 | Breed | Nov 2012 | A1 |
20130027240 | Chowdhury | Jan 2013 | A1 |
20130106673 | McCormack et al. | May 2013 | A1 |
20130261871 | Hobbs et al. | Oct 2013 | A1 |
20140004793 | Bandyopadhyay et al. | Jan 2014 | A1 |
20140028542 | Lovitt et al. | Jan 2014 | A1 |
20140070994 | Schmalenberg et al. | Mar 2014 | A1 |
20140145883 | Baks et al. | May 2014 | A1 |
20140222253 | Siegel et al. | Aug 2014 | A1 |
20140324888 | Xie et al. | Oct 2014 | A1 |
20150117862 | Trotta et al. | Apr 2015 | A1 |
20150181840 | Tupin, Jr. et al. | Jul 2015 | A1 |
20150185314 | Corcos | Jul 2015 | A1 |
20150185316 | Rao et al. | Jul 2015 | A1 |
20150212198 | Nishio et al. | Jul 2015 | A1 |
20150243575 | Strothmann et al. | Aug 2015 | A1 |
20150277569 | Sprenger et al. | Oct 2015 | A1 |
20150325925 | Kamgaing et al. | Nov 2015 | A1 |
20150346820 | Poupyrev et al. | Dec 2015 | A1 |
20150348821 | Iwanaga et al. | Dec 2015 | A1 |
20150364816 | Murugan et al. | Dec 2015 | A1 |
20160018511 | Nayyar et al. | Jan 2016 | A1 |
20160041617 | Poupyrev | Feb 2016 | A1 |
20160041618 | Poupyrev | Feb 2016 | A1 |
20160061942 | Rao et al. | Mar 2016 | A1 |
20160061947 | Patole et al. | Mar 2016 | A1 |
20160098089 | Poupyrev | Apr 2016 | A1 |
20160103213 | Ikram et al. | Apr 2016 | A1 |
20160109566 | Liu et al. | Apr 2016 | A1 |
20160118353 | Ahrens et al. | Apr 2016 | A1 |
20160146931 | Rao et al. | May 2016 | A1 |
20160146933 | Rao et al. | May 2016 | A1 |
20160187462 | Altus et al. | Jun 2016 | A1 |
20160191232 | Subburaj et al. | Jun 2016 | A1 |
20160240907 | Haroun | Aug 2016 | A1 |
20160249133 | Sorensen | Aug 2016 | A1 |
20160252607 | Saboo et al. | Sep 2016 | A1 |
20160259037 | Molchanov | Sep 2016 | A1 |
20160266233 | Mansour | Sep 2016 | A1 |
20160269815 | Liao et al. | Sep 2016 | A1 |
20160291130 | Ginsburg et al. | Oct 2016 | A1 |
20160299215 | Dandu et al. | Oct 2016 | A1 |
20160306034 | Trotta et al. | Oct 2016 | A1 |
20160320852 | Poupyrev | Nov 2016 | A1 |
20160320853 | Lien et al. | Nov 2016 | A1 |
20160327633 | Kumar et al. | Nov 2016 | A1 |
20160334502 | Ali et al. | Nov 2016 | A1 |
20160349845 | Poupyrev | Dec 2016 | A1 |
20160364009 | Lemberger | Dec 2016 | A1 |
20170033062 | Liu et al. | Feb 2017 | A1 |
20170045607 | Bharadwaj et al. | Feb 2017 | A1 |
20170052618 | Lee et al. | Feb 2017 | A1 |
20170054449 | Mani et al. | Feb 2017 | A1 |
20170060254 | Molchanov | Mar 2017 | A1 |
20170070952 | Balakrishnan et al. | Mar 2017 | A1 |
20170074974 | Rao et al. | Mar 2017 | A1 |
20170074980 | Adib et al. | Mar 2017 | A1 |
20170077765 | Bell et al. | Mar 2017 | A1 |
20170090014 | Subburaj et al. | Mar 2017 | A1 |
20170090015 | Breen et al. | Mar 2017 | A1 |
20170102457 | Li et al. | Apr 2017 | A1 |
20170115377 | Giannini et al. | Apr 2017 | A1 |
20170131395 | Reynolds et al. | May 2017 | A1 |
20170139036 | Nayyar et al. | May 2017 | A1 |
20170150298 | Bandyopadhyay et al. | May 2017 | A1 |
20170170947 | Yang | Jun 2017 | A1 |
20170176574 | Eswaran et al. | Jun 2017 | A1 |
20170192847 | Rao et al. | Jul 2017 | A1 |
20170201019 | Trotta | Jul 2017 | A1 |
20170212597 | Mishra | Jul 2017 | A1 |
20170332055 | Henderson | Nov 2017 | A1 |
20170364160 | Malysa et al. | Dec 2017 | A1 |
20180046255 | Rothera et al. | Feb 2018 | A1 |
20180101239 | Yin et al. | Apr 2018 | A1 |
20180373340 | Cheng et al. | Dec 2018 | A1 |
Number | Date | Country |
---|---|---|
1463161 | Dec 2003 | CN |
1716695 | Jan 2006 | CN |
101490578 | Jul 2009 | CN |
101585361 | Nov 2009 | CN |
102788969 | Nov 2012 | CN |
102967854 | Mar 2013 | CN |
103529444 | Jan 2014 | CN |
203950036 | Nov 2014 | CN |
105607745 | May 2016 | CN |
102008054570 | Jun 2010 | DE |
102011075725 | Nov 2012 | DE |
102014118063 | Jul 2015 | DE |
2247799 | Mar 1992 | GB |
2001174539 | Jun 2001 | JP |
2004198312 | Jul 2004 | JP |
2006234513 | Sep 2006 | JP |
2008029025 | Feb 2008 | JP |
2008089614 | Apr 2008 | JP |
2009069124 | Apr 2009 | JP |
2011529181 | Dec 2011 | JP |
2012112861 | Jun 2012 | JP |
2013521508 | Jun 2013 | JP |
2014055957 | Mar 2014 | JP |
20090063166 | Jun 2009 | KR |
20140080101 | Jun 2014 | KR |
20140082815 | Jul 2014 | KR |
2007060069 | May 2007 | WO |
2013009473 | Jan 2013 | WO |
2016033361 | Mar 2016 | WO |
Entry |
---|
Chen, Xiaolong et al., “Detection and Extraction of Marine Target with Micromotion via Short-Time Fractional Fourier Transform in Sparse Domain,” IEEE International Conference on Signal Processing, Communications and Computing, ICSPCC, Aug. 5-8, 2016, 5 pages. |
Chuanhua, Du, “FMCW Radar Range-Doppler Processing and Beam Formation Technology”, Chinese Doctoral Dissertations & Master's Theses Full Text Database (Masters)—Information Science and Technology Series, China National Knowledge Infrastructure, ISSN 1674-0246, CN 11-9144/G, Dec. 16, 2004-Mar. 2015, 14 pages. |
Deacon, Peter et al., “Frequency Modulated Continuous Wave (FMCW) Radar”, Design Team 6 Technical Lecture, Nov. 9, 2011, 27 pages. |
Diederichs, Kailtyn et al., “Wireless Biometric Individual Identification Utilizing Millimeter Waves”, IEEE Sensors Letters, vol. 1, No. 1, IEEE Sensors Council 3500104, Feb. 2017, 4 pages. |
Dooring Alert Systems, “Riders Matter”, http:\\dooringalertsystems.com, printed Oct. 4, 2017, 16 pages. |
Filippelli, Mario et al., “Respiratory dynamics during laughter”, J Appl Physiol, (90), 1441-1446, Apr. 2001, http://iap.physiology.org/content/jap/90/4/1441.full.pdf. |
Fox, Ben, “The Simple Technique That Could Save Cyclists' Lives”, https://www.outsideonline.com/2115116/simple-technique-could-save-cyclists-lives, Sep. 19, 2016, 6 pages. |
Gu, Changzhan et al., “Assessment of Human Respiration Patterns via Noncontact Sensing Using Doppler Multi-Radar System”, Sensors, Mar. 2015, 15(3), 6383-6398, doi: 10.3390/s150306383, 17 pages. |
Guercan, Yalin “Super-resolution Algorithms for Joint Range-Azimuth-Doppler Estimation in Automotive Radars”, Technische Universitet Delft, TUDelft University of Technology Challenge the Future, Jan. 25, 2017, 72 pages. |
Inac, Ozgur et al., “A Phased Array RFIC with Built-In Self-Test Capabilities”, IEEE Transactions on Microwave Theory and Techniques, vol. 60, No. 1, Jan. 2012, 10 pages. |
Kizhakkel, V., “Pulsed Radar Target Recognition Based on Micro-Doppler Signatures Using Wavelet Analysis”, A Thesis, Graduate Program in Electrical and Computer Engineering, Ohio State University, Jan. 2013-May 2013, 118 pages. |
Kuehnke, Lutz, “Phased Array Calibration Procedures Based on Measured Element Patterns”, 2001 Eleventh International Conference on Antennas and Propagation, IEEE Conf., Publ. No. 480, Apr. 17-20, 2001, 4 pages. |
Lim, Soo-Chul et al., “Expansion of Smartwatch Touch Interface from Touchscreen to Around Device Interface Using Infrared Line Image Sensors”, Sensors 2015, ISSN 1424-8220, vol. 15, 16642-16653, doi:10.3390/s150716642, www.mdpi.com/journal/sensors, Jul. 15, 2009, 12 pages. |
Lin, Jau-Jr et al., “Design of an FMCW radar baseband signal processing system for automotive application”, SpringerPlus a SpringerOpen Journal, (2016) 5:42, http://creativecommons.org/licenses/by/4.0/, DOI 10.1186/s40064-015-1583-5; Jan. 2016, 16 pages. |
Microwave Journal Frequency Matters, “Single-Chip 24 GHz Radar Front End”, Infineon Technologies AG, www.microwavejournal.com/articles/print/21553-single-chip-24-ghz-radar-front-end, Feb. 13, 2014, 2 pages. |
Schroff, Florian et al., “FaceNet: A Unified Embedding for Face Recognition and Clustering”, CVF, CVPR2015, IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Mar. 12, 2015, pp. 815-823. |
Simon, W., et al., “Highly Integrated KA-Band Tx Frontend Module Including 8×8 Antenna Array”, IMST GmbH, Germany, Asia Pacific Microwave Conference, Dec. 7-10, 2009, 63 pages. |
Suleymanov, Suleyman, “Design and Implementation of an FMCW Radar Signal Processing Module for Automotive Applications”, Master Thesis, University of Twente, Aug. 31, 2016, 61 pages. |
Thayananthan, T. et al., “Intelligent target recognition using micro-Doppler radar signatures”, Defence R&D Canada, Radar Sensor Technology III, Proc. of SPIE, vol. 7308, 730817, Dec. 9, 2009, 11 pages. |
Thayaparan, T. et al., “Micro-Doppler Radar Signatures for Intelligent Target Recognition,” Defence Research and Development Canada, Technical Memorandum, DRDC Ottawa TM 2004-170, Sep. 2004, 73 pages. |
Wilder, Carol N., et al., “Respiratory patterns in infant cry”, Canada Journal of Speech, Human Communication Winter, 1974-75, http://cjslpa.ca/files/1974_HumComm_Vol_01/No_03_2-60/Wilder_Baken_HumComm_1974.pdf, pp. 18-34. |
Xin, Qin et al., “Signal Processing for Digital Beamforming FMCW SAR”, Hindawi Publishing Corporation, Mathematical Problems in Engineering, vol. 2014, Article ID 859890, http://dx.doi.org/10.1155/2014/859890, 11 pages. |
Liming, Gong “Interpretation on NVIDIA's RADA Gesture Recognition to Have a Full Picture of Project Soli”, CSDN, Jun. 12, 2016, 15 pages. |
Number | Date | Country | |
---|---|---|---|
20200019235 A1 | Jan 2020 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15401598 | Jan 2017 | US |
Child | 16580373 | US |