The present disclosure relates to vital sign estimation and monitoring.
Sleep monitoring tracks the activity of the human body during sleep and provides information on brain activity and other physiological factors during sleep. Hence, sleep monitoring has been used to diagnose or understand the underlying causes of sleeping disorders such as sleep apnea, sleep related seizures, insomnia, etc., and also to qualitatively measure different aspects of sleep such as quality, activity, etc.
Common methods of sleep monitoring require the use of wearables or other contact sensors to measure vital signs. Such wearables can be uncomfortable to wear and can cause difficulty in falling asleep or even affect sleep behavior itself. Furthermore, several of these approaches also require overnight monitoring in clinics, increasing cost and discomfort of patients. For these reasons, non-contact approaches for sleep monitoring have tremendous utility.
Research into sleep monitoring approaches has been varied. As mentioned above, sleep monitoring using wearables is one of the most common approaches. Non-contact sleep monitoring approaches also exist, such as camera-based and sensor-based approaches. One example uses a low power radar-based biomotion sensor that detects movement to measure sleep/wake patterns during sleep and perform sleep/wake classification.
Methods and systems for remote sleep monitoring are provided. Such methods and systems provide non-contact sleep monitoring via remote sensing or radar sensors. In this regard, when processing backscattered radar signals from a sleeping subject on a normal mattress, a breathing motion magnification effect is observed from mattress surface displacement due to human respiratory activity. This undesirable motion artifact causes existing approaches for accurate heart-rate estimation to fail. Embodiments of the present disclosure use a novel active motion suppression technique to deal with this problem by intelligently selecting a slow-time series from multiple ranges and examining a corresponding phase difference. This approach facilitates improved sleep monitoring, where one or more subjects can be remotely monitored during an evaluation period (which corresponds to an expected sleep cycle).
An exemplary embodiment provides a method for remote sleep monitoring. The method includes transmitting a radar signal toward a subject and receiving a radio frequency (RF) response signal corresponding to the radar signal. The method further includes monitoring vital signs of a subject, which includes processing the RF response signal to produce one or more vital sign signals and monitoring the vital sign signals over an evaluation period. The method further includes classifying a sleep state of the subject based on results from monitoring the vital signs.
Another exemplary embodiment provides a sleep monitoring device. The sleep monitoring device includes a radar sensor configured to receive an RF response signal to a radar signal and a processing circuit coupled to the radar sensor. The processing circuit is configured to monitor vital signs of a subject from the RF response signal by processing the RF response signal to produce a plurality of vital sign signals and classifying a sleep state of the subject based on monitoring the plurality of vital sign signals over an evaluation period.
Another exemplary embodiment provides a system for remote sleep monitoring. The system includes a radar sensor configured to receive an RF response signal to a radar signal transmitted toward one or more subjects, a database, and a processing circuit coupled to the radar sensor and the database. The processing circuit is configured to monitor vital signs of the one or more subjects over an evaluation period by processing the RF response signal to produce vital sign signals for each of the one or more subjects, storing the vital sign signals in the database, and classifying a sleep state of the subject based on the vital sign signals stored in the database.
Those skilled in the art will appreciate the scope of the present disclosure and realize additional aspects thereof after reading the following detailed description of the preferred embodiments in association with the accompanying drawing figures.
The accompanying drawing figures incorporated in and forming a part of this specification illustrate several aspects of the disclosure, and together with the description serve to explain the principles of the disclosure.
The embodiments set forth below represent the necessary information to enable those skilled in the art to practice the embodiments and illustrate the best mode of practicing the embodiments. Upon reading the following description in light of the accompanying drawing figures, those skilled in the art will understand the concepts of the disclosure and will recognize applications of these concepts not particularly addressed herein. It should be understood that these concepts and applications fall within the scope of the disclosure and the accompanying claims.
It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of the present disclosure. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
It will be understood that when an element is referred to as being “connected” or “coupled” to another element, it can be directly connected or coupled to the other element or intervening elements may be present. In contrast, when an element is referred to as being “directly connected” or “directly coupled” to another element, there are no intervening elements present.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises,” “comprising,” “includes,” and/or “including” when used herein specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs. It will be further understood that terms used herein should be interpreted as having a meaning that is consistent with their meaning in the context of this specification and the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
Methods and systems for remote sleep monitoring are provided. Such methods and systems provide non-contact sleep monitoring via remote sensing or radar sensors. In this regard, when processing backscattered radar signals from a sleeping subject on a normal mattress, a breathing motion magnification effect is observed from mattress surface displacement due to human respiratory activity. This undesirable motion artifact causes existing approaches for accurate heart-rate estimation to fail. Embodiments of the present disclosure use a novel active motion suppression technique to deal with this problem by intelligently selecting a slow-time series from multiple ranges and examining a corresponding phase difference. This approach facilitates improved sleep monitoring, where one or more subjects can be remotely monitored during an evaluation period (which corresponds to an expected sleep cycle).
I. Signal Model
Remote vital sign detection (e.g., for remote sleep monitoring) can be characterized with a signal model. For example, a synthesized transmitted pulse for remote vital sign detection can be modeled as a cosine wave with a Gaussian envelope:
ptx(τ)=p0(τ)cos(2πFcτ) Equation 1
where p0(τ) denotes the Gaussian pulse envelope and is designed to satisfy an emission mask. Fc denotes the nominal operating frequency. A receive signal in response to the transmitted pulse can be modeled as:
prx(τ)=ATp0(τ−τD(t))cos(2πFc(τ−τD(t))) Equation 2
where AT denotes the target response and TD(t) denotes time-varying time-delay due to vital sign motion. T and t denote slow-time and fast-time scales, respectively.
The vital sign motion V(t) of a subject at a nominal distance d0 can be modeled as a sum of two non-stationary and periodic-like signals, XB(t) for respiratory activity and XH(t) for cardiac activity:
d(t)=d0+V(t)=d0+XB(t)+XH(t) Equation 3
Then the motion modulated time-delay is written as TD(t)=d(t)/c, where c is the speed of light.
The complex baseband signal is obtained by mixing with a term ej2πF
s(t)=A(t)ej2πF
where A(t) denotes a time-varying amplitude. The ideal phase signal is then derived as:
where λ=c/Fc represents wavelength and φ0 is the initial phase term.
II. Breathing Magnification Effect
An interesting motion artifact has been observed during a sleep monitoring study. This motion magnification effect on the respiration signal makes heartbeat detection impossible using existing methods, such as radar-based methods.
In general, the respiration signal in the radar return is much stronger than that of heartbeat signal for two factors: radar cross section and physical displacement. Even though a radar sensor is illuminating the entire body of a subject, only a small portion of this response contains skin motion or vasomotion due to cardiac activity. The physical displacement due to normal breathing ranges from 1 mm (shallow) to 1 cm (deep), while the physical displacement near the skin surface as a result of heartbeat motion is on the order of 0.1 mm. Furthermore, in supine position, a non-rigid mattress tends to support and conform to the back of the human body very well. This results in the mattress moving with the human body during respiration, causing a motion magnification effect due to respiration.
In order to validate this important observation, a test subject was instructed to lie down on both a rigid firm surface (e.g., a wooden bed frame) and on a non-rigid surface (e.g., a soft mattress) and breathe normally. Radar response signals for both situations were recorded and processed for comparison. As illustrated in
III. Active Motion Suppression Technique
Prior work has shown that radar-based heartbeat detection is respiration-interference limited. In supine position on a non-rigid surface, this problem gets even more challenging, as explained above with respect to
Embodiments described herein provide a new active motion suppression technique to recover the heartbeat signal in the presence of magnified respiration interference. This approach exploits the fact that the human body is an extended dynamic target, and thus vital signs can be observed in multiple range bins. Across these multiple range bins, breathing energy spreads across a larger number of range bins when compared to heartbeat or pulse energy. The total number of range bins of interest can be divided into two categories: breathing only, and a mixture of respiration and heartbeat signals. A goal of the active motion suppression technique is to find two ‘best’ range bins such that their phase difference in the slow-time series generates a maximum pulse signal to noise ratio (SNR).
In an exemplary aspect, the subject 14 lies on a rigid or non-rigid surface 18 (e.g., a mattress and/or a rigid frame) during a sleep evaluation (e.g., an evaluation period corresponding to an expected sleep cycle, such as four or more hours, six or more hours, or eight or more hours). The radar sensor 12 is configured to receive (e.g., sense) a radio frequency (RF) response signal 20 which corresponds to a radar signal 22 emitted toward the subject 14. In some examples, the radar sensor 12 is a transceiver which transmits the radar signal 22 and receives the RF response signal 20. In other examples, the radar signal 22 is emitted from another device (e.g., a transmitter which may be in communication with the radar sensor 12). The processing device 16 records and/or analyzes the RF response signal 20 to produce and/or monitor vital signs of the subject. The processing device 16 is further used to classify a sleep state of the subject (e.g., awake, asleep, depth of sleep, etc.), and may also provide a diagnosis of sleep conditions (e.g., sleep apnea, insomnia, etc.).
The radar sensor 12 in the embodiment of
It should be understood that in other embodiments the radar sensor 12 can include a different radar, such as a having a bandwidth between 1.0 GHz and 2.0 GHz. It should also be understood that the radar sensor 12 may in other embodiments be positioned differently, such as wall-mounted or as a free-standing device placed on a surface and aimed toward the subject 14.
With reference to , as shown in Equation 6:
where A0 is a nominal signal amplitude.
derr=√{square root over (Σ(√{square root over (I2+Q2)}−1)2/N)}≤0.3 Equation 7
Here the upper limit is selected as thirty percent of the unit radius, which means that if the distance error from the I/Q samples is over 0.3, then this channel is too noisy and thus is ignored.
By now, a set of channels or range bins, V, with clean phase is obtained with outliers eliminated. But each set of I/Q samples from the selected range bins can generate a different phase trajectory with varying arc lengths and starting phases in the constellation plot. An example is shown in
The basic idea of phase-based motion cancellation to suppress the magnified respiration signal is to select one channel from the breathing only range bins and select another one from the range bins containing both respiration and heartbeat signals:
where the DC offsets in the m-th channel and the n-th channel are assumed to be compensated perfectly. Ideally, the phase difference is only a function of the heartbeat signal:
But in reality the motion interference cannot be perfectly cancelled out due to phase noise. This leads to an extra term in Equation 10, representing motion residual noise XBResi(t):
The two most appropriate channels are chosen based on the following equation:
arg maxm,nH Equation 12
The phase difference of any combination from the set of channels V (obtained as described above with respect to H, and this combination gets selected.
H is defined as the signal to noise ratio at the heartbeat frequency of interest by inspecting the spectrum of the phase differences {tilde over (Φ)}Diff(t).
With reference to
In order to demonstrate the effectiveness of motion suppression through the proposed technique, no filter is applied in generating
IV. Method for Remote Sleep Monitoring
The process continues at operation 1004, with monitoring vital signs of a subject. Operation 1004 includes suboperation 1006, with processing the RF response signal to produce one or more vital sign signals. In an exemplary aspect, processing the RF response signal includes processing the RF response signal to produce a respiratory signal (e.g., from a respiratory channel) and a mixed respiratory and cardiac spatial channel and extracting a cardiac signal from the mixed respiratory and cardiac spatial channel. Operation 1004 further includes suboperation 1008, with monitoring the vital sign signals over an evaluation period. In an exemplary aspect, the evaluation period corresponds to an expected sleep cycle of the subject.
The process continues at operation 1010, with classifying a sleep state of the subject based on results from monitoring the vital signs. For example, the sleep state of the subject (e.g., awake, asleep, depth of sleep, etc.) may be determined throughout the evaluation period (e.g., to determine duration and/or quality of sleep). This analysis of the vital sign signals, including classification of the sleep state may also be used to provide a diagnosis of sleep conditions (e.g., sleep apnea, insomnia, etc).
Although the operations of
V. Computer System
The exemplary computer system 1100 in this embodiment includes a processing device 1102 or processor, a main memory 1104 (e.g., read-only memory (ROM), flash memory, dynamic random access memory (DRAM), such as synchronous DRAM (SDRAM), etc.), and a static memory 1106 (e.g., flash memory, static random access memory (SRAM), etc.), which may communicate with each other via a data bus 1108. Alternatively, the processing device 1102 may be connected to the main memory 1104 and/or static memory 1106 directly or via some other connectivity means. In an exemplary aspect, the processing device 1102 could be used to perform any of the methods or functions described above.
The processing device 1102 represents one or more general-purpose processing devices, such as a microprocessor, central processing unit (CPU), or the like. More particularly, the processing device 1102 may be a complex instruction set computing (CISC) microprocessor, a reduced instruction set computing (RISC) microprocessor, a very long instruction word (VLIW) microprocessor, a processor implementing other instruction sets, or other processors implementing a combination of instruction sets. The processing device 1102 is configured to execute processing logic in instructions for performing the operations and steps discussed herein.
The various illustrative logical blocks, modules, and circuits described in connection with the embodiments disclosed herein may be implemented or performed with the processing device 1102, which may be a microprocessor, field programmable gate array (FPGA), a digital signal processor (DSP), an application-specific integrated circuit (ASIC), or other programmable logic device, a discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. Furthermore, the processing device 1102 may be a microprocessor, or may be any conventional processor, controller, microcontroller, or state machine. The processing device 1102 may also be implemented as a combination of computing devices (e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration).
The computer system 1100 may further include a network interface device 1110. The computer system 1100 also may or may not include an input 1112, configured to receive input and selections to be communicated to the computer system 1100 when executing instructions. The input 1112 may include, but not be limited to, a touch sensor (e.g., a touch display), an alphanumeric input device (e.g., a keyboard), and/or a cursor control device (e.g., a mouse). In an exemplary aspect, the radar sensor 12 of
The computer system 1100 may or may not include a data storage device that includes instructions 1116 stored in a computer-readable medium 1118. The instructions 1116 may also reside, completely or at least partially, within the main memory 1104 and/or within the processing device 1102 during execution thereof by the computer system 1100, the main memory 1104, and the processing device 1102 also constituting computer-readable medium. The instructions 1116 may further be transmitted or received via the network interface device 1110.
While the computer-readable medium 1118 is shown in an exemplary embodiment to be a single medium, the term “computer-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions 1116. The term “computer-readable medium” shall also be taken to include any medium that is capable of storing, encoding, or carrying a set of instructions for execution by the processing device 1102 and that causes the processing device 1102 to perform any one or more of the methodologies of the embodiments disclosed herein. The term “computer-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical medium, and magnetic medium.
The operational steps described in any of the exemplary embodiments herein are described to provide examples and discussion. The operations described may be performed in numerous different sequences other than the illustrated sequences. Furthermore, operations described in a single operational step may actually be performed in a number of different steps. Additionally, one or more operational steps discussed in the exemplary embodiments may be combined.
Those skilled in the art will recognize improvements and modifications to the preferred embodiments of the present disclosure. All such improvements and modifications are considered within the scope of the concepts disclosed herein and the claims that follow.
This application is a 35 USC 371 national phase filing of International Application No. PCT/US2020/057452, filed Oct. 27, 2020, which claims the benefit of U.S. Provisional Patent Application No. 62/926,717, filed Oct. 28, 2019, the disclosures of which are incorporated herein by reference in their entireties.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2020/057452 | 10/27/2020 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2021/086809 | 5/6/2021 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
4860014 | Shores et al. | Aug 1989 | A |
5424749 | Richmond | Jun 1995 | A |
5565872 | Prevatt et al. | Oct 1996 | A |
5828331 | Harper | Oct 1998 | A |
6026340 | Corrado et al. | Feb 2000 | A |
8712069 | Murgia et al. | Apr 2014 | B1 |
9164167 | Hyde et al. | Oct 2015 | B2 |
10310073 | Santra | Jun 2019 | B1 |
20050168336 | Donskoy et al. | Aug 2005 | A1 |
20060054438 | Asaba et al. | Mar 2006 | A1 |
20060253278 | Furst-Yust et al. | Nov 2006 | A1 |
20080077015 | Boric-Lubecke et al. | Mar 2008 | A1 |
20080135762 | Villanucci et al. | Jun 2008 | A1 |
20080151694 | Slater | Jun 2008 | A1 |
20090203972 | Heneghan et al. | Aug 2009 | A1 |
20100011845 | Laughard, Jr. et al. | Jan 2010 | A1 |
20100152600 | Droitcour et al. | Jun 2010 | A1 |
20100290063 | Bakhtiari et al. | Nov 2010 | A1 |
20130053653 | Cuddihy et al. | Feb 2013 | A1 |
20140194793 | Nakata et al. | Jul 2014 | A1 |
20140212986 | Angelescu et al. | Jul 2014 | A1 |
20150319540 | Rubinstein et al. | Nov 2015 | A1 |
20160022204 | Mostov | Jan 2016 | A1 |
20160089052 | Cho et al. | Mar 2016 | A1 |
20180000408 | Heinrich et al. | Jan 2018 | A1 |
20190059746 | McMahon | Feb 2019 | A1 |
20200196866 | Chiou et al. | Jun 2020 | A1 |
20210093203 | Zhong | Apr 2021 | A1 |
20210353156 | Rong et al. | Nov 2021 | A1 |
20220142478 | Bliss | May 2022 | A1 |
20220373646 | Nguyen | Nov 2022 | A1 |
20230000396 | Coffey | Jan 2023 | A1 |
Number | Date | Country |
---|---|---|
106264501 | Jan 2017 | CN |
2022153626 | Oct 2022 | JP |
0116554 | Mar 2001 | WO |
0116554 | Sep 2001 | WO |
2005091014 | Sep 2005 | WO |
2008001092 | Jan 2008 | WO |
2012055148 | May 2012 | WO |
2017180985 | Oct 2017 | WO |
2018050913 | Mar 2018 | WO |
2018213757 | Nov 2018 | WO |
2018234394 | Dec 2018 | WO |
Entry |
---|
Anderson, N. et al., “A 118-mW Pulse-Based Radar SoC in 55-nm CMOS for Non-Contact Human Vital Signs Detection,” IEEE Journal of Solid-State Circuits, vol. 52, No. 12, Dec. 2017, IEEE, pp. 3421-3432. |
Aumann, H.M. et al., “Doppler radar microphone with logarithmic square-law detector,” Electronics Letters, vol. 52, No. 12, Jun. 2016, pp. 1061-1063. |
Avargel, Y. et al., “Speech measurements using a laser Doppler vibrometer sensor: Application to speech enhancement,” 2011 Joint Workshop on Hands-free Speech Communication and Microphone Arrays, May 30-Jun. 1, 2011, Edinburgh, UK, IEEE. |
Chazal, P. et al., “Sleep/wake measurement using a noncontact biomotion sensor,” Journal of Sleep Research, vol. 20, No. 2, Aug. 2010, pp. 356-366. |
Chernov, N. et al., “Least Squares Fitting of Circles,” Journal of Mathematical Imaging and Vision, vol. 23, No. 3, Nov. 2005, pp. 239-252. |
Chung, K-Y. et al., “Noncontact Sleep Study by Multi-Modal Sensor Fusion,” Sensors, vol. 17, No. 7, Jul. 2017, MDPI, 17 pages. |
Davis, A. et al., “The Visual Microphone: Passive Recovery of Sound from Video,” ACM Transactions on Graphics, vol. 33, No. 4, Jul. 2014, 10 pages. |
Guan, S. et al., “Automated DC Offset Calibration Strategy for Structural Health Monitoring Based on Portable CW Radar Sensor,” IEEE Transactions on Instrumentation and Measurement, vol. 63, No. 12, Dec. 2014, IEEE, pp. 3111-3118. |
Geisheimer, J.L. et al., “A Surface Vibration Electromagnetic Speech Sensor,” Multi-modal Speech Recognition Workshop 2002, Jun. 2002, Georgia Tech Research Institute, Atlanta Sensors and Electromagnetic Applications Lab, 5 pages. |
Immoreev, I. et al., “UWB Radar for Patient Monitoring,” IEEE Aerospace and Electronic Systems Magazine, vol. 23, Issue 11, Nov. 2008, IEEE, 8 pages. |
Jiao, M. et al., “A Novel Radar Sensor for the Non-Contact Detection of Speech Signals,” Sensors, vol. 10, No. 5, May 2010, pp. 4622-4633. |
Lazaro, A. et al., “Analysis of Vital Signs Monitoring Using an IR-UWB Radar,” Progress In Electromagnetics Research, vol. 100, Jan. 2010, pp. 265-284. |
Lee, J.-M. et al., “Comparison of Wearable Trackers' Ability to Estimate Sleep,” International Journal of Environmental Research and Public Health, vol. 15, No. 6, Jun. 2018, MDPI, 13 pages. |
Li, C. et al., “Complex Signal Demodulation and Random Body Movement Cancellation Techniques for Non-contact Vital Sign Detection,” 2008 IEEE MTT-S International Microwave Symposium Digest, Jun. 15-20, 2008, Atlanta, GA, USA, IEEE, 4 pages. |
Mercuri, M. et al., “Vital-sign monitoring and spatial tracking of multiple people using a contactless radar-based sensor,” Nature Electronics, vol. 2, Jun. 2019, pp. 252-262. |
Nam, Y. et al., “Sleep Monitoring Based on a Tri-Axial Accelerometer and a Pressure Sensor,” Sensors, vol. 16, No. 5, May 2016, MDPI, 14 pages. |
Park, B.-K. et al., “Arctangent Demodulation With DC Offset Compensation in Quadrature Doppler Radar Receiver Systems,” IEEE Transactions on Microwave Theory and Techniques, vol. 55, No. 5, May 2007, IEEE, pp. 1073-1079. |
Rahmati, M. et al., “SSFB: Signal-Space-Frequency Beamforming for Underwater Acoustic Video Transmission,” 2017 IEEE 14th International Conference on Mobile Ad Hoc and Sensor Systems (MASS), Oct. 22-25, 2017, Orlando, FL, USA, IEEE, pp. 180-188. |
Ren, L. et al., “Noncontact Heartbeat Detection using UWB Impulse Doppler Radar,” 2015 IEEE Topical Conference an Biomedical Wireless Technologies, Networks, and Sensing Systems (BioWireleSS), Jan. 25-28, 2015, San Diego, CA, IEEE, 3 pages. |
Ren, L. et al., “Phase-Based Methods for Heart Rate Detection Using UWB Impulse Doppler Radar,” IEEE Transactions on Microwave Theory and Techniques, vol. 64, Issue 10, Oct. 2016, IEEE, 13 pages. |
Rong, Y. et al., “Harmonics-Based Multiple Heartbeat Detection at Equal Distance using UWB Impulse Radar,” 2018 IEEE Radar Conference (RadarConf18), Apr. 23-27, 2018, Oklahoma City, OK, USA, IEEE, 5 pages. |
Rong,Y. et al., “Remote Sensing for Vital Information Based on Spectral-Domain Harmonic Signatures,” IEEE Transactions on Aerospace and Electronic Systems, vol. 55, No. 6, May 2019, IEEE, 12 pages. |
Rong, Y. “Remote Sensing For Vital Signs Monitoring Using Advanced Radar Signal Processing Techniques,” A Dissertation Presented in Partial Fulfillment of the Requirements for the Degree Doctor of Philosophy, Arizona State University, Dec. 2018, 117 pages. |
Rong, Y. et al., “Smart Homes: See Multiple Heartbeats Through Wall Using Wireless Signals,” 2019 Radar Conference (RadarConf), Apr. 2019, Boston, MA, USA, IEEE, 6 pages. |
Rothberg, S. et al., “Laser vibrometry: Pseudo-vibrations,” Journal of Sound and Vibration, Dec. 1989, Elsevier, 18 pages. |
Savage, H.O. et al., “Development and validation of a novel non-contact monitor of nocturnal respiration for identifying sleep-disordered breathing in patients with heart failure,” ESC Heart Failure, vol. 3, No. 3, Sep. 2016, John Wiley & Sons, pp. 212-219. |
Staderini, E.M., “UWB Radars in Medicine,” IEEE Aerospace and Electronic Systems Magazine, vol. 17, No. 1, Feb. 2002, pp. 13-18. |
Tian, Y. et al., “Smart radar sensor for speech detection and enhancement,” Sensors and Actuators A: Physical, vol. 191, Mar. 2013, Elsevier, pp. 99-104. |
Viswanathan, V. et al., “Noise-immune multisensor speech input: formal subjective testing in operational conditions,” International Conference on Acoustics, Speech, and Signal Processing, May 23-26, 1989, Glasgow, UK, IEEE, pp. 373-376. |
Viswanathan, V. et al., “Noise-immune speech transduction using multiple sensors,” IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP'85), Apr. 26-29, 1985, Tampa, FL, USA, IEEE, 4 pages. |
Yacchirema, D.C., “A Smart System for Sleep Monitoring by Integrating IoT With Big Data Analytics,” IEEE Access, vol. 6, Jun. 2018, 16 pages. |
Zhao, H. et al., “A Portable 24-GHz Auditory Radar for Non-contact Speech Sensing with Background Noise Rejection and Directional Discrimination,” 2016 IEEE MTT-S International Microwave Symposium (IMS), May 22-27, 2016, San Francisco, CA, USA, IEEE, 4 pages. |
Invitation to Pay Additional Fees for International Patent Application No. PCT/US2019/053425, dated Nov. 27, 2019, 2 pages. |
International Search Report and Written Opinion for International Patent Application No. PCT/US2019/053425, dated Jan. 30, 2020, 10 pages. |
International Preliminary Report on Patentability for International Patent Application No. PCT/US2019/053425, dated Apr. 15, 2021, 7 pages. |
International Search Report and Written Opinion for International Patent Application No. PCT/US2020/057452, dated Feb. 12, 2021, 11 pages. |
International Search Report and Written Opinion for International Patent Application No. PCT/US2020/058326, dated Feb. 3, 2021, 13 pages. |
Author Unknown, “XeThru X4,” available as early as Apr. 40, 2019, accessed Jun. 9, 2022 from https://www.radartutorial.eu/19.kartei/13.labs/karte009.en.html, 1 page. |
Lee, J. et al., “Sleep Monitoring System Using Kinect Sensor,” International Journal of Distributed Sensor Networks, vol. 11, No. 10, Oct. 2015, Hindawi Publishing Corporation, 10 pages. |
Ma, Y. et al., “Speech Recovery Based On Auditory Radar and Webcam,” 2019 IEEE MTT-S International Microwave Biomedical Conference (IMBioC), May 6-8, 2019, Nanjing, China, IEEE, 3 pages. |
Extended European Search Report for European Patent Application No. 20882810.3, dated Nov. 22, 2022, 9 pages. |
Non-Final Office Action for U.S. Appl. No. 17/773,503, dated Mar. 14, 2023, 23 pages. |
Number | Date | Country | |
---|---|---|---|
20230018038 A1 | Jan 2023 | US |
Number | Date | Country | |
---|---|---|---|
62926717 | Oct 2019 | US |