Radar imaging on a mobile computing device

Information

  • Patent Grant
  • 10823841
  • Patent Number
    10,823,841
  • Date Filed
    Friday, April 29, 2016
    8 years ago
  • Date Issued
    Tuesday, November 3, 2020
    3 years ago
Abstract
Systems and methods of capturing images are disclosed. For instance, a plurality of position signals associated with a mobile computing device can be received, the plurality of position signals can be obtained at least in part using one or more sensors implemented within the mobile computing device. A relative motion between the mobile computing device and a scattering point associated with a target can be determined. A plurality of return signals reflected from the scattering point can be received. Each return signal can correspond to a pulse transmitted by the mobile computing device. A target response associated with the scattering point can be determined based at least in part on the relative motion between the mobile computing device and the scattering point.
Description
FIELD

The present disclosure relates generally to capturing images, and more particularly to generating images using synthetic aperture radar imaging techniques on a mobile computing device.


BACKGROUND

Synthetic aperture radar technology is typically implemented in platforms such as aircrafts, satellites, and/or fixed track moving radar platforms. In particular, such synthetic aperture radar implementations are typically designed for systems wherein a motion of the platform is very precisely constrained to predetermined trajectories and/or precisely measured with GPS. Further, such synthetic aperture radar implementations typically require a large amount of space due to the size of the radar hardware (e.g. circuitry, antennas, etc.). Some synthetic aperture radar implementations can be configured to capture images of a target by simulating a synthetic aperture based on a relative motion between the target and the radar. Such imaging implementations typically generate image through post-processing techniques based at least in part on the predetermined platform trajectory.


Such systems do not meet the constraints of a consumer mobile computing device having limited size, cost, and processing resources. In addition, such a mobile computing device may not have a predetermined trajectory that can be used to create the synthetic aperture.


SUMMARY

Aspects and advantages of embodiments of the present disclosure will be set forth in part in the following description, or may be learned from the description, or may be learned through practice of the embodiments.


One example aspect of the present disclosure is directed to a computer-implemented method of capturing images using a mobile computing device. The method includes receiving, by a mobile computing device, a plurality of position signals associated with the mobile computing device. The plurality of position signals are obtained at least in part using one or more sensors implemented within the mobile computing device. The method further includes determining, by the mobile computing device, a relative motion between the mobile computing device and a scattering point associated with the target scene based at least in part on the plurality of position signals. The method further includes receiving, by the mobile computing device, a plurality of return signals reflected from the scattering point. Each return signal corresponds to a pulse transmitted by the mobile computing device while the mobile computing device is in view of the scattering point. The method further includes determining, by the mobile computing device, a target response associated the scattering point based at least in part on the relative motion between the mobile computing device and the scattering point.


Other example aspects of the present disclosure are directed to systems, apparatus, tangible, non-transitory computer-readable media, user interfaces, memory devices, and electronic devices for capturing synthetic aperture radar images using a mobile computing device.


These and other features, aspects and advantages of various embodiments will become better understood with reference to the following description and appended claims. The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the present disclosure and, together with the description, serve to explain the related principles.





BRIEF DESCRIPTION OF THE DRAWINGS

Detailed discussion of embodiments directed to one of ordinary skill in the art are set forth in the specification, which makes reference to the appended figures, in which:



FIG. 1 depicts an example system for capturing images according to example embodiments of the present disclosure;



FIG. 2 depicts an example system for capturing images according to example embodiments of the present disclosure;



FIG. 3 depicts an example antenna configuration according to example embodiments of the present disclosure;



FIG. 4 depicts a flow diagram of an example method of capturing images according to example embodiments of the present disclosure; and



FIG. 5 depicts an example system according to example embodiments of the present disclosure.





DETAILED DESCRIPTION

Reference now will be made in detail to embodiments, one or more examples of which are illustrated in the drawings. Each example is provided by way of explanation of the embodiments, not limitation of the present disclosure. In fact, it will be apparent to those skilled in the art that various modifications and variations can be made to the embodiments without departing from the scope or spirit of the present disclosure. For instance, features illustrated or described as part of one embodiment can be used with another embodiment to yield a still further embodiment. Thus, it is intended that aspects of the present disclosure cover such modifications and variations.


Example aspects of the present disclosure are directed to determining synthetic aperture radar images by a mobile computing device, such as a smartphone, tablet computing device, wearable computing device, laptop computing device, or any other suitable computing device capable of being carried by a user while in operation. Synthetic aperture radar images are typically captured from radar platforms having a substantially fixed and/or predetermined path or trajectory, such as an airplane, satellite, or a fixed track moving radar system. Such images can be captured by simulating a synthesized antenna aperture by capturing radar data associated with a target at a number of positions and times as the platform travels along its trajectory. The simulated aperture can have a larger size than the physical aperture of the antenna. The captured data can then be combined to form an image. The fixed or predetermined motion of the platform can be used to account for a migration of the target within the captured data at each of the different data capture positions.


Synthetic aperture imaging techniques are not typically implemented within consumer mobile devices because the motion of the device in capturing an image is not fixed or predetermined. In this manner, the migration of the target within the captured data at each position does not typically follow a known or predetermined pattern, and is not easily determined. In addition, computing, hardware, and/or size constraints associated with mobile computing devices can preclude implementation of such imaging techniques.


According to example aspects of the present disclosure, a motion of a mobile device can be determined in real time as the mobile device transmits and receives radar data with respect to a target scene, and the determined motion can be used to update a target response associated with received radar data obtained by the mobile device. In particular, data associated with a plurality of locations of a mobile computing device can be received. The data can be determined at least in part by one or more sensors or other components implemented within the mobile computing device. The plurality of locations of the mobile device can correspond to a movement of the mobile device along a trajectory proximate a target scene by a user during an image capture period. One or more relative positions between the mobile device and the target scene can be determined based at least in part on the data associated with the plurality of locations. As the mobile device is being moved along the trajectory, the mobile device can transmit a sequence of pulses and receive one or more reflected signals indicative of the target scene. The mobile device can then determine one or more radar images by compensating for the trajectory of the mobile device in real-time. In various implementations, the image can be a two-dimensional (2D) image, a three-dimensional image (3D), and/or a see-through-the-wall (STTW) image determined using one or more synthetic aperture radar processing techniques.


More particularly, the mobile device can include a radar module having one or more antenna elements configured to transmit a sequence of pulses and/or to receive return signals from the target scene. The mobile device can further be configured to determine positional information of the mobile device using one or more accelerometers, gyroscopes, depth cameras, optical cameras, ranging base stations, and/or various other suitable components. Upon initiation of an image capture period, for instance, in response to an input from a user, the mobile device can begin transmitting a periodic sequence of modulated pulses as the user moves the mobile device on a trajectory proximate the target scene. As another example, during the image capture period, the target scene may be moved on a trajectory proximate the mobile device. Such relative motion between the mobile device and the target scene can be used to simulate a synthetic antenna aperture that is larger than the physical antenna aperture of the mobile device. A plurality of return signals can be received corresponding to time-delayed versions of the transmitted signals. The plurality of return signals can correspond to a superposition of reflections from all scattering points within the antenna field of view. For instance, the return signals can correspond to a sum of the contribution of all scattering points in the target scene. In some implementations, the return signals can include amplitude data and/or phase data associated with the return signals. Return signals can be received corresponding to each transmitted pulse. In this manner, data indicative of a particular scattering point of the target scene can be received multiple times as the mobile device moves proximate the target scene.


One or more target responses associated with the target scene can be determined in real-time based at least in part on the relative motion between the mobile device and the target scene. A target response can be indicative of reflected energy received by the mobile device, and can vary based at least in part on the relative range and velocity between the target scene and the mobile device. In some implementations, the range or distance from the mobile device to the target scene (e.g., to one or more scattering points within the target scene) can vary with each pulse transmitted and received by the mobile device. In particular, the reflections from the scattering points within the antenna field of view may be modulated by the relative range and velocity between the various scattering points and the mobile device as the mobile device moves relative to the target scene. In this manner, as the mobile device moves relative to the target scene, the target response can be updated to compensate for the modulated return signal.


In particular, the target response can be updated based at least in part on a relative position and/or velocity between the mobile device and the target scene. As indicated, as the mobile device moves proximate the target scene during an imaging period, a trajectory and/or velocity of the mobile device can be monitored. For instance, in some implementations, the movement of the mobile device by the user can be an ad hoc movement that does not follow a predefined motion or path. In this manner, the relative motion between the target scene and the mobile device during an imaging period is not known prior to the imaging period. In some implementations, the relative trajectory and/or velocity of the mobile device and/or target can be determined using one or more onboard position sensors, such as one or more accelerometers, gyroscopes, depth cameras, optical cameras, etc. in conjunction with the return signals received by the mobile device.


The relative trajectory and/or velocity between the mobile device and the target can be used to determine a migration of an individual scattering point. In particular, as the scattering point moves through the aperture of the receiving antenna of the mobile device over a plurality of pulses during an imaging period, the range between the scattering point and the aperture varies. The determined relative trajectory and/or velocity can be used to determine the degree of variation of the range. The mobile device can compensate for such range variations to reduce or eliminate the range variation based at least in part on the determined degree of variation. The target response associated with the return signals can be updated to reflect the compensated return signals. In particular, the target response can be determined by combining the compensated return signals for each scattering point in the target scene. In some implementations, one or more synthetic aperture radar processing techniques can be used to compensate the return signals and/or to determine the updated target response. For instance, one or more pulse compression techniques, range-Doppler techniques, range and/or Doppler migration correction techniques, Doppler mapping techniques can be used.


In some implementations, such processing techniques can further be used to generate an image of the radio frequency (RF) reflectivity of the target scene. Such generated image can have a higher resolution than an image generated using real aperture imaging techniques. In various implementations, the generated image can be a 2D image providing range and azimuth information associated with the target scene, or a 3D image providing range, azimuth, and elevation information associated with the target scene. In some implementations, the generated image may be a STTW image of one or more objects located behind a wall or other barrier. The type of image that is generated can be based on the motion or trajectory of the mobile device during the imaging period. In particular, the motion of the mobile device can be used to simulate an antenna aperture suitable for capturing different image types. For instance, a trajectory of the mobile device having only transverse motion relative to the target scene can be suitable for generating a 2D image. As another example, a trajectory of the mobile device having transverse motion and longitudinal motion relative to the target scene can be suitable for generating a 3D image. In this manner, the trajectory of the mobile device during an imaging period can be chosen by a user to generate a desired image type.


As an example, a user of a mobile computing device can initiate an imaging process, for instance, through interaction with a user interface associated with the user device. The mobile device can then prompt the user to move the mobile device with respect to a target scene of which the user desires to capture an image. As the user moves the mobile device with respect to the target scene, the mobile device can begin transmitting a sequence of pulses and receiving return data associated with a target scene. The mobile device can further determine a relative motion between the mobile device and target scene. The mobile device can then generate one or more images of the target scene by combining the return data received as the mobile device moved with respect to the target scene, and compensating for the determined relative motion. The mobile device can provide the image for display on the user interface of the mobile device.


With reference now to the figures, example embodiments of the present disclosure will be discussed in more detail. For instance, FIG. 1 depicts an example system 100 for capturing a synthetic aperture radar image according to example embodiments of the present disclosure. System 100 includes an image capture device 102 configured to capture one or more images of a target 104. Image capture device can be any suitable device configured to capture one or more radar images. For instance, image capture device 102 can be a standalone image capture device, or can be implemented or integrated within mobile computing device. For instance, the mobile computing device can be any suitable mobile computing device, such as a smartphone, tablet, laptop, wearable computing device, or other suitable mobile computing device capable of being carried by a user while in operation. Image capture device 102 includes one or more antenna elements 106, one or more position sensors 108, and a SAR controller 110. Image capture device 102 can be configured to generate energy signals, such as a modulated periodic signal (e.g. pulse train), to be transmitted in a general direction of target 104. In some implementations, the generated energy signals can be frequency-modulated continuous wave (FMCW) energy signals having various frequencies between about 50 GHz and about 70 GHz. Antenna element(s) 106 can be configured to broadcast or transmit the generated energy signals into space in the direction of target 104, and to receive one or more reflected signals indicative of target 104. For instance, the energy signals can be periodically transmitted at a pulse repetition frequency (PRF) rate. During each transmission, one or more receiver antenna elements can simultaneously be powered. One or more intercepted RF signals can, for instance, be mixed to an intermediate frequency and converted to discrete samples that are provided to SAR controller 110. In some implementations, image capture device 102 can be configured amplify and demodulate the received signals, and to provide the amplified and demodulated signals to SAR controller 110.


For instance, FIG. 2 depicts an example image data acquisition process 140 according to example embodiments of the present disclosure. In particular, image capture device 102 can be configured to transmit energy signals (e.g. original waves) 142 towards target 104, and to intercept or receive return signals (e.g. reflected waves) 144 from target 104. In this manner, image capture device 102 can both emit electromagnetic radio frequency (RF) waves, and receive reflected waves. In particular, energy signals can be propagated in a straight line at the speed of light in one or more directions associated with an antenna beam pattern of antenna element(s) 106. Objects within the propagation path (e.g. target 104) can either absorb the electromagnetic energy or scatter the electromagnetic energy, which results in a change of wave direction.


Electromagnetic radiation reflected coherently back in the direction of image capture device 102 can be intercepted by antenna element(s) 106 (e.g. one or more receiving antenna elements). Such received return signal is a superposition of reflections from a plurality of scattering points within the field of view of image capture device 102. As indicated data indicative of the received return signals can be provided to SAR controller 110 for processing to generate one or more images.


As indicated above, image capture device 102 can be configured to simulate a synthesized antenna aperture using synthetic aperture radar techniques. For instance, during an image capture process or sequence, a user of image capture device 102 can facilitate a relative motion 112 between image capture device 102 and target 104. For instance, a user may facilitate relative motion 112 by moving image capture device 102 in various directions and distances with respect to target 104. For instance, relative motion 112 can be a non-predefined motion that has not set path or trajectory. In this manner, relative motion 112 can include a user defined motion that is not known prior to initiation of relative motion 112. In some implementations, the user can rotate image capture device 102 about one or more axes during the image capture sequence, such that image capture device 102 and/or an antenna beam associated with image capture device 102 is continuously oriented in a general direction of target 102 during the image capture sequence. As another example, the user can facilitate relative motion 112 by maintaining image capture device 102 in a substantially constant position while target 104 moves in one or more non-predefined directions relative to image capture device 102.


Referring back to FIG. 1, antenna element(s) 106 can include various suitable antenna element types. In addition, antenna element(s) 106 can include various suitable numbers of antenna elements configured in various suitable manners. For instance, in some implementations, antenna element(s) 106 can include one or more microstrip antennas configured as a steered or unsteered array. For instance, antenna element(s) 106 can include multiple transmitter-receiver pairs configured to produce a broad antenna beam that can be digitally steered with respect to target 104. As another example, a single transmitter-receiver pair or an unsteered combination of antenna elements can be enabled to produce one-dimensional ranging and tracking along a line of sight associated with image capture device 102. In this manner, one or more transmitter-receiver pairs can be configured to provide one or more channels of data associated with the received signals.


As an example, FIG. 3 depicts an example antenna configuration 120 according to example embodiments of the present disclosure. Antenna configuration 120 can correspond to antenna element(s) 106 of FIG. 1. As shown, antenna configuration 120 can include transmitting antennas 122 and 124, and receiving antennas 126, 128, 130, and 132. Antenna configuration 120 further includes a feed network 134 configured to transmit energy signals to transmitting antennas 122, 124, and from receiving antennas 126-130. Antenna configuration 120 can be configured as a steered array (e.g. phased array) configured to electronically or digitally steer an associated radar beam with respect to target 104. Each transmitting antenna 122, 124 can be paired with each receiving antenna 126-132 to provide eight channels of data, which can be coherently combined, for instance, in a digital domain using one or more suitable beam-forming algorithms to enable 3D spatial discrimination.


Example antenna configuration 120 is provided for illustrative purposes only. As indicated, it will be appreciated that various other suitable antenna configurations can be used without deviating from the scope of the present disclosure. For instance, various suitable antenna configurations can be used having various suitable antenna element types, numbers, and/or arrangements, and having various suitable feed networks.


Referring back to FIG. 1, image capture device 102 can further include one or more position sensor(s) 108. In particular, position sensor(s) 108 can be integrated or implemented within image capture device 108. In some implementations, one or more position sensors 108 can be external to image capture device 108. As indicated above, position sensor(s) 108 can include one or more accelerometers, gyroscopes, depth cameras, optical cameras, ranging base station. Position sensor(s) can be configured to monitor real-time motion and/or position of image capture device 102. For instance, image capture device 102, can be configured to determine a plurality of positions of image capture device 102 based at least in part on a plurality of positioning signals obtained by position sensor(s) 108. In this manner, a position and orientation of image capture device 102 can be tracked as image capture device 102 transmits and receives energy signals during an image capture sequence. In some implementations, a velocity of image capture device 102 can further be determined based at least in part on the positioning signals obtained by position sensor(s) 108. In this manner, a position, velocity, orientation, or other physical characteristics of image capture device 102 can be determined during an image capture sequence.


As an example, if the user facilitates relative motion 112 by moving image capture device 102 with respect to the target 104 during an image capture sequence, position sensor(s) 108 can monitor the motion and/or velocity of image capture device 102. For instance, position sensor(s) 108 can obtain a plurality of positioning signals indicative of one or more positions, orientations, velocities, etc. of image capture device 102 as the user moves image capture device 102 to facilitate relative motion 112. The positioning signals, along with the return signals received by antenna element(s) 106 can be used to determine relative motion 112.


In particular, a timing associated with the return signals can be used to determine a position (e.g. spatial coordinates) of target 104 and/or a range between target 104 and image capture device 102. Such determined position and/or range can be used in conjunction with the obtained positioning signals to determine relative motion 112. In some implementations, one or more relative positions and/or one or more relative velocities can be determined.


As indicated above, SAR controller 110 can be configured to receive the (digitally sampled) return signals intercepted by antenna element(s) 106 and to process the signals to generate one or more images depicting target 104 and/or a scene surrounding target 104. For instance, SAR controller 110 can implement one or more SAR or inverse SAR (ISAR) processing techniques to resolve spatially separated points associated with target 104 and/or the surrounding scene. In example implementations, such processing techniques can include one or more pulse compressions techniques, range-Doppler processing techniques, Doppler mapping techniques, range migration correction techniques, Doppler migration correction techniques and/or other suitable processing techniques.


As indicated above, a plurality of target responses can be determined and/or updated based at least in part on relative motion 112. In particular, a target response can be indicative of the energy reflected by a scattering point associated with target 104. The reflected energy can be modulated based at least in part on the relative range and the relative velocity between image capture device 102 and target 104. As image capture device 102 moves relative to target 104, the relative range and velocities may vary. In this manner, the modulation of various return signals obtained at different times and/or positions may vary. For instance, a first return signal associated with a scattering point obtained at a first position and/or velocity may have different characteristics than a second return signal associated with the scattering point obtained at a second position and/or velocity.


In this manner, the varying return signals can be resolved and combined to generate an image. For instance, a first target response associated with the reflected energy received at a first position can be updated to reflect the relative motion between image capture device 102 and target 104. In particular, such updated target response can compensate for the varying modulations in return signals associated with a scattering point that were received at different relative positions. As indicated above, such modulation can cause a response associated with a scattering point to migrate in a non-predetermined manner. For instance, the range of the scattering point with respect to image capture device 102 can vary based at least in part on relative motion 112. Such range migration can be corrected or compensated for based at least in part on the determined relative positions and/or velocities. The updated target response can reflect such compensated range migration. In this manner, the target response can be updated one or more times to reflect various relative positions and/or velocities between image capture device 102 and target 104.


As indicated, such described SAR techniques can be used to process the received return signals to generate an image depicting target 104. Such image can be a radar image having a higher resolution than an image generated using real aperture radar imaging techniques associated with antenna element(s) 106. In various implementations, the image can be a 2D image, a 3D image, and/or a STTW image. For instance, target 104 may include an occluded object 105 located behind target 104 relative to image capture device 102. Such STTW image may depict occluded object 105.


As indicated above, in implementations wherein a 2D image is generated using SAR processing techniques, the image can provide range and azimuth information associated with target 104. Such 2D image can be captured by facilitating a transverse relative motion 112 between image capture device 102 and target 104. For instance, the user can facilitate a motion of image capture device 102 along a single plane parallel to a face of target 104. In implementations, wherein a 3D image is generated using SAR processing techniques, the image can provide range, azimuth and elevation information associated with target 104. Such 3D image can be captured by facilitating a transverse and longitudinal relative motion 112 between image capture device 102 and target 104. For instance, a user can facilitate a motion of image capture device 102 along a parallel plane relative to the face of target 104 and along a perpendicular plane relative to the face of target 104.


In some implementations, image capture device 102 can include a user interface configured to receive an input from the user requesting initiation of an image capture sequence. In some implementations, the user interface can prompt the user to select an image type to be generated (e.g. 2D, 3D, STTW, etc.). The user interface can further prompt the user to move image capture device in an appropriate manner based at least in part on the selected image type. Upon initiation of the image capture sequence, the user can begin moving image capture device 102 in accordance with the selected image type, and image capture device 102 can begin transmitting energy signals. In implementations, wherein antenna element(s) 106 are configured as an electronically steered array, image capture device 102 may identify target 104 and steer the antenna beam generated by antenna element(s) 106 towards target 104, such that the main lobe of the antenna beam in pointed towards target 104 throughout the image capture sequence. In this manner, a signal-to-noise ratio of the return signals can be improved.



FIG. 4 depicts a flow diagram of an example method (200) of capturing one or more images according to example embodiments of the present disclosure. Method (200) can be implemented by one or more computing devices, such as one or more of the computing devices depicted in FIG. 5. In particular implementations, the method (300) can be implemented by the SAR controller 210 depicted in FIG. 1. In addition, FIG. 4 depicts steps performed in a particular order for purposes of illustration and discussion. Those of ordinary skill in the art, using the disclosures provided herein, will understand that the steps of any of the methods discussed herein can be adapted, rearranged, expanded, omitted, or modified in various ways without deviating from the scope of the present disclosure.


At (202), method (200) can include transmitting a plurality of pulses during an image capture period associated with a mobile computing device. For instance, the pulses can be periodically transmitted FMCW RF signals. As indicated, the image capture period can be associated with a synthetic aperture radar imaging technique wherein a synthetic aperture is simulated using a relative motion between the mobile computing device and a target.


At (204), method (200) can include receiving a plurality of return signals associated with a scattering point associated with the target. For instance, the scattering point can be a point located on the target that reflects received energy in a direction of the mobile computing device. In some implementations, each return signal associated with the scattering point can be received from a different relative position between the target and the mobile computing device.


At (206), method (200) can include receiving a plurality of positioning signals associated with the mobile computing device during the image capture period. As indicated, the positioning signals can be obtained by one or more position sensors associated with the mobile computing device. For instance, the position sensors can be embedded within the mobile computing device and/or external to the mobile computing device. In particular, the positioning signals can be indicative of one or more positions, orientations, velocities, and/or other physical characteristics of the mobile computing device as the mobile computing device moves with respect to a target.


At (208), method (200) can include determining a relative motion between the mobile computing device and the target. As indicated above, the relative motion can be determined based at least in part on the positioning signals. The relative motion can further be determined based at least in part on the received return signals. For instance, the relative motion can be determined at least in part from a timing between transmission of pulses and reception of the corresponding return signals by the mobile computing device. Such timing can be indicative of a range and/or distance between the target and the mobile computing device. Doppler frequencies associated with the return signals can further be used to determine a velocity of the target. Such range and velocity determined from the return signals can be used in conjunction with the positioning signals obtained by the position sensors to determine the relative motion.


At (210), method (200) can include determining a target response associated with the scattering point based at least in part on the relative motion. As indicated, the target response can be indicative of received energy reflected by the scattering point.


At (212), method (200) can include updating the target response based at least in part on the determined relative motion between the target and the mobile computing device. As indicated, the updated target response can be determined to compensate for variations in return signals obtained from different positions relative to the scattering point. In some implementations, updating the target response can include determining a second target response to reflect the discrepancies between the relative range and velocity of the mobile computing device and the target associated with different return signals.


At (214), method (200) can include generating an image depicting the target based at least in part on the updated target response. The image can be a 2D image, a 3D image and/or a STTW image depicting an occluded object associated with the target.



FIG. 5 depicts an example computing system 300 that can be used to implement the methods and systems according to example aspects of the present disclosure. The system 300 can be implemented using a client-server architecture that includes a mobile computing device 310 that communicates with one or more servers 330 over a network 340. The system 300 can be implemented using other suitable architectures, such as a single computing device.


The system 300 includes a mobile computing device 310. The mobile computing device 310 can be implemented using any suitable computing device(s). The mobile computing device 310 can correspond to image capture device 102 of FIG. 1 or other device. In some implementations, the mobile computing device 310 can be a smartphone, tablet, wearable computing device, laptop, or any other suitable computing device capable of being carried by a user while in operation. The mobile computing device 310 can have one or more processors 312 and one or more memory devices 314. The mobile computing device 310 can also include a network interface used to communicate with one or more servers 330 over the network 340. The network interface can include any suitable components for interfacing with one more networks, including for example, transmitters, receivers, ports, controllers, antennas, or other suitable components.


The one or more processors 312 can include any suitable processing device, such as a microprocessor, microcontroller, integrated circuit, logic device, or other suitable processing device. The one or more memory devices 314 can include one or more computer-readable media, including, but not limited to, non-transitory computer-readable media, RAM, ROM, hard drives, flash drives, or other memory devices. The one or more memory devices 314 can store information accessible by the one or more processors 312, including computer-readable instructions 316 that can be executed by the one or more processors 312. The instructions 316 can be any set of instructions that when executed by the one or more processors 312, cause the one or more processors 312 to perform operations. For instance, the instructions 316 can be executed by the one or more processors 312 to implement a user interface 320 for capturing images according to example embodiments of the present disclosure and a SAR controller 110 described with reference to FIG. 2.


As shown in FIG. 5, the one or more memory devices 314 can also store data 318 that can be retrieved, manipulated, created, or stored by the one or more processors 312. The data 318 can include, for instance, return signals generated according to example aspects of the present disclosure, and other data. The data 318 can be stored in one or more databases. The one or more databases can be connected to the mobile computing device 310 by a high bandwidth LAN or WAN, or can also be connected to mobile computing device 310 through network 340. The one or more databases can be split up so that they are located in multiple locales.


The mobile computing device 310 can include various input/output devices for providing and receiving information from a user, such as a touch screen, touch pad, data entry keys, speakers, and/or a microphone suitable for voice recognition. For instance, the mobile computing device 310 can have a display device 335 for presenting a user interface for displaying images according to example aspects of the present disclosure. Mobile computing device 310 can further include position sensors 108 described with respect to FIG. 1.


Mobile computing device 310 can further include a 322. Radar module can include one or more antenna elements 106 as described with reference to FIG. 1. Radar module 322 can further include a signal generator configured to generate energy signals to be transmitted, and to provide the generated energy signals to antenna element(s) 106.


In some implementations, the mobile computing device 310 can exchange data with one or more remote computing devices, such as server 330 over the network 340. For instance, server 330 can be a web server. Server 330 can be implemented using any suitable type of computing device. Similar to the mobile computing device 310, a server 330 can include one or more processor(s) 332 and a memory 334. The one or more processor(s) 332 can include one or more central processing units (CPUs), graphics processing units (GPUs) dedicated to efficiently rendering images or performing other specialized calculations, and/or other processing devices. The memory 334 can include one or more computer-readable media and can store information accessible by the one or more processors 332, including instructions 336 that can be executed by the one or more processors 332 and data 338.


In some implementations, one or more example aspects of the present disclosure can be performed by server 330. For instance, one or more operations associated with SAR controller 110 can be performed by server 330 and communicated to the mobile computing device 310 via the network 340.


The server 330 can also include a network interface used to communicate with one or more remote computing devices (e.g. mobile computing device 310) over the network 340. The network interface can include any suitable components for interfacing with one more networks, including for example, transmitters, receivers, ports, controllers, antennas, or other suitable components.


The network 340 can be any type of communications network, such as a local area network (e.g. intranet), wide area network (e.g. Internet), cellular network, or some combination thereof. The network 340 can also include a direct connection between a server 330 and the mobile computing device 310. In general, communication between the mobile computing device 310 and a server 330 can be carried via network interface using any type of wired and/or wireless connection, using a variety of communication protocols (e.g. TCP/IP, HTTP, SMTP, FTP), encodings or formats (e.g. HTML, XML), and/or protection schemes (e.g. VPN, secure HTTP, SSL).


The technology discussed herein makes reference to servers, databases, software applications, and other computer-based systems, as well as actions taken and information sent to and from such systems. One of ordinary skill in the art will recognize that the inherent flexibility of computer-based systems allows for a great variety of possible configurations, combinations, and divisions of tasks and functionality between and among components. For instance, server processes discussed herein may be implemented using a single server or multiple servers working in combination. Databases and applications may be implemented on a single system or distributed across multiple systems. Distributed components may operate sequentially or in parallel.


While the present subject matter has been described in detail with respect to specific example embodiments thereof, it will be appreciated that those skilled in the art, upon attaining an understanding of the foregoing may readily produce alterations to, variations of, and equivalents to such embodiments. Accordingly, the scope of the present disclosure is by way of example rather than by way of limitation, and the subject disclosure does not preclude inclusion of such modifications, variations and/or additions to the present subject matter as would be readily apparent to one of ordinary skill in the art.

Claims
  • 1. A method performed by a computing device for creating radar-based images, the computing device comprising a smartphone, tablet computing device, wearable computing device, or laptop computing device the method comprising: at a first location of the computing device relative to a fixed target: transmitting, by the computing device, a first radar signal;receiving, by the computing device, a first return signal caused by a reflection of the first radar signal off the fixed target, the first return signal comprising a plurality of channels of first radar data from a plurality of receive antennas; anddetermining, by the computing device, a first target response associated with the fixed target based on the first return signal, the first target response determined by combining the channels of first radar data in a digital domain using a beamforming algorithm to enable spatial discrimination; andat a second location of the computing device relative to the fixed target, the second location not corresponding to a pre-determined trajectory of the computing device: transmitting, by the computing device, a second radar signal;receiving, by the computing device, a second return signal caused by a reflection of the second radar signal off the fixed target, the second return signal comprising a plurality of channels of second radar data from the receive antennas;determining, by the computing device, a second target response associated with the fixed target based on the second return signal, the second target response determined by combining the channels of second radar data in the digital domain using the beamforming algorithm;determining, by the computing device, a movement of the computing device relative to the fixed target between the first and second locations based on at least one of a difference between the first and second target responses or sensor data from at least one sensor implemented within the computing device, the movement comprising a change in range to the fixed target and a velocity of the computing device relative to the fixed target;adjusting, by the computing device, the second target response based on the movement; andcreating, by the computing device, a radar-based image of the fixed target based on the first and adjusted second target responses.
  • 2. The method of claim 1, wherein the millimeter wave radar signals have frequencies between 50 gigahertz (GHz) and 70 GHz.
  • 3. The method of claim 1, wherein the sensor comprises an accelerometer, gyroscope, depth camera, optical camera, or ranging device.
  • 4. The method of claim 1, wherein: the second radar signal is transmitted via a plurality of transmit antennas; andthe second radar signal is steered towards the fixed target using beamforming techniques.
  • 5. The method of claim 4, wherein the second radar signal is steered based on the movement.
  • 6. The method of claim 4: further comprising, identifying, by the computing device, a location of the fixed target relative to the computing device based on the first target response; andwherein the second radar signal is steered based on the identified location of the fixed target and the movement.
  • 7. The method of claim 1, wherein the movement further comprises a change in orientation of the computing device relative to the fixed target.
  • 8. The method of claim 1, wherein the radar-based image is a see-through-the-wall image.
  • 9. The method of claim 1, wherein the radar-based image is a three-dimensional image.
  • 10. The method of claim 1, wherein: the movement is used to determine a migration of an individual scattering point of the fixed target relative to the computing device; andthe second target response is adjusted based on the migration of the individual scattering point.
  • 11. A computing system comprised by a smartphone, tablet computing device, wearable computing device, or laptop computing device, the computing system comprising: at least one processor;at least one radar transmitter;at least one radar receiver;a plurality of receive antennas;at least one sensor; andat least one memory device, the memory device storing computer-readable instructions that when executed by the processor cause the processor to: at a first location of the computing system relative to a fixed target: cause the radar transmitter to transmit a first radar signal;receive, via the radar receiver, a first return signal caused by a reflection of the first radar signal off the fixed target, the first return signal comprising channels of first radar data from each of the receive antennas; anddetermine a first target response associated with the fixed target based on the first return signal, the first target response determined by combining the channels of first radar data in a digital domain using a beamforming algorithm to enable spatial discrimination; andat a second location of the computing system relative to the fixed target, the second location not corresponding to a pre-determined trajectory of the computing system: cause the radar transmitter to transmit a second radar signal;receive, via the radar receiver, a second return signal caused by a reflection of the second radar signal off the fixed target, the second return signal comprising channels of second radar data from each of the receive antennas;determine a second target response associated with the fixed target based on the second return signal, the second target response determined by combining the channels of second radar data in the digital domain using the beamforming algorithm;determine a movement of the computing system relative to the fixed target between the first and second locations based on at least one of a difference between the first and second target responses or sensor data from the at least one sensor, the movement comprising a change in range to the fixed target and a velocity of the computing device relative to the fixed target;adjust the second target response based on the movement; andcreate a radar-based image of the fixed target based on the first and adjusted second target responses.
  • 12. The computing system of claim 11, wherein the millimeter wave radar signals have frequencies between 50 gigahertz (GHz) and 70 GHz.
  • 13. The computing system of claim 11, wherein the sensor comprises an accelerometer, gyroscope, depth camera, optical camera, or ranging device.
  • 14. The computing system of claim 11, wherein: the computing system further comprises a plurality of transmit antennas;the second radar signal is transmitted via the plurality of transmit antennas; andthe second radar signal is steered using beamforming techniques.
  • 15. The computing system of claim 14, wherein the second radar signal is steered towards the fixed target.
  • 16. The computing system of claim 15, wherein the second radar signal is steered based on the movement.
  • 17. The computing system of claim 15, wherein: the instructions further cause the processor to identify a location of the fixed target relative to the computing system based on the first target response; andthe second radar signal is steered based on the identified location of the fixed target and the relative movement.
  • 18. The computing system of claim 11, wherein the movement further comprises a change in orientation of the computing device relative to the fixed target.
  • 19. The computing system of claim 11, wherein the radar-based image is one or more of a three-dimensional image or a see-through-the-wall image.
  • 20. The computing system of claim 11, wherein: the movement is used to determine a migration of an individual scattering point of the fixed target relative to the computing device; andthe second target response is adjusted based on the migration of the individual scattering point.
PRIORITY CLAIM

The present application is based on and claims priority to U.S. Provisional Application 62/237,975 having a filing date of Oct. 6, 2015, which is incorporated by reference herein.

US Referenced Citations (571)
Number Name Date Kind
3610874 Gagliano Oct 1971 A
3752017 Lloyd et al. Aug 1973 A
3953706 Harris et al. Apr 1976 A
4104012 Ferrante Aug 1978 A
4654967 Thenner Apr 1987 A
4700044 Hokanson et al. Oct 1987 A
4795998 Dunbar et al. Jan 1989 A
4838797 Dodier Jun 1989 A
5016500 Conrad et al. May 1991 A
5121124 Spivey et al. Jun 1992 A
5298715 Chalco et al. Mar 1994 A
5341979 Gupta Aug 1994 A
5410471 Alyfuku et al. Apr 1995 A
5468917 Brodsky et al. Nov 1995 A
5564571 Zanotti Oct 1996 A
5656798 Kubo et al. Aug 1997 A
5724707 Kirk et al. Mar 1998 A
5798798 Rector et al. Aug 1998 A
6032450 Blum Mar 2000 A
6037893 Lipman Mar 2000 A
6080690 Lebby et al. Jun 2000 A
6101431 Niwa et al. Aug 2000 A
6210771 Post et al. Apr 2001 B1
6254544 Hayashi Jul 2001 B1
6303924 Adan et al. Oct 2001 B1
6313825 Gilbert Nov 2001 B1
6340979 Beaton et al. Jan 2002 B1
6380882 Hegnauer Apr 2002 B1
6386757 Konno May 2002 B1
6440593 Ellison et al. Aug 2002 B2
6492980 Sandbach Dec 2002 B2
6493933 Post et al. Dec 2002 B1
6513833 Breed et al. Feb 2003 B2
6513970 Tabata et al. Feb 2003 B1
6524239 Reed et al. Feb 2003 B1
6543668 Fujii et al. Apr 2003 B1
6616613 Goodman Sep 2003 B1
6711354 Kameyama Mar 2004 B2
6717065 Hosaka et al. Apr 2004 B2
6802720 Weiss et al. Oct 2004 B2
6833807 Flacke et al. Dec 2004 B2
6835898 Eldridge et al. Dec 2004 B2
6854985 Weiss Feb 2005 B1
6929484 Weiss et al. Aug 2005 B2
6970128 Dwelly et al. Nov 2005 B1
6997882 Parker et al. Feb 2006 B1
7019682 Louberg et al. Mar 2006 B1
7134879 Sugimoto et al. Nov 2006 B2
7164820 Eves et al. Jan 2007 B2
7194371 McBride et al. Mar 2007 B1
7223105 Weiss et al. May 2007 B2
7230610 Jung et al. Jun 2007 B2
7249954 Weiss Jul 2007 B2
7266532 Sutton et al. Sep 2007 B2
7299964 Jayaraman et al. Nov 2007 B2
7310236 Takahashi et al. Dec 2007 B2
7317416 Flom et al. Jan 2008 B2
7348285 Dhawan et al. Mar 2008 B2
7365031 Swallow et al. Apr 2008 B2
7421061 Boese et al. Sep 2008 B2
7462035 Lee et al. Dec 2008 B2
7528082 Krans et al. May 2009 B2
7544627 Tao et al. Jun 2009 B2
7578195 DeAngelis et al. Aug 2009 B2
7644488 Aisenbrey Jan 2010 B2
7647093 Bojovic et al. Jan 2010 B2
7670144 Ito et al. Mar 2010 B2
7677729 Vilser et al. Mar 2010 B2
7691067 Westbrook et al. Apr 2010 B2
7698154 Marchosky Apr 2010 B2
7791700 Bellamy Sep 2010 B2
7834276 Chou et al. Nov 2010 B2
7941676 Glaser May 2011 B2
7952512 Delker et al. May 2011 B1
7999722 Beeri et al. Aug 2011 B2
8062220 Kurtz et al. Nov 2011 B2
8063815 Valo et al. Nov 2011 B2
8169404 Boillot May 2012 B1
8179604 Prada Gomez et al. May 2012 B1
8193929 Siu et al. Jun 2012 B1
8199104 Park et al. Jun 2012 B2
8282232 Hsu et al. Oct 2012 B2
8289185 Alonso Oct 2012 B2
8301232 Albert et al. Oct 2012 B2
8314732 Oswald et al. Nov 2012 B2
8334226 Nhan et al. Dec 2012 B2
8341762 Balzano Jan 2013 B2
8344949 Moshfeghi Jan 2013 B2
8367942 Howell et al. Feb 2013 B2
8475367 Yuen et al. Jul 2013 B1
8505474 Kang et al. Aug 2013 B2
8509882 Albert et al. Aug 2013 B2
8514221 King et al. Aug 2013 B2
8527146 Jackson et al. Sep 2013 B1
8549829 Song et al. Oct 2013 B2
8560972 Wilson Oct 2013 B2
8562526 Heneghan et al. Oct 2013 B2
8569189 Bhattacharya et al. Oct 2013 B2
8614689 Nishikawa et al. Dec 2013 B2
8655004 Prest et al. Feb 2014 B2
8700137 Albert Apr 2014 B2
8758020 Burdea et al. Jun 2014 B2
8759713 Sheats Jun 2014 B2
8764651 Tran Jul 2014 B2
8785778 Streeter et al. Jul 2014 B2
8790257 Libbus et al. Jul 2014 B2
8814574 Selby et al. Aug 2014 B2
8819812 Weber et al. Aug 2014 B1
8854433 Rafii Oct 2014 B1
8860602 Nohara et al. Oct 2014 B2
8921473 Hyman Dec 2014 B1
8948839 Longinotti-Buitoni et al. Feb 2015 B1
9055879 Selby et al. Jun 2015 B2
9075429 Karakotsios et al. Jul 2015 B1
9093289 Vicard et al. Jul 2015 B2
9125456 Chow Sep 2015 B2
9141194 Keyes et al. Sep 2015 B1
9148949 Zhou et al. Sep 2015 B2
9229102 Wright et al. Jan 2016 B1
9230160 Kanter Jan 2016 B1
9235241 Newham et al. Jan 2016 B2
9316727 Sentelle et al. Apr 2016 B2
9331422 Nazzaro et al. May 2016 B2
9335825 Rautiainen et al. May 2016 B2
9346167 O'Connor et al. May 2016 B2
9354709 Heller et al. May 2016 B1
9508141 Khachaturian et al. Nov 2016 B2
9569001 Mistry et al. Feb 2017 B2
9575560 Poupyrev et al. Feb 2017 B2
9588625 Poupyrev Mar 2017 B2
9594443 VanBlon et al. Mar 2017 B2
9600080 Poupyrev Mar 2017 B2
9693592 Robinson et al. Jul 2017 B2
9746551 Scholten et al. Aug 2017 B2
9766742 Papakostas Sep 2017 B2
9778749 Poupyrev Oct 2017 B2
9811164 Poupyrev Nov 2017 B2
9817109 Saboo et al. Nov 2017 B2
9837760 Karagozler et al. Dec 2017 B2
9848780 DeBusschere et al. Dec 2017 B1
9921660 Poupyrev Mar 2018 B2
9933908 Poupyrev Apr 2018 B2
9947080 Nguyen et al. Apr 2018 B2
9971414 Gollakota et al. May 2018 B2
9971415 Poupyrev et al. May 2018 B2
9983747 Poupyrev May 2018 B2
9994233 Diaz-Jimenez et al. Jun 2018 B2
10016162 Rogers et al. Jul 2018 B1
10034630 Lee et al. Jul 2018 B2
10073590 Dascola et al. Sep 2018 B2
10080528 DeBusschere et al. Sep 2018 B2
10082950 Lapp Sep 2018 B2
10088908 Poupyrev et al. Oct 2018 B1
10139916 Poupyrev Nov 2018 B2
10155274 Robinson et al. Dec 2018 B2
10175781 Karagozler et al. Jan 2019 B2
10222469 Gillian et al. Mar 2019 B1
10300370 Amihood et al. May 2019 B1
10310621 Lien et al. Jun 2019 B1
10379621 Schwesig et al. Aug 2019 B2
10401490 Gillian et al. Sep 2019 B2
10459080 Schwesig et al. Oct 2019 B1
10503883 Gillian et al. Dec 2019 B1
10540001 Poupyrev et al. Jan 2020 B1
10642367 Poupyrev May 2020 B2
10705185 Lien et al. Jul 2020 B1
20010035836 Miceli et al. Nov 2001 A1
20020009972 Amento et al. Jan 2002 A1
20020080156 Abbott et al. Jun 2002 A1
20020170897 Hall Nov 2002 A1
20030005030 Sutton et al. Jan 2003 A1
20030071750 Benitz Apr 2003 A1
20030093000 Nishio et al. May 2003 A1
20030100228 Bungo et al. May 2003 A1
20030119391 Swallow et al. Jun 2003 A1
20030122677 Kail Jul 2003 A1
20040009729 Hill et al. Jan 2004 A1
20040102693 DeBusschere et al. May 2004 A1
20040249250 McGee et al. Dec 2004 A1
20040259391 Jung et al. Dec 2004 A1
20050069695 Jung et al. Mar 2005 A1
20050128124 Greneker et al. Jun 2005 A1
20050148876 Endoh et al. Jul 2005 A1
20050231419 Mitchell Oct 2005 A1
20060035554 Glaser et al. Feb 2006 A1
20060040739 Wells Feb 2006 A1
20060047386 Kanevsky et al. Mar 2006 A1
20060061504 Leach, Jr. et al. Mar 2006 A1
20060125803 Westerman et al. Jun 2006 A1
20060136997 Telek et al. Jun 2006 A1
20060139162 Flynn Jun 2006 A1
20060139314 Bell Jun 2006 A1
20060148351 Tao et al. Jul 2006 A1
20060157734 Onodero et al. Jul 2006 A1
20060166620 Sorensen Jul 2006 A1
20060170584 Romero et al. Aug 2006 A1
20060209021 Yoo et al. Sep 2006 A1
20060258205 Locher et al. Nov 2006 A1
20070024488 Zemany et al. Feb 2007 A1
20070026695 Lee et al. Feb 2007 A1
20070027369 Pagnacco et al. Feb 2007 A1
20070118043 Oliver et al. May 2007 A1
20070161921 Rausch Jul 2007 A1
20070164896 Suzuki et al. Jul 2007 A1
20070176821 Flom et al. Aug 2007 A1
20070192647 Glaser Aug 2007 A1
20070197115 Eves et al. Aug 2007 A1
20070197878 Shklarski Aug 2007 A1
20070210074 Maurer et al. Sep 2007 A1
20070237423 Tico et al. Oct 2007 A1
20080001735 Tran Jan 2008 A1
20080002027 Kondo et al. Jan 2008 A1
20080015422 Wessel Jan 2008 A1
20080024438 Collins et al. Jan 2008 A1
20080039731 McCombie et al. Feb 2008 A1
20080059578 Albertson et al. Mar 2008 A1
20080065291 Breed Mar 2008 A1
20080074307 Boric-Lubecke et al. Mar 2008 A1
20080122796 Jobs et al. May 2008 A1
20080134102 Movold et al. Jun 2008 A1
20080136775 Conant Jun 2008 A1
20080168396 Matas et al. Jul 2008 A1
20080168403 Westerman et al. Jul 2008 A1
20080194204 Duet et al. Aug 2008 A1
20080194975 MacQuarrie et al. Aug 2008 A1
20080211766 Westerman et al. Sep 2008 A1
20080233822 Swallow et al. Sep 2008 A1
20080278450 Lashina Nov 2008 A1
20080282665 Speleers Nov 2008 A1
20080291158 Park et al. Nov 2008 A1
20080303800 Elwell Dec 2008 A1
20080316085 Rofougaran et al. Dec 2008 A1
20080320419 Matas et al. Dec 2008 A1
20090018408 Ouchi et al. Jan 2009 A1
20090018428 Dias et al. Jan 2009 A1
20090033585 Lang Feb 2009 A1
20090053950 Surve Feb 2009 A1
20090056300 Chung et al. Mar 2009 A1
20090058820 Hinckley Mar 2009 A1
20090113298 Jung et al. Apr 2009 A1
20090115617 Sano et al. May 2009 A1
20090118648 Kandori et al. May 2009 A1
20090149036 Lee et al. Jun 2009 A1
20090177068 Stivoric et al. Jul 2009 A1
20090203244 Toonder Aug 2009 A1
20090226043 Angell et al. Sep 2009 A1
20090253585 Diatchenko et al. Oct 2009 A1
20090270690 Roos et al. Oct 2009 A1
20090278915 Kramer et al. Nov 2009 A1
20090288762 Wolfel Nov 2009 A1
20090295712 Ritzau Dec 2009 A1
20090319181 Khosravy et al. Dec 2009 A1
20100013676 Do et al. Jan 2010 A1
20100045513 Pett et al. Feb 2010 A1
20100050133 Nishihara et al. Feb 2010 A1
20100053151 Marti et al. Mar 2010 A1
20100060570 Underkoffler et al. Mar 2010 A1
20100065320 Urano Mar 2010 A1
20100069730 Bergstrom et al. Mar 2010 A1
20100071205 Graumann et al. Mar 2010 A1
20100094141 Puswella Apr 2010 A1
20100107099 Frazier et al. Apr 2010 A1
20100109938 Oswald et al. May 2010 A1
20100152600 Droitcour et al. Jun 2010 A1
20100179820 Harrison et al. Jul 2010 A1
20100198067 Mahfouz et al. Aug 2010 A1
20100201586 Michalk Aug 2010 A1
20100204550 Heneghan et al. Aug 2010 A1
20100205667 Anderson et al. Aug 2010 A1
20100208035 Pinault et al. Aug 2010 A1
20100225562 Smith Sep 2010 A1
20100234094 Gagner et al. Sep 2010 A1
20100241009 Petkie Sep 2010 A1
20100002912 Solinsky Oct 2010 A1
20100281438 Latta et al. Nov 2010 A1
20100292549 Schuler Nov 2010 A1
20100306713 Geisner et al. Dec 2010 A1
20100313414 Sheats Dec 2010 A1
20100324384 Moon et al. Dec 2010 A1
20100325770 Chung et al. Dec 2010 A1
20110003664 Richard Jan 2011 A1
20110010014 Oexman et al. Jan 2011 A1
20110018795 Jang Jan 2011 A1
20110029038 Hyde et al. Feb 2011 A1
20110073353 Lee et al. Mar 2011 A1
20110083111 Forutanpour et al. Apr 2011 A1
20110093820 Zhang et al. Apr 2011 A1
20110118564 Sankai May 2011 A1
20110119640 Berkes et al. May 2011 A1
20110166940 Bangera et al. Jul 2011 A1
20110181509 Rautiainen et al. Jul 2011 A1
20110181510 Hakala et al. Jul 2011 A1
20110193939 Vassigh et al. Aug 2011 A1
20110197263 Stinson, III Aug 2011 A1
20110202404 van der Riet Aug 2011 A1
20110213218 Weiner et al. Sep 2011 A1
20110221666 Newton et al. Sep 2011 A1
20110234492 Ajmera et al. Sep 2011 A1
20110239118 Yamaoka et al. Sep 2011 A1
20110245688 Arora et al. Oct 2011 A1
20110279303 Smith Nov 2011 A1
20110286585 Hodge Nov 2011 A1
20110303341 Meiss et al. Dec 2011 A1
20110307842 Chiang et al. Dec 2011 A1
20110316888 Sachs et al. Dec 2011 A1
20110318985 McDermid Dec 2011 A1
20120001875 Li et al. Jan 2012 A1
20120019168 Noda et al. Jan 2012 A1
20120029369 Icove et al. Feb 2012 A1
20120047468 Santos et al. Feb 2012 A1
20120068876 Bangera et al. Mar 2012 A1
20120069043 Narita et al. Mar 2012 A1
20120092284 Rofougaran et al. Apr 2012 A1
20120123232 Najarian et al. May 2012 A1
20120127082 Kushler et al. May 2012 A1
20120144934 Russell et al. Jun 2012 A1
20120150493 Casey et al. Jun 2012 A1
20120154313 Au et al. Jun 2012 A1
20120156926 Kato et al. Jun 2012 A1
20120174299 Balzano Jul 2012 A1
20120174736 Wang et al. Jul 2012 A1
20120193801 Gross et al. Aug 2012 A1
20120220835 Chung Aug 2012 A1
20120248093 Ulrich et al. Oct 2012 A1
20120254810 Heck et al. Oct 2012 A1
20120268416 Pirogov et al. Oct 2012 A1
20120270564 Gum et al. Oct 2012 A1
20120280900 Wang et al. Nov 2012 A1
20120298748 Factor et al. Nov 2012 A1
20120310665 Xu et al. Dec 2012 A1
20130016070 Starner et al. Jan 2013 A1
20130027218 Schwarz et al. Jan 2013 A1
20130035563 Angellides Feb 2013 A1
20130046544 Kay et al. Feb 2013 A1
20130053653 Cuddihy et al. Feb 2013 A1
20130078624 Holmes et al. Mar 2013 A1
20130082922 Miller Apr 2013 A1
20130083173 Geisner et al. Apr 2013 A1
20130086533 Stienstra Apr 2013 A1
20130096439 Lee et al. Apr 2013 A1
20130102217 Jeon Apr 2013 A1
20130104084 Mlyniec et al. Apr 2013 A1
20130113647 Sentelle et al. May 2013 A1
20130113830 Suzuki May 2013 A1
20130117377 Miller May 2013 A1
20130132931 Bruns et al. May 2013 A1
20130147833 Aubauer et al. Jun 2013 A1
20130150735 Cheng Jun 2013 A1
20130161078 Li Jun 2013 A1
20130169471 Lynch Jul 2013 A1
20130176161 Derham et al. Jul 2013 A1
20130176258 Dahl et al. Jul 2013 A1
20130194173 Zhu et al. Aug 2013 A1
20130195330 Kim et al. Aug 2013 A1
20130196716 Muhammad Aug 2013 A1
20130207962 Oberdorfer et al. Aug 2013 A1
20130229508 Li et al. Sep 2013 A1
20130241765 Kozma et al. Sep 2013 A1
20130245986 Grokop et al. Sep 2013 A1
20130253029 Jain et al. Sep 2013 A1
20130260630 Ito et al. Oct 2013 A1
20130263029 Rossi et al. Oct 2013 A1
20130278499 Anderson Oct 2013 A1
20130278501 Bulzacki Oct 2013 A1
20130281024 Rofougaran Oct 2013 A1
20130283203 Batraski et al. Oct 2013 A1
20130322729 Mestha et al. Dec 2013 A1
20130332438 Li et al. Dec 2013 A1
20130345569 Mestha et al. Dec 2013 A1
20140005809 Frei et al. Jan 2014 A1
20140022108 Alberth et al. Jan 2014 A1
20140028539 Newham et al. Jan 2014 A1
20140049487 Konertz et al. Feb 2014 A1
20140050354 Heim et al. Feb 2014 A1
20140051941 Messerschmidt Feb 2014 A1
20140070957 Longinotti-Buitoni et al. Mar 2014 A1
20140072190 Wu et al. Mar 2014 A1
20140073486 Ahmed et al. Mar 2014 A1
20140073969 Zou et al. Mar 2014 A1
20140081100 Muhsin et al. Mar 2014 A1
20140095480 Marantz et al. Apr 2014 A1
20140097979 Nohara et al. Apr 2014 A1
20140121540 Raskin May 2014 A1
20140135631 Brumback et al. May 2014 A1
20140139422 Mistry et al. May 2014 A1
20140139430 Leung May 2014 A1
20140139616 Pinter et al. May 2014 A1
20140143678 Mistry et al. May 2014 A1
20140149859 Van Dyken et al. May 2014 A1
20140181509 Liu Jun 2014 A1
20140184496 Gribetz et al. Jul 2014 A1
20140184499 Kim Jul 2014 A1
20140188989 Stekkelpak et al. Jul 2014 A1
20140191939 Penn et al. Jul 2014 A1
20140200416 Kashef et al. Jul 2014 A1
20140201690 Holz Jul 2014 A1
20140208275 Mongia et al. Jul 2014 A1
20140215389 Walsh et al. Jul 2014 A1
20140239065 Zhou et al. Aug 2014 A1
20140244277 Krishna Rao et al. Aug 2014 A1
20140246415 Wittkowski Sep 2014 A1
20140247212 Kim et al. Sep 2014 A1
20140250515 Jakobsson Sep 2014 A1
20140253431 Gossweiler et al. Sep 2014 A1
20140253709 Bresch et al. Sep 2014 A1
20140262478 Harris et al. Sep 2014 A1
20140270698 Luna et al. Sep 2014 A1
20140275854 Venkatraman et al. Sep 2014 A1
20140280295 Kurochikin et al. Sep 2014 A1
20140281975 Anderson Sep 2014 A1
20140282877 Mahaffey et al. Sep 2014 A1
20140297006 Sadhu Oct 2014 A1
20140298266 Lapp Oct 2014 A1
20140300506 Alton et al. Oct 2014 A1
20140306936 Dahl et al. Oct 2014 A1
20140309855 Tran Oct 2014 A1
20140316261 Lux et al. Oct 2014 A1
20140318699 Longinotti-Buitoni et al. Oct 2014 A1
20140324888 Xie et al. Oct 2014 A1
20140329567 Chan et al. Nov 2014 A1
20140333467 Inomata Nov 2014 A1
20140343392 Yang Nov 2014 A1
20140347295 Kim et al. Nov 2014 A1
20140357369 Callens et al. Dec 2014 A1
20140368378 Crain Dec 2014 A1
20140368441 Touloumtzis Dec 2014 A1
20140376788 Xu et al. Dec 2014 A1
20150002391 Chen Jan 2015 A1
20150009096 Lee et al. Jan 2015 A1
20150026815 Barrett Jan 2015 A1
20150029050 Driscoll et al. Jan 2015 A1
20150030256 Brady et al. Jan 2015 A1
20150040040 Balan et al. Feb 2015 A1
20150046183 Cireddu Feb 2015 A1
20150062033 Ishihara Mar 2015 A1
20150068069 Tran et al. Mar 2015 A1
20150077282 Mohamadi Mar 2015 A1
20150077345 Hwang et al. Mar 2015 A1
20150085060 Fish et al. Mar 2015 A1
20150091820 Rosenberg et al. Apr 2015 A1
20150091858 Rosenberg et al. Apr 2015 A1
20150091859 Rosenberg et al. Apr 2015 A1
20150091903 Costello et al. Apr 2015 A1
20150099941 Tran Apr 2015 A1
20150100328 Kress et al. Apr 2015 A1
20150109164 Takaki Apr 2015 A1
20150112606 He et al. Apr 2015 A1
20150133017 Liao et al. May 2015 A1
20150143601 Longinotti-Buitoni et al. May 2015 A1
20150145805 Liu May 2015 A1
20150162729 Reversat et al. Jun 2015 A1
20150177866 Hwang et al. Jun 2015 A1
20150185314 Corcos et al. Jul 2015 A1
20150199045 Robucci et al. Jul 2015 A1
20150205358 Lyren Jul 2015 A1
20150223733 Al-Alusi Aug 2015 A1
20150226004 Thompson Aug 2015 A1
20150229885 Offenhaeuser Aug 2015 A1
20150256763 Niemi Sep 2015 A1
20150261320 Leto Sep 2015 A1
20150268027 Gerdes Sep 2015 A1
20150268799 Starner et al. Sep 2015 A1
20150277569 Sprenger et al. Oct 2015 A1
20150280102 Tajitsu et al. Oct 2015 A1
20150285906 Hooper et al. Oct 2015 A1
20150287187 Redtel Oct 2015 A1
20150301167 Sentelle Oct 2015 A1
20150312041 Choi Oct 2015 A1
20150314780 Stenneth et al. Nov 2015 A1
20150317518 Fujimaki et al. Nov 2015 A1
20150323993 Levesque et al. Nov 2015 A1
20150332075 Burch Nov 2015 A1
20150341550 Lay Nov 2015 A1
20150346820 Poupyrev et al. Dec 2015 A1
20150350902 Baxley et al. Dec 2015 A1
20150351703 Phillips et al. Dec 2015 A1
20150370250 Bachrach et al. Dec 2015 A1
20150375339 Sterling et al. Dec 2015 A1
20160018948 Parvarandeh et al. Jan 2016 A1
20160026253 Bradski et al. Jan 2016 A1
20160038083 Ding et al. Feb 2016 A1
20160041617 Poupyrev Feb 2016 A1
20160041618 Poupyrev Feb 2016 A1
20160042169 Polehn Feb 2016 A1
20160048235 Poupyrev Feb 2016 A1
20160048236 Poupyrev Feb 2016 A1
20160048672 Lux et al. Feb 2016 A1
20160054792 Poupyrev Feb 2016 A1
20160054803 Poupyrev Feb 2016 A1
20160054804 Gollakata et al. Feb 2016 A1
20160055201 Poupyrev et al. Feb 2016 A1
20160077202 Hirvonen Mar 2016 A1
20160090839 Stolarcyzk Mar 2016 A1
20160098089 Poupyrev Apr 2016 A1
20160100166 Dragne et al. Apr 2016 A1
20160103500 Hussey et al. Apr 2016 A1
20160106328 Mestha et al. Apr 2016 A1
20160131741 Park May 2016 A1
20160140872 Palmer et al. May 2016 A1
20160145776 Roh May 2016 A1
20160170491 Jung Jun 2016 A1
20160171293 Li et al. Jun 2016 A1
20160186366 McMaster Jun 2016 A1
20160206244 Rogers Jul 2016 A1
20160213331 Gil et al. Jul 2016 A1
20160216825 Forutanpour Jul 2016 A1
20160220152 Meriheina et al. Aug 2016 A1
20160234365 Alameh et al. Aug 2016 A1
20160249698 Berzowska et al. Sep 2016 A1
20160252607 Saboo et al. Sep 2016 A1
20160252965 Mandella et al. Sep 2016 A1
20160253044 Katz Sep 2016 A1
20160259037 Molchanov et al. Sep 2016 A1
20160262685 Wagner et al. Sep 2016 A1
20160282988 Poupyrev Sep 2016 A1
20160283101 Schwesig et al. Sep 2016 A1
20160284436 Fukuhara et al. Sep 2016 A1
20160287172 Morris et al. Oct 2016 A1
20160299526 Inagaki et al. Oct 2016 A1
20160306034 Trotta et al. Oct 2016 A1
20160320852 Poupyrev Nov 2016 A1
20160320853 Lien et al. Nov 2016 A1
20160320854 Lien et al. Nov 2016 A1
20160321428 Rogers Nov 2016 A1
20160338599 DeBusschere et al. Nov 2016 A1
20160345638 Robinson et al. Dec 2016 A1
20160349790 Connor Dec 2016 A1
20160349845 Poupyrev et al. Dec 2016 A1
20160377712 Wu et al. Dec 2016 A1
20170029985 Tajitsu et al. Feb 2017 A1
20170052618 Lee et al. Feb 2017 A1
20170060254 Molchanov et al. Mar 2017 A1
20170060298 Hwang et al. Mar 2017 A1
20170075481 Chou et al. Mar 2017 A1
20170075496 Rosenberg et al. Mar 2017 A1
20170097413 Gillian et al. Apr 2017 A1
20170097684 Lien Apr 2017 A1
20170115777 Poupyrev Apr 2017 A1
20170124407 Micks et al. May 2017 A1
20170125940 Karagozler et al. May 2017 A1
20170168630 Khoshkava et al. Jun 2017 A1
20170192523 Poupyrev Jul 2017 A1
20170192629 Takada et al. Jul 2017 A1
20170196513 Longinotti-Buitoni et al. Jul 2017 A1
20170232538 Robinson et al. Aug 2017 A1
20170233903 Jeon Aug 2017 A1
20170249033 Podhajny et al. Aug 2017 A1
20170322633 Shen et al. Nov 2017 A1
20170325337 Karagozler et al. Nov 2017 A1
20170325518 Poupyrev et al. Nov 2017 A1
20170329412 Schwesig et al. Nov 2017 A1
20170329425 Karagozler et al. Nov 2017 A1
20180000354 DeBusschere et al. Jan 2018 A1
20180000355 DeBusschere et al. Jan 2018 A1
20180004301 Poupyrev Jan 2018 A1
20180005766 Fairbanks et al. Jan 2018 A1
20180046258 Poupyrev Feb 2018 A1
20180106897 Shouldice et al. Apr 2018 A1
20180113032 Dickey et al. Apr 2018 A1
20180157330 Gu et al. Jun 2018 A1
20180160943 Fyfe et al. Jun 2018 A1
20180177464 DeBusschere et al. Jun 2018 A1
20180196527 Poupyrev et al. Jul 2018 A1
20180256106 Rogers et al. Sep 2018 A1
20180296163 DeBusschere et al. Oct 2018 A1
20190033981 Poupyrev Jan 2019 A1
20190232156 Amihood et al. Aug 2019 A1
20190257939 Schwesig et al. Aug 2019 A1
20190321719 Gillian et al. Oct 2019 A1
20200089314 Poupyrev et al. Mar 2020 A1
20200264765 Poupyrev et al. Aug 2020 A1
Foreign Referenced Citations (98)
Number Date Country
1462382 Dec 2003 CN
101751126 Jun 2010 CN
102184020 Sep 2011 CN
102414641 Apr 2012 CN
102473032 May 2012 CN
102782612 Nov 2012 CN
102893327 Jan 2013 CN
202887794 Apr 2013 CN
103076911 May 2013 CN
103502911 Jan 2014 CN
102660988 Mar 2014 CN
104035552 Sep 2014 CN
104838336 Aug 2015 CN
103355860 Jan 2016 CN
102011075725 Nov 2012 DE
102013201359 Jul 2014 DE
0161895 Nov 1985 EP
1785744 May 2007 EP
1815788 Aug 2007 EP
2417908 Feb 2012 EP
2637081 Sep 2013 EP
2770408 Aug 2014 EP
2953007 Dec 2015 EP
3201726 Aug 2017 EP
3017722 Aug 2015 FR
2070469 Sep 1981 GB
2443208 Apr 2008 GB
113860 Apr 1999 JP
11168268 Jun 1999 JP
2003280049 Oct 2003 JP
2006234716 Sep 2006 JP
2007011873 Jan 2007 JP
2007132768 May 2007 JP
2007266772 Oct 2007 JP
2008287714 Nov 2008 JP
2008293501 Dec 2008 JP
2009037434 Feb 2009 JP
2011102457 May 2011 JP
201218583 Sep 2012 JP
2012198916 Oct 2012 JP
2012208714 Oct 2012 JP
2013196047 Sep 2013 JP
2013251913 Dec 2013 JP
2014532332 Dec 2014 JP
1020080102516 Nov 2008 KR
100987650 Oct 2010 KR
20140027837 Mar 2014 KR
1020140055985 May 2014 KR
101914850 Oct 2018 KR
201425974 Jul 2014 TW
9001895 Mar 1990 WO
0130123 Apr 2001 WO
2001027855 Apr 2001 WO
0175778 Oct 2001 WO
2002082999 Oct 2002 WO
2004004557 Jan 2004 WO
2004053601 Jun 2004 WO
2005033387 Apr 2005 WO
2005103863 Nov 2005 WO
2007125298 Nov 2007 WO
2008061385 May 2008 WO
2009032073 Mar 2009 WO
2009083467 Jul 2009 WO
2010032173 Mar 2010 WO
2010101697 Sep 2010 WO
2012026013 Mar 2012 WO
2012064847 May 2012 WO
2012152476 Nov 2012 WO
2013082806 Jun 2013 WO
2013084108 Jun 2013 WO
2013137412 Sep 2013 WO
2013154864 Oct 2013 WO
2013186696 Dec 2013 WO
2013191657 Dec 2013 WO
2013192166 Dec 2013 WO
2014019085 Feb 2014 WO
2014085369 Jun 2014 WO
2014116968 Jul 2014 WO
2014124520 Aug 2014 WO
2014136027 Sep 2014 WO
2014138280 Sep 2014 WO
2014160893 Oct 2014 WO
2014165476 Oct 2014 WO
2014204323 Dec 2014 WO
2015017931 Feb 2015 WO
2015022671 Feb 2015 WO
2015149049 Oct 2015 WO
2016053624 Apr 2016 WO
2016118534 Jul 2016 WO
2016176471 Nov 2016 WO
2016176600 Nov 2016 WO
2016178797 Nov 2016 WO
2017019299 Feb 2017 WO
2017062566 Apr 2017 WO
2017200570 Nov 2017 WO
2017200571 Nov 2017 WO
20170200949 Nov 2017 WO
2018106306 Jun 2018 WO
Non-Patent Literature Citations (323)
Entry
David P. Duncan, “Motion Compensation of Synthetic Aperture Radar”, Microwave Earth Remote Sensing Laboratory, Brigham Young University, Apr. 15, 2003, 5 pages.
“Written Opinion”, PCT Application No. PCT/US2016/065295, dated Apr. 13, 2018, 8 pages.
“First Examination Report”, GB Application No. 1621332.4, dated May 16, 2017, 7 pages.
“International Search Report and Written Opinion”, PCT Application No. PCT/US2016/065295, dated Mar. 14, 2017, 12 pages.
Antonimuthu, “Google's Project Soli brings Gesture Control to Wearables using Radar”, YouTube[online], Available from https://www.youtube.com/watch?v=czJfcgvQcNA as accessed on May 9, 2017; See whole video, especially 6:05-6:35.
“Preliminary Report on Patentability”, PCT Application No. PCT/US2016/065295, dated Jul. 24, 2018, 18 pages.
“Advisory Action”, U.S. Appl. No. 14/504,139, dated Aug. 28, 2017, 3 pages.
“Apple Watch Used Four Sensors to Detect your Pulse”, retrieved from http://www.theverge.com/2014/9/9/6126991 / apple-watch-four-back-sensors-detect-activity on Sep. 23, 2017 as cited in PCT search report for PCT Application No. PCT/US2016/026756 dated Nov. 10, 2017; The Verge, paragraph 1, Sep. 9, 2014, 4 pages.
“Cardio”, Retrieved From: <http://www.cardiio.com/> Apr. 15, 2015 App Information Retrieved From: <https://itunes.apple.com/us/app/cardiio-touchless-camera-pulse/id542891434?Is=1&mt=8> Apr. 15, 2015, Feb. 24, 2015, 6 pages.
“Clever Toilet Checks on Your Health”, CNN.Com; Technology, Jun. 28, 2005, 2 pages.
“Combined Search and Examination Report”, GB Application No. 1620892.8, dated Apr. 6, 2017 , 5 pages.
“Combined Search and Examination Report”, GB Application No. 1620891.0, dated May 31, 2017, 9 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 15/362,359, dated Sep. 17, 2018, 10 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/582,896, dated Dec. 19, 2016, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/504,061, dated Dec. 27, 2016 , 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/582,896, dated Feb. 6, 2017 , 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/582,896, dated Feb. 23, 2017 , 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/930,220, dated Mar. 20, 2017 , 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/930,220, dated May 11, 2017 , 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/312,486, dated Oct. 28, 2016 , 4 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/312,486, dated Jan. 23, 2017 , 4 pages.
“Extended European Search Report”, EP Application No. 15170577.9, dated Nov. 5, 2015 , 12 pages.
“Final Office Action”, U.S. Appl. No. 14/504,061, dated Mar. 9, 2016 , 10 pages.
“Final Office Action”, U.S. Appl. No. 14/681,625, dated Dec. 7, 2016 , 10 pages.
“Final Office Action”, U.S. Appl. No. 15/287,253, dated Apr. 2, 2019, 10 pages.
“Final Office Action”, U.S. Appl. No. 15/398,147, dated Jun. 30, 2017, 11 pages.
“Final Office Action”, U.S. Appl. No. 14/959,799, dated Jul. 19, 2017, 12 pages.
“Final Office Action”, U.S. Appl. No. 14/731,195, dated Oct. 11, 2018, 12 pages.
“Final Office Action”, U.S. Appl. No. 15/595,649, dated May 23, 2018, 13 pages.
“Final Office Action”, U.S. Appl. No. 14/715,454, dated Sep. 7, 2017, 14 pages.
“Final Office Action”, U.S. Appl. No. 14/504,139, dated May 1, 2018, 14 pages.
“Final Office Action”, U.S. Appl. No. 15/286,512, dated Dec. 26, 2018, 15 pages.
“Final Office Action”, U.S. Appl. No. 15/142,619, dated Feb. 8, 2018, 15 pages.
“Final Office Action”, U.S. Appl. No. 14/504,121, dated Aug. 8, 2017, 16 pages.
“Final Office Action”, U.S. Appl. No. 14/959,730, dated Nov. 22, 2017, 16 pages.
“Final Office Action”, U.S. Appl. No. 15/142,689, dated Jun. 1, 2018, 16 pages.
“Final Office Action”, U.S. Appl. No. 14/959,799, dated Jan. 4, 2018, 17 pages.
“Final Office Action”, U.S. Appl. No. 14/720,632, dated Jan. 9, 2018, 18 pages.
“Final Office Action”, U.S. Appl. No. 14/518,863, dated May 5, 2017 , 18 pages.
“Final Office Action”, U.S. Appl. No. 14/959,901, dated Aug. 25, 2017, 19 pages.
“Final Office Action”, U.S. Appl. No. 15/093,533, dated Mar. 21, 2018, 19 pages.
“Final Office Action”, U.S. Appl. No. 14/715,454, dated Apr. 17, 2018, 19 pages.
“Final Office Action”, U.S. Appl. No. 14/518,863, dated Apr. 5, 2018, 21 pages.
“Final Office Action”, U.S. Appl. No. 14/959,901, dated Jun. 15, 2018, 21 pages.
“Final Office Action”, U.S. Appl. No. 15/287,308, dated Feb. 8, 2019, 23 pages.
“Final Office Action”, U.S. Appl. No. 14/599,954, dated Aug. 10, 2016 , 23 pages.
“Final Office Action”, U.S. Appl. No. 14/504,038, dated Sep. 27, 2016 , 23 pages.
“Final Office Action”, U.S. Appl. No. 14/504,121, dated Jul. 9, 2018, 23 pages.
“Final Office Action”, U.S. Appl. No. 15/286,152, dated Jun. 26, 2018, 25 pages.
“Final Office Action”, U.S. Appl. No. 15/403,066, dated Oct. 5, 2017, 31 pages.
“Final Office Action”, U.S. Appl. No. 15/267,181, dated Jun. 7, 2018, 31 pages.
“Final Office Action”, U.S. Appl. No. 14/312,486, dated Jun. 3, 2016 , 32 pages.
“Final Office Action”, U.S. Appl. No. 15/166,198, dated Sep. 27, 2018, 33 pages.
“Final Office Action”, U.S. Appl. No. 14/699,181, daed May 4, 2018, 41 pages.
“Final Office Action”, U.S. Appl. No. 14/715,793, dated Sep. 12, 2017, 7 pages.
“Final Office Action”, U.S. Appl. No. 14/809,901, dated Dec. 13, 2018, 7 pages.
“Final Office Action”, U.S. Appl. No. 14/874,955, dated Jun. 30, 2017, 9 pages.
“Final Office Action”, U.S. Appl. No. 14/874,955, dated Jun. 11, 2018, 9 pages.
“First Action Interview OA”, U.S. Appl. No. 14/715,793, dated Jun. 21, 2017, 3 pages.
“First Action Interview Office Action”, U.S. Appl. No. 14/959,901, dated Apr. 14, 2017 , 3 pages.
“First Action Interview Office Action”, U.S. Appl. No. 14/731,195, dated Jun. 21, 2018, 4 pages.
“First Action Interview Office Action”, U.S. Appl. No. 15/286,152, dated Mar. 1, 2018, 5 pages.
“First Action Interview Office Action”, U.S. Appl. No. 15/166,198, dated Apr. 25, 2018, 8 pages.
“First Action Interview Pilot Program Pre-Interview Communication”, U.S. Appl. No. 14/731,195, dated Aug. 1, 2017, 3 pages.
“Foreign Office Action”, Chinese Application No. 201580034536.8, dated Oct. 9, 2018.
“Foreign Office Action”, KR Application No. 10-2016-7036023, dated Aug. 11, 2017, 10 pages.
“Foreign Office Action”, Japanese Application No. 2018-501256, dated Jul. 24, 2018, 11 pages.
“Foreign Office Action”, Chinese Application No. 201580036075.8, dated Jul. 4, 2018, 14 page.
“Foreign Office Action”, CN Application No. 201580034908.7, dated Jul. 3, 2018, 17 pages.
“Foreign Office Action”, Chinese Application No. 201721290290.3, dated Mar. 9, 2018, 2 pages.
“Foreign Office Action”, JP App. No. 2016-567813, dated Jan. 16, 2018, 3 pages.
“Foreign Office Action”, Korean Application No. 10-2016-7036015, dated Oct. 15, 2018, 3 pages.
“Foreign Office Action”, Japanese Application No. 2018501256, dated Feb. 26, 2019, 3 pages.
“Foreign Office Action”, Japanese Application No. 2016-567839, dated Apr. 3, 2018, 3 pages.
“Foreign Office Action”, European Application No. 16784352.3, dated May 16, 2018, 3 pages.
“Foreign Office Action”, Chinese Application No. 201721290290.3, dated Jun. 6, 2018, 3 pages.
“Foreign Office Action”, KR Application No. 10-2016-7035397, dated Sep. 20, 2017, 5 pages.
“Foreign Office Action”, UK Application No. 1620891.0, dated Dec. 6, 2018, 5 pages.
“Foreign Office Action”, Chinese Application No. 201580036075.8, dated Feb. 19, 2019, 5 pages.
“Foreign Office Action”, Korean Application No. 1020187012629, dated May 24, 2018, 6 pages.
“Foreign Office Action”, EP Application No. 15170577.9, dated May 30, 2017, 7 pages.
“Foreign Office Action”, Korean Application No. 10-2016-7036396, dated Jan. 3, 2018, 7 pages.
“Foreign Office Action”, JP Application No. 2016567813, dated Sep. 22, 2017, 8 pages.
“Foreign Office Action”, Japanese Application No. 2018021296, dated Dec. 25, 2018, 8 pages.
“Foreign Office Action”, EP Application No. 15754323.2, dated Mar. 9, 2018, 8 pages.
“Frogpad Introduces Wearable Fabric Keyboard with Bluetooth Technology”, Retrieved From: <http://www.geekzone.co.nz/content.asp?contentid=3898> Mar. 16, 2015, Jan. 7, 2005 , 2 pages.
“International Preliminary Report on Patentability”, PCT Application No. PCT/US2016/063874, dated Nov. 29, 2018, 12 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2015/030388, dated Dec. 15, 2016 , 12 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2015/043963, dated Feb. 16, 2017 , 12 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2015/050903, dated Apr. 13, 2017 , 12 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2015/043949, dated Feb. 16, 2017 , 13 pages.
“International Preliminary Report on Patentability”, PCT Application No. PCT/US2017/032733, dated Nov. 29, 2018, 7 pages.
“International Preliminary Report on Patentability”, PCT Application No. PCT/US2016/026756, dated Oct. 19, 2017, 8 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2015/044774, dated Mar. 2, 2017 , 8 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/060399, dated Jan. 30, 2017 , 11 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2015/044774, dated Nov. 3, 2015 , 12 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/042013, dated Oct. 26, 2016 , 12 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/062082, dated Feb. 23, 2017 , 12 pages.
“International Search Report and Written Opinion”, PCT/US2017/047691, dated Nov. 16, 2017, 13 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/024267, dated Jun. 20, 2016 , 13 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/024273, dated Jun. 20, 2016 , 13 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/032307, dated Aug. 25, 2016 , 13 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/034366, dated Nov. 17, 2016 , 13 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/029820, dated Jul. 15, 2016 , 14 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/055671, dated Dec. 1, 2016 , 14 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/030177, dated Aug. 2, 2016 , 15 pages.
“International Search Report and Written Opinion”, PCT Application No. PCT/US2017/051663, dated Nov. 29, 2017, 16 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2015/043963, dated Nov. 24, 2015 , 16 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/024289, dated Aug. 25, 2016 , 17 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2015/043949, dated Dec. 1, 2015 , 18 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2015/050903, dated Feb. 19, 2016 , 18 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/030115, dated Aug. 8, 2016 , 18 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/063874, dated May 11, 2017, 19 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/033342, dated Oct. 27, 2016 , 20 pages.
“Life:X Lifestyle eXplorer”, Retrieved from <https://web. archive.org/web/20150318093841/http://research.microsoft.com/en-us/projects/lifex >, Feb. 3, 2017 , 2 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/596,702, dated Jan. 4, 2019, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/286,837, dated Oct. 26, 2018, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/504,139, dated Jan. 27, 2017 , 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/959,799, dated Jan. 27, 2017 , 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/398,147, dated Mar. 9, 2017 , 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/504,139, dated Oct. 18, 2017, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/287,155, dated Dec. 10, 2018, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/666,155, dated Feb. 3, 2017 , 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/504,121, dated Jan. 9, 2017 , 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/809,901, dated May 24, 2018, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/959,730, dated Jun. 23, 2017, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/862,409, dated Jun. 22, 2017, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/930,220, dated Sep. 14, 2016 , 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/286,512, dated Jul. 19, 2018, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/142,829, dated Aug. 16, 2018, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/720,632, dated Jun. 14, 2017, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/142,619, dated Aug. 25, 2017, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/959,799, dated Sep. 8, 2017, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/715,454, dated Jan. 11, 2018, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/595,649, dated Oct. 31, 2017, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/504,139, dated Oct. 5, 2018, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/518,863, dated Oct. 14, 2016 , 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/599,954, dated Jan. 26, 2017 , 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/862,409, dated Dec. 14, 2017, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/599,954, dated Feb. 2, 2016 , 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/287,253, dated Apr. 5, 2018, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/093,533, dated Aug. 24, 2017, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/142,689, dated Oct. 4, 2017, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/287,308, dated Oct. 15, 2018, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/286,537, dated Nov. 19, 2018, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/504,121, dated Jan. 2, 2018, 19 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/287,253, dated Sep. 7, 2018, 20 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/518,863, dated Sep. 29, 2017, 20 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/720,632, dated May 18, 2018, 20 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/959,901, dated Jan. 8, 2018, 21 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/959,901, dated Oct. 11, 2018, 22 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/504,038, dated Feb. 26, 2016 , 22 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/312,486, dated Oct. 23, 2015 , 25 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/286,152, dated Oct. 19, 2018, 27 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/267,181, dated Feb. 8, 2018, 29 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/403,066, dated May 4, 2017 , 31 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/699,181, dated Oct. 18, 2017, 33 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/504,038, dated Mar. 22, 2017 , 33 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/287,394, dated Mar. 22, 2019, 39 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/398,147, dated Sep. 8, 2017, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/874,955, dated Feb. 8, 2018, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/681,625, dated Mar. 6, 2017 , 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/586,174, dated Jun. 18, 2018, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/504,061, dated Nov. 4, 2015 , 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/874,955, dated Feb. 27, 2017 , 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/582,896, dated Jun. 29, 2016 , 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/681,625, dated Aug. 12, 2016 , 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/666,155, dated Aug. 24, 2016 , 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/513,875, dated Feb. 21, 2017 , 9 pages.
“Non-Invasive Quantification of Peripheral Arterial Volume Distensibilitiy and its Non-Lineaer Relationship with Arterial Pressure”, Journal of Biomechanics, Pergamon Press, vol. 42, No. 8; as cited in the search report for PCT/US2016/013968 citing the whole document, but in particular the abstract, dated May 29, 2009, 2 pages.
“Notice of Allowance”, U.S. Appl. No. 14/599,954, dated May 24, 2017 , 11 pages.
“Notice of Allowance”, U.S. Appl. No. 14/312,486, dated Oct. 7, 2016 , 15 pages.
“Notice of Allowance”, U.S. Appl. No. 14/504,038, dated Aug. 7, 2017, 17 pages.
“Notice of Allowance”, U.S. Appl. No. 15/403,066, dated Jan. 8, 2018, 18 pages.
“Notice of Allowance”, U.S. Appl. No. 15/287,200, dated Nov. 6, 2018, 19 pages.
“Notice of Allowance”, U.S. Appl. No. 15/286,152, dated Mar. 5, 2019, 23 pages.
“Notice of Allowance”, U.S. Appl. No. 14/715,793, dated Jul. 6, 2018, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 15/286,495, dated Jan. 17, 2019, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 15/595,649, dated Jan. 3, 2019, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 14/715,793, dated Dec. 18, 2017, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 14/666,155, dated Feb. 20, 2018, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 14/582,896, dated Nov. 7, 2016 , 5 pages.
“Notice of Allowance”, U.S. Appl. No. 15/586,174, dated Sep. 24, 2018, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 14/513,875, dated Jun. 28, 2017, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 14/666,155, dated Jul. 10, 2017, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 14/874,955, dated Oct. 20, 2017, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 14/504,061, dated Sep. 12, 2016 , 7 pages.
“Notice of Allowance”, U.S. Appl. No. 14/494,863, dated May 30, 2017 , 7 pages.
“Notice of Allowance”, U.S. Appl. No. 14/681,625, dated Jun. 7, 2017 , 7 pages.
“Notice of Allowance”, U.S. Appl. No. 15/286,837, dated Mar. 6, 2019, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 14/862,409, dated Jun. 6, 2018, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 15/362,359, dated Aug. 3, 2018, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 14/681,625, dated Oct. 23, 2017, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 14/874,955, dated Oct. 4, 2018, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 15/398,147, dated Nov. 15, 2017, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 14/959,730, dated Feb. 22, 2018, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 15/142,829, dated Feb. 6, 2019, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 14/930,220, dated Feb. 2, 2017 , 8 pages.
“Notice of Allowance”, U.S. Appl. No. 15/595,649, dated Sep. 14, 2018, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 15/343,067, dated Jul. 27, 2017, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 15/142,689, dated Oct. 30, 2018, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 14/504,137, dated Feb. 6, 2019, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 15/142,619, dated Aug. 13, 2018, 9 pages.
“Philips Vital Signs Camera”, Retrieved From: <http://www.vitalsignscamera.com/> Apr. 15, 2015, Jul. 17, 2013 , 2 pages.
“Pre-Interview Communication”, U.S. Appl. No. 15/287,359, dated Jul. 24, 2018, 2 pages.
“Pre-Interview Communication”, U.S. Appl. No. 14/513,875, dated Oct. 21, 2016 , 3 pages.
“Pre-Interview Communication”, U.S. Appl. No. 14/959,901, dated Feb. 10, 2017 , 3 pages.
“Pre-Interview Communication”, U.S. Appl. No. 14/959,730, dated Feb. 15, 2017 , 3 pages.
“Pre-Interview Communication”, U.S. Appl. No. 14/715,793, dated Mar. 20, 2017 , 3 pages.
“Pre-Interview Communication”, U.S. Appl. No. 14/715,454, dated Apr. 14, 2017 , 3 pages.
“Pre-Interview Communication”, U.S. Appl. No. 15/343,067, dated Apr. 19, 2017 , 3 pages.
“Pre-Interview Communication”, U.S. Appl. No. 15/286,495, dated Sep. 10, 2018, 4 pages.
“Pre-Interview Communication”, U.S. Appl. No. 15/362,359, dated May 17, 2018, 4 pages.
“Pre-Interview Communication”, U.S. Appl. No. 14/494,863, dated Jan. 27, 2017 , 5 pages.
“Pre-Interview Communication”, U.S. Appl. No. 15/166,198, dated Mar. 8, 2018, 8 pages.
“Pre-Interview First Office Action”, U.S. Appl. No. 15/286,152, dated Feb. 8, 2018, 4 pages.
“Pre-Interview Office Action”, U.S. Appl. No. 14/862,409, dated Sep. 15, 2017, 16 pages.
“Pre-Interview Office Action”, U.S. Appl. No. 14/731,195, dated Dec. 20, 2017, 4 pages.
“Preliminary Report on Patentability”, PCT Application No. PCT/US2016/042013, dated Jan. 30, 2018, 7 pages.
“Preliminary Report on Patentability”, PCT Application No. PCT/US2016/062082, dated Nov. 15, 2018, 8 pages.
“Preliminary Report on Patentability”, PCT Application No. PCT/US2016/055671, dated Apr. 10, 2018, 9 pages.
“Preliminary Report on Patentability”, PCT Application No. PCT/US2016/032307, dated Dec. 7, 2017, 9 pages.
“Pressure-Volume Loop Analysis in Cardiology”, retrieved from https://en.wikipedia.org/w/index.php?t itle=Pressure-volume loop analysis in card iology&oldid=636928657 on Sep. 23, 2017; Obtained per link provided in search report from PCT/US2016/01398 dated Jul. 28, 2016, Dec. 6, 2014, 10 pages.
“Restriction Requirement”, U.S. Appl. No. 15/362,359, dated Jan. 8, 2018, 5 pages.
“Restriction Requirement”, U.S. Appl. No. 14/666,155, dated Jul. 22, 2016 , 5 pages.
“Restriction Requirement”, U.S. Appl. No. 15/462,957, dated Jan. 4, 2019, 6 pages.
“Restriction Requirement”, U.S. Appl. No. 15/352,194, dated Feb. 6, 2019, 8 pages.
“Restriction Requirement”, U.S. Appl. No. 15/286,537, dated Aug. 27, 2018, 8 pages.
“Textile Wire Brochure”, Retrieved at: http://www.textile-wire.ch/en/home.html, Aug. 7, 2004 , 17 pages.
“The Dash smart earbuds play back music, and monitor your workout”, Retrieved from < http://newatlas.com/bragi-dash-tracking-earbuds/30808/>, Feb. 13, 2014 , 3 pages.
“The Instant Blood Pressure app estimates blood pressure with your smartphone and our algorithm”, Retrieved at: http://www.instantbloodpressure.com/—Jun. 23, 2016, 6 pages.
“Thermofocus No Touch Forehead Thermometer”, Technimed, Internet Archive. Dec. 24, 2014. https://web.archive.org/web/20141224070848/http://www.tecnimed.it:80/thermofocus-forehead-thermometer-H1N1-swine-flu.html, Dec. 24, 2018, 4 pages.
“Written Opinion”, PCT Application No. PCT/US2017/032733, dated Jul. 24, 2017, 5 pages.
“Written Opinion”, PCT Application No. PCT/US2017/032733, dated Jul. 26, 2017, 5 pages.
“Written Opinion”, PCT Application No. PCT/US2016/042013, dated Feb. 2, 2017, 6 pages.
“Written Opinion”, PCT Application No. PCT/US2016/026756, dated Nov. 10, 2016, 7 pages.
“Written Opinion”, PCT Application No. PCT/US2016/055671, dated Apr. 13, 2017, 8 pages.
“Written Opinion”, PCT Application No. PCT/US2017/051663, dated Oct. 12, 2018, 8 pages.
“Written Opinion”, PCT Application PCT/US2016/013968, dated Jul. 28, 2016, 9 pages.
“Written Opinion”, PCT Application No. PCT/US2016/030177, dated Nov. 3, 2016, 9 pages.
Arbabian, Amin et al., “A 94GHz mm-Wave to Baseband Pulsed-Radar for Imaging and Gesture Recognition”, 2012 IEEE, 2012 Symposium on VLSI Circuits Digest of Technical Papers, Jan. 1, 2012 , 2 pages.
Balakrishnan, Guha et al., “Detecting Pulse from Head Motions in Video”, In Proceedings: CVPR '13 Proceedings of the 2013 IEEE Conference on Computer Vision and Pattern Recognition Available at: <http://people.csail.mit.edu/mrub/vidmag/papers/Balakrishnan_Detecting _Pulse_from_2013_CVPR_paper.pdf>, Jun. 23, 2013 , 8 pages.
Bondade, Rajdeep et al., “A linear-assisted DC-DC hybrid power converter for envelope tracking RF power amplifiers”, 2014 IEEE Energy Conversion Congress and Exposition (ECCE), IEEE, Sep. 14, 2014, pp. 5769-5773, XP032680873, DOI: 10.1109/ECCE.2014.6954193, Sep. 14, 2014, 5 pages.
Cheng, Jingyuan “Smart Textiles: From Niche to Mainstream”, IEEE Pervasive Computing, pp. 81-84.
Couderc, Jean-Philippe et al., “Detection of Atrial Fibrillation using Contactless Facial Video Monitoring”, In Proceedings: Heart Rhythm Society, vol. 12, Issue 1 Available at: <http://www.heartrhythmjournal.com/article/S1547-5271(14)00924-2/pdf>, 7 pages.
Espina, Javier et al., “Wireless Body Sensor Network for Continuous Cuff-less Blood Pressure Monitoring”, International Summer School on Medical Devices and Biosensors, 2006, 5 pages.
Fan, Tenglong et al., “Wireless Hand Gesture Recognition Based on Continuous-Wave Doppler Radar Sensors”, IEEE Transactions on Microwave Theory and Techniques, Plenum, USA, vol. 64, No. 11, Nov. 1, 2016 (Nov. 1, 2016), pp. 4012-4012, XP011633246, ISSN: 0018-9480, DOI: 10.1109/TMTT.2016.2610427, Nov. 1, 2016, 9 pages.
Farringdon, Jonny et al., “Wearable Sensor Badge & Sensor Jacket for Context Awareness”, Third International Symposium on Wearable Computers, 7 pages.
Garmatyuk, Dmitriy S. et al., “Ultra-Wideband Continuous-Wave Random Noise Arc-SAR”, IEEE Transaction on Geoscience and Remote Sensing, vol. 40, No. 12, Dec. 2002, Dec. 2002, 10 pages.
Geisheimer, Jonathan L. et al., “A Continuous-Wave (CVV) Radar for Gait Analysis”, IEEE 2001, 2001, 5 pages.
Godana, Bruhtesfa E. “Human Movement Characterization in Indoor Environment using GNU Radio Based Radar”, Retrieved at: http://repository.tudelft.nl/islandora/object/uuid:414e1868-dd00-4113-9989-4c213f1f7094?collection=education, Nov. 30, 2009 , 100 pages.
Gürbüz, Sevgi Z. et al., “Detection and Identification of Human Targets in Radar Data”, Proc. SPIE 6567, Signal Processing, Sensor Fusion, and Target Recognition XVI, 656701, May 7, 2007, 12 pages.
He, David D. “A Continuous, Wearable, and Wireless Heart Monitor Using Head Ballistocardiogram (BCG) and Head Electrocardiogram (ECG) with a Nanowatt ECG Heartbeat Detection Circuit”, In Proceedings: Thesis, Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology Available at: <http://dspace.mit.edu/handle/1721.1/79221>, 137 pages.
Holleis, Paul et al., “Evaluating Capacitive Touch Input on Clothes”, Proceedings of the 10th International Conference on Human Computer Interaction, Jan. 1, 2008 , 10 pages.
Ishijima, Masa “Unobtrusive Approaches to Monitoring Vital Signs at Home”, Medical & Biological Engineering and Computing, Springer, Berlin, DE, vol. 45, No. 11 as cited in search report for PCT/US2016/013968 dated Jul. 28, 2016, Sep. 26, 2007, 3 pages.
Klabunde, Richard E. “Ventricular Pressure-Volume Loop Changes in Valve Disease”, Retrieved From <https://web.archive.org/web/20101201185256/http://cvphysiology.com/Heart%20Disease/HD009.htm>, Dec. 1, 2010 , 8 pages.
Kubota, Yusuke et al., “A Gesture Recognition Approach by using Microwave Doppler Sensors”, IPSJ SIG Technical Report, 2009 (6), Information Processing Society of Japan, Apr. 15, 2010, pp. 1-8, Apr. 15, 2010, 13 pages.
Lien, Jaime et al., “Soli: Ubiquitous Gesture Sensing with Millimeter Wave Radar”, ACM Transactions on Graphics (TOG), ACM, Us, vol. 35, No. 4, Jul. 11, 2016 (Jul. 11, 2016), pp. 1-19, XP058275791, ISSN: 0730-0301, DOI: 10.1145/2897824.2925953, Jul. 11, 2016, 19 pages.
Martinez-Garcia, Hermino et al., “Four-quadrant linear-assisted DC/DC voltage regulator”, Analog Integrated Circuits and Signal Processing, Springer New York LLC, US, vol. 88, No. 1, Apr. 23, 2016 (Apr. 23, 2016)pp. 151-160, XP035898949, ISSN: 0925-1030, DOI: 10.1007/S10470-016-0747-8, Apr. 23, 2016, 10 pages.
Matthews, Robert J. “Venous Pulse”, Retrieved at: http://www.rjmatthewsmd.com/Definitions/venous_pulse.htm—on Nov. 30, 2016, Apr. 13, 2013 , 7 pages.
Nakajima, Kazuki et al., “Development of Real-Time Image Sequence Analysis for Evaluating Posture Change and Respiratory Rate of a Subject in Bed”, In Proceedings: Physiological Measurement, vol. 22, No. 3 Retrieved From: <http://iopscience.iop.org/0967-3334/22/3/401/pdf/0967-3334_22_3_401.pdf> Feb. 27, 2015, 8 pages.
Otto, Chris et al., “System Architecture of a Wireless Body Area Sensor Network for Ubiquitous Health Monitoring”, Journal of Mobile Multimedia; vol. 1, No. 4, Jan. 10, 2006, 20 pages.
Palese, et al., “The Effects of Earphones and Music on the Temperature Measured by Infrared Tympanic Thermometer: Preliminary Results”, ORL—head and neck nursing: official journal of the Society of Otorhinolaryngology and Head-Neck Nurses 32.2, Jan. 1, 2013 , pp. 8-12.
Patel, P C. et al., “Applications of Electrically Conductive Yarns in Technical Textiles”, International Conference on Power System Technology (POWECON), Oct. 30, 2012 , 6 pages.
Poh, Ming-Zher et al., “A Medical Mirror for Non-contact Health Monitoring”, In Proceedings: ACM SIGGRAPH Emerging Technologies Available at: <http://affect.media.mit.edu/pdfs/11.Poh-etal-SIGGRAPH.pdf>, Jan. 1, 2011 , 1 page.
Poh, Ming-Zher et al., “Non-contact, Automated Cardiac Pulse Measurements Using Video Imaging and Blind Source Separation.”, In Proceedings: Optics Express, vol. 18, No. 10 Available at: <http://www.opticsinfobase.org/view_article.cfm?gotourl=http%3A%2F%2Fwww%2Eopticsinfobase%2Eorg%2FDirectPDFAccess%2F77B04D55%2DBC95%2D6937%2D5BAC49A426378C02%5F199381%2Foe%2D18%2D10%2D10762%2Ep, May 7, 2010 , 13 pages.
Pu, Qifan et al., “Gesture Recognition Using Wireless Signals”, pp. 15-18.
Pu, Qifan et al., “Whole-Home Gesture Recognition Using Wireless Signals”, MobiCom'13, Sep. 30-Oct. 4, Miami, FL, USA, 2013, 12 pages.
Pu, Qifan et al., “Whole-Home Gesture Recognition Using Wireless Signals”, Proceedings of the 19th annual international conference on Mobile computing & networking (MobiCom'13), US, ACM, Sep. 30, 2013, pp. 27-38, Sep. 30, 2013, 12 pages.
Pu, Quifan et al., “Whole-Home Gesture Recognition Using Wireless Signals”, MobiCom '13 Proceedings of the 19th annual international conference on Mobile computing & networking, Aug. 27, 2013 , 12 pages.
Schneegass, Stefan et al., “Towards a Garment OS: Supporting Application Development for Smart Garments”, Wearable Computers, ACM, Sep. 13, 2014, 6 pages.
Skolnik, Merrill I. “CW and Frequency-Modulated Radar”, In: “Introduction to Radar Systems”, Jan. 1, 1981 (Jan. 1, 1981), McGraw Hill, XP055047545, ISBN: 978-0-07-057909-5 pp. 68-100, p. 95-p. 97, Jan. 1, 1981, 18 pages.
Stoppa, Matteo “Wearable Electronics and Smart Textiles: A Critical Review”, In Proceedings of Sensors, vol. 14, Issue 7, Jul. 7, 2014 , pp. 11957-11992.
Wang, Wenjin et al., “Exploiting Spatial Redundancy of Image Sensor for Motion Robust rPPG”, In Proceedings: IEEE Transactions on Biomedical Engineering, vol. 62, Issue 2, Jan. 19, 2015 , 11 pages.
Wang, Yazhou et al., “Micro-Doppler Signatures for Intelligent Human Gait Recognition Using a UWB Impulse Radar”, 2011 IEEE International Symposium on Antennas and Propagation (APSURSI), Jul. 3, 2011 , pp. 2103-2106.
Wijesiriwardana, R et al., “Capacitive Fibre-Meshed Transducer for Touch & Proximity Sensing Applications”, IEEE Sensors Journal, IEEE Service Center, Oct. 1, 2005 , 5 pages.
Zhadobov, Maxim et al., “Millimeter-Wave Interactions with the Human Body: State of Knowledge and Recent Advances”, International Journal of Microwave and Wireless Technologies, p. 1 of 11. # Cambridge University Press and the European Microwave Association, 2011 doi:10.1017/S1759078711000122, 2011.
Zhadobov, Maxim et al., “Millimeter-wave Interactions with the Human Body: State of Knowledge and Recent Advances”, International Journal of Microwave and Wireless Technologies, Mar. 1, 20111 , 11 pages.
Zhang, Ruquan et al., “Study of the Structural Design and Capacitance Characteristics of Fabric Sensor”, Advanced Materials Research (vols. 194-196), Feb. 21, 2011 , 8 pages.
Zheng, Chuan et al., “Doppler Bio-Signal Detection Based Time-Domain Hand Gesture Recognition”, 2013 IEEE MTT-S International Microwave Workshop Series on RF and Wireless Technologies for Biomedical and Healthcare Applications (IMWS-BIO), IEEE, Dec. 9, 2013 (Dec. 9, 2013), p. 3, XP032574214, DOI: 10.1109/IMWS-BIO.2013.6756200, Dec. 9, 2013, 3 Pages.
“EP Appeal Decision”, European Application No. 10194359.5, dated May 28, 2019, 20 pages.
“Final Office Action”, U.S. Appl. No. 15/287,394, dated Sep. 30, 2019, 38 Pages.
“Foreign Office Action”, Korean Application No. 1020197004803, dated Oct. 14, 2019, 2 pages.
“Foreign Office Action”, Japanese Application No. 2018501256, dated Oct. 23, 2019, 5 pages.
“Foreign Office Action”, British Application No. 1621332.4, dated Nov. 6, 2019, 3 pages.
“Foreign Office Action”, Japanese Application No. 2018156138, dated Sep. 30, 2019, 3 pages.
“Galaxy S4 Air Gesture”, Galaxy S4 Guides, https://allaboutgalaxys4.com/galaxy-s4-features-explained/air-gesture/, 4 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/791,044, dated Sep. 30, 2019, 22 Pages.
“Non-Final Office Action”, U.S. Appl. No. 15/596,702, dated Oct. 21, 2019, 21 Pages.
“Samsung Galaxy S4 Air Gestures”, Video from https://www.youtube.com/watch?v=375Hb87yGcg, May 7, 2013.
Amihood, et al., “Closed-Loop Manufacturing System Using Radar”, Technical Disclosure Commons; Retrieved from http://www.tdcommons.org/dpubs_series/464, Apr. 17, 2017, 8 pages.
Karagozler, et al., “Embedding Radars in Robots to Accurately Measure Motion”, Technical Disclosure Commons; Retrieved from http://www.tdcommons.org/dpubs_series/454, Mar. 30, 2017, 8 pages.
Lien, et al., “Embedding Radars in Robots for Safety and Obstacle Detection”, Technical Disclosure Commons; Retrieved from http://www.tdcommons.org/dpubs_series/455, Apr. 2, 2017, 10 pages.
“Final Office Action”, U.S. Appl. No. 15/287,155, dated Apr. 10, 2019, 11 pages.
“Final Office Action”, U.S. Appl. No. 15/286,537, dated Apr. 19, 2019, 21 pages.
“Final Office Action”, U.S. Appl. No. 15/596,702, dated Jun. 13, 2019, 21 pages.
“Foreign Office Action”, Japanese Application No. 2018156138, dated May 22, 2019, 3 pages.
“Foreign Office Action”, Korean Application No. 1020197004803, dated Apr. 26, 2019, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 15/286,512, dated Apr. 9, 2019, 14 pages.
“Final Office Action”, U.S. Appl. No. 15/287,359, dated Feb. 19, 2020, 16 Pages.
“Foreign Office Action”, Korean Application No. 1020187004283, dated Jan. 3, 2020, 8 pages.
“Foreign Office Action”, Chinese Application No. 201611159870.9, dated Dec. 17, 2019, 15 pages.
“Foreign Office Action”, Korean Application No. 1020197004803, dated Dec. 6, 2019, 2 pages.
“Notice of Allowance”, U.S. Appl. No. 15/791,044, dated Feb. 12, 2020, 8 Pages.
“Notice of Allowance”, U.S. Appl. No. 15/287,394, dated Mar. 4, 2020, 11 Pages.
“Final Office Action”, U.S. Appl. No. 15/596,702, dated Apr. 14, 2020, 27 Pages.
“Foreign Office Action”, Japanese Application No. 2018156138, dated Apr. 22, 2020, 3 pages.
“Notice of Allowance”, U.S. Appl. No. 16/401,611, dated Jun. 10, 2020, 17 Pages.
“Pre-Interview Communication”, U.S. Appl. No. 16/401,611, dated Apr. 13, 2020, 4 Pages.
“Pre-Interview Communication”, U.S. Appl. No. 16/380,245, dated Jun. 15, 2020, 3 Pages.
“Extended European Search Report”, European Application No. 19164113.3, dated Jun. 13, 2019, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/286,537, dated Sep. 3, 2019, 28 Pages.
“Notice of Allowance”, U.S. Appl. No. 15/287,308, dated Jul. 17, 2019, 17 Pages.
“Notice of Allowance”, U.S. Appl. No. 15/287,253, dated Aug. 26, 2019, 13 Pages.
“Notice of Allowance”, U.S. Appl. No. 15/287,155, dated Jul. 25, 2019, 7 pages.
“First Action Interview Office Action”, U.S. Appl. No. 16/080,293, Jul. 23, 2020, 3 Pages.
“Foreign Office Action”, British Application No. 1621192.2, Jun. 17, 2020, 5 pages.
“Foreign Office Action”, Chinese Application No. 201680038897.4, Jun. 29, 2020, 28 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/287,359, Jun. 26, 2020, 19 Pages.
“Non-Final Office Action”, U.S. Appl. No. 16/503,234, Aug. 5, 2020, 18 Pages.
“Non-Final Office Action”, U.S. Appl. No. 15/596,702, Aug. 19, 2020, 27 Pages.
“Pre-Interview Communication”, U.S. Appl. No. 16/080,293, Jun. 25, 2020, 3 Pages.
“Search Report”, UK Application No. 2007255.9, Jul. 6, 2020, 1 page.
Provisional Applications (1)
Number Date Country
62237975 Oct 2015 US