Radar-based object detection for vehicles

Information

  • Patent Grant
  • 10459080
  • Patent Number
    10,459,080
  • Date Filed
    Thursday, October 6, 2016
    8 years ago
  • Date Issued
    Tuesday, October 29, 2019
    5 years ago
Abstract
This document describes techniques and devices for radar-based object detection for vehicles. A radar-based object detection component implemented in a vehicle is configured to detect characteristics of persons within the vehicle, such as a driver or other passengers. Based on the detected characteristics, an activity of the person can be determined and various operations can be initiated based on the activity, such as initiating a warning when the driver is not paying attention to driving or automatically slowing down the vehicle. In some cases, the radar-based object detection component can also be implemented to detect characteristics of objects positioned external to the vehicle, such as pedestrians, other vehicles, or objects in the road. The radar-based object detection component may also be implemented to authenticate a driver of the vehicle, such as by detecting biometric characteristics of the driver or recognizing a series of gestures corresponding to an authentication sequence.
Description
BACKGROUND

This background description is provided for the purpose of generally presenting the context of the disclosure. Unless otherwise indicated herein, material described in this section is neither expressly nor impliedly admitted to be prior art to the present disclosure or the appended claims.


Vehicles, such as automobiles, boats, or aircrafts, can be dangerous when the driver of the vehicle fails to pay attention to driving. This may also be the case in autonomous driving experiences, where the driver may be relied upon as a fallback mechanism in the event that the autonomous driving system fails or is unable to handle a particular type of navigation.


SUMMARY

This document describes techniques and devices for radar-based object detection for vehicles. The techniques describe a radar-based object detection component implemented in a vehicle that is configured to detect characteristics of persons within the vehicle, such as a driver or other passengers. Then, based on the detected characteristics, an activity of the person can be determined and various operations can be initiated based on the activity, such as initiating a warning when the driver is not paying attention to driving, automatically slowing down the vehicle, and so forth. In some cases, the radar-based object detection component can also be implemented to detect characteristics of objects positioned external to the vehicle, such as pedestrians, other vehicles, foreign objects in the road, and so forth. The radar-based object detection component may also be implemented to authenticate a driver of the vehicle, such as by detecting biometric characteristics of the driver or recognizing a series of gestures corresponding to an authentication sequence. This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter.





BRIEF DESCRIPTION OF THE DRAWINGS

Various aspects of radar-based object detection for vehicles are described with reference to the following drawings. The same numbers are used throughout the drawings to reference like features and components:



FIG. 1 is an illustration of an environment in an example implementation that is operable to employ radar-based object detection for a vehicle.



FIG. 2 illustrates an example implementation of the vehicle computing system of FIG. 1 in greater detail.



FIG. 3 illustrates an example of RF wave propagation, and a corresponding reflected wave propagation.



FIG. 4 illustrates an example environment in which multiple antenna are used to ascertain information about a target object.



FIG. 5 is a flow diagram depicting a procedure in an example implementation.



FIG. 6 illustrates various components of an example vehicle computing system that incorporates radar-based object detection for vehicles as described with reference to FIGS. 1-5.





DETAILED DESCRIPTION

Overview


This document describes techniques and devices for radar-based object detection for vehicles. The techniques describe a radar-based object detection component for a vehicle (e.g., an automobile, boat, or plane) that is configured to detect various characteristics of persons with the vehicle (e.g., a driver and passengers), as well as characteristics of objects external to the vehicle (e.g., pedestrians, other vehicles, or foreign objects in the road).


For example, the radar-based object detection component can monitor a presence and attention level of a driver of the vehicle while the vehicle is moving by initiating transmission of an outgoing RF signal via a radar-emitting element of a radar sensor, receiving, via an antenna of the radar sensor, an incoming RF signal generated by the outgoing RF signal reflecting off the driver of the vehicle, and analyzing the incoming RF signal to detect one or more characteristics of the driver. Such characteristics, for example, can include a position and movement of the driver's body or a specific body part, such as the driver's hands, mouth, eyes, and so forth.


Then, based on the detected characteristics, the radar-based object detection component can determine an activity of the driver. As described herein, an activity of the driver corresponds to an activity currently being performed by the driver, such as driving with one or both hands on the steering wheel, being awake, drowsy, or asleep, interacting with a mobile device (e.g., texting), looking straight ahead, sideways, or backwards, talking (e.g., to a passenger in the vehicle or during a phone call), sitting somewhere other than the driver's seat (e.g., in an autonomous driving experience), looking in the glove compartment, and so forth. Based on the determined activity, one or more operations can be initiated. Generally, the operations improve the driving experience, increase the safety of the driving experience, provide security for the vehicle, or control navigation of the vehicle.


For example, in some cases, the radar-based object detection component determines an attention level of the driver, based on the activity, and then initiates the one or more operations based on the determined attention level. For example, activities such as texting, looking backwards, or being drowsy or sleepy, may be indicative of a low attention level. In contrast, activities such as driving with both hands on the wheel are indicative of a high attention level. Thus, in some cases, a warning (e.g., audible, visual, or tactile) may be initiated, in response to determining that the driver has a low attention level, in order to alert the driver to pay attention to driving. In this way, the radar-based object detection component monitors the driver without requiring the driver's deliberate or conscious interaction with the system.


In one or more implementations, the radar-based object detection component monitors the presence and attention level of the driver in an autonomous or semi-autonomous vehicle system. Autonomous vehicles are developed to navigate and operate either unmanned or to assist a vehicle operator, and can utilize many different types of sensors, automation, robotics, and other computer-controlled systems and mechanisms. However, in many cases, the driver must still act as a “fallback mechanism” in order to handle driving duties in certain instances where the autonomous system fails or is unable to control navigation. In these cases, the radar-based object detection component may monitor the driver's presence and attention level to ensure that the driver is a suitable backup in the event that the autonomous system needs to switch over to the manual system. In the event that the driver's attention is low, or the driver is not present in the driver's seat, the component may initiate various warnings to ensure that the driver is reminded that he may be needed as a fallback mechanism. Furthermore, in some cases, the radar-based object detection component may prevent a transition from an autonomous driving mode to a manual driving mode if the attention level of the driver is below a threshold indicating that the driver is not paying attention or is not present in the driver's seat. As another example, if the system is in autonomous driving mode or cruise control mode, the component may detect if the driver has a low attention level, and in response, cause the vehicle to slow down while at the same time alerting the driver.


Similar techniques may also be applied to passengers within the vehicle. For example, radar-emitting elements and antennas may be positioned in the rear of the vehicle to monitor a toddler or baby, and may provide status updates to the driver, such as to let the driver know that the baby is sleeping, waking up, or choking on a piece of food.


In one or more implementations, the radar-based object detection system may also include radar-emitting elements and antennas positioned on the exterior of the vehicle in order to sense and detect various external objects, such as pedestrians, other vehicles, debris or foreign objects in the road, objects on the side of the road, and so forth. The detection of such external objects may be used as part of the autonomous driving experience, as part of a “cruise control” experience, or in order to provide a warning (or automatic braking) when objects are detected in close proximity to the driving path (e.g., when a pedestrian steps into the road in front of the vehicle).


In one or more implementations, the radar-based object detection component is further configured to augment a keyless entry systems to verify that a person is actually present, or as part of an authentication procedure to authenticate the driver as a known person permitted to drive the vehicle. For example, the component may prevent the car from being driven unless the driver is authenticated as a known person permitted to drive the vehicle. In order to authenticate the driver, the component may detect biometric characteristics of the driver (e.g., a height or skeletal structure) and/or a series of “in air” gestures corresponding to a specific authentication sequence.


Example Environment



FIG. 1 is an illustration of an environment 100 in an example implementation that is operable to employ radar-based object detection in a vehicle. The illustrated environment 100 includes a vehicle computing system 102, which is configurable in a variety of ways and includes a radar-based object detection component 104, a radar module 106, and a vehicle controller 108. The vehicle computing system 102 may be incorporated as part of a vehicle 110, which in this example is illustrated as an automobile, but may include any type of vehicle such as a boat, plane, train, and so forth.


As used herein, the term “automobile” refers to a passenger vehicle designed for operation on roads and having one or more engines used rotate to wheels causing the automobile to be propelled. Examples of automobiles include cars, trucks, sport utility vehicles, vans, and the like. In one or more implementations, vehicle 110 is “autonomous” or at least “partially autonomous”. Autonomous vehicles are developed to navigate and operate either unmanned or to assist a vehicle operator, and can utilize many different types of sensors, automation, robotics, and other computer-controlled systems and mechanisms.


In this example, radar-based object detection component 104 is a hardware component of vehicle computing system 102. The radar-based object detection component 104 is configurable to detect objects in three dimensions, such as to identify the object, an orientation of the object, and/or movement of the object.


In order to detect object characteristics, radar-based object detection system 104 includes one or more radar sensors 112. Generally, radar sensors 112 include one or more antennas that are configured to transmit one or more RF signals. As a transmitted signal reaches an object (e.g., the driver of vehicle 110), at least a portion reflects back to the radar sensor 112 and is processed, as further described below, in order to detect characteristics of the object. In some cases, each radar sensor 112 includes a radar-emitting element configured to transmit the RF signal, and an antenna configured to capture the reflections of the RF signal. However, the radar sensor can include different combinations of radar-emitting elements and antennas. For instance, a single antenna could be utilized to capture reflections from three different radar-emitting elements, or vice versa. The RF signals can have any suitable combination of energy level, carrier frequency, burst periodicity, pulse width, modulation type, waveform, phase relationship, and so forth. In some cases, some or all of the respective signals transmitted in the RF signals differs from one another to create a specific diversity scheme, such as a time diversity scheme that transmits multiple versions of a same signal at different points in time, a frequency diversity scheme that transmits signals using several different frequency channels, a space diversity scheme that transmits signals over different propagation paths, etc.


Radar-based object detection component 104 can be implemented with just one, or multiple radar sensors 112. For example, in some cases, a single radar sensor 112 can be positioned proximate the driver's seat of the vehicle 110 (e.g., on the steering wheel or dashboard) in order to capture characteristics of the driver of vehicle 110. In other cases, multiple radar sensors 112 can be positioned throughout the interior of vehicle 110 in order to detect characteristics of the driver, as well as other passengers within the vehicle. In this example, radar sensors 112 are also shown as being positioned on the exterior of vehicle 110 in order to detect characteristics of objects external to vehicle 110, such as pedestrians, other vehicles, or foreign objects (e.g., trees, buildings, or debris in the road). For example, radar sensors 112 could be positioned on the front of the exterior of the vehicle in order to detect characteristics of objects within the path of the moving vehicle, as well as on the sides and rear of the exterior of vehicle 110.


The radar module 106 is representative of functionality to detect the presence or activity of persons within the vehicle 110, or objects external to the vehicle 110, and to initiate various operations based on the detection. For example, the radar module 106 may receive inputs from the radar sensors 112 that are usable to detect characteristics or attributes to identify an object (e.g., the driver of vehicle 110, a passenger, or objects located outside of the vehicle), orientation of the object, and/or movement of the object. Based on recognition of a combination of one or more of the characteristics or attributes, the radar module 106 may initiate an operation.


When radar sensors 112 are positioned proximate to the driver of vehicle 110, the radar-based object detection component 104 can monitor a presence and attention level of a driver of vehicle 110 while the vehicle is moving by initiating transmission of an outgoing RF signal via the radar sensors 112 (e.g., via a radar-emitting element), and receiving, via the radar sensor 112 (e.g., via an antenna), an incoming RF signal generated by the outgoing RF signal reflecting off the driver of the vehicle 110. Radar-based object detection component 104 can then analyze the incoming RF signal to detect one or more characteristics of the driver. Such characteristics, for example, can include a position and movement of the driver's body or a specific body part, such as the driver's hands, mouth, eyes, and so forth.


Then, based on the detected characteristics, the radar module 106 can determine an activity of the driver. As described herein, an activity of the driver corresponds to an activity currently being performed by the driver, such as driving with one or both hands on the steering wheel, being awake, drowsy, or asleep, interacting with a mobile device (e.g., texting), looking straight ahead, sideways, or backwards, talking (e.g., talking with a passenger in the vehicle or talking during a phone call), sitting somewhere other than the driver's seat (e.g., in an autonomous driving experience), or looking in the glove compartment, to name just a few.


Based on the determined activity, the radar module 106 initiates one or more operations. In some cases, the one or more operations may be initiated by sending control signals to the vehicle controller 108, in order to cause the vehicle controller 108 to control the vehicle 110 to output audible warnings, visual notifications, control navigation of the vehicle (e.g., causing the vehicle to slow down or speed up), and so forth. For example, the vehicle controller can control the audio system of the vehicle 100 to output an audible warning, such as a loud beep, or voice narration that instructs the driver to “pay attention” or “wake up”, or notifies the driver that “the baby is asleep”. As another example, the vehicle controller 108 can control the navigation system of the vehicle (e.g., the cruise control system of autonomous driving system) based on the driver's activity, such as by slowing down when the driver is not paying attention.


In some cases, the radar module 106 determines an attention level of the driver, based on the activity, and then initiates the one or more operations based on the determined attention level. For example, activities such as texting, looking backwards, or being drowsy or sleepy, may be indicative of a low attention level. In contrast, activities such as driving with both hands on the wheel are indicative of a high attention level. Thus, in some cases, a warning (e.g., audible, visual, or tactile) may be initiated, in response to determining that the driver has a low attention level, in order to alert the driver to pay attention to driving. In this way, the radar module 106 monitors the driver without requiring the driver's deliberate or conscious interaction with the system.


In one or more implementations, in order to determine whether the driver is paying attention, the attention level is first determined as a score, based on the various detected characteristics of the driver. Then, the attention level is compared to a threshold. If the attention level of the driver is above the threshold, then radar module 106 determines that the driver is paying attention to driving vehicle 110. Alternately, if the attention level of the driver is below the threshold, then radar module 106 determines that the driver is not paying attention.


In one or more implementations, the radar-based object detection component 104 monitors the presences and attention level of the driver in an autonomous or semi-autonomous vehicle system. Autonomous vehicles are developed to navigate and operate either unmanned or to assist a vehicle operator, and can utilize many different types of sensors, automation, robotics, and other computer-controlled systems and mechanisms. However, in many cases, the driver must still act as a “fallback mechanism” in order to handle driving duties in certain instances where the autonomous system fails or is unable to control navigation. In these cases, the radar-based object detection component 104 may monitor the driver's presence and attention level to ensure that the driver is a suitable backup in the event that the autonomous system needs to switch over to the manual system. In the event that the driver's attention is low, or the driver is not present in the driver's seat, the component may initiate various warnings to ensure that the driver is reminded that he may be needed as a fallback mechanism. Furthermore, in some cases, the radar-based object detection component may prevent a transition from an autonomous driving mode to a manual driving mode if the attention level of the driver is below the threshold indicating that the driver is not paying attention or is not present in the driver's seat.


Similar techniques may also be applied to passengers within the vehicle. For example, radar sensors 112 may be positioned in the rear of the vehicle 110 to monitor a toddler or baby in order to determine an activity of the passenger, such as sleeping, waking up, choking on a piece of food, and so forth. Then, an operation may be initiated, based on the activity of the passenger, such as by alerting the driver that the baby is asleep, waking up, or choking on a piece of food.


In one or more implementations, the radar-based object detection component 104 is further configured to augment a keyless entry system for the vehicle to verify that a person is actually present, or as part of an authentication procedure to authenticate the driver as a known person permitted to drive the vehicle 110. For example, the radar sensor 112 can detect one or more biometric characteristics of the driver, such as height, skeletal structure, and so forth. Then, radar module 106 can compare the detected biometric characteristics of the driver to stored biometric characteristics of known persons that are permitted to drive the vehicle. If the detected biometric characteristics of the driver match the stored biometric characteristics, then radar module 106 authenticates the driver as a known person permitted to drive vehicle 110. Alternately or additionally, the radar sensor 112 can detect one or more gestures performed by the driver. Then, radar module 106 can compare the detected one or more gestures to stored gestures corresponding to an authentication sequence. If the detected gestures performed by the driver match the stored gestures, then radar module 106 authenticates the drive as a known person permitted to drive vehicle 110. In some cases, a two-stage authentication process may be applied, whereby the driver is authenticated based on biometric characteristics as well as detection of one or more recognized gestures performed by the driver. Once the driver is authenticated, the driver is then permitted to drive the vehicle. Alternately, if the driver is not authenticated, the driver may be prevented from driving the vehicle.


As described herein, biometric characteristics correspond to distinctive, measurable characteristics that can be used to identify a particular known person, or a particular “type” of person (e.g., an adult versus a child). Biometric characteristics are often categorized as physiological versus behavioral characteristics. Physiological characteristics are related to the shape of the body and may include, by way of example and not limitation, height, skeletal structure, fingerprint, palm veins, face recognition, DNA, palm print, hand geometry, iris recognition, retina scent, heart conditions, and so forth. Behavioral characteristics are related to the pattern of behavior of a person, including but not limited to a walking gait, typing rhythm, and so forth.


When radar sensors 112 are implemented on the exterior of vehicle 110, the radar module 106 can be implemented to detect characteristics of various external objects, such as pedestrians, other vehicles, debris or foreign objects in the road, objects on the side of the road, and so forth. Vehicle controller 108 may then control navigation of the vehicle based on the detected characteristics of external objects. The detection of such external objects may be used as part of the autonomous driving experience, as part of a “cruise control” experience, or in order to provide a warning (or automatic braking) when objects are detected in close proximity to the driving path (e.g., when a pedestrian steps into the road in front of the vehicle). For example, radar module 106 can recognize the detected characteristics as certain objects, such as a pedestrian or a vehicle. Vehicle controller 108 may then control navigation of vehicle 110 based on the recognized objects, such as by slowing down when a pedestrian steps in front of the vehicle, speeding up when a vehicle approaches quickly from the rear, or swerving when a vehicle approaches from the side.


Having generally described an environment in which radar-based object detection for vehicles may be implemented, now consider FIG. 2, which illustrates an example implementation of vehicle computing system 102 of FIG. 1 in greater detail. As discussed above, vehicle computing system 102 represents any suitable type of computing system that is implemented within a vehicle, such as an automobile, plane, boat, and so forth.


Vehicle computing system 102 includes processor(s) 202 and computer-readable media 204. Radar module 106 and vehicle controller 108 from FIG. 1 embodied as computer-readable instructions on the computer-readable media 204 can be executed by the processor(s) 202 to invoke or interface with some or all of the functionalities described herein, such as through Application Programming Interfaces (APIs) 206.


APIs 206 provide programming access into various routines and functionality incorporated into radar-based object detection component 104. For instance, radar-based object detection component 104 can have a programmatic interface (socket connection, shared memory, read/write registers, hardware interrupts, etc.) that can be used in concert with APIs 206 to allow applications external to radar-based object detection component 104 a way to communicate or configure the component. In some embodiments, APIs 206 provide high-level access into radar-based object detection component 104 in order to abstract implementation details and/or hardware access from a calling program, request notifications related to identified events, query for results, and so forth. APIs 206 can also provide low-level access to radar-based object detection component 104, where a calling program can control direct or partial hardware configuration of radar-based object detection component 104. In some cases, APIs 206 provide programmatic access to input configuration parameters that configure transmit signals and/or select object recognition algorithms. These APIs enable programs, such as radar module 106, to incorporate the functionality provided by radar-based object detection component 104 into executable code. For instance, radar module 106 can call or invoke APIs 206 to register for, or request, an event notification when a particular object characteristic has been detected, enable or disable wireless gesture recognition in vehicle computing system 102, and so forth. At times, APIs 206 can access and/or include low level hardware drivers that interface with hardware implementations of radar-based object detection component 104. Alternately or additionally, APIs 206 can be used to access various algorithms that reside on radar-based object detection component 104 to configure algorithms, extract additional information (such as 3D tracking information, angular extent, reflectivity profiles from different aspects, correlations between transforms/features from different channels, etc.), change an operating mode of radar-based object detection component 104, and so forth.


Radar-based object detection component 104 represents functionality that wirelessly detects objects, such as a driver or passenger within vehicle 110, or objects external to vehicle 110. Radar-based object detection component 104 can be implemented as a chip embedded within vehicle computing system 102, such as a System-on-Chip (SoC). However, it is to be appreciated that radar-based object detection component 104 can be implemented in any other suitable manner, such as one or more Integrated Circuits (ICs), as a processor with embedded processor instructions or configured to access processor instructions stored in memory, as hardware with embedded firmware, a printed circuit board with various hardware components, or any combination thereof. Here, radar-based object detection component 104 includes radar-emitting element 208, antenna(s) 210, digital signal processing component 212, machine-learning component 214, and an object characteristics library 216, which can be used in concert to detect object characteristics using radar techniques.


Generally, radar-emitting element 208 is configured to provide a radar field. In some cases, the radar field is configured to at least partially reflect off a target object, such as a driver of vehicle 110, other passengers within vehicle 110, or objects external to vehicle 110. The radar field can also be configured to penetrate fabric or other obstructions and reflect from human tissue. These fabrics or obstructions can include wood, glass, plastic, cotton, wool, nylon and similar fibers, and so forth, while reflecting from human tissues, such as a person's body, or a part of the person's body, such as a hand, face, and so forth.


A radar field can be a small size, such as 0 or 1 millimeters to 1.5 meters, or an intermediate size, such as 1 to 30 meters. It is to be appreciated that these sizes are merely for discussion purposes, and that any other suitable range can be used. When the radar field has an intermediate size, radar-based object detection component 104 is configured to receive and process reflections of the radar field to detect large-body movements based on reflections from human tissue caused by body, arm, or leg movements. In this way, user actions, such as texting, reaching into a glove compartment, or talking with other passengers, may be detected. In other cases, the radar field can be configured to enable radar-based object detection component 104 to detect smaller and more precise movements, such as movement of the eyes of the driver or passenger in vehicle 110, or micro-gestures used to authenticate the driver. Radar-emitting element 208 can be configured to emit continuously modulated radiation, ultra-wideband radiation, or sub-millimeter-frequency radiation.


Antenna(s) 210 transmit and receive RF signals. In some cases, radar-emitting element 208 couples with antenna(s) 210 to transmit a radar field. As one skilled in the art will appreciate, this is achieved by converting electrical signals into electromagnetic waves for transmission, and vice versa for reception. Radar-based object detection component 104 can include any suitable number of antennas in any suitable configuration. For instance, any of the antennas can be configured as a dipole antenna, a parabolic antenna, a helical antenna, a monopole antenna, and so forth. In some embodiments, antenna(s) 210 are constructed on-chip (e.g., as part of an SoC), while in other embodiments, antenna(s) 210 are separate components, metal, hardware, etc. that attach to, or are included within, radar-based object detection component 104. An antenna can be single-purpose (e.g., a first antenna directed towards transmitting signals, a second antenna directed towards receiving signals, etc.), or multi-purpose (e.g., an antenna is directed towards transmitting and receiving signals). Thus, some embodiments utilize varying combinations of antennas, such as an embodiment that utilizes two single-purpose antennas directed towards transmission in combination with four single-purpose antennas directed towards reception. The placement, size, and/or shape of antenna(s) 210 can be chosen to enhance a specific transmission pattern or diversity scheme, such as a pattern or scheme designed to capture information about a micro-gesture performed by the hand. In some cases, the antennas can be physically separated from one another by a distance that allows radar-based object detection component 104 to collectively transmit and receive signals directed to a target object over different channels, different radio frequencies, and different distances. In some cases, antenna(s) 210 are spatially distributed to support triangulation techniques, while in others the antennas are collocated to support beamforming techniques. While not illustrated, each antenna can correspond to a respective transceiver path that physically routes and manages the outgoing signals for transmission and the incoming signals for capture and analysis.


Digital signal processing component 212 generally represents digitally capturing and processing a signal. For instance, digital signal processing component 212 samples analog RF signals received by antenna(s) 210 to generate digital samples that represents the RF signals, and then processes these samples to extract information about the target object. Alternately or additionally, digital signal processing component 212 controls the configuration of signals generated and transmitted by radar-emitting element 208 and/or antenna(s) 210, such as configuring a plurality of signals to form a specific diversity scheme like a beamforming diversity scheme. In some cases, digital signal processing component 212 receives input configuration parameters that control an RF signal's transmission parameters (e.g., frequency channel, power level, etc.), such as through APIs 206. In turn, digital signal processing component 212 modifies the RF signal based upon the input configuration parameter. At times, the signal processing functions of digital signal processing component 212 are included in a library of signal processing functions or algorithms that are also accessible and/or configurable via APIs 206. Thus, digital signal processing component 212 can be programmed or configured via APIs 206 (and a corresponding programmatic interface of radar-based gesture detection component 104) to dynamically select algorithms and/or dynamically reconfigure. Digital signal processing component 212 can be implemented in hardware, software, firmware, or any combination thereof.


Among other things, machine-learning component 214 receives information processed or extracted by digital signal processing component 212, and uses that information to classify or recognize various aspects of the target object. In some cases, machine-learning component 214 applies one or more algorithms to probabilistically determine an action of a driver or passenger based on an input signal and previously learned object characteristic features corresponding to the action. As in the case of digital signal processing component 212, machine-learning component 214 can include a library of multiple machine-learning algorithms, such as a Random Forrest algorithm, deep learning algorithms (i.e. artificial neural network algorithms, convolutional neural net algorithms, etc.), clustering algorithms, Bayesian algorithms, and so forth. Machine-learning component 214 can be trained on how to identify various object characteristics corresponding to user action using input data that consists of example user actions to learn. In turn, machine-learning component 214 uses the input data to learn what features can be attributed to a specific action. These features are then used to identify when the specific action occurs. In some embodiments, APIs 206 can be used to configure machine-learning component 214 and/or its corresponding algorithms. Thus, machine-learning component 214 can be configured via APIs 206 (and a corresponding programmatic interface of radar-based object detection component 104) to dynamically select algorithms and/or dynamically reconfigure.


Object characteristics library 216 represents data used by digital signal processing component 212 and/or machine-learning component 214 to identify a target object and/or detect known actions or gestures performed by the driver or passenger, or known external objects. For instance, object characteristics library 216 can store signal characteristics, characteristics about a target object that are discernable from a signal, or a customized machine-learning model that can be used to identify a user action, unique in-the-air gesture, a user identity, user presence, and so forth. In addition, certain data stored in object characteristics library 216 may be treated in one or more ways before it is stored or used, so that personally identifiable information is removed. For example, a user's identity may be treated so that no personally identifiable information can be determined for the user, or a user's geographic location may be generalized where location information is obtained (such as to a city, ZIP code, or state level), so that a particular location of a user cannot be determined. Thus, the user may have control over what information is collected about the user, how that information is used, and what information is provided to the user.



FIG. 3 illustrates an example of RF wave propagation, and a corresponding reflected wave propagation. It is to be appreciated that the following discussion has been simplified, and is not intended to describe all technical aspects of RF wave propagation, reflected wave propagation, or detection techniques.


Environment 300a includes source device 302 and object 304. Object 304, for example, could be a driver or passenger in vehicle 110, or an external object (e.g., a pedestrian, other vehicle, or foreign object). Source device 302 includes antenna 306, which generally represents functionality configured to transmit and receive electromagnetic waves in the form of an RF signal. It is to be appreciated that antenna 306 can be coupled to a source, such as a radar-emitting element, to achieve transmission of a signal. In this example, source device 302 transmits a series of RF pulses, illustrated here as RF pulse 308a, RF pulse 308b, and RF pulse 308c. As indicated by their ordering and distance from source device 302, RF pulse 308a is transmitted first in time, followed by RF pulse 308b, and then RF pulse 308c. For discussion purposes, these RF pulses have the same pulse width, power level, and transmission periodicity between pulses, but any other suitable type of signal with alternate configurations can be transmitted without departing from the scope of the claimed subject matter.


Generally speaking, electromagnetic waves can be characterized by the frequency or wavelength of their corresponding oscillations. Being a form of electromagnetic radiation, RF signals adhere to various wave and particle properties, such as reflection. When an RF signal reaches an object, it will undergo some form of transition. Specifically, there will be some reflection off the object. Environment 300b illustrates the reflection of RF pulses 308a-308c reflecting off of object 304, where RF pulse 310a corresponds to a reflection originating from RF pulse 308a reflecting off of object 304, RF pulse 310b corresponds to a reflection originating from RF pulse 310b, and so forth. In this simple case, source device 302 and object 304 are stationary, and RF pulses 308a-308c are transmitted via a single antenna (antenna 306) over a same RF channel, and are transmitted directly towards object 304 with a perpendicular impact angle. Similarly, RF pulses 310a-310c are shown as reflecting directly back to source device 302, rather than with some angular deviation. However, as one skilled in the art will appreciate, these signals can alternately be transmitted or reflected with variations in their transmission and reflection directions based upon the configuration of source device 302, object 304, transmission parameters, variations in real-world factors, and so forth. Upon receiving and capturing RF pulses 310a-310c, source device 302 can then analyze the pulses, either individually or in combination, to identify characteristics related to object 304. For example, source device 302 can analyze all of the received RF pulses to obtain temporal information and/or spatial information about object 304. Accordingly, source device 302 can use knowledge about a transmission signal's configuration (such as pulse widths, spacing between pulses, pulse power levels, phase relationships, and so forth), and further analyze a reflected RF pulse to identify various characteristics about object 304, such as size, shape, movement speed, movement direction, surface smoothness, material composition, and so forth.


Now consider FIG. 4, which builds upon the above discussion of FIG. 3. FIG. 4 illustrates example environment 400 in which multiple antenna are used to ascertain information about a target object. Environment 400 includes source device 402 and a target object, shown here as hand 404. It is to be appreciated, however, that similar techniques may also be applied to other target objects, such as a driver or passenger in vehicle 110, or an external object. Generally speaking, source device 402 includes antennas 406a-406d to transmit and receive multiple RF signals. In some embodiments, source device 402 includes radar-based object detection component 104, and antennas 406a-406d correspond to antennas 208. While source device 402 in this example includes four antennas, it is to be appreciated that any suitable number of antennas can be used. Each antenna of antennas 406a-406d is used by source device 402 to transmit a respective RF signal (e.g., antenna 406a transmits RF signal 408a, antenna 406b transmits RF signal 408b, and so forth). As discussed above, these RF signals can be configured to form a specific transmission pattern or diversity scheme when transmitted together. For example, the configuration of RF signals 408a-408d, as well as the placement of antennas 406a-406d relative to a target object, can be based upon beamforming techniques to produce constructive interference or destructive interference patterns, or alternately configured to support triangulation techniques. At times, source device 402 configures RF signals 408a-408d based upon an expected information extraction algorithm, as further described below.


When RF signals 408a-408d reach hand 404, they generate reflected RF signals 410a-410d. Similar to the discussion of FIG. 4 above, source device 402 captures these reflected RF signals, and then analyzes them to identify various properties or characteristics of hand 404, such as a particular action or micro-gesture. For instance, in this example, RF signals 408a-408d are illustrated with the bursts of the respective signals being transmitted synchronously in time. In turn, and based upon the shape and positioning of hand 404, reflected signals 410a-410d return to source device 402 at different points in time (e.g., reflected signal 410b is received first, followed by reflected signal 410c, then reflected signal 410a, and then reflected signal 410d). Reflected signals 410a-410d can be received by source device 402 in any suitable manner. For example, antennas 406a-406d can each receive all of reflected signals 410a-410d, or receive varying subset combinations of reflected signals 410a-410d (i.e. antenna 406a receives reflected signal 410a and reflected signal 410d, antenna 406b receives reflected signal 410a, reflected signal 410b, and reflected signal 410c, etc.). Thus, each antenna can receive reflected signals generated by transmissions from another antenna. By analyzing the various return times of each reflected signal, source device 402 can determine shape and corresponding distance information associated with hand 404. When reflected pulses are analyzed over time, source device 402 can additionally discern movement. Thus, by analyzing various properties of the reflected signals, as well as the transmitted signals, various information about hand 404 can be extracted, as further described below. It is to be appreciated that the above example has been simplified for discussion purposes, and is not intended to be limiting.


As in the case of FIG. 3, FIG. 4 illustrates RF signals 408a-508d as propagating at a 90° angle from source device 402 and in phase with one another. Similarly, reflected signals 410a-510d each propagate back at a 90° angle from hand 404 and, as in the case of RF signals 408a-508d, are in phase with one another. However, as one skilled in the art will appreciate, more complex transmission signal configurations, and signal analysis on the reflected signals, can be utilized, examples of which are provided above and below. In some embodiments, RF signals 408a-508d can each be configured with different directional transmission angles, signal phases, power levels, modulation schemes, RF transmission channels, and so forth. These differences result in variations between reflected signals 410a-510d. In turn, these variations each provide different perspectives of the target object which can be combined using data fusion techniques to yield a better estimate of hand 404, how it is moving, its 3-dimentional (3D) spatial profile, a corresponding micro-gesture, etc.


Example Procedures



FIG. 5 is a flow diagram depicting a procedure 500 in an example implementation. The following discussion describes techniques that may be implemented utilizing the previously described systems and devices. Aspects of each of the procedure may be implemented in hardware, firmware, software, or a combination thereof. The procedure is shown as a set of blocks that specify operations performed by one or more devices and are not necessarily limited to the orders shown for performing the operations by the respective blocks.


Transmission of an outgoing RF signal is initiated via a radar-emitting element of a radar sensor implemented in a vehicle (block 902). For example, radar module 106 initiates transmission of an outgoing RF signal via radar-emitting element 208 of radar sensor 112 implemented in vehicle 110.


An incoming RF signal generated by the outgoing RF signal reflecting off an object is received by an antenna of the radar sensor (block 904). For example, antenna 210 of radar sensor 112 receives an incoming RF signal generated by the outgoing RF signal reflecting off an object, such as a driver or passenger within vehicle 110, or an object external to the vehicle 110.


The incoming RF signal is analyzed to detect one or more characteristics of the object (block 906). For example, radar module 106 may analyze the incoming RF signal to detect one or more characteristics of the driver, passenger, or external object.


Based on the detected one or more characteristics of the object, an operation is initiated (block 908). For example, the radar module 106 may determine an activity of the driver of the vehicle 110, based on the detected characteristics, and then initiate an operation based on the activity of the driver. Other examples include providing operations based on detected characteristics of passengers in the vehicle, authenticating the driver based on detected biometric characteristics or gestures, and controlling the vehicle based on the detection of external objects.


Example Vehicle Computing System



FIG. 6 illustrates various components of an example vehicle computing system 600 that incorporates radar-based object detection for vehicles as described with reference to FIGS. 1-5 Vehicle computing system 600 may be implemented as any type of a fixed or mobile device, in any form of a consumer, computer, portable, user, communication, phone, navigation, gaming, audio, camera, messaging, media playback, and/or other type of electronic device, such as vehicle computing system 102. In light of this, it is to be appreciated that various alternate embodiments can include additional components that are not described, or exclude components that are described, with respect to vehicle computing system 600.


Vehicle computing system 600 includes communication devices 602 that enable wired and/or wireless communication of device data 604 (e.g., received data, data that is being received, data scheduled for broadcast, data packets of the data, etc.). The device data 604 or other device content can include configuration settings of the device and/or information associated with a user of the device.


Vehicle computing system 600 also includes communication interfaces 606 that can be implemented as any one or more of a serial and/or parallel interface, a wireless interface, any type of network interface, a modem, and as any other type of communication interface. The communication interfaces 606 provide a connection and/or communication links between vehicle computing system 600 and a communication network by which other electronic, computing, and communication devices communicate data with vehicle computing system 600.


Vehicle computing system 600 includes one or more processors 608 (e.g., any of microprocessors, controllers, and the like) which process various computer-executable instructions to control the operation of vehicle computing system 600 and to implement embodiments of the techniques described herein. Alternatively or in addition, vehicle computing system 600 can be implemented with any one or combination of hardware, firmware, or fixed logic circuitry that is implemented in connection with processing and control circuits which are generally identified at 610. Although not shown, vehicle computing system 600 can include a system bus or data transfer system that couples the various components within the device. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures.


Vehicle computing system 600 also includes computer-readable media 612, such as one or more memory components, examples of which include random access memory (RAM), non-volatile memory (e.g., any one or more of a read-only memory (ROM), flash memory, EPROM, EEPROM, etc.), and a disk storage device. A disk storage device may be implemented as any type of magnetic or optical storage device, such as a hard disk drive, a recordable and/or rewriteable compact disc (CD), any type of a digital versatile disc (DVD), and the like.


Computer-readable media 612 provides data storage mechanisms to store the device data 604, as well as various applications 614 and any other types of information and/or data related to operational aspects of vehicle computing system 600. The applications 614 can include a device manager (e.g., a control application, software application, signal processing and control module, code that is native to a particular device, a hardware abstraction layer for a particular device, etc.). Computer-readable media 612 also includes APIs 616.


APIs 616 provide programmatic access to an authentication component, examples of which are provided above. The programmatic access can range from high-level program access that obscures underlying details of how a function is implemented, to low-level programmatic access that enables access to hardware. In some cases, APIs can be used to send input configuration parameters associated with modifying how signals are transmitted, received, and/or processed by an authentication component.


Vehicle computing system 600 also includes audio and/or video processing system 618 that processes audio data and/or passes through the audio and video data to audio system 620 and/or to display system 622 (e.g., a screen of a smart phone or camera). Audio system 620 and/or display system 622 may include any devices that process, display, and/or otherwise render audio, video, display, and/or image data. Display data and audio signals can be communicated to an audio component and/or to a display component via an RF link, S-video link, HDMI, composite video link, component video link, DVI, analog audio connection, or other similar communication link, such as media data port 624. In some implementations, audio system 620 and/or display system 622 are external components to vehicle computing system 600. Alternatively or additionally, display system 622 can be an integrated component of the example electronic device, such as part of an integrated touch interface.


Vehicle computing system 600 also includes a radar-based object detection component 626 that wirelessly identifies one or more features of a target object, such as a micro-gesture performed by a hand as further described above. Radar-based object detection component 626 can be implemented as any suitable combination of hardware, software, firmware, and so forth. In some embodiments, authentication component 626 is implemented as an SoC. Among other things, radar-based object detection component 626 includes radar-emitting element 628, antennas 630, digital signal processing component 632, machine-learning component 634, and object characteristics library 636.


Radar-emitting element 628 is configured to provide a radar field. In some cases, the radar field is configured to at least partially reflect off a target object. The radar field can also be configured to penetrate fabric or other obstructions and reflect from human tissue. These fabrics or obstructions can include wood, glass, plastic, cotton, wool, nylon and similar fibers, and so forth, while reflecting from human tissues, such as a person's hand. Radar-emitting element 628 works in concert with antennas 630 to provide the radar field.


Antenna(s) 630 transmit and receive RF signals under the control of authentication component 626. Each respective antenna of antennas 630 can correspond to a respective transceiver path internal to authentication component 626 that physical routes and manages outgoing signals for transmission and the incoming signals for capture and analysis as further described above.


Digital signal processing component 632 digitally processes RF signals received via antennas 630 to extract information about the target object. This can be high-level information that simply identifies a target object, or lower level information that identifies a particular micro-gesture performed by a hand. In some embodiments, digital signal processing component 632 additionally configures outgoing RF signals for transmission on antennas 630. Some of the information extracted by digital signal processing component 632 is used by machine-learning component 634. Digital signal processing component 632 at times includes multiple digital signal processing algorithms that can be selected or deselected for an analysis, examples of which are provided above. Thus, digital signal processing component 632 can generate key information from RF signals that can be used to determine what gesture might be occurring at any given moment. At times, an application, such those illustrated by applications 614, can configure the operating behavior of digital signal processing component 632 via APIs 616.


Machine-learning component 634 receives input data, such as a transformed raw signal or high-level information about a target object, and analyzes the input date to identify or classify various features contained within the data. As in the case above, machine-learning component 634 can include multiple machine-learning algorithms that can be selected or deselected for an analysis. Among other things, machine-learning component 634 can use the key information generated by digital signal processing component 632 to detect relationships and/or correlations between the generated key information and previously learned gestures to probabilistically decide which gesture is being performed. At times, an application, such those illustrated by applications 614, can configure the operating behavior of machine-learning component 632 via APIs 616.


Object characteristics library 636 represents data used by radar-based object detection component 626 to identify a target object and/or gestures performed by the target object. For instance, object characteristics library 216 can store signal characteristics, or characteristics about a target object that are discernable from a signal, that can be used to identify a user action, a unique in-the-air gesture, biometric characteristics, a user identity, user presence, and so forth. In addition, certain data stored in object characteristics library 636 may be treated in one or more ways before it is stored or used, so that personally identifiable information is removed. For example, a user's identity may be treated so that no personally identifiable information can be determined for the user, or a user's geographic location may be generalized where location information is obtained (such as to a city, ZIP code, or state level), so that a particular location of a user cannot be determined. Thus, the user may have control over what information is collected about the user, how that information is used, and what information is provided to the user.


Although the embodiments have been described in language specific to structural features and/or methodological acts, it is to be understood that the various embodiments defined in the appended claims are not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the various embodiments.

Claims
  • 1. A radar-based object detection component implemented in a vehicle, the radar-based object detection component comprising: at least one radar sensor positioned within an interior of the vehicle, the at least one radar sensor comprising at least one radar-emitting element and at least one antenna; anda radar module implemented at least partially in hardware and configured to authenticate a driver of the vehicle by: receiving, via the antenna, an incoming RF signal generated by an outgoing RF signal emitted by the radar-emitting element of the radar sensor reflecting off a portion of the driver that is performing at least one gesture;determining the gesture based on the incoming RF signal;comparing the determined gesture to stored gestures corresponding to known persons permitted to drive the vehicle; andauthenticating the driver as a known person permitted to drive the vehicle based on the determined gesture matching one of the stored gestures.
  • 2. The radar-based object detection component of claim 1, wherein the radar module is further configured to cause the vehicle to allow the driver to drive the vehicle based on the authenticating.
  • 3. The radar-based object detection component of claim 1, further comprising at least one radar sensor configured to detect one or more characteristics of a passenger in the vehicle, and wherein the radar module is further configured to: determine an activity of the passenger based on the one or more detected characteristics of the passenger; andinitiate an operation based on the determined activity of the passenger.
  • 4. The radar-based object detection component of claim 1, further comprising at least one radar sensor positioned on an exterior of the vehicle and configured to detect characteristics of external objects, and wherein a vehicle controller of the vehicle is configured to control navigation of the vehicle based on the detected characteristics of the external objects.
  • 5. The radar-based object detection component of claim 1, wherein the authenticating the driver is further based on at least one detected biometric characteristic of the driver that is determined from the radar sensor.
  • 6. The radar-based object detection component of claim 5, wherein the authenticating the driver is further based on comparing the detected biometric characteristic to stored biometric characteristics corresponding to the known persons permitted to drive the vehicle.
  • 7. The radar-based object detection component of claim 5, wherein the detected biometric characteristics comprise a height or skeletal structure.
  • 8. The radar-based object detection component of claim 7, wherein the height or skeletal structure is used to determine whether the driver is a child or an adult.
  • 9. A method comprising: initiating transmission of an outgoing RF signal via at least one radar-emitting element implemented within at least one radar sensor of a vehicle;receiving, via at least one antenna of the radar sensor, an incoming RF signal generated by the outgoing RF signal reflecting off a portion of the driver that is performing at least one gesture;determining the gesture based on the incoming RF signal;comparing the determined gesture to stored gestures corresponding to known persons permitted to drive the vehicle; andauthenticating the driver as a known person permitted to drive the vehicle based on the determined gesture matching one of the stored gestures.
  • 10. The method of claim 9, further comprising allowing the driver to drive the vehicle based on the authenticating.
  • 11. The method of claim 9, further comprising: detecting one or more characteristics of a passenger in the vehicle;determining an activity of the passenger based on the detected characteristics of the passenger; andinitiating an operation based on the determined activity of the passenger.
  • 12. The method of claim 9, further comprising: detecting characteristics of external objects; andcontrolling navigation of the vehicle based on the detected characteristics of the external objects.
  • 13. The method of claim 9, wherein the authenticating the driver is further based on at least one detected biometric characteristic of the driver that is determined from the radar sensor.
  • 14. The method of claim 13, wherein the authenticating the driver is further based on comparing the detected biometric characteristic to stored biometric characteristics corresponding to the known persons permitted to drive the vehicle.
  • 15. The method of claim 13, wherein the detected biometric characteristics comprise a height or skeletal structure.
  • 16. The method of claim 15, wherein the height or skeletal structure is used to determine whether the driver is a child or an adult.
  • 17. A vehicle comprising: a vehicle controller configured to provide an autonomous driving mode for the vehicle;a radar-based object detection component comprising: at least one interior radar sensor positioned within an interior of the vehicle and configured to detect a series of gestures performed by a driver of the vehicle, and at least one exterior radar sensor positioned on an exterior of the vehicle and configured to detect characteristics of external objects; anda radar module implemented at least partially in hardware and configured to communicate control signals to the vehicle controller to allow the driver to operate the vehicle based on the detected gestures of the driver matching a predefined series of gestures corresponding to a known driver.
  • 18. The vehicle of claim 17, wherein: the radar sensor is further configured to detect at least one biometric characteristic of the driver; andthe communication of the control signals to the vehicle controller to allow the driver to operate the vehicle is further based on the detected biometric characteristic of the driver.
  • 19. The vehicle of claim 17, wherein the detected gestures of the driver are compared to a plurality of series of gestures corresponding to respective known drivers.
  • 20. The vehicle of claim 17, wherein the detected gestures are recognized as three-dimensional “in-the-air” gestures performed by the driver.
PRIORITY

This application claims priority to U.S. Provisional Patent Application Ser. No. 62/237,975 filed on Oct. 6, 2015, the disclosure of which is incorporated by reference herein in its entirety.

US Referenced Citations (563)
Number Name Date Kind
3610874 Gagliano Oct 1971 A
3752017 Lloyd et al. Aug 1973 A
3953706 Harris et al. Apr 1976 A
4104012 Ferrante Aug 1978 A
4654967 Thenner Apr 1987 A
4700044 Hokanson et al. Oct 1987 A
4795998 Dunbar et al. Jan 1989 A
4838797 Dodier Jun 1989 A
5016500 Conrad et al. May 1991 A
5121124 Spivey et al. Jun 1992 A
5298715 Chalco et al. Mar 1994 A
5341979 Gupta Aug 1994 A
5410471 Alyfuku et al. Apr 1995 A
5468917 Brodsky et al. Nov 1995 A
5564571 Zanotti Oct 1996 A
5656798 Kubo et al. Aug 1997 A
5724707 Kirk et al. Mar 1998 A
5798798 Rector et al. Aug 1998 A
6032450 Blum Mar 2000 A
6037893 Lipman Mar 2000 A
6080690 Lebby et al. Jun 2000 A
6101431 Niwa et al. Aug 2000 A
6210771 Post et al. Apr 2001 B1
6254544 Hayashi Jul 2001 B1
6313825 Gilbert Nov 2001 B1
6340979 Beaton et al. Jan 2002 B1
6386757 Konno May 2002 B1
6440593 Ellison et al. Aug 2002 B2
6492980 Sandbach Dec 2002 B2
6493933 Post et al. Dec 2002 B1
6513833 Breed Feb 2003 B2
6513970 Tabata et al. Feb 2003 B1
6524239 Reed et al. Feb 2003 B1
6543668 Fujii et al. Apr 2003 B1
6616613 Goodman Sep 2003 B1
6711354 Kameyama Mar 2004 B2
6717065 Hosaka et al. Apr 2004 B2
6802720 Weiss et al. Oct 2004 B2
6833807 Flacke et al. Dec 2004 B2
6835898 Eldridge et al. Dec 2004 B2
6854985 Weiss Feb 2005 B1
6929484 Weiss et al. Aug 2005 B2
6970128 Dwelly et al. Nov 2005 B1
6997882 Parker et al. Feb 2006 B1
7019682 Louberg et al. Mar 2006 B1
7134879 Sugimoto et al. Nov 2006 B2
7158076 Fiore et al. Jan 2007 B2
7164820 Eves et al. Jan 2007 B2
7194371 McBride et al. Mar 2007 B1
7205932 Fiore Apr 2007 B2
7223105 Weiss et al. May 2007 B2
7230610 Jung et al. Jun 2007 B2
7249954 Weiss Jul 2007 B2
7266532 Sutton et al. Sep 2007 B2
7299964 Jayaraman et al. Nov 2007 B2
7310236 Takahashi et al. Dec 2007 B2
7317416 Flom et al. Jan 2008 B2
7348285 Dhawan et al. Mar 2008 B2
7365031 Swallow et al. Apr 2008 B2
7421061 Boese et al. Sep 2008 B2
7462035 Lee et al. Dec 2008 B2
7528082 Krans et al. May 2009 B2
7544627 Tao et al. Jun 2009 B2
7578195 DeAngelis et al. Aug 2009 B2
7644488 Aisenbrey Jan 2010 B2
7647093 Bojovic et al. Jan 2010 B2
7670144 Ito et al. Mar 2010 B2
7677729 Vilser et al. Mar 2010 B2
7691067 Westbrook et al. Apr 2010 B2
7698154 Marchosky Apr 2010 B2
7791700 Bellamy Sep 2010 B2
7834276 Chou et al. Nov 2010 B2
7845023 Swatee Dec 2010 B2
7941676 Glaser May 2011 B2
7952512 Delker et al. May 2011 B1
7999722 Beeri et al. Aug 2011 B2
8062220 Kurtz et al. Nov 2011 B2
8063815 Valo et al. Nov 2011 B2
8169404 Boillot May 2012 B1
8179604 Prada Gomez et al. May 2012 B1
8193929 Siu et al. Jun 2012 B1
8199104 Park et al. Jun 2012 B2
8282232 Hsu et al. Oct 2012 B2
8289185 Alonso Oct 2012 B2
8301232 Albert et al. Oct 2012 B2
8314732 Oswald et al. Nov 2012 B2
8334226 Nhan et al. Dec 2012 B2
8341762 Balzano Jan 2013 B2
8344949 Moshfeghi Jan 2013 B2
8367942 Howell et al. Feb 2013 B2
8475367 Yuen et al. Jul 2013 B1
8505474 Kang et al. Aug 2013 B2
8509882 Albert et al. Aug 2013 B2
8514221 King et al. Aug 2013 B2
8527146 Jackson et al. Sep 2013 B1
8549829 Song et al. Oct 2013 B2
8560972 Wilson Oct 2013 B2
8562526 Heneghan et al. Oct 2013 B2
8569189 Bhattacharya et al. Oct 2013 B2
8614689 Nishikawa et al. Dec 2013 B2
8655004 Prest et al. Feb 2014 B2
8700137 Albert Apr 2014 B2
8758020 Burdea et al. Jun 2014 B2
8759713 Sheats Jun 2014 B2
8764651 Tran Jul 2014 B2
8785778 Streeter et al. Jul 2014 B2
8790257 Libbus et al. Jul 2014 B2
8814574 Selby et al. Aug 2014 B2
8819812 Weber et al. Aug 2014 B1
8854433 Rafii Oct 2014 B1
8860602 Nohara et al. Oct 2014 B2
8921473 Hyman Dec 2014 B1
8948839 Longinotti-Buitoni et al. Feb 2015 B1
9055879 Selby et al. Jun 2015 B2
9093289 Vicard et al. Jul 2015 B2
9125456 Chow Sep 2015 B2
9141194 Keyes et al. Sep 2015 B1
9148949 Guofu et al. Sep 2015 B2
9223494 Desalvo et al. Dec 2015 B1
9229102 Wright et al. Jan 2016 B1
9230160 Kanter Jan 2016 B1
9235241 Newham et al. Jan 2016 B2
9316727 Sentelle et al. Apr 2016 B2
9331422 Nazzaro et al. May 2016 B2
9335825 Rautianinen et al. May 2016 B2
9346167 O'Connor et al. May 2016 B2
9354709 Heller et al. May 2016 B1
9508141 Khachaturian et al. Nov 2016 B2
9569001 Mistry et al. Feb 2017 B2
9575560 Poupyrev et al. Feb 2017 B2
9588625 Poupyrev Mar 2017 B2
9594443 VanBlon et al. Mar 2017 B2
9600080 Poupyrev Mar 2017 B2
9693592 Robinson et al. Jul 2017 B2
9746551 Scholten et al. Aug 2017 B2
9766742 Papakostas Sep 2017 B2
9778749 Poupyrev Oct 2017 B2
9811164 Poupyrev Nov 2017 B2
9817109 Saboo et al. Nov 2017 B2
9837760 Karagozler et al. Dec 2017 B2
9848780 DeBusschere et al. Dec 2017 B1
9921660 Poupyrev Mar 2018 B2
9933908 Poupyrev Apr 2018 B2
9947080 Nguyen et al. Apr 2018 B2
9971414 Gollakota et al. May 2018 B2
9971415 Poupyrev et al. May 2018 B2
9983747 Poupyrev May 2018 B2
9994233 Diaz-Jimenez Jun 2018 B2
10016162 Rogers et al. Jul 2018 B1
10034630 Lee Jul 2018 B2
10073590 Dascola et al. Sep 2018 B2
10080528 DeBusschere et al. Sep 2018 B2
10082950 Lapp Sep 2018 B2
10088908 Poupyrev et al. Oct 2018 B1
10139916 Poupyrev Nov 2018 B2
10155274 Robinson et al. Dec 2018 B2
10175781 Karagozler et al. Jan 2019 B2
10203763 Poupyrev et al. Feb 2019 B1
10222469 Gillian et al. Mar 2019 B1
10241581 Lien et al. Mar 2019 B2
10268321 Poupyrev Apr 2019 B2
10285456 Poupyrev et al. May 2019 B2
10300370 Amihood et al. May 2019 B1
10310620 Lien et al. Jun 2019 B2
10310621 Lien et al. Jun 2019 B1
10379621 Schwesig et al. Aug 2019 B2
10401490 Gillian et al. Sep 2019 B2
10409385 Poupyrev Sep 2019 B2
20010035836 Miceli et al. Nov 2001 A1
20020009972 Amento et al. Jan 2002 A1
20020080156 Abbott et al. Jun 2002 A1
20020170897 Hall Nov 2002 A1
20030005030 Sutton et al. Jan 2003 A1
20030071750 Benitz Apr 2003 A1
20030093000 Nishio et al. May 2003 A1
20030100228 Bungo et al. May 2003 A1
20030119391 Swallow et al. Jun 2003 A1
20030122677 Kail Jul 2003 A1
20040009729 Hill et al. Jan 2004 A1
20040102693 Jenkins May 2004 A1
20040249250 McGee et al. Dec 2004 A1
20040259391 Jung et al. Dec 2004 A1
20050069695 Jung et al. Mar 2005 A1
20050128124 Greneker et al. Jun 2005 A1
20050148876 Endoh et al. Jul 2005 A1
20050231419 Mitchell Oct 2005 A1
20050267366 Murashita et al. Dec 2005 A1
20060035554 Glaser et al. Feb 2006 A1
20060040739 Wells Feb 2006 A1
20060047386 Kanevsky et al. Mar 2006 A1
20060061504 Leach, Jr. et al. Mar 2006 A1
20060125803 Westerman et al. Jun 2006 A1
20060136997 Telek et al. Jun 2006 A1
20060139162 Flynn Jun 2006 A1
20060139314 Bell Jun 2006 A1
20060148351 Tao et al. Jul 2006 A1
20060157734 Onodero et al. Jul 2006 A1
20060166620 Sorensen Jul 2006 A1
20060170584 Romero et al. Aug 2006 A1
20060209021 Yoo et al. Sep 2006 A1
20060258205 Locher et al. Nov 2006 A1
20060284757 Zemany Dec 2006 A1
20070024488 Zemany et al. Feb 2007 A1
20070026695 Lee et al. Feb 2007 A1
20070027369 Pagnacco et al. Feb 2007 A1
20070118043 Oliver et al. May 2007 A1
20070161921 Rausch Jul 2007 A1
20070164896 Suzuki et al. Jul 2007 A1
20070176821 Flom et al. Aug 2007 A1
20070192647 Glaser Aug 2007 A1
20070197115 Eves et al. Aug 2007 A1
20070197878 Shklarski Aug 2007 A1
20070210074 Maurer et al. Sep 2007 A1
20070237423 Tico et al. Oct 2007 A1
20080001735 Tran Jan 2008 A1
20080002027 Kondo et al. Jan 2008 A1
20080015422 Wessel Jan 2008 A1
20080024438 Collins et al. Jan 2008 A1
20080039731 McCombie et al. Feb 2008 A1
20080059578 Albertson et al. Mar 2008 A1
20080065291 Breed Mar 2008 A1
20080134102 Movold et al. Jun 2008 A1
20080136775 Conant Jun 2008 A1
20080168396 Matas et al. Jul 2008 A1
20080194204 Duet et al. Aug 2008 A1
20080194975 MacQuarrie et al. Aug 2008 A1
20080211766 Westerman et al. Sep 2008 A1
20080233822 Swallow et al. Sep 2008 A1
20080278450 Lashina Nov 2008 A1
20080282665 Speleers Nov 2008 A1
20080291158 Park et al. Nov 2008 A1
20080303800 Elwell Dec 2008 A1
20080316085 Rofougaran et al. Dec 2008 A1
20080320419 Matas et al. Dec 2008 A1
20090018408 Ouchi et al. Jan 2009 A1
20090018428 Dias et al. Jan 2009 A1
20090033585 Lang Feb 2009 A1
20090053950 Surve Feb 2009 A1
20090056300 Chung et al. Mar 2009 A1
20090058820 Hinckley Mar 2009 A1
20090113298 Jung et al. Apr 2009 A1
20090115617 Sano et al. May 2009 A1
20090118648 Kandori et al. May 2009 A1
20090149036 Lee et al. Jun 2009 A1
20090177068 Stivoric et al. Jul 2009 A1
20090203244 Toonder Aug 2009 A1
20090226043 Angell et al. Sep 2009 A1
20090253585 Diatchenko et al. Oct 2009 A1
20090270690 Roos et al. Oct 2009 A1
20090278915 Kramer et al. Nov 2009 A1
20090288762 Wolfel Nov 2009 A1
20090295712 Ritzau Dec 2009 A1
20090319181 Khosravy et al. Dec 2009 A1
20100045513 Pett et al. Feb 2010 A1
20100050133 Nishihara et al. Feb 2010 A1
20100053151 Marti et al. Mar 2010 A1
20100060570 Underkoffler et al. Mar 2010 A1
20100065320 Urano Mar 2010 A1
20100069730 Bergstrom et al. Mar 2010 A1
20100071205 Graumann et al. Mar 2010 A1
20100094141 Puswella Apr 2010 A1
20100109938 Oswald et al. May 2010 A1
20100152600 Droitcour et al. Jun 2010 A1
20100179820 Harrison et al. Jul 2010 A1
20100198067 Mahfouz et al. Aug 2010 A1
20100201586 Michalk Aug 2010 A1
20100204550 Heneghan et al. Aug 2010 A1
20100205667 Anderson et al. Aug 2010 A1
20100208035 Pinault et al. Aug 2010 A1
20100225562 Smith Sep 2010 A1
20100234094 Gagner et al. Sep 2010 A1
20100241009 Petkie Sep 2010 A1
20100002912 Solinsky Oct 2010 A1
20100281438 Latta et al. Nov 2010 A1
20100292549 Schuler Nov 2010 A1
20100306713 Geisner et al. Dec 2010 A1
20100313414 Sheats Dec 2010 A1
20100324384 Moon et al. Dec 2010 A1
20100325770 Chung et al. Dec 2010 A1
20110003664 Richard Jan 2011 A1
20110010014 Oexman et al. Jan 2011 A1
20110018795 Jang Jan 2011 A1
20110029038 Hyde et al. Feb 2011 A1
20110073353 Lee et al. Mar 2011 A1
20110083111 Forutanpour et al. Apr 2011 A1
20110093820 Zhang et al. Apr 2011 A1
20110118564 Sankai May 2011 A1
20110119640 Berkes et al. May 2011 A1
20110166940 Bangera et al. Jul 2011 A1
20110181509 Rautiainen et al. Jul 2011 A1
20110181510 Hakala et al. Jul 2011 A1
20110193939 Vassigh et al. Aug 2011 A1
20110197263 Stinson, III Aug 2011 A1
20110202404 van der Riet Aug 2011 A1
20110213218 Weiner et al. Sep 2011 A1
20110221666 Newton et al. Sep 2011 A1
20110234492 Ajmera et al. Sep 2011 A1
20110239118 Yamaoka et al. Sep 2011 A1
20110245688 Arora et al. Oct 2011 A1
20110279303 Smith et al. Nov 2011 A1
20110286585 Hodge Nov 2011 A1
20110303341 Meiss et al. Dec 2011 A1
20110307842 Chiang et al. Dec 2011 A1
20110316888 Sachs et al. Dec 2011 A1
20110318985 McDermid Dec 2011 A1
20120001875 Li et al. Jan 2012 A1
20120019168 Noda et al. Jan 2012 A1
20120029369 Icove et al. Feb 2012 A1
20120047468 Santos et al. Feb 2012 A1
20120068876 Bangera et al. Mar 2012 A1
20120092284 Rofougaran et al. Apr 2012 A1
20120123232 Najarian et al. May 2012 A1
20120127082 Kushler et al. May 2012 A1
20120144934 Russell et al. Jun 2012 A1
20120150493 Casey et al. Jun 2012 A1
20120154313 Au et al. Jun 2012 A1
20120156926 Kato et al. Jun 2012 A1
20120174299 Balzano Jul 2012 A1
20120174736 Wang et al. Jul 2012 A1
20120182222 Moloney Jul 2012 A1
20120193801 Gross et al. Aug 2012 A1
20120220835 Chung Aug 2012 A1
20120248093 Ulrich et al. Oct 2012 A1
20120254810 Heck et al. Oct 2012 A1
20120268416 Pirogov et al. Oct 2012 A1
20120280900 Wang et al. Nov 2012 A1
20120298748 Factor et al. Nov 2012 A1
20120310665 Xu et al. Dec 2012 A1
20130016070 Starner et al. Jan 2013 A1
20130027218 Schwarz et al. Jan 2013 A1
20130035563 Angelides Feb 2013 A1
20130046544 Kay et al. Feb 2013 A1
20130053653 Cuddihy et al. Feb 2013 A1
20130078624 Holmes et al. Mar 2013 A1
20130082922 Miller Apr 2013 A1
20130083173 Geisner et al. Apr 2013 A1
20130086533 Stienstra Apr 2013 A1
20130096439 Lee et al. Apr 2013 A1
20130102217 Jeon Apr 2013 A1
20130104084 Mlyniec et al. Apr 2013 A1
20130113647 Sentelle et al. May 2013 A1
20130113830 Suzuki May 2013 A1
20130117377 Miller May 2013 A1
20130132931 Bruns et al. May 2013 A1
20130147833 Aubauer et al. Jun 2013 A1
20130150735 Cheng Jun 2013 A1
20130161078 Li Jun 2013 A1
20130169471 Lynch Jul 2013 A1
20130176161 Derham et al. Jul 2013 A1
20130194173 Zhu et al. Aug 2013 A1
20130195330 Kim et al. Aug 2013 A1
20130196716 Muhammad Aug 2013 A1
20130207962 Oberdorfer et al. Aug 2013 A1
20130229508 Li et al. Sep 2013 A1
20130241765 Kozma et al. Sep 2013 A1
20130245986 Grokop et al. Sep 2013 A1
20130253029 Jain et al. Sep 2013 A1
20130260630 Ito et al. Oct 2013 A1
20130278499 Anderson Oct 2013 A1
20130278501 Bulzacki Oct 2013 A1
20130281024 Rofougaran et al. Oct 2013 A1
20130283203 Batraski et al. Oct 2013 A1
20130322729 Mestha et al. Dec 2013 A1
20130332438 Li et al. Dec 2013 A1
20130345569 Mestha et al. Dec 2013 A1
20140005809 Frei et al. Jan 2014 A1
20140022108 Alberth et al. Jan 2014 A1
20140028539 Newham et al. Jan 2014 A1
20140049487 Konertz et al. Feb 2014 A1
20140050354 Heim et al. Feb 2014 A1
20140051941 Messerschmidt Feb 2014 A1
20140070957 Longinotti-Buitoni et al. Mar 2014 A1
20140072190 Wu et al. Mar 2014 A1
20140073486 Ahmed et al. Mar 2014 A1
20140073969 Zou et al. Mar 2014 A1
20140081100 Muhsin et al. Mar 2014 A1
20140095480 Marantz et al. Apr 2014 A1
20140097979 Nohara et al. Apr 2014 A1
20140121540 Raskin May 2014 A1
20140135631 Brumback et al. May 2014 A1
20140139422 Mistry et al. May 2014 A1
20140139616 Pinter et al. May 2014 A1
20140143678 Mistry et al. May 2014 A1
20140149859 Van Dyken et al. May 2014 A1
20140184496 Gribetz et al. Jul 2014 A1
20140184499 Kim Jul 2014 A1
20140191939 Penn et al. Jul 2014 A1
20140200416 Kashef et al. Jul 2014 A1
20140201690 Holz Jul 2014 A1
20140208275 Mongia et al. Jul 2014 A1
20140215389 Ivalsh et al. Jul 2014 A1
20140239065 Zhou et al. Aug 2014 A1
20140244277 Krishna Rao et al. Aug 2014 A1
20140246415 Wittkowski Sep 2014 A1
20140247212 Kim et al. Sep 2014 A1
20140250515 Jakobsson Sep 2014 A1
20140253431 Gossweiler et al. Sep 2014 A1
20140253709 Bresch et al. Sep 2014 A1
20140262478 Harris et al. Sep 2014 A1
20140275854 Venkatraman et al. Sep 2014 A1
20140280295 Kurochikin et al. Sep 2014 A1
20140281975 Anderson et al. Sep 2014 A1
20140282877 Mahaffey et al. Sep 2014 A1
20140297006 Sadhu Oct 2014 A1
20140298266 Lapp Oct 2014 A1
20140300506 Alton et al. Oct 2014 A1
20140306936 Dahl et al. Oct 2014 A1
20140309855 Tran Oct 2014 A1
20140316261 Lux et al. Oct 2014 A1
20140318699 Longinotti-Buitoni et al. Oct 2014 A1
20140324888 Xie et al. Oct 2014 A1
20140329567 Chan et al. Nov 2014 A1
20140333467 Inomata Nov 2014 A1
20140343392 Yang Nov 2014 A1
20140347295 Kim et al. Nov 2014 A1
20140357369 Callens et al. Dec 2014 A1
20140368378 Crain et al. Dec 2014 A1
20140368441 Touloumtzis Dec 2014 A1
20140376788 Xu et al. Dec 2014 A1
20150002391 Chen Jan 2015 A1
20150009096 Lee et al. Jan 2015 A1
20150026815 Barrett Jan 2015 A1
20150029050 Driscoll et al. Jan 2015 A1
20150030256 Brady et al. Jan 2015 A1
20150040040 Balan et al. Feb 2015 A1
20150046183 Cireddu Feb 2015 A1
20150062033 Ishihara Mar 2015 A1
20150068069 Tran et al. Mar 2015 A1
20150077282 Mohamadi Mar 2015 A1
20150085060 Fish et al. Mar 2015 A1
20150091820 Rosenberg et al. Apr 2015 A1
20150091858 Rosenberg et al. Apr 2015 A1
20150091859 Rosenberg et al. Apr 2015 A1
20150091903 Costello et al. Apr 2015 A1
20150095987 Potash et al. Apr 2015 A1
20150099941 Tran Apr 2015 A1
20150100328 Kress et al. Apr 2015 A1
20150106770 Shah et al. Apr 2015 A1
20150109164 Takaki Apr 2015 A1
20150112606 He et al. Apr 2015 A1
20150133017 Liao et al. May 2015 A1
20150143601 Longinotti-Buitoni et al. May 2015 A1
20150145805 Liu May 2015 A1
20150162729 Reversat et al. Jun 2015 A1
20150177866 Hwang et al. Jun 2015 A1
20150185314 Corcos et al. Jul 2015 A1
20150199045 Robucci et al. Jul 2015 A1
20150205358 Lyren Jul 2015 A1
20150223733 Al-Alusi Aug 2015 A1
20150226004 Thompson Aug 2015 A1
20150229885 Offenhaeuser Aug 2015 A1
20150256763 Niemi Sep 2015 A1
20150261320 Leto Sep 2015 A1
20150268027 Gerdes Sep 2015 A1
20150268799 Starner et al. Sep 2015 A1
20150277569 Sprenger et al. Oct 2015 A1
20150280102 Tajitsu et al. Oct 2015 A1
20150285906 Hooper et al. Oct 2015 A1
20150287187 Redtel Oct 2015 A1
20150301167 Sentelle et al. Oct 2015 A1
20150312041 Choi Oct 2015 A1
20150314780 Stenneth Nov 2015 A1
20150317518 Fujimaki et al. Nov 2015 A1
20150323993 Levesque et al. Nov 2015 A1
20150332075 Burch Nov 2015 A1
20150341550 Lay Nov 2015 A1
20150346820 Poupyrev et al. Dec 2015 A1
20150350902 Baxley et al. Dec 2015 A1
20150351703 Phillips et al. Dec 2015 A1
20150375339 Sterling et al. Dec 2015 A1
20160018948 Parvarandeh et al. Jan 2016 A1
20160026253 Bradski et al. Jan 2016 A1
20160038083 Ding et al. Feb 2016 A1
20160041617 Poupyrev Feb 2016 A1
20160041618 Poupyrev Feb 2016 A1
20160042169 Polehn Feb 2016 A1
20160048235 Poupyrev Feb 2016 A1
20160048236 Poupyrev Feb 2016 A1
20160048672 Lux et al. Feb 2016 A1
20160054792 Poupyrev Feb 2016 A1
20160054803 Poupyrev Feb 2016 A1
20160054804 Gollakata et al. Feb 2016 A1
20160055201 Poupyrev et al. Feb 2016 A1
20160090839 Stolarcyzk Mar 2016 A1
20160098089 Poupyrev Apr 2016 A1
20160100166 Dragne et al. Apr 2016 A1
20160103500 Hussey et al. Apr 2016 A1
20160106328 Mestha et al. Apr 2016 A1
20160131741 Park May 2016 A1
20160140872 Palmer et al. May 2016 A1
20160145776 Roh May 2016 A1
20160146931 Rao et al. May 2016 A1
20160170491 Jung Jun 2016 A1
20160171293 Li et al. Jun 2016 A1
20160186366 McMaster Jun 2016 A1
20160206244 Rogers Jul 2016 A1
20160213331 Gil et al. Jul 2016 A1
20160216825 Forutanpour Jul 2016 A1
20160220152 Meriheina et al. Aug 2016 A1
20160249698 Berzowska et al. Sep 2016 A1
20160252607 Saboo et al. Sep 2016 A1
20160252965 Mandella et al. Sep 2016 A1
20160253044 Katz Sep 2016 A1
20160259037 Molchanov et al. Sep 2016 A1
20160262685 Wagner et al. Sep 2016 A1
20160282988 Poupyrev Sep 2016 A1
20160283101 Schwesig et al. Sep 2016 A1
20160284436 Fukuhara et al. Sep 2016 A1
20160287172 Morris et al. Oct 2016 A1
20160299526 Inagaki et al. Oct 2016 A1
20160320852 Poupyrev Nov 2016 A1
20160320853 Lien et al. Nov 2016 A1
20160320854 Lien et al. Nov 2016 A1
20160321428 Rogers Nov 2016 A1
20160338599 DeBusschere et al. Nov 2016 A1
20160345638 Robinson et al. Dec 2016 A1
20160349790 Connor Dec 2016 A1
20160349845 Poupyrev et al. Dec 2016 A1
20160377712 Wu et al. Dec 2016 A1
20170029985 Tajitsu et al. Feb 2017 A1
20170052618 Lee Feb 2017 A1
20170060254 Molchanov et al. Mar 2017 A1
20170060298 Hwang et al. Mar 2017 A1
20170075481 Chou et al. Mar 2017 A1
20170075496 Rosenberg et al. Mar 2017 A1
20170097413 Gillian et al. Apr 2017 A1
20170097684 Lien Apr 2017 A1
20170115777 Poupyrev Apr 2017 A1
20170124407 Micks May 2017 A1
20170125940 Karagozler et al. May 2017 A1
20170192523 Poupyrev Jul 2017 A1
20170192629 Takada et al. Jul 2017 A1
20170196513 Longinotti-Buitoni et al. Jul 2017 A1
20170232538 Robinson et al. Aug 2017 A1
20170233903 Jeon Aug 2017 A1
20170249033 Podhajny et al. Aug 2017 A1
20170322633 Shen et al. Nov 2017 A1
20170325337 Karagozler et al. Nov 2017 A1
20170325518 Poupyrev et al. Nov 2017 A1
20170329412 Schwesig et al. Nov 2017 A1
20170329425 Karagozler et al. Nov 2017 A1
20180000354 DeBusschere et al. Jan 2018 A1
20180000355 DeBusschere et al. Jan 2018 A1
20180004301 Poupyrev Jan 2018 A1
20180005766 Fairbanks et al. Jan 2018 A1
20180046258 Poupyrev Feb 2018 A1
20180095541 Gribetz et al. Apr 2018 A1
20180106897 Shouldice et al. Apr 2018 A1
20180113032 Dickey et al. Apr 2018 A1
20180157330 Gu et al. Jun 2018 A1
20180160943 Fyfe et al. Jun 2018 A1
20180177464 DeBusschere et al. Jun 2018 A1
20180196527 Poupyrev et al. Jul 2018 A1
20180256106 Rogers et al. Sep 2018 A1
20180296163 DeBusschere et al. Oct 2018 A1
20180321841 Lapp Nov 2018 A1
20190033981 Poupyrev Jan 2019 A1
20190138109 Poupyrev et al. May 2019 A1
20190155396 Lien et al. May 2019 A1
20190208837 Poupyrev et al. Jul 2019 A1
20190232156 Amihood et al. Aug 2019 A1
20190243464 Lien et al. Aug 2019 A1
20190257939 Schwesig et al. Aug 2019 A1
Foreign Referenced Citations (83)
Number Date Country
1462382 Dec 2003 CN
101751126 Jun 2010 CN
102414641 Apr 2012 CN
102782612 Nov 2012 CN
102893327 Jan 2013 CN
202887794 Apr 2013 CN
103076911 May 2013 CN
103502911 Jan 2014 CN
102660988 Mar 2014 CN
104035552 Sep 2014 CN
103355860 Jan 2016 CN
102011075725 Nov 2012 DE
102013201359 Jul 2014 DE
0161895 Nov 1985 EP
1785744 May 2007 EP
1815788 Aug 2007 EP
2417908 Feb 2012 EP
2637081 Sep 2013 EP
2770408 Aug 2014 EP
2953007 Dec 2015 EP
3201726 Aug 2017 EP
3017722 Aug 2015 FR
2070469 Sep 1981 GB
2443208 Apr 2008 GB
113860 Apr 1999 JP
11168268 Jun 1999 JP
2003280049 Oct 2003 JP
2006234716 Sep 2006 JP
2007011873 Jan 2007 JP
2007132768 May 2007 JP
2008287714 Nov 2008 JP
2009037434 Feb 2009 JP
2011102457 May 2011 JP
201218583 Sep 2012 JP
2012198916 Oct 2012 JP
2013196047 Sep 2013 JP
2014532332 Dec 2014 JP
1020080102516 Nov 2008 KR
100987650 Oct 2010 KR
1020140055985 May 2014 KR
101914850 Oct 2018 KR
201425974 Jul 2014 TW
9001895 Mar 1990 WO
WO-0130123 Apr 2001 WO
WO-2001027855 Apr 2001 WO
WO-0175778 Oct 2001 WO
WO-2002082999 Oct 2002 WO
2004004557 Jan 2004 WO
WO-2005033387 Apr 2005 WO
2007125298 Nov 2007 WO
WO-2008061385 May 2008 WO
WO-2009032073 Mar 2009 WO
2009083467 Jul 2009 WO
WO-2010032173 Mar 2010 WO
2010101697 Sep 2010 WO
WO-2012026013 Mar 2012 WO
2012064847 May 2012 WO
WO-2012152476 Nov 2012 WO
WO-2013082806 Jun 2013 WO
WO-2013084108 Jun 2013 WO
2013192166 Dec 2013 WO
WO-2013186696 Dec 2013 WO
WO-2013191657 Dec 2013 WO
WO-2014019085 Feb 2014 WO
2014085369 Jun 2014 WO
WO-2014116968 Jul 2014 WO
2014124520 Aug 2014 WO
WO-2014136027 Sep 2014 WO
WO-2014138280 Sep 2014 WO
WO-2014160893 Oct 2014 WO
WO-2014165476 Oct 2014 WO
WO-2014204323 Dec 2014 WO
WO-2015017931 Feb 2015 WO
WO-2015022671 Feb 2015 WO
2016053624 Apr 2016 WO
2016118534 Jul 2016 WO
2016176471 Nov 2016 WO
2016178797 Nov 2016 WO
2017019299 Feb 2017 WO
2017062566 Apr 2017 WO
2017200571 Nov 2017 WO
20170200949 Nov 2017 WO
2018106306 Jun 2018 WO
Non-Patent Literature Citations (297)
Entry
“Final Office Action”, U.S. Appl. No. 14/518,863, dated Apr. 5, 2018, 21 pages.
“Final Office Action”, U.S. Appl. No. 14/504,139, dated May 1, 2018, 14 pages.
“Final Office Action”, U.S. Appl. No. 15/595,649, dated May 23, 2018, 13 pages.
“Final Office Action”, U.S. Appl. No. 15/142,689, dated Jun. 1, 2018, 16 pages.
“Final Office Action”, U.S. Appl. No. 14/874,955, dated Jun. 11, 2018, 9 pages.
“Final Office Action”, U.S. Appl. No. 14/959,901, dated Jun. 15, 2018, 21 pages.
“Final Office Action”, U.S. Appl. No. 15/286,152, dated Jun. 26, 2018, 25 pages.
“Final Office Action”, U.S. Appl. No. 15/267,181, dated Jun. 7, 2018, 31 pages.
“First Action Interview Office Action”, U.S. Appl. No. 15/166,198, dated Apr. 25, 2018, 8 pages.
“Foreign Office Action”, European Application No. 16784352.3, dated May 16, 2018, 3 pages.
“Foreign Office Action”, Chinese Application No. 201721290290.3, dated Jun. 6, 2018, 3 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/287,253, dated Apr. 5, 2018, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/586,174, dated Jun. 18, 2018, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 14/862,409, dated Jun. 6, 2018, 7 pages.
“Pre-Interview Communication”, U.S. Appl. No. 15/362,359, dated May 17,2018, 4 pages.
“Preliminary Report on Patentability”, PCT Application No. PCT/US2016/055671, dated Apr. 10, 2018, 9 pages.
“Written Opinion”, PCT Application No. PCT/US2017/032733, dated Jul. 24, 2017, 5 pages.
“Advisory Action”, U.S. Appl. No. 14/504,139, dated Aug. 28, 2017, 3 pages.
“Final Office Action”, U.S. Appl. No. 14/959,901, dated Aug. 25, 2017, 19 pages.
“Final Office Action”, U.S. Appl. No. 15/403,066, dated Oct. 5, 2017, 31 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/093,533, dated Aug. 24, 2017, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/142,619, dated Aug. 25, 2017, 16 pages.
Non-Final Office Action, U.S. Appl. No. 14/959,799, dated Sep. 8, 2017, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/398,147, dated Sep. 8, 2017, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/518,863, dated Sep. 29, 2017, 20 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/142,689, dated Oct. 4, 2017, 18 pages.
“Pre-Interview Office Action”, U.S. Appl. No. 14/862,409, dated Sep. 15, 2017, 16 pages.
“Written Opinion”, PCT Application No. PCT/US2016/055671, dated Apr. 13, 2017, 8 pages.
“Written Opinion”, PCT Application No. PCT/US2017/032733, dated Jul. 26, 2017, 5 pages.
“Final Office Action”, U.S. Appl. No. 14/504,061, dated Mar. 9, 2016, 10 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2015/044774, dated Nov. 3, 2015, 12 pages.
“Extended European Search Report”, EP Application No. 15170577.9, dated Nov. 5, 2015, 12 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/024267, dated Jun. 20, 2016, 13 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/024273, dated Jun. 20, 2016, 13 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/032307, dated Aug. 25, 2016, 13 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/029820, dated Jul. 15, 2016, 14 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/030177, dated Aug. 2, 2016, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/930,220, dated Sep. 14, 2016, 15 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2015/043963, dated Nov. 24, 2015, 16 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2015/050903, dated Feb. 19, 2016, 18 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/030115, dated Aug. 8, 2016, 18 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2015/043949, dated Dec. 1, 2015, 18 pages.
“Frogpad Introduces Wearable Fabric Keyboard with Bluetooth Technology”, Retrieved From: <http://www.geekzone.co.nz/content.asp?contentid=3898> Mar. 16, 2015, Jan. 7, 2005, 2 pages.
“Philips Vital Signs Camera”, Retrieved From: <http://www.vitalsignscamera.com/> Apr. 15, 2015, Jul. 17, 2013, 2 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/504,038, dated Feb. 26, 2016, 22 pages.
“Final Office Action”, U.S. Appl. No. 14/504,038, dated Sep. 27, 2016, 23 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/312,486, dated Oct. 23, 2015, 25 pages.
“Final Office Action”, U.S. Appl. No. 14/312,486, dated Jun. 3, 2016, 32 pages.
“Restriction Requirement”, U.S. Appl. No. 14/666,155, dated Jul. 22, 2016, 5 pages.
“The Instant Blood Pressure app estimates blood pressure with your smartphone and our algorithm”, Retrieved at: http://www.instantbloodpressure.com/—on Jun. 23, 2016, 6 pages.
“Cardiio”, Retrieved From: <http://www.cardiio.com/> Apr. 15, 2015 App Information Retrieved From: <https://itunes.apple.com/us/app/cardiio-touchless-camera-pulse/id542891434?ls=1&mt=8> Apr. 15, 2015, Feb. 24, 2015, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 14/504,061, dated Sep. 12, 2016, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/504,061, dated Nov. 4, 2015, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/582,896, dated Jun. 29, 2016, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/681,625, dated Aug. 12, 2016, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/666,155, dated Aug. 24, 2016, 9 pages.
Arbabian,“A 94GHz mm-Wave to Baseband Pulsed-Radar for Imaging and Gesture Recognition”, 2012 IEEE, 2012 Symposium on VLSI Circuits Digest of Technical Papers, 2012, 2 pages.
Balakrishnan,“Detecting Pulse from Head Motions in Video”, In Proceedings: CVPR '13 Proceedings of the 2013 IEEE Conference on Computer Vision and Pattern Recognition Available at: <http://people.csail.mitedu/mrub/vidmag/papers/Balakrishnan_Detecting_Pulse_from_2013_CVPR_paper.pdf>, Jun. 23, 2013, 8 pages.
Couderc,“Detection of Atrial Fibrillation using Contactless Facial Video Monitoring”, In Proceedings: Heart Rhythm Society, vol. 12, Issue 1 Available at: <http://www.heartrhythmjournal.com/article/S1547-5271(14)00924-2/pdf>, Jan. 2015, 7 pages.
Espina,“Wireless Body Sensor Network for Continuous Cuff-less Blood Pressure Monitoring”, International Summer School on Medical Devices and Biosensors, 2006, Sep. 2006, 5 pages.
Godana,“Human Movement Characterization in Indoor Environment using GNU Radio Based Radar”, Retrieved at: http://repository.tudelft.nl/islandora/object/uuid:414e1868-dd00-4113-9989-4c213f1f7094?collection=education, Nov. 30, 2009, 100 pages.
He,“A Continuous, Wearable, and Wireless Heart Monitor Using Head Ballistocardiogram (BCG) and Head Electrocardiogram (EEC) with a Nanowatt ECG Heartbeat Detection Circuit”, In Proceedings: Thesis, Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology Available at: <http://dspace.mit.edu/handle/1721.1/79221>, Feb. 2013, 137 pages.
Holleis,“Evaluating Capacitive Touch Input on Clothes”, Proceedings of the 10th International Conference on Human Computer Interaction, Jan. 1, 2008, 10 pages.
Nakajima,“Development of Real-Time Image Sequence Analysis for Evaluating Posture Change and Respiratory Rate of a Subject in Bed”, In Proceedings: Physiological Measurement, vol. 22, No. 3 Retrieved From: <http://iopscience.iop.org/0967-3334/22/3/401/pdf/0967-3334_22_3 _401.pdf> Feb. 27, 2015, Aug. 2001, 8 pages.
Patel,“Applications of Electrically Conductive Yarns in Technical Textiles”, International Conference on Power System Technology (POWECON), Oct. 30, 2012, 6 pages.
Poh,“A Medical Mirror for Non-contact Health Monitoring”, In Proceedings: ACM SIGGRPH Emerging Technologies Available at: <http://affect.media.mit.edu/pdfs/11.Poh-etal-SIGGRAPH.pdf>, 2011, 1 page.
Poh,“Non-contact, Automated Cardiac Pulse Measurements Using Video Imaging and Blind Source Separation.”, In Proceedings: Optics Express, vol. 18, No. 10 Available at: <http://www.opticsinfobase.org/view_article.cfm?gotourl=http%3A%2F%2Fwww%2Eopticsinfobase%2Eorg%2FDirectPDFAccess%2F77B04D55%2DBC95%2D6937%2D5BAC49A426378C02%5F199381%2Foe%2D18%2D10%2D10762%2EP, May 7, 2010, 13 pages.
Pu,“Whole-Home Gesture Recognition Using Wireless Signals”, MobiCom '13 Proceedings of the 19th annual international conference on Mobile computing & networking, Aug. 27, 2013, 12 pages.
Wang,“Exploiting Spatial Redundancy of Image Sensor for Motion Robust rPPG”, In Proceedings: IEEE Transactions on Biomedical Engineering, vol. 62, Issue 2, Jan. 19, 2015, 11 pages.
Wang,“Micro-Doppler Signatures for Intelligent Human Gait Recognition Using a UWB Impulse Radar”, 2011 IEEE International Symposium on Antennas and Propagation (APSURSI), Jul. 3, 2011, pp. 2103-2106.
Wijesiriwardana,“Capacitive Fibre-Meshed Transducer for Touch & Proximity Sensing Applications”, IEEE Sensors Journal, IEEE Service Center, Oct. 1, 2005, 5 pages.
Zhadobov,“Millimeter-wave Interactions with the Human Body: State of Knowledge and Recent Advances”, International Journal of Microwave and Wireless Technologies, Mar. 1, 2011, 11 pages.
Zhang,“Study of the Structural Design and Capacitance Characteristics of Fabric Sensor”, Advanced Materials Research (vols. 194-196), Feb. 21, 2011, 8 pages.
“Combined Search and Examination Report”, GB Application No. 1620892.8, dated Apr. 6, 2017, 5 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/930,220, dated Mar. 20, 2017, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/930,220, dated May 11, 2017, 2 pages.
“Final Office Action”, U.S. Appl. No. 14/518,863, dated May 5, 2017, 18 pages.
“First Action Interview Office Action”, U.S. Appl. No. 14/959,901, dated Apr. 14, 2017, 3 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2015/050903, dated Apr. 13, 2017, 12 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/060399, dated Jan. 30, 2017, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/504,038, dated Mar. 22, 2017, 33 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/398,147, dated Mar. 9, 2017, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/403,066, dated May 4, 2017, 31 pages.
“Notice of Allowance”, U.S. Appl. No. 14/494,863, dated May 30, 2017, 7 pages.
“Pre-Interview Communication”, U.S. Appl. No. 15/343,067, dated Apr. 19, 2017, 3 pages.
“Textile Wire Brochure”, Retrieved at: http://www.textile-wire.ch/en/home.html, Aug. 7, 2004, 17 pages.
Stoppa,“Wearable Electronics and Smart Textiles: A Critical Review”, In Proceedings of Sensors, vol. 14, Issue 7, Jul. 7, 2014, pp. 11957-11992.
“Combined Search and Examination Report”, GB Application No. 1620891.0, dated May 31, 2017, 9 pages.
“Final Office Action”, U.S. Appl. No. 15/398,147, dated Jun. 30, 2017, 11 pages.
“Final Office Action”, U.S. Appl. No. 14/874,955, dated Jun. 30, 2017, 9 pages.
“Final Office Action”, U.S. Appl. No. 14/959,799, dated Jul. 19, 2017, 12 pages.
“Final Office Action”, U.S. Appl. No. 14/504,121, dated Aug. 8, 2017, 16 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/063874, dated May 11, 2017, 19 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/862,409, dated Jun. 22, 2017, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/959,730, dated Jun. 23, 2017, 14 pages.
“Notice of Allowance”, U.S. Appl. No. 14/513,875, dated Jun. 28, 2017, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 15/343,067, dated Jul. 27, 2017, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 14/504,038, dated Aug. 7, 2017, 17 pages.
“Final Office Action”, U.S. Appl. No. 15/142,619, dated Feb. 8, 2018, 15 pages.
“Final Office Action”, U.S. Appl. No. 15/093,533, dated Mar. 21, 2018, 19 pages.
“First Action Interview Office Action”, U.S. Appl. No. 15/286,152, dated Mar. 1, 2018, 5 pages.
“Foreign Office Action”, Chinese Application No. 201721290290.3, dated Mar. 9, 2018, 2 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/267,181, dated Feb. 8, 2018, 29 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/874,955, dated Feb. 8, 2018, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 14/959,730, dated Feb. 22, 2018, 8 pages.
“Pre-Interview Communication”, U.S. Appl. No. 15/166,198, dated Mar. 8, 2018, 8 pages.
“Pre-Interview First Office Action”, U.S. Appl. No. 15/286,152, dated Feb. 8, 2018, 4 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/504,061, dated Dec. 27, 2016, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/582,896, dated Dec. 19, 2016, 2 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/024289, dated Aug. 25, 2016, 17 pages.
Cheng,“Smart Textiles: From Niche to Mainstream”, IEEE Pervasive Computing, Jul. 2013, pp. 81-84.
Farringdon,“Wearable Sensor Badge & Sensor Jacket for Context Awareness”, Third International Symposium on Wearable Computers, Oct. 1999, 7 pages.
Schneegass,“Towards a Garment OS: Supporting Application Development for Smart Garments”, Wearable Computers, ACM, Sep. 2014, 6 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/312,486, dated Jan. 23, 2017, 4 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/582,896, dated Feb. 6, 2017, 2 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/582,896, dated Feb. 23, 2017, 2 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2015/043963, dated Feb. 16, 2017, 12 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2015/030388, dated Dec. 15, 2016, 12 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2015/043949, dated Feb. 16, 2017, 13 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2015/044774, dated Mar. 2, 2017, 8 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/062082, dated Feb. 23, 2017, 12 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/055671, dated Dec. 1, 2016, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/504,121, dated Jan. 9, 2017, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/504,139, dated Jan. 27, 2017, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/513,875, dated Feb. 21, 2017, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/874,955, dated Feb. 27, 2017, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/959,799, dated Jan. 27, 2017, 10 pages.
“Notice of Allowance”, U.S. Appl. No. 14/930,220, dated Feb. 2, 2017, 8 pages.
“Pre-Interview Communication”, U.S. Appl. No. 14/494,863, dated Jan. 27, 2017, 5 pages.
“Pre-Interview Communication”, U.S. Appl. No. 14/959,730, dated Feb. 15, 2017, 3 pages.
“Pre-Interview Communication”, U.S. Appl. No. 14/959,901, dated Feb. 10, 2017, 3 pages.
“Final Office Action”, U.S. Appl. No. 14/959,799, dated Jan. 4, 2018, 17 pages.
“Final Office Action”, U.S. Appl. No. 14/959,730, dated Nov. 22, 2017, 16 pages.
“International Search Report and Written Opinion”, PCT/US20171047691, dated Nov. 16, 2017, 13.
“International Search Report and Written Opinion”, PCT Application No. PCT/US2017/051663, dated Nov. 29, 2017, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/504,121, dated Jan. 2, 2018, 19 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/959,901, dated Jan. 8, 2018, 21 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/504,139, dated Oct. 18, 2017, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/595,649, dated Oct. 31, 2017, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/862,409, dated Dec. 14, 2017, 17 pages.
“Notice of Allowance”, U.S. Appl. No. 15/403,066, dated Jan. 8, 2018, 18 pages.
“Notice of Allowance”, U.S. Appl. No. 14/874,955, dated Oct. 20. 2017, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 15/398,147, dated Nov. 15, 2017, 8 pages.
“Notice of Publication”, U.S. Appl. No. 15/703,511, dated Jan. 4, 2018, 1 page.
“Restriction Requirement”, U.S. Appl. No. 15/362,359, dated Jan. 8, 2018, 5 pages.
Bondade, et al., “A linear-assisted DC-DC hybrid power converter for envelope tracking RF power amplifiers”, 2014 IEEE Energy Conversion Congress and Exposition (ECCE), IEEE, Sep. 14, 2014, pp. 5769-5773, XP032680873, DOI: 10.1109/ECCE.2014.6954193, Sep. 14, 2014, 5 pages.
Fan, et al., “Wireless Hand Gesture Recognition Based on Continuous-Wave Doppler Radar Sensors”, IEEE Transactions on Microwave Theory and Techniques, Plenum, USA, vol. 64, No. 11, Nov. 1, 2016 (Nov. 1, 2016), pp. 4012-4012, XP011633246, ISSN: 0018-9480, DOI: 101109/TMTT.2016.2610427, Nov. 1, 2016, 9 pages.
Lien, et al., “Soli: Ubiquitous Gesture Sensing with Millimeter Wave Radar”, ACM Transactions on Graphics (TOG), ACM, Us, vol. 35, No. 4, Jul. 11, 2016 (Jul. 11, 2016), pp. 1-19, XP058275791, ISSN: 0730-0301, DOI: 10.1145/2897824.2925953, Jul. 11, 2016, 19 pages.
Martinez-Garcia, et al., “Four-quadrant linear-assisted DC/DC voltage regulator”, Analog Integrated Circuits and Signal Processing, Springer New York LLC, US, vol. 88, No. 1, Apr. 23, 2016 (Apr. 23, 2016) , pp. 151-160, XP035898949, ISSN: 0925-1030, DOI: 10.1007/S10470-016-0747-8, Apr. 23, 2016, 10 pages.
Skolnik, “CW and Frequency-Modulated Radar”, In: “Introduction to Radar Systems”, Jan. 1, 1981 (Jan. 1, 1981), McGraw Hill, XP055047545, ISBN: 978-0-07-057909-5 pp. 68-100, p. 95-p. 97, Jan. 1, 1981, 18 pages.
Zheng, et al., “Doppler Bio-Signal Detection Based Time-Domain Hand Gesture Recognition”, 2013 IEEE MTT-S International Microwave Workshop Series on RF and Wireless Technologies for Biomedical and Healthcare Applications (IMWS-BIO), IEEE, Dec. 9, 2013 (Dec. 9, 2013), p. 3, XP032574214, DOI: 10.1109/IMWS-BIO.2013.6756200, Dec. 9, 2013, 3 Pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 14/312,486, dated Oct. 28, 2016, 4 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/518,863, dated Oct. 14, 2016, 16 pages.
“Notice of Allowance”, U.S. Appl. No. 14/312,486, dated Oct. 7, 2016, 15 pages.
“Notice of Allowance”, U.S. Appl. No. 14/582,896, dated Nov. 7, 2016, 5 pages.
“Pre-Interview Communication”, U.S. Appl. No. 14/513,875, dated Oct. 21, 2016, 3 pages.
Pu,“Gesture Recognition Using Wireless Signals”, Oct. 2014, pp. 15-18.
“Corrected Notice of Allowance”, U.S. Appl. No. 15/362,359, dated Sep. 17, 2018, 10 pages.
“Final Office Action”, U.S. Appl. No. 14/504,121, dated Jul. 9, 2018, 23 pages.
“Final Office Action”, U.S. Appl. No. 15/166,198, dated Sep. 27, 2018, 33 pages.
“Foreign Office Action”, Japanese Application No. 2018-501256, dated Jul. 24, 2018, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/287,253, dated Jul. 7, 2018, 20 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/504,139, dated Oct. 5, 2018, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/286,512, dated Jul. 19, 2018, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/142,829, dated Aug. 16, 2018, 15 pages.
“Notice of Allowance”, U.S. Appl. No. 15/362,359, dated Aug. 3, 2018, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 14/874,955, dated Oct. 4, 2018, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 15/142,619, dated Aug. 13, 2018, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 15/595,649, dated Sep. 14, 2018, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 15/586,174, dated Sep. 24, 2018, 5 pages.
“Pre-Interview Communication”, U.S. Appl. No. 15/286,495, dated Sep. 10, 2018, 4 pages.
“Pre-Interview Communication”, U.S. Appl. No. 15/287,359, dated Jul. 24, 2018, 2 pages.
“Restriction Requirement”, U.S. Appl. No. 15/286,537, dated Aug. 27, 2018, 8 pages.
“Apple Watch Used Four Sensors to Detect your Pulse”, retrieved from http://www.theverge.com/2014/9/9/6126991 / apple-watch-four-back-sensors-detect-activity on Sep. 23, 2017 as cited in PCT search report for PCT Application No. PCT/US2016/026756 dated Nov. 10, 2017; the Verge, paragraph 1, Sep. 9, 2014, 4 pages.
“Clever Toilet Checks on Your Health”, CNN.Com; Technology, Jun. 28, 2005, 2 pages.
“Final Office Action”, U.S. Appl. No. 14/681,625, dated Dec. 7, 2016, 10 pages.
“Final Office Action”, U.S. Appl. No. 14/731,195, dated Oct. 11, 2018, 12 pages.
“Final Office Action”, U.S. Appl. No. 14/715,454, dated Sep. 7, 2017, 14 pages.
“Final Office Action”, U.S. Appl. No. 15/286,512, dated Dec. 26, 2018, 15 pages.
“Final Office Action”, U.S. Appl. No. 14/720,632, dated Jan. 9, 2018, 18 pages.
“Final Office Action”, U.S. Appl. No. 14/715,454, dated Apr. 17, 2018, 19 pages.
“Final Office Action”, U.S. Appl. No. 15/287,308, dated Feb. 8, 2019, 23 pages.
“Final Office Action”, U.S. Appl. No. 14/599,954, dared Aug. 10, 2016, 23 pages.
“Final Office Action”, U.S. Appl. No. 14/699,181, dated May 4, 2018, 41 pages.
“Final Office Action”, U.S. Appl. No. 14/715,793, dated Sep. 12, 2017, 7 pages.
“Final Office Action”, U.S. Appl. No. 14/809,901, dated Dec. 13, 2018, 7 pages.
“First Action Interview OA”, U.S. Appl. No. 14/715,793, dated Jun. 21, 2017, 3 pages.
“First Action Interview Office Action”, U.S. Appl. No. 15/142,471, dated Feb. 5, 2019, 29 pages.
“First Action Interview Office Action”, U.S. Appl. No. 14/731,195, dated Jun. 21, 2018, 4 pages.
“First Action Interview Pilot Program Pre-Interview Communication”, U.S. Appl. No. 14/731,195, dated Aug. 1, 2017, 3 pages.
“First Examination Report”, GB Application No. 1621332.4, dated May 16, 2017, 7 pages.
“Foreign Office Action”, Chinese Application No. 201580034536.8, dated Oct. 9, 2018.
“Foreign Office Action”, KR Application No. 10-2016-7036023, dated Aug. 11, 2017, 10 pages.
“Foreign Office Action”, Chinese Application No. 201580036075.8, dated Jul. 4, 2018, 14 page.
“Foreign Office Action”, CN Application No. 201580034908.7, dated Jul. 3, 2018, 17 pages.
“Foreign Office Action”, JP App. No. 2016-567813, dated Jan. 16, 2018, 3 pages.
“Foreign Office Action”, Korean Application No. 10-2016-7036015, dated Oct. 15, 2018, 3 pages.
“Foreign Office Action”, Japanese Application No. 2016-567839, dated Apr. 3, 2018, 3 pages.
“Foreign Office Action”, KR Application No. 10-2016-7035397, dated Sep. 20, 2017, 5 pages.
“Foreign Office Action”, Korean Application No. 1020187012629, dated May 24, 2018, 6 pages.
“Foreign Office Action”, EP Application No. 15170577.9, dated May 30, 2017, 7 pages.
“Foreign Office Action”, Korean Application No. 10-2016-7036396, dated Jan. 3, 2018, 7 pages.
“Foreign Office Action”, JP Application No. 2016567813, dated Sep. 22, 2017, 8 pages.
“Foreign Office Action”, Japanese Application No. 2018021296, dated Dec. 25, 2018, 8 pages.
“Foreign Office Action”, EP Application No. 15754323.2, dated Mar. 9, 2018, 8 pages.
“Foreign Office Action—Needs Translation”, Japanese Application No. 2018501256, dated Feb. 26, 2019, 3 pages.
“International Preliminary Report on Patentability”, PCT Application No. PCT/US2017/032733, dated Nov. 29, 2018, 7 pages.
“International Preliminary Report on Patentability”, PCT Application No. PCT/US2016/026756, dated Oct. 19, 2017, 8 pages.
“International Search Report and Written Opinion”, PCT Application No. PCT/US2016/065295, dated Mar. 14, 2017, 12 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/042013, dated Oct. 26, 2016, 12 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/034366, dated Nov. 17, 2016, 13 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2016/033342, dated Oct. 27, 2016, 20 pages.
“Life:X Lifestyle eXplorer”, Retrieved from <https://web.archive.org/web/20150318093841/http://research.microsoft.com/en-us/projects/lifex >, Feb. 3, 2017, 2 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/596,702, dated Jan. 4, 2019, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/286,837, dated Oct. 26, 2018, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/666,155, dated Feb. 3, 2017, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/809,901, dated May 24, 2018, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/720,632, dated Jun. 14, 2017, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/715,454, dated Jan. 11, 2018, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/599,954, dated Jan. 26, 2017, 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/599,954, dated Feb. 2, 2016, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/287,308, dated Oct. 15, 2018, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/286,537, dated Nov. 19, 2018, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/720,632, dated May 18, 2018, 20 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/959,901, dated Oct. 11, 2018, 22 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/286,152, dated Oct. 19, 2018, 27 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/699,181, dated Oct. 18, 2017, 33 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/166,198, dated Feb. 21, 2019, 48 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/681,625, dated Mar. 6, 2017, 7 pages.
“Non-Invasive Quantification of Peripheral Arterial Volume Distensibilitiy and its Non-Lineaer Relationship with Arterial Pressure”, Journal of Biomechanics, Pergamon Press, vol. 42, No. 8; as cited in the search report for PCT/US2016/013968 citing the whole document, but in particular the abstract, dated May 29, 2009, 2 pages.
“Notice of Allowance”, U.S. Appl. No. 14/599,954, dated May 24, 2017, 11 pages.
“Notice of Allowance”, U.S. Appl. No. 15/287,200, dated Nov. 6, 2018, 19 pages.
“Notice of Allowance”, U.S. Appl. No. 15/286,152, dated Mar. 5, 2019, 23 pages.
“Notice of Allowance”, U.S. Appl. No. 14/715,793, dated Jul. 6, 2018, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 15/286,495, dated Jan. 17, 2019, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 15/595,649, dated Jan. 3, 2019, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 14/715,793, dated Dec. 18, 2017, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 14/666,155, dated Feb. 20, 2018, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 14/666,155, dated Jul. 10, 2017, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 14/681,625, dated Jun. 7, 2017, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 14/681,625, dated Oct. 23, 2017, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 15/142,829, dated Feb. 6, 2019, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 15/142,689, dated Oct. 30, 2018, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 14/504,137, dated Feb. 6, 2019, 9 pages.
“Pre-Interview Communication”, U.S. Appl. No. 15/142,471, dated Dec. 12, 2018, 3 pages.
“Pre-Interview Communication”, U.S. Appl. No. 14/715,793, dated Mar. 20, 2017, 3 pages.
“Pre-Interview Communication”, U.S. Appl. No. 14/715,454, dated Apr. 14, 2017, 3 pages.
“Pre-Interview Communication”, U.S. Appl. No. 15/703,511, dated Feb. 22, 2019, 5 pages.
“Pre-Interview Office Action”, U.S. Appl. No. 14/731,195, dated Dec. 20, 2017, 4 pages.
“Preliminary Report on Patentability”, PCT Application No. PCT/US2016/065295, dated Jul. 24, 2018, 18 pages.
“Preliminary Report on Patentability”, PCT Application No. PCT/US2016/042013, dated Jan. 30, 2018, 7 pages.
“Preliminary Report on Patentability”, PCT Application No. PCT/US2016/032307, dated Dec. 7, 2017, 9 pages.
“Pressure-Volume Loop Analysis in Cardiology”, retrieved from https://en.wikipedia.org/w/index.php?t itle=Pressure-volume loop analysis in card iology&oldid=636928657 on Sep. 23, 2017; Obtained per link provided in search report from PCT/US2016/01398 dated Jul. 28, 2016, Dec. 6, 2014, 10 pages.
“Restriction Requirement”, U.S. Appl. No. 15/462,957, dated Jan. 4, 2019, 6 pages.
“Restriction Requirement”, U.S. Appl. No. 15/352,194, dated Feb. 6, 2019, 8 pages.
“The Dash smart earbuds play back music, and monitor your workout”, Retrieved from < http://newatlas.com/bragi-dash-tracking-earbuds/30808/>, Feb. 13, 2014, 3 pages.
“Thermofocus No Touch Forehead Thermometer”, Technimed, Internet Archive. Dec. 24, 2014. https://web.archive.org/web/20141224070848/http://www.tecnimed.it:80/thermofocus-forehead-thermometer-H1N1-swine-flu.html, Dec. 24, 2018, 4 pages.
“Written Opinion”, PCT Application No. PCT/US2016/042013, dated Feb. 2, 2017, 6 pages.
“Written Opinion”, PCT Application No. PCT/US2016/026756, dated Nov. 10, 2016, 7 pages.
“Written Opinion”, PCT Application No. PCT/US2017/051663, dated Oct. 12, 2018, 8 pages.
“Written Opinion”, PCT Application No. PCT/US2016/065295, dated Apr. 13, 2018, 8 pages.
“Written Opinion”, PCT Application PCT/US2016/013968, dated Jul. 28, 2016, 9 pages.
Antonimuthu, “Google's Project Soli brings Gesture Control to Wearables using Radar”, YouTube[online], Available from https://www.youtube.com/watch?v=czJfcgvQcNA as accessed on May 9, 2017; See whole video, especially 6:05-6:35.
Duncan, David P., “Motion Compensation of Synthetic Aperture Radar”, Microwave Earth Remote Sensing Laboratory, Brigham Young University, Apr. 15, 2003, 5 pages.
Garmatyuk, Dmitriy S. et al., “Ultra-Wideband Continuous-Wave Random Noise Arc-SAR”, IEEE Transaction on Geoscience and Remote Sensing, vol. 40, No. 12, Dec. 2002, Dec. 2002, 10 pages.
Geisheimer, Jonathan L. et al., “A Continuous-Wave (CW) Radar for Gait Analysis”, IEEE 2001, 2001, 5 pages.
GürbüZ, Sevgi Z. et al., “Detection and Identification of Human Targets in Radar Data”, Proc. SPIE 6567, Signal Processing, Sensor Fusion, and Target Recognition XVI, 656701, May 7, 2007, 12 pages.
Ishijima, Masa, “Unobtrusive Approaches to Monitoring Vital Signs at Home”, Medical & Biological Engineering and Computing, Springer, Berlin, DE, vol. 45, No. 11 as cited in search report for PCT/US2016/013968 dated Jul. 28, 2016, Sep. 25, 2007, 3 pages.
Klabunde, Richard E., “Ventricular Pressure-Volume Loop Changes in Valve Disease”, Retrieved From <https://web.archive.org/web/20101201185256/http://cvphysiology.com/Heart%20Disease/HD009.htm>, Dec. 1, 2010, 8 pages.
Kubota, Yusuke et al., “A Gesture Recognition Approach by using Microwave Doppler Sensors”, IPSJ SIG Technical Report, 2009 (6), Information Processing Society of Japan, Apr. 15, 2010, pp. 1-8, Apr. 15, 2010, 13 pages.
Matthews, Robert J., “Venous Pulse”, Retrieved at: http://www.rjmatthewsmd.com/Definitions/venous_pulse.htm—on Nov. 30, 2016, Apr. 13, 2013, 7 pages.
Otto, Chris et al., “System Architecture of a Wireless Body Area Sensor Network for Ubiquitous Health Monitoring”, Journal of Mobile Multimedia; vol. 1, No. 4, Jan. 10, 2006, 20 pages.
Palese, et al., “The Effects of Earphones and Music on the Temperature Measured by Infrared Tympanic Thermometer: Preliminary Results”, ORL—head neck nursing: official journal of the Society of Otorhinolaryngology and Head-Neck Nurses 32.2, Jan. 1, 2013, p. 8-12.
Pu, Qifan et al., “Whole-Home Gesture Recognition Using Wireless Signals”, MobiCom'13, Sep. 30-Oct. 4, Miami, FL, USA, 2013, 12 pages.
Pu, Qifan et al., “Whole-Home Gesture Recognition Using Wireless Signals”, Proceedings of the 19th annual international conference on Mobile computing & networking (MobiCom'13), US, ACM, Sep. 30, 2013, pp. 27-38, Dec. 30, 2013, 12 pages.
Zhadobov, Maxim et al., “Millimeter-Wave Interactions with the Human Body: State of Knowledge and Recent Advances”, International Journal of Microwave and Wireless Technologies, p. 1 of 11. # Cambridge University Press and the European Microwave Association, 2011 doi:10.1017/S1759078711000122, 2011.
“Final Office Action”, U.S. Appl. No. 15/286,537, dated Apr. 19, 2019, 21 pages.
“Final Office Action”, U.S. Appl. No. 15/287,253, dated Apr. 2, 2019, 10 pages.
“Final Office Action”, U.S. Appl. No. 14/959,901, dated May 30, 2019, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/287,394, dated Mar. 22, 2019, 39 pages.
“Non-Final Office Action”, U.S. Appl. No. 16/238,464, dated Mar. 7, 2019, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/424,263, dated May 23, 2019, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/462,957, dated May 24, 2019, 14 pages.
“Notice of Allowance”, U.S. Appl. No. 15/286,837, dated Mar. 6, 2019, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 15/703,511, dated Apr. 16, 2019, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 15/286,512, dated Apr. 9, 2019, 14 pages.
“Pre-Interview Communication”, U.S. Appl. No. 15/917,238, dated May 1, 2019, 6 pages.
“Final Office Action”, U.S. Appl. No. 15/142,471, dated Jun. 20, 2019, 26 pages.
“Final Office Action”, U.S. Appl. No. 16/238,464, dated Jul. 25, 2019, 15 pages.
“First Action Interview Office Action”, U.S. Appl. No. 15/917,238, dated Jun. 6, 2019, 6 pages.
“International Preliminary Report on Patentability”, PCT Application No. PCT/US2017/051663, dated Jun. 20, 2019, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 15/286,537, dated Sep. 3, 2019, 28 Pages.
“Notice of Allowance”, U.S. Appl. No. 15/287,308, dated Jul. 17, 2019, 17 Pages.
“Notice of Allowance”, U.S. Appl. No. 15/917,238, dated Aug. 21, 2019, 13 pages.
“Notice of Allowance”, U.S. Appl. No. 16/389,402, dated Aug. 21, 2019, 7 Pages.
“Notice of Allowance”, U.S. Appl. No. 15/287,253, dated Aug. 26, 2019, 13 Pages.
“Notice of Allowance”, U.S. Appl. No. 15/352,194, dated Jun. 26, 2019, 8 pages.
Provisional Applications (1)
Number Date Country
62237975 Oct 2015 US