This background description is provided for the purpose of generally presenting the context of the disclosure. Unless otherwise indicated herein, material described in this section is neither expressly nor impliedly admitted to be prior art to the present disclosure or the appended claims.
As computing devices evolve with more computing power, they are able to evolve how they receive input commands or information. One type of evolving input mechanism relates to capturing user gestures. For instance, a user can attach a first peripheral device to their arm or hand that reads muscle activity, or hold a second peripheral device that contains an accelerometer that detects motion. In turn, these peripherals then communicate with a receiving computing device based upon a detected gesture. With these types of peripheral devices, a user physically connects the peripheral device to a corresponding body part that performs the gesture. However, this constrains the user, in that the user must not only acquire these peripheral devices, but must couple them to the receiving computing device. Thus, it would be advantageous to capture various gestures without attaching a peripheral device to the user.
Further, a computing device may allow a user to authenticate themselves in a variety of different ways in order to access the device or applications executing on the device, such as by gesturing or entering a password to unlock a laptop, receiving fingerprint sensor data to unlock a smartphone, using a microphone to recognize the user's voice, using a camera to recognize the user's face, and so forth. However, in many cases, the user would like to be authenticated without touching the device, or to be authenticated without any type of active interaction with the device to allow for seamless access to devices and applications, while at the same time preventing certain people from access. One way to authenticate users includes using a camera to capture video or photos of a user. However, using a camera to authenticate users may result in a loss of privacy for users who do not wish, or expect, that their photo will be captured.
This document describes techniques and devices for radar-based authentication. The techniques describe a radar-based authentication component that is configured to recognize biometric characteristics associated with a person or gestures performed by the person. Then, by comparing the biometric characteristics or gestures with an authentication library, an authentication state may be determined which allows or restricts access to a device or application. This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter.
Various aspects of radar-based authentication are described with reference to the following drawings. The same numbers are used throughout the drawings to reference like features and components:
This document describes techniques and devices for radar-based authentication. The techniques describe a radar-based authentication component that is configured to recognize biometric characteristics associated with a person, and to determine an authentication state, based on the recognized biometric characteristics, which allows or restricts access to a device or application. Thus, a person can be authenticated even without the person's active engagement.
As described herein, biometric characteristics correspond to distinctive, measurable characteristics that can be used to identify a particular known person, or a particular “type” of person (e.g., an adult versus a child). Biometric characteristics are often categorized as physiological versus behavioral characteristics. Physiological characteristics are related to the shape of the body and may include, by way of example and not limitation, height, skeletal structure, fingerprint, palm veins, face recognition, DNA, palm print, hand geometry, iris recognition, retina scent, heart conditions, and so forth. Behavioral characteristics are related to the pattern of behavior of a person, including but not limited to a walking gait, typing rhythm, and so forth.
Alternately or additionally, the authentication component can be configured to recognize a gesture (or series of gestures) by the person, and to determine the authentication state based on the gesture. Thus, in this implementation, the user actively engages and interacts with the authentication component in order to be authenticated. In some cases, a two-stage authentication process may be applied, whereby the authentication component authenticates a person based on biometric characteristics as well as detection of one or more recognized gestures performed by the person.
In order to provide radar-based authentication techniques, the authentication component incorporates techniques usable to detect biometric characteristics and/or gestures that can be compared to an authentication library in order to authenticate the person to various types of applications or devices. For instance, a three-dimensional object detection system may be employed in which object characteristics are detected in free space, without any attachment or peripheral device connected to a corresponding body part, such as a hand. The three-dimensional object detection system, for instance, may leverage transmission and reception of radio signals to detect orientation and movement of an object in three-dimensional space. Such movement, for example, could include movement of a person's body (e.g., the person's walking gait), or movement of a single body part (e.g., gestures performed by movement of a person's hand).
Inputs resulting from this detection are then processed by leveraging an authentication library of the authentication component. The authentication library, for instance, may define gestures as well as biometric characteristics of a person (e.g., height, skeletal structure, or walking gait) that are detectable and correspond to various authentication states which enable or restrict access to various devices or applications. Accordingly, the object characteristics of the inputs may be compared to the authentication library to recognize corresponding object characteristics maintained in the library, and an authentication state corresponding to the recognized object characteristics.
For example, the authentication component can accurately recognize gestures or a series of gestures that are made in three dimensions, such as in-the-air gestures. Based on a comparison with the authentication library, the gestures can be recognized as a pre-configured unlock gesture or other input to authenticate a person, such that the person is permitted to access a device or application. These in-the-air gestures can be made from varying distances, such as from a person sitting on a couch to control a television, a person standing in a kitchen to control an oven or refrigerator, or millimeters from a desktop computer's display.
Alternately or additionally, the authentication component can be used to recognize biometric characteristics of a person. In some cases, the recognized biometric characteristics of the person can be used to determine that the person is a particular known person. For example, the authentication component can accurately identify a person based on biometric characteristics of the person, such as the person's height, skeletal structure, facial shape, hair, walking gait, and so forth. By so doing, the authentication component may determine that a person is a particular known person that is listed in the authentication library. Alternately, the authentication component can identify a person as a particular “type” of person, thereby differentiating the person from other types of persons. For example, a person could be recognized as an “adult” and differentiated from a “child”, based on the detection of biometric characteristics such as height or skeletal structure.
An authentication state is then initiated as a result of this comparison. Generally, an authentication state either permits or restricts access to a device or application. A series of gestures, for instance, may be detected by the authentication component and compared to the authentication library to determine an authentication state. If the series of gestures match a particular authentication state in which access to a device or application is permitted, then the authentication component may initiate the authentication state, such as by causing the computing device to transition from a locked state to an unlocked state.
As another example, biometric characteristics of a person may be detected by the authentication component and compared to the authentication library to determine an authentication state. If the biometric characteristics of the person match a particular known person and correspond to an authentication state, then the authentication component may initiate the authentication state, such as by causing the computing device to permit access to the person. Alternately, if the person is not recognized as a particular known person, the authentication component may initiate the authentication state by restricting access to the person. Similarly, if the biometric characteristics are recognized as corresponding to a particular type of person, then the authentication component may initiate an associated authentication state, such as by enabling access to the computing device if the person is recognized as an adult, but preventing the access if the person is recognized as a child. In this way, young children may be prevented from accessing certain devices or dangerous objects.
Notably, the radar-based authentication techniques described herein can authenticate a person at approximately the same resolution as a camera. However, unlike a camera, radar does not actually “see” a person, and thus the person's privacy is not violated in many cases. As such, these types of systems can be implemented in places where cameras are not permitted or expected by users.
Example Environment
The computing device 102, for instance, may be configured as a wearable device having a housing that is configured to be worn by or attached to a user. As such, the housing of the wearable device 106 may take a variety of different forms, such as a ring, broach, pendant, configured to be worn on a wrist of a user as illustrated, glasses as illustrated at 108, and so forth. The computing device 102 may also be configured to include a housing configured to be held by one or more hands of a user, such as a mobile phone or tablet as illustrated at 110, a laptop 112 computer, a dedicated camera 114, and so forth. Other examples include incorporation of the computing device 102 as part of a vehicle 116 (e.g., plane, train, boat, aircraft, and balloon), as part of the “Internet-of-things” such as a thermostat 118, appliance, vent, furnace, and so forth. Additional forms of computing devices 102 include desktop computers, game consoles, media consumption devices, televisions, and so on.
Thus, the computing device 102 ranges from full resource devices with substantial memory and processor resources (e.g., personal computers, game consoles) to a low-resource device with limited memory and/or processing resources (e.g., wearables). Although single computing device examples are shown, a computing device may be representative of a plurality of different devices (e.g., a television and remote control) as further described in relation to
The computing device 102, regardless of configuration, is configured to include an authentication component 104 to initiate various authentication states based on the detection of one or more biometric characteristics of a person or gestures performed by the person. The authentication component 104 in the illustrated example includes a three dimensional (3D) object detection system 120 and an authentication module 122 that is implemented at least partially in hardware. The authentication component 104 is representative of functionality to identify gestures made by a person 124 (e.g., either directly by the person and/or with an object) as well as biometric characteristics of the person 124 (e.g., height, skeletal structure, or walking gait), and to transition to various authentication states. For example, the authentication module 122 may receive inputs from the 3D object detection system 120 that are usable to detect characteristics or attributes to identify an object (e.g., person 124), orientation of the object, and/or movement of the object. Based on recognition of a combination of one or more of the characteristics or attributes, the authentication module 122 may initiate an authentication state, such as to detect a series of gestures by a hand of person 124 and cause a user interface output by the computing device 102 to transition from displaying a “lockscreen” to displaying a home screen. As another example, authentication module 122 may recognize person 124 as a particular known person based on a detected height and walking gait of the person and unlock the door of the person's house or car.
The 3D object detection system 120 is configurable to detect objects in three dimensions, such as to identify the object, an orientation of the object, and/or movement of the object. The detection may be performed using a variety of different techniques, such as cameras (e.g., a time-of-flight camera), sound waves, and so on. In the illustrated example, the 3D object detection system 120 is configured to use radar techniques and radio waves through use of a radio wave transmitter/receiver 126 and a radar processing module 128. The radio wave transmitter/receiver 126, for instance, transmits radio waves in the radio frequency range corresponding to one or more Wi-Fi frequency bands, e.g., IEEE 802.11 and so forth. The radar processing module 128 then detects return of these radio waves to detect objects, which may be performed at a resolution of less than one centimeter as further described beginning in relation to
Through use of radio waves, the 3D object detection system 120 may detect objects that are located behind other objects, e.g., are least partially obscured from “view” by another object. The 3D object detection system 120 may also transmit through materials such as fabric and plastics and even through a housing of the computing device 102 itself such that the housing may be made with lower cost and increased protection against outside elements. These techniques may also be leveraged to detect gestures or biometric characteristics of person 124 while the computing device 102 is in the person's 124 pocket. Complementary detection techniques may also be used, such as for the radar processing module 128 to leverage inputs from a plurality of computing devices, such as a watch and phone as illustrated, to detect a gesture or biometric characteristics. In the following, a variety of gesture detection and biometric detection techniques are described, which may be implemented using radar or other object detection techniques.
In this example, the authentication library 202 includes one or more biometric characteristics 204 of a person 124 or type of person and/or gestures 206 that are mapped to an authentication state 208. In some cases, the authentication library 202 may include pre-configured gestures 206 that are mapped to various authentication states 208. For instance, a specific gesture could be mapped to an authentication state which transitions a computing device from a “locked” state to an “unlocked” state. Similarly, biometric characteristics, such as a particular height threshold, can be mapped to an authentication state. Alternately or additionally, during an enrollment or registration process, biometric characteristics 204 or gesture characteristics 206 may be captured by the 3D object detection system 120 and mapped to an authentication state 208. For instance, biometric characteristics associated with person 124 can be captured and mapped to an authentication state that enables access to a device. In addition, certain data stored in authentication library 202 may be treated in one or more ways before it is stored or used, so that personally identifiable information is removed. For example, a person's identity may be treated so that no personally identifiable information can be determined for the person, or a person's geographic location may be generalized where location information is obtained (such as to a city, ZIP code, or state level), so that a particular location of a person cannot be determined. Thus, the person may have control over what information is collected about the person, how that information is used, and what information is provided to the person.
The authentication library 202 is usable by the authentication module 122 to recognize the one or more biometric characteristics 204 of a person 124 or gestures 206 performed by person 124, as well as the authentication state 208 that is mapped to the recognized gesture 206 or biometric characteristics 204.
The 3D object detection system 120, for instance, may detect an object (e.g., a body of person 124 or a specific part of the body of person 124) and movement of the object. Data that results from this detection is then used by the authentication module 122 to identify which of the biometric characteristics 204 or gestures 206 in the authentication library 202 correspond to the detected object characteristics, if any. From this, the authentication module 122 also identifies an authentication state 208 that corresponds to the recognized biometric characteristic 204 or gesture 206.
Based on this comparison, authentication module 122 initiates the recognized authentication state 208, if any, which permits or restricts access to computing device 102, applications 210 executing on computing device 102, or a remote devices or application. For example, a result of the comparison can be output by the authentication module 122 via one or more application programming interfaces 212 to applications 210 that are executable by the computing device 102, such as third-party applications, an operating system, and so forth. The applications 210, for instance, may then cause the transition to the recognized authentication state 208, such as by unlocking computing device 102 to enable user access.
For example, a gesture 214 is illustrated as being performed by a hand 216 of the person 124 of
The authentication module 122 accepts data as an input from the 3D object detection system 120 that describes this positioning and movement. This data is then compared with the biometric characteristics 204 or gestures 206 stored in the authentication library 202 to recognize the gesture 214 (or biometric characteristics), and an authentication state 208 that corresponds to the gesture or biometric characteristics. In the illustrated example, the gesture 214 is an “unlock gesture” that is configured to transition the computing device 102 or applications 210 from a “locked state” to an “unlocked state”.
Accordingly, based on this comparison the authentication module 122 may initiate an authentication state 208, which may include permitting or restricting access to computing device 102, application 210, or a remote device or application. The biometric characteristics or gestures may be detected using a variety of techniques, an example of which that includes use of RF signals is described in the following.
In some cases, the authentication module 122 is configured to initiate a first authentication state which permits the person to access a device or application if the person is recognized as the known person based on detected biometric characteristics of the person, and to initiate a second authentication state which restricts the person from accessing the device or application if the person is not recognized as the known person. In this way, the authentication component can authenticate the person without the person's active engagement.
As another example, the authentication module 122 can be configured to initiate a first authentication state which permits the person to access a device or application if the person is recognized as a first type of person (e.g., an adult) based on detected biometric characteristics of the person, and to initiate a second authentication state which restricts the person from accessing the device or application if the person is recognized as a second type of person (e.g., a child). For example, types or groups of people can be differentiated based on biometric characteristics such as height, weight, length or size of various body parts. For example, height may be usable to differentiate an adult from a young child.
In one or more implementations, the authentication module 122 is configured to determine the authentication state based on a two stage authentication process. For example, the authentication module can authenticate a person based on the person being recognized as a known person (based on biometric characteristics) as well as detecting a pre-configured authentication gesture performed by the person.
The 3D object detection system 120 represents functionality that wirelessly captures characteristics of a target object, illustrated here as a hand 216, but also including biometric characteristics of a person's body or a portion thereof. In this example, 3D object detection system 120 is a hardware component of the computing device 102. In some cases, 3D object detection system 120 not only captures characteristics about the hand 216, but can additionally identify a specific gesture performed by the hand 216 from other gestures. Any suitable type of characteristic or gesture can be captured or identified, such as an associated size of the hand, a directional movement of the hand, a micro-gesture performed by all or a portion of the hand (e.g., a single-tap gesture, a double-tap gesture, a left-swipe, a forward-swipe, a right-swipe, a finger making a shape, etc.), list of physical characteristics, and so forth. The term micro-gesture is used to signify a gesture that can be identified from other gestures based on differences in movement using a scale on the order of millimeters to sub-millimeters. Alternately or additionally, authentication component 104 can be configured to identify gestures on a larger scale than a micro-gesture (i.e., a macro-gesture that is identified by differences with a coarser resolution than a micro-gesture, such as differences measured in centimeters or meters).
Hand 216 of the person 124 represents a target object that the authentication component 104 is in process of detecting. Here, the hand 216 resides in free-space with no devices attached to it. Being in free-space, the hand 216 has no physical devices attached to it that couple to, or communicate with, computing device 102 and/or 3D object detection system 120. While this example is described in the context of detecting the hand 216, it is to be appreciated that 3D object detection system 120 can similarly be used to capture biometric characteristics of person 124, as well as characteristics of any other suitable type of target object, whether part or of separate from the person 124.
As part of this, the 3D object detection system 120 is configured to transmit and receive radio frequency (RF) signals. In an implementation, 3D object detection system 120 transmits the RF signals as radar signals, each on a respective antenna, that are directed towards hand 216 or the body of person 124. As the transmitted signals reach the hand 216 or body of person 124, at least some reflect back to 3D object detection system 120 and are processed, as further described below. Signals detected by the 3D object detection system 120 can have any suitable combination of energy level, carrier frequency, burst periodicity, pulse width, modulation type, waveform, phase relationship, and so forth. In some cases, some or all of the respective signals transmitted in signals differs from one another to create a specific diversity scheme, such as a time diversity scheme that transmits multiple versions of a same signal at different points in time, a frequency diversity scheme that transmits signals using several different frequency channels, a space diversity scheme that transmits signals over different propagation paths, and so forth.
Having generally described an environment in which radar-based authentication may be implemented, now consider
Computer-readable storage media 304 also includes the Application Programming Interfaces (APIs) 212 to provide programming access into various routines and tools provided by the authentication component 104 of
These APIs 212 enable applications 210 to incorporate the functionality provided by the authentication component 104 into executable code. For instance, applications 210 can call or invoke APIs 212 to register for, or request, an event notification when a particular gesture has been detected, enable or disable wireless gesture recognition in computing device 102, and so forth. At times, APIs 212 can access and/or include low level hardware drivers that interface with hardware implementations of authentication component 104. Alternately or additionally, APIs 212 can be used to access various algorithms that reside on authentication component 104 to perform additional functionality or extract additional information, such as 3D tracking information, angular extent, reflectivity profiles from different aspects, correlations between transforms/features from different channels, and so forth.
The 3D object detection system 120 and authentication module 122 of the authentication component 104 represent functionality that wirelessly detects a variety of gestures, such as gestures performed by a hand 216 of
Generally, radar-emitting element 306 is configured to provide a radar field. In some cases, the radar field is configured to at least partially reflect off a target object. The radar field can also be configured to penetrate fabric or other obstructions and reflect from human tissue. These fabrics or obstructions can include wood, glass, plastic, cotton, wool, nylon and similar fibers, and so forth, while reflecting from human tissues, such as a person's hand or body.
A radar field can be a small size, such as 0 or 1 millimeters to 1.5 meters, or an intermediate size, such as 1 to 30 meters. It is to be appreciated that these sizes are merely for discussion purposes, and that any other suitable range can be used. When the radar field has an intermediate size, 3D object detection system 120 is configured to receive and process reflections of the radar field to provide large-body gestures based on reflections from human tissue caused by body, arm, or leg movements, as well as to detect biometric characteristics of a person's body. In other cases, the radar field can be configured to enable 3D object detection system 120 to detect smaller and more-precise gestures, such as micro-gestures. Example intermediate-sized radar fields include those in which a user makes gestures to control a television from a couch, change a song or volume from a stereo across a room, turn off an oven or oven timer (a near field would also be useful here), turn lights on or off in a room, and so forth. Radar-emitting element 306 can be configured to emit continuously modulated radiation, ultra-wideband radiation, or sub-millimeter-frequency radiation.
Antenna(s) 308 transmit and receive RF signals. In some cases, radar-emitting element 306 couples with antenna(s) 308 to transmit a radar field. As one skilled in the art will appreciate, this is achieved by converting electrical signals into electromagnetic waves for transmission, and vice versa for reception. Authentication component 104 can include any suitable number of antennas in any suitable configuration. For instance, any of the antennas can be configured as a dipole antenna, a parabolic antenna, a helical antenna, a monopole antenna, and so forth. In some embodiments, antennas 308 are constructed on-chip (e.g., as part of an SoC), while in other embodiments, antennas 308 are separate components, metal, hardware, etc. that attach to, or are included within, 3D object detection system 120. An antenna can be single-purpose (e.g., a first antenna can be directed towards transmitting signals, and a second antenna can be directed towards receiving signals), or multi-purpose (e.g., an antenna is directed towards transmitting and receiving signals). Thus, some embodiments utilized varying combinations of antennas, such as an embodiment that utilizes two single-purpose antennas directed towards transmission in combination with four single-purpose antennas directed towards reception. The placement, size, and/or shape of antennas 308 can be chosen to enhance a specific transmission pattern or diversity scheme, such as a pattern or scheme designed to capture information about a micro-gesture performed by the hand, as further described above and below. In some cases, the antennas can be physically separated from one another by a distance that allows authentication component 104 to collectively transmit and receive signals directed to a target object over different channels, different radio frequencies, and different distances. In some cases, antennas 308 are spatially distributed to support triangulation techniques, while in others the antennas are collocated to support beamforming techniques. While not illustrated, each antenna can correspond to a respective transceiver path that physically routes and manages the outgoing signals for transmission and the incoming signals for capture and analysis.
Digital signal processing component 310 generally represents digitally capturing and processing a signal. For instance, digital signal processing component 310 samples analog RF signals received by antenna(s) 308 to generate digital samples that represents the RF signals, and then processes these samples to extract information about the target object. Alternately or additionally, digital signal processing component 310 controls the configuration of signals generated and transmitted by radar-emitting element 306 and/or antennas 308, such as configuring a plurality of signals to form a specific diversity scheme like a beamforming diversity scheme. In some cases, digital signal processing component 310 receives input configuration parameters that control an RF signal's transmission parameters (e.g., frequency channel, power level, etc.), such as through APIs 212. In turn, digital signal processing component 310 modifies the RF signal based upon the input configuration parameter. At times, the signal processing functions of digital signal processing component 310 are included in a library of signal processing functions or algorithms that are also accessible and/or configurable via APIs 212, e.g., authentication library 202. Digital signal processing component 310 can be implemented in hardware, software, firmware, or any combination thereof.
Among other things, machine-learning component 312 receives information processed or extracted by digital signal processing component 310, and uses that information to classify or recognize various aspects of the target object. In some cases, machine-learning component 312 applies one or more algorithms to probabilistically determine biometric characteristics, or which gesture has occurred, given an input signal and previously learned gesture and biometric characteristics by leveraging the authentication library 202. As in the case of digital-signal processing component 310, machine-learning component 312 can include a library of multiple machine-learning algorithms, such as a Random Forrest algorithm, deep learning algorithms (i.e. artificial neural network algorithms, convolutional neural net algorithms, etc.), clustering algorithms, Bayesian algorithms, and so forth. Machine-learning component 312 can be trained on how to identify various gestures using input data that consists of example gesture(s) to learn. In turn, machine-learning component 312 uses the input data to learn what features can be attributed to a specific gesture. These features are then used to identify when the specific gesture occurs. In some embodiments, APIs 212 can be used to configure machine-learning component 312 and/or its corresponding algorithms.
Computing device 102 also includes I/O ports 314 and network interfaces 316. I/O ports 314 can include a variety of ports, such as by way of example and not limitation, high-definition multimedia (HDMI), digital video interface (DVI), display port, fiber-optic or light-based, audio ports (e.g., analog, optical, or digital), Universal Serial Bus (USB) ports, serial advanced technology attachment (SATA) ports, peripheral component interconnect (PCI) express based ports or card slots, serial ports, parallel ports, or other legacy ports. Computing device 102 may also include the network interfaces 316 for communicating data over wired, wireless, or optical networks. By way of example and not limitation, the network interfaces 316 may communicate data over a local-area-network (LAN), a wireless local-area-network (WLAN), a personal-area-network (PAN), a wide-area-network (WAN), an intranet, the Internet, a peer-to-peer network, point-to-point network, a mesh network, and the like.
As technology advances, users have an expectation that new devices will provide additional freedoms and flexibility over past devices. One such example is the inclusion of wireless capabilities in a device. Consider the case of a wireless mouse input device. A wireless mouse input device receives input from a user in the format of button clicks and movement in position, and wirelessly transmits this information to a corresponding computing device. The wireless nature obviates the need to have a wired connection between the wireless mouse input device and the computing device, which gives more freedom to the user with the mobility and placement of the mouse. However, the user still physically interacts with the wireless mouse input device as a way to enter input into the computing device. Accordingly, if the wireless mouse input device gets lost or is misplaced, the user is unable to enter input with that mechanism. Thus, removing the need for a peripheral device as an input mechanism gives additional freedom to the user. One such example is performing input to a computing device via a hand gesture.
Gestures provide a user with a simple and readily available mechanism to input commands to a computing device. However, detection of some gestures, such as hand gestures, can pose certain problems. For example, attaching a movement sensing device to a hand does not remove a user's dependency upon a peripheral device. Instead, it is a solution that simply trades one input peripheral for another. As an alternative, cameras can capture images, which can then be compared and analyzed to identify the hand gestures. However, this option may not yield a fine enough resolution to detect micro-gestures. An alternate solution involves usage of radar systems to transmit RF signals to a target object, and determine information about that target based upon an analysis of the reflected signal.
Various implementations described herein are used to wirelessly detect biometric characteristics or gestures using multiple antennas. Each antenna can be configured to transmit a respective RF signal to enable detection of biometric characteristics or a micro-gesture performed by a hand. In some embodiments, the collective transmitted RF signals are configured to radiate a specific transmission pattern or specific diversity scheme. RF signals reflected off of the hand or body of person 124 can be captured by the antenna, and further analyzed to identify temporal variations in the RF signals. In turn, these temporal variations can be used to identify biometric characteristics or gestures.
Consider
Environment 400a includes source device 402 and object 404. Source device 402 includes antenna 406, which generally represents functionality configured to transmit and receive electromagnetic waves in the form of an RF signal. It is to be appreciated that antenna 406 can be coupled to a source, such as a radar-emitting element, to achieve transmission of a signal. In this example, source device 402 transmits a series of RF pulses, illustrated here as RF pulse 408a, RF pulse 408b, and RF pulse 408c. As indicated by their ordering and distance from source device 402, RF pulse 408a is transmitted first in time, followed by RF pulse 408b, and then RF pulse 408c. For discussion purposes, these RF pulses have the same pulse width, power level, and transmission periodicity between pulses, but any other suitable type of signal with alternate configurations can be transmitted without departing from the scope of the claimed subject matter.
Generally speaking, electromagnetic waves can be characterized by the frequency or wavelength of their corresponding oscillations. Being a form of electromagnetic radiation, RF signals adhere to various wave and particle properties, such as reflection. When an RF signal reaches an object, it will undergo some form of transition. Specifically, there will be some reflection off the object. Environment 400b illustrates the reflection of RF pulses 408a-408c reflecting off of object 404, where RF pulse 410a corresponds to a reflection originating from RF pulse 408a reflecting off of object 404, RF pulse 410b corresponds to a reflection originating from RF pulse 410b, and so forth. In this simple case, source device 402 and object 404 are stationary, and RF pulses 408a-408c are transmitted via a single antenna (antenna 406) over a same RF channel, and are transmitted directly towards object 404 with a perpendicular impact angle. Similarly, RF pulses 410a-410c are shown as reflecting directly back to source device 402, rather than with some angular deviation. However, as one skilled in the art will appreciate, these signals can alternately be transmitted or reflected with variations in their transmission and reflection directions based upon the configuration of source device 402, object 404, transmission parameters, variations in real-world factors, and so forth. Upon receiving and capturing RF pulses 410a-410c, source device 402 can then analyze the pulses, either individually or in combination, to identify characteristics related to object 404. For example, source device 402 can analyze all of the received RF pulses to obtain temporal information and/or spatial information about object 404. Accordingly, source device 402 can use knowledge about a transmission signal's configuration (such as pulse widths, spacing between pulses, pulse power levels, phase relationships, and so forth), and further analyze a reflected RF pulse to identify various characteristics about object 404, such as size, shape, movement speed, movement direction, surface smoothness, material composition, and so forth.
Now consider
When RF signals 508a-508d reach hand 504, they generate reflected RF signals 510a-510d. Similar to the discussion of
As in the case of
Building further upon the discussions with respect to
Desktop computer 604 includes, or is associated with, 3D object detection system 602-1. These devices work together to improve user interaction with desktop computer 604. Assume, for example, that desktop computer 604 includes a touch screen 610 through which display and user interaction can be performed. This touch screen 610 can present some challenges to users, such as needing a person to sit in a particular orientation, such as upright and forward, to be able to touch the screen. Further, the size for selecting controls through touch screen 610 can make interaction difficult and time-consuming for some users. Consider, however, 3D object detection system 602-1, which provides near radar field 608-1 enabling a user's hands to interact with desktop computer 604, such as with small or large, simple or complex gestures, including those with one or two hands, and in three dimensions. As is readily apparent, a large volume through which a user may make selections can be substantially easier and provide a better experience over a flat surface, such as that of touch screen 610.
Similarly, consider 3D object detection system 602-2, which provides intermediate radar field 608-2. Providing a radar-field enables a user to interact with television 606 from a distance and through various gestures, ranging from hand gestures, to arm gestures, to full-body gestures. By so doing, user selections can be made simpler and easier than a flat surface (e.g., touch screen 610), a remote control (e.g., a gaming or television remote), and other conventional control mechanisms.
3D object detection systems can interact with applications or an operating system of computing devices, or remotely through a communication network by transmitting input responsive to recognizing gestures. Gestures can be mapped to various applications and devices, thereby enabling control of many devices and applications. Many complex and unique gestures can be recognized by radar-based gesture recognition systems, thereby permitting precise and/or single-gesture control, even for multiple applications. Radar-based gesture recognition systems, whether integrated with a computing device, having computing capabilities, or having few computing abilities, can each be used to interact with various devices and applications.
The radar field can also include a surface applied to human tissue. Consider, for example,
Example 700 includes a hand 702 and a surface radar field 704 provided by 3D object detection system 120 that is included in a laptop 706. Radar-emitting element 306 (not shown) provides surface radar field 704 penetrating chair 708 and applied to hand 702. In this case, antenna 308 (not shown) is configured to receive a reflection caused by an interaction on the surface of hand 702 that penetrates (e.g., reflects back through) chair 708. Similarly, digital signal processing component 310 and/or machine-learning component 312 are configured to process the received reflection on the surface sufficient to provide gesture data usable to determine a gesture and/or biometric data usable to determine biometric characteristics usable to identify a person. Note that with surface radar field 704, another hand may interact to perform gestures, such as to tap on the surface on hand 702, thereby interacting with surface radar field 704. Example gestures include single and multi-finger swipe, spread, squeeze, non-linear movements, and so forth. Or hand 702 may simply move or change shape to cause reflections, thereby also performing an occluded gesture.
With respect to human-tissue reflection, reflecting radar fields can process these fields to determine identifying indicia based on the human-tissue reflection, and confirm that the identifying indicia matches recorded identifying indicia for a person, such that the person may be authenticated. These identifying indicia can include various biometric identifiers, such as a size, shape, ratio of sizes, cartilage structure, and bone structure for the person or a portion of the person, such as the person's hand. These identify indicia may also be associated with a device worn by the person permitted to control the mobile computing device, such as device having a unique or difficult-to-copy reflection (e.g., a wedding ring of 14 carat gold and three diamonds, which reflects radar in a particular manner). In addition, radar-based gesture detection systems can be configured so that personally identifiable information is removed. For example, a user's identity may be treated so that no personally identifiable information can be determined for the user, or a user's geographic location may be generalized where location information is obtained (such as to a city, ZIP code, or state level), so that a particular location of a user cannot be determined. Thus, the user may have control over what information is collected about the user, how that information is used, and what information is provided to the user.
By way of further example, consider
Example 800 includes a computing device 802, a radar field 804, and three persons, 806, 808, and 810. The 3D object detection system 120 can detect characteristics about each of person's 806, 808, and 810, such as his height, weight, skeletal structure, facial shape and hair (or lack thereof). By so doing, 3D object detection system 120 may determine, for example, that person 810 is a particular known person, a particular type of person (e.g., an adult to differentiate person 810 from a child, or a man to differentiate person 810 from person 808 who is a woman), or simply identify person 810 to differentiate him from the other persons in the room (persons 806 and 808). Furthermore, in order to identify the person 810, the person may do little if anything explicitly, though explicit interaction is also permitted. For example, 810 may simply walk in and sit down on a stool and by so doing walks into radar field 804 which then automatically detects biometric characteristics of the person 810. For example, the 3D object detection system 120 senses this interaction based on received reflections from person 810.
Example Procedures
An input is detected using a three dimensional object detection system of an authentication component of a computing device (block 902). The input may detect an object, such as a person 124 (or a part of a person 124), a physical object associated with a person 124, as well as objects in a physical environment, in which, the person 124 is disposed.
At least one characteristic is recognized by the authentication component based on the detected input through comparison with an authentication library maintained by the gesture component (block 904). An authentication module 122, for instance, may compare the input from the 3D object detection system 120 with an authentication library 202 to determine a gesture 206 which corresponds with the input, if any. Alternately or additionally, the authentication module 122 may compare the input from the 3D object detection system 120 with an authentication library 202 to identify biometric characteristics 204 associated with a known person or a type of person.
An authentication state is then recognized that corresponds to the recognized at least one characteristic by the authentication component using the authentication library (block 906). The biometric characteristics 204 or gesture 206, for instance, may be associated with a corresponding authentication state 208 within the authentication library 202.
The recognized authentication state is then initiated by the authentication component (block 910). Continuing with the above example, this may include causing a device or application to transition to the authentication state, thereby enabling or restricting access to the device or application.
Example Electronic Device
Electronic device 1000 includes communication devices 1002 that enable wired and/or wireless communication of device data 1004 (e.g., received data, data that is being received, data scheduled for broadcast, data packets of the data, etc.). The device data 1004 or other device content can include configuration settings of the device and/or information associated with a user of the device.
Electronic device 1000 also includes communication interfaces 1006 that can be implemented as any one or more of a serial and/or parallel interface, a wireless interface, any type of network interface, a modem, and as any other type of communication interface. The communication interfaces 1006 provide a connection and/or communication links between electronic device 1000 and a communication network by which other electronic, computing, and communication devices communicate data with electronic device 1000.
Electronic device 1000 includes one or more processors 1008 (e.g., any of microprocessors, controllers, and the like) which process various computer-executable instructions to control the operation of electronic device 1000 and to implement embodiments of the techniques described herein. Alternatively or in addition, electronic device 1000 can be implemented with any one or combination of hardware, firmware, or fixed logic circuitry that is implemented in connection with processing and control circuits which are generally identified at 1010. Although not shown, electronic device 1000 can include a system bus or data transfer system that couples the various components within the device. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures.
Electronic device 1000 also includes computer-readable media 1012, such as one or more memory components, examples of which include random access memory (RAM), non-volatile memory (e.g., any one or more of a read-only memory (ROM), flash memory, EPROM, EEPROM, etc.), and a disk storage device. A disk storage device may be implemented as any type of magnetic or optical storage device, such as a hard disk drive, a recordable and/or rewriteable compact disc (CD), any type of a digital versatile disc (DVD), and the like.
Computer-readable media 1012 provides data storage mechanisms to store the device data 1004, as well as various applications 1014 and any other types of information and/or data related to operational aspects of electronic device 1000. The applications 1014 can include a device manager (e.g., a control application, software application, signal processing and control module, code that is native to a particular device, a hardware abstraction layer for a particular device, etc.). Computer-readable media 1012 also includes APIs 1016.
APIs 1016 provide programmatic access to an authentication component, examples of which are provided above. The programmatic access can range from high-level program access that obscures underlying details of how a function is implemented, to low-level programmatic access that enables access to hardware. In some cases, APIs can be used to send input configuration parameters associated with modifying how signals are transmitted, received, and/or processed by an authentication component.
Electronic device 1000 also includes audio and/or video processing system 1018 that processes audio data and/or passes through the audio and video data to audio system 1020 and/or to display system 1022 (e.g., a screen of a smart phone or camera). Audio system 1020 and/or display system 1022 may include any devices that process, display, and/or otherwise render audio, video, display, and/or image data. Display data and audio signals can be communicated to an audio component and/or to a display component via an RF link, S-video link, HDMI, composite video link, component video link, DVI, analog audio connection, or other similar communication link, such as media data port 1024. In some implementations, audio system 1020 and/or display system 1022 are external components to electronic device 1000. Alternatively or additionally, display system 1022 can be an integrated component of the example electronic device, such as part of an integrated touch interface.
Electronic device 1000 also includes authentication component 1026 that wirelessly identifies one or more features of a target object, such as a micro-gesture performed by a hand as further described above. Authentication component 1026 can be implemented as any suitable combination of hardware, software, firmware, and so forth. In some embodiments, authentication component 1026 is implemented as an SoC. Among other things, authentication component 1026 includes radar-emitting element 1028, antennas 1030, digital signal processing component 1032, machine-learning component 1034, and gesture library 1036.
Radar-emitting element 1028 is configured to provide a radar field. In some cases, the radar field is configured to at least partially reflect off a target object. The radar field can also be configured to penetrate fabric or other obstructions and reflect from human tissue. These fabrics or obstructions can include wood, glass, plastic, cotton, wool, nylon and similar fibers, and so forth, while reflecting from human tissues, such as a person's hand Radar-emitting element 1028 works in concert with antennas 1030 to provide the radar field.
Antenna(s) 1030 transmit and receive RF signals under the control of authentication component 1026. Each respective antenna of antennas 1030 can correspond to a respective transceiver path internal to authentication component 1026 that physical routes and manages outgoing signals for transmission and the incoming signals for capture and analysis as further described above.
Digital signal processing component 1032 digitally processes RF signals received via antennas 1030 to extract information about the target object. This can be high-level information that simply identifies a target object, or lower level information that identifies a particular micro-gesture performed by a hand. In some embodiments, digital signal processing component 1032 additionally configures outgoing RF signals for transmission on antennas 1030. Some of the information extracted by digital signal processing component 1032 is used by machine-learning component 1034. Digital signal processing component 1032 at times includes multiple digital signal processing algorithms that can be selected or deselected for an analysis, examples of which are provided above. Thus, digital signal processing component 1032 can generate key information from RF signals that can be used to determine what gesture might be occurring at any given moment. At times, an application, such those illustrated by applications 1014, can configure the operating behavior of digital signal processing component 1032 via APIs 1016.
Machine-learning component 1034 receives input data, such as a transformed raw signal or high-level information about a target object, and analyzes the input date to identify or classify various features contained within the data. As in the case above, machine-learning component 1034 can include multiple machine-learning algorithms that can be selected or deselected for an analysis. Among other things, machine-learning component 1034 can use the key information generated by digital signal processing component 1032 to detect relationships and/or correlations between the generated key information and previously learned gestures to probabilistically decide which gesture is being performed. At times, an application, such those illustrated by applications 1014, can configure the operating behavior of machine-learning component 1032 via APIs 1016.
Authentication library 1036 represents data used by authentication component 1026 to identify a target object and/or gestures performed by the target object. For instance, authentication library 202 can store signal characteristics, or characteristics about a target object that are discernable from a signal, that can be used to identify a unique in-the-air gesture, biometric characteristics, a user identity, user presence, and so forth. In addition, certain data stored in authentication library 1036 may be treated in one or more ways before it is stored or used, so that personally identifiable information is removed. For example, a user's identity may be treated so that no personally identifiable information can be determined for the user, or a user's geographic location may be generalized where location information is obtained (such as to a city, ZIP code, or state level), so that a particular location of a user cannot be determined. Thus, the user may have control over what information is collected about the user, how that information is used, and what information is provided to the user.
Although the embodiments have been described in language specific to structural features and/or methodological acts, it is to be understood that the various embodiments defined in the appended claims are not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the various embodiments.
This application claims priority to U.S. Provisional Patent Application Ser. No. 62/237,975 filed on Oct. 6, 2015, the disclosure of which is incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
3610874 | Gagliano | Oct 1971 | A |
3752017 | Lloyd et al. | Aug 1973 | A |
3953706 | Harris et al. | Apr 1976 | A |
4104012 | Ferrante | Aug 1978 | A |
4654967 | Thenner | Apr 1987 | A |
4700044 | Hokanson et al. | Oct 1987 | A |
4795998 | Dunbar et al. | Jan 1989 | A |
4838797 | Dodier | Jun 1989 | A |
5016500 | Conrad et al. | May 1991 | A |
5121124 | Spivey et al. | Jun 1992 | A |
5298715 | Chalco et al. | Mar 1994 | A |
5341979 | Gupta | Aug 1994 | A |
5410471 | Alyfuku et al. | Apr 1995 | A |
5468917 | Brodsky et al. | Nov 1995 | A |
5564571 | Zanotti | Oct 1996 | A |
5656798 | Kubo et al. | Aug 1997 | A |
5724707 | Kirk et al. | Mar 1998 | A |
5798798 | Rector et al. | Aug 1998 | A |
6032450 | Blum | Mar 2000 | A |
6037893 | Lipman | Mar 2000 | A |
6080690 | Lebby et al. | Jun 2000 | A |
6101431 | Niwa et al. | Aug 2000 | A |
6210771 | Post et al. | Apr 2001 | B1 |
6254544 | Hayashi | Jul 2001 | B1 |
6313825 | Gilbert | Nov 2001 | B1 |
6340979 | Beaton et al. | Jan 2002 | B1 |
6380882 | Hegnauer | Apr 2002 | B1 |
6386757 | Konno | May 2002 | B1 |
6440593 | Ellison et al. | Aug 2002 | B2 |
6492980 | Sandbach | Dec 2002 | B2 |
6493933 | Post et al. | Dec 2002 | B1 |
6513833 | Breed et al. | Feb 2003 | B2 |
6513970 | Tabata et al. | Feb 2003 | B1 |
6524239 | Reed et al. | Feb 2003 | B1 |
6543668 | Fujii et al. | Apr 2003 | B1 |
6616613 | Goodman | Sep 2003 | B1 |
6711354 | Kameyama | Mar 2004 | B2 |
6717065 | Hosaka et al. | Apr 2004 | B2 |
6802720 | Weiss et al. | Oct 2004 | B2 |
6833807 | Flacke et al. | Dec 2004 | B2 |
6835898 | Eldridge et al. | Dec 2004 | B2 |
6854985 | Weiss | Feb 2005 | B1 |
6929484 | Weiss et al. | Aug 2005 | B2 |
6970128 | Dwelly et al. | Nov 2005 | B1 |
6997882 | Parker et al. | Feb 2006 | B1 |
7019682 | Louberg | Mar 2006 | B1 |
7134879 | Sugimoto et al. | Nov 2006 | B2 |
7158076 | Fiore et al. | Jan 2007 | B2 |
7164820 | Eves et al. | Jan 2007 | B2 |
7194371 | McBride et al. | Mar 2007 | B1 |
7205932 | Fiore | Apr 2007 | B2 |
7223105 | Weiss et al. | May 2007 | B2 |
7230610 | Jung et al. | Jun 2007 | B2 |
7249954 | Weiss | Jul 2007 | B2 |
7266532 | Sutton et al. | Sep 2007 | B2 |
7299964 | Jayaraman et al. | Nov 2007 | B2 |
7310236 | Takahashi et al. | Dec 2007 | B2 |
7317416 | Flom et al. | Jan 2008 | B2 |
7348285 | Dhawan et al. | Mar 2008 | B2 |
7365031 | Swallow et al. | Apr 2008 | B2 |
7421061 | Boese et al. | Sep 2008 | B2 |
7462035 | Lee et al. | Dec 2008 | B2 |
7528082 | Krans et al. | May 2009 | B2 |
7544627 | Tao et al. | Jun 2009 | B2 |
7578195 | DeAngelis et al. | Aug 2009 | B2 |
7644488 | Aisenbrey | Jan 2010 | B2 |
7647093 | Bojovic et al. | Jan 2010 | B2 |
7670144 | Ito et al. | Mar 2010 | B2 |
7677729 | Vilser et al. | Mar 2010 | B2 |
7691067 | Westbrook et al. | Apr 2010 | B2 |
7698154 | Marchosky | Apr 2010 | B2 |
7791700 | Bellamy | Sep 2010 | B2 |
7834276 | Chou et al. | Nov 2010 | B2 |
7845023 | Swatee | Dec 2010 | B2 |
7941676 | Glaser | May 2011 | B2 |
7952512 | Delker et al. | May 2011 | B1 |
7999722 | Beeri et al. | Aug 2011 | B2 |
8062220 | Kurtz et al. | Nov 2011 | B2 |
8063815 | Valo | Nov 2011 | B2 |
8169404 | Boillot | May 2012 | B1 |
8179604 | Prada Gomez et al. | May 2012 | B1 |
8193929 | Siu et al. | Jun 2012 | B1 |
8199104 | Park et al. | Jun 2012 | B2 |
8282232 | Hsu et al. | Oct 2012 | B2 |
8289185 | Alonso | Oct 2012 | B2 |
8301232 | Albert et al. | Oct 2012 | B2 |
8314732 | Oswald et al. | Nov 2012 | B2 |
8334226 | Nhan et al. | Dec 2012 | B2 |
8341762 | Balzano | Jan 2013 | B2 |
8344949 | Moshfeghi | Jan 2013 | B2 |
8367942 | Howell et al. | Feb 2013 | B2 |
8475367 | Yuen et al. | Jul 2013 | B1 |
8505474 | Kang et al. | Aug 2013 | B2 |
8509882 | Albert et al. | Aug 2013 | B2 |
8514221 | King et al. | Aug 2013 | B2 |
8527146 | Jackson | Sep 2013 | B1 |
8549829 | Song et al. | Oct 2013 | B2 |
8560972 | Wilson | Oct 2013 | B2 |
8562526 | Heneghan et al. | Oct 2013 | B2 |
8569189 | Bhattacharya et al. | Oct 2013 | B2 |
8614689 | Nishikawa et al. | Dec 2013 | B2 |
8655004 | Prest et al. | Feb 2014 | B2 |
8700137 | Albert | Apr 2014 | B2 |
8758020 | Burdea et al. | Jun 2014 | B2 |
8759713 | Sheats | Jun 2014 | B2 |
8764651 | Tran | Jul 2014 | B2 |
8785778 | Streeter et al. | Jul 2014 | B2 |
8790257 | Libbus et al. | Jul 2014 | B2 |
8814574 | Selby et al. | Aug 2014 | B2 |
8819812 | Weber et al. | Aug 2014 | B1 |
8854433 | Rafii | Oct 2014 | B1 |
8860602 | Nohara et al. | Oct 2014 | B2 |
8921473 | Hyman | Dec 2014 | B1 |
8948839 | Longinotti-Buitoni et al. | Feb 2015 | B1 |
9055879 | Selby et al. | Jun 2015 | B2 |
9075429 | Karakotsios et al. | Jul 2015 | B1 |
9093289 | Vicard et al. | Jul 2015 | B2 |
9125456 | Chow | Sep 2015 | B2 |
9141194 | Keyes et al. | Sep 2015 | B1 |
9148949 | Zhou et al. | Sep 2015 | B2 |
9223494 | Desalvo et al. | Dec 2015 | B1 |
9229102 | Wright et al. | Jan 2016 | B1 |
9230160 | Kanter | Jan 2016 | B1 |
9235241 | Newham et al. | Jan 2016 | B2 |
9316727 | Sentelle et al. | Apr 2016 | B2 |
9331422 | Nazzaro et al. | May 2016 | B2 |
9335825 | Rautianinen et al. | May 2016 | B2 |
9346167 | O'Connor et al. | May 2016 | B2 |
9354709 | Heller et al. | May 2016 | B1 |
9508141 | Khachaturian et al. | Nov 2016 | B2 |
9569001 | Mistry et al. | Feb 2017 | B2 |
9575560 | Poupyrev et al. | Feb 2017 | B2 |
9588625 | Poupyrev | Mar 2017 | B2 |
9594443 | VanBlon et al. | Mar 2017 | B2 |
9600080 | Poupyrev | Mar 2017 | B2 |
9693592 | Robinson et al. | Jul 2017 | B2 |
9746551 | Scholten et al. | Aug 2017 | B2 |
9766742 | Papakostas | Sep 2017 | B2 |
9778749 | Poupyrev | Oct 2017 | B2 |
9811164 | Poupyrev | Nov 2017 | B2 |
9817109 | Saboo et al. | Nov 2017 | B2 |
9837760 | Karagozler et al. | Dec 2017 | B2 |
9848780 | DeBusschere et al. | Dec 2017 | B1 |
9921660 | Poupyrev | Mar 2018 | B2 |
9933908 | Poupyrev | Apr 2018 | B2 |
9947080 | Nguyen et al. | Apr 2018 | B2 |
9971414 | Gollakota et al. | May 2018 | B2 |
9971415 | Poupyrev et al. | May 2018 | B2 |
9983747 | Poupyrev | May 2018 | B2 |
9994233 | Diaz-Jimenez et al. | Jun 2018 | B2 |
10016162 | Rogers et al. | Jul 2018 | B1 |
10034630 | Lee et al. | Jul 2018 | B2 |
10073590 | Dascola et al. | Sep 2018 | B2 |
10080528 | DeBusschere et al. | Sep 2018 | B2 |
10082950 | Lapp | Sep 2018 | B2 |
10088908 | Poupyrev et al. | Oct 2018 | B1 |
10139916 | Poupyrev | Nov 2018 | B2 |
10155274 | Robinson et al. | Dec 2018 | B2 |
10175781 | Karagozler et al. | Jan 2019 | B2 |
10203763 | Poupyrev et al. | Feb 2019 | B1 |
10222469 | Gillian et al. | Mar 2019 | B1 |
10241581 | Lien et al. | Mar 2019 | B2 |
10268321 | Poupyrev | Apr 2019 | B2 |
10285456 | Poupyrev et al. | May 2019 | B2 |
10300370 | Amihood et al. | May 2019 | B1 |
10310620 | Lien et al. | Jun 2019 | B2 |
10310621 | Lien et al. | Jun 2019 | B1 |
10379621 | Schwesig et al. | Aug 2019 | B2 |
10401490 | Gillian et al. | Sep 2019 | B2 |
10409385 | Poupyrev | Sep 2019 | B2 |
20010035836 | Miceli et al. | Nov 2001 | A1 |
20020009972 | Amento et al. | Jan 2002 | A1 |
20020080156 | Abbott et al. | Jun 2002 | A1 |
20020170897 | Hall | Nov 2002 | A1 |
20030005030 | Sutton et al. | Jan 2003 | A1 |
20030071750 | Benitz | Apr 2003 | A1 |
20030093000 | Nishio et al. | May 2003 | A1 |
20030100228 | Bungo et al. | May 2003 | A1 |
20030119391 | Swallow et al. | Jun 2003 | A1 |
20030122677 | Kail | Jul 2003 | A1 |
20040009729 | Hill et al. | Jan 2004 | A1 |
20040102693 | Jenkins | May 2004 | A1 |
20040249250 | McGee et al. | Dec 2004 | A1 |
20040259391 | Jung et al. | Dec 2004 | A1 |
20050069695 | Jung et al. | Mar 2005 | A1 |
20050128124 | Greneker et al. | Jun 2005 | A1 |
20050148876 | Endoh et al. | Jul 2005 | A1 |
20050231419 | Mitchell | Oct 2005 | A1 |
20050267366 | Murashita et al. | Dec 2005 | A1 |
20060035554 | Glaser et al. | Feb 2006 | A1 |
20060040739 | Wells | Feb 2006 | A1 |
20060047386 | Kanevsky et al. | Mar 2006 | A1 |
20060061504 | Leach, Jr. et al. | Mar 2006 | A1 |
20060125803 | Westerman et al. | Jun 2006 | A1 |
20060136997 | Telek | Jun 2006 | A1 |
20060139162 | Flynn | Jun 2006 | A1 |
20060139314 | Bell | Jun 2006 | A1 |
20060148351 | Tao et al. | Jul 2006 | A1 |
20060157734 | Onodero et al. | Jul 2006 | A1 |
20060166620 | Sorensen | Jul 2006 | A1 |
20060170584 | Romero et al. | Aug 2006 | A1 |
20060209021 | Yoo et al. | Sep 2006 | A1 |
20060258205 | Locher et al. | Nov 2006 | A1 |
20060284757 | Zemany | Dec 2006 | A1 |
20070024488 | Zemany et al. | Feb 2007 | A1 |
20070026695 | Lee et al. | Feb 2007 | A1 |
20070027369 | Pagnacco et al. | Feb 2007 | A1 |
20070118043 | Oliver et al. | May 2007 | A1 |
20070161921 | Rausch | Jul 2007 | A1 |
20070164896 | Suzuki et al. | Jul 2007 | A1 |
20070176821 | Flom | Aug 2007 | A1 |
20070192647 | Glaser | Aug 2007 | A1 |
20070197115 | Eves et al. | Aug 2007 | A1 |
20070197878 | Shklarski | Aug 2007 | A1 |
20070210074 | Maurer et al. | Sep 2007 | A1 |
20070237423 | Tico et al. | Oct 2007 | A1 |
20080001735 | Tran | Jan 2008 | A1 |
20080002027 | Kondo et al. | Jan 2008 | A1 |
20080015422 | Wessel | Jan 2008 | A1 |
20080024438 | Collins et al. | Jan 2008 | A1 |
20080039731 | McCombie et al. | Feb 2008 | A1 |
20080059578 | Albertson et al. | Mar 2008 | A1 |
20080065291 | Breed | Mar 2008 | A1 |
20080074307 | Boric-Lubecke et al. | Mar 2008 | A1 |
20080134102 | Movold et al. | Jun 2008 | A1 |
20080136775 | Conant | Jun 2008 | A1 |
20080168396 | Matas et al. | Jul 2008 | A1 |
20080194204 | Duet | Aug 2008 | A1 |
20080194975 | MacQuarrie et al. | Aug 2008 | A1 |
20080211766 | Westerman et al. | Sep 2008 | A1 |
20080233822 | Swallow et al. | Sep 2008 | A1 |
20080278450 | Lashina | Nov 2008 | A1 |
20080282665 | Speleers | Nov 2008 | A1 |
20080291158 | Park et al. | Nov 2008 | A1 |
20080303800 | Elwell | Dec 2008 | A1 |
20080316085 | Rofougaran et al. | Dec 2008 | A1 |
20080320419 | Matas et al. | Dec 2008 | A1 |
20090018408 | Ouchi et al. | Jan 2009 | A1 |
20090018428 | Dias et al. | Jan 2009 | A1 |
20090033585 | Lang | Feb 2009 | A1 |
20090053950 | Surve | Feb 2009 | A1 |
20090056300 | Chung et al. | Mar 2009 | A1 |
20090058820 | Hinckley | Mar 2009 | A1 |
20090113298 | Jung et al. | Apr 2009 | A1 |
20090115617 | Sano et al. | May 2009 | A1 |
20090118648 | Kandori et al. | May 2009 | A1 |
20090149036 | Lee et al. | Jun 2009 | A1 |
20090177068 | Stivoric et al. | Jul 2009 | A1 |
20090203244 | Toonder | Aug 2009 | A1 |
20090226043 | Angell et al. | Sep 2009 | A1 |
20090253585 | Diatchenko et al. | Oct 2009 | A1 |
20090270690 | Roos et al. | Oct 2009 | A1 |
20090278915 | Kramer et al. | Nov 2009 | A1 |
20090288762 | Wolfel | Nov 2009 | A1 |
20090295712 | Ritzau | Dec 2009 | A1 |
20090319181 | Khosravy et al. | Dec 2009 | A1 |
20100002912 | Solinsky | Jan 2010 | A1 |
20100045513 | Pett et al. | Feb 2010 | A1 |
20100050133 | Nishihara et al. | Feb 2010 | A1 |
20100053151 | Marti et al. | Mar 2010 | A1 |
20100060570 | Underkoffler et al. | Mar 2010 | A1 |
20100065320 | Urano | Mar 2010 | A1 |
20100069730 | Bergstrom et al. | Mar 2010 | A1 |
20100071205 | Graumann et al. | Mar 2010 | A1 |
20100094141 | Puswella | Apr 2010 | A1 |
20100109938 | Oswald et al. | May 2010 | A1 |
20100152600 | Droitcour et al. | Jun 2010 | A1 |
20100179820 | Harrison et al. | Jul 2010 | A1 |
20100198067 | Mahfouz et al. | Aug 2010 | A1 |
20100201586 | Michalk | Aug 2010 | A1 |
20100204550 | Heneghan et al. | Aug 2010 | A1 |
20100205667 | Anderson et al. | Aug 2010 | A1 |
20100208035 | Pinault et al. | Aug 2010 | A1 |
20100225562 | Smith | Sep 2010 | A1 |
20100234094 | Gagner et al. | Sep 2010 | A1 |
20100241009 | Petkie | Sep 2010 | A1 |
20100281438 | Latta et al. | Nov 2010 | A1 |
20100292549 | Schuler | Nov 2010 | A1 |
20100306713 | Geisner et al. | Dec 2010 | A1 |
20100313414 | Sheats | Dec 2010 | A1 |
20100324384 | Moon et al. | Dec 2010 | A1 |
20100325770 | Chung et al. | Dec 2010 | A1 |
20110003664 | Richard | Jan 2011 | A1 |
20110010014 | Oexman et al. | Jan 2011 | A1 |
20110018795 | Jang | Jan 2011 | A1 |
20110029038 | Hyde et al. | Feb 2011 | A1 |
20110073353 | Lee et al. | Mar 2011 | A1 |
20110083111 | Forutanpour et al. | Apr 2011 | A1 |
20110093820 | Zhang et al. | Apr 2011 | A1 |
20110118564 | Sankai | May 2011 | A1 |
20110119640 | Berkes et al. | May 2011 | A1 |
20110166940 | Bangera et al. | Jul 2011 | A1 |
20110181509 | Rautiainen et al. | Jul 2011 | A1 |
20110181510 | Hakala et al. | Jul 2011 | A1 |
20110193939 | Vassigh et al. | Aug 2011 | A1 |
20110197263 | Stinson, III | Aug 2011 | A1 |
20110202404 | van der Riet | Aug 2011 | A1 |
20110213218 | Weiner et al. | Sep 2011 | A1 |
20110221666 | Newton et al. | Sep 2011 | A1 |
20110234492 | Ajmera et al. | Sep 2011 | A1 |
20110239118 | Yamaoka et al. | Sep 2011 | A1 |
20110245688 | Arora et al. | Oct 2011 | A1 |
20110279303 | Smith | Nov 2011 | A1 |
20110286585 | Hodge | Nov 2011 | A1 |
20110303341 | Meiss et al. | Dec 2011 | A1 |
20110307842 | Chiang et al. | Dec 2011 | A1 |
20110316888 | Sachs et al. | Dec 2011 | A1 |
20110318985 | McDermid | Dec 2011 | A1 |
20120001875 | Li et al. | Jan 2012 | A1 |
20120019168 | Noda et al. | Jan 2012 | A1 |
20120029369 | Icove et al. | Feb 2012 | A1 |
20120047468 | Santos et al. | Feb 2012 | A1 |
20120068876 | Bangera et al. | Mar 2012 | A1 |
20120092284 | Rofougaran et al. | Apr 2012 | A1 |
20120123232 | Najarian et al. | May 2012 | A1 |
20120127082 | Kushler et al. | May 2012 | A1 |
20120144934 | Russell et al. | Jun 2012 | A1 |
20120150493 | Casey et al. | Jun 2012 | A1 |
20120154313 | Au et al. | Jun 2012 | A1 |
20120156926 | Kato et al. | Jun 2012 | A1 |
20120174299 | Balzano | Jul 2012 | A1 |
20120174736 | Wang et al. | Jul 2012 | A1 |
20120182222 | Moloney | Jul 2012 | A1 |
20120193801 | Gross et al. | Aug 2012 | A1 |
20120220835 | Chung | Aug 2012 | A1 |
20120248093 | Ulrich et al. | Oct 2012 | A1 |
20120254810 | Heck et al. | Oct 2012 | A1 |
20120268416 | Pirogov et al. | Oct 2012 | A1 |
20120270564 | Gum et al. | Oct 2012 | A1 |
20120280900 | Wang | Nov 2012 | A1 |
20120298748 | Factor et al. | Nov 2012 | A1 |
20120310665 | Xu et al. | Dec 2012 | A1 |
20130016070 | Starner et al. | Jan 2013 | A1 |
20130027218 | Schwarz et al. | Jan 2013 | A1 |
20130035563 | Angelides | Feb 2013 | A1 |
20130046544 | Kay et al. | Feb 2013 | A1 |
20130053653 | Cuddihy et al. | Feb 2013 | A1 |
20130078624 | Holmes et al. | Mar 2013 | A1 |
20130082922 | Miller | Apr 2013 | A1 |
20130083173 | Geisner et al. | Apr 2013 | A1 |
20130086533 | Stienstra | Apr 2013 | A1 |
20130096439 | Lee et al. | Apr 2013 | A1 |
20130102217 | Jeon | Apr 2013 | A1 |
20130104084 | Mlyniec et al. | Apr 2013 | A1 |
20130113647 | Sentelle et al. | May 2013 | A1 |
20130113830 | Suzuki | May 2013 | A1 |
20130117377 | Miller | May 2013 | A1 |
20130132931 | Bruns et al. | May 2013 | A1 |
20130147833 | Aubauer et al. | Jun 2013 | A1 |
20130150735 | Cheng | Jun 2013 | A1 |
20130161078 | Li | Jun 2013 | A1 |
20130169471 | Lynch | Jul 2013 | A1 |
20130176161 | Derham | Jul 2013 | A1 |
20130194173 | Zhu et al. | Aug 2013 | A1 |
20130195330 | Kim et al. | Aug 2013 | A1 |
20130196716 | Khurram | Aug 2013 | A1 |
20130207962 | Oberdorfer et al. | Aug 2013 | A1 |
20130229508 | Li et al. | Sep 2013 | A1 |
20130241765 | Kozma et al. | Sep 2013 | A1 |
20130245986 | Grokop et al. | Sep 2013 | A1 |
20130253029 | Jain et al. | Sep 2013 | A1 |
20130260630 | Ito et al. | Oct 2013 | A1 |
20130278499 | Anderson | Oct 2013 | A1 |
20130278501 | Bulzacki | Oct 2013 | A1 |
20130281024 | Rofougaran et al. | Oct 2013 | A1 |
20130283203 | Batraski et al. | Oct 2013 | A1 |
20130322729 | Mestha et al. | Dec 2013 | A1 |
20130332438 | Li et al. | Dec 2013 | A1 |
20130345569 | Mestha et al. | Dec 2013 | A1 |
20140005809 | Frei et al. | Jan 2014 | A1 |
20140022108 | Alberth et al. | Jan 2014 | A1 |
20140028539 | Newham et al. | Jan 2014 | A1 |
20140049487 | Konertz et al. | Feb 2014 | A1 |
20140050354 | Heim et al. | Feb 2014 | A1 |
20140051941 | Messerschmidt | Feb 2014 | A1 |
20140070957 | Longinotti-Buitoni et al. | Mar 2014 | A1 |
20140072190 | Wu et al. | Mar 2014 | A1 |
20140073486 | Ahmed et al. | Mar 2014 | A1 |
20140073969 | Zou et al. | Mar 2014 | A1 |
20140081100 | Muhsin et al. | Mar 2014 | A1 |
20140095480 | Marantz et al. | Apr 2014 | A1 |
20140097979 | Nohara et al. | Apr 2014 | A1 |
20140121540 | Raskin | May 2014 | A1 |
20140135631 | Brumback et al. | May 2014 | A1 |
20140139422 | Mistry et al. | May 2014 | A1 |
20140139616 | Pinter et al. | May 2014 | A1 |
20140143678 | Mistry et al. | May 2014 | A1 |
20140149859 | Van Dyken et al. | May 2014 | A1 |
20140184496 | Gribetz et al. | Jul 2014 | A1 |
20140184499 | Kim | Jul 2014 | A1 |
20140191939 | Penn et al. | Jul 2014 | A1 |
20140200416 | Kashef et al. | Jul 2014 | A1 |
20140201690 | Holz | Jul 2014 | A1 |
20140208275 | Mongia et al. | Jul 2014 | A1 |
20140215389 | Walsh et al. | Jul 2014 | A1 |
20140239065 | Zhou et al. | Aug 2014 | A1 |
20140244277 | Krishna Rao et al. | Aug 2014 | A1 |
20140246415 | Wittkowski | Sep 2014 | A1 |
20140247212 | Kim et al. | Sep 2014 | A1 |
20140250515 | Jakobsson | Sep 2014 | A1 |
20140253431 | Gossweiler et al. | Sep 2014 | A1 |
20140253709 | Bresch et al. | Sep 2014 | A1 |
20140262478 | Harris et al. | Sep 2014 | A1 |
20140275854 | Venkatraman et al. | Sep 2014 | A1 |
20140280295 | Kurochikin et al. | Sep 2014 | A1 |
20140281975 | Anderson | Sep 2014 | A1 |
20140282877 | Mahaffey | Sep 2014 | A1 |
20140297006 | Sadhu | Oct 2014 | A1 |
20140298266 | Lapp | Oct 2014 | A1 |
20140300506 | Teter | Oct 2014 | A1 |
20140306936 | Dahl | Oct 2014 | A1 |
20140309855 | Tran | Oct 2014 | A1 |
20140316261 | Lux et al. | Oct 2014 | A1 |
20140318699 | Longinotti-Buitoni et al. | Oct 2014 | A1 |
20140324888 | Xie et al. | Oct 2014 | A1 |
20140329567 | Chan et al. | Nov 2014 | A1 |
20140333467 | Inomata | Nov 2014 | A1 |
20140343392 | Yang | Nov 2014 | A1 |
20140347295 | Kim et al. | Nov 2014 | A1 |
20140357369 | Callens et al. | Dec 2014 | A1 |
20140368378 | Crain et al. | Dec 2014 | A1 |
20140368441 | Touloumtzis | Dec 2014 | A1 |
20140376788 | Xu et al. | Dec 2014 | A1 |
20150002391 | Chen | Jan 2015 | A1 |
20150009096 | Lee et al. | Jan 2015 | A1 |
20150026815 | Barrett | Jan 2015 | A1 |
20150029050 | Driscoll et al. | Jan 2015 | A1 |
20150030256 | Brady et al. | Jan 2015 | A1 |
20150040040 | Balan et al. | Feb 2015 | A1 |
20150046183 | Cireddu | Feb 2015 | A1 |
20150062033 | Ishihara | Mar 2015 | A1 |
20150068069 | Tran et al. | Mar 2015 | A1 |
20150077282 | Mohamadi | Mar 2015 | A1 |
20150085060 | Fish et al. | Mar 2015 | A1 |
20150091820 | Rosenberg et al. | Apr 2015 | A1 |
20150091858 | Rosenberg et al. | Apr 2015 | A1 |
20150091859 | Rosenberg et al. | Apr 2015 | A1 |
20150091903 | Costello et al. | Apr 2015 | A1 |
20150095987 | Potash et al. | Apr 2015 | A1 |
20150099941 | Tran | Apr 2015 | A1 |
20150100328 | Kress et al. | Apr 2015 | A1 |
20150106770 | Shah et al. | Apr 2015 | A1 |
20150109164 | Takaki | Apr 2015 | A1 |
20150112606 | He et al. | Apr 2015 | A1 |
20150133017 | Liao et al. | May 2015 | A1 |
20150143601 | Longinotti-Buitoni et al. | May 2015 | A1 |
20150145805 | Liu | May 2015 | A1 |
20150162729 | Reversat et al. | Jun 2015 | A1 |
20150177866 | Hwang et al. | Jun 2015 | A1 |
20150185314 | Corcos et al. | Jul 2015 | A1 |
20150199045 | Robucci et al. | Jul 2015 | A1 |
20150205358 | Lyren | Jul 2015 | A1 |
20150223733 | Al-Alusi | Aug 2015 | A1 |
20150226004 | Thompson | Aug 2015 | A1 |
20150229885 | Offenhaeuser | Aug 2015 | A1 |
20150256763 | Niemi | Sep 2015 | A1 |
20150261320 | Leto | Sep 2015 | A1 |
20150268027 | Gerdes | Sep 2015 | A1 |
20150268799 | Starner et al. | Sep 2015 | A1 |
20150277569 | Sprenger et al. | Oct 2015 | A1 |
20150280102 | Tajitsu et al. | Oct 2015 | A1 |
20150285906 | Hooper et al. | Oct 2015 | A1 |
20150287187 | Redtel | Oct 2015 | A1 |
20150301167 | Sentelle et al. | Oct 2015 | A1 |
20150312041 | Choi | Oct 2015 | A1 |
20150314780 | Stenneth et al. | Nov 2015 | A1 |
20150317518 | Fujimaki et al. | Nov 2015 | A1 |
20150323993 | Levesque et al. | Nov 2015 | A1 |
20150332075 | Burch | Nov 2015 | A1 |
20150341550 | Lay | Nov 2015 | A1 |
20150346820 | Poupyrev et al. | Dec 2015 | A1 |
20150350902 | Baxley | Dec 2015 | A1 |
20150351703 | Phillips et al. | Dec 2015 | A1 |
20150375339 | Sterling et al. | Dec 2015 | A1 |
20160018948 | Parvarandeh et al. | Jan 2016 | A1 |
20160026253 | Bradski et al. | Jan 2016 | A1 |
20160038083 | Ding et al. | Feb 2016 | A1 |
20160041617 | Poupyrev | Feb 2016 | A1 |
20160041618 | Poupyrev | Feb 2016 | A1 |
20160042169 | Polehn | Feb 2016 | A1 |
20160048235 | Poupyrev | Feb 2016 | A1 |
20160048236 | Poupyrev | Feb 2016 | A1 |
20160048672 | Lux | Feb 2016 | A1 |
20160054792 | Poupyrev | Feb 2016 | A1 |
20160054803 | Poupyrev | Feb 2016 | A1 |
20160054804 | Gollakata et al. | Feb 2016 | A1 |
20160055201 | Poupyrev et al. | Feb 2016 | A1 |
20160090839 | Stolarcyzk | Mar 2016 | A1 |
20160098089 | Poupyrev | Apr 2016 | A1 |
20160100166 | Dragne et al. | Apr 2016 | A1 |
20160103500 | Hussey et al. | Apr 2016 | A1 |
20160106328 | Mestha et al. | Apr 2016 | A1 |
20160131741 | Park | May 2016 | A1 |
20160140872 | Palmer et al. | May 2016 | A1 |
20160145776 | Roh | May 2016 | A1 |
20160146931 | Rao et al. | May 2016 | A1 |
20160170491 | Jung | Jun 2016 | A1 |
20160171293 | Li et al. | Jun 2016 | A1 |
20160186366 | McMaster | Jun 2016 | A1 |
20160206244 | Rogers | Jul 2016 | A1 |
20160213331 | Gil et al. | Jul 2016 | A1 |
20160216825 | Forutanpour | Jul 2016 | A1 |
20160220152 | Meriheina et al. | Aug 2016 | A1 |
20160249698 | Berzowska et al. | Sep 2016 | A1 |
20160252607 | Saboo et al. | Sep 2016 | A1 |
20160252965 | Mandella et al. | Sep 2016 | A1 |
20160253044 | Katz | Sep 2016 | A1 |
20160259037 | Molchanov et al. | Sep 2016 | A1 |
20160262685 | Wagner et al. | Sep 2016 | A1 |
20160282988 | Poupyrev | Sep 2016 | A1 |
20160283101 | Schwesig et al. | Sep 2016 | A1 |
20160284436 | Fukuhara et al. | Sep 2016 | A1 |
20160287172 | Morris et al. | Oct 2016 | A1 |
20160299526 | Inagaki et al. | Oct 2016 | A1 |
20160320852 | Poupyrev | Nov 2016 | A1 |
20160320853 | Lien et al. | Nov 2016 | A1 |
20160320854 | Lien et al. | Nov 2016 | A1 |
20160321428 | Rogers | Nov 2016 | A1 |
20160338599 | DeBusschere et al. | Nov 2016 | A1 |
20160345638 | Robinson et al. | Dec 2016 | A1 |
20160349790 | Connor | Dec 2016 | A1 |
20160349845 | Poupyrev et al. | Dec 2016 | A1 |
20160377712 | Wu et al. | Dec 2016 | A1 |
20170029985 | Tajitsu et al. | Feb 2017 | A1 |
20170052618 | Lee et al. | Feb 2017 | A1 |
20170060254 | Molchanov et al. | Mar 2017 | A1 |
20170060298 | Hwang et al. | Mar 2017 | A1 |
20170075481 | Chou et al. | Mar 2017 | A1 |
20170075496 | Rosenberg et al. | Mar 2017 | A1 |
20170097413 | Gillian et al. | Apr 2017 | A1 |
20170097684 | Lien | Apr 2017 | A1 |
20170115777 | Poupyrev | Apr 2017 | A1 |
20170124407 | Micks et al. | May 2017 | A1 |
20170125940 | Karagozler et al. | May 2017 | A1 |
20170192523 | Poupyrev | Jul 2017 | A1 |
20170192629 | Takada et al. | Jul 2017 | A1 |
20170196513 | Longinotti-Buitoni et al. | Jul 2017 | A1 |
20170231089 | Van Keymeulen | Aug 2017 | A1 |
20170232538 | Robinson et al. | Aug 2017 | A1 |
20170233903 | Jeon | Aug 2017 | A1 |
20170249033 | Podhajny et al. | Aug 2017 | A1 |
20170322633 | Shen et al. | Nov 2017 | A1 |
20170325337 | Karagozler et al. | Nov 2017 | A1 |
20170325518 | Poupyrev et al. | Nov 2017 | A1 |
20170329412 | Schwesig et al. | Nov 2017 | A1 |
20170329425 | Karagozler et al. | Nov 2017 | A1 |
20180000354 | DeBusschere et al. | Jan 2018 | A1 |
20180000355 | DeBusschere et al. | Jan 2018 | A1 |
20180004301 | Poupyrev | Jan 2018 | A1 |
20180005766 | Fairbanks et al. | Jan 2018 | A1 |
20180046258 | Poupyrev | Feb 2018 | A1 |
20180095541 | Gribetz et al. | Apr 2018 | A1 |
20180106897 | Shouldice | Apr 2018 | A1 |
20180113032 | Dickey et al. | Apr 2018 | A1 |
20180157330 | Gu et al. | Jun 2018 | A1 |
20180160943 | Fyfe | Jun 2018 | A1 |
20180177464 | DeBusschere et al. | Jun 2018 | A1 |
20180196527 | Poupyrev et al. | Jul 2018 | A1 |
20180256106 | Rogers et al. | Sep 2018 | A1 |
20180296163 | DeBusschere et al. | Oct 2018 | A1 |
20180321841 | Lapp | Nov 2018 | A1 |
20190033981 | Poupyrev | Jan 2019 | A1 |
20190138109 | Poupyrev et al. | May 2019 | A1 |
20190155396 | Lien et al. | May 2019 | A1 |
20190208837 | Poupyrev et al. | Jul 2019 | A1 |
20190232156 | Amihood et al. | Aug 2019 | A1 |
20190243464 | Lien et al. | Aug 2019 | A1 |
20190257939 | Schwesig et al. | Aug 2019 | A1 |
Number | Date | Country |
---|---|---|
1462382 | Dec 2003 | CN |
101751126 | Jun 2010 | CN |
102414641 | Apr 2012 | CN |
102782612 | Nov 2012 | CN |
102893327 | Jan 2013 | CN |
202887794 | Apr 2013 | CN |
103076911 | May 2013 | CN |
103502911 | Jan 2014 | CN |
102660988 | Mar 2014 | CN |
104035552 | Sep 2014 | CN |
103355860 | Jan 2016 | CN |
102011075725 | Nov 2012 | DE |
102013201359 | Jul 2014 | DE |
0161895 | Nov 1985 | EP |
1785744 | May 2007 | EP |
1815788 | Aug 2007 | EP |
2417908 | Feb 2012 | EP |
2637081 | Sep 2013 | EP |
2770408 | Aug 2014 | EP |
2953007 | Dec 2015 | EP |
3201726 | Aug 2017 | EP |
3017722 | Aug 2015 | FR |
2070469 | Sep 1981 | GB |
2443208 | Apr 2008 | GB |
113860 | Apr 1999 | JP |
11168268 | Jun 1999 | JP |
2003280049 | Oct 2003 | JP |
2006234716 | Sep 2006 | JP |
2007011873 | Jan 2007 | JP |
2007132768 | May 2007 | JP |
2008287714 | Nov 2008 | JP |
2009037434 | Feb 2009 | JP |
2011102457 | May 2011 | JP |
2012185833 | Sep 2012 | JP |
2012198916 | Oct 2012 | JP |
2013196047 | Sep 2013 | JP |
2014532332 | Dec 2014 | JP |
1020080102516 | Nov 2008 | KR |
100987650 | Oct 2010 | KR |
1020140055985 | May 2014 | KR |
101914850 | Oct 2018 | KR |
201425974 | Jul 2014 | TW |
9001895 | Mar 1990 | WO |
WO-0130123 | Apr 2001 | WO |
WO-2001027855 | Apr 2001 | WO |
WO-0175778 | Oct 2001 | WO |
WO-2002082999 | Oct 2002 | WO |
2004004557 | Jan 2004 | WO |
2004053601 | Jun 2004 | WO |
WO-2005033387 | Apr 2005 | WO |
2007125298 | Nov 2007 | WO |
WO-2008061385 | May 2008 | WO |
WO-2009032073 | Mar 2009 | WO |
2009083467 | Jul 2009 | WO |
WO-2010032173 | Mar 2010 | WO |
2010101697 | Sep 2010 | WO |
WO-2012026013 | Mar 2012 | WO |
2012064847 | May 2012 | WO |
WO-2012152476 | Nov 2012 | WO |
WO-2013082806 | Jun 2013 | WO |
WO-2013084108 | Jun 2013 | WO |
2013192166 | Dec 2013 | WO |
WO-2013186696 | Dec 2013 | WO |
WO-2013191657 | Dec 2013 | WO |
WO-2014019085 | Feb 2014 | WO |
2014085369 | Jun 2014 | WO |
WO-2014116968 | Jul 2014 | WO |
2014124520 | Aug 2014 | WO |
WO-2014136027 | Sep 2014 | WO |
WO-2014138280 | Sep 2014 | WO |
WO-2014160893 | Oct 2014 | WO |
WO-2014165476 | Oct 2014 | WO |
WO-2014204323 | Dec 2014 | WO |
WO-2015017931 | Feb 2015 | WO |
WO-2015022671 | Feb 2015 | WO |
2015149049 | Oct 2015 | WO |
2016053624 | Apr 2016 | WO |
2016118534 | Jul 2016 | WO |
2016176471 | Nov 2016 | WO |
2016178797 | Nov 2016 | WO |
2017019299 | Feb 2017 | WO |
2017062566 | Apr 2017 | WO |
2017200571 | Nov 2017 | WO |
20170200949 | Nov 2017 | WO |
2018106306 | Jun 2018 | WO |
Entry |
---|
S. Z. Gurbuz, W. L. Melvin, D. B. Williams “Detection and identification of human targets in radar data”—Proc. SPIE 6567, Signal Processing, Sensor Fusion, and Target Recognition XVI, 65670I (May 7, 2007) (Year: 2007). |
Jonathan L. Geisheimer, Eugene Greneker “A continuous-wave (CW) radar for gait analysis”, IEEE, 2011, pp. 834-838 (Year: 2001). |
Dmitry S. Garmatyuk and Ram M. Narayanan “Ultra-wideband contnuous-wave random noise Arc-SAR”, IEEE, 2002, pp. 2543-2552 (Year: 2002). |
“Combined Search and Examination Report”, GB Application No. 1620892.8, dated Apr. 6, 2017, 5 pages. |
“Corrected Notice of Allowance”, U.S. Appl. No. 14/930,220, dated Mar. 20, 2017, 2 pages. |
“Corrected Notice of Allowance”, U.S. Appl. No. 14/930,220, dated May 11, 2017, 2 pages. |
“Final Office Action”, U.S. Appl. No. 14/518,863, dated May 5, 2017, 18 pages. |
“First Action Interview Office Action”, U.S. Appl. No. 14/959,901, dated Apr. 14, 2017, 3 pages. |
“International Preliminary Report on Patentability”, Application No. PCT/US2015/050903, dated Apr. 13, 2017, 12 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2016/060399, dated Jan. 30, 2017, 11 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/504,038, dated Mar. 22, 2017, 33 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/403,066, dated May 4, 2017, 31 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/494,863, dated May 30, 2017, 7 pages. |
“Pre-Interview Communication”, U.S. Appl. No. 15/343,067, dated Apr. 19, 2017, 3 pages. |
“Textile Wire Brochure”, Retrieved at: http://www.textile-wire.ch/en/home.html, Aug. 7, 2004, 17 pages. |
Stoppa,“Wearable Electronics and Smart Textiles: A Critical Review”, In Proceedings of Sensors, vol. 14, Issue 7, Jul. 7, 2014, pp. 11957-11992. |
“Advisory Action”, U.S. Appl. No. 14/504,139, dated Aug. 28, 2017, 3 pages. |
“Final Office Action”, U.S. Appl. No. 14/959,901, dated Aug. 25, 2017, 19 pages. |
“Final Office Action”, U.S. Appl. No. 15/403,066, dated Oct. 5, 2017, 31 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/093,533, dated Aug. 24, 2017, 18 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/142,619, dated Aug. 25, 2017, 16 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/959,799, dated Sep. 8, 2017, 16 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/398,147, dated Sep. 8, 2017, 7 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/518,863, dated Sep. 29, 2017, 20 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/142,689, dated Oct. 4, 2017, 18 pages. |
“Pre-Interview Office Action”, U.S. Appl. No. 14/862,409, dated Sep. 15, 2017, 16 pages. |
“Written Opinion”, PCT Application No. PCT/US2016/055671, dated Apr. 13, 2017, 8 pages. |
“Written Opinion”, PCT Application No. PCT/US2017/032733, dated Jul. 26, 2017, 5 pages. |
“Final Office Action”, U.S. Appl. No. 14/959,799, dated Jan. 4, 2018, 17 pages. |
“Final Office Action”, U.S. Appl. No. 14/959,730, dated Nov. 22, 2017, 16 pages. |
“International Search Report and Written Opinion”, PCT/US2017/047691, dated Nov. 16, 2017, 13. |
“International Search Report and Written Opinion”, PCT Application No. PCT/US2017/051663, dated Nov. 29, 2017, 16 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/504,121, dated Jan. 2, 2018, 19 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/959,901, dated Jan. 8, 2018, 21 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/504,139, dated Oct. 18, 2017, 12 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/595,649, dated Oct. 31, 2017, 16 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/862,409, dated Dec. 14, 2017, 17 pages. |
“Notice of Allowance”, U.S. Appl. No. 15/403,066, dated Jan. 8, 2018, 18 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/874,955, dated Oct. 20, 2017, 7 pages. |
“Notice of Allowance”, U.S. Appl. No. 15/398,147, dated Nov. 15, 2017, 8 pages. |
“Notice of Publication”, U.S. Appl. No. 15/703,511, dated Jan. 4, 2018, 1 page. |
“Restriction Requirement”, U.S. Appl. No. 15/362,359, dated Jan. 8, 2018, 5 pages. |
Bondade, et al., “A linear-assisted DC-DC hybrid power converter for envelope tracking RF power amplifiers”, 2014 IEEE Energy Conversion Congress and Exposition (ECCE), IEEE, Sep. 14, 2014, pp. 5769-5773, XP032680873, DOI: 10.1109/ECCE.2014.6954193, Sep. 14, 2014, 5 pages. |
Fan, et al., “Wireless Hand Gesture Recognition Based on Continuous-Wave Doppler Radar Sensors”, IEEE Transactions on Microwave Theory and Techniques, Plenum, USA, vol. 64, No. 11, Nov. 1, 2016 (Nov. 1, 2016), pp. 4012-4012, XP011633246, ISSN: 0018-9480, DOI: 10.1109/TMTT.2016.2610427, Nov. 1, 2016, 9 pages. |
Lien, et al., “Soli: Ubiquitous Gesture Sensing with Millimeter Wave Radar”, ACM Transactions on Graphics (TOG), ACM, Us, vol. 35, No. 4, Jul. 11, 2016 (Jul. 11, 2016), pp. 1-19, XP058275791, ISSN: 0730-0301, DOI: 10.1145/2897824.2925953, Jul. 11, 2016, 19 pages. |
Martinez-Garcia, et al., “Four-quadrant linear-assisted DC/DC voltage regulator”, Analog Integrated Circuits and Signal Processing, Springer New York LLC, US, vol. 88, No. 1, Apr. 23, 2016 (Apr. 23, 2016) , pp. 151-160, XP035898949, ISSN: 0925-1030, DOI: 10.1007/S10470-016-0747-8, Apr. 23, 2016, 10 pages. |
Skolnik, “CW and Frequency-Modulated Radar”, In: “Introduction to Radar Systems”, Jan. 1, 1981 (Jan. 1, 1981), McGraw Hill, XP055047545, ISBN: 978-0-07-057909-5 pp. 68-100, p. 95-p. 97, Jan. 1, 1981, 18 pages. |
Zheng, et al., “Doppler Bio-Signal Detection Based Time-Domain Hand Gesture Recognition”, 2013 IEEE MTT-S International Microwave Workshop Series on RF and Wireless Technologies for Biomedical and Healthcare Applications (IMWS-BIO), IEEE, Dec. 9, 2013 (Dec. 9, 2013), p. 3, XP032574214, DOI: 10.1109/IMWS-BIO.2013.6756200, Dec. 9, 2013, 3 Pages. |
“Cardiio”, Retrieved From: <http://www.cardiio.com/> Apr. 15, 2015 App Information Retrieved From: <https://itunes.apple.com/us/app/cardiio-touchless-camera-pulse/id542891434?Is=1&mt=8> Apr. 15, 2015, Feb. 24, 2015, 6 pages. |
“Extended European Search Report”, EP Application No. 15170577.9, dated Nov. 5, 2015, 12 pages. |
“Final Office Action”, U.S. Appl. No. 14/312,486, dated Jun. 3, 2016, 32 pages. |
“Final Office Action”, U.S. Appl. No. 14/504,038, dated Sep. 27, 2016, 23 pages. |
“Final Office Action”, U.S. Appl. No. 14/504,061, dated Mar. 9, 2016, 10 pages. |
“Frogpad Introduces Wearable Fabric Keyboard with Bluetooth Technology”, Retrieved From: <http://www.geekzone.co.nz/content.asp?contentid=3898> Mar. 16, 2015, Jan. 7, 2005, 2 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2015/044774, dated Nov. 3, 2015, 12 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2016/024267, dated Jun. 20, 2016, 13 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2016/024273, dated Jun. 20, 2016, 13 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2016/032307, dated Aug. 25, 2016, 13 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2016/029820, dated Jul. 15, 2016, 14 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2016/030177, dated Aug. 2, 2016, 15 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2015/043963, dated Nov. 24, 2015, 16 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2015/050903, dated Feb. 19, 2016, 18 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2016/030115, dated Aug. 8, 2016, 18 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2015/043949, dated Dec. 1, 2015, 18 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/312,486, dated Oct. 23, 2015, 25 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/504,038, dated Feb. 26, 2016, 22 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/504,061, dated Nov. 4, 2015, 8 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/582,896, dated Jun. 29, 2016, 9 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/666,155, dated Aug. 24, 2016, 9 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/681,625, dated Aug. 12, 2016, 9 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/930,220, dated Sep. 14, 2016, 15 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/504,061, dated Sep. 12, 2016, 7 pages. |
“Philips Vital Signs Camera”, Retrieved From: <http://www.vitalsignscamera.com/> Apr. 15, 2015, Jul. 17, 2013, 2 pages. |
“Restriction Requirement”, U.S. Appl. No. 14/666,155, dated Jul. 22, 2016, 5 pages. |
“The Instant Blood Pressure app estimates blood pressure with your smartphone and our algorithm”, Retrieved at: http://www.instantbloodpressure.com/—on Jun. 23, 2016, 6 pages. |
Arbabian,“A 94GHz mm-Wave to Baseband Pulsed-Radar for Imaging and Gesture Recognition”, 2012 IEEE, 2012 Symposium on VLSI Circuits Digest of Technical Papers, 2012, 2 pages. |
Balakrishnan,“Detecting Pulse from Head Motions in Video”, In Proceedings: CVPR '13 Proceedings of the 2013 IEEE Conference on Computer Vision and Pattern Recognition Available at: <http://people.csail.mit.edu/mrub/vidmag/papers/Balakrishnan_Detecting_Pulse_from_2013_CVPR_paper.pdf>, Jun. 23, 2013, 8 pages. |
Couderc,“Detection of Atrial Fibrillation using Contactless Facial Video Monitoring”, In Proceedings: Heart Rhythm Society, vol. 12, Issue 1 Available at: <http://www.heartrhythmjournal.com/article/S1547-5271(14)00924-2/pdf>, Jan. 2015, 7 pages. |
Espina,“Wireless Body Sensor Network for Continuous Cuff-less Blood Pressure Monitoring”, International Summer School on Medical Devices and Biosensors, 2006, Sep. 2006, 5 pages. |
Godana,“Human Movement Characterization in Indoor Environment using GNU Radio Based Radar”, Retrieved at: http://repository.tudelft.nl/islandora/object/uuid:414e1868-dd00-4113-9989-4c213f1f7094?collection=education, Nov. 30, 2009, 100 pages. |
He,“A Continuous, Wearable, and Wireless Heart Monitor Using Head Ballistocardiogram (BCG) and Head Electrocardiogram (ECG) with a Nanowatt ECG Heartbeat Detection Circuit”, In Proceedings: Thesis, Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology Available at: <http://dspace.mit.edu/handle/1721.1/79221>, Feb. 2013, 137 pages. |
Holleis,“Evaluating Capacitive Touch Input on Clothes”, Proceedings of the 10th International Conference on Human Computer Interaction, Jan. 1, 2008, 10 pages. |
Nakajima,“Development of Real-Time Image Sequence Analysis for Evaluating Posture Change and Respiratory Rate of a Subject in Bed”, In Proceedings: Physiological Measurement, vol. 22, No. 3 Retrieved From: <http://iopscience.iop.org/0967-3334/22/3/401/pdf/0967-3334_22_3_401.pdf> Feb. 27, 2015, Aug. 2001, 8 pages. |
Patel,“Applications of Electrically Conductive Yarns in Technical Textiles”, International Conference on Power System Technology (POWECON), Oct. 30, 2012, 6 pages. |
Poh,“A Medical Mirror for Non-contact Health Monitoring”, In Proceedings: ACM SIGGRAPH Emerging Technologies Available at: <http://affect.media.mit.edu/pdfs/11.Poh-etal-SIGGRAPH.pdf>, 2011, 1 page. |
Poh,“Non-contact, Automated Cardiac Pulse Measurements Using Video Imaging and Blind Source Separation.”, In Proceedings: Optics Express, vol. 18, No. 10 Available at: <http://www.opticsinfobase.org/view_article.cfm?gotourl=http%3A%2F%2Fwww%2Eopticsinfobase%2Eorg%2FDirectPDFAccess%2F77B04D55%2DBC95%2D6937%2D5BAC49A426378C02%5F199381%2Foe%2D18%2D10%2D10762%2Ep, May 7, 2010, 13 pages. |
Pu,“Whole-Home Gesture Recognition Using Wireless Signals”, MobiCom '13 Proceedings of the 19th annual international conference on Mobile computing & networking, Aug. 27, 2013, 12 pages. |
Wang,“Exploiting Spatial Redundancy of Image Sensor for Motion Robust rPPG”, In Proceedings: IEEE Transactions on Biomedical Engineering, vol. 62, Issue 2, Jan. 19, 2015, 11 pages. |
Wang,“Micro-Doppler Signatures for Intelligent Human Gait Recognition Using a UWB Impulse Radar”, 2011 IEEE International Symposium on Antennas and Propagation (APSURSI), Jul. 3, 2011, pp. 2103-2106. |
Wijesiriwardana,“Capacitive Fibre-Meshed Transducer for Touch & Proximity Sensing Applications”, IEEE Sensors Journal, IEEE Service Center, Oct. 1, 2005, 5 pages. |
Zhadobov,“Millimeter-wave Interactions with the Human Body: State of Knowledge and Recent Advances”, International Journal of Microwave and Wireless Technologies, Mar. 1, 2011, 11 pages. |
Zhang,“Study of the Structural Design and Capacitance Characteristics of Fabric Sensor”, Advanced Materials Research (vols. 194-196), Feb. 21, 2011, 8 pages. |
“Corrected Notice of Allowance”, U.S. Appl. No. 15/362,359, dated Sep. 17, 2018, 10 pages. |
“Final Office Action”, U.S. Appl. No. 14/504,121, dated Jul. 9, 2018, 23 pages. |
“Final Office Action”, U.S. Appl. No. 15/166,198, dated Sep. 27, 2018, 33 pages. |
“Foreign Office Action”, Japanese Application No. 2018-501256, dated Jul. 24, 2018, 11 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/287,253, dated Sep. 7, 2018, 20 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/504,139, dated Oct. 5, 2018, 16 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/286,512, dated Jul. 19, 2018, 15 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/142,829, dated Aug. 16, 2018, 15 pages. |
“Notice of Allowance”, U.S. Appl. No. 15/362,359, dated Aug. 3, 2018, 8 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/874,955, dated Oct. 4, 2018, 8 pages. |
“Notice of Allowance”, U.S. Appl. No. 15/142,619, dated Aug. 13, 2018, 9 pages. |
“Notice of Allowance”, U.S. Appl. No. 15/595,649, dated Sep. 14, 2018, 8 pages. |
“Notice of Allowance”, U.S. Appl. No. 15/586,174, dated Sep. 24, 2018, 5 pages. |
“Pre-Interview Communication”, U.S. Appl. No. 15/286,495, dated Sep. 10, 2018, 4 pages. |
“Pre-Interview Communication”, U.S. Appl. No. 15/287,359, dated Jul. 24, 2018, 2 pages. |
“Restriction Requirement”, U.S. Appl. No. 15/286,537, dated Aug. 27, 2018, 8 pages. |
“Final Office Action”, U.S. Appl. No. 14/518,863, dated Apr. 5, 2018, 21 pages. |
“Final Office Action”, U.S. Appl. No. 14/504,139, dated May 1, 2018, 14 pages. |
“Final Office Action”, U.S. Appl. No. 15/595,649, dated May 23, 2018, 13 pages. |
“Final Office Action”, U.S. Appl. No. 15/142,689, dated Jun. 1, 2018, 16 pages. |
“Final Office Action”, U.S. Appl. No. 14/874,955, dated Jun. 11, 2018, 9 pages. |
“Final Office Action”, U.S. Appl. No. 14/959,901, dated Jun. 15, 2018, 21 pages. |
“Final Office Action”, U.S. Appl. No. 15/286,152, dated Jun. 26, 2018, 25 pages. |
“Final Office Action”, U.S. Appl. No. 15/267,181, dated Jun. 7, 2018, 31 pages. |
“First Action Interview Office Action”, U.S. Appl. No. 15/166,198, dated Apr. 25, 2018, 8 pages. |
“Foreign Office Action”, European Application No. 16784352.3, dated May 16, 2018, 3 pages. |
“Foreign Office Action”, Chinese Application No. 201721290290.3, dated Jun. 6, 2018, 3 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/287,253, dated Apr. 5, 2018, 17 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/586,174, dated Jun. 18, 2018, 7 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/862,409, dated Jun. 6, 2018, 7 pages. |
“Pre-Interview Communication”, U.S. Appl. No. 15/362,359, dated May 17, 2018, 4 pages. |
“Preliminary Report on Patentability”, PCT Application No. PCT/US2016/055671, dated Apr. 10, 2018, 9 pages. |
“Written Opinion”, PCT Application No. PCT/US2017/032733, dated Jul. 24, 2017, 5 pages. |
“Corrected Notice of Allowance”, U.S. Appl. No. 14/312,486, dated Oct. 28, 2016, 4 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2016/024289, dated Aug. 25, 2016, 17 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/518,863, dated Oct. 14, 2016, 16 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/312,486, dated Oct. 7, 2016, 15 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/582,896, dated Nov. 7, 2016, 5 pages. |
“Pre-Interview Communication”, U.S. Appl. No. 14/513,875, dated Oct. 21, 2016, 3 pages. |
Cheng,“Smart Textiles: From Niche to Mainstream”, IEEE Pervasive Computing, Jul. 2013, pp. 81-84. |
Farringdon,“Wearable Sensor Badge & Sensor Jacket for Context Awareness”, Third International Symposium on Wearable Computers, Oct. 1999, 7 pages. |
Pu,“Gesture Recognition Using Wireless Signals”, Oct. 2014, pp. 15-18. |
Schneegass,“Towards a Garment OS: Supporting Application Development for Smart Garments”, Wearable Computers, ACM, Sep. 2014, 6 pages. |
“Combined Search and Examination Report”, GB Application No. 1620891.0, dated May 31, 2017, 9 pages. |
“Final Office Action”, U.S. Appl. No. 15/398,147, dated Jun. 30, 2017, 11 pages. |
“Final Office Action”, U.S. Appl. No. 14/874,955, dated Jun. 30, 2017, 9 pages. |
“Final Office Action”, U.S. Appl. No. 14/959,799, dated Jul. 19, 2017, 12 pages. |
“Final Office Action”, U.S. Appl. No. 14/504,121, dated Aug. 8, 2017, 16 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2016/063874, dated May 11, 2017, 19 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/862,409, dated Jun. 22, 2017, 15 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/959,730, dated Jun. 23, 2017, 14 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/513,875, dated Jun. 28, 2017, 7 pages. |
“Notice of Allowance”, U.S. Appl. No. 15/343,067, dated Jul. 27, 2017, 9 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/504,038, dated Aug. 7, 2017, 17 pages. |
“Corrected Notice of Allowance”, U.S. Appl. No. 14/582,896, dated Feb. 23, 2017, 2 pages. |
“International Preliminary Report on Patentability”, Application No. PCT/US2015/043963, dated Feb. 16, 2017, 12 pages. |
“International Preliminary Report on Patentability”, Application No. PCT/US2015/030388, dated Dec. 15, 2016, 12 pages. |
“International Preliminary Report on Patentability”, Application No. PCT/US2015/043949, dated Feb. 16, 2017, 13 pages. |
“International Preliminary Report on Patentability”, Application No. PCT/US2015/044774, dated Mar. 2, 2017, 8 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2016/062082, dated Feb. 23, 2017, 12 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2016/055671, dated Dec. 1, 2016, 14 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/513,875, dated Feb. 21, 2017, 9 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/874,955, dated Feb. 27, 2017, 8 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/398,147, dated Mar. 9, 2017, 10 pages. |
“Pre-Interview Communication”, U.S. Appl. No. 14/959,730, dated Feb. 15, 2017, 3 pages. |
“Corrected Notice of Allowance”, U.S. Appl. No. 14/312,486, dated Jan. 23, 2017, 4 pages. |
“Corrected Notice of Allowance”, U.S. Appl. No. 14/504,061, dated Dec. 27, 2016, 2 pages. |
“Corrected Notice of Allowance”, U.S. Appl. No. 14/582,896, dated Feb. 6, 2017, 2 pages. |
“Corrected Notice of Allowance”, U.S. Appl. No. 14/582,896, dated Dec. 19, 2016, 2 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/504,121, dated Jan. 9, 2017, 13 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/504,139, dated Jan. 27, 2017, 10 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/959,799, dated Jan. 27, 2017, 10 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/930,220, dated Feb. 2, 2017, 8 pages. |
“Pre-Interview Communication”, U.S. Appl. No. 14/494,863, dated Jan. 27, 2017, 5 pages. |
“Pre-Interview Communication”, U.S. Appl. No. 14/959,901, dated Feb. 10, 2017, 3 pages. |
“Final Office Action”, U.S. Appl. No. 15/142,619, dated Feb. 8, 2018, 15 pages. |
“Final Office Action”, U.S. Appl. No. 15/093,533, dated Mar. 21, 2018, 19 pages. |
“First Action Interview Office Action”, U.S. Appl. No. 15/286,152, dated Mar. 1, 2018, 5 pages. |
“Foreign Office Action”, Chinese Application No. 201721290290.3, dated Mar. 9, 2018, 2 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/267,181, dated Feb. 8, 2018, 29 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/874,955, dated Feb. 8, 2018, 7 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/959,730, dated Feb. 22, 2018, 8 pages. |
“Pre-Interview Communication”, U.S. Appl. No. 15/166,198, dated Mar. 8, 2018, 8 pages. |
“Pre-Interview First Office Action”, U.S. Appl. No. 15/286,152, dated Feb. 8, 2018, 4 pages. |
“Final Office Action”, U.S. Appl. No. 15/286,512, dated Dec. 26, 2018, 15 pages. |
“International Preliminary Report on Patentability”, PCT Application No. PCT/US2017/032733, dated Nov. 29, 2018, 7 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/959,901, dated Oct. 11, 2018, 22 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/286,152, dated Oct. 19, 2018, 27 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/286,837, dated Oct. 26, 2018, 10 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/286,537, dated Nov. 19, 2018, 18 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/287,155, dated Dec. 10, 2018, 12 pages. |
“Notice of Allowance”, U.S. Appl. No. 15/142,689, dated Oct. 30, 2018, 9 pages. |
“Notice of Allowance”, U.S. Appl. No. 15/287,200, dated Nov. 6, 2018, 19 pages. |
“Written Opinion”, PCT Application No. PCT/US2017/051663, dated Oct. 12, 2018, 8 pages. |
“Apple Watch Used Four Sensors to Detect your Pulse”, retrieved from http://www.theverge.com/2014/9/9/6126991 / apple-watch-four-back-sensors-detect-activity on Sep. 23, 2017 as cited in PCT search report for PCT Application No. PCT/US2016/026756 dated Nov. 10, 2017; The Verge, paragraph 1, Sep. 9, 2014, 4 pages. |
“Clever Toilet Checks on Your Health”, CNN.Com; Technology, Jun. 28, 2005, 2 pages. |
“Final Office Action”, U.S. Appl. No. 14/681,625, dated Dec. 7, 2016, 10 pages. |
“Final Office Action”, U.S. Appl. No. 14/731,195, dated Oct. 11, 2018, 12 pages. |
“Final Office Action”, U.S. Appl. No. 14/715,454, dated Sep. 7, 2017, 14 pages. |
“Final Office Action”, U.S. Appl. No. 14/720,632, dated Jan. 9, 2018, 18 pages. |
“Final Office Action”, U.S. Appl. No. 14/715,454, dated Apr. 17, 2018, 19 pages. |
“Final Office Action”, U.S. Appl. No. 14/599,954, dated Aug. 10, 2016, 23 pages. |
“Final Office Action”, U.S. Appl. No. 14/699,181, dated May 4, 2018, 41 pages. |
“Final Office Action”, U.S. Appl. No. 14/715,793, dated Sep. 12, 2017, 7 pages. |
“Final Office Action”, U.S. Appl. No. 14/809,901, dated Dec. 13, 2018, 7 pages. |
“First Action Interview OA”, U.S. Appl. No. 14/715,793, dated Jun. 21, 2017, 3 pages. |
“First Action Interview Office Action”, U.S. Appl. No. 15/142,471, dated Feb. 5, 2019, 29 pages. |
“First Action Interview Office Action”, U.S. Appl. No. 14/731,195, dated Jun. 21, 2018, 4 pages. |
“First Action Interview Pilot Program Pre-Interview Communication”, U.S. Appl. No. 14/731,195, dated Aug. 1, 2017, 3 pages. |
“First Examination Report”, GB Application No. 1621332.4, dated May 16, 2017, 7 pages. |
“Foreign Office Action”, Chinese Application No. 201580034536.8, dated Oct. 9, 2018. |
“Foreign Office Action”, KR Application No. 10-2016-7036023, dated Aug. 11, 2017, 10 pages. |
“Foreign Office Action”, Chinese Application No. 201580036075.8, dated Jul. 4, 2018, 14 page. |
“Foreign Office Action”, JP App. No. 2016-567813, dated Jun. 16, 2018, 3 pages. |
“Foreign Office Action”, Japanese Application No. 2016-567839, dated Apr. 3, 2018, 3 pages. |
“Foreign Office Action”, KR Application No. 10-2016-7035397, dated Sep. 20, 2017, 5 pages. |
“Foreign Office Action”, Korean Application No. 1020187012629, dated May 24, 2018, 6 pages. |
“Foreign Office Action”, EP Application No. 15170577.9, dated May 30, 2017, 7 pages. |
“Foreign Office Action”, Korean Application No. 10-2016-7036396, dated Jan. 3, 2018, 7 pages. |
“Foreign Office Action”, JP Application No. 2016567813, dated Sep. 22, 2017, 8 pages. |
“Foreign Office Action”, Japanese Application No. 2018021296, dated Dec. 25, 2018, 8 pages. |
“Foreign Office Action”, EP Application No. 15754323.2, dated Mar. 9, 2018, 8 pages. |
“Foreign Office Action ”, CN Application No. 201580034908.7, dated Jul. 3, 2018, 12 pages. |
“International Preliminary Report on Patentability”, PCT Application No. PCT/US2016/026756, dated Oct. 10, 2017, 8 pages. |
“International Search Report and Written Opinion”, PCT Application No. PCT/US2016/065295, dated Mar. 14, 2017, 5 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2016/042013, dated Oct. 26, 2016, 12 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2016/034366, dated Nov. 17, 2016, 13 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2016/033342, dated Oct. 27, 2016, 20 pages. |
“Life:X Lifestyle eXplorer”, Retrieved from <https://web.archive.org/web/20150318093841/http://research.microsoft.com/en-us/projects/lifex >, Feb. 3, 2017, 2 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/596,702, dated Jan. 4, 2019, 10 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/666,155, dated Feb. 3, 2017, 12 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/809,901, May 24, 2018, 13 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/720,632, dated Jun. 14, 2017, 16 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/715,454, dated Jan. 11, 2018, 16 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/599,954, dated Jan. 26, 2017, 16 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/599,954, dated Feb. 2, 2016, 17 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/720,632, May 18, 2018, 20 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/699,181, dated Oct. 18, 2017, 33 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/681,625, dated Mar. 6, 2017, 7 pages. |
“Non-Invasive Quantification of Peripheral Arterial Volume Distensibilitiy and its Non-Lineaer Relationship with Arterial Pressure”, Journal of Biomechanics, Pergamon Press, vol. 42, No. 8; as cited in the search report for PCT/US2016/013968 citing the whole document, but in particular the abstract, dated May 29, 2009, 2 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/599,954, dated May 24, 2017, 11 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/715,793, dated Jul. 6, 2018, 5 pages. |
“Notice of Allowance”, U.S. Appl. No. 15/286,495, dated Jan. 17, 2019, 5 pages. |
“Notice of Allowance”, U.S. Appl. No. 15/595,649, dated Jan. 3, 2019, 5 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/715,793, dated Dec. 18, 2017, 5 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/666,155, dated Feb. 20, 2018, 5 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/666,155, dated Jul. 10, 2017, 7 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/681,625, dated Jun. 7, 2017, 7 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/681,625, dated Oct. 23, 2017, 8 pages. |
“Notice of Allowance”, U.S. Appl. No. 15/142,829, dated Feb. 6, 2019, 8 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/504,137, dated Feb. 6, 2019, 9 pages. |
“Pre-Interview Communication”, U.S. Appl. No. 15/142,471, dated Dec. 12, 2018, 3 pages. |
“Pre-Interview Communication”, U.S. Appl. No. 14/715,793, dated Mar. 20, 2017, 3 pages. |
“Pre-Interview Communication”, U.S. Appl. No. 14/715,454, dated Apr. 14, 2017, 3 pages. |
“Pre-Interview Office Action”, U.S. Appl. No. 14/731,195, dated Dec. 20, 2017, 4 pages. |
“Preliminary Report on Patentability”, PCT Application No. PCT/US2016/065295, dated Jul. 24, 2018, 18 pages. |
“Preliminary Report on Patentability”, PCT Application No. PCT/US2016/042013, dated Jan. 30, 2018, 7 pages. |
“Preliminary Report on Patentability”, PCT Application No. PCT/US2016/032307, dated Dec. 7, 2017, 9 pages. |
“Pressure-Volume Loop Analysis in Cardiology”, retrieved from https://en.wikipedia.org/w/index.php?t itle=Pressure-volume loop analysis in card iology&oldid=636928657 on Sep. 23, 2017; Obtained per link provided in search report from PCT/US2016/01398 dated Jul. 28, 2016, Dec. 6, 2014, 10 pages. |
“Restriction Requirement”, U.S. Appl. No. 15/462,957, dated Jan. 4, 2019, 6 pages. |
“Restriction Requirement”, U.S. Appl. No. 15/352,194, dated Feb. 6, 2019, 8 pages. |
“The Dash smart earbuds play back music, and monitor your workout”, Retrieved from < http://newatlas.com/bragi-dash-tracking-earbuds/30808/>, Feb. 13, 2014, 3 pages. |
“Thermofocus No Touch Forehead Thermometer”, Technimed, Internet Archive. Dec. 24, 2014. https://web.archive.org/web/20141224070848/http://www.tecnimed.it:80/thermofocus-forehead-thermometer-H1N1-swine-flu.html, Dec. 24, 2018, 4 pages. |
“Written Opinion”, PCT Application No. PCT/US2016/042013, dated Feb. 2, 2017, 6 pages. |
“Written Opinion”, PCT Application No. PCT/US2016/026756, dated Nov. 10, 2016, 7 pages. |
“Written Opinion”, PCT Application No. PCT/US2016/065295, dated Apr. 13, 2018, 8 pages. |
“Written Opinion”, PCT Application PCT/US2016/013968, dated Jul. 28, 2016, 9 pages. |
Duncan, David P. “Motion Compensation of Synthetic Aperture Radar”, Microwave Earth Remote Sensing Laboratory, Brigham Young University, Apr. 15, 2003, 5 pages. |
Ishijima, Masa “Unobtrusive Approaches to Monitoring Vital Signs at Home”, Medical & Biological Engineering and Computing, Springer, Berlin, DE, vol. 45, No. 11 as cited in search report for PCT/US2016/013968 dated Jul. 28, 2016, Sep. 26, 2007, 3 pages. |
Klabunde, Richard E. “Ventricular Pressure-Volume Loop Changes in Valve Disease”, Retrieved From <https://web.archive.org/web/20101201185256/http://cvphysiology.com/Heart%20Disease/HD009.htm>, Dec. 1, 2010, 8 pages. |
Matthews, Robert J. “Venous Pulse”, Retrieved at: http://www.rjmatthewsmd.com/Definitions/venous_pulse.htm—on Nov. 30, 2016, Apr. 13, 2013, 7 pages. |
Otto, Chris et al., “System Architecture of a Wireless Body Area Sensor Network for Ubiquitous Health Monitoring”, Journal of Mobile Multimedia; vol. 1, No. 4, Jan. 10, 2006, 20 pages. |
Palese, et al., “The Effects of Earphones and Music on the Temperature Measured by Infrared Tympanic Thermometer: Preliminary Results”, ORL-head and neck nursing: official journal of the Society of Otorhinolaryngology and Head-Neck Nurses 32.2, Jan. 1, 2013, pp. 8-12. |
Pu, Qifan et al., “Whole-Home Gesture Recognition Using Wireless Signals”, MobiCom'13, Sep. 30-Oct. 4, Miami, FL, USA, 2013, 12 pages. |
Pu, Qifan et al., “Whole-Home Gesture Recognition Using Wireless Signals”, Proceedings of the 19th annual international conference on Mobile computing & networking (MobiCom'13), US, ACM, Sep. 30, 2013, pp. 27-38, Sep. 30, 2013, 12 pages. |
Zhadobov, Maxim et al., “Millimeter-Wave Interactions with the Human Body: State of Knowledge and Recent Advances”, International Journal of Microwave and Wireless Technologies, p. 1 of 11, Cambridge University Press and the European Microwave Association, 2011, doi:10.1017/S1759078711000122, 2011. |
“Final Office Action”, U.S. Appl. No. 15/287,155, dated Apr. 10, 2019, 11 pages. |
“Final Office Action”, U.S. Appl. No. 15/286,537, dated Apr. 19, 2019, 21 pages. |
“Final Office Action”, U.S. Appl. No. 15/287,253, dated Apr. 2, 2019, 10 pages. |
“Foreign Office Action”, Japanese Application No. 2018501256, dated Feb. 26, 2019, 3 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/166,198, dated Feb. 21, 2019, 48 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/287,394, dated Mar. 22, 2019, 39 pages. |
“Non-Final Office Action”, U.S. Appl. No. 16/238,464, dated Mar. 7, 2019, 15 pages. |
“Notice of Allowance”, U.S. Appl. No. 15/286,152, dated Mar. 5, 2019, 23 pages. |
“Notice of Allowance”, U.S. Appl. No. 15/286,837, dated Mar. 6, 2019, 7 pages. |
“Notice of Allowance”, U.S. Appl. No. 15/703,511, dated Apr. 16, 2019, 5 pages. |
“Notice of Allowance”, U.S. Appl. No. 15/286,512, dated Apr. 9, 2019, 14 pages. |
“Pre-Interview Communication”, U.S. Appl. No. 15/917,238, dated May 1, 2019, 6 pages. |
“Pre-Interview Communication”, U.S. Appl. No. 15/703,511, dated Feb. 11, 2019, 5 pages. |
“Final Office Action”, U.S. Appl. No. 14/959,901, dated May 30, 2019, 18 pages. |
“Final Office Action”, U.S. Appl. No. 15/142,471, dated Jun. 20, 2019, 26 pages. |
“Final Office Action”, U.S. Appl. No. 16/238,464, dated Jul. 25, 2019, 15 pages. |
“First Action Interview Office Action”, U.S. Appl. No. 15/917,238, Jun. 6, 2019, 6 pages. |
“Foreign Office Action”, Korean Application No. 10-2016-7036015, Oct. 15, 2018, 3 pages. |
“International Preliminary Report on Patentability”, PCT Application No. PCT/US2017/051663, Jun. 20, 2019, 10 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/424,263, dated May 23, 2019, 12 pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/462,957, dated May 24, 2019, 14 pages. |
“Notice of Allowance”, U.S. Appl. No. 15/352,194, dated Jun. 26, 2019, 8 pages. |
“Notice of Allowance”, U.S. Appl. No. 15/287,155, dated Jul. 25, 2019, 7 pages. |
Kubota, et al., “A Gesture Recognition Approach by using Microwave Doppler Sensors”, IPSJ SIG Technical Report, 2009 (6), Information Processing Society of Japan, Apr. 15, 2010, pp. 1-8, Apr. 15, 2010, 12 pages. |
“Final Office Action”, U.S. Appl. No. 15/287,394, dated Sep. 30, 2019, 38 Pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/286,537, dated Sep. 3, 2019, 28 Pages. |
“Non-Final Office Action”, U.S. Appl. No. 15/791,044, dated Sep. 30, 2019, 22 Pages. |
“Notice of Allowance”, U.S. Appl. No. 15/917,238, dated Aug. 21, 2019, 13 pages. |
“Notice of Allowance”, U.S. Appl. No. 16/389,402, dated Aug. 21, 2019, 7 Pages. |
“Notice of Allowance”, U.S. Appl. No. 15/287,253, dated Aug. 26, 2019, 13 Pages. |
“Notice of Allowance”, U.S. Appl. No. 16/356,748, dated Oct. 17, 2019, 9 Pages. |
Number | Date | Country | |
---|---|---|---|
62237975 | Oct 2015 | US |