Methods and apparatus for autocalibration of a wearable electrode sensor system

Information

  • Patent Grant
  • 11941176
  • Patent Number
    11,941,176
  • Date Filed
    Wednesday, December 21, 2022
    a year ago
  • Date Issued
    Tuesday, March 26, 2024
    a month ago
Abstract
Methods and systems used in calibrating the position and/or orientation of a wearable device configured to be worn on a wrist or forearm of a user, the method comprises sensing a plurality of neuromuscular signals from the user using a plurality of sensors arranged on the wearable device, and providing the plurality of neuromuscular signals and/or signals derived from the plurality of neuromuscular signals as inputs to one or more trained autocalibration models, determining based, at least in part, on the output of the one or more trained autocalibration models, a current position and/or orientation of the wearable device on the user, and generating a control signal based, at least in part, on the current position and/or orientation of the wearable device on the user and the plurality of neuromuscular signals.
Description
BACKGROUND

Some smart wearable devices that detect user-generated signals can be worn by users in various orientations and positions on the body. A common issue for these types of devices is that signal detection can be negatively affected if the wearable device is worn at a location or in an orientation for which the device was not optimized for performance.


SUMMARY

Some embodiments are directed to a system for calibrating the position and/or orientation of a wearable device configured to be worn on a wrist or forearm of a user. The system comprises a plurality of sensors arranged on the wearable device, wherein the plurality of sensors are configured to continuously sense a plurality of neuromuscular signals from the user, and at least one computer processor. The at least one computer processor is programmed to provide the plurality of neuromuscular signals and/or signals derived from the plurality of neuromuscular signals as inputs to one or more trained autocalibration models, determine based, at least in part, on the output of the one or more trained autocalibration models, a current position and/or orientation of the wearable device on the user, and generate a control signal based, at least in part, on the current position and/or orientation of the wearable device on the user and the plurality of neuromuscular signals.


In one aspect, the at least one computer processor is programmed to determine the current position and/or orientation of the wearable device on the user without the user performing a particular pose or gesture during sensing of the plurality of neuromuscular signals.


In another aspect, the at least one computer processor is programmed to process the sensed plurality of neuromuscular signals prior to providing the processed neuromuscular signals to the one or more autocalibration models.


In another aspect, the at least one programmed processor is further programmed to generate calibrated neuromuscular signals based, at least in part, on the current position and/or orientation of the wearable device and the plurality of neuromuscular signals, and generating a control signal comprises generating a control signal based, at least in part, on the calibrated neuromuscular signals.


In another aspect, the at least one programmed processor is further programmed to select or modify one or more inference models based, at least in part, on the current position and/or orientation of the wearable device, and provide the plurality of neuromuscular signals as input to the selected or modified one or more inference models, and generating a control signal is further based, at least in part, on an output of the selected or modified one or more inference models.


In another aspect, the one or more autocalibration models used to determine the current position and/or orientation of the wearable device on the user include a neural network.


In another aspect, the neural network is an LSTM neural network.


In another aspect, the LSTM neural network includes at least one pooling layer.


In another aspect, the at least one pooling layer comprises a max pooling layer.


In another aspect, the at least one pooling layer provides rotation invariance of ±1 sensor locations.


In another aspect, the one or more autocalibration models are trained to identify the orientation and/or location of the wearable device based on neuromuscular signals obtained from a plurality of users.


In another aspect, the neuromuscular signals obtained from a plurality of users were obtained as each of the plurality of users performed a plurality of hand gestures and/or poses while the wearable device was oriented and/or positioned in multiple different configurations on a forearm and/or wrist of the user.


In another aspect, the plurality of neuromuscular sensors comprises a plurality of electromyography (EMG) sensors.


In another aspect, the at least one computer processor is further programmed to identify that at least a portion of the wearable device has moved on the wrist or forearm of the user, and determine the current position and/or orientation of the wearable device on the user in response to identifying that at least a portion of the wearable device has moved on the wrist or forearm of the user.


In another aspect, identifying the at least a portion of the wearable device has moved on the wrist or forearm of the user comprises determining that the wearable device has rotated around and/or translated along the user's wrist or forearm.


In another aspect, identifying the at least a portion of the wearable device has moved on the wrist or forearm of the user comprises detecting one or more movement artifacts in the sensed plurality of neuromuscular signals.


In another aspect, the system further comprises at least one inertial measurement unit (IMU) sensor, and identifying the at least a portion of the wearable device has moved on the wrist or forearm of the user is based, at least in part, on at least one signal sensed by the at least one IMU sensor.


In another aspect, determining the current position and/or orientation of wearable device comprises determining a rotation of the wearable device relative to a virtual reference orientation of the wearable device.


In another aspect, the control signal is a control signal to control one or more operations of or within a virtual reality system or an augmented reality system.


Some embodiments are directed to a method for calibrating the position and/or orientation of a wearable band on a user. The method comprises sensing a plurality of neuromuscular signals from the user using a plurality of sensors arranged on the wearable device, providing the plurality of neuromuscular signals and/or signals derived from the plurality of neuromuscular signals as inputs to one or more trained autocalibration models, determining based, at least in part, on the output of the one or more trained autocalibration models, a current position and/or orientation of the wearable device on the user, and generating a control signal based, at least in part, on the current position and/or orientation of the wearable device on the user and the plurality of neuromuscular signals.


In one aspect, determining the current position and/or orientation of the wearable device comprises determining the current position and/orientation of the wearable device without the user performing a particular pose or gesture during sensing of the plurality of neuromuscular signals.


In another aspect, the method further comprises processing the sensed plurality of neuromuscular signals prior to providing the processed neuromuscular signals to the one or more autocalibration models.


In another aspect, the method further comprises generating calibrated neuromuscular signals based, at least in part, on the current position and/or orientation of the wearable device and the plurality of neuromuscular signals, and generating a control signal comprises generating a control signal based, at least in part, on the calibrated neuromuscular signals. In another aspect, the method further comprises selecting or modifying one or more inference models based, at least in part, on the current position and/or orientation of the wearable device, and providing the plurality of neuromuscular signals as input to the selected or modified one or more inference models, wherein generating a control signal is further based, at least in part, on an output of the selected or modified one or more inference models.


In another aspect, the one or more autocalibration models used to determine the current position and/or orientation of the wearable device on the user include a neural network. In another aspect, the one or more autocalibration models are trained to identify the orientation and/or location of the wearable device based on neuromuscular signals obtained from a plurality of users.


In another aspect, the neuromuscular signals obtained from a plurality of users were obtained as each of the plurality of users performed a plurality of hand gestures and/or poses while the wearable device was oriented and/or positioned in multiple different configurations on a forearm and/or wrist of the user.


In another aspect, determining the current position and/or orientation of wearable device comprises determining a rotation of the wearable device relative to a virtual reference orientation of the wearable device.


In another aspect, the method further comprises identifying that at least a portion of the wearable device has moved on the wrist or forearm of the user, and determining the current position and/or orientation of the wearable device on the user in response to identifying that at least a portion of the wearable device has moved on the wrist or forearm of the user.


In another aspect, identifying the at least a portion of the wearable device has moved on the wrist or forearm of the user comprises determining that the wearable device has rotated around and/or translated along the user's wrist or forearm.\


In another aspect, identifying the at least a portion of the wearable device has moved on the wrist or forearm of the user comprises detecting one or more movement artifacts in the sensed plurality of neuromuscular signals.


In another aspect, the method further comprises identifying the at least a portion of the wearable device has moved on the wrist or forearm of the user is based, at least in part, on at least one signal sensed by at least one inertial measurement unit (IMU) sensor.


Some embodiments are directed to a method for training an inference model for autocalibration of a wearable device. The method comprises receiving neuromuscular signals sensed from a plurality of users as the wearable device was worn by each of the users while performing a plurality of hand gestures and/or poses, wherein for each of the plurality of hand gestures and/or poses, the wearable device was oriented or positioned differently on the user, labeling the received neuromuscular signals for each of the plurality of hand gestures and/or poses based on the orientation or position of the wearable device during sensing of the corresponding neuromuscular signals, and training one or more autocalibration models based, at least in part, on the labeled neuromuscular signals.


In one aspect, the method further comprises extracting from the received neuromuscular signals, a plurality of templates, where each of the plurality of templates corresponds to one of the plurality of hand gestures and/or poses, and generating calibrated neuromuscular signals based, at least on the extracted plurality of templates, and training the one or more autocalibration models comprises training the one or more autocalibration models based, at least in part, on the calibrated neuromuscular signals.


In another aspect, the method further comprises predicting a plurality of rotation offsets of the wearable device using a plurality of simulations of different orientations.


In another aspect, the one or more autocalibration models include a neural network.


In another aspect, the neural network is an LSTM neural network.


Some embodiments are directed to a non-transitory computer-readable storage medium encoded with a plurality of processor-executable instructions that, when executed by one or more computer processors perform one or more of the foregoing methods.


It should be appreciated that all combinations of the foregoing concepts and additional concepts discussed in greater detail below (provided such concepts are not mutually inconsistent) are contemplated as being part of the inventive subject matter disclosed herein. In particular, all combinations of claimed subject matter appearing at the end of this disclosure are contemplated as being part of the inventive subject matter disclosed herein.





BRIEF DESCRIPTION OF DRAWINGS

Various non-limiting embodiments of the technology will be described with reference to the following figures. It should be appreciated that the figures are not necessarily drawn to scale.



FIG. 1 illustrates a wearable system with sixteen EMG sensors arranged circumferentially around a band configured to be worn around a user's lower arm or wrist, in accordance with some embodiments of the technology described herein;



FIGS. 2A and 2B schematically illustrate a computer-based system that includes a wearable portion and a dongle portion, respectively, in accordance with some embodiments of the technology described herein;



FIG. 3 is a plot illustrating an example distribution of outputs generated by an autocalibration model trained in accordance with some embodiments. The distribution of outputs is generated across a dataset with data collected from different users;



FIG. 4 shows predicted values output from an autocalibration model trained in accordance with some embodiments. The predicted value are averaged across time which result in model predictions with greater confidence values;



FIG. 5 shows a plot of the accuracy of an autocalibration model trained in accordance with some embodiments. The accuracy is expressed in an Area Under the Receiver Operating Characteristic Curve (AUC);



FIG. 6 shows a plot of a correlation between a baseline model and an augmented model for autocalibrating sensors of a wearable device in accordance with some embodiments;



FIG. 7 shows a plot of correlations of augmented inference models trained with electrode invariances set to ±1, ±2, and ±3 along with the baseline model in accordance with some embodiments;



FIG. 8 schematically illustrates a visualization of an energy plot for a single EMG electrode on a wearable device at a first position and a second position in which the wearable device has been rotated;



FIG. 9 illustrates a flowchart of a process for offline training of an autocalibration model based on neuromuscular signals recorded from a plurality of users in accordance with some embodiments; and



FIG. 10 illustrates a flowchart of a process for using the trained autocalibration model to calibrate the position and/or orientation of sensors on a wearable device in accordance with some embodiments.





DETAILED DESCRIPTION

The disclosed methods and apparatus herein relate to autocalibration of sensor systems such that the systems perform well when the sensors are worn by users in various orientations and/or locations on a body part In some embodiments of the technology described herein, sensor signals may be used to predict information about a position and/or a movement of one or more portions of a user's body (e.g., a leg, an arm, and/or a hand), which may be represented as a multi-segment articulated rigid-body system with joints connecting the multiple segments of the rigid-body system. For example, in the case of a hand movement, signals sensed by wearable neuromuscular sensors placed at locations on the user's body (e.g., the user's arm and/or wrist) may be provided as input to one or more inference models trained to predict estimates of the position (e.g., absolute position, relative position, orientation) and the force(s) associated with a plurality of rigid segments in a computer-based musculoskeletal representation associated with a hand, for example, when the user performs one or more hand movements. The combination of position information and force information associated with segments of the musculoskeletal representation associated with a hand may be referred to herein as a “handstate” of the musculoskeletal representation. As a user performs different movements, a trained inference model may interpret neuromuscular signals sensed by the wearable neuromuscular sensors into position and force estimates (handstate information) that are used to update the musculoskeletal representation.


In certain embodiments, processed neuromuscular signals and/or neuromuscular signal patterns (e.g., EMG signals and/or EMG signal patterns) are collected from multiple users, and the processed signals and/or signal patterns are used to generate generalized models to predict the orientation and/or location of an armband containing EMG sensors. For example, spatiotemporal patterns of EMG signals can be obtained as users perform different handstate configurations with or without system supervision (e.g., gestures or poses comprising various pinches and wrist movements in the four cardinal directions) when the armband is positioned on the forearm and/or oriented in various ways, and these signals can be used to train generalized inference models to predict the user's gestures based on detected EMG signals. In certain embodiments, the systems and methods disclosed herein comprise identifying a “virtual” reference electrode and mapping the specific orientation of an armband to an offset value associated with an electrode and the “virtual” reference electrode.


As used herein, the term “gestures” may refer to a static or dynamic configuration of one or more body parts including a position of the one or more body parts and forces associated with the configuration. For example, gestures may include discrete gestures, such as placing or pressing the palm of a hand down on a solid surface, or grasping a ball, or pinching two fingers together (e.g., to form a pose); or continuous gestures, such as waving a finger back and forth, grasping and throwing a ball, rotating a wrist in a direction; or a combination of discrete and continuous gestures. Gestures may include covert gestures that may be imperceptible to another person, such as slightly tensing a joint by co-contracting opposing muscles or using sub-muscular activations, or “off-manifold” activations. In training an inference model, gestures may be defined using an application configured to prompt a user to perform the gestures or, alternatively, gestures may be arbitrarily defined by a user. The gestures performed by the user may include symbolic gestures (e.g., gestures mapped to other gestures, interactions, or commands, for example, based on a gesture vocabulary that specifies the mapping). In some cases, hand and arm gestures may be symbolic and used to communicate according to cultural standards.


Following autocalibration of the system(s), in various embodiments, a number of muscular activation states of a user may be identified from the recorded and/or detected signals and/or information based on the signals, to enable improved selection and/or control of objects in the user's environment when those objects are configured to receive control signals. The control of the objects can be performed directly from a neuromuscular activity device or indirectly via another system such as an augmented reality (AR) system or any extended or cross reality system (XR system or environment), including but not limited to mixed reality (MR), virtual reality (VR), etc.


The description set forth below is intended as a description of various configurations of the subject technology and is not intended to represent the only configurations in which the subject technology may be practiced. The appended drawings are incorporated herein and constitute a part of the detailed description. The detailed description includes specific details for the purpose of providing a thorough understanding of the subject technology. It, however, will be clear and apparent that the subject technology is not limited to the specific details set forth herein and may be practiced without these specific details. In some instances, well-known structures and components are shown in block diagram form to avoid obscuring the concepts of the subject technology.


The terms “computer”, “processor”, “computer processor”, “compute device” or the like should be expansively construed to cover any kind of electronic device with data processing capabilities including, by way of non-limiting example, a digital signal processor (DSP), a microcontroller, a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), or any other electronic computing device comprising one or more processors of any kind, or any combination thereof.


As used herein, the phrase “for example,” “such as”, “for instance” and variants thereof describe non-limiting embodiments of the presently disclosed subject matter. Reference in the specification to “one case”, “some cases”, “other cases”, or variants thereof means that a particular feature, structure or characteristic described in connection with the embodiment(s) is included in at least one embodiment of the presently disclosed subject matter. Thus the appearance of the phrase “one case”, “some cases”, “other cases” or variants thereof does not necessarily refer to the same embodiment(s).


It is appreciated that, unless specifically stated otherwise, certain features of the presently disclosed subject matter, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the presently disclosed subject matter, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable sub-combination.



FIG. 1 illustrates a band system with sixteen EMG sensors arranged circumferentially around an elastic band 102 configured to be worn around a user's lower arm. For example, FIG. 1 shows EMG sensors 101 arranged circumferentially (e.g., symmetrically spaced) around elastic band 102. It should be appreciated that any suitable number of neuromuscular sensors may be used and the number and arrangement of neuromuscular sensors used may depend on the particular application for which the wearable device is used. For example, a wearable armband or wristband may be used to generate control information for controlling an augmented reality system, a robot, controlling a vehicle, scrolling through text, controlling a virtual avatar, or any other suitable control task. Further, the band system can be configured to be worn around other parts of a user's body, such as their thigh or calf, for example.


In some embodiments, sensors 101 only includes a plurality of neuromuscular or muscular sensors or electrodes (e.g., EMG electrodes/sensors, MMG electrodes/sensors, SMG electrodes/sensors, etc.). In other embodiments, sensors 101 includes a plurality of neuromuscular sensors and at least one “auxiliary” sensor configured to continuously record a plurality of auxiliary signals. Examples of auxiliary sensors include, but are not limited to, other sensors such as inertial measurement unit (IMU) sensors, microphones, imaging devices (e.g., a camera), radiation based sensors for use with a radiation-generation device (e.g., a laser-scanning device), or other types of sensors such as a heart-rate monitor.


In some embodiments, the output of one or more of the sensors may be optionally processed using hardware signal processing circuitry (e.g., to perform amplification, filtering, and/or rectification). In other embodiments, at least some signal processing of the output of the sensors may be performed in software. Thus, signal processing of signals sensed by the sensors may be performed in hardware, software, or by any suitable combination of hardware and software, as aspects of the technology described herein are not limited in this respect. A non-limiting example of a signal-processing procedure used to process recorded data from the sensors 101 is discussed in more detail below in connection with FIGS. 2A and 2B.



FIGS. 2A and 2B illustrate a schematic diagram with internal components of a wearable system with sixteen sensors (e.g., EMG sensors), in accordance with some embodiments of the technology described herein. As shown, the wearable system includes a wearable portion 210 (FIG. 2A) and a dongle portion 220 (FIG. 2B). Although not illustrated, the dongle portion 220 is in communication with the wearable portion 210 (e.g., via Bluetooth or another suitable short range wireless communication technology). As shown in FIG. 2A, the wearable portion 210 includes the sensors 101, examples of which are described above in connection with FIG. 1. The sensors 101 provide output (e.g., sensed signals) to an analog front end 230, which performs analog processing (e.g., noise reduction, filtering, etc.) on the sensed signals. Processed analog signals produced by the analog front end 230 are then provided to an analog-to-digital converter 232, which converts the processed analog signals to digital signals that can be processed by one or more computer processors. An example of a computer processor that may be used in accordance with some embodiments is a microcontroller (MCU) 234. As shown in FIG. 2A, the MCU 234 may also receive inputs from other sensors (e.g., an IMU 240) and from a power and battery module 242. As will be appreciated, the MCU 234 may receive data from other devices not specifically shown. A processing output by the MCU 234 may be provided to an antenna 250 for transmission to the dongle portion 220, shown in FIG. 2B. The dongle portion 220 includes an antenna 252 that communicates with the antenna 250 of the wearable portion 210. Communication between the antennas 250 and 252 may occur using any suitable wireless technology and protocol, non-limiting examples of which include radiofrequency signaling and Bluetooth. As shown, the signals received by the antenna 252 of the dongle portion 220 may be provided to a host computer for further processing, for display, and/or for effecting control of a particular physical or virtual object or objects (e.g., to perform a control operation in an AR or VR environment)


Although the examples provided with reference to FIGS. 1, 2A, and 2B are discussed in the context of interfaces with EMG sensors, it is to be understood that the wearable systems described herein can also be implemented with other types of sensors, including, but not limited to, mechanomyography (MMG) sensors, sonomyography (SMG) sensors, and electrical impedance tomography (EIT) sensors.


As described above in connection with FIGS. 2A and 2B, the wearable system may include one or more computer processors (e.g., MCU 234) programmed to communicate with the sensors (e.g., neuromuscular sensors 101 and/or IMU sensor(s) 240). For example, signals recorded by one or more of the sensors may be provided to the processor(s), which may be programmed to execute one or more trained inference models or machine learning models that process signals captured by the sensors.


In some embodiments, the trained inference model(s) may comprise a neural network and, for example, may comprise a recurrent neural network. In some embodiments, the recurrent neural network may be a long short-term memory (LSTM) neural network. It should be appreciated, however, that the recurrent neural network is not limited to be an LSTM neural network and may have any other suitable architecture. For example, in some embodiments, the recurrent neural network may be a fully recurrent neural network, a gated recurrent neural network, a recursive neural network, a Hopfield neural network, an associative memory neural network, an Elman neural network, a Jordan neural network, an echo state neural network, a second order recurrent neural network, and/or any other suitable type of recurrent neural network. In other embodiments, neural networks that are not recurrent neural networks may be used. For example, deep neural networks, convolutional neural networks, and/or feedforward neural networks, may be used. In some implementations, the inference model(s) can comprise an unsupervised machine learning model, i.e., users are not required to perform a predetermined set of handstate configurations for which an inference model was previously trained to predict or identify. In other embodiments, the inference model(s) can comprise one or more supervised machine learning models wherein users performed specific gestures or handstate configurations in response to instructions, and the detected and processed signals from the electrodes or sensors have been associated with the performed gestures or handstate configurations.


In some embodiments, one or more inference models (e.g., a neural network as discussed above) can be implemented to classify sets of EMG signals or patterns characterized by unique spatio-temporal patterns that vary in amplitude for different amounts of forces generated by users' muscles or motor units. The processed EMG signals and/or patterns can be associated with a manner or way a user wears the band system during signal sensing (e.g., the orientation and/or positioning of the band on user's forearm). Accordingly, in one embodiment, the inference model(s) can associate one or more unique EMG signals or patterns with a specific orientation and/or location of the neuromuscular armband on the user (e.g., in a manner in which the electrodes are in contact with certain areas of the user's skin). Likewise, the inference model(s) can associate one or more unique EMG signals or patterns with a rotated orientation of the armband and/or with a re-located positioning of the armband when the user moves the armband from a lower forearm position to an upper forearm position (or the other way around, i.e., from an upper forearm position to a lower forearm position). Thus, the inference model(s) can be trained to associate various armband rotations and/or positional offsets with detected and processed EMG signals or signal patterns. Once the system can identify the specific orientation and/or positioning of the band on the user, the system can select and apply one or more previously-trained inference model(s) at that specific orientation and/or positioning in order to predict, for example, user handstate configurations with a higher level of accuracy compared to other inference model(s) that may have been previously trained using different orientations and/or positions of the band on users. Differently stated, in certain embodiments, the armband system can be calibrated via selection of particular inference model(s) such that the system adapts to any rotation and/or arm position offset without interfering with the user experience, also referred to herein as autocalibration. In other embodiments, the detected offset in orientation and/or relative difference in location of the band on the user compared to the orientation and/or positioning of the band used to train one or more inference models can be used to “normalize” the orientation and/or location of the band for input into the inference model(s). A virtual reference electrode (e.g., the “O” electrode) may be defined, and a rotation offset relative to the virtual reference electrode may be used to normalize the orientation and/or location of the band. For example, it may be determined based on the detected and processed EMG signals that the band has a rotational offset of three electrodes from the virtual reference electrode. In such an instance, the detected EMG signals can be further processed to account for this particular offset (either before or after being input into the trained inference models).


In some implementations, the armband system can be autocalibrated such that, the armband system adapts to users that may have injured or missing muscles, different adipose tissue or fat, and other anatomic variables. Although discussed below in the context of multiple trained inference models, it is appreciated that the embodiments discussed herein can in some instances be implemented as a single or sole trained inference model or machine learning model. It is also appreciated that at least some of the inference models may be trained from data collected from multiple users. For instance, data can be collected from recorded EMG signals of multiple users while they perform one or more handstate configurations, e.g., poses or gestures.


In one embodiment, the inference model that classifies EMG patterns can be created as follows: 1) build a new inference model/experiment class that takes as input(s) a set of preprocessed EMG signals; 2) generate training data by randomly applying a rotation offset to the preprocessed EMG signals; 3) produce positive labels when the augmented offset is 0, and null otherwise; 4) calibrate the training data to have calibrated data at offset=O; and 5) train an inference model using the calibrated training data, and evaluate the performance of the trained inference model by testing different rotation offsets.


An example of a distribution of outputs generated by a trained inference model across a dataset with data collected from different users is shown in FIG. 3. It can be appreciated that on average the trained model outputs show higher confidence values when the training data is calibrated (i.e., offset=0) compared to when the training data is not calibrated. In this example, the trained model produces a prediction for every 80 millisecond (ms) chunk of collected EMG data, however, other time intervals can be analogously used. Also, in this example, a binary classifier is used to determine whether the band is calibrated at the “O” electrode in the preferred orientation. FIG. 4 shows predicted values averaged across time which result in predictions with greater confidence values. It can be appreciated that by applying a smoothing factor of 10 seconds the predicted offset can be acquired with more accuracy e.g., an accuracy at the ±1 electrode range. FIG. 5 shows the accuracy of the model expressed in Area Under the Receiver Operating Characteristic Curve (AUC).


In some implementations, an electrode invariance factor associated with a small offset error can be provided as an input to an inference model, e.g., during the training phase. Such an inference model is also referred herein as an augmented inference model. The augmented inference model can be implemented by, for instance, expanding a batch of training data and creating a new dimension for each sample in a training data batch. The new dimension includes concatenations of different versions of the same feature vector with different electrode rotations. For example, given a feature vector of dimension 384, an invariance of ±1 electrode can be set by providing as input to the augmented inference model a feature tensor of dimension 3×384. Such a feature tensor includes the extra dimension containing the same feature vector with the three rotation offsets [−1, 0, I]. Then, the augmented inference model may be initiated with a first input dense layer and this extra dimension may be reduced by pooling across the extra dimension, for instance, by taking the maximum activation over the extra dimension. Accordingly, the augmented inference model applies the same set of coefficients (for the first dense layer) for each of the three offsets, then selects for each filter the largest activation over the three proposed offsets. FIG. 6 illustrates a correlation between a baseline model and the augmented inference model. It should be appreciated that the baseline model is more sensitive to the offset errors, as it sharply drops performance when is not calibrated (i.e., offset !=0). FIG. 7 illustrates correlations of augmented inference models trained with electrode invariances set to ±1, ±2, and ±3 along with the baseline model.


In another embodiment, the autocalibration process comprises numerically aligning the band position (e.g., rotation, location on the forearm, and/or orientation) via common reference point(s) across different users and their forearms and/or wrists, so that the system can identify virtual reference point(s) for each electrode and can consistently represent the same part of the forearm and/or wrist irrespective of the band position. For example, this can be done by applying an electrode permutation (e.g., reversing the order of the electrodes depending on the band orientation as applied to the forearm, or as applied to the left or right forearms) and applying a circular rotation to the electrodes. FIG. 8 schematically shows a visualization of an energy plot for a single EMG electrode “O” at a first position 810, and at a second position 820 in which the EMG electrode is rotated three positions. The disclosed systems and methods herein can utilize the difference(s) in such energy plots in order to achieve the autocalibration results as described herein.


In some embodiments, the user need not perform a specific calibration gesture to register the band position. Instead, an autocalibration model can be trained to recognize and predict the specific electrode permutation that generated a given set of EMG signals or patterns. Data augmentation techniques can be implemented on data sets obtained in a supervised machine learning fashion. For example, offline EMG data can be collected from a set of users who performed a set of canonical gestures or poses comprising various pinches and wrist movements in the four cardinal directions to register the band rotations based on the collected and processed EMG data associated with those gestures or poses. With this process, the system can generate labels to train a model (e.g., an autocalibration model) to recognize different simulated band rotations and/or orientations (e.g., data augmentation). The number of gestures or poses upon which the autocalibration model is trained can vary, but may comprise approximately eight gestures. In this embodiment, the autocalibration task can be regarded as a classification task, and more collected and labeled data may lead to better precision on the predicted offset of the electrode(s). During this offline training, the band can be rotated and additional data can be collected from users in each of the rotated positions. For example, if the band has 16 electrode channels, data can be collected from the same user or across multiple users at each of 16 different orientations. Some embodiments employ an autocalibration model to predict that the wearable system is rotated into one of multiple (e.g., 16) positions.



FIG. 9 shows an example of a flowchart for offline training of an autocalibration model in accordance with some embodiments. In act 910 EMG data is sensed as one or more users are performing multiple gestures or poses with different band orientations. As discussed above, any suitable number of gestures or poses (e.g., 8 gestures or poses) may be used. The process then proceeds to act 912, where templates are extracted for each gesture or pose, wherein each of the templates corresponds to spatio-temporal patterns of the sensed EMG signals. The process then proceeds to act 914, where the extracted templates are used to generate calibrated EMG data. Any given electrode channel can be designated the “O” channel electrode at which the system is set to by default process and interpret EMG inputs to predict handstate configurations.


In act 916, each of the band orientations during data collection is assigned an offset value depending on the degree of rotation of the armband when the data was collected. In this way, each labeled data set collected and processed for the specific gestures and/or poses can be further be labeled with an offset value depending on the band orientation and associated simulation position can be generated for the band. The labeled data for calibrated EMG signals, which have been assigned to gestures and/or poses, and EMG signals identified based on specific electrode channel orientation(s) can be input into an autocalibration model in act 920 to train the model. In some embodiments, the autocalibration model is a neural network (such as an LSTM, for example), and the autocalibration model is trained to predict the orientation of the armband based on received EMG signal data. Once properly trained, the process proceeds to act 922 where the autocalibration model can be implemented for any user to predict the orientation and/or rotation offset of the band (e.g., with respect to the “0” channel) based on received EMG data from the user and simulations of the band positioning and/or orientation based on previously-collected user EMG data. In some embodiments, the model may be trained such that it can predict the degree of rotation or offset to an error of +/−1 electrode channel. After the autocalibration model has been suitably trained to predict the orientation of the band within an error of +/−1 electrode channel, the autocalibration model can be used “online” to predict the band orientation on any given user during the user's performance of one or more motor actions or intended motor actions.


By contrast to techniques that require the user to perform a specific gesture or pose for calibration, some embodiments use autocalibration to automatically detect the position of the wearable device on a body part using an autocalibration model that does not require the user to perform a specific calibration gesture or pose. Such embodiments permit continuous or periodic autocalibration of the device as the user wears the device without requiring the user to stop what they are doing and perform a particular calibration gesture or pose. FIG. 10 illustrates a flowchart for online usage of a trained autocalibration model in accordance with some embodiments. In act 1010, EMG data is sensed from a user wearing the wearable device (e.g., the wearable device shown in FIG. 1). In act 1012, the sensed EMG data is provided to a trained autocalibration model (e.g., the autocalibration model trained in act 920 of the process in FIG. 9). The process of FIG. 10 then proceeds to act 1014 where the output of the model is used to predict a position and/or orientation of the wearable device by, for example, classifying the EMG signal or signal patterns into one of multiple (e.g., 16) rotation positions. The process then proceeds to act 1016 where a correction based on the predicted output of the autocalibration model is applied to the sensed EMG data to generate calibrated EMG data. In other embodiments, the sensed EMG data is not corrected, and the sensed EMG data is provided as input to selected inferential models trained at the detected orientation(s) and/or position(s) or provided as input to models that have been modified to interpret the sensed EMG data and accurately output handstate configurations, poses, and/or gestures.


As described above, in some embodiments, the autocalibration model may be implemented as an LSTM neural network. In certain embodiments, it is desirable to address the +/−1 electrode channel error rate for the rotation offset (see FIG. 6 showing an augmented model performing equally well for +/−1 electrode offset, but the baseline model performing not as well for a +/−1 electrode offset). This can be done by implementing a pooling layer within the neural network as shown schematically in act 1020 of FIG. 10. For example, the pooling layer may be a max pooling layer where the max pooling layer can be configured to determine the location of the “0” electrode channel (e.g., based on a probabilistic calculation for each of the three electrode locations), so the system can virtually label a specific electrode channel as the “0” reference channel out of a possible set of three channels initially identified by the autocalibration model. It should be understood that other types of pooling layers can be used that are well known in the art, e.g., average or mean pooling layers.


In certain embodiments, the band detects and analyzes EMG signals or EMG signal patterns either continuously for a certain period of time or at discrete time points. In either case, during this defined period of time or at discrete time points, the autocalibration model(s) can be used to determine the orientation and/or positioning of the band on the user's body part, e.g., forearm. In certain embodiments, the band detects and processes EMG signals or EMG signal patterns initially upon the user putting the band on and/or in response to detecting band movement subsequent to initial placement of the band. For example, the system may identify that at least a portion of the wearable device (e.g., one or more EMG sensors) has moved, e.g., rotated around and/or translated along the wrist or forearm of the user, and in response to determining that the at least a portion of the wearable device (e.g., the band) has moved, the system may perform autocalibration by determining a current position and/or orientation of the wearable device in accordance with one or more of the autocalibration techniques described herein. By automatically restarting calibration after identifying movement of the wearable device, re-calibration can be performed in an efficient and timely manner.


Movement of the wearable device on the user can be detected in any suitable manner. In some embodiments, detection of movement of the wearable device is based on one or more detected movement artifacts identified in the neuromuscular signals sensed by the plurality of neuromuscular sensors. In other embodiments that include at least one inertial measurement unit (IMU) sensor, movement of the wearable device may be detected based at least in part, on at least one signal sensed by the at least one IMU sensor. In yet other embodiments, detection of movement of the wearable device may be based, at least in part, on signals from multiple types of sensors (e.g., at least one EMG sensor and at least one IMU sensor) and/or sensors included on or external to the wearable device (e.g., at least one camera sensor).


Using the techniques described herein, the system may be able to calibrate the orientation and/or position of the armband in less than 10 seconds based on one or more sampling rates or frequencies. In other embodiments, due to settling time of the EMG electrodes (e.g., from skin-electrode interactions) after band movements, which can lead to an initially degraded signal quality, it may take a little longer to get a reliable estimation of the band position. For example, the system can initiate an autocalibration sequence upon detection of band movement, and the sequence can run for approximately 30 seconds to get a more reliable prediction of band orientation and/or position. Shorter time periods can be used to run the autocalibration sequences provided that the settling time can be reduced and little to no band movement is detected during the autocalibration sequence. In certain embodiments, the autocalibration sequence can be initiated and re-run upon the detection of any band movements no matter how slight in order to maximize the accuracy of the predictions associated with EMG signals or EMG signal patterns.


In other embodiments, the techniques described herein can be used to identify a specific position on the user's body part, e.g., how far up the band is sitting on the user's forearm as it relates to the distances between the band and the user's elbow joint or wrist joint. Similar to the embodiment described above (e.g., in FIG. 9) in which a generalized, offline model is generated and trained from labeled EMG signal or EMG signal pattern data with users performing different gestures and/or poses at different orientations of the band, in other embodiments additional data can be collected from users performing gestures and/or poses at different or the same orientations of the band but at different positions on the user's body part (e.g., along portions of the forearm). In this way, one or more generalized models can be trained to predict not only the specific orientation (e.g., rotation) of the band, but also its positioning on the user's body part (e.g., relative to a reference point such as the elbow joint or wrist joint). Such embodiments can be used to better predict user handstate configurations given potential variability in EMG signals or EMG signal patterns depending on the specific location of the band on the user. Given neuroanatomical constraints and specific user neuroanatomy, it may be advantageous to analyze EMG signals on more distal or more proximal points of the user's body part (e.g., along different portions of the forearm or wrist). In certain embodiments, the autocalibration model can detect a specific positioning of the band on the user's forearm, and generate an input signal to an interface to instruct the user to reposition the band in a more distal or more proximal location depending on the specific outcome desired (e.g., to move the band farther up the arm so that relatively more EMG data can be collected or farther down the arm because certain input models to be used were trained on collected data from users with the band closer to their wrists).


While various embodiments have been described above, it should be understood that they have been presented by way of example only, and not limitation. Where methods and/or schematics described above indicate certain events and/or flow patterns occurring in certain order, the ordering of certain events and/or flow patterns may be modified. While the embodiments have been particularly shown and described, it will be understood that various changes in form and details may be made. Additionally, certain of the steps may be performed concurrently in a parallel process when possible, as well as performed sequentially as described above. Although various embodiments have been described as having particular features and/or combinations of components, other embodiments are possible having any combination or sub-combination of any features and/or components from any of the embodiments described herein. Furthermore, although various embodiments are described as having a particular entity associated with a particular compute device, in other embodiments different entities can be associated with other and/or different compute devices.


It is intended that the systems and methods described herein can be performed by software (stored in memory and/or executed on hardware), hardware, or a combination thereof. Hardware modules may include, for example, a general-purpose processor, a field programmable gates array (FPGA), and/or an application specific integrated circuit (ASIC). Software modules (executed on hardware) can be expressed in a variety of software languages (e.g., computer code), including Unix utilities, C, C++, Java™, JavaScript, Ruby, SQL, SAS®, Python, Fortran, the R programming language/software environment, Visual Basic™, and other object-oriented, procedural, or other programming language and development tools. Examples of computer code include, but are not limited to, micro-code or micro-instructions, machine instructions, such as produced by a compiler, code used to produce a web service, and files containing higher-level instructions that are executed by a computer using an interpreter. Additional examples of computer code include, but are not limited to, control signals, encrypted code, and compressed code. Each of the devices described herein can include one or more processors as described above.


Some embodiments described herein relate to devices with a non-transitory computer-readable medium (also can be referred to as a non-transitory processor-readable medium or memory) having instructions or computer code thereon for performing various computer-implemented operations. The computer-readable medium (or processor-readable medium) is non-transitory in the sense that it does not include transitory propagating signals per se (e.g., a propagating electromagnetic wave carrying information on a transmission medium such as space or a cable). The media and computer code (also can be referred to as code) may be those designed and constructed for the specific purpose or purposes. Examples of non-transitory computer-readable media include, but are not limited to: magnetic storage media such as hard disks, floppy disks, and magnetic tape; optical storage media such as Compact Disc/Digital Video Discs (CD/DVDs), Compact Disc-Read Only Memories (CD-ROMs), and holographic devices; magneto-optical storage media such as optical disks; carrier wave signal processing modules; and hardware devices that are specially configured to store and execute program code, such as Application-Specific Integrated Circuits (ASICs), Programmable Logic Devices (PLDs), Read-Only Memory (ROM) and Random-Access Memory (RAM) devices. Other embodiments described herein relate to a computer program product, which can include, for example, the instructions and/or computer code discussed herein.


Processor-executable instructions can be in many forms, such as program modules, executed by one or more compute devices, and can include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular data types, and the functionality can be combined and/or distributed as appropriate for various embodiments. Data structures can be stored in processor-readable media in a number of suitable forms. For simplicity of illustration, data structures may be shown to have fields that are related through location in the data structure. Such relationships may likewise be achieved by assigning storage for the fields with locations in a processor-readable medium that conveys relationship(s) between the fields. However, any suitable mechanism/tool can be used to establish a relationship between information in fields of a data structure, including through the use of pointers, tags or other mechanisms/tools that establish relationship between data elements.


Various disclosed concepts can be embodied as one or more methods, of which examples have been provided. The acts performed as part of a particular method can be ordered in any suitable way. Accordingly, embodiments can be constructed in which acts are performed in an order different than illustrated/discussed, which can include performing some acts simultaneously, even though shown as sequential acts in illustrative embodiments. The use of flow diagrams is not meant to be limiting with respect to the order of operations performed. The herein described subject matter sometimes illustrates different components contained within, or connected with, different other components. It is to be understood that such depicted architectures are merely exemplary, and that in fact many other architectures can be implemented which achieve the same functionality. In a conceptual sense, any arrangement of components to achieve the same functionality is effectively “associated” such that the desired functionality is achieved. Hence, any two components herein combined to achieve a particular functionality can be seen as “associated with” each other such that the desired functionality is achieved, irrespective of architectures or intermediate components. Likewise, any two components so associated can also be viewed as being “operably connected,” or “operably coupled,” to each other to achieve the desired functionality, and any two components capable of being so associated can also be viewed as being “operably couplable,” to each other to achieve the desired functionality. Specific examples of operably couplable include but are not limited to physically malleable and/or physically interacting components and/or wirelessly interactable and/or wirelessly interacting components and/or logically interacting and/or logically interactable components.


All definitions, as defined and used herein, should be understood to control over dictionary definitions, definitions in documents incorporated by reference, and/or ordinary meanings of the defined terms.


The indefinite articles “a” and “an,” as used herein in the specification and in the claims, unless clearly indicated to the contrary, should be understood to mean “at least one.”


The phrase “and/or,” as used herein in the specification and in the claims, should be understood to mean “either or both” of the elements so conjoined, i.e., elements that are conjunctively present in some cases and disjunctively present in other cases. Multiple elements listed with “and/or” should be construed in the same fashion, i.e., “one or more” of the elements so conjoined. Other elements may optionally be present other than the elements specifically identified by the “and/or” clause, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, a reference to “A and/or B”, when used in conjunction with open-ended language such as “comprising” can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc.


As used herein in the specification and in the claims, “or” should be understood to have the same meaning as “and/or” as defined above. For example, when separating items in a list, “or” or “and/or” shall be interpreted as being inclusive, i.e., the inclusion of at least one, but also including more than one, of a number or list of elements, and, optionally, additional unlisted items. Only terms clearly indicated to the contrary, such as “only one of” or “exactly one of,” or, when used in the claims, “consisting of,” will refer to the inclusion of exactly one element of a number or list of elements. In general, the term “or” as used herein shall only be interpreted as indicating exclusive alternatives (i.e. “one or the other but not both”) when preceded by terms of exclusivity, such as “either,” “one of,” “only one of,” or “exactly one of.” “Consisting essentially of,” when used in the claims, shall have its ordinary meaning as used in the field of patent law.


As used herein in the specification and in the claims, the phrase “at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements. This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase “at least one” refers, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, “at least one of A and B” (or, equivalently, “at least one of A or B,” or, equivalently “at least one of A and/or B”) can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.


In the claims, as well as in the specification above, all transitional phrases such as “comprising,” “including,” “carrying,” “having,” “containing,” “involving,” “holding,” “composed of,” and the like are to be understood to be open-ended, i.e., to mean including but not limited to. Only the transitional phrases “consisting of” and “consisting essentially of” shall be closed or semi-closed transitional phrases, respectively, as set forth in the United States Patent Office Manual of Patent Examining Procedures, Section 2111.03.

Claims
  • 1. A system comprising: a plurality of sensors configured to continuously sense neuromuscular signals from a body part of a user; andat least one computer processor programmed to: determine based, at least in part, on the neuromuscular signals sensed from the body part of the user, a current position and/or orientation of the plurality of sensors on the user's body part;present a user interface to the user indicating a current position and/or orientation of the plurality of sensors; andgenerate a control signal based, at least in part, on the current position and/or orientation of the plurality of sensors on the user's body part, wherein the control signal causes the user interface to instruct the user to variably reposition the plurality of sensors to a different position on the user's body part based on the determined current position and/or orientation and further based on which position and/or orientation of the user's body part is associated with a specific set of training data generated from a specific input model.
  • 2. The system of claim 1, wherein the plurality of sensors comprises a wearable device configured to be worn on a wrist or forearm of the user.
  • 3. The system of claim 2, further comprising providing the plurality of neuromuscular signals as inputs to one or more trained autocalibration models to calibrate the position and/or orientation of the wearable device.
  • 4. The system of claim 1, wherein the current position and/or orientation are determined based on one or more specified neuromuscular signals or patterns that indicate a distal or proximal position of the user's body part on which the plurality of sensors is located.
  • 5. The system of claim 4, wherein the control signal causes the user interface to instruct the user to reposition the plurality of sensors to a more distal or more proximal location on the user's body part based on the determined current position and/or orientation.
  • 6. The system of claim 5, wherein the instruction to reposition the plurality of sensors to a more distal or more proximal location on the user's body part is based on and is specific to the neuroanatomy of the user's body part.
  • 7. The system of claim 4, further comprising: determining that an increased amount of data is needed from the plurality of sensors; andcausing the user interface, via the control signal, to instruct the user to reposition the plurality of sensors to a more proximal location on the user's body part.
  • 8. The system of claim 4, further comprising: determining that additional training data is available for the plurality of sensors at a distal location on the user's body part; andcausing the user interface, via the control signal, to instruct the user to reposition the plurality of sensors to a more distal location on the user's body part.
  • 9. The system of claim 1, wherein the at least one computer processor is programmed to determine the current position and/or orientation of the plurality of sensors on the user without the user performing a particular pose or gesture during sensing of the plurality of neuromuscular signals.
  • 10. The system of claim 1, wherein the at least one programmed processor is further programmed to: generate, using an autocalibration model, one or more calibrated neuromuscular signals based, at least in part, on the current position and/or orientation of the plurality of sensors, wherein generating the control signal comprises generating a control signal based, at least in part, on the calibrated neuromuscular signals.
  • 11. The system of claim 10, wherein the autocalibration model used to generate the calibrated neuromuscular signals comprises a neural network.
  • 12. The system of claim 1, wherein the control signal is configured to control one or more operations of or within a virtual reality system or an augmented reality system.
  • 13. A computer-implemented method comprising: determining based, at least in part, on one or more neuromuscular signals sensed from a body part of a user wearing a plurality of sensors, a current position and/or orientation of the plurality of sensors on the user's body part;presenting a user interface to the user indicating a current position and/or orientation of the plurality of sensors; andgenerating a control signal based, at least in part, on the current position and/or orientation of the plurality of sensors on the user's body part, wherein the control signal causes the user interface to variably instruct the user to reposition the plurality of sensors to a different position on the user's body part based on the determined current position and/or orientation and further based on which position and/or orientation of the user's body part is associated with a specific set of training data generated from a specific input model.
  • 14. The method of claim 13, wherein the plurality of sensors comprises a wearable device configured to be worn on a wrist or forearm of the user.
  • 15. The method of claim 14, further comprising providing the plurality of neuromuscular signals as inputs to one or more trained autocalibration models to calibrate the position and/or orientation of the wearable device.
  • 16. The method of claim 13, wherein the current position and/or orientation are determined based on one or more specified neuromuscular signals or patterns that indicate a distal or proximal position of the user's body part on which the plurality of sensors is located.
  • 17. The method of claim 16, wherein the control signal causes the user interface to instruct the user to reposition the plurality of sensors to a more distal or more proximal location on the user's body part based on the determined current position and/or orientation.
  • 18. The method of claim 17, wherein the instruction to reposition the plurality of sensors to a more distal or more proximal location on the user's body part is based on and is specific to the neuroanatomy of the user's body part.
  • 19. The method of claim 16, further comprising: determining that an increased amount of data is needed from the plurality of sensors; andcausing the user interface, via the control signal, to instruct the user to reposition the plurality of sensors to a more proximal location on the user's body part.
  • 20. A wearable device comprising: a plurality of sensors configured to continuously sense neuromuscular signals from a body part of a user; andat least one computer processor programmed to:determine based, at least in part, on the neuromuscular signals sensed from the body part of the user, a current position and/or orientation of the plurality of sensors on the user's body part;present a user interface to the user indicating a current position and/or orientation of the plurality of sensors; andgenerate a control signal based, at least in part, on the current position and/or orientation of the plurality of sensors on the user's body part, wherein the control signal causes the user interface to variably instruct the user to reposition the plurality of sensors to a different position on the user's body part based on the determined current position and/or orientation and further based on which position and/or orientation of the user's body part is associated with a specific set of training data generated from a specific input model.
RELATED APPLICATIONS

This application is a continuation of Ser. No. 17/297,449, filed May 26, 2021 which claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Application 62/771,957, filed Nov. 27, 2018, and entitled, “METHODS AND APPARATUS FOR AUTOCALIBRATION OF A WEARABLE SURFACE EMG SENSOR SYSTEM,” the entire contents of which is incorporated by reference herein.

US Referenced Citations (894)
Number Name Date Kind
1411995 Dull Apr 1922 A
3408133 Lee Oct 1968 A
3580243 Johnson May 1971 A
3620208 Wayne et al. Nov 1971 A
3712716 Cornsweet et al. Jan 1973 A
3735425 Hoshall et al. May 1973 A
3880146 Everett et al. Apr 1975 A
4055168 Miller et al. Oct 1977 A
4602639 Hoogendoorn et al. Jul 1986 A
4705408 Jordi Nov 1987 A
4817064 Milles Mar 1989 A
4896120 Kamil Jan 1990 A
4978213 El Hage Dec 1990 A
5003978 Dunseath, Jr. Apr 1991 A
D322227 Warhol Dec 1991 S
5081852 Cox Jan 1992 A
5103323 Magarinos et al. Apr 1992 A
5231674 Cleveland et al. Jul 1993 A
5251189 Thorp Oct 1993 A
D348660 Parsons Jul 1994 S
5445869 Ishikawa et al. Aug 1995 A
5462065 Cusimano Oct 1995 A
5467104 Furness, III et al. Nov 1995 A
5482051 Reddy et al. Jan 1996 A
5589956 Morishima et al. Dec 1996 A
5596339 Furness, III et al. Jan 1997 A
5605059 Woodward Feb 1997 A
5625577 Kunii et al. Apr 1997 A
5683404 Johnson Nov 1997 A
5742421 Wells et al. Apr 1998 A
6005548 Latypov et al. Dec 1999 A
6008781 Furness, III et al. Dec 1999 A
6009210 Kang Dec 1999 A
6027216 Guyton et al. Feb 2000 A
6032530 Hock Mar 2000 A
D422617 Simioni Apr 2000 S
6066794 Longo May 2000 A
6184847 Fateh et al. Feb 2001 B1
6236476 Son et al. May 2001 B1
6238338 DeLuca et al. May 2001 B1
6244873 Hill et al. Jun 2001 B1
6317103 Furness, III et al. Nov 2001 B1
6377277 Yamamoto Apr 2002 B1
D459352 Giovanniello Jun 2002 S
6411843 Zarychta Jun 2002 B1
6487906 Hock Dec 2002 B1
6510333 Licata et al. Jan 2003 B1
6527711 Stivoric et al. Mar 2003 B1
6619836 Silvant et al. Sep 2003 B1
6639570 Furness, III et al. Oct 2003 B2
6658287 Litt et al. Dec 2003 B1
6720984 Jorgensen et al. Apr 2004 B1
6743982 Biegelsen et al. Jun 2004 B2
6771294 Pulli et al. Aug 2004 B1
6774885 Even-Zohar Aug 2004 B1
6807438 Brun Del Re et al. Oct 2004 B1
D502661 Rapport Mar 2005 S
D502662 Rapport Mar 2005 S
6865409 Getsla et al. Mar 2005 B2
D503646 Rapport Apr 2005 S
6880364 Vidolin et al. Apr 2005 B1
6901286 Sinderby et al. May 2005 B1
6927343 Watanabe et al. Aug 2005 B2
6942621 Avinash et al. Sep 2005 B2
6965842 Rekimoto Nov 2005 B2
6972734 Ohshima et al. Dec 2005 B1
6984208 Zheng Jan 2006 B2
7022919 Brist et al. Apr 2006 B2
7028507 Rapport Apr 2006 B2
7086218 Pasach Aug 2006 B1
7089148 Bachmann et al. Aug 2006 B1
D535401 Travis et al. Jan 2007 S
7173437 Hervieux et al. Feb 2007 B2
7209114 Radley-Smith Apr 2007 B2
D543212 Marks May 2007 S
7265298 Maghribi et al. Sep 2007 B2
7271774 Puuri Sep 2007 B2
7333090 Tanaka et al. Feb 2008 B2
7351975 Brady et al. Apr 2008 B2
7450107 Radley-Smith Nov 2008 B2
7473888 Wine et al. Jan 2009 B2
7491892 Wagner et al. Feb 2009 B2
7517725 Reis Apr 2009 B2
7558622 Tran Jul 2009 B2
7574253 Edney et al. Aug 2009 B2
7580742 Tan et al. Aug 2009 B2
7596393 Jung et al. Sep 2009 B2
7618260 Daniel et al. Nov 2009 B2
7636549 Ma et al. Dec 2009 B2
7640007 Chen et al. Dec 2009 B2
7660126 Cho et al. Feb 2010 B2
7684105 Lamontagne et al. Mar 2010 B2
7747113 Mukawa et al. Jun 2010 B2
7761390 Ford Jul 2010 B2
7773111 Cleveland et al. Aug 2010 B2
7787946 Stahmann et al. Aug 2010 B2
7805386 Greer Sep 2010 B2
7809435 Ettare et al. Oct 2010 B1
7844310 Anderson Nov 2010 B2
D628616 Yuan Dec 2010 S
7850306 Uusitalo et al. Dec 2010 B2
7870211 Pascal et al. Jan 2011 B2
D633939 Puentes et al. Mar 2011 S
D634771 Fuchs Mar 2011 S
7901368 Flaherty et al. Mar 2011 B2
7925100 Howell et al. Apr 2011 B2
7948763 Chuang May 2011 B2
D640314 Yang Jun 2011 S
D643428 Janky et al. Aug 2011 S
D646192 Woode Oct 2011 S
D649177 Cho et al. Nov 2011 S
8054061 Prance et al. Nov 2011 B2
D654622 Hsu Feb 2012 S
8120828 Schwerdtner Feb 2012 B2
8170656 Tan May 2012 B2
8179604 Prada Gomez et al. May 2012 B1
8188937 Amafuji et al. May 2012 B1
8190249 Gharieb et al. May 2012 B1
D661613 Demeglio Jun 2012 S
8203502 Chi et al. Jun 2012 B1
8207473 Axisa et al. Jun 2012 B2
8212859 Tang et al. Jul 2012 B2
D667482 Healy et al. Sep 2012 S
D669522 Klinar et al. Oct 2012 S
D669523 Wakata et al. Oct 2012 S
D671590 Klinar et al. Nov 2012 S
8311623 Sanger Nov 2012 B2
8348538 Van Loenen et al. Jan 2013 B2
8351651 Lee Jan 2013 B2
8355671 Kramer et al. Jan 2013 B2
8384683 Luo Feb 2013 B2
8386025 Hoppe Feb 2013 B2
8389862 Arora et al. Mar 2013 B2
8421634 Tan et al. Apr 2013 B2
8427977 Workman et al. Apr 2013 B2
D682343 Waters May 2013 S
D682727 Bulgari May 2013 S
8435191 Barboutis et al. May 2013 B2
8437844 Syed Momen et al. May 2013 B2
8447704 Tan et al. May 2013 B2
D685019 Li Jun 2013 S
8467270 Gossweiler, III et al. Jun 2013 B2
8469741 Oster et al. Jun 2013 B2
D687087 Iurilli Jul 2013 S
8484022 Vanhoucke Jul 2013 B1
D689862 Liu Sep 2013 S
D692941 Klinar et al. Nov 2013 S
8591411 Banet et al. Nov 2013 B2
D695333 Farnam et al. Dec 2013 S
D695454 Moore Dec 2013 S
8620361 Bailey et al. Dec 2013 B2
8624124 Koo et al. Jan 2014 B2
8634119 Bablumyan et al. Jan 2014 B2
D701555 Markovitz et al. Mar 2014 S
8666212 Amirparviz Mar 2014 B1
8702629 Giuffrida et al. Apr 2014 B2
8704882 Turner Apr 2014 B2
D704248 DiChiara May 2014 S
8718980 Garudadri et al. May 2014 B2
8743052 Keller et al. Jun 2014 B1
8744543 Li et al. Jun 2014 B2
8754862 Zaliva Jun 2014 B2
8777668 Ikeda et al. Jul 2014 B2
D716457 Brefka et al. Oct 2014 S
D717685 Bailey et al. Nov 2014 S
8879276 Wang Nov 2014 B2
8880163 Barachant et al. Nov 2014 B2
8883287 Boyce et al. Nov 2014 B2
8890875 Jammes et al. Nov 2014 B2
8892479 Tan et al. Nov 2014 B2
8895865 Lenahan et al. Nov 2014 B2
D719568 Heinrich et al. Dec 2014 S
D719570 Heinrich et al. Dec 2014 S
8912094 Koo et al. Dec 2014 B2
8914472 Lee et al. Dec 2014 B1
8922481 Kauffmann et al. Dec 2014 B1
D723093 Li Feb 2015 S
8954135 Yuen et al. Feb 2015 B2
D724647 Rohrbach Mar 2015 S
8970571 Wong et al. Mar 2015 B1
8971023 Olsson et al. Mar 2015 B2
9018532 Wesselmann et al. Apr 2015 B2
9037530 Tan et al. May 2015 B2
9086687 Park et al. Jul 2015 B2
9092664 Forutanpour et al. Jul 2015 B2
D736664 Paradise et al. Aug 2015 S
9107586 Tran Aug 2015 B2
D738373 Davies et al. Sep 2015 S
9135708 Ebisawa Sep 2015 B2
9146730 Lazar Sep 2015 B2
D741855 Park et al. Oct 2015 S
9170674 Forutanpour et al. Oct 2015 B2
D742272 Bailey et al. Nov 2015 S
D742874 Cheng et al. Nov 2015 S
D743963 Osterhout Nov 2015 S
9182826 Powledge et al. Nov 2015 B2
9211417 Heldman et al. Dec 2015 B2
9218574 Phillipps et al. Dec 2015 B2
D747714 Erbeus Jan 2016 S
D747759 Ho Jan 2016 S
9235934 Mandella et al. Jan 2016 B2
9240069 Li Jan 2016 B1
D750623 Park et al. Mar 2016 S
D751065 Magi Mar 2016 S
9278453 Assad Mar 2016 B2
9299248 Lake et al. Mar 2016 B2
D756359 Bailey et al. May 2016 S
9329694 Slonneger May 2016 B2
9341659 Poupyrev et al. May 2016 B2
9349280 Baldwin et al. May 2016 B2
9351653 Harrison May 2016 B1
D758476 Ho Jun 2016 S
D760313 Ho et al. Jun 2016 S
9367139 Ataee et al. Jun 2016 B2
9372535 Bailey et al. Jun 2016 B2
9389694 Ataee et al. Jul 2016 B2
9393418 Giuffrida et al. Jul 2016 B2
9402582 Parviz et al. Aug 2016 B1
9408316 Bailey et al. Aug 2016 B2
9418927 Axisa et al. Aug 2016 B2
D766895 Choi Sep 2016 S
9439566 Arne et al. Sep 2016 B2
D768627 Rochat et al. Oct 2016 S
9459697 Bedikian et al. Oct 2016 B2
9472956 Michaelis et al. Oct 2016 B2
9477313 Mistry et al. Oct 2016 B2
D771735 Lee et al. Nov 2016 S
9483123 Aleem et al. Nov 2016 B2
9529434 Choi et al. Dec 2016 B2
D780828 Bonaventura et al. Mar 2017 S
D780829 Bonaventura et al. Mar 2017 S
9597015 McNames et al. Mar 2017 B2
9600030 Bailey et al. Mar 2017 B2
9612661 Wagner et al. Apr 2017 B2
9613262 Holz Apr 2017 B2
9652047 Mullins et al. May 2017 B2
9654477 Kotamraju May 2017 B1
9659403 Horowitz May 2017 B1
9687168 John Jun 2017 B2
9696795 Marcolina et al. Jul 2017 B2
9720515 Wagner et al. Aug 2017 B2
9741169 Holz Aug 2017 B1
9766709 Holz Sep 2017 B2
9785247 Horowitz et al. Oct 2017 B1
9788789 Bailey Oct 2017 B2
9807221 Bailey et al. Oct 2017 B2
9864431 Keskin et al. Jan 2018 B2
9867548 Le et al. Jan 2018 B2
9880632 Ataee Jan 2018 B2
9891718 Connor Feb 2018 B2
9921641 Worley, III et al. Mar 2018 B1
9996983 Mullins Jun 2018 B2
10042422 Morun et al. Aug 2018 B2
10070799 Ang et al. Sep 2018 B2
10078435 Noel Sep 2018 B2
10101809 Morun et al. Oct 2018 B2
10152082 Bailey Dec 2018 B2
10185416 Mistry et al. Jan 2019 B2
10188309 Morun et al. Jan 2019 B2
10199008 Aleem et al. Feb 2019 B2
10203751 Keskin et al. Feb 2019 B2
10216274 Chapeskie et al. Feb 2019 B2
10251577 Morun et al. Apr 2019 B2
10310601 Morun et al. Jun 2019 B2
10331210 Morun et al. Jun 2019 B2
10362958 Morun et al. Jul 2019 B2
10409371 Kaifosh et al. Sep 2019 B2
10429928 Morun et al. Oct 2019 B2
10437335 Daniels Oct 2019 B2
10460455 Giurgica-Tiron et al. Oct 2019 B2
10489986 Kaifosh et al. Nov 2019 B2
10496168 Kaifosh et al. Dec 2019 B2
10504286 Kaifosh et al. Dec 2019 B2
10520378 Brown et al. Dec 2019 B1
10528135 Bailey et al. Jan 2020 B2
10558273 Park et al. Feb 2020 B2
10592001 Berenzweig et al. Mar 2020 B2
10610737 Crawford Apr 2020 B1
10676083 De Sapio et al. Jun 2020 B1
10687759 Guo et al. Jun 2020 B2
10905350 Berenzweig et al. Feb 2021 B2
10905383 Barachant Feb 2021 B2
10937414 Berenzweig et al. Mar 2021 B2
10990174 Kaifosh et al. Apr 2021 B2
11009951 Bailey et al. May 2021 B2
11150730 Anderson et al. Oct 2021 B1
20010033402 Popovich Oct 2001 A1
20020003627 Rieder Jan 2002 A1
20020009972 Amento Jan 2002 A1
20020030636 Richards Mar 2002 A1
20020032386 Sackner et al. Mar 2002 A1
20020077534 DuRousseau Jun 2002 A1
20020094701 Biegelsen et al. Jul 2002 A1
20020120415 Millott et al. Aug 2002 A1
20020120916 Snider, Jr. Aug 2002 A1
20020198472 Kramer Dec 2002 A1
20030030595 Radley-Smith Feb 2003 A1
20030036691 Stanaland et al. Feb 2003 A1
20030051505 Robertson et al. Mar 2003 A1
20030144586 Tsubata Jul 2003 A1
20030144829 Geatz et al. Jul 2003 A1
20030171921 Manabe et al. Sep 2003 A1
20030182630 Saund et al. Sep 2003 A1
20030184544 Prudent Oct 2003 A1
20040010210 Avinash et al. Jan 2004 A1
20040024312 Zheng Feb 2004 A1
20040054273 Finneran et al. Mar 2004 A1
20040068409 Tanaka et al. Apr 2004 A1
20040073104 Brun Del Re et al. Apr 2004 A1
20040080499 Lui Apr 2004 A1
20040092839 Shin et al. May 2004 A1
20040138580 Frei et al. Jul 2004 A1
20040194500 Rapport Oct 2004 A1
20040210165 Marmaropoulos et al. Oct 2004 A1
20040243342 Rekimoto Dec 2004 A1
20040254617 Hemmerling et al. Dec 2004 A1
20050005637 Rapport Jan 2005 A1
20050012715 Ford Jan 2005 A1
20050070227 Shen et al. Mar 2005 A1
20050070791 Edney et al. Mar 2005 A1
20050115561 Stahmann et al. Jun 2005 A1
20050119701 Lauter et al. Jun 2005 A1
20050177038 Kolpin et al. Aug 2005 A1
20050179644 Alsio et al. Aug 2005 A1
20060018833 Murphy et al. Jan 2006 A1
20060037359 Stinespring Feb 2006 A1
20060058699 Vitiello et al. Mar 2006 A1
20060061544 Min et al. Mar 2006 A1
20060121958 Jung et al. Jun 2006 A1
20060129057 Maekawa et al. Jun 2006 A1
20060132705 Li Jun 2006 A1
20060149338 Flaherty et al. Jul 2006 A1
20060211956 Sankai Sep 2006 A1
20060238707 Elvesjo et al. Oct 2006 A1
20070009151 Pittman et al. Jan 2007 A1
20070016265 Davoodi et al. Jan 2007 A1
20070023662 Brady et al. Feb 2007 A1
20070078308 Daly Apr 2007 A1
20070132785 Ebersole, Jr. et al. Jun 2007 A1
20070148624 Nativ Jun 2007 A1
20070172797 Hada et al. Jul 2007 A1
20070177770 Derchak et al. Aug 2007 A1
20070185697 Tan et al. Aug 2007 A1
20070196033 Russo Aug 2007 A1
20070256494 Nakamura et al. Nov 2007 A1
20070276270 Tran Nov 2007 A1
20070279852 Daniel et al. Dec 2007 A1
20070285399 Lund Dec 2007 A1
20080001735 Tran Jan 2008 A1
20080032638 Anderson Feb 2008 A1
20080051673 Kong et al. Feb 2008 A1
20080052643 Ike et al. Feb 2008 A1
20080058668 Seyed Momen et al. Mar 2008 A1
20080103639 Troy et al. May 2008 A1
20080103769 Schultz et al. May 2008 A1
20080136775 Conant Jun 2008 A1
20080152217 Greer Jun 2008 A1
20080163130 Westerman Jul 2008 A1
20080214360 Stirling et al. Sep 2008 A1
20080221487 Zohar et al. Sep 2008 A1
20080262772 Luinge et al. Oct 2008 A1
20080278497 Jammes et al. Nov 2008 A1
20080285805 Luinge et al. Nov 2008 A1
20090005700 Joshi et al. Jan 2009 A1
20090007597 Hanevold Jan 2009 A1
20090027337 Hildreth Jan 2009 A1
20090031757 Harding Feb 2009 A1
20090040016 Ikeda Feb 2009 A1
20090051544 Niknejad Feb 2009 A1
20090079607 Denison et al. Mar 2009 A1
20090079813 Hildreth Mar 2009 A1
20090082692 Hale et al. Mar 2009 A1
20090082701 Zohar et al. Mar 2009 A1
20090085864 Kutliroff et al. Apr 2009 A1
20090102580 Uchaykin Apr 2009 A1
20090109241 Tsujimoto Apr 2009 A1
20090112080 Matthews Apr 2009 A1
20090124881 Rytky May 2009 A1
20090147004 Ramon et al. Jun 2009 A1
20090179824 Tsujimoto et al. Jul 2009 A1
20090189864 Walker et al. Jul 2009 A1
20090189867 Krah et al. Jul 2009 A1
20090195497 Fitzgerald et al. Aug 2009 A1
20090204031 McNames et al. Aug 2009 A1
20090207464 Wiltshire et al. Aug 2009 A1
20090209878 Sanger Aug 2009 A1
20090251407 Flake et al. Oct 2009 A1
20090258669 Nie et al. Oct 2009 A1
20090265671 Sachs et al. Oct 2009 A1
20090318785 Ishikawa et al. Dec 2009 A1
20090319230 Case, Jr. et al. Dec 2009 A1
20090322653 Putilin et al. Dec 2009 A1
20090326406 Tan Dec 2009 A1
20090327171 Tan Dec 2009 A1
20100030532 Arora et al. Feb 2010 A1
20100041974 Ting et al. Feb 2010 A1
20100063794 Hernandez-Rebollar Mar 2010 A1
20100066664 Son et al. Mar 2010 A1
20100106044 Linderman Apr 2010 A1
20100113910 Brauers et al. May 2010 A1
20100142015 Kuwahara et al. Jun 2010 A1
20100149073 Chaum et al. Jun 2010 A1
20100150415 Atkinson et al. Jun 2010 A1
20100228487 Leuthardt et al. Sep 2010 A1
20100234696 Li et al. Sep 2010 A1
20100240981 Barboutis et al. Sep 2010 A1
20100249635 Van Der Reijden Sep 2010 A1
20100280628 Sankai Nov 2010 A1
20100292595 Paul Nov 2010 A1
20100292606 Prakash et al. Nov 2010 A1
20100292617 Lei et al. Nov 2010 A1
20100293115 Seyed Momen Nov 2010 A1
20100306713 Geisner et al. Dec 2010 A1
20100315266 Gunawardana et al. Dec 2010 A1
20100317958 Beck et al. Dec 2010 A1
20110007035 Shai Jan 2011 A1
20110018754 Tojima et al. Jan 2011 A1
20110025982 Takahashi Feb 2011 A1
20110054360 Son Mar 2011 A1
20110065319 Oster et al. Mar 2011 A1
20110066381 Garudadri et al. Mar 2011 A1
20110072510 Cheswick Mar 2011 A1
20110077484 Van Slyke et al. Mar 2011 A1
20110082838 Niemela Apr 2011 A1
20110092826 Lee et al. Apr 2011 A1
20110119216 Wigdor May 2011 A1
20110133934 Tan et al. Jun 2011 A1
20110134026 Kang et al. Jun 2011 A1
20110151974 Deaguero Jun 2011 A1
20110166434 Gargiulo Jul 2011 A1
20110172503 Knepper et al. Jul 2011 A1
20110173204 Murillo et al. Jul 2011 A1
20110173574 Clavin et al. Jul 2011 A1
20110181527 Capela et al. Jul 2011 A1
20110202493 Li Aug 2011 A1
20110205242 Friesen Aug 2011 A1
20110213278 Horak et al. Sep 2011 A1
20110221672 Osterhout Sep 2011 A1
20110224507 Banet et al. Sep 2011 A1
20110224556 Moon et al. Sep 2011 A1
20110224564 Moon et al. Sep 2011 A1
20110230782 Bartol et al. Sep 2011 A1
20110248914 Sherr Oct 2011 A1
20110262002 Lee Oct 2011 A1
20110270135 Dooley et al. Nov 2011 A1
20110295100 Hegde et al. Dec 2011 A1
20110313762 Ben-David et al. Dec 2011 A1
20120002256 Lacoste et al. Jan 2012 A1
20120007821 Zaliva Jan 2012 A1
20120029322 Wartena et al. Feb 2012 A1
20120051005 Vanfleteren et al. Mar 2012 A1
20120052268 Axisa et al. Mar 2012 A1
20120053439 Ylostalo et al. Mar 2012 A1
20120066163 Balls et al. Mar 2012 A1
20120071092 Pasquero et al. Mar 2012 A1
20120071780 Barachant et al. Mar 2012 A1
20120101357 Hoskuldsson et al. Apr 2012 A1
20120117514 Kim et al. May 2012 A1
20120139817 Freeman Jun 2012 A1
20120157789 Kangas et al. Jun 2012 A1
20120157886 Tenn Jun 2012 A1
20120165695 Kidmose et al. Jun 2012 A1
20120182309 Griffin et al. Jul 2012 A1
20120184838 John Jul 2012 A1
20120188158 Tan et al. Jul 2012 A1
20120203076 Fatta et al. Aug 2012 A1
20120209134 Morita Aug 2012 A1
20120226130 De Graff et al. Sep 2012 A1
20120249797 Haddick et al. Oct 2012 A1
20120265090 Fink et al. Oct 2012 A1
20120265480 Oshima Oct 2012 A1
20120275621 Elko Nov 2012 A1
20120283526 Gommesen et al. Nov 2012 A1
20120283896 Persaud et al. Nov 2012 A1
20120293548 Perez et al. Nov 2012 A1
20120302858 Kidmose et al. Nov 2012 A1
20120320532 Wang Dec 2012 A1
20120323521 De Foras et al. Dec 2012 A1
20130004033 Trugenberger Jan 2013 A1
20130005303 Song et al. Jan 2013 A1
20130016292 Miao et al. Jan 2013 A1
20130016413 Saeedi et al. Jan 2013 A1
20130020948 Han et al. Jan 2013 A1
20130027341 Mastandrea Jan 2013 A1
20130038707 Cunningham et al. Feb 2013 A1
20130077820 Marais et al. Mar 2013 A1
20130080794 Hsieh Mar 2013 A1
20130106686 Bennett May 2013 A1
20130123656 Heck May 2013 A1
20130123666 Giuffrida et al. May 2013 A1
20130127708 Jung et al. May 2013 A1
20130131538 Gaw et al. May 2013 A1
20130135223 Shai May 2013 A1
20130135722 Yokoyama May 2013 A1
20130141375 Ludwig et al. Jun 2013 A1
20130144629 Johnston et al. Jun 2013 A1
20130165813 Chang et al. Jun 2013 A1
20130191741 Dickinson et al. Jul 2013 A1
20130198694 Rahman et al. Aug 2013 A1
20130207889 Chang et al. Aug 2013 A1
20130207963 Stirbu et al. Aug 2013 A1
20130215235 Russell Aug 2013 A1
20130217998 Mahfouz et al. Aug 2013 A1
20130221996 Poupyrev et al. Aug 2013 A1
20130222384 Futterer Aug 2013 A1
20130232095 Tan et al. Sep 2013 A1
20130259238 Xiang et al. Oct 2013 A1
20130265229 Forutanpour et al. Oct 2013 A1
20130265437 Thorn et al. Oct 2013 A1
20130271292 McDermott Oct 2013 A1
20130285901 Lee et al. Oct 2013 A1
20130285913 Griffin et al. Oct 2013 A1
20130293580 Spivack Nov 2013 A1
20130310979 Herr et al. Nov 2013 A1
20130312256 Wesselmann et al. Nov 2013 A1
20130317382 Le Nov 2013 A1
20130317648 Assad Nov 2013 A1
20130332196 Pinsker Dec 2013 A1
20130335302 Crane et al. Dec 2013 A1
20140005743 Giuffrida et al. Jan 2014 A1
20140020945 Hurwitz et al. Jan 2014 A1
20140028539 Newham et al. Jan 2014 A1
20140028546 Jeon et al. Jan 2014 A1
20140045547 Singamsetty et al. Feb 2014 A1
20140049417 Abdurrahman et al. Feb 2014 A1
20140051946 Arne et al. Feb 2014 A1
20140052150 Taylor et al. Feb 2014 A1
20140074179 Heldman et al. Mar 2014 A1
20140092009 Yen et al. Apr 2014 A1
20140094675 Luna et al. Apr 2014 A1
20140098018 Kim et al. Apr 2014 A1
20140100432 Golda et al. Apr 2014 A1
20140107493 Yuen et al. Apr 2014 A1
20140121471 Walker May 2014 A1
20140122958 Greenebrg et al. May 2014 A1
20140132512 Gomez Sainz-Garcia May 2014 A1
20140139422 Mistry et al. May 2014 A1
20140142937 Powledge et al. May 2014 A1
20140143064 Tran May 2014 A1
20140147820 Snow et al. May 2014 A1
20140157168 Albouyeh et al. Jun 2014 A1
20140194062 Palin et al. Jul 2014 A1
20140196131 Lee Jul 2014 A1
20140198034 Bailey et al. Jul 2014 A1
20140198035 Bailey Jul 2014 A1
20140198944 Forutanpour et al. Jul 2014 A1
20140200432 Banerji Jul 2014 A1
20140201666 Bedikian et al. Jul 2014 A1
20140202643 Hikmet et al. Jul 2014 A1
20140204455 Popovich et al. Jul 2014 A1
20140207017 Gilmore et al. Jul 2014 A1
20140223462 Aimone et al. Aug 2014 A1
20140226193 Sun Aug 2014 A1
20140232651 Kress et al. Aug 2014 A1
20140236031 Banet et al. Aug 2014 A1
20140240103 Lake Aug 2014 A1
20140240223 Lake Aug 2014 A1
20140245200 Holz Aug 2014 A1
20140249397 Lake et al. Sep 2014 A1
20140257141 Giuffrida et al. Sep 2014 A1
20140258864 Shenoy et al. Sep 2014 A1
20140277622 Raniere Sep 2014 A1
20140278139 Hong et al. Sep 2014 A1
20140278441 Ton et al. Sep 2014 A1
20140279860 Pan et al. Sep 2014 A1
20140282282 Holz Sep 2014 A1
20140285326 Luna et al. Sep 2014 A1
20140285429 Simmons Sep 2014 A1
20140297528 Agrawal et al. Oct 2014 A1
20140299362 Park et al. Oct 2014 A1
20140304665 Holz Oct 2014 A1
20140310595 Acharya et al. Oct 2014 A1
20140330404 Abdelghani et al. Nov 2014 A1
20140334083 Bailey Nov 2014 A1
20140334653 Luna et al. Nov 2014 A1
20140337861 Chang et al. Nov 2014 A1
20140340857 Hsu et al. Nov 2014 A1
20140344731 Holz Nov 2014 A1
20140349257 Connor Nov 2014 A1
20140354528 Laughlin et al. Dec 2014 A1
20140354529 Laughlin et al. Dec 2014 A1
20140355825 Kim et al. Dec 2014 A1
20140358024 Nelson et al. Dec 2014 A1
20140358825 Phillipps et al. Dec 2014 A1
20140359540 Kelsey et al. Dec 2014 A1
20140361988 Katz et al. Dec 2014 A1
20140364703 Kim et al. Dec 2014 A1
20140365163 Jallon Dec 2014 A1
20140368424 Choi et al. Dec 2014 A1
20140368428 Pinault Dec 2014 A1
20140368474 Kim et al. Dec 2014 A1
20140368896 Nakazono et al. Dec 2014 A1
20140375465 Fenuccio et al. Dec 2014 A1
20140376773 Holz Dec 2014 A1
20150006120 Sett et al. Jan 2015 A1
20150010203 Muninder et al. Jan 2015 A1
20150011857 Henson et al. Jan 2015 A1
20150019135 Kacyvenski et al. Jan 2015 A1
20150025355 Bailey Jan 2015 A1
20150029092 Holz et al. Jan 2015 A1
20150035827 Yamaoka et al. Feb 2015 A1
20150036221 Stephenson Feb 2015 A1
20150045689 Barone Feb 2015 A1
20150045699 Mokaya et al. Feb 2015 A1
20150051470 Bailey Feb 2015 A1
20150057506 Luna et al. Feb 2015 A1
20150057770 Bailey Feb 2015 A1
20150065840 Bailey Mar 2015 A1
20150070270 Bailey et al. Mar 2015 A1
20150070274 Morozov Mar 2015 A1
20150072326 Mauri et al. Mar 2015 A1
20150084860 Aleem Mar 2015 A1
20150091790 Forutanpour et al. Apr 2015 A1
20150094564 Tashman et al. Apr 2015 A1
20150099946 Sahin Apr 2015 A1
20150106052 Balakrishnan et al. Apr 2015 A1
20150109202 Ataee Apr 2015 A1
20150124566 Lake May 2015 A1
20150128094 Baldwin et al. May 2015 A1
20150141784 Morun May 2015 A1
20150148641 Morun et al. May 2015 A1
20150148728 Sallum et al. May 2015 A1
20150157944 Gottlieb Jun 2015 A1
20150160621 Yilmaz Jun 2015 A1
20150169074 Ataee et al. Jun 2015 A1
20150170421 Mandella et al. Jun 2015 A1
20150177841 Vanblon et al. Jun 2015 A1
20150182113 Utter, II Jul 2015 A1
20150182130 Utter, II Jul 2015 A1
20150182160 Kim et al. Jul 2015 A1
20150182163 Utter Jul 2015 A1
20150182164 Utter, II Jul 2015 A1
20150182165 Miller et al. Jul 2015 A1
20150185838 Camacho-Perez Jul 2015 A1
20150185853 Clausen et al. Jul 2015 A1
20150186609 Utter, II Jul 2015 A1
20150187355 Parkinson et al. Jul 2015 A1
20150193949 Katz et al. Jul 2015 A1
20150199025 Holz Jul 2015 A1
20150205126 Schowengerdt Jul 2015 A1
20150205134 Bailey et al. Jul 2015 A1
20150213191 Abdelghani et al. Jul 2015 A1
20150216475 Luna et al. Aug 2015 A1
20150220152 Tait et al. Aug 2015 A1
20150223716 Korkala et al. Aug 2015 A1
20150230756 Luna et al. Aug 2015 A1
20150234426 Bailey Aug 2015 A1
20150237716 Su et al. Aug 2015 A1
20150242009 Xiao et al. Aug 2015 A1
20150242120 Rodriguez Aug 2015 A1
20150242575 Abovitz et al. Aug 2015 A1
20150261306 Lake Sep 2015 A1
20150261318 Scavezze et al. Sep 2015 A1
20150272483 Etemad et al. Oct 2015 A1
20150277575 Ataee et al. Oct 2015 A1
20150288944 Nistico et al. Oct 2015 A1
20150289995 Wilkinson et al. Oct 2015 A1
20150293592 Cheong et al. Oct 2015 A1
20150296553 DiFranco Oct 2015 A1
20150302168 De Sapio et al. Oct 2015 A1
20150305672 Grey et al. Oct 2015 A1
20150309563 Connor Oct 2015 A1
20150309582 Gupta Oct 2015 A1
20150310766 Alshehri et al. Oct 2015 A1
20150312175 Langholz Oct 2015 A1
20150313496 Connor Nov 2015 A1
20150323998 Kudekar et al. Nov 2015 A1
20150325202 Lake et al. Nov 2015 A1
20150332013 Lee et al. Nov 2015 A1
20150346701 Gordon et al. Dec 2015 A1
20150351690 Toth et al. Dec 2015 A1
20150355716 Balasubramanian et al. Dec 2015 A1
20150355718 Slonneger Dec 2015 A1
20150362734 Moser et al. Dec 2015 A1
20150366504 Connor Dec 2015 A1
20150370326 Chapeskie Dec 2015 A1
20150370333 Ataee Dec 2015 A1
20150378161 Bailey et al. Dec 2015 A1
20150378162 Bailey et al. Dec 2015 A1
20150378164 Bailey et al. Dec 2015 A1
20150379770 Haley, Jr. et al. Dec 2015 A1
20160011668 Gilad-Bachrach et al. Jan 2016 A1
20160020500 Matsuda Jan 2016 A1
20160026853 Wexler et al. Jan 2016 A1
20160033771 Tremblay et al. Feb 2016 A1
20160049073 Lee Feb 2016 A1
20160050037 Webb Feb 2016 A1
20160071319 Fallon et al. Mar 2016 A1
20160092504 Mitri et al. Mar 2016 A1
20160099010 Sainath Apr 2016 A1
20160107309 Walsh et al. Apr 2016 A1
20160113587 Kothe et al. Apr 2016 A1
20160144172 Hsueh et al. May 2016 A1
20160150636 Otsubo May 2016 A1
20160156762 Bailey et al. Jun 2016 A1
20160162604 Xiaoli et al. Jun 2016 A1
20160170710 Kim et al. Jun 2016 A1
20160187992 Yamamoto et al. Jun 2016 A1
20160195928 Wagner et al. Jul 2016 A1
20160199699 Klassen Jul 2016 A1
20160202081 Debieuvre et al. Jul 2016 A1
20160206206 Avila et al. Jul 2016 A1
20160207201 Herr et al. Jul 2016 A1
20160217614 Kraver et al. Jul 2016 A1
20160235323 Tadi et al. Aug 2016 A1
20160238845 Alexander et al. Aug 2016 A1
20160239080 Marcolina et al. Aug 2016 A1
20160242646 Obma Aug 2016 A1
20160259407 Schick Sep 2016 A1
20160262687 Vaidyanathan et al. Sep 2016 A1
20160263458 Mather et al. Sep 2016 A1
20160274365 Bailey et al. Sep 2016 A1
20160274732 Bang et al. Sep 2016 A1
20160274758 Bailey Sep 2016 A1
20160282947 Schwarz et al. Sep 2016 A1
20160291768 Cho et al. Oct 2016 A1
20160292497 Kehtarnavaz et al. Oct 2016 A1
20160309249 Wu et al. Oct 2016 A1
20160313798 Connor Oct 2016 A1
20160313801 Wagner et al. Oct 2016 A1
20160313890 Walline et al. Oct 2016 A1
20160313899 Noel Oct 2016 A1
20160314623 Coleman et al. Oct 2016 A1
20160327796 Bailey et al. Nov 2016 A1
20160327797 Bailey et al. Nov 2016 A1
20160342227 Natzke et al. Nov 2016 A1
20160349514 Alexander et al. Dec 2016 A1
20160349515 Alexander et al. Dec 2016 A1
20160349516 Alexander et al. Dec 2016 A1
20160350973 Shapira et al. Dec 2016 A1
20160377865 Alexander et al. Dec 2016 A1
20160377866 Alexander et al. Dec 2016 A1
20170025026 Ortiz Catalan Jan 2017 A1
20170031502 Rosenberg et al. Feb 2017 A1
20170035313 Hong et al. Feb 2017 A1
20170061817 Mettler May Mar 2017 A1
20170068095 Holland et al. Mar 2017 A1
20170068445 Lee et al. Mar 2017 A1
20170075426 Camacho Perez et al. Mar 2017 A1
20170079828 Pedtke et al. Mar 2017 A1
20170080346 Abbas Mar 2017 A1
20170090604 Barbier Mar 2017 A1
20170091567 Wang et al. Mar 2017 A1
20170095178 Schoen et al. Apr 2017 A1
20170097753 Bailey et al. Apr 2017 A1
20170115483 Aleem et al. Apr 2017 A1
20170119472 Herrmann et al. May 2017 A1
20170123487 Hazra et al. May 2017 A1
20170124474 Kashyap May 2017 A1
20170124816 Yang et al. May 2017 A1
20170127354 Garland et al. May 2017 A1
20170147077 Park et al. May 2017 A1
20170153701 Mahon et al. Jun 2017 A1
20170161635 Oono et al. Jun 2017 A1
20170188878 Lee Jul 2017 A1
20170188980 Ash Jul 2017 A1
20170197142 Stafford et al. Jul 2017 A1
20170205876 Vidal et al. Jul 2017 A1
20170209055 Pantelopoulos et al. Jul 2017 A1
20170212290 Alexander et al. Jul 2017 A1
20170212349 Bailey et al. Jul 2017 A1
20170219829 Bailey Aug 2017 A1
20170220923 Bae et al. Aug 2017 A1
20170237789 Harner et al. Aug 2017 A1
20170237901 Lee et al. Aug 2017 A1
20170259167 Cook et al. Sep 2017 A1
20170262064 Ofir Sep 2017 A1
20170277282 Go Sep 2017 A1
20170285744 Juliato Oct 2017 A1
20170285756 Wang et al. Oct 2017 A1
20170285757 Robertson et al. Oct 2017 A1
20170285848 Rosenberg et al. Oct 2017 A1
20170296363 Yetkin et al. Oct 2017 A1
20170299956 Holland et al. Oct 2017 A1
20170301630 Nguyen et al. Oct 2017 A1
20170308118 Ito Oct 2017 A1
20170312614 Tran et al. Nov 2017 A1
20170329392 Keskin et al. Nov 2017 A1
20170329404 Keskin et al. Nov 2017 A1
20170340506 Zhang et al. Nov 2017 A1
20170344706 Torres et al. Nov 2017 A1
20170347908 Watanabe et al. Dec 2017 A1
20170371403 Wetzler et al. Dec 2017 A1
20180000367 Longinotti-Buitoni Jan 2018 A1
20180018825 Kim et al. Jan 2018 A1
20180020285 Zass Jan 2018 A1
20180020951 Kaifosh et al. Jan 2018 A1
20180020978 Kaifosh et al. Jan 2018 A1
20180020990 Park et al. Jan 2018 A1
20180024634 Kaifosh et al. Jan 2018 A1
20180024635 Kaifosh et al. Jan 2018 A1
20180024641 Mao et al. Jan 2018 A1
20180064363 Morun et al. Mar 2018 A1
20180067553 Morun et al. Mar 2018 A1
20180068489 Kim et al. Mar 2018 A1
20180074332 Li et al. Mar 2018 A1
20180081439 Daniels Mar 2018 A1
20180088675 Vogel Mar 2018 A1
20180088765 Bailey Mar 2018 A1
20180092599 Kerth et al. Apr 2018 A1
20180093181 Goslin et al. Apr 2018 A1
20180095542 Mallinson Apr 2018 A1
20180095630 Bailey Apr 2018 A1
20180101235 Bodensteiner et al. Apr 2018 A1
20180101289 Bailey Apr 2018 A1
20180107275 Chen et al. Apr 2018 A1
20180120948 Aleem et al. May 2018 A1
20180133551 Chang et al. May 2018 A1
20180140441 Poirters May 2018 A1
20180150033 Lake et al. May 2018 A1
20180153430 Ang Jun 2018 A1
20180153444 Yang et al. Jun 2018 A1
20180154140 Bouton et al. Jun 2018 A1
20180168905 Goodall et al. Jun 2018 A1
20180178008 Bouton et al. Jun 2018 A1
20180217249 La Salla et al. Aug 2018 A1
20180239430 Tadi et al. Aug 2018 A1
20180240459 Weng et al. Aug 2018 A1
20180247443 Briggs et al. Aug 2018 A1
20180279919 Bansbach et al. Oct 2018 A1
20180301057 Hargrove et al. Oct 2018 A1
20180307314 Connor Oct 2018 A1
20180314879 Khwaja et al. Nov 2018 A1
20180321745 Morun et al. Nov 2018 A1
20180321746 Morun et al. Nov 2018 A1
20180330549 Brenton Nov 2018 A1
20180333575 Bouton Nov 2018 A1
20180344195 Morun et al. Dec 2018 A1
20180356890 Zhang et al. Dec 2018 A1
20180360379 Harrison et al. Dec 2018 A1
20190008453 Spoof Jan 2019 A1
20190025919 Tadi et al. Jan 2019 A1
20190027141 Strong et al. Jan 2019 A1
20190033967 Morun et al. Jan 2019 A1
20190033974 Mu et al. Jan 2019 A1
20190038166 Tavabi et al. Feb 2019 A1
20190056422 Park et al. Feb 2019 A1
20190076716 Chiou et al. Mar 2019 A1
20190089898 Kim et al. Mar 2019 A1
20190113973 Coleman et al. Apr 2019 A1
20190121305 Kaifosh et al. Apr 2019 A1
20190121306 Kaifosh et al. Apr 2019 A1
20190146809 Lee et al. May 2019 A1
20190150777 Guo et al. May 2019 A1
20190192037 Morun et al. Jun 2019 A1
20190196585 Laszlo et al. Jun 2019 A1
20190196586 Laszlo et al. Jun 2019 A1
20190197778 Sachdeva et al. Jun 2019 A1
20190209034 Deno Jul 2019 A1
20190212817 Kaifosh et al. Jul 2019 A1
20190216619 McDonnall Jul 2019 A1
20190223748 Al-Natsheh et al. Jul 2019 A1
20190227627 Kaifosh et al. Jul 2019 A1
20190228330 Kaifosh et al. Jul 2019 A1
20190228533 Giurgica-Tiron et al. Jul 2019 A1
20190228579 Kaifosh et al. Jul 2019 A1
20190228590 Kaifosh et al. Jul 2019 A1
20190228591 Giurgica-Tiron et al. Jul 2019 A1
20190247650 Tran Aug 2019 A1
20190279407 McHugh et al. Sep 2019 A1
20190294243 Laszlo et al. Sep 2019 A1
20190324549 Araki et al. Oct 2019 A1
20190332140 Wang et al. Oct 2019 A1
20190348026 Berenzweig et al. Nov 2019 A1
20190348027 Berenzweig et al. Nov 2019 A1
20190357787 Barachant et al. Nov 2019 A1
20190362557 Lacey et al. Nov 2019 A1
20200042089 Ang et al. Feb 2020 A1
20200057661 Bendfeldt Feb 2020 A1
20200065569 Nduka et al. Feb 2020 A1
20200069210 Berenzweig et al. Mar 2020 A1
20200069211 Berenzweig et al. Mar 2020 A1
20200073483 Berenzweig et al. Mar 2020 A1
20200077955 Shui Mar 2020 A1
20200097081 Stone et al. Mar 2020 A1
20200097083 Mao et al. Mar 2020 A1
20200111260 Osborn et al. Apr 2020 A1
20200125171 Morun et al. Apr 2020 A1
20200142490 Xiong et al. May 2020 A1
20200143795 Park et al. May 2020 A1
20200159322 Morun et al. May 2020 A1
20200163562 Neaves May 2020 A1
20200205932 Zar Jul 2020 A1
20200225320 Belskikh et al. Jul 2020 A1
20200245873 Frank et al. Aug 2020 A1
20200249752 Parshionikar Aug 2020 A1
20200275895 Barachant Sep 2020 A1
20200301509 Liu et al. Sep 2020 A1
20200305795 Floyd Oct 2020 A1
20200320335 Shamun et al. Oct 2020 A1
20210109598 Zhang et al. Apr 2021 A1
20210117523 Kim et al. Apr 2021 A1
20210290159 Bruinsma et al. Sep 2021 A1
20220256706 Xiong Aug 2022 A1
Foreign Referenced Citations (117)
Number Date Country
2902045 Aug 2014 CA
2921954 Feb 2015 CA
2939644 Aug 2015 CA
1838933 Sep 2006 CN
101310242 Nov 2008 CN
102246125 Nov 2011 CN
102349037 Feb 2012 CN
103501694 Jan 2014 CN
103720470 Apr 2014 CN
103777752 May 2014 CN
103886215 Jun 2014 CN
104951069 Sep 2015 CN
105009031 Oct 2015 CN
105190477 Dec 2015 CN
105190578 Dec 2015 CN
105511615 Apr 2016 CN
106067178 Nov 2016 CN
106102504 Nov 2016 CN
106108898 Nov 2016 CN
107203272 Sep 2017 CN
109620651 Apr 2019 CN
110300542 Oct 2019 CN
111616847 Sep 2020 CN
111902077 Nov 2020 CN
112074225 Dec 2020 CN
112469469 Mar 2021 CN
112822992 May 2021 CN
4412278 Oct 1995 DE
0301790 Feb 1989 EP
1345210 Sep 2003 EP
1408443 Oct 2006 EP
2198521 Jun 2012 EP
2541763 Jan 2013 EP
2733578 May 2014 EP
2959394 Dec 2015 EP
3104737 Dec 2016 EP
3200051 Aug 2017 EP
3487395 May 2019 EP
3697297 Dec 2020 EP
2959394 May 2021 EP
S61198892 Sep 1986 JP
H05277080 Oct 1993 JP
H0639754 Feb 1994 JP
H07248873 Sep 1995 JP
3103427 Oct 2000 JP
2001054507 Feb 2001 JP
2002287869 Oct 2002 JP
2003303047 Oct 2003 JP
2005095561 Apr 2005 JP
2005352739 Dec 2005 JP
2008192004 Aug 2008 JP
2009050679 Mar 2009 JP
2010520561 Jun 2010 JP
2013160905 Aug 2013 JP
2015512550 Apr 2015 JP
2015514467 May 2015 JP
2016507098 Mar 2016 JP
2016507851 Mar 2016 JP
2016540276 Dec 2016 JP
2017509386 Apr 2017 JP
2019023941 Feb 2019 JP
2019185531 Oct 2019 JP
2021072136 May 2021 JP
20110040165 Apr 2011 KR
20120094870 Aug 2012 KR
20120097997 Sep 2012 KR
20150123254 Nov 2015 KR
20160121552 Oct 2016 KR
20170067873 Jun 2017 KR
20170107283 Sep 2017 KR
101790147 Oct 2017 KR
20190022329 Mar 2019 KR
9527341 Oct 1995 WO
2006086504 Aug 2006 WO
2008109248 Sep 2008 WO
2009042313 Apr 2009 WO
2010095636 Aug 2010 WO
2010104879 Sep 2010 WO
2011011750 Jan 2011 WO
2011070554 Jun 2011 WO
2012155157 Nov 2012 WO
2013154864 Oct 2013 WO
2014130871 Aug 2014 WO
2014155288 Oct 2014 WO
2014186370 Nov 2014 WO
2014194257 Dec 2014 WO
2014197443 Dec 2014 WO
2015027089 Feb 2015 WO
2015063520 May 2015 WO
2015073713 May 2015 WO
2015081113 Jun 2015 WO
2015100172 Jul 2015 WO
2015123445 Aug 2015 WO
2015123775 Aug 2015 WO
2015184760 Dec 2015 WO
2015192117 Dec 2015 WO
2015199747 Dec 2015 WO
2016041088 Mar 2016 WO
2017062544 Apr 2017 WO
2017075611 May 2017 WO
2017092225 Jun 2017 WO
2017120669 Jul 2017 WO
2017172185 Oct 2017 WO
2017208167 Dec 2017 WO
2018022602 Feb 2018 WO
2018098046 May 2018 WO
2019099758 May 2019 WO
2019147953 Aug 2019 WO
2019147958 Aug 2019 WO
2019147996 Aug 2019 WO
2019217419 Nov 2019 WO
2019226259 Nov 2019 WO
2019231911 Dec 2019 WO
2020047429 Mar 2020 WO
2020061440 Mar 2020 WO
2020061451 Mar 2020 WO
2020072915 Apr 2020 WO
Non-Patent Literature Citations (385)
Entry
Gargiulo g., et al “GigaOhm High-Impedance FET Input Amplifiers for Dry Electrodes Biosensor Circuits and Systems”, Jan. 2011, https://www.researchgate.net/publication/255994293_Giga-Ohm_High-Impedance_FET_Input_Amplifiers_for_Dry_Electrode_Biosensor_Circuits_and_Systems (Year: 2011).
Office Action dated Feb. 7, 2023 for European Application No. 19810524.9, filed May 28, 2019, 7 pages.
Al-Jumaily A., et al., “Electromyogram(EMG) Driven System based Virtual Reality for Prosthetic and Rehabilitation Devices,” Proceedings of the 11th Internationalconference on Information Integration Andweb-Based Applications & Services, Jan. 1, 2009, pp. 582-586.
Al-Mashhadany Y.I., “Inverse Kinematics Problem (IKP) of 6-DOF Manipulator By Locally Recurrent Neural Networks (LRNNs),” Management and Service Science (MASS), International Conference on Management and Service Science., IEEE, Aug. 24, 2010, 5 pages.
Ai-Timemy A.H., et al., “Improving the Performance Against Force Variation of EMG Controlled Multifunctional Upper-Limb Prostheses for Transradial Amputees,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, Jun. 2016, vol. 24 (6), 12 Pages.
Amitai Y., “P-27: A Two-Dimensional Aperture Expander for Ultra-Compact, High-Performance Head-Worn Displays,” SID Symposium Digest of Technical Papers, 2005, vol. 36 (1), pp. 360-363.
Arkenbout E.A., et al., “Robust Hand Motion Tracking through Data Fusion of 5DT Data Glove and Nimble VR Kinect Camera Measurements,” Sensors, 2015, vol. 15, pp. 31644-31671.
Ayras P., et al., “Exit Pupil Expander With a Large Field of View Based on Diffractive Optics,” Journal of the SID, 2009, vol. 17 (8), pp. 659-664.
Bailey ct al., Wearable Muscle Interface Systems, Devices and Methods That Interact With Content Displayed on an Electronic Display, Office Action dated Mar. 31, 2015, for U.S. Appl. No. 14/155,107, 17 pages.
Bailey et al., “Muscle Interface Device and Method for Interacting With Content Displayed on Wearable Head Mounted Displays,” Amendment filed Aug. 25, 2015, for U.S. Appl. No. 14/155,087, 10 pages.
Bailey et al., “Muscle Interface Device and Method for Interacting With Content Displayed on Wearable Head Mounted Displays,” Amendment filed Aug. 9, 2016, for U.S. Appl. No. 14/155,087, 8 pages.
Bailey et al., “Muscle Interface Device and Method for Interacting With Content Displayed on Wearable Head Mounted Displays,” Amendment filed May 17, 2016, for U.S. Appl. No. 14/155,087, 13 pages.
Bailey et al., “Muscle Interface Device and Method for Interacting With Content Displayed on Wearable Head Mounted Displays,” Office Action dated Feb. 17, 2016, for U.S. Appl. No. 14/155,087, 16 pages.
Bailey et al., “Muscle Interface Device and Method for Interacting With Content Displayed on Wearable Head Mounted Displays,” Office Action dated Jul. 20, 2015, for U.S. Appl. No. 14/155,087, 14 pages.
Bailey et al., “Muscle Interface Device and Method for Interacting With Content Displayed on Wearable Head Mounted Displays,” Office Action dated Jul. 8, 2016, for U.S. Appl. No. 14/155,087, 16 pages.
Bailey et al., “Muscle Interface Device and Method for Interacting With Content Displayed on Wearable Head Mounted Displays,” Office Action dated Mar. 31, 2015, for U.S. Appl. No. 14/155,087, 15 pages.
Bailey et al., “Muscle Interface Device and Method for Interacting With Content Displayed on Wearable Head Mounted Displays,” Preliminary Amendment filed Jan. 28, 2014, for U.S. Appl. No. 14/155,087, 8 pages.
Bailey et al., “Wearable Muscle Interface Systems, Devices and Methods That Interact With. Content Displayed on an Electronic Display,” Amendment filed Aug. 9, 2016, for U.S. Appl No. 14/155,107, 8 pages.
Bailey et al., “Wearable Muscle Interface Systems, Devices and Methods That Interact With Content Displayed on an Electronic Display,” Amendment filed May 11, 2016, for U.S. Appl. No. 14/155,107, 15 pages.
Bailey et al., Wearable Muscle Interface Systems, Devices and Methods That Interact With Content Displayed on an Electronic Display/ Office Action dated Feb. 11, 2016, for U.S. Appl. No. 14/155,107, 20 pages.
Bailey et al., Wearable Muscle Interface Systems, Devices and Methods That Interact With Content Displayed on an Electronic Display, Office Action dated Jul. 16, 2015, forU.S. Appl. No. 14/155,107, 20 pages.
Bailey et al., Wearable Muscle Interface Systems. Devices and Methods That Interact With Content Displayed on an Electronic Display/ Office Action dated Jul. 8, 2016, for U.S. Appl. No. 14/155,107, 21 pages.
Bailey., et al., “Wearable Muscle Interface Systems, Devices And Methods That Interact With Content Displayed On An Electronic Display,” Office Action dated Mar. 31, 2015, for U.S. Appl. No. 14/155,107, 17 pages.
Benko H., et al., “Enhancing Input On and Above the Interactive Surface with Muscle Sensing,” The ACM International Conference on Interactive Tabletops and Surfaces (ITS), Nov. 23-25, 2009, pp. 93-100.
Berenzweig A., et al., “Wearable Devices and Methods for Improved Speech Recognition,” U.S. Appl. No. 16/785,680, filed Feb. 10, 2020, 67 pages.
Boyali A., et al., “Spectral Collaborative Representation based Classification for Hand Gestures Recognition on Electromyography Signals,” Biomedical Signal Processing and Control, 2016, vol. 24, pp. 11-18.
Brownlee J., “Finite State Machines (FSM): Finite State Machines as a Control Technique in Artificial Intelligence (AI),” FSM, Jun. 2002, 12 pages.
Cannan J., et al., “A Wearable Sensor Fusion Armband for Simple Motion Control and Selection for Disabled and Non-Disabled Users,” Computer Science and Electronic Engineering Conference, IEEE, Sep. 12, 2012, pp. 216-219, XP032276745.
Chellappan K.V., et al., “Laser-Based Displays: A Review,” Applied Optics, Sep. 1, 2010, vol. 49 (25), pp. F79-F98.
Cheng J., et al., “A Novel Phonology- and Radical-Coded Chinese Sign Language Recognition Framework Using Accelerometer and Surface Electromyography Sensors,” Sensors, 2015, vol. 15, pp. 23303-23324.
Communication Pursuant to Article 94(3) for European Patent Application No. 17835112.8, dated Dec. 14, 2020, 6 Pages.
Communication Pursuant to Rule 164(1) EPC, Partial Supplementary European Search Report for European Application No. 14753949.8, dated Sep. 30, 2016, 7 pages.
Co-pending U.S. Appl. No. 15/659,072, inventors Patrick; Kaifosh et al., filed on Jul. 25, 2017.
Co-pending U.S. Appl. No. 15/816,435, inventors Ning; Guo et al., filed on Nov. 17, 2017.
Co-pending U.S. Appl. No. 15/882,858, inventors Stephen; Lake et al., filed on Jan. 29, 2018.
Co-pending U.S. Appl. No. 15/974,430, inventors Adam; Berenzweig et al., filed on May 8, 2018.
Co-pending U.S. Appl. No. 16/353,998, inventors Patrick; Kaifosh et al., filed on Mar. 14, 2019.
Co-pending U.S. Appl. No. 16/557,383, inventors Adam; Berenzweig et al., filed on Aug. 30, 2019.
Co-pending U.S. Appl. No. 16/557,427, inventors Adam; Berenzweig et al., filed on Aug. 30, 2019.
Co-Pending U.S. Appl. No. 15/974,430, filed May 8, 2018, 44 Pages.
Co-Pending U.S. Appl. No. 16/353,998, filed Mar. 14, 2019, 43 pages.
Co-Pending U.S. Appl. No. 16/557,383, filed Aug. 30, 2019, 94 Pages.
Co-Pending U.S. Appl. No. 16/557,427, filed Aug. 30, 2019, 93 Pages.
Co-Pending U.S. Appl. No. 16/577,207, filed Sep. 20, 2019, 67 Pages.
Co-Pending U.S. Appl. No. 14/505,836, filed Oct. 3, 2014, 59 Pages.
Co-Pending U.S. Appl. No. 15/816,435, filed Nov. 17, 2017, 24 Pages.
Co-Pending U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 54 Pages.
Co-Pending U.S. Appl. No. 15/974,384, filed May 8, 2018, 44 Pages.
Co-Pending U.S. Appl. No. 15/974,454, filed May 8, 2018, 45 Pages.
Co-Pending U.S. Appl. No. 16/557,342, filed Aug. 30, 2019, 93 Pages.
Co-Pending U.S. Appl. No. 16/430,299, filed Jun. 3, 2019, 42 Pages.
Corazza S., et al.,“A Markerless Motion Capture System to Study Musculoskeletal Biomechanics: Visual Hull and Simulated Annealing Approach,” Annals of Biomedical Engineering, Jul. 2006, vol. 34 (6), pp. 1019-1029, [Retrieved on Dec. 11, 2019], 11 pages, Retrieved from the Internet: URL: https://www.researchgate.net/publication/6999610_A_Markerless_Motion_Capture_System_to_Study_Musculoskeletal_Biomechanics_Visual_Hull_and_Simulated_Annealing_Approach.
International Preliminary Report on Patentability for International Application No. PCT/US2019/061759, dated May 27, 2021, 12 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2019/063587, dated Jun. 10, 2021, 13 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2020/049274, dated Mar. 17, 2022, 14 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2020/061392, dated Jun. 9, 2022, 11 pages.
International Search Report and Written Opinion for International Application No. PCT/US2014/052143, dated Nov. 21, 2014, 8 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2014/067443, dated Feb. 27, 2015, 8 pages.
International Search Report and Written Opinion for International Application No. PCT/US2015/015675, dated May 27, 2015, 9 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2016/018293, dated Jun. 8, 2016, 17 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2016/018298, dated Jun. 8, 2016, 14 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2016/018299, dated Jun. 8, 2016, 12 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2016/067246, dated Apr. 25, 2017, 10 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2017/043686, dated Oct. 6, 2017, 9 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2017/043693, dated Oct. 6, 2017, 7 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2017/043791, dated Oct. 5, 2017, 10 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2018/056768, dated Jan. 15, 2019, 8 pages.
International Search Report and Written Opinion for International Application No. PCT/US2018/061409, dated Mar. 12, 2019, 11 pages.
International Search Report and Written Opinion for International Application No. PCT/US2018/063215, dated Mar. 21, 2019, 17 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/015167, dated May 21, 2019, 7 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/015174, dated May 21, 2019, 8 pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/015244, dated May 16, 2019, 8 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/020065, dated May 16, 2019, 10 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/028299, dated Aug. 9, 2019, 12 pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/031114, dated Dec. 20, 2019, 18 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/034173, dated Sep. 18, 2019, 10 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/037302, dated Oct. 11, 2019, 13 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/042579, dated Oct. 31, 2019, 8 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/046351, dated Nov. 7, 2019, 9 pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/049094, dated Jan. 9, 2020, 27 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/052131, dated Dec. 6, 2019, 8 pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/052151, dated Jan. 15, 2020, 10 pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/054716, dated Dec. 20, 2019, 11 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/061759, dated Jan. 29, 2020, 12 pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/063587, dated Mar. 25, 2020, 16 pages.
International Search Report and Written Opinion for International Application No. PCT/US2020/025735, dated Jun. 22, 2020, 10 pages.
International Search Report and Written Opinion for International Application No. PCT/US2020/025772, dated Aug. 3, 2020, 11 pages.
International Search Report and Written Opinion for International Application No. PCT/US2020/025797, dated Jul. 9, 2020, 10 pages.
International Search Report and Written Opinion for International Application No. PCT/US2020/049274, dated Feb. 1, 2021, 17 pages.
International Search Report and Written Opinion for International Application No. PCT/US2020/061392, dated Mar. 12, 2021, 12 pages.
International Search Report and Written Opinion for International Application No. PCT/US2017/043792, dated Oct. 5, 2017, 9 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/015134, dated May 15, 2019, 11 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/015180, dated May 28, 2019, 10 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/015183, dated May 3, 2019, 8 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/015238, dated May 16, 2019, 8 Pages.
Invitation to Pay Additional Fees for International Application No. PCT/US2019/031114, dated Aug. 6, 2019, 7 pages.
Invitation to Pay Additional Fees for International Application No. PCT/US2019/049094, dated Oct. 24, 2019, 2 Pages.
Itoh Y., et al., “Interaction-Free Calibration for Optical See-Through Head-Mounted Displays based on 3D Eye Localization,” IEEE Symposium on 3D User Interfaces (3DUI), 2014, pp. 75-82.
Janssen C., “Radio Frequency (RF),” 2013, [Retrieved on Jul. 12, 2017], 2 pages, Retrieved from the Internet: URL: https://web.archive.org/web/20130726153946/https://www.techopedia.com/definition/5083/radio-frequency-rf.
Jiang H., “Effective and Interactive Interpretation of Gestures by Individuals with Mobility Impairments,” Thesis/Dissertation Acceptance, Purdue University Graduate School, Graduate School Form 30, Updated on Jan. 15, 2015, 24 pages.
Kainz et al., “Approach to Hand Tracking and Gesture Recognition Based on Depth-Sensing Cameras and EMG Monitoring,” Acta Informatica Pragensia, vol. 3, Jan. 1, 2014, pp. 104-112, Retrieved from the Internet: URL: https://aip.vse.cz/pdfs/aip/2014/01/08.pdf.
Kawaguchi J., et al., “Estimation of Finger Joint Angles Based on Electromechanical Sensing of Wrist Shape,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, Sep. 2017, vol. 25 (9), pp. 1409-1418.
Costanza E., et al., “EMG as a Subtle Input Interface for Mobile Computing,” Mobile HCI, LNCS 3160, 2004, pp. 426-430.
Costanza E., et al., “Toward Subtle Intimate Interfaces for Mobile Devices Using an EMG Controller,” CHI, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Apr. 2-7, 2005, pp. 481-489.
Cote-Allard U., et al., “Deep Learning for Electromyographic Hand Gesture Signal Classification Using Transfer Learning,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, Jan. 26, 2019, vol. 27 (4), 11 Pages.
Csapo A.B., et al., “Evaluation of Human-Myo Gesture Control Capabilities in Continuous Search and Select Operations,” 7th IEEE International Conference on Cognitive Infocommunications, Oct. 16-18, 2016, pp. 000415-000420.
Cui L., et al., “Diffraction From Angular Multiplexing Slanted Volume Hologram Gratings,” Optik, 2005, vol. 116, pp. 118-122.
Curatu C., et al., “Dual Purpose Lens for an Eye-Tracked Projection Head-Mounted Display,” International Optical Design Conference SPIE-OSA, 2006, vol. 6342, pp. 63420X-1-63420X-7.
Curatu C., et al., “Projection-Based Head-Mounted Display With Eye-Tracking Capabilities,” Proceedings of SPIE, 2005, vol. 5875, pp. 58750J-1-58750J-9.
Davoodi R., et al., “Development of a Physics-Based Target Shooting Game to Train Amputee Users of Multi joint Upper Limb Prostheses,” Presence, Massachusetts Institute of Technology, 2012, vol. 21 (1), pp. 85-95.
Delis A.L., et al., “Development of a Myoelectric Controller Based on Knee Angle Estimation,” Biodevices, International Conference on Biomedical Electronics and Devices, Jan. 17, 2009, 7 pages.
Diener L., et al., “Direct Conversion From Facial Myoelectric Signals to Speech Using Deep Neural Networks,” International Joint Conference on Neural Networks (IJCNN), Oct. 1, 2015, 7 pages.
Ding I-J., et al., “HMM with Improved Feature Extraction-Based Feature Parameters for Identity Recognition of Gesture Command Operators by Using a Sensed Kinect-Data Stream,” Neurocomputing, 2017, vol. 262, pp. 108-119.
Essex D., “Tutorial on Optomechanical Beam Steering Mechanisms,” OPTI 521 Tutorial, College of Optical Sciences, University of Arizona, 2006, 8 pages.
European Search Report for European Application No. 19861903.3, dated Oct. 12, 2021, 2 pages.
European Search Report for European Application No. 19863248.1, dated Oct. 19, 2021, 2 pages.
European Search Report for European Application No. 19868789.9, dated May 9, 2022, 9 pages.
European Search Report for European Application No. 19890394.0, dated Apr. 29, 2022, 9 pages.
Extended European Search Report for European Application No. 18879156.0, dated Mar. 12, 2021, 11 pages.
Extended European Search Report for European Application No. 19743717.1, dated Mar. 3, 2021, 12 pages.
Extended European Search Report for European Application No. 19744404.5, dated Mar. 29, 2021, 11 pages.
Extended European Search Report for European Application No. 19799947.7, dated May 26, 2021, 10 pages.
Extended European Search Report for European Application No. 17835111.0, dated Nov. 21, 2019, 6 pages.
Extended European Search Report for European Application No. 17835112.8, dated Feb. 5, 2020, 17 pages.
Extended European Search Report for European Application No. 17835140.9, dated Nov. 26, 2019, 10 Pages.
Extended European Search Report for European Application No. 18869441.8, dated Nov. 17, 2020, 20 Pages.
Extended European Search Report for European Application No. 19806723.3, dated Jul. 7, 2021, 13 pages.
Extended European Search Report for European Application No. 19810524.9, dated Mar. 17, 2021, 11 pages.
Extended European Search Report for European Application No. 19850130.6, dated Sep. 1, 2021, 14 Pages.
Extended European Search Report for European Application No. 19855191.3, dated Dec. 6, 2021, 11 pages.
Extended European Search Report for European Application No. 19883839.3, dated Dec. 15, 2021, 7 pages.
Farina D., et al., “Man/Machine Interface Based on the Discharge Timings of Spinal Motor Neurons After Targeted Muscle Reinnervation,” Nature Biomedical Engineering, Feb. 6, 2017, vol. 1, Article No. 0025, pp. 1-12.
Farina D., et al., “The Extraction of Neural Information from the Surface EMG for the Control of Upper-Limb Prostheses: Emerging Avenues and Challenges,” IEEE Transactions on Neural Systems Andrehabilitation Engineering, vol. 22, No. 4, Jul. 1, 2014, pp. 797-809.
Favorskaya M., et al., “Localization and Recognition of Dynamic Hand Gestures Based on Hierarchy of Manifold Classifiers,” International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, May 25-27, 2015, vol. XL-5/W6, pp. 1-8.
Fernandez E., et al., “Optimization of a Thick Polyvinyl Alcohol-Acrylamide Photopolymer for Data Storage Using a Combination of Angular and Peristrophic Holographic Multiplexing,” Applied Optics, Oct. 10, 2009, vol. 45 (29), pp. 7661-7666.
Final Office Action dated Jun. 2, 2020 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 127 Pages.
Final Office Action dated Jun. 2, 2020 for U.S. Appl. No. 16/557,383, filed Aug. 30, 2019, 66 Pages.
Final Office Action dated Jan. 3, 2019 for U.S. Appl. No. 14/465,194, filed Aug. 21, 2014, 61 Pages.
Final Office Action dated Nov. 3, 2020 for U.S. Appl. No. 15/974,430, filed May 8, 2018, 27 Pages.
Final Office Action dated Feb. 4, 2020 for U.S. Appl. No. 16/577,207, filed Sep. 20, 2019, 76 Pages.
Final Office Action dated Feb. 4, 2021 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 42 Pages.
Final Office Action dated Jun. 5, 2020 for U.S. Appl. No. 16/557,427, filed Aug. 30, 2019, 95 Pages.
Final Office Action dated Oct. 8, 2020 for U.S. Appl. No. 16/557,342, filed Aug. 30, 2019, 73 Pages.
Final Office Action dated Apr. 9, 2020 for U.S. Appl. No. 15/974,454, filed May 8, 2018, 19 Pages.
Final Office Action dated Jan. 10, 2018 for U.S. Appl. No. 14/465,194, filed Aug. 21, 2014, 50 Pages.
Final Office Action dated Dec. 11, 2019 for U.S. Appl. No. 15/974,430, filed May 8, 2018, 30 Pages.
Final Office Action dated Jan. 13, 2021 for U.S. Appl. No. 16/577,207, filed Sep. 20, 2019, 91 Pages.
Final Office Action dated Dec. 18, 2019 for U.S. Appl. No. 16/258,279, filed Jan. 25, 2019, 45 Pages.
Final Office Action dated Nov. 18, 2020 for U.S. Appl. No. 14/461,044, filed Aug. 15, 2014, 14 Pages.
Final Office Action dated Feb. 19, 2021 for U.S. Appl. No. 16/258,279, filed Jan. 25, 2019, 58 Pages.
Final Office Action dated Oct. 21, 2021 for U.S. Appl. No. 16/899,843, filed Jun. 12, 2020, 29 Pages.
Final Office Action dated Jul. 23, 2021 for U.S. Appl. No. 14/461,044, filed Aug. 15, 2014, 15 Pages.
Kessler D., “Optics of Near to Eye Displays (NEDs),” Presentation—Oasis, Tel Aviv, Feb. 19, 2013, 37 pages.
Kim H., et al., “Real-Time Human Pose Estimation and Gesture Recognition from Depth Images Using Superpixels and SVM Classifier,” Sensors, 2015, vol. 15, pp. 12410-12427.
Kipke D.R., et al., “Silicon-Substrate Intracortical Microelectrode Arrays for Long-Term Recording of Neuronal Spike Activity in Cerebral Cortex,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, Jun. 2003, vol. 11 (2), 5 pages, Retrieved on Oct. 7, 2019 [Oct. 7, 2019] Retrieved from the Internet: URL: https://www.ece.uvic.ca/-bctill/papers/neurimp/Kipke_etal_2003_01214707.pdf.
Koerner M.D., “Design and Characterization of the Exo-Skin Haptic Device: A Novel Tendon Actuated Textile Hand Exoskeleton,” Abstract of thesis for Drexel University Masters Degree [online], Nov. 2, 2017, 5 pages, Retrieved from the Internet: URL: https://dialog.proquest.com/professional/docview/1931047627?accountid=153692.
Krees B.C., et al., “Diffractive and Holographic Optics as Optical Combiners in Head Mounted Displays,” UbiComp, Zurich, Switzerland, Sep. 8-12, 2013, pp. 1479-1482.
Kress B., et al., “A Review of Head-Mounted Displays (HMD) Technologies and Applications for Consumer Electronics,” Proceedings of SPIE, 2013, vol. 8720, pp. 87200A-1-87200A-13.
Kress B., “Optical Architectures for See-Through Wearable Displays,” Presentation, Bay Area SID Seminar, Apr. 30, 2014, 156 pages.
Lake et al., “Method and Apparatus for Analyzing Capacitive EMG and IMU Sensor Signals for Gesture Control,” Amendment filed Aug. 21, 2015, for U.S. Appl. No. 14/186,878, 13 pages.
Lake et al., “Method and Apparatus for Analyzing Capacitive EMG and IMU Sensor Signals for Gesture Control,” Office Action dated Jun. 17, 2015, for U.S. Appl. No. 14/186,878, 13 pages.
Lake et al.' “Method and Apparatus for Analyzing Capacitive EMG and IMU Sensor Signals for Gesture Control,” Preliminary Amendment filed May 9, 2014, for U.S. Appl. No. 14/186,878, 9 pages.
Lake et al., “Method and Apparatus for Analyzing Capacitive EMG and IMU Sensor Signals for Gesture Control,” U.S. Appl. No. 14/186,878, filed Feb. 21, 2014, 29 pages.
Lake et al., “Methods and Devices for Combining Muscle Activity Sensor Signals and Inertial Sensor Signals for Gesture-Based Control,” Amendment filed Jan. 8, 2016, for U.S. Appl. No. 14/186,889, 16 pages.
Lake et al., “Methods and Devices for Combining Muscle Activity Sensor Signals and Inertial Sensor Signals for Gesture-Based Control,” Amendment filed Jul. 13, 2016, for U.S. Appl. No. 14/186,889, 12 pages.
Lake et al., “Methods and Devices for Combining Muscle Activity Sensor Signals and Inertial Sensor Signals for Gesture-Based Control,” Office Action dated Jun. 16, 2016, for U.S. Appl. No. 14/186,889, 13 pages.
Lake et al., “Methods and Devices for Combining Muscle Activity Sensor Signals and Inertial Sensor Signals for Gesture-Based Control,” Office Action dated Nov. 5, 2015, for U.S. Appl. No. 14/186,889, 11 pages.
Lake et al., “Methods and Devices That Combine Muscle Activity Sensor Signals and Inertial Sensor Signals for Gesture-Based Control,” U.S. Appl. No. 14/186,889, filed Feb. 21, 2014, 58 pages.
Lee D.C., et al., “Motion and Force Estimation System of Human Fingers,” Journal of Institute of Control, Robotics and Systems, 2011, vol. 17 (10), pp. 1014-1020.
Levola T., “7.1: Invited Paper: Novel Diffractive Optical Components for Near to Eye Displays,” SID Symposium Digest of Technical Papers, 2006, vol. 37 (1), pp. 64-67.
Li Y., et al., “Motor Function Evaluation of Hemiplegic Upper-Extremities Using Data Fusion from Wearable Inertial and Surface EMG Sensors,” Sensors, MDPI, 2017, vol. 17 (582), pp. 1-17.
Liao C.D., et al., “The Evolution of MEMS Displays,” IEEE Transactions on Industrial Electronics, Apr. 2009, vol. 56 (4), pp. 1057-1065.
Lippert T.M., “Chapter 6: Display Devices: RSD™ (Retinal Scanning Display),” The Avionics Handbook, CRC Press, 2001, 8 pages.
Lopes J., et al., “Hand/Arm Gesture Segmentation by Motion Using IMU and EMG Sensing,” ScienceDirect, Jun. 27-30, 2017, vol. 11, pp. 107-113.
Majaranta P., et al., “Chapter 3: Eye Tracking and Eye-Based Human-Computer Interaction,” Advances in Physiological Computing, Springer-Verlag London, 2014, pp. 39-65.
Marcard T.V., et al., “Sparse Inertial Poser: Automatic 3D Human Pose Estimation from Sparse IMUs,” arxiv.org, Computer Graphics Forum, 2017, vol. 36 (2), 12 pages, XP080759137.
Martin H., et al., “A Novel Approach of Prosthetic Arm Control using Computer Vision, Biosignals, and Motion Capture,” IEEE Symposium on Computational Intelligence in Robotic Rehabilitation and Assistive Technologies (CIR2AT), 2014, 5 pages.
McIntee S.S., “A Task Model of Free-Space Movement-Based Geastures,” Dissertation, Graduate Faculty of North Carolina State University, Computer Science, 2016, 129 pages.
Mendes Jr.J.J.A., et al., “Sensor Fusion and Smart Sensor in Sports and Biomedical Applications,” Sensors, 2016, vol. 16 (1569), pp. 1-31.
Merriam-Webster, “Radio Frequencies,” download date Jul. 12, 2017, 2 pages, Retrieved from the Internet: URL: https://www.merriam-webster.com/table/collegiate/radiofre.htm.
Mohamed O.H., “Homogeneous Cognitive Based Biometrics for Static Authentication,” Dissertation submitted to University of Victoria, Canada, 2010, [last accessed Oct. 11, 2019], 149 pages, Retrieved from the Internet: URL: http://hdl.handle.net/1828/321.
Morris D., et al., “Emerging Input Technologies for Always-Available Mobile Interaction,” Foundations and Trends in Human-Computer Interaction, 2010, vol. 4 (4), pp. 245-316.
Morun C., et al., “Systems, Articles, and Methods for Capacitive Electromyography Sensors,” U.S. Appl. No. 16/437,351, filed Jun. 11, 2019, 51 pages.
Naik G.R., et al., “Source Separation and Identification issues in Bio Signals: A Solution using Blind Source Separation,” Chapter 4 of Recent Advances in Biomedical Engineering, Intech, 2009, 23 pages.
Naik G.R., et al., “Real-Time Hand Gesture Identification for Human Computer Interaction Based on ICA of Surface Electromyogram,” IADIS International Conference Interfaces and Human Computer Interaction, 2007, pp. 83-90.
Naik G.R., et al., “Subtle Hand Gesture Identification for HCI Using Temporal Decorrelation Source Separation BSS of Surface EMG,” Digital Image Computing Techniques and Applications, IEEE Computer Society, 2007, pp. 30-37.
Negro F., et al., “Multi-Channel Intramuscular and Surface EMG Decomposition by Convolutive Blind Source Separation,” Journal of Neural Engineering, Feb. 29, 2016, vol. 13, 18 Pages.
Non-Final Office Action dated Mar. 1, 2018 for U.S. Appl. No. 14/553,657, filed Nov. 25, 2014, 29 Pages.
Non-Final Office Action dated Mar. 2, 2021 for U.S. Appl. No. 15/974,430, filed May 8, 2018, 32 Pages.
Non-Final Office Action dated May 2, 2018 for U.S. Appl. No. 15/799,628, filed Oct. 31, 2017, 25 Pages.
Non-Final Office Action dated Sep. 2, 2020 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 66 Pages.
Non-Final Office Action dated Aug. 3, 2020 for U.S. Appl. No. 16/593,446, filed Oct. 4, 2019, 44 pages.
Non-Final Office Action dated Jun. 3, 2021 for U.S. Appl. No. 15/816,435, filed Nov. 17, 2017, 32 Pages.
Non-Final Office Action dated Jun. 5, 2020 for U.S. Appl. No. 15/659,072, filed Jul. 25, 2017, 59 Pages.
Non-Final Office Action dated Oct. 5, 2022 for U.S. Appl. No. 17/576,815, filed Jan. 14, 2022, 14 pages.
Non-Final Office Action dated Nov. 6, 2018 for U.S. Appl. No. 16/057,573, filed Aug. 7, 2018, 14 Pages.
Non-Final Office Action dated Sep. 6, 2019 for U.S. Appl. No. 16/424,144, filed May 28, 2019, 11 Pages.
Non-Final Office Action dated May 7, 2021 for U.S. Appl. No. 16/899,843, filed Jun. 12, 2020, 24 Pages.
Non-Final Office Action dated Oct. 7, 2022 for U.S. Appl. No. 17/141,646, filed Jan. 5, 2021, 6 pages.
Non-Final Office Action dated Feb. 8, 2021 for U.S. Appl. No. 16/557,342, filed Aug. 30, 2019, 11 Pages.
Non-Final Office Action dated Oct. 8, 2020 for U.S. Appl. No. 16/577,207, filed Sep. 20, 2019, 51 Pages.
Non-Final Office Action dated Apr. 9, 2019 for U.S. Appl. No. 16/258,409, filed Jan. 25, 2019, 71 Pages.
Tibold R., et al., “Prediction of Muscle Activity during Loaded Movements of The Upper Limb,” Journal of NeuroEngineering Rehabilitation, 2015 vol. 12, No. 6, DOI: https://doi.org/10.1186/1743-0003-12-6, 12 pages.
Torres T., “Myo Gesture Control Armband,” PCMag, Jun. 8, 2015, 9 pages, Retrieved from the Internet: URL: https://www.pcmag.com/article2/0,2817,2485462,00.asp.
Ueno A., et al., “A Capacitive Sensor System for Measuring Laplacian Electromyogram through Cloth: A Pilot Study,” Proceedings of the 29th Annual International Conference of the IEEE EMBS, Cite Internationale, Lyon, France, Aug. 23-26, 2007, pp. 5731-5734.
Ueno A., et al., “Feasibility of Capacitive Sensing of Surface Electromyographic Potential through Cloth,” Sensors and Materials, 2012, vol. 24 (6), pp. 335-346.
Urey H., “Diffractive Exit-Pupil Expander for Display Applications,” Applied Optics, Nov. 10, 2001, vol. 40 (32), pp. 5840-5851.
Urey H., et al., “Optical Performance Requirements for MEMS-Scanner Based Microdisplays,” Conferences on MOEMS and Miniaturized Systems, SPIE, 2000, vol. 4178, pp. 176-185.
Valero-Cuevas F.J., et al., “Computational Models for Neuromuscular Function,” IEEE Reviews in Biomedical Engineering, 2009, vol. 2, NIH Public Access Author Manuscript [online], Jun. 16, 2011 [Retrieved on Jul. 29, 2019], 52 pages, Retrieved from the Internet: URL: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3116649/.
Viirre E., et al., “The Virtual Retinal Display: A New Technology for Virtual Reality and Augmented Vision in Medicine,” Proceedings of Medicine Meets Virtual Reality, IOS Press and Ohmsha, 1998, pp. 252-257.
Wijk U., et al., “Forearm Amputee's Views of Prosthesis Use and Sensory Feedback,” Journal of Hand Therapy, Jul. 2015, vol. 28 (3), pp. 269-278.
Wittevrongel B., et al., “Spatiotemporal Beamforming: A Transparent and Unified Decoding Approach to Synchronous Visual Brain-Computer Interfacing,” Frontiers in Neuroscience, Nov. 15, 2017, vol. 11, Article No. 630, 13 Pages.
Wodzinski M., et al., “Sequential Classification of Palm Gestures Based on A* Algorithm and MLP Neural Network for Quadrocopter Control,” Metrology and Measurement Systems, 2017, vol. 24 (2), pp. 265-276.
Written Opinion for International Application No. PCT/US2014/057029, dated Feb. 24, 2015, 9 Pages.
Xiong A., et al., “A Novel HCI based on EMG and IMU,” Proceedings of the 2011 IEEE International Conference on Robotics and Biomimetics, Phuket, Thailand, Dec. 7-11, 2011, pp. 2653-2657.
Xu Z., et al., “Hand Gesture Recognition and Virtual Game Control Based on 3D Accelerometer and EMG Sensors,” Proceedings of the 14th International Conference on Intelligent User Interfaces, D211 Sanibel Island, Florida, Feb. 8-11, 2009, pp. 401-406.
Xue Y., et al., “Multiple Sensors Based Hand Motion Recognition Using Adaptive Directed Acyclic Graph, ” Applied Sciences, MDPI, 2017, vol. 7 (358), pp. 1-14.
Yang Z., et al., “Surface EMG Based Handgrip Force Predictions Using Gene Expression Programming,” Neurocomputing, 2016, vol. 207, pp. 568-579.
Zacharaki E.I., et al., “Spike Pattern Recognition by Supervised Classification in Low Dimensional Embedding Space,” Brain Informatics, 2016, vol. 3, pp. 73-83.
Zhang X., et al., “A Framework for Hand Gesture Recognition Based on Accelerometer and EMG Sensors,” IEEE Transactions on Systems, Man, and Cybernetics—Part A: Systems and Humans, Nov. 2011, vol. 41 (6), pp. 1064-1076.
Office Action dated Jan. 20, 2023 for Chinese Application No. 201780059093.7, filed Jul. 25, 2017, 16 pages.
Notice of Allowance dated Dec. 14, 2022 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 10 pages.
Notice of Allowance dated Feb. 6, 2020 for U.S. Appl. No. 16/424,144, filed May 28, 2019, 28 Pages.
Notice of Allowance dated Feb. 8, 2019 for U.S. Appl. No. 16/023,276, filed Jun. 29, 2018, 15 pages.
Notice of Allowance dated Feb. 9, 2022 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 9 pages.
Notice of Allowance dated Nov. 10, 2021 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 6 pages.
Notice of Allowance dated Mar. 11, 2020 for U.S. Appl. No. 14/465,194, filed Aug. 21, 2014, 29 Pages.
Notice of Allowance dated Jul. 15, 2021 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 2 pages.
Notice of Allowance dated Jun. 15, 2018 for U.S. Appl. No. 15/799,621, filed Oct. 31, 2017, 27 pages.
Notice of Allowance dated Dec. 16, 2020 for U.S. Appl. No. 16/593,446, filed Oct. 4, 2019, 44 pages.
Notice of Allowance dated Jul. 18, 2022 for U.S. Appl. No. 16/550,905, filed Aug. 26, 2019, 7 pages.
Notice of Allowance dated May 18, 2020 for U.S. Appl. No. 16/258,279, filed Jan. 25, 2019, 42 Pages.
Notice of Allowance dated May 18, 2022 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 10 pages.
Notice of Allowance dated Aug. 19, 2020 for U.S. Appl. No. 16/557,427, filed Aug. 30, 2019, 22 Pages.
Notice of Allowance dated Jul. 19, 2019 for U.S. Appl. No. 16/258,409, filed Jan. 25, 2019, 36 Pages.
Notice of Allowance dated Apr. 20, 2022 for U.S. Appl. No. 14/461,044, filed Aug. 15, 2014, 08 pages.
Notice of Allowance dated May 20, 2020 for U.S. Appl. No. 16/389,419, filed Apr. 19, 2019, 28 Pages.
Notice of Allowance dated Aug. 22, 2022 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 9 pages.
Notice of Allowance dated Oct. 22, 2021 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018 , 8 pages.
Notice of Allowance dated Aug. 23, 2021 for U.S. Appl. No. 15/974,430, filed May 8, 2018, 12 pages.
Notice of Allowance dated Dec. 23, 2020 for U.S. Appl. No. 15/659,072, filed Jul. 25, 2017, 26 Pages.
Notice of Allowance dated Sep. 24, 2020 for U.S. Appl. No. 16/292,609, filed Mar. 5, 2019, 20 Pages.
Notice of Allowance dated Mar. 25, 2022 for U.S. Appl. No. 16/550,905, filed Aug. 26, 2019, 7 pages.
Notice of Allowance dated Sep. 25, 2018 for U.S. Appl. No. 14/553,657, filed Nov. 25, 2014, 25 Pages.
Notice of Allowance dated Jan. 28, 2019 for U.S. Appl. No. 16/023,300, filed Jun. 29, 2018, 31 pages.
Notice of Allowance dated Jun. 28, 2021 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 18 pages.
Notice of Allowance dated Nov. 3, 2022 for U.S. Appl. No. 16/899,843, filed Jun. 12, 2020, 10 pages.
Notice of Allowance dated Mar. 30, 2018 for U.S. Appl. No. 14/539,773, filed Nov. 12, 2014, 17 pages.
Notice of Allowance dated Nov. 30, 2018 for U.S. Appl. No. 15/799,628, filed Oct. 31, 2017, 19 Pages.
Notice of Allowance dated Jul. 31, 2019 for U.S. Appl. No. 16/257,979, filed Jan. 25, 2019, 22 Pages.
Notice of Allowance received for U.S. Appl. No. 14/155,107 dated Aug. 30, 2019, 16 pages.
Office action for European Application No. 17835112.8, dated Feb. 11, 2022, 11 Pages.
Office Action for European Application No. 19806723.3, dated Oct. 27, 2022, 8 pages.
Office Action for European Patent Application No. 19743717.1, dated Apr. 11, 2022, 10 pages.
Office Action dated Sep. 28, 2022 for Chinese Application No. 201780059093.7, filed Jul. 25, 2017, 16 pages.
Partial Supplementary European Search Report for European Application No. 18879156.0, dated Dec. 7, 2020, 9 pages.
Picard R.W., et al., “Affective Wearables,” Proceedings of the IEEE 1st International Symposium on Wearable Computers, ISWC, Cambridge, MA, USA, Oct. 13-14, 1997, pp. 90-97.
Preinterview First Office Action dated Jun. 24, 2020 for U.S. Appl. No. 16/785,680, filed Feb. 10, 2020, 90 Pages.
Rekimoto J., “GestureWrist and GesturePad: Unobtrusive Wearable Interaction Devices,” ISWC Proceedings of the 5th IEEE International Symposium on Wearable Computers, 2001, 7 pages.
Restriction Requirement dated Aug. 8, 2017 for U.S. Appl. No. 14/553,657, filed Nov. 25, 2014, 7 Pages.
Saponas T.S., et al., “Demonstrating the Feasibility of Using Forearm Electromyography for Muscle-Computer Interfaces,” CHI Proceedings, Physiological Sensing for Input, Apr. 5-10, 2008, pp. 515-524.
Saponas T.S., et al., “Enabling Always-Available Input with Muscle-Computer Interfaces,” Conference: Proceedings of the 22nd Annual ACM Symposium on User Interface Software and Technology, Oct. 7, 2009, pp. 167-176.
Saponas T.S., et al., “Making Muscle-Computer Interfaces More Practical,” CHI, Atlanta, Georgia, USA, Apr. 10-15, 2010, 4 pages.
Sartori M., et al., “Neural Data-Driven Musculoskeletal Modeling for Personalized Neurorehabilitation Technologies,” IEEE Transactions on Biomedical Engineering, May 5, 2016, vol. 63 (5), pp. 879-893.
Sato M., et al., “Touche: Enhancing Touch Interaction on Humans, Screens, Liquids, and Everyday Objects,” CHI, Austin, Texas, May 5-10, 2012, 10 pages.
Sauras-Perez P., et al., “A Voice and Pointing Gesture Interaction System for Supporting Human Spontaneous Decisions in Autonomous Cars,” Clemson University, All Dissertations, May 2017, 174 pages.
Schowengerdt B.T., et al., “Stereoscopic Retinal Scanning Laser Display With Integrated Focus Cues for Ocular Accommodation,” Proceedings of SPIE-IS&T Electronic Imaging, 2004, vol. 5291, pp. 366-376.
Shen S., et al., “I Am a Smartwatch and I Can Track My User's Arm,” University of Illinois at Urbana-Champaign, MobiSys, Jun. 25-30, 2016, 12 pages.
Silverman N.L., et al., “58.5L: Late-News Paper: Engineering a Retinal Scanning Laser Display with Integrated Accommodative Depth Cues,” SID 03 Digest, 2003, pp. 1538-1541.
Son M., et al., “EValuating the Utility of Two Gestural Discomfort Evaluation Methods,” PLOS One, Apr. 19, 2017, 21 pages.
Strbac M., et al., “Microsoft Kinect-Based Artificial Perception System for Control of Functional Electrical Stimulation Assisted Grasping,” Hindawi Publishing Corporation, BioMed Research International [online], 2014, Article No. 740469, 13 pages, Retrieved from the Internet: URL: https://dx.doi.org/10.1155/2014/740469.
Takatsuka Y., et al., “Retinal Projection Display Using Diffractive Optical Element,” Tenth International Conference on Intelligent Information Hiding and Multimedia Signal Processing, IEEE, 2014, pp. 403-406.
Final Office Action dated Sep. 23, 2020 for U.S. Appl. No. 15/816,435, filed Nov. 17, 2017, 70 Pages.
Final Office Action dated Jan. 28, 2020 for U.S. Appl. No. 16/557,342, filed Aug. 30, 2019, 15 Pages.
Final Office Action dated Jul. 28, 2017 for U.S. Appl. No. 14/505,836, filed Oct. 3, 2014, 52 Pages.
Final Office Action dated Jun. 28, 2021 for U.S. Appl. No. 16/557,342, filed Aug. 30, 2019, 11 Pages.
Final Office Action dated Nov. 29, 2019 for U.S. Appl. No. 15/659,072, filed Jul. 25, 2017, 36 Pages.
Final Office Action dated Nov. 29, 2019 for U.S. Appl. No. 16/353,998, filed Mar. 14, 2019, 33 Pages.
Final Office Action received for U.S. Appl. No. 14/155,087 dated Dec. 16, 2016, 32 pages.
Final Office Action received for U.S. Appl. No. 14/155,087 dated Jul. 20, 2015, 27 pages.
Final Office Action received for U.S. Appl. No. 14/155,087 dated Jul. 8, 2016, 27 pages.
Final Office Action received for U.S. Appl. No. 14/155,087 dated Nov. 27, 2017, 40 pages.
Gourmelon L., et al., “Contactless Sensors for Surface Electromyography,” Proceedings of the 28th IEEE EMBS Annual International Conference, New York City, NY, Aug. 30-Sep. 3, 2006, pp. 2514-2517.
Hainich R.R., et al., “Chapter 10: Near-Eye Displays,” Displays: Fundamentals & Applications, AK Peters/CRC Press, 2011, 65 pages.
Hauschild M., et al., “A Virtual Reality Environment for Designing and Fitting Neural Prosthetic Limbs,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, Mar. 2007, vol. 15 (1), pp. 9-15.
Hornstein S., et al., “Maradin's Micro-Mirror—System Level Synchronization Notes,” SID Digest, 2012, pp. 981-984.
“IEEE 100 The Authoritative Dictionary of IEEE Standards Terms,” Seventh Edition, Standards Information Network IEEE Press, Dec. 2000, 3 pages.
Intemational Search Report and Written Opinion for International Application No. PCT/US2014/017799, dated May 16, 2014, 9 pages.
Intemational Search Report and Written Opinion for International Application No. PCT/US2014/037863, dated Aug. 21, 2014, 10 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2017/043693, dated Feb. 7, 2019, 7 Pages.
International Preliminary Report on Patentability for International Application No. PCT/US2017/043791, dated Feb. 7, 2019, 9 Pages.
International Preliminary Report on Patentability for International Application No. PCT/US2019/031114, dated Nov. 19, 2020, 16 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2019/049094, dated Mar. 11, 2021, 24 Pages.
International Preliminary Report on Patentability for International Application No. PCT/US2019/052151, dated Apr. 1, 2021, 9 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2014/017799, dated Sep. 3, 2015, 8 Pages.
International Preliminary Report on Patentability for International Application No. PCT/US2014/037863, dated Nov. 26, 2015, 8 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2014/052143, dated Mar. 3, 2016, 7 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2014/067443, dated Jun. 9, 2016, 7 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2015/015675, dated Aug. 25, 2016, 8 Pages.
International Preliminary Report on Patentability for International Application No. PCT/US2017/043686, dated Feb. 7, 2019, 8 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2017/043792, dated Feb. 7, 2019, 8 Pages.
International Preliminary Report on Patentability for International Application No. PCT/US2018/056768, dated Apr. 30, 2020, 7 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2018/061409, dated May 28, 2020, 10 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2019/015174, dated Aug. 6, 2020, 7 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2019/015183, dated Aug. 6, 2020, 7 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2019/015238, dated Aug. 6, 2020, 7 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2019/028299, dated Dec. 10, 2020, 11 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2019/034173, dated Dec. 10, 2020, 9 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2019/046351, dated Feb. 25, 2021, 8 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2019/052131, dated Apr. 1, 2021, 8 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2019/054716, dated Apr. 15, 2021, 10 pages.
Non-Final Office Action dated Aug. 11, 2021 for U.S. Appl. No. 16/577,207, filed Sep. 20, 2019, 35 Pages.
Non-Final Office Action dated Sep. 11, 2019 for U.S. Appl. No. 14/465,194, filed Aug. 21, 2014, 72 Pages.
Non-Final Office Action dated May 12, 2022 for U.S. Appl. No. 16/899,843, filed Jun. 12, 2020, 34 Pages.
Non-Final Office Action dated Jun. 13, 2019 for U.S. Appl. No. 16/258,279, filed Jan. 25, 2019, 38 Pages.
Non-Final Office Action dated Sep. 14, 2017 for U.S. Appl. No. 14/539,773, filed Nov. 12, 2014, 28 pages.
Non-Final Office Action dated Aug. 15, 2018 for U.S. Appl. No. 14/465,194, filed Aug. 21, 2014, 64 Pages.
Non-Final Office Action dated Jun. 15, 2020 for U.S. Appl. No. 16/292,609, filed Mar. 5, 2019, 26 Pages.
Non-Final Office Action dated Jun. 15, 2020 for U.S. Appl. No. 16/557,342, filed Aug. 30, 2019, 46 Pages.
Non-Final Office Action dated Jan. 16, 2020 for U.S. Appl. No. 16/389,419, filed Apr. 19, 2019, 26 Pages.
Non-Final Office Action dated May 16, 2019 for U.S. Appl. No. 15/974,384, filed May 8, 2018, 13 Pages.
Non-Final Office Action dated May 16, 2019 for U.S. Appl. No. 15/974,430, filed May 8, 2018, 12 Pages.
Non-Final Office Action dated Aug. 17, 2017 for U.S. Appl. No. 14/465,194, filed Aug. 21, 2014, 81 Pages.
Non-Final Office Action dated Dec. 17, 2018 for U.S. Appl. No. 16/137,960, filed Sep. 21, 2018, 10 pages.
Non-Final Office Action dated Jan. 18, 2018 for U.S. Appl. No. 15/799,621, filed Oct. 31, 2017, 10 pages.
Non-Final Office Action dated Nov. 19, 2019 for U.S. Appl. No. 16/577,207, filed Sep. 20, 2019, 32 Pages.
Non-Final Office Action dated Aug. 20, 2020 for U.S. Appl. No. 15/974,454, filed May 8, 2018, 59 Pages.
Non-Final Office Action dated Dec. 20, 2019 for U.S. Appl. No. 15/974,454, filed May 8, 2018, 41 Pages.
Non-Final Office Action dated Jan. 22, 2020 for U.S. Appl. No. 15/816,435, filed Nov. 17, 2017, 35 Pages.
Non-Final Office Action dated Jun. 22, 2017 for U.S. Appl. No. 14/461,044, filed Aug. 15, 2014, 21 Pages.
Non-Final Office Action dated Oct. 22, 2019 for U.S. Appl. No. 16/557,342, filed Aug. 30, 2019, 16 Pages.
Non-Final Office Action dated Dec. 23, 2019 for U.S. Appl. No. 16/557,383, filed Aug. 30, 2019, 53 Pages.
Non-Final Office Action dated Dec. 23, 2019 for U.S. Appl. No. 16/557,427, filed Aug. 30, 2019, 52 Pages.
Non-Final Office Action dated Feb. 23, 2017 for U.S. Appl. No. 14/505,836, filed Oct. 3, 2014, 54 Pages.
Non-Final Office Action dated Jul. 23, 2020 for U.S. Appl. No. 16/593,446, filed Oct. 4, 2019, 28 pages.
Non-Final Office Action dated May 24, 2019 for U.S. Appl. No. 16/353,998, filed Mar. 14, 2019, 20 Pages.
Non-Final Office Action dated Feb. 25, 2021 for U.S. Appl. No. 14/461,044, filed Aug. 15, 2014, 17 Pages.
Non-Final Office Action dated May 26, 2020 for U.S. Appl. No. 16/353,998, filed Mar. 14, 2019, 60 Pages.
Non-Final Office Action dated Nov. 27, 2020 for U.S. Appl. No. 16/258,279, filed Jan. 25, 2019, 44 Pages.
Non-Final Office Action dated Aug. 28, 2018 for U.S. Appl. No. 16/023,276, filed Jun. 29, 2018, 10 pages.
Non-Final Office Action dated Aug. 28, 2018 for U.S. Appl. No. 16/023,300, filed Jun. 29, 2018, 11 pages.
Non-Final Office Action dated Jun. 28, 2021 for U.S. Appl. No. 16/550,905, filed Aug. 26, 2019, 5 Pages.
Non-Final Office Action dated Apr. 29, 2019 for U.S. Appl. No. 16/257,979, filed Jan. 25, 2019, 63 Pages.
Non-Final Office Action dated Apr. 30, 2019 for U.S. Appl. No. 15/659,072, filed Jul. 25, 2017, 99 Pages.
Non-Final Office Action dated Apr. 30, 2020 for U.S. Appl. No. 15/974,430, filed May 8, 2018, 57 Pages.
Non-Final Office Action dated Dec. 30, 2019 for U.S. Appl. No. 16/593,446, filed Oct. 4, 2019, 43 pages.
Non-Final Office Action dated Jun. 30, 2016 for U.S. Appl. No. 14/505,836, filed Oct. 3, 2014, 37 Pages.
Non-Final Office Action dated Oct. 30, 2019 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 22 Pages.
Non-Final Office Action received for U.S. Appl. No. 14/155,087 dated Aug. 16, 2016, 28 pages.
Non-Final Office Action received for U.S. Appl. No. 14/155,087 dated Aug. 7, 2017, 28 pages.
Non-Final Office Action received for U.S. Appl. No. 14/155,087 dated Feb. 17, 2016, 26 pages.
Non-Final Office Action received for U.S. Appl. No. 14/155,087 dated Mar. 31, 2015, 22 pages.
Non-Final Office Action received for U.S. Appl. No. 14/155,107 dated Aug. 17, 2016, 37 pages.
Non-Final Office Action received for U.S. Appl. No. 14/155,107 dated Aug. 7, 2017, 34 pages.
Non-Final Office Action received for U.S. Appl. No. 14/155,107 dated Feb. 11, 2016, 42 pages.
Non-Final Office Action received for U.S. Appl. No. 14/155,107 dated Jul. 13, 2018, 45 pages.
Non-Final Office Action received for U.S. Appl. No. 14/155,107 dated Mar. 31, 2015, 26 pages.
Notice of Allowance dated May 1, 2019 for U.S. Appl. No. 16/137,960, filed Sep. 21, 2018, 14 pages.
Notice of Allowance dated Nov. 2, 2020 for U.S. Appl. No. 15/974,454, filed May 8, 2018, 24 Pages.
Notice of Allowance dated Nov. 4, 2019 for U.S. Appl. No. 15/974,384, filed May 8, 2018, 39 Pages.
Notice of Allowance dated Mar. 5, 2019 for U.S. Appl. No. 16/057,573, filed Aug. 7, 2018, 31 Pages.
European Search Report for European Patent Application No. 23186202.0. dated Aug. 2, 2023, 7 pages.
Khezri M., et al., “A Novel Approach to Recognize Hand Movements Via sEMG Patterns,” 2007 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Aug. 22, 2007, pp. 4907-4910.
Naik G.R., et al., “SEMG for Identifying Hand Gestures using ICA,” In Proceedings of the 2nd International Workshop on Biosignal Processing and Classification, Jan. 31, 2006, pp. 61-67.
Office Action dated Sep. 14, 2023 for Chinese Application No. 201980035465.1, filed May 28, 2019, 9 pages.
Office Action dated Aug. 15, 2023 for Japanese Patent Application No. 2021-507757, filed on Feb. 15, 2021,9 pages.
Office Action dated Aug. 16, 2023 for Chinese Application No. 201880082887.X, filed Oct. 19, 2018, 17 pages.
Office Action dated Aug. 16, 2023 for Chinese Application No. 202080062417.4, filed Sep. 3, 2020, 11 pages.
Office Action dated Aug. 21, 2023 for Chinese Patent Application No. 201980062920.7, filed Sep. 20, 2019, 21 pages.
Office Action dated Jun. 22, 2023 for European Patent Application No. 19863248.1, filed on Sep. 20, 2019, 5 pages.
Office Action dated Sep. 28, 2023 for Chinese Application No. 201980022051.5, filed Jan. 25, 2019, 10 pages.
Office Action dated Aug. 29, 2023 for Japanese Application No. 2021-506985, filed Feb. 9, 2021,6 pages.
Office Action dated Aug. 31, 2023 for Chinese Application No. 201 980045972.3 filed May 7, 2021,20 pages.
Valero-Cuevas F. J., et al. “Computational Models for Neuromuscular Function,” IEEE reviews in Biomedical Engineering, Dec. 31, 2009, vol. 2, pp. 110-135.
Final Office Action received for U.S. Appl. No. 14/155,107 dated Dec. 19, 2016, 35 pages.
Final Office Action received for U.S. Appl. No. 14/155,107 dated Jan. 17, 2019, 46 pages.
Final Office Action received for U.S. Appl. No. 14/155,107 dated Jul. 16, 2015, 28 pages.
Final Office Action received for U.S. Appl. No. 14/155,107 dated Jul. 8, 2016, 31 pages.
Final Office Action received for U.S. Appl. No. 14/155,107 dated Nov. 27, 2017, 44 pages.
First Office Action dated Nov. 25, 2020, for Canadian Application No. 2921954, filed Aug. 21, 2014, 4 pages.
Fong H.C., et al., “PepperGram With Interactive Control,” 22nd International Conference onvirtual System & Multimedia (VSMM), Oct. 17, 2016, 5 pages.
Gallina A., et al., “Surface EMG Biofeedback,” Surface Electromyography: Physiology, Engineering, and Applications, 2016, pp. 485-500.
Gargiulo G., et al., “Giga Ohm High-Impedance FET Input Amplifiers for Dry Electrode Biosensor Circuits and Systems,” Integrated Microsystems: Electronics, Photonics, and Biotechnolgy, Dec. 19, 2017, 41 Pages, Retrieved from the Internet: URL: https://www.researchgate.net/profile/Aiistair_Mcewan/publication/255994293_Gigaohm_high_impedance_FETinput_amplifiers_for_dry_electrode_biosensor_circuits_and_systems/links/Of31753a7d0287f5f7000000/Giga-ohm-https://www.researchgate.net/publication/ 25599429 3_Giga-Ohm_High- impedance FET Input Amplifiers for Dry Electrode Biosensor Circuits and Systems.
Ghasemzadeh H., et al., “A Body Sensor Network With Electromyogram and Inertial Sensors: Multimodal Interpretation of Muscular Activities,” IEEE Transactions on Information Technology in Biomedicine, Mar. 2010, vol. 14 (2), pp. 198-206.
Gopura R.A.R.C., et al., “A Human Forearm and Wrist Motion Assist Exoskeleton Robot With EMG-Based Fuzzy-Neuro Control,” Proceedings of the 2nd Biennial IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics, Oct. 19-22, 2008, 6 pages.
Provisional Applications (1)
Number Date Country
62771957 Nov 2018 US
Continuations (1)
Number Date Country
Parent 17297449 US
Child 18069248 US