The accompanying drawings illustrate a number of exemplary embodiments and are a part of the specification. Together with the following description, these drawings demonstrate and explain various principles of the present disclosure.
Throughout the drawings, identical reference characters and descriptions indicate similar, but not necessarily identical, elements. While the exemplary embodiments described herein are susceptible to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and are described in detail herein. However, the exemplary embodiments described herein are not intended to be limited to the particular forms disclosed. Rather, the present disclosure covers all modifications, equivalents, and alternatives falling within the scope of the appended claims.
Extended reality systems, such as augmented reality or virtual reality systems, often include one or more sensors configured to detect the actions or intent of a user. Such sensors may be included in a control device, such as a wearable device configured to be worn by the user. For example, a wearable control device may include a plurality of electromyography (EMG) sensors that include electrodes designed to contact the skin of a user when the device is worn. EMG signals generated by these EMG sensors may, in turn, be used to generate a control signal that may be used to modify the extended reality experience of the user. However, these EMG signals are often susceptible to noise, such as electromagnetic noise generated by an electronic circuit of the control device and/or other components (such as magnetic trackers). Unfortunately, these noise signals often have undesirable effects on the control signal and, hence, on the extended reality experience of the user.
As is explained in greater detail below, the instant disclosure describes a variety of approaches to reducing or substantially eliminating the effects of noise, from any source, on detected sensor signals. For example, a control device according to the principles described herein may include an analog circuit with an amplifier configured to receive sensor signals from a plurality of electrodes, an analog-to-digital converter (ADC) configured to receive analog sensor signals from the analog circuit and to provide digital sensor signals, and a processor configured to receive the digital sensor signals and provide digital control signals based on the sensor signals.
In some examples, the control device may be configured to reduce the effects of electromagnetic noise. For example, the amplifier and/or the ADC may be configured to reduce noise signals, and an anti-aliasing filter may also be introduced into the analog circuit to prevent problematic under-sampling of the noise signal. The control device may also be shielded, and the arrangement of components within the control device may be configured to reduce the amplitude of the noise signal. Improved control signals may then be generated by the control device, allowing improved control of an extended reality view in response to the control signals.
In some embodiments, the output of one or more of the sensing components may be optionally processed using hardware signal processing circuit (e.g., to perform amplification, filtering, and/or rectification). In some embodiments, at least some signal processing of the output of the sensing components may be performed in software. Thus, signal processing of signals sampled by the sensors may be performed in hardware, software, or by any suitable combination of hardware and software, as aspects of the technology described herein are not limited in this respect. A non-limiting example of a signal processing chain used to process sensor data from sensors 210 is discussed in more detail below, for example, with reference to
In some examples, the dongle may be inserted into a separate computer device, that may be located within the same environment as the user, but not carried by the user. This separate computer may receive signals from the control device and further process these signals to provide control signals to the head-mounted device. In some examples, the dongle may be network enabled, allowing communication with a remote computer through the network, and the remote computer may provide control signals to the head-mounted device to modify an extended reality (XR) image (e.g., VR or AR image) presented to the user. In some examples, a dongle may be inserted into a head-mounted device to provide improved communications functionality, and the head-mounted device may perform further processing (e.g., modification of the XR image) based on the control signal received from the control device 310.
In some examples, an apparatus may not include a separate dongle portion. The configuration of the dongle portion may be included in a head-mounted device, such as an extended reality headset, or other device such as a remote computer device. In some examples, the circuit described above in
A head-mounted device may include an antenna similar to antenna 352 described above in relation to
Although the examples provided with reference to
In some examples, electromagnetic interference may be reduced by increasing the distance between a device and its associated analog circuit and a magnetic tracker transmitter that generates an AC magnetic field. In some embodiments, a shielding material may be arranged around at least a portion of the analog circuit to shield the circuit, at least in part, from the effects of the AC magnetic field. In yet further embodiments, one or more components of the analog circuit of the EMG control device may be configured to reduce electromagnetic interference induced on one or more conductors of the EMG control device. One or more of the various techniques for reducing electromagnetic interference described herein may be used alone or in combination.
In some examples, the control glove 430 (that may be more simply referred to as a glove) may include one or more magnetic tracker receivers. For example, a finger of the glove may include at least one receiver coil, and detection of a tracker signal from the at least one receiver coil induced by a magnetic tracker transmitter may be used to determine the position and/or orientation of at least portion of the finger. One or more receiver coils may be associated with each portion of a hand, such as a finger (such as the thumb), palm, and the like. The glove may also include other sensors providing sensor signals indicative of the position and/or configuration of the hand, such as electroactive sensors. Sensor signals, such as magnetic tracker receiver signals, may be transmitted to a control device, such as a wearable control device. In some examples, a control device (such as a wrist-mounted control device) may be in communication with a control glove, and receive sensor data from the control glove using wired and/or wireless communication. For example, a flexible electrical connector may extend between a control device (e.g., a wrist-mounted control device) and the glove.
In some examples, the control device 420 may include an EMG control interface similar to the device illustrated in
Electromagnetic interference reduction techniques may be integrated with magnetic trackers having configurations similar to the configuration shown in
In some examples, electromagnetic interference may be reduced by increasing the physical distance between the magnetic tracker transmitter and the EMG control interface. In some embodiments, electromagnetic noise induced in the circuit of the EMG control interface may be reduced, at least in part, prior to analog-to-digital conversion using additional circuit components introduced in the analog signal chain. Although introducing additional circuit components into the analog signal chain increases the amount of area that the analog signal chain circuit consumes on a printed circuit broad, in some embodiments, the increase in area is offset by noise reduction benefits, such as those described in more detail below.
In some examples, the ADC 530 may be a differential ADC, configured to output a digital signal based on the difference between two analog input voltages. If there is similar noise signal in both analog input voltages, the difference between the input voltages, and hence the digital signal, may be generally independent of the electromagnetic noise. For example, noise signals 544 and 546 may be present in both ADC inputs, and may have reduced or effectively no effect on the output of ADC 530 functioning as a differential ADC. In some examples, two digital signals may be generated, and digital subtraction (or division, or other comparison method) may be used to remove common mode noise signals. Alternatively, a difference signal may be generated and digitized. In this context, a common mode noise signal may refer to similar noise signals present in multiple sensor signals, or data channels derived therefrom.
In some examples, a device may include an analog circuit configured so that a first noise signal present at the non-inverting input of a differential amplifier may be similar to, and in phase with, a second noise signal generated at the inverting input of the differential amplifier. The differential amplifier may then effectively subtract the second noise signal from the first noise signal, so that the differential amplifier output may be effectively noise free, or have reduced noise in the differential amplifier output (e.g., compared to the output of a non-differential amplifier used in a similar circuit). In some examples, a device may include one or more fully differential amplifiers, where the difference between the two output voltages may be based on the difference between the two input voltages (optionally including multiplication by the gain, if any). For both differential amplifiers and fully differential amplifiers, the noise signal may be a common mode signal, having a similar form in both inputs, that is thereby greatly reduced or substantially eliminated in the output voltage(s). In some examples, negative feedback may be provided to reduce the gain and/or improve the signal bandwidth.
In some examples, the ADC and anti-aliasing filter may be integrated into a single package, for example, a single integrated circuit (IC, or chip). Shielding may be located proximate, adjacent, or within the ADC/anti-aliasing chip to reduce noise generation in the chip.
In some embodiments, attenuation of noise generated by an external electromagnetic source may be achieved using a higher-order anti-aliasing filter arranged between an input amplifier and an ADC within an analog-signal chain of an EMG control interface. The higher-order anti-aliasing filter may, for example, in combination with the amplifier, provide a transfer function such that the amplified in-band signals are at least 90 dB higher than the attenuated noise signals.
In some embodiments, electromagnetic noise may be reduced by changing a characteristic of the ADC circuit. Conventional ADC circuit is often susceptible to the aliasing effect, as discussed above. In some embodiments, a continuous-time ADC is used in the analog signal chain of the EMG control interface, which does not have the same aliasing properties. Although continuous-time ADCs may be more expensive and consume more power than a conventional ADC circuit, the tradeoff of improved electromagnetic interference reduction may be suitable for some applications.
In some examples, the shielding material may include an electrically conductive material. A shielding material may include a metal layer, such as an aluminum layer, having a metal layer thickness of 2 mm or less, for example, a thickness of 1 mm or less. In some embodiments, multiple layers of shielding material may be used, for example, if one layer of magnetic shielding does not offer the desired attenuation of noise signals. The shielding material as disclosed herein can be formed from or include any suitable material (including flexible and lightweight materials) provided it achieves the functionality described herein. In addition to the those mentioned above, such materials include but are not limited to: one or more metals and/or alloys or compounds (e.g., those comprising aluminum, bronze, tin, copper, and/or mu-metals), carbon-filled nylon, conductive paint (e.g., silver and/or carbon-based paint), conductive fabric (e.g., silver nanowire), conductive polymers (e.g., carbon or graphene filled polylactic acid (PLA)), conductive plastics, conductive rubbers, conductive silicones, or combinations thereof. The shielding material may also include one or more non-conductive components that may be combined with any one or more conductive components, such as the aforementioned examples.
In some examples, a method of reducing electromagnetic interference in an analog circuit of a control device for an extended reality (XR) system may include: providing an analog signal chain circuit that includes at least one amplifier and an analog-to-digital converter coupled to an amplifier by one or more electrical conductors; and reducing electromagnetic interference induced on the one or more electrical conductors by an external AC magnetic field by configuring at least one component of the control device to reduce the electromagnetic interference. The step of reducing the electromagnetic interference may include providing, in the analog signal chain circuit, at least one fully differential amplifier configured to subtract electromagnetic noise present on the one or more electrical conductors, that may include providing at least two fully differential amplifiers in the analog signal chain circuit. Reducing the electromagnetic interference may also include providing, in the analog signal chain circuit, at least one anti-aliasing filter arranged between an amplifier and the analog-to-digital converter, and/or arranging an anti-aliasing filter to be closer to the analog-to-digital converter than an amplifier. An anti-aliasing filter may include an anti-aliasing filter having at least two stages. In addition, reducing the electromagnetic interference may include forming a shielding material around at least a portion of the analog signal chain circuit. In one example, the method may also include providing, in the analog signal chain circuit, a plurality of analog-to-digital converters, each of which is configured to process the output of a single signal channel of a plurality of signal channels. In another example, the method may also include reducing the electromagnetic interference by integrating a magnetic tracker receiver within the control device and configuring a distance between the magnetic tracker receiver and a magnetic tracker transmitter of the XR system to reduce the electromagnetic interference.
In some examples, electromagnetic noise reduction may include the use of hardware, software, or a combination thereof. When implemented in software, the software code may be executed on any suitable processor or collection of processors, whether provided in a single computer or distributed among multiple computers. Any component or collection of components that perform the functions described above may be generically considered as one or more controllers that control the above-discussed functions. The one or more controllers may be implemented in numerous ways, such as with dedicated hardware or with one or more processors programmed using microcode or software to perform the functions recited above.
In some examples, a device may include at least one non-transitory computer readable storage medium (e.g., a computer memory, a portable memory, a compact disk, etc.) encoded with a computer program (i.e., a plurality of instructions), which, when executed on a processor, performs the above-discussed functions of the described examples. The computer-readable storage medium may be transportable such that the program stored thereon may be loaded onto any computer resource to implement any suitable aspects of the described examples. In addition, the reference to a computer program which, when executed, performs the above-discussed functions, is not limited to an application program running on a host computer. Rather, the term computer program may be used herein to reference any type of computer code (e.g., software or microcode) that may be employed to program a processor to implement the one or more aspects of the described example
In some examples, a device (such as a control device for an extended reality (XR) system) includes an analog signal chain circuit including at least one amplifier configured to amplify analog electrical signals recorded from a body of a user on which the device is worn, and an analog-to-digital converter configured to convert the amplified analog electrical signals to digital signals. In these examples, at least one component of the device is configured to reduce electromagnetic interference induced on one or more conductors within the analog signal chain circuit by an external AC magnetic field.
In some examples, the amplifier may include at least one fully differential amplifier configured to reduce the electromagnetic interference. In some examples, the amplifier may include at least two fully differential amplifiers. In some examples, the analog signal chain circuit may further include an anti-aliasing filter arranged between an amplifier and a respective analog-to-digital converter, where the anti-aliasing filter is configured to reduce electromagnetic interference. In some examples, the anti-aliasing filter may include an anti-aliasing filter arranged closer to the analog-to-digital converter than the amplifier. In some examples, the distance between the anti-aliasing filter and the analog-to-digital converter is less than 2 cm. The anti-aliasing filter may have one or more stages, such as at least two stages. An example device may further include a shielding material formed around at least a portion of the analog signal chain circuit, where the shielding material is configured to reduce the electromagnetic interference.
In some examples, a device includes a plurality of signal channels, where each signal channel is configured to record an analog electrical signal from the body of the user. The analog signal chain circuit may further include a plurality of analog-to-digital converters, each of which is configured to process the analog electrical signal from one of the plurality of signal channels. In some examples, the control device may include a magnetic tracker receiver. The distance between the magnetic tracking receiver and the magnetic tracking system transmitter of the XR system may be configured to reduce the electromagnetic interference. An example device, such as a control device, may include a plurality of EMG sensors configured to record a plurality of EMG signals from the body of the user, with an amplifier coupled to one or more of the plurality of EMG sensors. The analog-to-digital converter may include a continuous-time analog-to-digital converter configured to reduce the electromagnetic interference.
In some examples, a method of reducing electromagnetic interference in an analog circuit of a control device for an extended reality (XR) system includes: providing an analog signal chain circuit including at least one amplifier and an analog-to-digital converter coupled to an amplifier by one or more electrical conductors; and reducing electromagnetic interference induced on the one or more electrical conductors by an external AC magnetic field by configuring at least one component of the control device to reduce the electromagnetic interference. The step of reducing the electromagnetic interference may include providing, in the analog signal chain circuit, at least one fully differential amplifier configured to subtract electromagnetic noise present on the one or more electrical conductors, that may include providing at least two fully differential amplifiers in the analog signal chain circuit. Reducing the electromagnetic interference may also include providing, in the analog signal chain circuit, at least one anti-aliasing filter arranged between an amplifier and the analog-to-digital converter, and/or arranging an anti-aliasing filter to be closer to the analog-to-digital converter than an amplifier. An anti-aliasing filter may include an anti-aliasing filter having at least two stages. Reducing the electromagnetic interference may include forming a shielding material around at least a portion of the analog signal chain circuit. An example method may further include providing, in the analog signal chain circuit, a plurality of analog-to-digital converters, each of which is configured to process output of a single signal channel of a plurality of signal channels. An example method may further include reducing the electromagnetic interference by integrating a magnetic tracker receiver within the control device such that a distance between the magnetic tracker receiver and a magnetic tracker transmitter of the XR system is configured to reduce the electromagnetic interference. In some examples, the magnetic tracker transmitter may be located within or supported by a head-mounted device or positioned in another location away from the sensors within the control device. The control device may include one or more magnetic tracker receiver coils, and/or receive signals from one or more magnetic tracker receiver coils. Receiver coils may be located on, for example, the hand, wrist, limb segments, joints, head, or other locations on the user's body.
In some examples, an extended reality (XR) system may include a head-mounted device, such as a headset configured to be worn on a user's head, and a control device configured to be worn on the user's arm or wrist. In these examples, the control device includes an analog signal chain circuit including at least one amplifier configured to amplify analog electrical signals recorded from a body of the user and an analog-to-digital converter configured to convert the amplified analog electrical signals to digital signals. In addition, at least one component of the XR system may be configured to reduce electromagnetic interference induced on one or more conductors within the analog signal chain circuit by an external AC magnetic field.
As detailed above, an electromagnetic field, such as a transmitter signal, may induce noise signals within the receiver of an apparatus. This transmitter signal may be generated by passing an alternating current from an alternating voltage source through a coil. The transmitter signal may generate a noise signal within a closed-loop or open-loop conductor within the receiver due, for example, to an interaction with the transmitter signal. A closed-loop or open-loop conductor may be formed, at least in part, by conducting tracks within the receiver circuit.
An example control device may include EMG sensors arranged circumferentially around a band, such as an elastic band configured to be worn around a body part of a user, such as the lower arm, wrist, one or more fingers, ankle, foot, head, or chest. Any suitable number of neuromuscular sensors may be used. The number and arrangement of neuromuscular sensors within a device may depend on the particular application for which the control device is used. In some examples, the sensors of an apparatus may be coupled together, for example, using flexible electronics incorporated into a control device, for example, within a flexible band.
In some examples, an apparatus, such as a control device (e.g., including an armband, wristband, and/or a head-mounted device) may be configured to generate a control signal for controlling an external device. The external device that may be controlled by the apparatus may include one or more of the following: an augmented reality system, a robot, an appliance (such as a television, radio, or other audiovisual device), an in-house system (such as heating or air conditioning), a vehicle, or other electronic device including a screen (e.g., to scroll through text, interact with a user interface, or control the operation of software). In some examples, an apparatus may be configured to control a virtual avatar within an augmented reality or virtual reality environment, or to perform any other suitable control task.
In some embodiments, the output of one or more of the sensors may be optionally processed using hardware signal processing circuit (e.g., to perform amplification, filtering, and/or rectification). In some embodiments, at least some signal processing of the output of the sensors may be performed in software. Thus, signal processing of signals sampled by the sensors may be performed in hardware, software, or by any suitable combination of hardware and software, as aspects of the technology described herein are not limited in this respect.
An example device may include a control device and one or more dongle portions in communication with the control device (e.g., via BLUETOOTH or another suitable short-range wireless communication technology). The control device may include one or more sensors, that may include electrical sensors including one or more electrodes. The electrical outputs from the electrodes, that may be referred to as sensor signals, may be provided to an analog circuit configured to perform analog processing (e.g., filtering, etc.) of the sensor signals. The processed sensor signals may then be provided to an analog-to-digital converter (ADC), that may be configured to convert analog signals to digital signals that may be processed by one or more computer processors. Example computer processors may include one or more microcontrollers (MCU), such as the nRF52840 (manufactured by NORDIC SEMICONDUCTOR). The MCU may also receive inputs from one or more other sensors. The device may include one or more other sensors, such as an orientation sensor, that may be an absolute orientation sensor and may include an inertial measurement unit. An example orientation sensor may include a BNO055 inertial measurement unit (manufactured by BOSCH SENSORTEC). The device may also include a dedicated power supply, such as a power and battery module. The output of the processing performed by MCU may be provided to an antenna for transmission to the dongle portion or another device. Other sensors may include mechanomyography (MMG) sensors, sonomyography (SMG) sensors, electrical impedance tomography (EIT) sensors, and other suitable type of sensors.
A dongle portion may include one or more antennas configured to communicate with the control device and/or other devices. Communication between device components may use any suitable wireless protocol, such as radio-frequency signaling and BLUETOOTH. Signals received by the antenna of dongle portion may be provided to a computer through an output, such as a USB output, for further processing, display, and/or for effecting control of a particular physical or virtual object or objects.
In some examples, a magnetic tracker transmitter may be provided by a separate device, such a separate computer device that may not be supported by a user. For example, a magnetic tracker transmitter may be located at a fixed location relative to the environment (e.g., a room) in which the user is located.
In some examples, a device according to the principles disclosed herein may include a higher-order anti-aliasing filter to attenuate the noise signals. This filter, plus the amplifier block, may offer a transfer function such that the amplified in-band signals are at least 90 dB higher than the attenuated noise signals. Such a configuration may use traces between ADC inputs and anti-aliasing filters outputs that are as short as practically possible. The noise signal coupled into unprotected traces may be negligible if such traces are kept short (e.g., approximately 3 mm in length, or less.)
In some examples, an ADC may be located within each analog channel, and may be located as close to the amplifier output as possible. For example, the analog signal may be converted into a corresponding digital form soon after it is outputted by the amplifier, for example, using a trace (e.g., a PCB track or other electrical conductor) having a length of approximately 3 mm or less. In some examples, the trace length may be approximately equal to or less than 2 mm to substantially avoid noise generation through the alternating electromagnetic field.
Examples include various methods and apparatuses for reducing electromagnetic interference in sensors used in extended reality (XR) environments, such as augmented reality (AR) or virtual reality (VR) environments. As is explained in greater detail below, positional tracking may be used in XR environments (such as AR or VR environments) to track movements, for example, with six degrees of freedom. A corresponding computing device may be configured to estimate a position of an object relative to the environment using one or more positional tracking approaches. Positional tracking may include magnetic tracking, in which the magnitude of a magnetic field may be measured in different directions.
An example apparatus may include a control device, that may be configured to be worn on the wrist of a user, and a head-mounted device. The control device, such as a wearable control device, may be configured to be supported on the wrist or lower arm of a user, and may include one or more sensors. The head-mounted device may include a headset, augmented reality spectacles, or other device configured to be supported by the head of a user, for example, by one or more frame elements, straps, and/or other support elements. The headset may take the form of a visor or helmet, or may be supported by a frame similar to those of spectacles. The head-mounted device may be configured to provide an extended reality environment to a user, such as a virtual reality or augmented reality environment. The control device and the head-mounted device may be in communication with one another, such as via wireless or wired communication components. The control device may detect gestures or other movements of the hands of the user, and provide control signals to the head-mounted device. The control signals may be used to modify augmented or virtual reality image elements displayed to a user. In some examples, control signals may be used to control real (physical) devices, that may be viewed by the user as part of an extended reality experience. In some examples, the apparatus may include a control element used to send control signals to a computer device.
An example magnetic tracker (that may also be referred to as a magnetic tracking system or magnetic tracker) may determine the intensity of a magnetic field using one or more electromagnetic sensors, such as magnetic sensors. The magnetic tracker may include a base station having a transmitter configured to generate an alternating or static electromagnetic field, and one or more sensors that may be configured to send sensor data to a computing device. The sensor data may be related to a position and/or orientation of the sensor with respect to the transmitter. The magnetic tracker may also be configured to enable the determination of object orientation. For example, if a tracked object is rotated, the distribution of the magnetic field along various axes (e.g., orthogonal axes in relation to the sensor) may change. The resulting change in the sensor signal may be used to determine the orientation of the sensor, and, optionally, the orientation of an object on which the sensor is located.
In one embodiment, an example apparatus may include an improved human-machine interface for XR devices, such as AR or VR devices, and an apparatus configured to control computing devices or other electronic devices. This example apparatus may also include a control device configured to receive and process electrical signals derived from the body of a user to provide a control signal. The control signal may be used for object manipulation within an XR environment, control of a computing device, or control of any other suitable electronic device. The control device may include one or more sensors, that may include one or more of an electromyography (EMG) sensor, mechanomyography (MMG) sensor, sonomyography (SMG) sensor, electrical impedance tomography (EIT) sensor, and/or any other suitable sensor.
In some examples, an apparatus, such as a control device, may include one or more printed circuit boards (PCBs), that may be electrically interconnected. In some examples, a PCB may include a plurality of electrically conducting traces, which in some examples may be configured to sense signals, such as signals from the body of a user. The apparatus may be configured to include one or more electronic circuits configured to provide signal amplification, data acquisition, wireless transmission, or other suitable signal acquisition and processing operations.
An apparatus including a magnetic tracker may allow, for example, manipulation of objects in XE environments. An alternating or static magnetic field produced by a transmitter of a magnetic tracker may induce voltage and/or current within an apparatus. For example, electromagnetic fields generated by the transmitter may induce electrical signals within the apparatus, for example, due to electromagnetic coupling to electrical conductors within the apparatus, such as copper tracks within a PCB. In some examples, copper tracks may help form electrically conducting loops, and stray signals may be induced within such loops. The stray signals may induce noise signals within the device, and the noise signals may have the same frequency as the transmitter. The transmitter frequency may be, for example, in the range of approximately 10 kHz to approximately 50 kHz. The transmitter frequency may be higher than the frequency of signals obtained from the body of the user, for instance higher than electromyography signals, which normally are in the frequency range of between approximately 20 Hz to approximately 3 kHz. A sampling frequency, that may be twice the frequency of the biometric signals (e.g., approximately 6 kHz), may be used to convert the analog signals into digital. Even though the induced noise frequency may be much higher than the biometric signals from human bodies, after the analog-digital conversion stage, the noise may be under-sampled, and alias signals originating from the under-sampling of the noise signal may be introduced into the frequency band of the biometric signals. Effectively, high-frequency noise signals may be “down-converted” in the frequency band and be combined with the biometric signals of interest. The noise signal may become stronger as the transmitter is moved closer to the apparatus.
Positional tracking may be used in XR environments to track movements with up to and including six degrees of freedom. Computer devices may be configured, with hardware and/or software, to estimate the positions of objects relative to the environment using positional tracking technologies. Magnetic tracking is a technique in which the magnitude of a magnetic field may be measured in different directions to track positions of one or more objects in an environment.
A magnetic tracking system (or magnetic tracker) may be configured to measure the intensity of an inhomogeneous magnetic field using one or more electromagnetic sensors. An example magnetic tracker may include a transmitter configured to generate an alternating or static electromagnetic field, and one or more electromagnetic sensors configured to provide a respective sensor position (e.g., with respect to the transmitter) to a computer device. The orientation of an object may also be determined using a magnetic tracker. For instance, if a tracked object is rotated, the distribution of the magnetic field along the various axes may change, and these changes may be used to determine the object's orientation.
An example apparatus may include control devices configured as human-machine interfaces, and may be used for immersive XR applications (such as virtual reality applications), and more generally to control computer devices. An example interface device may be configured to process electrical signals derived from the body of a user, and may be used to achieve realistic object manipulation in an XR environment. Example devices may include one or more sensors, including one or more electromyography (EMG) sensors, mechanomyography (MMG) sensors, sonomyography (SMG) sensors, electrical impedance tomography (EIT) sensors, and/or other suitable sensors. Example devices may include one or more printed circuit boards (PCBs), that may include boards connected together, that may include many (e.g., thousands) of electrically conductive traces routed together to achieve certain functionalities such as sensing signals from a user body, signal amplification, data acquisition, wireless transmission, and/or other suitable signal acquisition and processing operations.
Example devices including a magnetic tracker may enable, for example, manipulation of objects in XR environments. However, an alternating or static magnetic field produced by the transmitter of a magnetic tracker may induce a voltage or current within open or closed electrically conductive loops within the device. An electrically conductive loop may include device components (e.g., copper traces on a PCB, electronic components, wires, and the like), resulting in noise being introduced into the device. The introduced noise may fluctuate at the same frequency used by the magnetic tracker, that may operate, for example, at a frequency within a range of 10 kHz to 50 kHz. In some examples, a magnetic tracker transmitter and/or a corresponding magnetic tracker receiver may include at least one coil, such as a 3-axis coil arrangement. Signal processing may be used to establish the three-dimensional relationship between transmitter and receiver coil arrangements.
Magnetic tracker frequencies may be higher than frequencies associated with signals recorded and/or derived from the user's body. For instance, frequencies associated with EMG signals typically range between ˜20 Hz to ˜3 KHz. In some examples, a device may use a signal sampling frequency twice as large as the highest frequency of the signal of interest (e.g., around ˜6 kHz) to convert the analog signals into digital signals (e.g., using analog to digital conversion (ADC) circuit). Despite the induced noise frequency being substantially higher than the frequency of the recorded biometric signals, when the high frequency noise is provided as input to the ADC circuit, it may be undersampled and an aliased image of the noise may interfere with the frequency band of interest associated with the biometric signals. The high frequency noise signal may be “down-converted” into the frequency band of interest by the ADC circuit. In some examples, a device is configured to reduce electromagnetic interference in a control device, such as a wearable control device.
In some examples, an apparatus may include a control device for an extended reality system. The control device may include analog signal chain circuit including at least one amplifier configured to amplify analog electrical signals recorded from a body of a user on which the control device is worn, and an analog-to-digital converter configured to convert the amplified analog electrical signals to digital signals, where at least one component of the control device is configured to reduce electromagnetic interference induced on one or more conductors within the analog signal chain circuit by an external AC magnetic field. In some examples, an amplifier may include at least one fully differential amplifier configured to reduce the electromagnetic interference. In some examples, an amplifier includes at least two fully differential amplifiers. In some examples, the analog signal chain circuit may further include at least one anti-aliasing filter arranged between an amplifier and the analog-to-digital converter, where an anti-aliasing filter may be configured to reduce the electromagnetic noise within the analog circuit. In some examples, an anti-aliasing filter may be located closer to the analog-to-digital converter than the associated amplifier. In some examples, a distance between the anti-aliasing filter and the analog-to-digital converter may be less than 20 mm, and in some examples may be less than 5 mm, such as less than 2 mm. In some examples, an anti-aliasing filter may have at least two stages. In some examples, the control device may further include a shielding material formed around at least a portion of analog circuit.
In some examples, a device, such as a control device, may further include a plurality of signal channels, where each signal channel is configured to record an analog electrical signal from the body of the user, and where the analog circuit further includes a plurality of analog-to-digital converters, each of which is configured to process the analog electrical signal within one of the plurality of signal channels. The analog circuit may include a plurality of signal channels, and each signal channel may include an analog-to-digital converter.
In some examples, a device, such as a control device, may further include a magnetic tracking system receiver (also referred to as a magnetic tracker receiver), where, in some examples, a distance between the magnetic tracker receiver and the magnetic tracker transmitter of the XR system may be configured to reduce the electromagnetic noise in the analog circuit. For example, the magnetic tracker receiver may be located adjacent or otherwise proximate the head of the user. In some examples, the control device may further include a plurality of EMG sensors configured to record a plurality of EMG signals from the body of the user, where an amplifier coupled to one or more of the plurality of EMG sensors. In some examples, the analog-to-digital converter may include a continuous-time analog-to-digital converter configured to reduce the electromagnetic interference. In some examples, the magnetic tracker system may be trained, for example, by comparison of receiver signals with analysis of images determined using an optical imaging system, or using a training process where a user places body parts, such as hands, into predetermined configurations. Magnetic tracker receivers may be distributed over the body of a user, for example, distributed over the torso, limb segments, and joints of a user. The magnetic tracking data may also be used in conjunction with a musculo-skeletal model of the user.
Some embodiments are directed to methods of reducing electromagnetic interference in analog circuit of a control device for an extended reality system. An example method may include providing an analog circuit including at least one amplifier and an analog-to-digital converter coupled to an amplifier by one or more electrical conductors, and reducing electromagnetic interference induced on the one or more electrical conductors by an external AC magnetic field by configuring at least one component of the control device to reduce the electromagnetic interference.
In some examples, a method reducing the electromagnetic interference includes providing in the analog signal chain circuit at least one fully differential amplifier configured to subtract electromagnetic noise present on the one or more electrical conductors. In some examples, the analog circuit may include at least one fully differential amplifier, such as at least two fully differential amplifiers. In some examples, reducing the electromagnetic interference includes providing in the analog circuit at least one anti-aliasing filter arranged between an amplifier and the analog-to-digital converter. In some examples, reducing the electromagnetic interference further includes arranging an anti-aliasing filter closer to the analog-to-digital converter than to an amplifier. In some examples, an anti-aliasing filter may include an anti-aliasing filter having at least two stages. In some examples, reducing the electromagnetic interference may include forming a shielding material around at least a portion of the analog signal chain circuit.
In some examples, a method includes reducing electromagnetic noise induced in an analog circuit by using the fully differential amplifier to reduce the effect of electromagnetic noise signals (e.g., present in both inputs of the fully differential amplifier) on the outputs of the fully differential amplifier, and further reducing the noise signal using the differential analog-to-digital converter configured to receive the outputs of the fully differential amplifier. The electromagnetic noise may be generated by a transmitter of a magnetic tracker system. The analog circuit may be configured to receive and process sensor signals from a plurality of electromyography sensors. In some examples, a method may further include using an anti-aliasing filter to further reduce the electromagnetic noise.
In some examples, the method may further include providing in the analog signal chain circuit a plurality of analog-to-digital converters, each of which is configured to process output of a single signal channel of a plurality of signal channels. In some examples, reducing the electromagnetic interference may include integrating a magnetic tracking system receiver with the control device such that a distance between the magnetic tracking system receiver and a magnetic tracking system transmitter of the XR system is configured to reduce the electromagnetic interference.
Some embodiments are directed to an XR system. The XR system may include a headset configured to be worn on a user's head, and a control device configured to be worn on the user's arm or wrist. The control device may include analog signal chain circuit including at least one amplifier configured to amplify analog electrical signals recorded from a body of the user, and an analog-to-digital converter configured to convert the amplified analog electrical signals to digital signals, where at least one component of the XR system is configured to reduce electromagnetic interference induced on one or more conductors within the analog signal chain circuit by an external AC magnetic field. All combinations of the concepts discussed herein are contemplated as being part of the disclosed subject matter (provided such concepts are not mutually inconsistent).
In some examples, an apparatus may include at least one physical processor, and physical memory including computer-executable instructions that, when executed by the physical processor, cause the physical processor to provide control signals to an extended reality headset (or other head-mounted device) based on detected EMG signals and/or to perform any of the other methods described herein.
In some examples, a non-transitory computer-readable medium may include one or more computer-executable instructions that, when executed by at least one processor of a computing device, cause the computing device to provide control signals to an extended reality headset (or other head-mounted device) based on detected EMG signals and/or to perform any of the other methods described herein.
Example 1. An example apparatus includes a head-mounted device that, when worn by a user, is configured to present an extended reality view to the user; and a control device including one or more electrodes that, when worn on a wrist of the user, contact the skin of the user, the control device including: an analog circuit including an amplifier configured to receive sensor signals from the one or more electrodes; an analog-to-digital converter (ADC) configured to receive analog sensor signals from the analog circuit and provide digital sensor signals; a processor configured to receive the digital sensor signals and provide control signals based on the digital sensor signals; and a control device antenna configured to transmit the control signals to the head-mounted device, wherein the head-mounted device is configured to modify the extended reality view in response to the control signals.
Example 2. The apparatus of example 1, wherein the amplifier includes a differential amplifier configured to remove electromagnetic noise generated within the analog circuit.
Example 3. The apparatus of any of examples 1-2, wherein the differential amplifier includes a fully differential amplifier.
Example 4. The apparatus of any of examples 1-3, wherein the ADC includes a differential ADC configured to remove electromagnetic noise generated within the analog circuit.
Example 5. The apparatus of any of examples 1-4, wherein the analog circuit includes a plurality of analog channels, and each analog channel has an associated one of a plurality of ADCs, wherein the plurality of ADCs includes the ADC.
Example 6. The apparatus of any of examples 1-5, wherein the analog circuit includes an anti-aliasing filter.
Example 7. The apparatus of any of examples 1-6, wherein the anti-aliasing filter is located proximate the ADC such that an electrical connection between the anti-aliasing filter and the ADC is shorter than approximately 1.5 cm.
Example 8. The apparatus of any of examples 1-7, wherein the amplifier is located proximate the ADC such that an electrical connection between an output of the amplifier and an input of the ADC is less than approximately 3 mm.
Example 9. The apparatus of any of examples 1-8, further including a magnetic tracker including a transmitter and a plurality of receivers.
Example 10. The apparatus of any of example 9, wherein the transmitter of the magnetic tracker is supported by the head-mounted device.
Example 11. The apparatus of any of examples 9-10, wherein at least one of the plurality of receivers of the magnetic tracker is configured to determine an orientation of a finger of the user.
Example 12. The apparatus of any of examples 9-11, wherein the plurality of receivers of the magnetic tracker are located within a control glove configured to be worn on a hand of the user, and the control glove is in communication with the control device.
Example 13. The apparatus of any of examples 1-12, wherein the control device includes a shielding layer configured to shield the analog circuit from electromagnetic radiation.
Example 14. The apparatus of example 13, wherein the shielding layer includes a ferrite material.
Example 15. The apparatus of any of examples 13-14, wherein the shielding layer includes at least one of the following: a metal layer, alloy layer, conductive polymer, carbon-filled nylon, conductive paint, conductive fabric, conductive plastic, conductive rubber, and/or conductive silicone.
Example 16. The apparatus of any of examples 1-15, wherein the head-mounted device includes at least one of a virtual-reality headset or augmented reality spectacles.
Example 17. The apparatus of any of examples 1-16, wherein the control device includes one or more electromyography sensors, wherein: the one or more electromyography sensors include the one or more electrodes, and the one or more sensors provides the sensor signals received by the analog circuit.
Example 18. The apparatus of any of examples 1-17, wherein the processor further receives inertial sensor signals from an inertial sensor, and the control signals are based on the digital sensor signals and the inertial sensor signals.
Example 19. An example method includes reducing electromagnetic noise in an analog circuit by: applying a fully differential amplifier to at least one output of the analog circuit; and further reducing the electromagnetic noise in the analog circuit by applying a differential analog-to-digital converter to at least one output of the fully differential amplifier, wherein the analog circuit is configured to receive and process sensor signals from one or more neuromuscular sensors.
Example 20. The method of example 19, further including applying an anti-aliasing filter to at least one output of the analog circuit to further reduce the electromagnetic noise.
Example 21. The method of any of examples 19-20, wherein the one or more neuromuscular sensors include sensors for detecting EMG signals.
Example 22. An example apparatus includes: a head-mounted device that includes a magnetic tracking system transmitter, and when worn by a user, is configured to present an extended reality view to the user; a control device including one or more electromyography electrodes that contact the skin of the user when worn by the user, the control device including: an analog circuit including an anti-aliasing filter and an amplifier configured to receive electromyography sensor signals from the one or more electrodes; a shielding layer configured to shield the analog circuit from electromagnetic radiation; an analog-to-digital converter (ADC) configured to receive analog sensor signals from the analog circuit and provide digital sensor signals; a processor configured to receive the digital sensor signals and provide control signals based on the digital sensor signals; a magnetic tracking system receiver; and a control device antenna configured to transmit the control signals to the head-mounted device, wherein the head-mounted device is configured to modify the extended reality view in response to the control signals.
Embodiments of the present disclosure may include or be implemented in conjunction with various types of artificial reality systems. Artificial reality is a form of reality that has been adjusted in some manner before presentation to a user, that may include, for example, a virtual reality, an augmented reality, a mixed reality, a hybrid reality, or some combination and/or derivative thereof. Artificial-reality content may include completely computer-generated content or computer-generated content combined with captured (e.g., real-world) content. The artificial-reality content may include video, audio, haptic feedback, or some combination thereof, any of that may be presented in a single channel or in multiple channels (such as stereo video that produces a three-dimensional (3D) effect to the viewer). Additionally, in some embodiments, artificial reality may also be associated with applications, products, accessories, services, or some combination thereof, that are used to, for example, create content in an artificial reality and/or are otherwise used in (e.g., to perform activities in) an artificial reality.
Artificial-reality systems may be implemented in a variety of different form factors and configurations. Some artificial reality systems may be designed to work without near-eye displays (NEDs). Other artificial reality systems may include an NED that also provides visibility into the real world (e.g., augmented-reality system 1300 in
Turning to
In some embodiments, augmented-reality system 1300 may include one or more sensors, such as sensor 1340. Sensor 1340 may represent one or more sensors, of the same or different sensing modalities. Sensor 1340 may generate measurement signals in response to the motion of augmented-reality system 1300, and may be located on substantially any portion of frame 1310. Sensor 1340 may represent one or more of a position sensor, an inertial sensor such as an inertial measurement unit (IMU), a depth camera assembly, a structured light emitter and/or detector, or any combination thereof. In some embodiments, augmented-reality system 1300 may or may not include sensor 1340 or may include more than one sensor. In some embodiments, sensor 1340 may include an IMU, and the IMU may generate calibration data based on measurement signals from sensor 1340. Examples of sensor 1340 may include, without limitation, accelerometers, gyroscopes, magnetometers, other suitable types of sensors that detect motion, sensors used for error correction of the IMU, or some combination thereof. Augmented-reality system 1300 may also include a microphone array with a plurality of acoustic transducers 1320(A)-1320(J), referred to collectively as acoustic transducers 1320. Acoustic transducers 1320 may be transducers that detect air pressure variations induced by sound waves. Each acoustic transducer 1320 may be configured to detect sound and convert the detected sound into an electronic format (e.g., an analog or digital format). The microphone array in
In some embodiments, one or more of acoustic transducers 1320(A)-(F) may be used as output transducers (e.g., speakers). For example, acoustic transducers 1320(A) and/or 1320(B) may be earbuds or any other suitable type of headphone or speaker.
The configuration of acoustic transducers 1320 of the microphone array may vary. While augmented-reality system 1300 is shown in
Acoustic transducers 1320(A) and 1320(B) may be positioned on different parts of the user's ear, such as behind the pinna, behind the tragus, and/or within the auricle or fossa. Or, there may be additional acoustic transducers 1320 on or surrounding the ear in addition to acoustic transducers 1320 inside the ear canal. Having an acoustic transducer 1320 positioned next to an ear canal of a user may enable the microphone array to collect information on how sounds arrive at the ear canal. By positioning at least two of acoustic transducers 1320 on either side of a user's head (e.g., as binaural microphones), augmented-reality device 1300 may simulate binaural hearing and capture a 3D stereo sound field around about a user's head. In some embodiments, acoustic transducers 1320(A) and 1320(B) may be connected to augmented-reality system 1300 via a wired connection 1330, and in other embodiments acoustic transducers 1320(A) and 1320(B) may be connected to augmented-reality system 1300 via a wireless connection (e.g., a BLUETOOTH connection). In still other embodiments, acoustic transducers 1320(A) and 1320(B) may not be used at all in conjunction with augmented-reality system 1300.
Acoustic transducers 1320 on frame 1310 may be positioned along the length of the temples, across the bridge, above or below display devices 1315(A) and 1315(B), or some combination thereof. Acoustic transducers 1320 may be oriented such that the microphone array is able to detect sounds in a wide range of directions surrounding the user wearing the augmented-reality system 1300. In some embodiments, an optimization process may be performed during manufacturing of augmented-reality system 1300 to determine relative positioning of each acoustic transducer 1320 in the microphone array.
In some examples, augmented-reality system 1300 may include or be connected to an external device (e.g., a paired device), such as neckband 1305. Neckband 1305 generally represents any type or form of paired device. Thus, the following discussion of neckband 1305 may also apply to various other paired devices, such as charging cases, smart watches, smart phones, wrist bands, other wearable devices, hand-held controllers, tablet computers, laptop computers, other external computer devices, etc.
As shown, neckband 1305 may be coupled to eyewear device 1302 via one or more connectors. The connectors may be wired or wireless and may include electrical and/or non-electrical (e.g., structural) components. In some cases, eyewear device 1302 and neckband 1305 may operate independently without any wired or wireless connection between them. While
Pairing external devices, such as neckband 1305, with augmented-reality eyewear devices may enable the eyewear devices to achieve the form factor of a pair of glasses while still providing sufficient battery and computation power for expanded capabilities. Some or all of the battery power, computational resources, and/or additional features of augmented-reality system 1300 may be provided by a paired device or shared between a paired device and an eyewear device, thus reducing the weight, heat profile, and form factor of the eyewear device overall while still retaining desired functionality. For example, neckband 1305 may allow components that would otherwise be included on an eyewear device to be included in neckband 1305 since users may tolerate a heavier weight load on their shoulders than they would tolerate on their heads. Neckband 1305 may also have a larger surface area over which to diffuse and disperse heat to the ambient environment. Thus, neckband 1305 may allow for greater battery and computation capacity than might otherwise have been possible on a stand-alone eyewear device. Since weight carried in neckband 1305 may be less invasive to a user than weight carried in eyewear device 1302, a user may tolerate wearing a lighter eyewear device and carrying or wearing the paired device for greater lengths of time than a user would tolerate wearing a heavy standalone eyewear device, thereby enabling users to more fully incorporate artificial reality environments into their day-to-day activities.
Neckband 1305 may be communicatively coupled with eyewear device 1302 and/or to other devices. These other devices may provide certain functions (e.g., tracking, localizing, depth mapping, processing, storage, etc.) to augmented-reality system 1300. In the embodiment of
Acoustic transducers 1320(I) and 1320(J) of neckband 1305 may be configured to detect sound and convert the detected sound into an electronic format (analog or digital). In the embodiment of
Controller 1325 of neckband 1305 may process information generated by the sensors on neckband 1305 and/or augmented-reality system 1300. For example, controller 1325 may process information from the microphone array that describes sounds detected by the microphone array. For each detected sound, controller 1325 may perform a direction-of-arrival (DOA) estimation to estimate a direction from which the detected sound arrived at the microphone array. As the microphone array detects sounds, controller 1325 may populate an audio data set with the information. In embodiments in which augmented-reality system 1300 includes an inertial measurement unit, controller 1325 may compute all inertial and spatial calculations from the IMU located on eyewear device 1302. A connector may convey information between augmented-reality system 1300 and neckband 1305 and between augmented-reality system 1300 and controller 1325. The information may be in the form of optical data, electrical data, wireless data, or any other transmittable data form. Moving the processing of information generated by augmented-reality system 1300 to neckband 1305 may reduce weight and heat in eyewear device 1302, making it more comfortable to the user.
Power source 1335 in neckband 1305 may provide power to eyewear device 1302 and/or to neckband 1305. Power source 1335 may include, without limitation, lithium ion batteries, lithium-polymer batteries, primary lithium batteries, alkaline batteries, or any other form of power storage. In some cases, power source 1335 may be a wired power source. Including power source 1335 on neckband 1305 instead of on eyewear device 1302 may help better distribute the weight and heat generated by power source 1335.
As noted, some artificial reality systems may, instead of blending an artificial reality with actual reality, substantially replace one or more of a user's sensory perceptions of the real world with a virtual experience. One example of this type of system is a head-worn display system, such as virtual-reality system 1400 in
Artificial reality systems may include a variety of types of visual feedback mechanisms. For example, display devices in augmented-reality system 1300 and/or virtual-reality system 1400 may include one or more liquid crystal displays (LCDs), light emitting diode (LED) displays, organic LED (OLED) displays digital light project (DLP) micro-displays, liquid crystal on silicon (LCoS) micro-displays, and/or any other suitable type of display screen. Artificial reality systems may include a single display screen for both eyes or may provide a display screen for each eye, that may allow for additional flexibility for varifocal adjustments or for correcting a user's refractive error. Some artificial reality systems may also include optical subsystems having one or more lenses (e.g., conventional concave or convex lenses, Fresnel lenses, adjustable liquid lenses, etc.) through which a user may view a display screen. These optical subsystems may serve a variety of purposes, including to collimate (e.g., make an object appear at a greater distance than its physical distance), to magnify (e.g., make an object appear larger than its actual size), and/or to relay (to, e.g., the viewer's eyes) light. These optical subsystems may be used in a non-pupil-forming architecture (such as a single lens configuration that directly collimates light but results in so-called pincushion distortion) and/or a pupil-forming architecture (such as a multi-lens configuration that produces so-called barrel distortion to nullify pincushion distortion).
In addition to or instead of using display screens, some artificial reality systems may include one or more projection systems. For example, display devices in augmented-reality system 1300 and/or virtual-reality system 1400 may include micro-LED projectors that project light (using, e.g., a waveguide) into display devices, such as clear combiner lenses that allow ambient light to pass through. The display devices may refract the projected light toward a user's pupil and may enable a user to simultaneously view both artificial reality content and the real world. The display devices may accomplish this using any of a variety of different optical components, including waveguides components (e.g., holographic, planar, diffractive, polarized, and/or reflective waveguide elements), light-manipulation surfaces and elements (such as diffractive, reflective, and refractive elements and gratings), coupling elements, etc. Artificial reality systems may also be configured with any other suitable type or form of image projection system, such as retinal projectors used in virtual retina displays.
Artificial reality systems may also include various types of computer vision components and subsystems. For example, augmented-reality system augmented-reality system 1300 and/or virtual-reality system 1400 may include one or more optical sensors, such as two-dimensional (2D) or 3D cameras, structured light transmitters and detectors, time-of-flight depth sensors, single-beam or sweeping laser rangefinders, 3D LiDAR sensors, and/or any other suitable type or form of optical sensor. An artificial reality system may process data from one or more of these sensors to identify a location of a user, to map the real world, to provide a user with context about real-world surroundings, and/or to perform a variety of other functions.
Artificial reality systems may also include one or more input and/or output audio transducers. In the examples shown in
While not shown in
By providing haptic sensations, audible content, and/or visual content, artificial reality systems may create an entire virtual experience or enhance a user's real-world experience in a variety of contexts and environments. For instance, artificial reality systems may assist or extend a user's perception, memory, or cognition within a particular environment. Some systems may enhance a user's interactions with other people in the real world or may enable more immersive interactions with other people in a virtual world. Artificial reality systems may also be used for educational purposes (e.g., for teaching or training in schools, hospitals, government organizations, military organizations, business enterprises, etc.), entertainment purposes (e.g., for playing video games, listening to music, watching video content, etc.), and/or for accessibility purposes (e.g., as hearing aids, visuals aids, etc.). The embodiments disclosed herein may enable or enhance a user's artificial reality experience in one or more of these contexts and environments and/or in other contexts and environments.
Some augmented reality systems may map a user's and/or device's environment using techniques referred to as “simultaneous location and mapping” (SLAM). SLAM mapping and location identifying techniques may involve a variety of hardware and software tools that can create or update a map of an environment while simultaneously keeping track of a user's location within the mapped environment. SLAM may use many different types of sensors to create a map and determine a user's position within the map. Data from magnetic trackers may be used to create the map and determine the position of the user, and of portions of the user's body, within the mapped environment.
SLAM techniques may, for example, implement optical sensors to determine a user's location. Radios including WiFi, BLUETOOTH, global positioning system (GPS), cellular or other communication devices may be also used to determine a user's location relative to a radio transceiver or group of transceivers (e.g., a WiFi router or group of GPS satellites). Acoustic sensors such as microphone arrays or 2D or 3D sonar sensors may also be used to determine a user's location within an environment. Augmented reality and virtual reality devices (such as systems 1300 and 1400 of
As detailed above, the computing devices and systems described and/or illustrated herein broadly represent any type or form of computing device or system capable of executing computer-readable instructions, such as those contained within the modules described herein. In their most basic configuration, these computing device(s) may each include at least one memory device and at least one physical processor.
In some examples, the term “memory device” generally refers to any type or form of volatile or non-volatile storage device or medium capable of storing data and/or computer-readable instructions. In one example, a memory device may store, load, and/or maintain one or more of the modules described herein. Examples of memory devices include, without limitation, Random Access Memory (RAM), Read Only Memory (ROM), flash memory, Hard Disk Drives (HDDs), Solid-State Drives (SSDs), optical disk drives, caches, variations or combinations of one or more of the same, or any other suitable storage memory.
In some examples, the term “physical processor” generally refers to any type or form of hardware-implemented processing unit capable of interpreting and/or executing computer-readable instructions. In one example, a physical processor may access and/or modify one or more modules stored in the above-described memory device. Examples of physical processors include, without limitation, microprocessors, microcontrollers, Central Processing Units (CPUs), Field-Programmable Gate Arrays (FPGAs) that implement softcore processors, Application-Specific Integrated Circuits (ASICs), portions of one or more of the same, variations or combinations of one or more of the same, or any other suitable physical processor.
Although illustrated as separate elements, the modules described and/or illustrated herein may represent portions of a single module or application. In addition, in certain embodiments one or more of these modules may represent one or more software applications or programs that, when executed by a computing device, may cause the computing device to perform one or more tasks. For example, one or more of the modules described and/or illustrated herein may represent modules stored and configured to run on one or more of the computing devices or systems described and/or illustrated herein. One or more of these modules may also represent all or portions of one or more special-purpose computers configured to perform one or more tasks.
In addition, one or more of the modules described herein may transform data, physical devices, and/or representations of physical devices from one form to another. For example, one or more of the modules recited herein may receive data to be transformed, transform the data, output a result of the transformation to perform a function, use the result of the transformation to perform a function, and store the result of the transformation to perform a function. Data may include physiological data from a user, such as neuromuscular signals, eye tracking data, or other data. A function may include control of a computerized device, selection and/or control of objects within an environment, such as an augmented reality environment, and the like. Additionally or alternatively, one or more of the modules recited herein may transform a processor, volatile memory, non-volatile memory, and/or any other portion of a physical computing device from one form to another by executing on the computing device, storing data on the computing device, and/or otherwise interacting with the computing device.
In some embodiments, the term “computer-readable medium” generally refers to any form of device, carrier, or medium capable of storing or carrying computer-readable instructions. Examples of computer-readable media include, without limitation, transmission-type media, such as carrier waves, and non-transitory-type media, such as magnetic-storage media (e.g., hard disk drives, tape drives, and floppy disks), optical-storage media (e.g., Compact Disks (CDs), Digital Video Disks (DVDs), and BLU-RAY disks), electronic-storage media (e.g., solid-state drives and flash media), and other distribution systems.
The process parameters and sequence of the steps described and/or illustrated herein are given by way of example only and may be varied as desired. For example, while the steps illustrated and/or described herein may be shown or discussed in a particular order, these steps do not necessarily need to be performed in the order illustrated or discussed. The various exemplary methods described and/or illustrated herein may also omit one or more of the steps described or illustrated herein or include additional steps in addition to those disclosed.
The preceding description has been provided to enable others skilled in the art to best utilize various aspects of the exemplary embodiments disclosed herein. This exemplary description is not intended to be exhaustive or to be limited to any precise form disclosed. Many modifications and variations are possible without departing from the spirit and scope of the present disclosure. The embodiments disclosed herein should be considered in all respects illustrative and not restrictive. Reference should be made to the appended claims and their equivalents in determining the scope of the present disclosure.
Unless otherwise noted, the terms “connected to” and “coupled to” (and their derivatives), as used in the specification and claims, are to be construed as permitting both direct and indirect (i.e., via other elements or components) connection. In addition, the terms “a” or “an,” as used in the specification and claims, are to be construed as meaning “at least one of.” Finally, for ease of use, the terms “including” and “having” (and their derivatives), as used in the specification and claims, are interchangeable with and have the same meaning as the word “including.”
This application claims the benefit of U.S. Provisional Application No. 62/826,574, filed Mar. 29, 2019, the disclosure of which is incorporated, in its entirety, by this reference.
Number | Name | Date | Kind |
---|---|---|---|
1411995 | Dull | Apr 1922 | A |
3408133 | Lee | Oct 1968 | A |
3580243 | Johnson | May 1971 | A |
3620208 | Wayne et al. | Nov 1971 | A |
3712716 | Cornsweet et al. | Jan 1973 | A |
3735425 | Hoshall et al. | May 1973 | A |
3880146 | Everett et al. | Apr 1975 | A |
4055168 | Miller et al. | Oct 1977 | A |
4602639 | Hoogendoorn et al. | Jul 1986 | A |
4705408 | Jordi | Nov 1987 | A |
4817064 | Milles | Mar 1989 | A |
4896120 | Kamil | Jan 1990 | A |
4978213 | El Hage | Dec 1990 | A |
5003978 | Dunseath, Jr. | Apr 1991 | A |
D322227 | Warhol | Dec 1991 | S |
5081852 | Cox | Jan 1992 | A |
5103323 | Magarinos et al. | Apr 1992 | A |
5231674 | Cleveland et al. | Jul 1993 | A |
5251189 | Thorp | Oct 1993 | A |
D348660 | Parsons | Jul 1994 | S |
5445869 | Ishikawa et al. | Aug 1995 | A |
5462065 | Cusimano | Oct 1995 | A |
5467104 | Furness, III et al. | Nov 1995 | A |
5482051 | Reddy et al. | Jan 1996 | A |
5589956 | Morishima et al. | Dec 1996 | A |
5596339 | Furness, III et al. | Jan 1997 | A |
5605059 | Woodward | Feb 1997 | A |
5625577 | Tosiyasu et al. | Apr 1997 | A |
5683404 | Johnson | Nov 1997 | A |
5742421 | Wells et al. | Apr 1998 | A |
6005548 | Latypov et al. | Dec 1999 | A |
6008781 | Furness, III et al. | Dec 1999 | A |
6009210 | Kang | Dec 1999 | A |
6027216 | Guyton et al. | Feb 2000 | A |
6032530 | Hock | Mar 2000 | A |
D422617 | Simioni | Apr 2000 | S |
6066794 | Longo | May 2000 | A |
6184847 | Fateh et al. | Feb 2001 | B1 |
6236476 | Son et al. | May 2001 | B1 |
6238338 | DeLuca et al. | May 2001 | B1 |
6244873 | Hill et al. | Jun 2001 | B1 |
6317103 | Furness, III et al. | Nov 2001 | B1 |
6377277 | Yamamoto | Apr 2002 | B1 |
D459352 | Giovanniello | Jun 2002 | S |
6411843 | Zarychta | Jun 2002 | B1 |
6487906 | Hock | Dec 2002 | B1 |
6510333 | Licata et al. | Jan 2003 | B1 |
6527711 | Stivoric et al. | Mar 2003 | B1 |
6619836 | Silvant et al. | Sep 2003 | B1 |
6639570 | Furness, III et al. | Oct 2003 | B2 |
6658287 | Litt et al. | Dec 2003 | B1 |
6720984 | Jorgensen et al. | Apr 2004 | B1 |
6743982 | Biegelsen et al. | Jun 2004 | B2 |
6771294 | Pulli et al. | Aug 2004 | B1 |
6774885 | Even-Zohar | Aug 2004 | B1 |
6807438 | Brun Del Re et al. | Oct 2004 | B1 |
D502661 | Rapport | Mar 2005 | S |
D502662 | Rapport | Mar 2005 | S |
6865409 | Getsla et al. | Mar 2005 | B2 |
D503646 | Rapport | Apr 2005 | S |
6880364 | Vidolin et al. | Apr 2005 | B1 |
6901286 | Sinderby et al. | May 2005 | B1 |
6927343 | Watanabe et al. | Aug 2005 | B2 |
6942621 | Avinash et al. | Sep 2005 | B2 |
6965842 | Rekimoto | Nov 2005 | B2 |
6972734 | Ohshima et al. | Dec 2005 | B1 |
6984208 | Zheng | Jan 2006 | B2 |
7022919 | Brist et al. | Apr 2006 | B2 |
7028507 | Rapport | Apr 2006 | B2 |
7086218 | Pasach | Aug 2006 | B1 |
7089148 | Bachmann et al. | Aug 2006 | B1 |
D535401 | Travis et al. | Jan 2007 | S |
7173437 | Hervieux et al. | Feb 2007 | B2 |
7209114 | Radley-Smith | Apr 2007 | B2 |
D543212 | Marks | May 2007 | S |
7265298 | Maghribi et al. | Sep 2007 | B2 |
7271774 | Puuri | Sep 2007 | B2 |
7333090 | Tanaka et al. | Feb 2008 | B2 |
7351975 | Brady et al. | Apr 2008 | B2 |
7450107 | Radley-Smith | Nov 2008 | B2 |
7473888 | Wine et al. | Jan 2009 | B2 |
7491892 | Wagner et al. | Feb 2009 | B2 |
7517725 | Reis | Apr 2009 | B2 |
7558622 | Tran | Jul 2009 | B2 |
7574253 | Edney et al. | Aug 2009 | B2 |
7580742 | Tan et al. | Aug 2009 | B2 |
7596393 | Jung et al. | Sep 2009 | B2 |
7618260 | Daniel et al. | Nov 2009 | B2 |
7636549 | Ma et al. | Dec 2009 | B2 |
7640007 | Chen et al. | Dec 2009 | B2 |
7660126 | Cho et al. | Feb 2010 | B2 |
7684105 | Lamontagne et al. | Mar 2010 | B2 |
7747113 | Mukawa et al. | Jun 2010 | B2 |
7761390 | Ford | Jul 2010 | B2 |
7773111 | Cleveland et al. | Aug 2010 | B2 |
7787946 | Stahmann et al. | Aug 2010 | B2 |
7805386 | Greer | Sep 2010 | B2 |
7809435 | Ettare et al. | Oct 2010 | B1 |
7844310 | Anderson | Nov 2010 | B2 |
D628616 | Yuan | Dec 2010 | S |
7850306 | Uusitalo et al. | Dec 2010 | B2 |
7870211 | Pascal et al. | Jan 2011 | B2 |
D633939 | Puentes et al. | Mar 2011 | S |
D634771 | Fuchs | Mar 2011 | S |
7901368 | Flaherty et al. | Mar 2011 | B2 |
7925100 | Howell et al. | Apr 2011 | B2 |
7948763 | Chuang | May 2011 | B2 |
D640314 | Yang | Jun 2011 | S |
D643428 | Janky et al. | Aug 2011 | S |
D646192 | Woode | Oct 2011 | S |
D649177 | Cho et al. | Nov 2011 | S |
8054061 | Prance et al. | Nov 2011 | B2 |
D654622 | Hsu | Feb 2012 | S |
8120828 | Schwerdtner | Feb 2012 | B2 |
8170656 | Tan et al. | May 2012 | B2 |
8179604 | Prada Gomez et al. | May 2012 | B1 |
8188937 | Amafuji et al. | May 2012 | B1 |
8190249 | Gharieb et al. | May 2012 | B1 |
D661613 | Demeglio | Jun 2012 | S |
8203502 | Chi et al. | Jun 2012 | B1 |
8207473 | Axisa et al. | Jun 2012 | B2 |
8212859 | Tang et al. | Jul 2012 | B2 |
D667482 | Healy et al. | Sep 2012 | S |
D669522 | Klinar et al. | Oct 2012 | S |
D669523 | Wakata et al. | Oct 2012 | S |
D671590 | Klinar et al. | Nov 2012 | S |
8311623 | Sanger | Nov 2012 | B2 |
8348538 | Van Loenen et al. | Jan 2013 | B2 |
8351651 | Lee | Jan 2013 | B2 |
8355671 | Kramer et al. | Jan 2013 | B2 |
8384683 | Luo | Feb 2013 | B2 |
8386025 | Hoppe | Feb 2013 | B2 |
8389862 | Arora et al. | Mar 2013 | B2 |
8421634 | Tan et al. | Apr 2013 | B2 |
8427977 | Workman et al. | Apr 2013 | B2 |
D682343 | Waters | May 2013 | S |
D682727 | Bulgari | May 2013 | S |
8435191 | Barboutis et al. | May 2013 | B2 |
8437844 | Syed Momen et al. | May 2013 | B2 |
8447704 | Tan et al. | May 2013 | B2 |
D685019 | Li | Jun 2013 | S |
8467270 | Gossweiler, III et al. | Jun 2013 | B2 |
8469741 | Oster et al. | Jun 2013 | B2 |
D687087 | Iurilli | Jul 2013 | S |
8484022 | Vanhoucke | Jul 2013 | B1 |
D689862 | Liu | Sep 2013 | S |
D692941 | Klinar et al. | Nov 2013 | S |
8591411 | Banet et al. | Nov 2013 | B2 |
D695333 | Farnam et al. | Dec 2013 | S |
D695454 | Moore | Dec 2013 | S |
8620361 | Bailey et al. | Dec 2013 | B2 |
8624124 | Koo et al. | Jan 2014 | B2 |
8634119 | Bablumyan et al. | Jan 2014 | B2 |
D701555 | Markovitz et al. | Mar 2014 | S |
8666212 | Amirparviz | Mar 2014 | B1 |
8702629 | Giuffrida et al. | Apr 2014 | B2 |
8704882 | Turner | Apr 2014 | B2 |
D704248 | Dichiara | May 2014 | S |
8718980 | Garudadri et al. | May 2014 | B2 |
8743052 | Keller et al. | Jun 2014 | B1 |
8744543 | Li et al. | Jun 2014 | B2 |
8754862 | Zaliva | Jun 2014 | B2 |
8777668 | Ikeda et al. | Jul 2014 | B2 |
D716457 | Brefka et al. | Oct 2014 | S |
D717685 | Bailey et al. | Nov 2014 | S |
8879276 | Wang | Nov 2014 | B2 |
8880163 | Barachant et al. | Nov 2014 | B2 |
8883287 | Boyce et al. | Nov 2014 | B2 |
8890875 | Jammes et al. | Nov 2014 | B2 |
8892479 | Tan et al. | Nov 2014 | B2 |
8895865 | Lenahan et al. | Nov 2014 | B2 |
D719568 | Heinrich et al. | Dec 2014 | S |
D719570 | Heinrich et al. | Dec 2014 | S |
8912094 | Koo et al. | Dec 2014 | B2 |
8914472 | Lee et al. | Dec 2014 | B1 |
8922481 | Kauffmann et al. | Dec 2014 | B1 |
D723093 | Li | Feb 2015 | S |
8954135 | Yuen et al. | Feb 2015 | B2 |
D724647 | Rohrbach | Mar 2015 | S |
8970571 | Wong et al. | Mar 2015 | B1 |
8971023 | Olsson et al. | Mar 2015 | B2 |
9018532 | Wesselmann et al. | Apr 2015 | B2 |
9037530 | Tan et al. | May 2015 | B2 |
9086687 | Park et al. | Jul 2015 | B2 |
9092664 | Forutanpour et al. | Jul 2015 | B2 |
D736664 | Paradise et al. | Aug 2015 | S |
9107586 | Tran | Aug 2015 | B2 |
D738373 | Davies et al. | Sep 2015 | S |
9135708 | Ebisawa | Sep 2015 | B2 |
9146730 | Lazar | Sep 2015 | B2 |
D741855 | Park et al. | Oct 2015 | S |
9170674 | Forutanpour et al. | Oct 2015 | B2 |
D742272 | Bailey et al. | Nov 2015 | S |
D742874 | Cheng et al. | Nov 2015 | S |
D743963 | Osterhout | Nov 2015 | S |
9182826 | Powledge et al. | Nov 2015 | B2 |
9211417 | Heldman et al. | Dec 2015 | B2 |
9218574 | Phillipps et al. | Dec 2015 | B2 |
D747714 | Erbeus | Jan 2016 | S |
D747759 | Ho | Jan 2016 | S |
9235934 | Mandella et al. | Jan 2016 | B2 |
9240069 | Li | Jan 2016 | B1 |
D750623 | Park et al. | Mar 2016 | S |
D751065 | Magi | Mar 2016 | S |
9278453 | Assad | Mar 2016 | B2 |
9299248 | Lake et al. | Mar 2016 | B2 |
D756359 | Bailey et al. | May 2016 | S |
9329694 | Slonneger | May 2016 | B2 |
9341659 | Poupyrev et al. | May 2016 | B2 |
9349280 | Baldwin et al. | May 2016 | B2 |
9351653 | Harrison | May 2016 | B1 |
D758476 | Ho | Jun 2016 | S |
D760313 | Ho et al. | Jun 2016 | S |
9367139 | Ataee et al. | Jun 2016 | B2 |
9372535 | Bailey et al. | Jun 2016 | B2 |
9389694 | Ataee et al. | Jul 2016 | B2 |
9393418 | Giuffrida et al. | Jul 2016 | B2 |
9402582 | Parviz et al. | Aug 2016 | B1 |
9408316 | Bailey et al. | Aug 2016 | B2 |
9418927 | Axisa et al. | Aug 2016 | B2 |
D766895 | Choi | Sep 2016 | S |
9439566 | Arne et al. | Sep 2016 | B2 |
D768627 | Rochat et al. | Oct 2016 | S |
9459697 | Bedikian et al. | Oct 2016 | B2 |
9472956 | Michaelis et al. | Oct 2016 | B2 |
9477313 | Mistry et al. | Oct 2016 | B2 |
D771735 | Lee et al. | Nov 2016 | S |
9483123 | Aleem et al. | Nov 2016 | B2 |
9529434 | Choi et al. | Dec 2016 | B2 |
D780828 | Bonaventura et al. | Mar 2017 | S |
D780829 | Bonaventura et al. | Mar 2017 | S |
9597015 | McNames et al. | Mar 2017 | B2 |
9600030 | Bailey et al. | Mar 2017 | B2 |
9612661 | Wagner et al. | Apr 2017 | B2 |
9613262 | Holz | Apr 2017 | B2 |
9652047 | Mullins et al. | May 2017 | B2 |
9654477 | Kotamraju | May 2017 | B1 |
9659403 | Horowitz | May 2017 | B1 |
9687168 | John | Jun 2017 | B2 |
9696795 | Marcolina et al. | Jul 2017 | B2 |
9720515 | Wagner et al. | Aug 2017 | B2 |
9741169 | Holz | Aug 2017 | B1 |
9766709 | Holz | Sep 2017 | B2 |
9785247 | Horowitz et al. | Oct 2017 | B1 |
9788789 | Bailey | Oct 2017 | B2 |
9807221 | Bailey et al. | Oct 2017 | B2 |
9864431 | Keskin et al. | Jan 2018 | B2 |
9867548 | Le et al. | Jan 2018 | B2 |
9880632 | Ataee et al. | Jan 2018 | B2 |
9891718 | Connor | Feb 2018 | B2 |
9921641 | Worley, III et al. | Mar 2018 | B1 |
10042422 | Morun et al. | Aug 2018 | B2 |
10070799 | Ang et al. | Sep 2018 | B2 |
10078435 | Noel | Sep 2018 | B2 |
10101809 | Morun et al. | Oct 2018 | B2 |
10152082 | Bailey | Dec 2018 | B2 |
10185416 | Mistry et al. | Jan 2019 | B2 |
10188309 | Morun et al. | Jan 2019 | B2 |
10199008 | Aleem et al. | Feb 2019 | B2 |
10203751 | Keskin et al. | Feb 2019 | B2 |
10216274 | Chapeskie et al. | Feb 2019 | B2 |
10251577 | Morun et al. | Apr 2019 | B2 |
10310601 | Morun et al. | Jun 2019 | B2 |
10331210 | Morun et al. | Jun 2019 | B2 |
10362958 | Morun et al. | Jul 2019 | B2 |
10409371 | Kaifosh et al. | Sep 2019 | B2 |
10429928 | Morun et al. | Oct 2019 | B2 |
10437335 | Daniels | Oct 2019 | B2 |
10460455 | Giurgica-Tiron et al. | Oct 2019 | B2 |
10489986 | Kaifosh et al. | Nov 2019 | B2 |
10496168 | Kaifosh et al. | Dec 2019 | B2 |
10504286 | Kaifosh et al. | Dec 2019 | B2 |
10520378 | Brown et al. | Dec 2019 | B1 |
10528135 | Bailey et al. | Jan 2020 | B2 |
10558273 | Park et al. | Feb 2020 | B2 |
10592001 | Berenzweig et al. | Mar 2020 | B2 |
10610737 | Crawford | Apr 2020 | B1 |
10676083 | De Sapio et al. | Jun 2020 | B1 |
10687759 | Guo et al. | Jun 2020 | B2 |
10905350 | Berenzweig et al. | Feb 2021 | B2 |
10905383 | Barachant | Feb 2021 | B2 |
10937414 | Berenzweig et al. | Mar 2021 | B2 |
10990174 | Kaifosh et al. | Apr 2021 | B2 |
11009951 | Bailey et al. | May 2021 | B2 |
11150730 | Anderson et al. | Oct 2021 | B1 |
20010033402 | Popovich | Oct 2001 | A1 |
20020003627 | Rieder | Jan 2002 | A1 |
20020009972 | Amento et al. | Jan 2002 | A1 |
20020030636 | Richards | Mar 2002 | A1 |
20020032386 | Sackner et al. | Mar 2002 | A1 |
20020077534 | DuRousseau | Jun 2002 | A1 |
20020094701 | Biegelsen et al. | Jul 2002 | A1 |
20020120415 | Millott et al. | Aug 2002 | A1 |
20020120916 | Snider, Jr. | Aug 2002 | A1 |
20020198472 | Kramer | Dec 2002 | A1 |
20030030595 | Radley-Smith | Feb 2003 | A1 |
20030036691 | Stanaland et al. | Feb 2003 | A1 |
20030051505 | Robertson et al. | Mar 2003 | A1 |
20030144586 | Tsubata | Jul 2003 | A1 |
20030144829 | Geatz et al. | Jul 2003 | A1 |
20030171921 | Manabe et al. | Sep 2003 | A1 |
20030182630 | Saund et al. | Sep 2003 | A1 |
20030184544 | Prudent | Oct 2003 | A1 |
20040010210 | Avinash et al. | Jan 2004 | A1 |
20040024312 | Zheng | Feb 2004 | A1 |
20040054273 | Finneran et al. | Mar 2004 | A1 |
20040068409 | Tanaka et al. | Apr 2004 | A1 |
20040073104 | Brun Del Re et al. | Apr 2004 | A1 |
20040080499 | Lui | Apr 2004 | A1 |
20040092839 | Shin et al. | May 2004 | A1 |
20040194500 | Rapport | Oct 2004 | A1 |
20040210165 | Marmaropoulos et al. | Oct 2004 | A1 |
20040243342 | Rekimoto | Dec 2004 | A1 |
20040254617 | Hemmerling et al. | Dec 2004 | A1 |
20050005637 | Rapport | Jan 2005 | A1 |
20050012715 | Ford | Jan 2005 | A1 |
20050070227 | Shen et al. | Mar 2005 | A1 |
20050070791 | Edney et al. | Mar 2005 | A1 |
20050115561 | Stahmann et al. | Jun 2005 | A1 |
20050119701 | Lauter et al. | Jun 2005 | A1 |
20050177038 | Kolpin et al. | Aug 2005 | A1 |
20050179644 | Alsio et al. | Aug 2005 | A1 |
20060018833 | Murphy et al. | Jan 2006 | A1 |
20060037359 | Stinespring | Feb 2006 | A1 |
20060058699 | Vitiello et al. | Mar 2006 | A1 |
20060061544 | Min et al. | Mar 2006 | A1 |
20060121958 | Jung et al. | Jun 2006 | A1 |
20060129057 | Maekawa et al. | Jun 2006 | A1 |
20060132705 | Li | Jun 2006 | A1 |
20060149338 | Flaherty et al. | Jul 2006 | A1 |
20060211956 | Sankai | Sep 2006 | A1 |
20060238707 | Elvesjo et al. | Oct 2006 | A1 |
20070009151 | Pittman et al. | Jan 2007 | A1 |
20070016265 | Davoodi et al. | Jan 2007 | A1 |
20070023662 | Brady et al. | Feb 2007 | A1 |
20070078308 | Daly | Apr 2007 | A1 |
20070132785 | Ebersole, Jr. et al. | Jun 2007 | A1 |
20070148624 | Nativ | Jun 2007 | A1 |
20070172797 | Hada et al. | Jul 2007 | A1 |
20070177770 | Derchak et al. | Aug 2007 | A1 |
20070185697 | Tan et al. | Aug 2007 | A1 |
20070256494 | Nakamura et al. | Nov 2007 | A1 |
20070276270 | Tran | Nov 2007 | A1 |
20070279852 | Daniel et al. | Dec 2007 | A1 |
20070285399 | Lund | Dec 2007 | A1 |
20080001735 | Tran | Jan 2008 | A1 |
20080032638 | Anderson | Feb 2008 | A1 |
20080051673 | Kong et al. | Feb 2008 | A1 |
20080052643 | Ike et al. | Feb 2008 | A1 |
20080058668 | Seyed Momen et al. | Mar 2008 | A1 |
20080103639 | Troy et al. | May 2008 | A1 |
20080103769 | Schultz et al. | May 2008 | A1 |
20080136775 | Conant | Jun 2008 | A1 |
20080152217 | Greer | Jun 2008 | A1 |
20080163130 | Westerman | Jul 2008 | A1 |
20080214360 | Stirling et al. | Sep 2008 | A1 |
20080221487 | Zohar et al. | Sep 2008 | A1 |
20080262772 | Luinge et al. | Oct 2008 | A1 |
20080278497 | Jammes et al. | Nov 2008 | A1 |
20080285805 | Luinge et al. | Nov 2008 | A1 |
20090005700 | Joshi et al. | Jan 2009 | A1 |
20090007597 | Hanevold | Jan 2009 | A1 |
20090027337 | Hildreth | Jan 2009 | A1 |
20090031757 | Harding | Feb 2009 | A1 |
20090040016 | Ikeda | Feb 2009 | A1 |
20090051544 | Niknejad | Feb 2009 | A1 |
20090079607 | Denison et al. | Mar 2009 | A1 |
20090079813 | Hildreth | Mar 2009 | A1 |
20090082692 | Hale et al. | Mar 2009 | A1 |
20090082701 | Zohar et al. | Mar 2009 | A1 |
20090085864 | Kutliroff et al. | Apr 2009 | A1 |
20090102580 | Uchaykin | Apr 2009 | A1 |
20090109241 | Tsujimoto | Apr 2009 | A1 |
20090112080 | Matthews | Apr 2009 | A1 |
20090124881 | Rytky | May 2009 | A1 |
20090147004 | Ramon et al. | Jun 2009 | A1 |
20090179824 | Tsujimoto et al. | Jul 2009 | A1 |
20090189864 | Walker et al. | Jul 2009 | A1 |
20090189867 | Krah et al. | Jul 2009 | A1 |
20090195497 | Fitzgerald et al. | Aug 2009 | A1 |
20090204031 | Mcnames et al. | Aug 2009 | A1 |
20090207464 | Wiltshire et al. | Aug 2009 | A1 |
20090209878 | Sanger | Aug 2009 | A1 |
20090251407 | Flake et al. | Oct 2009 | A1 |
20090258669 | Nie et al. | Oct 2009 | A1 |
20090265671 | Sachs et al. | Oct 2009 | A1 |
20090318785 | Ishikawa et al. | Dec 2009 | A1 |
20090319230 | Case, Jr. et al. | Dec 2009 | A1 |
20090322653 | Putilin et al. | Dec 2009 | A1 |
20090326406 | Tan et al. | Dec 2009 | A1 |
20090327171 | Tan et al. | Dec 2009 | A1 |
20100030532 | Arora et al. | Feb 2010 | A1 |
20100041974 | Ting et al. | Feb 2010 | A1 |
20100063794 | Hernandez-Rebollar | Mar 2010 | A1 |
20100066664 | Son et al. | Mar 2010 | A1 |
20100106044 | Linderman | Apr 2010 | A1 |
20100113910 | Brauers et al. | May 2010 | A1 |
20100142015 | Kuwahara et al. | Jun 2010 | A1 |
20100149073 | Chaum et al. | Jun 2010 | A1 |
20100150415 | Atkinson et al. | Jun 2010 | A1 |
20100228487 | Leuthardt et al. | Sep 2010 | A1 |
20100234696 | Li et al. | Sep 2010 | A1 |
20100240981 | Barboutis et al. | Sep 2010 | A1 |
20100249635 | Van Der Reijden | Sep 2010 | A1 |
20100280628 | Sankai | Nov 2010 | A1 |
20100292595 | Paul | Nov 2010 | A1 |
20100292606 | Prakash et al. | Nov 2010 | A1 |
20100292617 | Lei et al. | Nov 2010 | A1 |
20100293115 | Seyed Momen | Nov 2010 | A1 |
20100306713 | Geisner et al. | Dec 2010 | A1 |
20100315266 | Gunawardana et al. | Dec 2010 | A1 |
20100317958 | Beck et al. | Dec 2010 | A1 |
20110007035 | Shai | Jan 2011 | A1 |
20110018754 | Tojima et al. | Jan 2011 | A1 |
20110025982 | Takahashi | Feb 2011 | A1 |
20110054360 | Son et al. | Mar 2011 | A1 |
20110065319 | Oster et al. | Mar 2011 | A1 |
20110066381 | Garudadri et al. | Mar 2011 | A1 |
20110072510 | Cheswick | Mar 2011 | A1 |
20110077484 | Van Slyke et al. | Mar 2011 | A1 |
20110082838 | Niemela | Apr 2011 | A1 |
20110092826 | Lee et al. | Apr 2011 | A1 |
20110119216 | Wigdor | May 2011 | A1 |
20110133934 | Tan et al. | Jun 2011 | A1 |
20110134026 | Kang et al. | Jun 2011 | A1 |
20110151974 | Deaguero | Jun 2011 | A1 |
20110166434 | Gargiulo | Jul 2011 | A1 |
20110172503 | Knepper et al. | Jul 2011 | A1 |
20110173204 | Murillo et al. | Jul 2011 | A1 |
20110173574 | Clavin et al. | Jul 2011 | A1 |
20110181527 | Capela et al. | Jul 2011 | A1 |
20110205242 | Friesen | Aug 2011 | A1 |
20110213278 | Horak et al. | Sep 2011 | A1 |
20110221672 | Osterhout et al. | Sep 2011 | A1 |
20110224507 | Banet et al. | Sep 2011 | A1 |
20110224556 | Moon et al. | Sep 2011 | A1 |
20110224564 | Moon et al. | Sep 2011 | A1 |
20110230782 | Bartol et al. | Sep 2011 | A1 |
20110248914 | Sherr | Oct 2011 | A1 |
20110262002 | Lee | Oct 2011 | A1 |
20110270135 | Dooley et al. | Nov 2011 | A1 |
20110295100 | Hegde et al. | Dec 2011 | A1 |
20110313762 | Ben-David et al. | Dec 2011 | A1 |
20120002256 | Lacoste et al. | Jan 2012 | A1 |
20120007821 | Zaliva | Jan 2012 | A1 |
20120029322 | Wartena et al. | Feb 2012 | A1 |
20120051005 | Vanfleteren et al. | Mar 2012 | A1 |
20120052268 | Axisa et al. | Mar 2012 | A1 |
20120053439 | Ylostalo et al. | Mar 2012 | A1 |
20120066163 | Balls et al. | Mar 2012 | A1 |
20120071092 | Pasquero et al. | Mar 2012 | A1 |
20120071780 | Barachant | Mar 2012 | A1 |
20120101357 | Hoskuldsson et al. | Apr 2012 | A1 |
20120117514 | Kim et al. | May 2012 | A1 |
20120139817 | Freeman | Jun 2012 | A1 |
20120157789 | Kangas et al. | Jun 2012 | A1 |
20120157886 | Tenn et al. | Jun 2012 | A1 |
20120165695 | Kidmose et al. | Jun 2012 | A1 |
20120182309 | Griffin et al. | Jul 2012 | A1 |
20120184838 | John et al. | Jul 2012 | A1 |
20120188158 | Tan et al. | Jul 2012 | A1 |
20120203076 | Fatta et al. | Aug 2012 | A1 |
20120209134 | Morita et al. | Aug 2012 | A1 |
20120226130 | De Graff et al. | Sep 2012 | A1 |
20120249797 | Haddick et al. | Oct 2012 | A1 |
20120265090 | Fink et al. | Oct 2012 | A1 |
20120265480 | Oshima | Oct 2012 | A1 |
20120275621 | Elko | Nov 2012 | A1 |
20120283526 | Gommesen et al. | Nov 2012 | A1 |
20120283896 | Persaud et al. | Nov 2012 | A1 |
20120293548 | Perez et al. | Nov 2012 | A1 |
20120302858 | Kidmose et al. | Nov 2012 | A1 |
20120320532 | Wang | Dec 2012 | A1 |
20120323521 | De Foras et al. | Dec 2012 | A1 |
20130004033 | Trugenberger | Jan 2013 | A1 |
20130005303 | Song et al. | Jan 2013 | A1 |
20130016292 | Miao et al. | Jan 2013 | A1 |
20130016413 | Saeedi et al. | Jan 2013 | A1 |
20130020948 | Han et al. | Jan 2013 | A1 |
20130027341 | Mastandrea | Jan 2013 | A1 |
20130038707 | Cunningham et al. | Feb 2013 | A1 |
20130077820 | Marais et al. | Mar 2013 | A1 |
20130080794 | Hsieh | Mar 2013 | A1 |
20130106686 | Bennett | May 2013 | A1 |
20130123656 | Heck | May 2013 | A1 |
20130123666 | Giuffrida et al. | May 2013 | A1 |
20130127708 | Jung et al. | May 2013 | A1 |
20130131538 | Gaw et al. | May 2013 | A1 |
20130135223 | Shai | May 2013 | A1 |
20130135722 | Yokoyama | May 2013 | A1 |
20130141375 | Ludwig et al. | Jun 2013 | A1 |
20130144629 | Johnston et al. | Jun 2013 | A1 |
20130165813 | Chang et al. | Jun 2013 | A1 |
20130191741 | Dickinson et al. | Jul 2013 | A1 |
20130198694 | Rahman et al. | Aug 2013 | A1 |
20130207889 | Chang et al. | Aug 2013 | A1 |
20130215235 | Russell | Aug 2013 | A1 |
20130217998 | Mahfouz et al. | Aug 2013 | A1 |
20130221996 | Poupyrev et al. | Aug 2013 | A1 |
20130222384 | Futterer | Aug 2013 | A1 |
20130232095 | Tan et al. | Sep 2013 | A1 |
20130259238 | Xiang et al. | Oct 2013 | A1 |
20130265229 | Forutanpour et al. | Oct 2013 | A1 |
20130265437 | Thorn et al. | Oct 2013 | A1 |
20130271292 | McDermott | Oct 2013 | A1 |
20130285901 | Lee et al. | Oct 2013 | A1 |
20130285913 | Griffin et al. | Oct 2013 | A1 |
20130293580 | Spivack | Nov 2013 | A1 |
20130310979 | Herr et al. | Nov 2013 | A1 |
20130312256 | Wesselmann et al. | Nov 2013 | A1 |
20130317382 | Le et al. | Nov 2013 | A1 |
20130317648 | Assad | Nov 2013 | A1 |
20130332196 | Pinsker | Dec 2013 | A1 |
20130335302 | Crane et al. | Dec 2013 | A1 |
20140005743 | Giuffrida et al. | Jan 2014 | A1 |
20140020945 | Hurwitz et al. | Jan 2014 | A1 |
20140028539 | Newham et al. | Jan 2014 | A1 |
20140028546 | Jeon et al. | Jan 2014 | A1 |
20140045547 | Singamsetty et al. | Feb 2014 | A1 |
20140049417 | Abdurrahman et al. | Feb 2014 | A1 |
20140051946 | Arne et al. | Feb 2014 | A1 |
20140052150 | Taylor et al. | Feb 2014 | A1 |
20140074179 | Heldman et al. | Mar 2014 | A1 |
20140092009 | Yen et al. | Apr 2014 | A1 |
20140094675 | Luna et al. | Apr 2014 | A1 |
20140098018 | Kim et al. | Apr 2014 | A1 |
20140100432 | Golda et al. | Apr 2014 | A1 |
20140107493 | Yuen et al. | Apr 2014 | A1 |
20140121471 | Walker | May 2014 | A1 |
20140122958 | Greenebrg et al. | May 2014 | A1 |
20140132512 | Gomez Sainz-Garcia | May 2014 | A1 |
20140139422 | Mistry et al. | May 2014 | A1 |
20140142937 | Powledge et al. | May 2014 | A1 |
20140143064 | Tran | May 2014 | A1 |
20140147820 | Snow et al. | May 2014 | A1 |
20140157168 | Albouyeh et al. | Jun 2014 | A1 |
20140194062 | Palin et al. | Jul 2014 | A1 |
20140196131 | Lee | Jul 2014 | A1 |
20140198034 | Bailey et al. | Jul 2014 | A1 |
20140198035 | Bailey et al. | Jul 2014 | A1 |
20140198944 | Forutanpour et al. | Jul 2014 | A1 |
20140200432 | Banerji et al. | Jul 2014 | A1 |
20140201666 | Bedikian et al. | Jul 2014 | A1 |
20140202643 | Hikmet et al. | Jul 2014 | A1 |
20140204455 | Popovich et al. | Jul 2014 | A1 |
20140223462 | Aimone et al. | Aug 2014 | A1 |
20140226193 | Sun | Aug 2014 | A1 |
20140232651 | Kress et al. | Aug 2014 | A1 |
20140236031 | Banet et al. | Aug 2014 | A1 |
20140240103 | Lake et al. | Aug 2014 | A1 |
20140240223 | Lake et al. | Aug 2014 | A1 |
20140245200 | Holz | Aug 2014 | A1 |
20140249397 | Lake | Sep 2014 | A1 |
20140257141 | Giuffrida et al. | Sep 2014 | A1 |
20140258864 | Shenoy et al. | Sep 2014 | A1 |
20140277622 | Raniere | Sep 2014 | A1 |
20140278139 | Hong et al. | Sep 2014 | A1 |
20140278441 | Ton et al. | Sep 2014 | A1 |
20140279860 | Pan et al. | Sep 2014 | A1 |
20140282282 | Holz | Sep 2014 | A1 |
20140285326 | Luna et al. | Sep 2014 | A1 |
20140285429 | Simmons | Sep 2014 | A1 |
20140297528 | Agrawal et al. | Oct 2014 | A1 |
20140299362 | Park et al. | Oct 2014 | A1 |
20140304665 | Holz | Oct 2014 | A1 |
20140310595 | Acharya et al. | Oct 2014 | A1 |
20140330404 | Abdelghani et al. | Nov 2014 | A1 |
20140334083 | Bailey | Nov 2014 | A1 |
20140334653 | Luna et al. | Nov 2014 | A1 |
20140337861 | Chang et al. | Nov 2014 | A1 |
20140340857 | Hsu et al. | Nov 2014 | A1 |
20140344731 | Holz | Nov 2014 | A1 |
20140349257 | Connor | Nov 2014 | A1 |
20140354528 | Laughlin et al. | Dec 2014 | A1 |
20140354529 | Laughlin et al. | Dec 2014 | A1 |
20140355825 | Kim et al. | Dec 2014 | A1 |
20140358024 | Nelson et al. | Dec 2014 | A1 |
20140358825 | Phillipps | Dec 2014 | A1 |
20140359540 | Kelsey et al. | Dec 2014 | A1 |
20140361988 | Katz et al. | Dec 2014 | A1 |
20140364703 | Kim et al. | Dec 2014 | A1 |
20140365163 | Jallon | Dec 2014 | A1 |
20140368424 | Choi et al. | Dec 2014 | A1 |
20140368428 | Pinault | Dec 2014 | A1 |
20140368474 | Kim et al. | Dec 2014 | A1 |
20140368896 | Nakazono et al. | Dec 2014 | A1 |
20140375465 | Fenuccio et al. | Dec 2014 | A1 |
20140376773 | Holz | Dec 2014 | A1 |
20150006120 | Sett et al. | Jan 2015 | A1 |
20150010203 | Muninder et al. | Jan 2015 | A1 |
20150011857 | Henson et al. | Jan 2015 | A1 |
20150019135 | Kacyvenski | Jan 2015 | A1 |
20150025355 | Bailey et al. | Jan 2015 | A1 |
20150029092 | Holz et al. | Jan 2015 | A1 |
20150035827 | Yamaoka et al. | Feb 2015 | A1 |
20150036221 | Stephenson | Feb 2015 | A1 |
20150045689 | Barone | Feb 2015 | A1 |
20150045699 | Mokaya et al. | Feb 2015 | A1 |
20150051470 | Bailey et al. | Feb 2015 | A1 |
20150057506 | Luna et al. | Feb 2015 | A1 |
20150057770 | Bailey et al. | Feb 2015 | A1 |
20150065840 | Bailey et al. | Mar 2015 | A1 |
20150070270 | Bailey et al. | Mar 2015 | A1 |
20150070274 | Morozov | Mar 2015 | A1 |
20150072326 | Mauri et al. | Mar 2015 | A1 |
20150084860 | Aleem et al. | Mar 2015 | A1 |
20150091790 | Forutanpour et al. | Apr 2015 | A1 |
20150094564 | Tashman et al. | Apr 2015 | A1 |
20150099946 | Sahin | Apr 2015 | A1 |
20150106052 | Balakrishnan et al. | Apr 2015 | A1 |
20150109202 | Ataee et al. | Apr 2015 | A1 |
20150124566 | Lake et al. | May 2015 | A1 |
20150128094 | Baldwin et al. | May 2015 | A1 |
20150141784 | Morun et al. | May 2015 | A1 |
20150148641 | Morun et al. | May 2015 | A1 |
20150148728 | Sallum et al. | May 2015 | A1 |
20150157944 | Gottlieb | Jun 2015 | A1 |
20150160621 | Yilmaz | Jun 2015 | A1 |
20150169074 | Ataee et al. | Jun 2015 | A1 |
20150170421 | Mandella et al. | Jun 2015 | A1 |
20150177841 | Vanblon et al. | Jun 2015 | A1 |
20150182113 | Utter, II | Jul 2015 | A1 |
20150182130 | Utter, II | Jul 2015 | A1 |
20150182160 | Kim et al. | Jul 2015 | A1 |
20150182163 | Utter | Jul 2015 | A1 |
20150182164 | Utter, II | Jul 2015 | A1 |
20150182165 | Miller et al. | Jul 2015 | A1 |
20150185838 | Camacho-Perez et al. | Jul 2015 | A1 |
20150185853 | Clausen et al. | Jul 2015 | A1 |
20150186609 | Utter, II | Jul 2015 | A1 |
20150187355 | Parkinson et al. | Jul 2015 | A1 |
20150193949 | Katz et al. | Jul 2015 | A1 |
20150199025 | Holz | Jul 2015 | A1 |
20150205126 | Schowengerdt | Jul 2015 | A1 |
20150205134 | Bailey et al. | Jul 2015 | A1 |
20150213191 | Abdelghani et al. | Jul 2015 | A1 |
20150216475 | Luna et al. | Aug 2015 | A1 |
20150220152 | Tait et al. | Aug 2015 | A1 |
20150223716 | Korkala et al. | Aug 2015 | A1 |
20150230756 | Luna et al. | Aug 2015 | A1 |
20150234426 | Bailey et al. | Aug 2015 | A1 |
20150237716 | Su et al. | Aug 2015 | A1 |
20150242009 | Xiao et al. | Aug 2015 | A1 |
20150242120 | Rodriguez | Aug 2015 | A1 |
20150242575 | Abovitz et al. | Aug 2015 | A1 |
20150261306 | Lake | Sep 2015 | A1 |
20150261318 | Scavezze et al. | Sep 2015 | A1 |
20150272483 | Etemad et al. | Oct 2015 | A1 |
20150277575 | Ataee et al. | Oct 2015 | A1 |
20150288944 | Nistico et al. | Oct 2015 | A1 |
20150289995 | Wilkinson et al. | Oct 2015 | A1 |
20150296553 | DiFranco et al. | Oct 2015 | A1 |
20150302168 | De Sapio et al. | Oct 2015 | A1 |
20150305672 | Grey et al. | Oct 2015 | A1 |
20150309563 | Connor | Oct 2015 | A1 |
20150309582 | Gupta | Oct 2015 | A1 |
20150310766 | Alshehri et al. | Oct 2015 | A1 |
20150312175 | Langholz | Oct 2015 | A1 |
20150313496 | Connor | Nov 2015 | A1 |
20150323998 | Kudekar et al. | Nov 2015 | A1 |
20150325202 | Lake et al. | Nov 2015 | A1 |
20150332013 | Lee et al. | Nov 2015 | A1 |
20150346701 | Gordon et al. | Dec 2015 | A1 |
20150351690 | Toth et al. | Dec 2015 | A1 |
20150355716 | Balasubramanian et al. | Dec 2015 | A1 |
20150355718 | Slonneger | Dec 2015 | A1 |
20150362734 | Moser et al. | Dec 2015 | A1 |
20150366504 | Connor | Dec 2015 | A1 |
20150370326 | Chapeskie et al. | Dec 2015 | A1 |
20150370333 | Ataee et al. | Dec 2015 | A1 |
20150378161 | Bailey et al. | Dec 2015 | A1 |
20150378162 | Bailey et al. | Dec 2015 | A1 |
20150378164 | Bailey et al. | Dec 2015 | A1 |
20150379770 | Haley, Jr. et al. | Dec 2015 | A1 |
20160011668 | Gilad-Bachrach et al. | Jan 2016 | A1 |
20160020500 | Matsuda | Jan 2016 | A1 |
20160026853 | Wexler et al. | Jan 2016 | A1 |
20160033771 | Tremblay et al. | Feb 2016 | A1 |
20160049073 | Lee | Feb 2016 | A1 |
20160050037 | Webb | Feb 2016 | A1 |
20160071319 | Fallon et al. | Mar 2016 | A1 |
20160092504 | Mitri et al. | Mar 2016 | A1 |
20160099010 | Sainath et al. | Apr 2016 | A1 |
20160107309 | Walsh et al. | Apr 2016 | A1 |
20160113587 | Kothe et al. | Apr 2016 | A1 |
20160144172 | Hsueh et al. | May 2016 | A1 |
20160150636 | Otsubo | May 2016 | A1 |
20160156762 | Bailey et al. | Jun 2016 | A1 |
20160162604 | Xiaoli et al. | Jun 2016 | A1 |
20160170710 | Kim et al. | Jun 2016 | A1 |
20160187992 | Yamamoto et al. | Jun 2016 | A1 |
20160195928 | Wagner et al. | Jul 2016 | A1 |
20160199699 | Klassen | Jul 2016 | A1 |
20160202081 | Debieuvre et al. | Jul 2016 | A1 |
20160206206 | Avila et al. | Jul 2016 | A1 |
20160207201 | Herr et al. | Jul 2016 | A1 |
20160217614 | Kraver et al. | Jul 2016 | A1 |
20160235323 | Tadi et al. | Aug 2016 | A1 |
20160238845 | Alexander et al. | Aug 2016 | A1 |
20160239080 | Marcolina et al. | Aug 2016 | A1 |
20160242646 | Obma | Aug 2016 | A1 |
20160259407 | Schick | Sep 2016 | A1 |
20160262687 | Vaidyanathan et al. | Sep 2016 | A1 |
20160263458 | Mather et al. | Sep 2016 | A1 |
20160274365 | Bailey et al. | Sep 2016 | A1 |
20160274732 | Bang et al. | Sep 2016 | A1 |
20160274758 | Bailey | Sep 2016 | A1 |
20160275726 | Mullins | Sep 2016 | A1 |
20160282947 | Schwarz et al. | Sep 2016 | A1 |
20160291768 | Cho et al. | Oct 2016 | A1 |
20160292497 | Kehtarnavaz et al. | Oct 2016 | A1 |
20160309249 | Wu et al. | Oct 2016 | A1 |
20160313798 | Connor | Oct 2016 | A1 |
20160313801 | Wagner et al. | Oct 2016 | A1 |
20160313890 | Walline et al. | Oct 2016 | A1 |
20160313899 | Noel | Oct 2016 | A1 |
20160314623 | Coleman et al. | Oct 2016 | A1 |
20160327796 | Bailey et al. | Nov 2016 | A1 |
20160327797 | Bailey et al. | Nov 2016 | A1 |
20160342227 | Natzke et al. | Nov 2016 | A1 |
20160349514 | Alexander et al. | Dec 2016 | A1 |
20160349515 | Alexander et al. | Dec 2016 | A1 |
20160349516 | Alexander et al. | Dec 2016 | A1 |
20160350973 | Shapira et al. | Dec 2016 | A1 |
20160377865 | Alexander et al. | Dec 2016 | A1 |
20160377866 | Alexander et al. | Dec 2016 | A1 |
20170025026 | Ortiz Catalan | Jan 2017 | A1 |
20170031502 | Rosenberg et al. | Feb 2017 | A1 |
20170035313 | Hong et al. | Feb 2017 | A1 |
20170061817 | Mettler | Mar 2017 | A1 |
20170068095 | Holland et al. | Mar 2017 | A1 |
20170068445 | Lee et al. | Mar 2017 | A1 |
20170075426 | Camacho Perez et al. | Mar 2017 | A1 |
20170079828 | Pedtke et al. | Mar 2017 | A1 |
20170080346 | Abbas | Mar 2017 | A1 |
20170090604 | Barbier | Mar 2017 | A1 |
20170091567 | Wang et al. | Mar 2017 | A1 |
20170095178 | Schoen et al. | Apr 2017 | A1 |
20170097753 | Bailey et al. | Apr 2017 | A1 |
20170115483 | Aleem et al. | Apr 2017 | A1 |
20170119472 | Herrmann et al. | May 2017 | A1 |
20170123487 | Hazra et al. | May 2017 | A1 |
20170124474 | Kashyap | May 2017 | A1 |
20170124816 | Yang et al. | May 2017 | A1 |
20170127354 | Garland et al. | May 2017 | A1 |
20170147077 | Park et al. | May 2017 | A1 |
20170153701 | Mahon et al. | Jun 2017 | A1 |
20170161635 | Oono et al. | Jun 2017 | A1 |
20170188878 | Lee | Jul 2017 | A1 |
20170188980 | Ash | Jul 2017 | A1 |
20170197142 | Stafford et al. | Jul 2017 | A1 |
20170205876 | Vidal et al. | Jul 2017 | A1 |
20170209055 | Pantelopoulos et al. | Jul 2017 | A1 |
20170212290 | Alexander et al. | Jul 2017 | A1 |
20170212349 | Bailey et al. | Jul 2017 | A1 |
20170219829 | Bailey | Aug 2017 | A1 |
20170220923 | Bae et al. | Aug 2017 | A1 |
20170237789 | Harner et al. | Aug 2017 | A1 |
20170237901 | Lee et al. | Aug 2017 | A1 |
20170259167 | Cook et al. | Sep 2017 | A1 |
20170262064 | Ofir et al. | Sep 2017 | A1 |
20170277282 | Go | Sep 2017 | A1 |
20170285744 | Juliato | Oct 2017 | A1 |
20170285756 | Wang et al. | Oct 2017 | A1 |
20170285757 | Robertson et al. | Oct 2017 | A1 |
20170285848 | Rosenberg et al. | Oct 2017 | A1 |
20170296363 | Yetkin et al. | Oct 2017 | A1 |
20170299956 | Holland et al. | Oct 2017 | A1 |
20170301630 | Nguyen et al. | Oct 2017 | A1 |
20170308118 | Ito | Oct 2017 | A1 |
20170312614 | Tran et al. | Nov 2017 | A1 |
20170329392 | Keskin et al. | Nov 2017 | A1 |
20170329404 | Keskin et al. | Nov 2017 | A1 |
20170340506 | Zhang et al. | Nov 2017 | A1 |
20170344706 | Torres et al. | Nov 2017 | A1 |
20170347908 | Watanabe et al. | Dec 2017 | A1 |
20170371403 | Wetzler | Dec 2017 | A1 |
20180000367 | Longinotti-Buitoni | Jan 2018 | A1 |
20180018825 | Kim et al. | Jan 2018 | A1 |
20180020285 | Zass | Jan 2018 | A1 |
20180020951 | Kaifosh | Jan 2018 | A1 |
20180020978 | Kaifosh et al. | Jan 2018 | A1 |
20180020990 | Park et al. | Jan 2018 | A1 |
20180024634 | Kaifosh et al. | Jan 2018 | A1 |
20180024635 | Kaifosh et al. | Jan 2018 | A1 |
20180024641 | Mao et al. | Jan 2018 | A1 |
20180064363 | Morun et al. | Mar 2018 | A1 |
20180067553 | Morun et al. | Mar 2018 | A1 |
20180068489 | Kim et al. | Mar 2018 | A1 |
20180074332 | Li et al. | Mar 2018 | A1 |
20180081439 | Daniels | Mar 2018 | A1 |
20180088675 | Vogel et al. | Mar 2018 | A1 |
20180088765 | Bailey | Mar 2018 | A1 |
20180092599 | Kerth et al. | Apr 2018 | A1 |
20180093181 | Goslin et al. | Apr 2018 | A1 |
20180095542 | Mallinson | Apr 2018 | A1 |
20180095630 | Bailey | Apr 2018 | A1 |
20180101235 | Bodensteiner et al. | Apr 2018 | A1 |
20180101289 | Bailey | Apr 2018 | A1 |
20180107275 | Chen et al. | Apr 2018 | A1 |
20180120948 | Aleem et al. | May 2018 | A1 |
20180133551 | Chang et al. | May 2018 | A1 |
20180140441 | Poirters | May 2018 | A1 |
20180150033 | Lake et al. | May 2018 | A1 |
20180153430 | Ang et al. | Jun 2018 | A1 |
20180153444 | Yang et al. | Jun 2018 | A1 |
20180154140 | Bouton et al. | Jun 2018 | A1 |
20180168905 | Goodall et al. | Jun 2018 | A1 |
20180178008 | Bouton et al. | Jun 2018 | A1 |
20180217249 | La Salla et al. | Aug 2018 | A1 |
20180239430 | Tadi et al. | Aug 2018 | A1 |
20180240459 | Weng et al. | Aug 2018 | A1 |
20180247443 | Briggs et al. | Aug 2018 | A1 |
20180279919 | Bansbach et al. | Oct 2018 | A1 |
20180301057 | Hargrove et al. | Oct 2018 | A1 |
20180307314 | Connor | Oct 2018 | A1 |
20180314879 | Khwaja et al. | Nov 2018 | A1 |
20180321745 | Morun et al. | Nov 2018 | A1 |
20180321746 | Morun et al. | Nov 2018 | A1 |
20180330549 | Brenton | Nov 2018 | A1 |
20180333575 | Bouton | Nov 2018 | A1 |
20180344195 | Morun et al. | Dec 2018 | A1 |
20180356890 | Zhang et al. | Dec 2018 | A1 |
20180360379 | Harrison et al. | Dec 2018 | A1 |
20190008453 | Spoof | Jan 2019 | A1 |
20190025919 | Tadi et al. | Jan 2019 | A1 |
20190027141 | Strong et al. | Jan 2019 | A1 |
20190033967 | Morun et al. | Jan 2019 | A1 |
20190033974 | Mu et al. | Jan 2019 | A1 |
20190038166 | Tavabi et al. | Feb 2019 | A1 |
20190076716 | Chiou et al. | Mar 2019 | A1 |
20190089898 | Kim et al. | Mar 2019 | A1 |
20190113973 | Coleman et al. | Apr 2019 | A1 |
20190121305 | Kaifosh et al. | Apr 2019 | A1 |
20190121306 | Kaifosh et al. | Apr 2019 | A1 |
20190146809 | Lee et al. | May 2019 | A1 |
20190150777 | Guo | May 2019 | A1 |
20190192037 | Morun et al. | Jun 2019 | A1 |
20190196585 | Laszlo et al. | Jun 2019 | A1 |
20190196586 | Laszlo et al. | Jun 2019 | A1 |
20190197778 | Sachdeva et al. | Jun 2019 | A1 |
20190209034 | Deno et al. | Jul 2019 | A1 |
20190212817 | Kaifosh et al. | Jul 2019 | A1 |
20190216619 | McDonnall et al. | Jul 2019 | A1 |
20190223748 | Al-Natsheh et al. | Jul 2019 | A1 |
20190227627 | Kaifosh et al. | Jul 2019 | A1 |
20190228330 | Kaifosh et al. | Jul 2019 | A1 |
20190228533 | Giurgica-Tiron et al. | Jul 2019 | A1 |
20190228579 | Kaifosh et al. | Jul 2019 | A1 |
20190228590 | Kaifosh et al. | Jul 2019 | A1 |
20190228591 | Giurgica-Tiron et al. | Jul 2019 | A1 |
20190247650 | Tran | Aug 2019 | A1 |
20190279407 | McHugh et al. | Sep 2019 | A1 |
20190294243 | Laszlo et al. | Sep 2019 | A1 |
20190056422 | Park et al. | Oct 2019 | A1 |
20190324549 | Araki et al. | Oct 2019 | A1 |
20190332140 | Wang | Oct 2019 | A1 |
20190348026 | Berenzweig et al. | Nov 2019 | A1 |
20190348027 | Berenzweig et al. | Nov 2019 | A1 |
20190357787 | Barachant et al. | Nov 2019 | A1 |
20190362557 | Lacey et al. | Nov 2019 | A1 |
20200042089 | Ang et al. | Feb 2020 | A1 |
20200057661 | Bendfeldt | Feb 2020 | A1 |
20200065569 | Nduka et al. | Feb 2020 | A1 |
20200069210 | Berenzweig et al. | Mar 2020 | A1 |
20200069211 | Berenzweig et al. | Mar 2020 | A1 |
20200073483 | Berenzweig et al. | Mar 2020 | A1 |
20200077955 | Shui et al. | Mar 2020 | A1 |
20200097081 | Stone et al. | Mar 2020 | A1 |
20200097083 | Mao et al. | Mar 2020 | A1 |
20200111260 | Osborn et al. | Apr 2020 | A1 |
20200125171 | Morun et al. | Apr 2020 | A1 |
20200142490 | Xiong et al. | May 2020 | A1 |
20200143795 | Park et al. | May 2020 | A1 |
20200159322 | Morun et al. | May 2020 | A1 |
20200163562 | Neaves | May 2020 | A1 |
20200205932 | Zar et al. | Jul 2020 | A1 |
20200225320 | Belskikh et al. | Jul 2020 | A1 |
20200245873 | Frank et al. | Aug 2020 | A1 |
20200249752 | Parshionikar | Aug 2020 | A1 |
20200275895 | Barachant | Sep 2020 | A1 |
20200301509 | Liu et al. | Sep 2020 | A1 |
20200305795 | Floyd et al. | Oct 2020 | A1 |
20200320335 | Shamun et al. | Oct 2020 | A1 |
20210109598 | Zhang et al. | Apr 2021 | A1 |
20210117523 | Kim et al. | Apr 2021 | A1 |
20210290159 | Bruinsma et al. | Sep 2021 | A1 |
20220256706 | Xiong et al. | Aug 2022 | A1 |
Number | Date | Country |
---|---|---|
2902045 | Aug 2014 | CA |
2921954 | Feb 2015 | CA |
2939644 | Aug 2015 | CA |
1838933 | Sep 2006 | CN |
101310242 | Nov 2008 | CN |
102246125 | Nov 2011 | CN |
102349037 | Feb 2012 | CN |
103777752 | May 2014 | CN |
103886215 | Jun 2014 | CN |
105009031 | Oct 2015 | CN |
105190477 | Dec 2015 | CN |
105190578 | Dec 2015 | CN |
105511615 | Apr 2016 | CN |
106067178 | Nov 2016 | CN |
106102504 | Nov 2016 | CN |
106108898 | Nov 2016 | CN |
107203272 | Sep 2017 | CN |
109620651 | Apr 2019 | CN |
110300542 | Oct 2019 | CN |
111616847 | Sep 2020 | CN |
111902077 | Nov 2020 | CN |
112074225 | Dec 2020 | CN |
112469469 | Mar 2021 | CN |
112822992 | May 2021 | CN |
4412278 | Oct 1995 | DE |
0301790 | Feb 1989 | EP |
1345210 | Sep 2003 | EP |
1408443 | Oct 2006 | EP |
2198521 | Jun 2012 | EP |
2541763 | Jan 2013 | EP |
2733578 | May 2014 | EP |
2959394 | Dec 2015 | EP |
3104737 | Dec 2016 | EP |
3200051 | Aug 2017 | EP |
3487395 | May 2019 | EP |
3697297 | Dec 2020 | EP |
2959394 | May 2021 | EP |
S61198892 | Sep 1986 | JP |
H05277080 | Oct 1993 | JP |
H0639754 | Feb 1994 | JP |
H07248873 | Sep 1995 | JP |
3103427 | Oct 2000 | JP |
2001054507 | Feb 2001 | JP |
2002287869 | Oct 2002 | JP |
2003303047 | Oct 2003 | JP |
2005095561 | Apr 2005 | JP |
2005352739 | Dec 2005 | JP |
2008192004 | Aug 2008 | JP |
2009050679 | Mar 2009 | JP |
2010520561 | Jun 2010 | JP |
2013160905 | Aug 2013 | JP |
2015512550 | Apr 2015 | JP |
2015514467 | May 2015 | JP |
2016507098 | Mar 2016 | JP |
2016507851 | Mar 2016 | JP |
2016540276 | Dec 2016 | JP |
2017509386 | Apr 2017 | JP |
2019023941 | Feb 2019 | JP |
2019185531 | Oct 2019 | JP |
2021072136 | May 2021 | JP |
20110040165 | Apr 2011 | KR |
20120094870 | Aug 2012 | KR |
20120097997 | Sep 2012 | KR |
20150123254 | Nov 2015 | KR |
20160121552 | Oct 2016 | KR |
20170067873 | Jun 2017 | KR |
20170107283 | Sep 2017 | KR |
10-1790147 | Oct 2017 | KR |
20190022329 | Mar 2019 | KR |
9527341 | Oct 1995 | WO |
2006086504 | Aug 2006 | WO |
2008109248 | Sep 2008 | WO |
2009042313 | Apr 2009 | WO |
2010095636 | Aug 2010 | WO |
2010104879 | Sep 2010 | WO |
WO-2011011750 | Jan 2011 | WO |
2011070554 | Jun 2011 | WO |
2012155157 | Nov 2012 | WO |
2013154864 | Oct 2013 | WO |
2014130871 | Aug 2014 | WO |
WO-2014155288 | Oct 2014 | WO |
2014186370 | Nov 2014 | WO |
2014194257 | Dec 2014 | WO |
2014197443 | Dec 2014 | WO |
2015027089 | Feb 2015 | WO |
2015073713 | May 2015 | WO |
WO-2015063520 | May 2015 | WO |
2015081113 | Jun 2015 | WO |
2015100172 | Jul 2015 | WO |
2015123445 | Aug 2015 | WO |
WO-2015123775 | Aug 2015 | WO |
2015184760 | Dec 2015 | WO |
2015192117 | Dec 2015 | WO |
2015199747 | Dec 2015 | WO |
2016041088 | Mar 2016 | WO |
2017062544 | Apr 2017 | WO |
2017075611 | May 2017 | WO |
2017092225 | Jun 2017 | WO |
2017120669 | Jul 2017 | WO |
2017172185 | Oct 2017 | WO |
2017208167 | Dec 2017 | WO |
2018022602 | Feb 2018 | WO |
2018098046 | May 2018 | WO |
2019099758 | May 2019 | WO |
2019147953 | Aug 2019 | WO |
2019147958 | Aug 2019 | WO |
2019147996 | Aug 2019 | WO |
2019217419 | Nov 2019 | WO |
2019226259 | Nov 2019 | WO |
2019231911 | Dec 2019 | WO |
2020047429 | Mar 2020 | WO |
2020061440 | Mar 2020 | WO |
2020061451 | Mar 2020 | WO |
2020072915 | Apr 2020 | WO |
Entry |
---|
Gargiulo et al., Giga-Ohm High-Impedance FET Input Amplifiers for Dry Electrode Biosensor Circuits and Systems, Jan. 2011, In book: Integrated Microsystems: Electronics, Photonics, and Biotechnology (pp. 165-194)Chapter: 8Publisher: CRC pressEditors: Iniewski, Kris (Year: 2011). |
Extended European Search Report for European Application No. 18869441.8, dated Nov. 17, 2020, 20 Pages. |
Extended European Search Report for European Application No. 19806723.3, dated Jul. 7, 2021, 13 pages. |
Extended European Search Report for European Application No. 19810524.9, dated Mar. 17, 2021, 11 pages. |
Extended European Search Report for European Application No. 19850130.6, dated Sep. 1, 2021, 14 Pages. |
Extended European Search Report for European Application No. 19855191.3, dated Dec. 6, 2021, 11 pages. |
Extended European Search Report for European Application No. 19883839.3, dated Dec. 15, 2021, 7 pages. |
Farina D., et al., “Man/Machine Interface Based on the Discharge Timings of Spinal Motor Neurons After Targeted Muscle Reinnervation,” Nature Biomedical Engineering, Feb. 6, 2017, vol. 1, Article No. 0025, pp. 1-12. |
Favorskaya M., et al., “Localization and Recognition of Dynamic Hand Gestures Based on Hierarchy of Manifold Classifiers,” International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, May 25-27, 2015, vol. XL-5/W6, pp. 1-8. |
Final Office Action dated Jun. 2, 2020 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 127 Pages. |
Final Office Action dated Jun. 2, 2020 for U.S. Appl. No. 16/557,383, filed Aug. 30, 2019, 66 Pages. |
Final Office Action dated Nov. 3, 2020 for U.S. Appl. No. 15/974,430, filed May 8, 2018, 27 Pages. |
Final Office Action dated Feb. 4, 2020 for U.S. Appl. No. 16/577,207, filed Sep. 20, 2019, 76 Pages. |
Final Office Action dated Feb. 4, 2021 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 42 Pages. |
Final Office Action dated Jun. 5, 2020 for U.S. Appl. No. 16/557,427, filed Aug. 30, 2019, 95 Pages. |
Final Office Action dated Oct. 8, 2020 for U.S. Appl. No. 16/557,342, filed Aug. 30, 2019, 73 Pages. |
Final Office Action dated Apr. 9, 2020 for U.S. Appl. No. 15/974,454, filed May 8, 2018, 19 Pages. |
Final Office Action dated Dec. 11, 2019 for U.S. Appl. No. 15/974,430, filed May 8, 2018, 30 Pages. |
Final Office Action dated Jan. 13, 2021 for U.S. Appl. No. 16/577,207, filed Sep. 20, 2019, 91 Pages. |
Final Office Action dated Dec. 18, 2019 for U.S. Appl. No. 16/258,279, filed Jan. 25, 2019, 45 Pages. |
Final Office Action dated Feb. 19, 2021 for U.S. Appl. No. 16/258,279, filed Jan. 25, 2019, 58 Pages. |
Final Office Action dated Sep. 23, 2020 for U.S. Appl. No. 15/816,435, filed Nov. 17, 2017, 70 Pages. |
Final Office Action dated Jan. 28, 2020 for U.S. Appl. No. 16/557,342, filed Aug. 30, 2019, 15 Pages. |
Final Office Action dated Jul. 28, 2017 for U.S. Appl. No. 14/505,836, filed Oct. 3, 2014, 52 Pages. |
Final Office Action dated Jun. 28, 2021 for U.S. Appl. No. 16/557,342, filed Aug. 30, 2019, 11 Pages. |
Final Office Action dated Nov. 29, 2019 for U.S. Appl. No. 15/659,072, filed Jul. 25, 2017, 36 Pages. |
Final Office Action dated Nov. 29, 2019 for U.S. Appl. No. 16/353,998, filed Mar. 14, 2019, 33 Pages. |
Fong H.C., et al., “PepperGram With Interactive Control,” 22nd International Conference onVirtual System & Multimedia (VSMM), Oct. 17, 2016, 5 pages. |
Gallina A., et al., “Surface EMG Biofeedback,” Surface Electromyography: Physiology, Engineering, and Applications, 2016, pp. 485-500. |
Ghasemzadeh H., et al., “A Body Sensor Network With Electromyogram and Inertial Sensors: Multimodal Interpretation of Muscular Activities,” IEEE Transactions on Information Technology in Biomedicine, Mar. 2010, vol. 14 (2), pp. 198-206. |
Gopura R.A.R.C., et al., “A Human Forearm and Wrist Motion Assist Exoskeleton Robot With EMG-Based Fuzzy-Neuro Control,” Proceedings of the 2nd Biennial IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics, Oct. 19-22, 2008, 6 pages. |
Gourmelon L., et al., “Contactless Sensors for Surface Electromyography,” Proceedings of the 28th IEEE EMBS Annual International Conference, New York City, NY, Aug. 30-Sep. 3, 2006, pp. 2514-2517. |
Hauschild M., et al., “A Virtual Reality Environment for Designing and Fitting Neural Prosthetic Limbs,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, Mar. 2007, vol. 15 (1), pp. 9-15. |
International Search Report and Written Opinion for International Application No. PCT/US2014/017799, dated May 16, 2014, 9 pages. |
Intemational Search Report and Written Opinion for International Application No. PCT/US2014/037863, dated Aug. 21, 2014, 10 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2017/043693, dated Feb. 7, 2019, 7 Pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2017/043791, dated Feb. 7, 2019, 9 Pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2019/031114, dated Nov. 19, 2020, 16 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2019/049094, dated Mar. 11, 2021, 24 Pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2019/052151, dated Apr. 1, 2021, 9 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2014/017799, dated Sep. 3, 2015, 8 Pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2014/037863, dated Nov. 26, 2015, 8 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2014/052143, dated Mar. 3, 2016, 7 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2014/067443, dated Jun. 9, 2016, 7 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2015/015675, dated Aug. 25, 2016, 8 Pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2017/043686, dated Feb. 7, 2019, 8 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2017/043792, dated Feb. 7, 2019, 8 Pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2018/056768, dated Apr. 30, 2020, 7 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2018/061409, dated May 28, 2020, 10 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2019/015174, dated Aug. 6, 2020, 7 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2019/015183, dated Aug. 6, 2020, 7 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2019/015238, dated Aug. 6, 2020, 7 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2019/028299, dated Dec. 10, 2020, 11 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2019/034173, dated Dec. 10, 2020, 9 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2019/046351, dated Feb. 25, 2021, 8 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2019/052131, dated Apr. 1, 2021, 8 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2019/054716, dated Apr. 15, 2021, 10 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2019/061759, dated May 27, 2021, 12 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2019/063587, dated Jun. 10, 2021, 13 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2020/049274, dated Mar. 17, 2022, 14 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2020/061392, dated Jun. 9, 2022, 11 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2014/052143, dated Nov. 21, 2014, 8 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2014/067443, dated Feb. 27, 2015, 8 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2015/015675, dated May 27, 2015, 9 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2017/043686, dated Oct. 6, 2017, 9 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2017/043693, dated Oct. 6, 2017, 7 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2017/043791, dated Oct. 5, 2017, 10 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2018/056768, dated Jan. 15, 2019, 8 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2018/061409, dated Mar. 12, 2019, 11 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2018/063215, dated Mar. 21, 2019, 17 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/015167, dated May 21, 2019, 7 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/015174, dated May 21, 2019, 8 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/015244, dated May 16, 2019, 8 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/020065, dated May 16, 2019, 10 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/028299, dated Aug. 9, 2019, 12 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/031114, dated Dec. 20, 2019, 18 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/034173, dated Sep. 18, 2019, 10 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/037302, dated Oct. 11, 2019, 13 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/042579, dated Oct. 31, 2019, 8 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/046351, dated Nov. 7, 2019, 9 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/049094, dated Jan. 9, 2020, 27 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/052131, dated Dec. 6, 2019, 8 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/052151, dated Jan. 15, 2020, 10 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/054716, dated Dec. 20, 2019, 11 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/061759, dated Jan. 29, 2020, 12 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/063587, dated Mar. 25, 2020, 16 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2020/025735, dated Jun. 22, 2020, 10 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2020/025772, dated Aug. 3, 2020, 11 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2020/025797, dated Jul. 9, 2020, 10 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2020/049274, dated Feb. 1, 2021, 17 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2020/061392, dated Mar. 12, 2021, 12 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2017/043792, dated Oct. 5, 2017, 9 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/015134, dated May 15, 2019, 11 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/015180, dated May 28, 2019, 10 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/015183, dated May 3, 2019, 8 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/015238, dated May 16, 2019, 8 Pages. |
Invitation to Pay Additional Fees for International Application No. PCT/US2019/031114, dated Aug. 6, 2019, 7 pages. |
Invitation to Pay Additional Fees for International Application No. PCT/US2019/049094, dated Oct. 24, 2019, 2 Pages. |
Jiang H., “Effective and Interactive Interpretation of Gestures by Individuals with Mobility Impairments,” Thesis/Dissertation Acceptance, Purdue University Graduate School, Graduate School Form 30, Updated on Jan. 15, 2015, 24 pages. |
Kainz et al., “Approach to Hand Tracking and Gesture Recognition Based on Depth-Sensing Cameras and EMG Monitoring,” ACTA Informatica Pragensia, vol. 3, Jan. 1, 2014, pp. 104-112, Retrieved from the Internet: URL: https://aip.vse.cz/pdfs/aip/2014/01/08.pdf. |
Kawaguchi J., et al., “Estimation of Finger Joint Angles Based on Electromechanical Sensing of Wrist Shape,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, Sep. 2017, vol. 25 (9), pp. 1409-1418. |
Kim H., et al., “Real-Time Human Pose Estimation and Gesture Recognition from Depth Images Using Superpixels and SVM Classifier,” Sensors, 2015, vol. 15, pp. 12410-12427. |
Kipke D.R., et al., “Silicon-Substrate Intracortical Microelectrode Arrays for Long-Term Recording of Neuronal Spike Activity in Cerebral Cortex,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, Jun. 2003, vol. 11 (2), 5 pages, Retrieved on Oct. 7, 2019 [Oct. 7, 2019] Retrieved from the Internet: URL: https://www.ece.uvic.ca/-bctill/papers/neurimp/Kipke_etal_2003_01214707.pdf. |
Koerner M.D., “Design and Characterization of the Exo-Skin Haptic Device: A Novel Tendon Actuated Textile Hand Exoskeleton,” Abstract of thesis for Drexel University Masters Degree [online], Nov. 2, 2017, 5 pages, Retrieved from the Internet: URL: https://dialog.proquest.com/professional/docview/1931047627?accountid=153692. |
Lee D.C., et al., “Motion and Force Estimation System of Human Fingers,” Journal of Institute of Control, Robotics and Systems, 2011, vol. 17 (10), pp. 1014-1020. |
Li Y., et al., “Motor Function Evaluation of Hemiplegic Upper-Extremities Using Data Fusion from Wearable Inertial and Surface EMG Sensors,” Sensors, MDPI, 2017, vol. 17 (582), pp. 1-17. |
Lopes J., et al., “Hand/Arm Gesture Segmentation by Motion Using IMU and EMG Sensing,” ScienceDirect, Jun. 27-30, 2017, vol. 11, pp. 107-113. |
Marcard T.V., et al., “Sparse Inertial Poser: Automatic 3D Human Pose Estimation from Sparse IMUs,” arxiv.org, Computer Graphics Forum, 2017, vol. 36 (2), 12 pages, XP080759137. |
Martin H., et al., “A Novel Approach of Prosthetic Arm Control using Computer Vision, Biosignals, and Motion Capture,” IEEE Symposium on Computational Intelligence in Robotic Rehabilitation and Assistive Technologies (CIR2AT), 2014, 5 pages. |
McIntee S.S., “A Task Model of Free-Space Movement-Based Geastures,” Dissertation, Graduate Faculty of North Carolina State University, Computer Science, 2016, 129 pages. |
Mendes Jr.J.J.A., et al., “Sensor Fusion and Smart Sensor in Sports and Biomedical Applications,” Sensors, 2016, vol. 16 (1569), pp. 1-31. |
Mohamed O.H., “Homogeneous Cognitive Based Biometrics for Static Authentication,” Dissertation submitted to University of Victoria, Canada, 2010, [last accessed Oct. 11, 2019], 149 pages, Retrieved from the Internet: URL: http://hdl.handle.net/1828/321. |
Morris D., et al., “Emerging Input Technologies for Always-Available Mobile Interaction,” Foundations and Trends in Human-Computer Interaction, 2010, vol. 4 (4), pp. 245-316. |
Naik G.R., et al., “Source Separation and Identification issues in Bio Signals: A Solution using Blind Source Separation,” Chapter 4 of Recent Advances in Biomedical Engineering, Intech, 2009, 23 pages. |
Naik G.R., et al., “Real-Time Hand Gesture Identification for Human Computer Interaction Based on ICA of Surface Electromyogram,” IADIS International Conference Interfaces and Human Computer Interaction, 2007, pp. 83-90. |
Naik G.R., et al., “Subtle Hand Gesture Identification for HCI Using Temporal Decorrelation Source Separation BSS of Surface EMG,” Digital Image Computing Techniques and Applications, IEEE Computer Society, 2007, pp. 30-37. |
Negro F., et al., “Multi-Channel Intramuscular and Surface EMG Decomposition by Convolutive Blind Source Separation,” Journal of Neural Engineering, Feb. 29, 2016, vol. 13, 18 Pages. |
Non-Final Office Action dated Mar. 2, 2021 for U.S. Appl. No. 15/974,430, filed May 8, 2018, 32 Pages. |
Non-Final Office Action dated Sep. 2, 2020 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 66 Pages. |
Non-Final Office Action dated Aug. 3, 2020 for U.S. Appl. No. 16/593,446, filed Oct. 4, 2019, 44 pages. |
Non-Final Office Action dated Jun. 3, 2021 for U.S. Appl. No. 15/816,435, filed Nov. 17, 2017, 32 Pages. |
Non-Final Office Action dated Jun. 5, 2020 for U.S. Appl. No. 15/659,072, filed Jul. 25, 2017, 59 Pages. |
Non-Final Office Action dated Sep. 6, 2019 for U.S. Appl. No. 16/424,144, filed May 28, 2019, 11 Pages. |
Non-Final Office Action dated Feb. 8, 2021 for U.S. Appl. No. 16/557,342, filed Aug. 30, 2019, 11 Pages. |
Non-Final Office Action dated Oct. 8, 2020 for U.S. Appl. No. 16/577,207, filed Sep. 20, 2019, 51 Pages. |
Non-Final Office Action dated Apr. 9, 2019 for U.S. Appl. No. 16/258,409, filed Jan. 25, 2019, 71 Pages. |
Non-Final Office Action dated Aug. 11, 2021 for U.S. Appl. No. 16/577,207, filed Sep. 20, 2019, 35 Pages. |
Non-Final Office Action dated Jun. 13, 2019 for U.S. Appl. No. 16/258,279, filed Jan. 25, 2019, 38 Pages. |
Non-Final Office Action dated Jun. 15, 2020 for U.S. Appl. No. 16/557,342, filed Aug. 30, 2019, 46 Pages. |
Non-Final Office Action dated Jan. 16, 2020 for U.S. Appl. No. 16/389,419, filed Apr. 19, 2019, 26 Pages. |
Non-Final Office Action dated May 16, 2019 for U.S. Appl. No. 15/974,384, filed May 8, 2018, 13 Pages. |
Non-Final Office Action dated May 16, 2019 for U.S. Appl. No. 15/974,430, filed May 8, 2018, 12 Pages. |
Non-Final Office Action dated Nov. 19, 2019 for U.S. Appl. No. 16/577,207, filed Sep. 20, 2019, 32 Pages. |
Non-Final Office Action dated Aug. 20, 2020 for U.S. Appl. No. 15/974,454, filed May 8, 2018, 59 Pages. |
Non-Final Office Action dated Dec. 20, 2019 for U.S. Appl. No. 15/974,454, filed May 8, 2018, 41 Pages. |
Non-Final Office Action dated Jan. 22, 2020 for U.S. Appl. No. 15/816,435, filed Nov. 17, 2017, 35 Pages. |
Non-Final Office Action dated Oct. 22, 2019 for U.S. Appl. No. 16/557,342, filed Aug. 30, 2019, 16 Pages. |
Non-Final Office Action dated Dec. 23, 2019 for U.S. Appl. No. 16/557,383, filed Aug. 30, 2019, 53 Pages. |
Non-Final Office Action dated Dec. 23, 2019 for U.S. Appl. No. 16/557,427, filed Aug. 30, 2019, 52 Pages. |
Non-Final Office Action dated Feb. 23, 2017 for U.S. Appl. No. 14/505,836, filed Oct. 3, 2014, 54 Pages. |
Non-Final Office Action dated Jul. 23, 2020 for U.S. Appl. No. 16/593,446, filed Oct. 4, 2019, 28 pages. |
Non-Final Office Action dated May 24, 2019 for U.S. Appl. No. 16/353,998, filed Mar. 14, 2019, 20 Pages. |
Non-Final Office Action dated May 26, 2020 for U.S. Appl. No. 16/353,998, filed Mar. 14, 2019, 60 Pages. |
Non-Final Office Action dated Nov. 27, 2020 for U.S. Appl. No. 16/258,279, filed Jan. 25, 2019, 44 Pages. |
Non-Final Office Action dated Apr. 29, 2019 for U.S. Appl. No. 16/257,979, filed Jan. 25, 2019, 63 Pages. |
Non-Final Office Action dated Apr. 30, 2019 for U.S. Appl. No. 15/659,072, filed Jul. 25, 2017, 99 Pages. |
Non-Final Office Action dated Apr. 30, 2020 for U.S. Appl. No. 15/974,430, filed May 8, 2018, 57 Pages. |
Non-Final Office Action dated Dec. 30, 2019 for U.S. Appl. No. 16/593,446, filed Oct. 4, 2019, 43 pages. |
Non-Final Office Action dated Jun. 30, 2016 for U.S. Appl. No. 14/505,836, filed Oct. 3, 2014, 37 Pages. |
Non-Final Office Action dated Oct. 30, 2019 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 22 Pages. |
Al-Jumaily A., et al., “Electromyogram(EMG) Driven System based Virtual Reality for Prosthetic and Rehabilitation Devices,” Proceedings of the 11th InternationalConference on Information Integration andWeb-Based Applications & Services, Jan. 1, 2009, pp. 582-586. |
Al-Mashhadany Y.I., “Inverse Kinematics Problem (IKP) of 6-DOF Manipulator By Locally Recurrent Neural Networks (LRNNs),” Management and Service Science (MASS), International Conference on Management and Service Science., IEEE, Aug. 24, 2010, 5 pages. |
Al-Timemy A.H., et al., “Improving the Performance Against Force Variation of EMG Controlled Multifunctional Upper-Limb Prostheses for Transradial Amputees,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, Jun. 2016, vol. 24 (6), 12 Pages. |
Arkenbout E.A., et al., “Robust Hand Motion Tracking through Data Fusion of 5DT Data Glove and Nimble VR Kinect Camera Measurements,” Sensors, 2015, vol. 15, pp. 31644-31671. |
Benko H., et al., “Enhancing Input On and Above the Interactive Surface with Muscle Sensing,” The ACM International Conference on Interactive Tabletops and Surfaces (ITS), Nov. 23-25, 2009, pp. 93-100. |
Berenzweig A., et al., “Wearable Devices and Methods for Improved Speech Recognition,” U.S. Appl. No. 16/785,680, filed Feb. 10, 2020, 67 pages. |
Boyali A., et al., “Spectral Collaborative Representation based Classification for Hand Gestures Recognition on Electromyography Signals,” Biomedical Signal Processing and Control, 2016, vol. 24, pp. 11-18. |
Brownlee J., “Finite State Machines (FSM): Finite State Machines as a Control Technique in Artificial Intelligence (AI),” FSM, Jun. 2002, 12 pages. |
Cannan J., et al., “A Wearable Sensor Fusion Armband for Simple Motion Control and Selection for Disabled and Non-Disabled Users,” Computer Science and Electronic Engineering Conference, IEEE, Sep. 12, 2012, pp. 216-219, XP032276745. |
Cheng J., et al., “A Novel Phonology- and Radical-Coded Chinese Sign Language Recognition Framework Using Accelerometer and Surface Electromyography Sensors,” Sensors, 2015, vol. 15, pp. 23303-23324. |
Communication Pursuant to Article 94(3) for European Patent Application No. 17835112.8, dated Dec. 14, 2020, 6 Pages. |
Communication Pursuant to Rule 164(1) EPC, Partial Supplementary European Search Report for European Application No. 14753949.8, dated Sep. 30, 2016, 7 pages. |
Co-pending U.S. Appl. No. 15/659,072, inventors Patrick; Kaifosh et al., filed Jul. 25, 2017. |
Co-pending U.S. Appl. No. 15/816,435, inventors Ning; Guo et al., filed Nov. 17, 2017. |
Co-pending U.S. Appl. No. 15/882,858, inventors Stephen; Lake et al., filed Jan. 29, 2018. |
Co-pending U.S. Appl. No. 15/974,430, inventors Adam; Berenzweig et al., filed May 8, 2018. |
Co-pending U.S. Appl. No. 16/353,998, inventors Patrick; Kaifosh et al., filed Mar. 14, 2019. |
Co-pending U.S. Appl. No. 16/557,383, inventors Adam; Berenzweig et al., filed Aug. 30, 2019. |
Co-pending U.S. Appl. No. 16/557,427, inventors Adam; Berenzweig et al., filed Aug. 30, 2019. |
Co-Pending U.S. Appl. No. 15/974,430, filed May 8, 2018, 44 Pages. |
Co-Pending U.S. Appl. No. 16/353,998, filed Mar. 14, 2019, 43 pages. |
Co-Pending U.S. Appl. No. 16/557,383, filed Aug. 30, 2019, 94 Pages. |
Co-Pending U.S. Appl. No. 16/557,427, filed Aug. 30, 2019, 93 Pages. |
Co-Pending U.S. Appl. No. 16/577,207, filed Sep. 20, 2019, 67 Pages. |
Co-Pending U.S. Appl. No. 14/505,836, filed Oct. 3, 2014, 59 Pages. |
Co-Pending U.S. Appl. No. 15/816,435, filed Nov. 17, 2017, 24 Pages. |
Co-Pending U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 54 Pages. |
Co-Pending U.S. Appl. No. 15/974,384, filed May 8, 2018, 44 Pages. |
Co-Pending U.S. Appl. No. 15/974,454, filed May 8, 2018, 45 Pages. |
Co-Pending U.S. Appl. No. 16/557,342, filed Aug. 30, 2019, 93 Pages. |
Corazza S., et al.,“A Markerless Motion Capture System to Study Musculoskeletal Biomechanics: Visual Hull and Simulated Annealing Approach,” Annals of Biomedical Engineering, Jul. 2006, vol. 34 (6), pp. 1019-1029, [Retrieved on Dec. 11, 2019], 11 pages, Retrieved from the Internet: URL: https://www.researchgate.net/publication/6999610_A_Markerless_Motion_Capture_System_to_Study_Musculoskeletal_Biomechanics_Visual_Hull_and_Simulated_Annealing_Approach. |
Costanza E., et al., “EMG as a Subtle Input Interface for Mobile Computing,” Mobile HCI, LNCS 3160, 2004, pp. 426-430. |
Costanza E., et al., “Toward Subtle Intimate Interfaces for Mobile Devices Using an EMG Controller,” CHI, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Apr. 2-7, 2005, pp. 481-489. |
Cote-Allard U., et al., “Deep Learning for Electromyographic Hand Gesture Signal Classification Using Transfer Learning,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, Jan. 26, 2019, vol. 27 (4), 11 Pages. |
Csapo A.B., et al., “Evaluation of Human-Myo Gesture Control Capabilities in Continuous Search and Select Operations,” 7th IEEE International Conference on Cognitive Infocommunications, Oct. 16-18, 2016, pp. 000415-000420. |
Davoodi R., et al., “Development of a Physics-Based Target Shooting Game to Train Amputee Users of Multi joint Upper Limb Prostheses,” Presence, Massachusetts Institute of Technology, 2012, vol. 21 (1), pp. 85-95. |
Delis A.L., et al., “Development of a Myoelectric Controller Based on Knee Angle Estimation,” Biodevices, International Conference on Biomedical Electronics and Devices, Jan. 17, 2009, 7 pages. |
Diener L., et al., “Direct Conversion From Facial Myoelectric Signals to Speech Using Deep Neural Networks,” International Joint Conference on Neural Networks (IJCNN), Oct. 1, 2015, 7 pages. |
Ding I-J., et al., “HMM with Improved Feature Extraction-Based Feature Parameters for Identity Recognition of Gesture Command Operators by Using a Sensed Kinect-Data Stream,” Neurocomputing, 2017, vol. 262, pp. 108-119. |
European Search Report for European Application No. 19861903.3, dated Oct. 12, 2021, 2 pages. |
European Search Report for European Application No. 19863248.1, dated Oct. 19, 2021, 2 pages. |
European Search Report for European Application No. 19868789.9, dated May 9, 2022, 9 pages. |
European Search Report for European Application No. 19890394.0, dated Apr. 29, 2022, 9 pages. |
Extended European Search Report for European Application No. 18879156.0, dated Mar. 12, 2021, 11 pages. |
Extended European Search Report for European Application No. 19743717.1, dated Mar. 3, 2021, 12 pages. |
Extended European Search Report for European Application No. 19744404.5, dated Mar. 29, 2021, 11 pages. |
Extended European Search Report for European Application No. 19799947.7, dated May 26, 2021, 10 pages. |
Extended European Search Report for European Application No. 17835111.0, dated Nov. 21, 2019, 6 pages. |
Extended European Search Report for European Application No. 17835112.8, dated Feb. 5, 2020, 17 pages. |
Extended European Search Report for European Application No. 17835140.9, dated Nov. 26, 2019, 10 Pages. |
Notice of Allowance dated Aug. 22, 2022 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 9 pages. |
Notice of Allowance dated Nov. 2, 2020 for U.S. Appl. No. 15/974,454, filed May 8, 2018, 24 Pages. |
Notice of Allowance dated Nov. 4, 2019 for U.S. Appl. No. 15/974,384, filed May 8, 2018, 39 Pages. |
Notice of Allowance dated Feb. 6, 2020 for U.S. Appl. No. 16/424,144, filed May 28, 2019, 28 Pages. |
Notice of Allowance dated Feb. 9, 2022 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 9 pages. |
Notice of Allowance dated Nov. 10, 2021 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 6 pages. |
Notice of Allowance dated Jul. 15, 2021 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 2 pages. |
Notice of Allowance dated Dec. 16, 2020 for U.S. Appl. No. 16/593,446, filed Oct. 4, 2019, 44 pages. |
Notice of Allowance dated May 18, 2020 for U.S. Appl. No. 16/258,279, filed Jan. 25, 2019, 42 Pages. |
Notice of Allowance dated May 18, 2022 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 10 pages. |
Notice of Allowance dated Aug. 19, 2020 for U.S. Appl. No. 16/557,427, filed Aug. 30, 2019, 22 Pages. |
Notice of Allowance dated Jul. 19, 2019 for U.S. Appl. No. 16/258,409, filed Jan. 25, 2019, 36 Pages. |
Notice of Allowance dated May 20, 2020 for U.S. Appl. No. 16/389,419, filed Apr. 19, 2019, 28 Pages. |
Notice of Allowance dated Oct. 22, 2021 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018 , 8 pages. |
Notice of Allowance dated Aug. 23, 2021 for U.S. Appl. No. 15/974,430, filed May 8, 2018, 12 pages. |
Notice of Allowance dated Dec. 23, 2020 for U.S. Appl. No. 15/659,072, filed Jul. 25, 2017, 26 Pages. |
Notice of Allowance dated Jun. 28, 2021 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 18 pages. |
Notice of Allowance dated Jul. 31, 2019 for U.S. Appl. No. 16/257,979, filed Jan. 25, 2019, 22 Pages. |
Office action for European Application No. 17835112.8, dated Feb. 11, 2022, 11 Pages. |
Office Action for European Patent Application No. 19743717.1, dated Apr. 11, 2022, 10 pages. |
Partial Supplementary European Search Report for European Application No. 18879156.0, dated Dec. 7, 2020, 9 pages. |
Picard R.W., et al., “Affective Wearables,” Proceedings of the IEEE 1st International Symposium on Wearable Computers, ISWC, Cambridge, MA, USA, Oct. 13-14, 1997, pp. 90-97. |
Preinterview First Office Action dated Jun. 24, 2020 for U.S. Appl. No. 16/785,680, filed Feb. 10, 2020, 90 Pages. |
Rekimoto J., “GestureWrist and GesturePad: Unobtrusive Wearable Interaction Devices,” ISWC Proceedings of the 5th IEEE International Symposium on Wearable Computers, 2001, 7 pages. |
Saponas T.S., et al., “Demonstrating the Feasibility of Using Forearm Electromyography for Muscle-Computer Interfaces,” CHI Proceedings, Physiological Sensing for Input, Apr. 5-10, 2008, pp. 515-524. |
Saponas T.S., et al., “Enabling Always-Available Input with Muscle-Computer Interfaces,” Conference: Proceedings of the 22nd Annual ACM Symposium on User Interface Software and Technology, Oct. 7, 2009, pp. 167-176. |
Saponas T.S., et al., “Making Muscle-Computer Interfaces More Practical,” CHI, Atlanta, Georgia, USA, Apr. 10-15, 2010, 4 pages. |
Sartori M., et al., “Neural Data-Driven Musculoskeletal Modeling for Personalized Neurorehabilitation Technologies,” IEEE Transactions on Biomedical Engineering, May 5, 2016, vol. 63 (5), pp. 879-893. |
Sato M., et al., “Touche: Enhancing Touch Interaction on Humans, Screens, Liquids, and Everyday Objects,” CHI, Austin, Texas, May 5-10, 2012, 10 pages. |
Sauras-Perez P., et al., “A Voice and Pointing Gesture Interaction System for Supporting Human Spontaneous Decisions in Autonomous Cars,” Clemson University, All Dissertations, May 2017, 174 pages. |
Shen S., et al., “I Am a Smartwatch and I Can Track My User's Arm,” University of Illinois at Urbana-Champaign, MobiSys, Jun. 25-30, 2016, 12 pages. |
Son M., et al., “Evaluating the Utility of Two Gestural Discomfort Evaluation Methods,” PLoS One, Apr. 19, 2017, 21 pages. |
Strbac M., et al., “Microsoft Kinect-Based Artificial Perception System for Control of Functional Electrical Stimulation Assisted Grasping,” Hindawi Publishing Corporation, BioMed Research International [online], 2014, Article No. 740469, 13 pages, Retrieved from the Internet: URL: https://dx.doi.org/10.1155/2014/740469. |
Torres T., “Myo Gesture Control Armband,” PCMag, Jun. 8, 2015, 9 pages, Retrieved from the Internet: URL: https://www.pcmag.com/article2/0,2817,2485462,00.asp. |
Ueno A., et al., “A Capacitive Sensor System for Measuring Laplacian Electromyogram through Cloth: A Pilot Study,” Proceedings of the 29th Annual International Conference of the IEEE EMBs, Cite Internationale, Lyon, France, Aug. 23-26, 2007, pp. 5731-5734. |
Ueno A., et al., “Feasibility of Capacitive Sensing of Surface Electromyographic Potential through Cloth,” Sensors and Materials, 2012, vol. 24 (6), pp. 335-346. |
Valero-Cuevas F.J., et al., “Computational Models for Neuromuscular Function,” IEEE Reviews in Biomedical Engineering, 2009, vol. 2, NIH Public Access Author Manuscript [online], Jun. 16, 2011 [Retrieved on Jul. 29, 2019], 52 pages, Retrieved from the Internet: URL: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3116649/. |
Wittevrongel B., et al., “Spatiotemporal Beamforming: A Transparent and Unified Decoding Approach to Synchronous Visual Brain-Computer Interfacing,” Frontiers in Neuroscience, Nov. 15, 2017, vol. 11, Article No. 630, 13 Pages. |
Wodzinski M., et al., “Sequential Classification of Palm Gestures Based on A* Algorithm and MLP Neural Network for Quadrocopter Control,” Metrology and Measurement Systems, 2017, vol. 24 (2), pp. 265-276. |
Xiong A., et al., “A Novel HCI based on EMG and IMU,” Proceedings of the 2011 IEEE International Conference on Robotics and Biomimetics, Phuket, Thailand, Dec. 7-11, 2011, pp. 2653-2657. |
Xu Z., et al., “Hand Gesture Recognition and Virtual Game Control Based on 3D Accelerometer and EMG Sensors,” Proceedings of the 14th International Conference on Intelligent User Interfaces, D211 Sanibel Island, Florida, Feb. 8-11, 2009, pp. 401-406. |
Xue Y., et al., “Multiple Sensors Based Hand Motion Recognition Using Adaptive Directed Acyclic Graph, ” Applied Sciences, MDPI, 2017, vol. 7 (358), pp. 1-14. |
Yang Z., et al., “Surface EMG Based Handgrip Force Predictions Using Gene Expression Programming,” Neurocomputing, 2016, vol. 207, pp. 568-579. |
Zacharaki E.I., et al., “Spike Pattern Recognition by Supervised Classification in Low Dimensional Embedding Space,” Brain Informatics, 2016, vol. 3, pp. 73-83. |
Zhang X., et al., “A Framework for Hand Gesture Recognition Based on Accelerometer and EMG Sensors,” IEEE Transactions on Systems, Man, and Cybernetics—Part A: Systems and Humans, Nov. 2011, vol. 41 (6), pp. 1064-1076. |
Tibold R., et al., “Prediction of Muscle Activity during Loaded Movements of The Upper Limb,” Journal of NeuroEngineering Rehabilitation, 2015 vol. 12, No. 6, DOI: https://doi.org/10.1186/1743-0003-12-6, 12 pages. |
Amitai Y., “P-27: A Two-Dimensional Aperture Expander for Ultra-Compact, High-Performance Head-Worn Displays,” SID Symposium Digest of Technical Papers, 2005, vol. 36 (1), pp. 360-363. |
Ayras P., et al., “Exit Pupil Expander With a Large Field of View Based on Diffractive Optics,” Joumal of the SID, 2009, vol. 17 (8), pp. 659-664. |
Bailey ct al., Wearable Muscle Interface Systems, Devices and Methods That Interact With Content Displayed on an Electronic Display, Office Action dated Mar. 31, 2015, for U.S. Appl. No. 14/155,107, 17 pages. |
Bailey et al., “Muscle Interface Device and Method for Interacting With Content Displayed on Wearable Head Mounted Displays,” Amendment filed Aug. 25, 2015, for U.S. Appl. No. 14/155,087, 10 pages. |
Bailey et al., “Muscle Interface Device and Method for Interacting With Content Displayed on Wearable Head Mounted Displays,” Amendment filed Aug. 9, 2016, for U.S. Appl. No. 14/155,087, 8 pages. |
Bailey et al., “Muscle Interface Device and Method for Interacting With Content Displayed on Wearable Head Mounted Displays,” Amendment filed May 17, 2016, for U.S. Appl. No. 14/155,087, 13 pages. |
Bailey et al., “Muscle Interface Device and Method for Interacting With Content Displayed on Wearable Head Mounted Displays,” Office Action dated Feb. 17, 2016, for U.S. Appl. No. 14/155,087, 16 pages. |
Bailey et al., “Muscle Interface Device and Method for Interacting With Content Displayed on Wearable Head Mounted Displays,” Office Action dated Jul. 20, 2015, for U.S. Appl. No. 14/155,087, 14 pages. |
Bailey et al., “Muscle Interface Device and Method for Interacting With Content Displayed on Wearable Head Mounted Displays,” Office Action dated Jul. 8, 2016, for U.S. Appl. No. 14/155,087, 16 pages. |
Bailey et al., “Muscle Interface Device and Method for Interacting With Content Displayed on Wearable Head Mounted Displays,” Office Action dated Mar. 31, 2015, for U.S. Appl. No. 14/155,087, 15 pages. |
Bailey et al., “Muscle Interface Device and Method for Interacting With Content Displayed on Wearable Head Mounted Displays,” Preliminary Amendment filed Jan. 28, 2014, for U.S. Appl. No. 14/155,087, 8 pages. |
Bailey et al., “Wearable Muscle Interface Systems, Devices and Methods That Interact With. Content Displayed on an Electronic Display,” Amendment filed Aug. 9, 2016, for U.S. Appl. No. 14/155,107, 8 pages. |
Bailey et al., “Wearable Muscle Interface Systems, Devices and Methods That Interact With Content Displayed on an Electronic Display,” Amendment filed May 11, 2016, for U.S. Appl. No. 14/155,107, 15 pages. |
Bailey et al., Wearable Muscle Interface Systems, Devices and Methods That Interact With Content Displayed on an Electronic Display/ Office Action dated Feb. 11, 2016, for U.S. Appl. No. 14/155,107, 20 pages. |
Bailey et al., Wearable Muscle Interface Systems, Devices and Methods That Interact With Content Displayed on an Electronic Display, Office Action dated Jul. 16, 2015, for U.S. Appl. No. 14/155,107, 20 pages. |
Bailey et al., Wearable Muscle Interface Systems. Devices and Methods That Interact With Content Displayed on an Electronic Display/ Office Action dated Jul. 8, 2016, for U.S. Appl. No. 14/155,107, 21 pages. |
Bailey., et al., “Wearable Muscle Interface Systems, Devices And Methods That Interact With Content Displayed on an Electronic Display,” Office Action dated Mar. 31, 2015, for U.S. Appl. No. 14/155,107, 17 pages. |
Chellappan K.V., et al., “Laser-Based Displays: A Review,” Applied Optics, Sep. 1, 2010, vol. 49 (25), pp. F79-F98. |
Co-Pending U.S. Appl. No. 16/430,299, filed Jun. 3, 2019, 42 Pages. |
Cui L., et al., “Diffraction From Angular Multiplexing Slanted Volume Hologram Gratings,” Optik, 2005, vol. 116, pp. 118-122. |
Curatu C., et al., “Dual Purpose Lens for an Eye-Tracked Projection Head-Mounted Display,” International Optical Design Conference SPIE-OSA, 2006, vol. 6342, pp. 63420X-1-63420X-7. |
Curatu C., et al., “Projection-Based Head-Mounted Display With Eye-Tracking Capabilities,” Proceedings of SPIE, 2005, vol. 5875, pp. 58750J-1-58750J-9. |
Essex D., “Tutorial on Optomechanical Beam Steering Mechanisms,” OPTI 521 Tutorial, College of Optical Sciences, University of Arizona, 2006, 8 pages. |
Farina D., et al., “The Extraction of Neural Information from the Surface EMG for the Control of Upper-Limb Prostheses: Emerging Avenues and Challenges,” IEEE Transactions on Neural Systems andRehabilitation Engineering, vol. 22, No. 4, Jul. 1, 2014, pp. 797-809. |
Fernandez E., et al., “Optimization of a Thick Polyvinyl Alcohol-Acrylamide Photopolymer for Data Storage Using a Combination of Angular and Peristrophic Holographic Multiplexing,” Applied Optics, Oct. 10, 2009, vol. 45 (29), pp. 7661-7666. |
Final Office Action dated Jan. 3, 2019 for U.S. Appl. No. 14/465,194, filed Aug. 21, 2014, 61 Pages. |
Final Office Action dated Jan. 10, 2018 for U.S. Appl. No. 14/465,194, filed Aug. 21, 2014, 50 Pages. |
Final Office Action dated Nov. 18, 2020 for U.S. Appl. No. 14/461,044, filed Aug. 15, 2014, 14 Pages. |
Final Office Action dated Oct. 21, 2021 for U.S. Appl. No. 16/899,843, filed Jun. 12, 2020, 29 Pages. |
Final Office Action dated Jul. 23, 2021 for U.S. Appl. No. 14/461,044, filed Aug. 15, 2014, 15 Pages. |
Final Office Action received for U.S. Appl. No. 14/155,087 dated Dec. 16, 2016, 32 pages. |
Final Office Action received for U.S. Appl. No. 14/155,087 dated Jul. 20, 2015, 27 pages. |
Final Office Action received for U.S. Appl. No. 14/155,087 dated Jul. 8, 2016, 27 pages. |
Final Office Action received for U.S. Appl. No. 14/155,087 dated Nov. 27, 2017, 40 pages. |
Final Office Action received for U.S. Appl. No. 14/155,107 dated Dec. 19, 2016, 35 pages. |
Final Office Action received for U.S. Appl. No. 14/155,107 dated Jan. 17, 2019, 46 pages. |
Final Office Action received for U.S. Appl. No. 14/155, 107 dated Jul. 16, 2015, 28 pages. |
Final Office Action received for U.S. Appl. No. 14/155, 107 dated Jul. 8, 2016, 31 pages. |
Final Office Action received for U.S. Appl. No. 14/155, 107 dated Nov. 27, 2017, 44 pages. |
First Office Action dated Nov. 25, 2020, for Canadian Application No. 2921954, filed Aug. 21, 2014, 4 pages. |
Hainich R.R., et al., “Chapter 10: Near-Eye Displays,” Displays: Fundamentals & Applications, AK Peters/CRC Press, 2011, 65 pages. |
Hornstein S., et al., “Maradin's Micro-Mirror—System Level Synchronization Notes,” SID Digest, 2012, pp. 981-984. |
“IEEE 100 The Authoritative Dictionary of IEEE Standards Terms,” Seventh Edition, Standards Information Network IEEE Press, Dec. 2000, 3 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2016/018293, dated Jun. 8, 2016, 17 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2016/018298, dated Jun. 8, 2016, 14 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2016/018299, dated Jun. 8, 2016, 12 Pages. |
International Search Report and Written Opinion for International Application No. PCT/US2016/067246, dated Apr. 25, 2017, 10 Pages. |
Itoh Y., et al., “Interaction-Free Calibration for Optical See-Through Head-Mounted Displays based on 3D Eye Localization,” IEEE Symposium on 3D User Interfaces (3DUI), 2014, pp. 75-82. |
Janssen C., “Radio Frequency (RF),” 2013, [Retrieved on Jul. 12, 2017], 2 pages, Retrieved from the Internet: URL: https://web.archive.org/web/20130726153946/https://www.techopedia.com/definition/5083/radio-frequency-rf. |
Kessler D., “Optics of Near to Eye Displays (NEDs),” Presentation—Oasis, Tel Aviv, Feb. 19, 2013, 37 pages. |
Krees B.C., et al., “Diffractive and Holographic Optics as Optical Combiners in Head Mounted Displays,” UbiComp, Zurich, Switzerland, Sep. 8-12, 2013, pp. 1479-1482. |
Kress B., et al., “A Review of Head-Mounted Displays (HMD) Technologies and Applications for Consumer Electronics,” Proceedings of SPIE, 2013, vol. 8720, pp. 87200A-1-87200A-13. |
Kress B., “Optical Architectures for See-Through Wearable Displays,” Presentation, Bay Area SID Seminar, Apr. 30, 2014, 156 pages. |
Lake et al., “Method and Apparatus for Analyzing Capacitive EMG and IMU Sensor Signals for Gesture Control,” Amendment filed Aug. 21, 2015, for U.S. Appl. No. 14/186,878, 13 pages. |
Lake et al., “Method and Apparatus for Analyzing Capacitive EMG and IMU Sensor Signals for Gesture Control,” Office Action dated Jun. 17, 2015, for U.S. Appl. No. 14/186,878, 13 pages. |
Lake et al.' “Method and Apparatus for Analyzing Capacitive EMG and IMU Sensor Signals for Gesture Control,” Preliminary Amendment filed May 9, 2014, for U.S. Appl. No. 14/186,878, 9 pages. |
Lake et al., “Method and Apparatus for Analyzing Capacitive EMG and IMU Sensor Signals for Gesture Control,” U.S. Appl. No. 14/186,878, filed Feb. 21, 2014, 29 pages. |
Lake et al., “Methods and Devices for Combining Muscle Activity Sensor Signals and Inertial Sensor Signals for Gesture-Based Control,” Amendment filed Jan. 8, 2016, for U.S. Appl. No. 14/186,889, 16 pages. |
Lake et al., “Methods and Devices for Combining Muscle Activity Sensor Signals and Inertial Sensor Signals for Gesture-Based Control,” Amendment filed Jul. 13, 2016, for U.S. Appl. No. 14/186,889, 12 pages. |
Lake et al., “Methods and Devices for Combining Muscle Activity Sensor Signals and Inertial Sensor Signals for Gesture-Based Control,” Office Action dated Jun. 16, 2016, for U.S. Appl. No. 14/186,889, 13 pages. |
Lake et al., “Methods and Devices for Combining Muscle Activity Sensor Signals and Inertial Sensor Signals for Gesture-Based Control,” Office Action dated Nov. 5, 2015, for U.S. Appl. No. 14/186,889, 11 pages. |
Lake et al., “Methods and Devices That Combine Muscle Activity Sensor Signals and Inertial Sensor Signals for Gesture-Based Control,” U.S. Appl. No. 14/186,889, filed Feb. 21, 2014, 58 pages. |
Levola T., “7.1: Invited Paper: Novel Diffractive Optical Components for Near to Eye Displays,” SID Symposium Digest of Technical Papers, 2006, vol. 37 (1), pp. 64-67. |
Liao C.D., et al., “The Evolution of MEMS Displays,” IEEE Transactions on Industrial Electronics, Apr. 2009, vol. 56 (4), pp. 1057-1065. |
Lippert T.M., “Chapter 6: Display Devices: RSD™ (Retinal Scanning Display),” The Avionics Handbook, CRC Press, 2001, 8 pages. |
Majaranta P., et al., “Chapter 3: Eye Tracking and Eye-Based Human-Computer Interaction,” Advances in Physiological Computing, Springer-Verlag London, 2014, pp. 39-65. |
Merriam-Webster, “Radio Frequencies,” download date Jul. 12, 2017, 2 pages, Retrieved from the Intemet: URL: https://www.merriam-webster.com/table/collegiate/radiofre.htm. |
Morun C., et al., “Systems, Articles, and Methods for Capacitive Electromyography Sensors,” U.S. Appl. No. 16/437,351, filed Jun. 11, 2019, 51 pages. |
Non-Final Office Action dated Mar. 1, 2018 for U.S. Appl. No. 14/553,657, filed Nov. 25, 2014, 29 Pages. |
Non-Final Office Action dated May 2, 2018 for U.S. Appl. No. 15/799,628, filed Oct. 31, 2017, 25 Pages. |
Non-Final Office Action dated Oct. 5, 2022 for U.S. Appl. No. 17/576,815, filed Jan. 14, 2022, 14 pages. |
Non-Final Office Action dated Nov. 6, 2018 for U.S. Appl. No. 16/057,573, filed Aug. 7, 2018, 14 Pages. |
Non-Final Office Action dated May 7, 2021 for U.S. Appl. No. 16/899,843, filed Jun. 12, 2020, 24 Pages. |
Non-Final Office Action dated Oct. 7, 2022 for U.S. Appl. No. 17/141,646, filed Jan. 5, 2021, 6 pages. |
Non-Final Office Action dated Sep. 11, 2019 for U.S. Appl. No. 14/465,194, filed Aug. 21, 2014, 72 Pages. |
Non-Final Office Action dated May 12, 2022 for U.S. Appl. No. 16/899,843, filed Jun. 12, 2020, 34 Pages. |
Non-Final Office Action dated Sep. 14, 2017 for U.S. Appl. No. 14/539,773, filed Nov. 12, 2014, 28 pages. |
Non-Final Office Action dated Aug. 15, 2018 for U.S. Appl. No. 14/465,194, filed Aug. 21, 2014, 64 Pages. |
Non-Final Office Action dated Jun. 15, 2020 for U.S. Appl. No. 16/292,609, filed Mar. 5, 2019, 26 Pages. |
Non-Final Office Action dated Aug. 17, 2017 for U.S. Appl. No. 14/465,194, filed Aug. 21, 2014, 81 Pages. |
Non-Final Office Action dated Dec. 17, 2018 for U.S. Appl. No. 16/137,960, filed Sep. 21, 2018, 10 pages. |
Non-Final Office Action dated Jan. 18, 2018 for U.S. Appl. No. 15/799,621, filed Oct. 31, 2017, 10 pages. |
Non-Final Office Action dated Jun. 22, 2017 for U.S. Appl. No. 14/461,044, filed Aug. 15, 2014, 21 Pages. |
Non-Final Office Action dated Feb. 25, 2021 for U.S. Appl. No. 14/461,044, filed Aug. 15, 2014, 17 Pages. |
Non-Final Office Action dated Aug. 28, 2018 for U.S. Appl. No. 16/023,276, filed Jun. 29, 2018, 10 pages. |
Non-Final Office Action dated Aug. 28, 2018 for U.S. Appl. No. 16/023,300, filed Jun. 29, 2018, 11 pages. |
Non-Final Office Action dated Jun. 28, 2021 for U.S. Appl. No. 16/550,905, filed Aug. 26, 2019, 5 Pages. |
Non-Final Office Action received for U.S. Appl. No. 14/155,087 dated Aug. 16, 2016, 28 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/155,087 dated Aug. 7, 2017, 28 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/155,087 dated Feb. 17, 2016, 26 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/155,087 dated Mar. 31, 2015, 22 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/155,107 dated Aug. 17, 2016, 37 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/155,107 dated Aug. 7, 2017, 34 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/155,107 dated Feb. 11, 2016, 42 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/155, 107 dated Jul. 13, 2018, 45 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/155,107 dated Mar. 31, 2015, 26 pages. |
Notice of Allowance dated May 1, 2019 for U.S. Appl. No. 16/137,960, filed Sep. 21, 2018, 14 pages. |
Notice of Allowance dated Mar. 5, 2019 for U.S. Appl. No. 16/057,573, filed Aug. 7, 2018, 31 Pages. |
Notice of Allowance dated Feb. 8, 2019 for U.S. Appl. No. 16/023,276, filed Jun. 29, 2018, 15 pages. |
Notice of Allowance dated Mar. 11, 2020 for U.S. Appl. No. 14/465,194, filed Aug. 21, 2014, 29 Pages. |
Notice of Allowance dated Jun. 15, 2018 for U.S. Appl. No. 15/799,621, filed Oct. 31, 2017, 27 pages. |
Notice of Allowance dated Jul. 18, 2022 for U.S. Appl. No. 16/550,905, filed Aug. 26, 2019, 7 pages. |
Notice of Allowance dated Apr. 20, 2022 for U.S. Appl. No. 14/461,044, filed Aug. 15, 2014, 08 pages. |
Notice of Allowance dated Sep. 24, 2020 for U.S. Appl. No. 16/292,609, filed Mar. 5, 2019, 20 Pages. |
Notice of Allowance dated Mar. 25, 2022 for U.S. Appl. No. 16/550,905, filed Aug. 26, 2019, 7 pages. |
Notice of Allowance dated Sep. 25, 2018 for U.S. Appl. No. 14/553,657, filed Nov. 25, 2014, 25 Pages. |
Notice of Allowance dated Jan. 28, 2019 for U.S. Appl. No. 16/023,300, filed Jun. 29, 2018, 31 pages. |
Notice of Allowance dated Nov. 3, 2022 for U.S. Appl. No. 16/899,843, filed Jun. 12, 2020, 10 pages. |
Notice of Allowance dated Mar. 30, 2018 for U.S. Appl. No. 14/539,773, filed Nov. 12, 2014, 17 pages. |
Notice of Allowance dated Nov. 30, 2018 for U.S. Appl. No. 15/799,628, filed Oct. 31, 2017, 19 Pages. |
Notice of Allowance received for U.S. Appl. No. 14/155,107 dated Aug. 30, 2019, 16 pages. |
Office Action for European Application No. 19806723.3, dated Oct. 27, 2022, 8 pages. |
Office Action dated Sep. 28, 2022 for Chinese Application No. 201780059093.7, filed Jul. 25, 2017, 16 pages. |
Restriction Requirement dated Aug. 8, 2017 for U.S. Appl. No. 14/553,657, filed Nov. 25, 2014, 7 Pages. |
Schowengerdt B.T., et al., “Stereoscopic Retinal Scanning Laser Display With Integrated Focus Cues for Ocular Accommodation,” Proceedings of SPIE-IS&T Electronic Imaging, 2004, vol. 5291, pp. 366-376. |
Silverman N.L., et al., “58.5L: Late-News Paper: Engineering a Retinal Scanning Laser Display with Integrated Accommodative Depth Cues,” SID 03 Digest, 2003, pp. 1538-1541. |
Takatsuka Y., et al., “Retinal Projection Display Using Diffractive Optical Element,” Tenth International Conference on Intelligent Information Hiding and Multimedia Signal Processing, IEEE, 2014, pp. 403-406. |
Urey H., “Diffractive Exit-Pupil Expander for Display Applications,” Applied Optics, Nov. 10, 2001, vol. 40 (32), pp. 5840-5851. |
Urey H., et al., “Optical Performance Requirements for MEMS-Scanner Based Microdisplays,” Conferences on MOEMS and Miniaturized Systems, SPIE, 2000, vol. 4178, pp. 176-185. |
Viirre E., et al., “The Virtual Retinal Display: A New Technology for Virtual Reality and Augmented Vision in Medicine,” Proceedings of Medicine Meets Virtual Reality, IOS Press and Ohmsha, 1998, pp. 252-257. |
Wijk U., et al., “Forearm Amputee's Views of Prosthesis Use and Sensory Feedback,” Journal of Hand Therapy, Jul. 2015, vol. 28 (3), pp. 269-278. |
Written Opinion for International Application No. PCT/US2014/057029, dated Feb. 24, 2015, 9 Pages. |
Office Action dated Feb. 7, 2023 for European Application No. 19810524.9, filed May 28, 2019, 7 pages. |
Notice of Allowance dated Dec. 14, 2022 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 10 pages. |
Office Action dated Jan. 20, 2023 for Chinese Application No. 201780059093.7, filed Jul. 25, 2017, 16 pages. |
European Search Report for European Patent Application No. 23186202.0, dated Aug. 2, 2023, 7 pages. |
Khezri M., et al., “A Novel Approach to Recognize Hand Movements Via sEMG Patterns,” 2007 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Aug. 22, 2007, pp. 4907-4910. |
Naik G.R., et al., “SEMG for Identifying Hand Gestures using ICA,” In Proceedings of the 2nd International Workshop on Biosignal Processing and Classification, Jan. 31, 2006, pp. 61-67. |
Office Action dated Sep. 14, 2023 for Chinese Application No. 201980035465.1, filed May 28, 2019, 9 pages. |
Office Action dated Aug. 15, 2023 for Japanese Patent Application No. 2021-507757, filed on Feb. 15, 2021,9 pages. |
Office Action dated Aug. 16, 2023 for Chinese Application No. 201880082887.X, filed Oct. 19, 2018, 17 pages. |
Office Action dated Aug. 16, 2023 for Chinese Application No. 202080062417.4, filed Sep. 3, 2020, 11 pages. |
Office Action dated Aug. 21, 2023 for Chinese Patent Application No. 201980062920.7, filed Sep. 20, 2019, 21 pages. |
Office Action dated Jun. 22, 2023 for European Patent Application No. 19863248.1, filed on Sep. 20, 2019, 5 pages. |
Office Action dated Aug. 29, 2023 for Japanese Application No. 2021-506985, filed Feb. 9, 2021,6 pages. |
Office Action dated Aug. 31, 2023 for Chinese Application No. 201980045972.3, filed May 7, 2021, 20 pages. |
Valero-Cuevas F. J., et al. “Computational Models for Neuromuscular Function,” IEEE reviews in Biomedical Engineering, Dec. 31, 2009, vol. 2, pp. 110-135. |
Number | Date | Country | |
---|---|---|---|
62826574 | Mar 2019 | US |