Devices, systems, and methods for controlling computing devices via neuromuscular signals of users

Information

  • Patent Grant
  • 11481031
  • Patent Number
    11,481,031
  • Date Filed
    Wednesday, September 8, 2021
    2 years ago
  • Date Issued
    Tuesday, October 25, 2022
    a year ago
Abstract
The disclosed human computer interface (HCI) system may include (1) at least one processor, (2) a plurality of sensors that detect one or more neuromuscular signals from a forearm or wrist of a user, and (3) memory that stores (A) one or more trained inferential models that determine an amount of force associated with the one or more neuromuscular signals and (B) computer-executable instructions that, when executed by the at least one processor, cause the at least one processor to (I) identify the amount of force determined by the one or more trained inferential models, (II) determine that the amount of force satisfies a threshold force value, and in accordance with the determination that the amount of force satisfies the threshold force value, (III) generate a first input command for the HCI system. Various other devices, systems, and methods are also disclosed.
Description
BRIEF DESCRIPTION OF DRAWINGS AND APPENDIX

The accompanying Drawings illustrate a number of exemplary embodiments and are parts of the specification. Together with the following description, the Drawings demonstrate and explain various principles of the instant disclosure.



FIG. 1 is a block diagram of an exemplary wearable device for controlling computing devices via neuromuscular signals of users.



FIG. 2 is an illustration of an exemplary system for controlling computing devices via neuromuscular signals of users.



FIG. 3 is an illustration of a user wearing and operating an exemplary wearable device for controlling computing devices via neuromuscular signals.



FIG. 4A is an illustration of an exemplary wearable device for controlling computing devices via neuromuscular signals of users.



FIG. 4B is an illustration of an exemplary wearable device for controlling computing devices via neuromuscular signals of users.



FIG. 5A is an illustration of an exemplary wearable device for controlling computing devices via neuromuscular signals of users.



FIG. 5B is an illustration of an exemplary wearable device for controlling computing devices via neuromuscular signals of users.



FIG. 6 is an illustration of an exemplary wearable device for controlling computing devices via neuromuscular signals of users.



FIG. 7 is an illustration of an exemplary state of a body part of a user donning a wearable device for controlling computing devices via neuromuscular signals.



FIG. 8A is an illustration of an exemplary state of a body part of a user donning a wearable device for controlling computing devices via neuromuscular signals.



FIG. 8B is an illustration of an exemplary state of a body part of a user donning a wearable device for controlling computing devices via neuromuscular signals.



FIG. 8C is an illustration of an exemplary state of a body part of a user donning a wearable device for controlling computing devices via neuromuscular signals.



FIG. 9A is an illustration of an exemplary signal representative of a state pattern corresponding to a body part of a user donning a wearable device for controlling computing devices via neuromuscular signals.



FIG. 9B is an illustration of an exemplary signal representative of a state pattern corresponding to a body part of a user donning a wearable device for controlling computing devices via neuromuscular signals.



FIG. 10A is an illustration of an exemplary state of a body part of a user donning a wearable device for controlling computing devices via neuromuscular signals.



FIG. 10B is an illustration of an exemplary state of a body part of a user donning a wearable device for controlling computing devices via neuromuscular signals.



FIG. 10C is an illustration of an exemplary state of a body part of a user donning a wearable device for controlling computing devices via neuromuscular signals.



FIG. 10D is an illustration of an exemplary state of a body part of a user donning a wearable device for controlling computing devices via neuromuscular signals.



FIG. 10E is an illustration of an exemplary action that is performed by a computing device in response to the state of the user's body part illustrated in FIG. 10A.



FIG. 10F is an illustration of an exemplary action that is performed by a computing device in response to the state of the user's body part illustrated in FIG. 10B.



FIG. 10G is an illustration of an exemplary action that is performed by a computing device in response to the state of the user's body part illustrated in FIG. 10C.



FIG. 10H is an illustration of an exemplary action that is performed by a computing device in response to the state of the user's body part illustrated in FIG. 10D.



FIG. 11A is an illustration of an exemplary radial menu capable of being controlled by a wearable device via neuromuscular signals of users.



FIG. 11B is an illustration of an exemplary radial menu capable of being controlled by a wearable device via neuromuscular signals of users.



FIG. 11C is an illustration of an exemplary sequential menu capable of being controlled by a wearable device via neuromuscular signals of users.



FIG. 11D is an illustration of an exemplary sequential menu capable of being controlled by a wearable device via neuromuscular signals of users.



FIG. 11E is an illustration of an exemplary sequential menu capable of being controlled by a wearable device via neuromuscular signals of users.



FIG. 12 is an illustration of an exemplary menu bar icon indicating whether a wearable device donned by a user is connected to a computing device.



FIG. 13 is an exemplary popup menu display that enables a user to activate and/or deactivate certain mappings between possible states of the user's body parts and actions capable of being performed by a computing device.



FIG. 14 is an exemplary popup menu display that enables a user to activate and/or deactivate certain mappings between possible states of the user's body parts and actions capable of being performed by a computing device.



FIG. 15 is an exemplary popup menu display that enables a user to activate and/or deactivate certain mappings between possible states of the user's body parts and actions capable of being performed by a computing device.



FIG. 16 is an exemplary popup menu display that enables a user to activate and/or deactivate certain mappings between possible states of the user's body parts and actions capable of being performed by a computing device.



FIG. 17 is an exemplary popup menu display that enables a user to activate and/or deactivate certain mappings between possible states of the user's body parts and actions capable of being performed by a computing device.



FIG. 18 is a flow diagram of an exemplary method for controlling a graphical user interface of a computing device via a wearable device donned by a user.



FIG. 19 is an illustration of an exemplary highlighted link activated in a web page in connection with a link-activate setting selected via the popup menu display illustrated in FIG. 17.



FIG. 20 is an illustration of an exemplary transition between mappings of possible states of the user's body parts and actions capable of being performed by a computing device.



FIG. 21A is an illustration of exemplary wearable device for controlling computing devices via neuromuscular signals of users.



FIG. 21B is an illustration of exemplary dongle that is connected to a computing device and facilitates interfacing a wearable device with the computing device.



FIG. 22 is a flowchart of an exemplary method for controlling computing devices via neuromuscular signals of users.



FIG. 23 is an illustration of an exemplary drawing application that includes a virtual drawing instrument whose width is capable of being controlled and/or modified in accordance with certain states of the user's body parts.



FIG. 24 is an illustration of an exemplary multi-state user interface that enables a user to select and/or define certain mappings between possible states of the user's body parts and actions capable of being performed by a computing device.



FIG. 25 is an illustration of exemplary augmented-reality glasses that may be used in connection with embodiments of this disclosure.



FIG. 26 is an illustration of an exemplary virtual-reality headset that may be used in connection with embodiments of this disclosure.


While the exemplary embodiments described herein are susceptible to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and will be described in detail herein. However, the exemplary embodiments described herein are not intended to be limited to the particular forms disclosed. Rather, the instant disclosure covers all modifications, combinations, equivalents, and alternatives falling within this disclosure.







DETAILED DESCRIPTION OF EXEMPLARY EMBODIMENTS

The present disclosure is generally directed to devices, systems, and methods for controlling computing devices via neuromuscular signals of users. As will be explained in greater detail below, these devices, systems, and methods may provide numerous features and benefits.


Human computer interfaces (HCIs) often encompass and/or refer to the means and/or mechanisms with which humans communicate with, instruct, and/or control computers. Examples of such HCIs include, without limitation, mice, keyboards, touchscreens, touchpads, joysticks, styluses, buttons, handheld controllers, combinations or variations of one or more of the same, and/or any other suitable HCIs.


Some interactions between humans and computers may necessitate and/or call for the use and/or application of multiple HCIs simultaneously. In some examples, a user may switch back and forth between different HCIs to engage with interactive media presented and/or displayed on a computer. For example, a user may switch between using a mouse and using a keyboard multiple times during a single interactive media session. Moreover, as computing devices become more portable, the development of HCIs may become more complex due at least in part to design tradeoffs resulting from size constraints and/or mobility requirements of portable devices. Unfortunately, as the portability of computing devices becomes even more ubiquitous, traditional HCIs may become less desirable and/or convenient for users. The instant disclosure, therefore, identifies and addresses a need for additional devices, systems, and methods for controlling computing devices via neuromuscular signals of users.


As will be described in greater detail below, the various devices, systems, and methods described herein may involve the use of a wearable device capable of detecting and/or sensing neuromuscular signals traversing through a user's body. For example, a user may wear a smart wristband with multiple surface electromyography (EMG) sensors that detect and/or sense neuromuscular signals traversing the user's arm, wrist, and/or hand. In this example, the smart wristband may be communicatively coupled to a nearby computing device. In response to certain neuromuscular signals detected via the user's body, the smart wristband may direct the computing device to perform one or more actions that account for those neuromuscular signals.


Accordingly, the smart wristband may enable the user to engage with interactive media presented and/or displayed on the computing device in less restrictive ways than traditional HCIs. The smart wristband may be used to control certain elements of interactive media based at least in part on EMG signals that correlate to predefined states of one or more body parts of the user. The smart wristband may enable the user to direct the computing device to perform certain interactive tasks. Examples of such interactive tasks include, without limitation, map navigation, page browsing, gaming controls, flight controls, interactions with graphical objects presented on a display, cursor control, link and/or button selection, combinations of one or more of the same, and/or any other suitable interactive tasks.


In some implementations, a wearable device may facilitate web browsing based at least in part on configured and/or programed controls or commands. Such controls and/or commands may include and/or involve scrolling up or down a webpage, moving a cursor across a webpage, and/or clicking on one or more webpage elements. In one example, the wearable device may enable users to control web browsing interactions, thereby emulating controls and/or commands provided by traditional HCIs. In another example, the wearable device may also facilitate and/or emulate flight controls, such as pitch, yaw, roll, and/or throttle. Additional examples of such controls and/or commands include, without limitation, activating, selecting, pitching, rotating, rolling, and/or dragging visual objects, navigating, combinations of one or more of the same, and/or any other suitable controls and/or commands.


In some implementations, a wearable device may be used to transition between different mappings of body part states and responsive actions. For example, the wearable device may detect and/or sense certain neuromuscular signals traversing a user's body. In this example, those neuromuscular signals may correspond to and/or represent a specific state of one or more of the user's body parts. As a result, the wearable device may be able to detect and/or sense one or more positions, movements, forces, contractions, poses, and/or gestures made by those body parts of the user. One mapping may cause the wearable device and/or the target computing device to perform a certain action in response to the detection of a specific state of those body parts. However, another mapping may cause the wearable device and/or the target computing device to perform a different action in response to the detection of the same state of those body parts. The wearable device may enable the user to transition between those mappings via neuromuscular signals.


In some implementations, one or more states of the user's body parts may correspond to and/or represent control actions used to interact with a radial menu presented on a display. For example, a first pose may cause the wearable device to direct a computing device to display a radial menu for selection by the user. In this example, a wrist movement (e.g., rotation) may cause the wearable device to direct the computing device to select an item or option available in the radial menu. Additionally or alternatively, a finger pinch pose may cause the wearable device to direct the computing device to click a selected menu item. Further, an open hand pose may cause the wearable device to direct the computing device to close the radial menu.


In some examples, the terms “wearable” and “wearable device” may refer to any type or form of computing device that is worn by a user of an artificial-reality system and/or visual display system as part of an article of clothing, an accessory, and/or an implant. In one example, a wearable device may include and/or represent a wristband secured to and/or worn by the wrist of a user. Additional examples of wearable devices include, without limitation, armbands, pendants, bracelets, rings, jewelry, anklebands, clothing, electronic textiles, shoes, clips, headsets, headbands, head-mounted displays, gloves, glasses, variations or combinations of one or more of the same, and/or any other suitable wearable devices.


The following will provide, with reference to FIGS. 1-6 and 21, detailed descriptions of various devices, systems, components, and/or implementations for controlling computing devices via neuromuscular signals of users. The discussion corresponding to FIGS. 7-19 and 23 will provide detailed descriptions of exemplary neuromuscular signals, exemplary states of body parts capable of being detected via neuromuscular signals, and/or exemplary actions performed in response to the detection of such body part states. The discussion corresponding to FIG. 20 will provide detailed descriptions of exemplary transitions between different mappings of body part states and responsive actions. Additionally, the discussion corresponding to FIG. 22 will provide detailed descriptions of an exemplary method for controlling computing devices via neuromuscular signals of users. Finally, the discussion corresponding to FIGS. 23 and 24 will provide detailed descriptions of types of exemplary artificial reality devices and/or systems capable of being controlled by neuromuscular signals of users.



FIG. 1 illustrates an exemplary wearable device 102 capable of controlling computing devices via neuromuscular signals of users. As illustrated in FIG. 1, exemplary wearable device 102 may include and/or represent a set of sensors 104(1)-(N) that detect and/or sense neuromuscular signals traversing the body of a user. In some examples, exemplary wearable device 102 may also include and/or represent a processing device 106 communicatively coupled to sensors 104(1)-(N) and/or memory 108. In such examples, memory 108 may include and/or store one or more trained inferential models that determine amounts of force associated with the neuromuscular signals detected by the sensors 104(1)-(N). Additionally or alternatively, memory 108 may include and/or store computer-executable instructions that, when executed by processor 106, cause processor 106 to (1) identify an amount of force associated with the neuromuscular signals as determined by the one or more trained inferential models, (2) determine that the amount of force satisfies a threshold force value, and/or in accordance with the determination that the amount of force satisfies the threshold force value, (3) generate a first input command for an HCI system (such as HCI system 200 in FIG. 2).


In some examples, processing device 106 may determine, based at least in part on those neuromuscular signals, a state of at least one body part of the user. Additionally or alternatively, processing device 106 may generate one or more input commands for a separate computing system (not necessarily illustrated in FIG. 1). Such input commands may account for the state of the user's body part.


In some examples, sensors 104(1)-(N) may each constitute and/or represent any type or form of sensor capable of detecting and/or sensing neuromuscular signals via a user's body. In one example, sensors 104(1)-(N) may include and/or represent one or more neuromuscular sensors and/or EMG sensors arranged circumferentially around wearable device 102. Additional examples of sensors 104(1)-(N) include, without limitation, mechanomyography (MMG) sensors, sonomyography (SMG) sensors, combinations or variations of one or more of the same, and/or any other suitable sensors. Any suitable number and/or arrangement of sensors 104(1)-(N) may be applied to wearable device 102.


In some embodiments, sensors 104(1)-(N) may include one or more EMG sensors, MMG sensors, and/or SMG sensors as well as one or more auxiliary sensors that record auxiliary signals and/or information. Examples of such auxiliary sensors include, without limitation, inertial measurement unit (IMU) sensors, position-tracking sensors, microphones, imaging sensors (e.g., cameras), radiation-based sensors for use with radiation-generation devices (e.g., laser-scanning devices), heart-rate monitors, combinations or variations of one or more of the same, and/or any other suitable auxiliary sensors.


In some examples, sensors 104(1)-(N) may be communicatively coupled to one another and/or to processing device 106 by flexible electronics, connectors, and/or wiring. Additionally or alternatively, sensors 104(1)-(N) may be integrated with and/or into an elastic band of wearable device 102.


In some embodiments, the output of one or more of sensors 104(1)-(N) may be processed, amplified, rectified, and/or filtered via hardware signal processing circuitry. Additionally or alternatively, the output of one or more of sensors 104(1)-(N) may be processed, amplified, rectified, and/or filtered via signal processing software or firmware. Accordingly, the processing of neuromuscular signals may be performed in hardware, software, and/or firmware.


As illustrated in FIG. 1, exemplary wearable device 102 may also include one or more processors, such as processing device 106. In some examples, processing device 106 may include and/or represent any type or form of hardware-implemented processing device capable of interpreting and/or executing computer-readable instructions. In one example, processing device 106 may access and/or modify certain software modules to facilitate controlling computing devices via neuromuscular signals of users. Examples of processing device 106 include, without limitation, physical processors, central processing units (CPUs), microprocessors, microcontrollers, field-programmable gate arrays (FPGAs) that implement softcore processors, application-specific integrated circuits (ASICs), portions of one or more of the same, variations or combinations of one or more of the same, and/or any other suitable processing device.


As illustrated in FIG. 1, exemplary wearable 102 may further include one or more memory devices, such as memory 108. Memory 108 generally represents any type or form of volatile or non-volatile storage device or medium capable of storing data and/or computer-readable instructions. In one example, memory 108 may store, load, and/or maintain one or more trained inferential models that perform certain tasks, classifications, and/or determinations in connection with controlling computing devices via neuromuscular signals. Examples of memory 108 include, without limitation, Random Access Memory (RAM), Read Only Memory (ROM), flash memory, Hard Disk Drives (HDDs), Solid-State Drives (SSDs), optical disk drives, caches, variations or combinations of one or more of the same, and/or any other suitable storage memory.


In some examples, wearable device 102 may include and/or incorporate a wearable band. For example, wearable device 102 may include and/or represent a strap and/or band designed and/or dimensioned to at least partially encompass the user's wrist and/or arm. The strap and/or band may include and/or contain a variety of different materials. Examples of such materials include, without limitation, cottons, polyesters, nylons, elastics, plastics, neoprene, rubbers, metals, woods, composites, combinations or variations of one or more of the same, and/or any other suitable materials. The strap and/or band may be defined and/or formed in a variety of shapes and/or sizes with the aim of securing wearable device 102 to the user's wrist and/or arm. In one example, the strap and/or band may include and/or represent one or more segments, links, and/or sections. Additionally or alternatively, the strap and/or band may be adjustable to provide a one-size-fits-most feature.


In some embodiments, wearable device 102 may include and/or incorporate one or more additional components that are not represented and/or illustrated in FIG. 1. For example, although not necessarily illustrated and/or labeled in this way in FIG. 1, wearable device 102 may also include and/or incorporate circuitry, transistors, resistors, capacitors, diodes, transceivers, sockets, wiring, and/or circuit boards, among other components.


In some examples, when wearable device 102 is worn by the user, sensors 104(1)-(N) may interface and/or make physical contact with the user's skin. In one example, wearable device 102 may be communicatively coupled to a computing system (such as a virtual reality headset, an augmented reality headset, a laptop, a desktop, a smart television, a monitor, etc.). In this example, the user may put and/or place his or her body in a certain state and/or condition to control and/or modify the presentation or performance of the computing system. As the user puts and/or places his or her body in that state and/or condition, the user's body may generate and/or produce neuromuscular signals representative, indicative, and/or suggestive of that state or condition.


In some example, the neuromuscular signals may traverse and/or travel through the user's body. For example, the user may make a pose and/or gesture that generates neuromuscular signals that traverse down his or her arm toward the hand. In one example, one or more of sensors 104(1)-(N) may detect and/or sense the neuromuscular signals as they traverse down the arm toward the hand. In response to detecting and/or sensing those signals, one or more of sensors 104(1)-(N) may generate and/or produce data representative of those signals.


In some examples, those sensors may provide and/or deliver a version of the data representative of the detected neuromuscular signals to at least one processing device (e.g., processing device 106, a processor incorporated in the computing system to which wearable device 102 is communicatively coupled, and/or a processor incorporated in an intermediary communication link or dongle). This data may undergo certain processing and/or conversions prior to being provided and/or delivered to the processing device. Accordingly, the version of data provided and/or delivered to the processing device may be any derivation and/or processed representation of the output received from the sensors. Examples of this version of the data include, without limitation, raw data produced and/or output by the sensors, digital conversions and/or representations of analog signals output by the sensors, processed digital representations of signals output by the sensors, combinations or variations of one or more of the same, and/or any other suitable version of data representative of neuromuscular signals.


In this example, the processing device may analyze and/or evaluate the data representative of the neuromuscular signals to determine the state of one or more body parts of the user. For example, the processing device may implement a trained inferential model. The processing device may input and/or feed the data representative of the neuromuscular signals to the inferential model. From that data, the trained inferential model may then output and/or produce a classification that identifies and/or indicates the state of such body parts. Accordingly, the processing device may determine the state of such body parts based at least in part on the output of the inferential model.


Various states of the user's body parts may be discernible and/or detectable based at least in part on neuromuscular signals traversing the user's body. Examples of such body part states include, without limitations, relative positions of certain body parts, movements of certain body parts, forces applied and/or exerted by certain body parts, isometric contractions made by certain body parts, poses made by certain body parts, gestures made by certain body parts, activations of certain body parts (e.g., muscles), changes in activation of certain body parts, combinations of one or more of the same, and/or any other discernible or detectable states of such body parts.


In some examples, the processing device may be able to determine the amount of force produced and/or exerted by one or more body parts of the user based at least in part on the neuromuscular signals detected by sensors 104(1)-(N). For example, from the data representative of the detected neuromuscular signals, the trained inferential model may output and/or produce an indication or measurement that identifies and/or specifies the amount of force exerted by those body parts. In response to determining the state of those body parts and the amount of force produced by those body parts, the processing device may generate one or more input commands for the computing system. Such input commands may account for the state of the user's body parts and the amount of force produced and/or exerted by those body parts.


In some examples, the processing device may cause the computing system to which wearable device 102 is communicatively coupled to perform one or more actions mapped to the state of those body parts and/or the amount of force exerted by those body parts. For example, the processing device may direct the computing system to perform those actions by sending and/or providing those input commands to the computing system. In one example, the processing device may determine and/or identify one or more characteristics of those actions to be regulated in accordance with the amount of force produced by the user's body parts. In this example, the processing device may formulate the input command to account for the amount of force produced by the user's body parts such that the characteristics of those actions correspond to the amount of force produced by the user's body parts.


Various actions may be mapped to different states of the user's body parts. Examples of such actions include, without limitation, scrolling through a graphical user interface (GUI), selecting a visual element of a GUI, clicking on a visual element of a GUI, displaying a visual element in a GUI, drawing and/or painting a visual element on a GUI, moving a cursor displayed on a GUI, associating a cursor of the computing system with a visual element displayed in a GUI based at least in part on an updated position of the cursor relative to the visual element, providing a feedback indication (whether visual, auditory, and/or haptic) of an association made between a cursor of the computing system and a visual element displayed in a GUI, inputting data, modifying interface controls, navigating or scrolling a GUI, transitioning from one mapping to another, combinations or variations of one or more of the same, and/or any other suitable actions.


Similarly, various degrees of force may be mapped to and/or be commensurate with different characteristics of such actions. For example, one characteristic may include and/or represent the scrolling speed with which the GUI is scrolled. In one example, as the amount of force produced by the user's body parts increases, so too may the scrolling speed. Conversely, as the amount of force produced by the user's body parts decreases, so too may the scrolling speed.


As another example, one characteristic may include and/or represent the width of a virtual drawing instrument and/or a virtual paint brushstroke. In one example, as the amount of force produced by the user's body parts increases, so too may the width of the virtual drawing instrument and/or the virtual paint brushstroke. Conversely, as the amount of force produced by the user's body parts decreases, so too may the width of the virtual drawing instrument and/or the virtual paint brushstroke.


Various forms of feedback may be provided to the user as the computing system performs the actions mapped to the state of the user's body parts. For example, one feedback indication of an association made between the cursor of the computing system and a visual element of the GUI may involve and/or entail modifying one or more characteristics (e.g., color, size, transparency, shadow, font, animation, shape, fill, emphasis, orientation, animation, line type, and/or line width) of the visual element of the GUI. Another exemplary feedback indication of an association made between the cursor of the computing system and a visual element of the GUI may involve and/or entail adding, to the GUI, at least one further visual element that represents the association.


Associations may be made between the cursor of the computing system and the visual element for a variety of reasons. For example, the processing device and/or the computing system may determine that an updated position of the cursor is within a certain distance of the visual element of the GUI. In one example, the processing device and/or the computing system may identify the position of the visual element within the GUI and/or the position(s) of one or more additional visual elements within the GUI. In this example, the processing device and/or the computing system may determine that the updated position of the cursor is closer to the position of the virtual element than the additional virtual elements within the GUI. In response to determining that the updated position of the cursor is within the certain distance of the visual element, the processing device and/or the computing system may associate the cursor with the visual element (instead of, e.g., the additional virtual elements).


As another example, the processing device and/or the computing system may determine the speed at which the cursor moved or is moving within the GUI to reach the updated position. The processing device and/or the computing system may then associate the cursor with the visual element based at least in part on the speed at which the cursor moved or is moving to reach the updated position.


In a further example, the processing device and/or the computing system may detect a direction in which the cursor moved or is moving within the GUI to reach the updated position. The processing device and/or the computing system may then associate the cursor with the visual element based at least in part on the direction in which the cursor moved or is moving to reach the updated position.


In some examples, the processing device and/or the computing system may maintain one or more mappings between possible states of the body parts and responsive actions capable of being performed by the computing system. For example, the processing device and/or the computing system may maintain a first mapping between possible states of a body part and a first set of actions as well as a second mapping between possible states of the body part and a second set of actions. In one example, the processing device and/or the computing system may activate the first mapping and/or deactivate the second mapping such that one or more of the actions in the first set are performed in response to one or more detected states of the body part.


In some examples, the user may be able to switch between the mappings by changing the state of one or more body parts. For example, the user may make a pose and/or gesture with his or her hand. As the user does so, sensors 104(1)-(N) may detect and/or sense certain neuromuscular signals generated by the user's body in connection with the pose and/or gesture. In this example, the processing device and/or the computing system may determine the state of the user's body parts based at least in part on those neuromuscular signals.


In some examples, this state of the user's body parts may correspond and/or be mapped to a transition command and/or action that causes the processing device and/or the computing system to switch mappings. In such examples, in response to determining this state of the user's body parts, the processing device and/or the computing system may transition from one mapping to another mapping. For example, the processing device and/or the computing system may deactivate one mapping and activate another mapping. As a result of this mapping transition, the computing device may be configured and/or programmed to perform one or more actions assigned by the other mapping to the possible state of a body part in response to the subsequent detection of that body part state.


In some examples, the processing device and/or computing system may map any number of conditions to a single action. In these examples, to initiate performance of the action, the processing device and/or computing system may ensure and/or determine that all the conditions have been satisfied. For example, the processing device and/or computing system may map the rotation of the user's arm while making a first pose to navigating a radial menu in a certain direction. In this example, the user may be able to navigate the radial menu in that direction by rotating his or her arm while making a first pose. However, if the user rotates his or her arm without making a first post, the user's arm rotation may have no effect on the radial menu.



FIG. 2 illustrates an exemplary HCI system 200 that includes wearable device 102, an interface system 220, and/or an application system 230. In some examples, wearable device 102, interface system 220, and/or application system 230 may each include an instance of processing device 106 and/or memory 108. In addition, HCI system 200 may include one or more additional wearable devices capable of implementing and/or performing any of the same functionality as wearable device 102. Accordingly, any of the tasks described above as being performed by wearable device 102 in connection with FIG. 1 may additionally or alternatively be performed by interface system 220, application system 230, and/or any additional wearable devices included in HCI system 200.


In some examples, wearable device 102 may communicate with interface system 220 and/or application system 230. In such examples, when worn on the body of a user, wearable device 102 may detect neuromuscular signals traversing the user's body via sensors 104(1)-(N). Wearable device 102 may record, store, and/or analyze those neuromuscular signals.


In some implementations, wearable device 102 may record, store, and/or analyze auxiliary position, velocity, and/or acceleration information together with the neuromuscular signals. In such implementations, wearable device 102 may perform analog processing (e.g., noise reduction, filtering, etc.) and/or analog-to-digital conversion of recorded neuromuscular signals. Wearable device 102 may communicate with interface system 220 via any suitable wireless technology, protocol, and/or signaling. In one example, wearable device 102 may provide and/or transfer the recorded neuromuscular signals, features extracted from such signals, and/or commands or instructions based on such signals to interface system 220 and/or application system 230.


In some examples, interface system 220 may receive the recorded neuromuscular signals, features extracted from such signals, and/or commands or instructions based on such signals from wearable device 102. In one example, interface system 220 may generate data, commands, and/or instructions for use or consumption by application system 230. In another example, interface system 220 may identify and/or determine the current state of a body part of the user by implementing and/or applying an inferential model. In this example, interface system 220 may communicate and/or disclose the identified or determined current state of the user's body part to application system 230. For example, interface system 220 may provide the position, orientation, joint angle, force, movement, contraction, pose, and/or gesture information to application system 230. Interface system 220 may communicate with application system 230 via any suitable wireless technology, protocol, and/or signaling.


In some examples, the state of the user's body part may include and/or represent a relative position, orientation, joint angle, force, movement, pose, or gesture of that body part. In one example, the state of the user's body part may describe a configuration of one or more segments in a musculoskeletal representation of that body part and/or the user's body. In this example, the musculoskeletal representation may model that body part and/or the user's body as a multi-segment articulated rigid body system. The musculoskeletal representation may also model certain joints that form the interfaces between the different segments and/or certain joint angles that define the spatial relationships between connected segments.


In the model, the spatial relationships between the connected segments may conform and/or be subject to human anatomical constraints. In some examples, the musculoskeletal segments may be modeled as rigid bodies. Additionally or alternatively, the musculoskeletal segments in the model may conform and/or be subject to inter-segment movements (e.g., a forearm modeled as a semi-rigid segment to account for the motion of the ulna and radius bones). In one example, position, orientation, and/or joint angle of the segments, as well as their respective time derivatives (e.g. linear or angular velocity or acceleration), may be described and/or modeled as one or more fixed coordinate systems.


In some examples, the state of the user's body part may include and/or represent the amount of force applied by and/or to that body part. For example, wearable device 102 may measure, identify and/or determine the amount of linear force and/or rotational (torque) force exerted by one or more segments of the musculoskeletal representation based at least in part on neuromuscular signals traversing the user's body.


Examples of such linear forces include, without limitation, the force of a finger or hand pressing on a solid object (e.g., a table), the force exerted when two segments (e.g., two fingers) are pinched together, variations or combinations of one or more of the same, and/or any other suitable linear forces. Examples of such rotational forces include, without limitation, the force created as segments in the wrist or fingers are twisted or flexed, the force created by twisting or flexing the user's arm or waist, variations or combinations of one or more of the same, and/or any other suitable rotational forces. In some embodiments, the state of the user's body part may include and/or involve pinching force information, grasping force information, and/or information about co-contraction forces between muscles represented by the musculoskeletal representation.


In some examples, the state of the user's body part may include and/or represent a pose made by the user's body and/or one or more of the user's body parts. In one example, a pose may indicate a static configuration and/or positioning of one or more body parts. In this example, the static configuration may describe the position of those body parts relative to one another. For example, a pose may include and/or represent clenching a fist, forming an open hand, statically pressing the user's index finger against the user's thumb, pressing the palm of one hand down on a solid surface, and/or gripping or holding a ball.


In some examples, the state of the user's body part may correspond to and/or represent positional information (e.g., segment coordinates, joint angles, or similar information) for a pose. Additionally or alternatively, the state of the user's body part may correspond to and/or represent an identifier assigned and/or specific to a pose (e.g., a parameter, function argument, or variable value).


In some examples, the state of the user's body part may include and/or represent a gesture made by the user's body and/or one or more of the user's body parts. In one example, a gesture may indicate a dynamic configuration of one or more body parts. In this example, the dynamic configuration may describe the position of those body parts relative to one another, the movement of those body parts relative to one another, and/or forces applied to and/or exerted by those body parts. For example, a gesture may constitute and/or represent waving a finger back and forth, throwing a ball, and/or grasping or palming a ball. Additionally or alternatively, a gesture may constitute and/or represent the activation and/or change in activation of certain muscles in the user's body.


In some embodiments, wearable device 102 may generate, store, and/or record state information that describes states of the user's body parts. In one example, such state information may describe a pose and/or gesture made with a hand of the user. In this example, such state information may also include a data-based model of the user's hand as a multi-segment representation. The joints in the user's wrist and fingers may form interfaces between the multiple segments in the data-based model.


In various embodiments, the state of the user's body state may describe a hand in combination with one or more arm segments. In other embodiments, the state of the user's body state may describe portions of the user's body other than the hand or fingers, such as an arm, a leg, a foot, a torso, a neck, variations or combinations of one or more of the same, and/or any other suitable body parts of the user.


The inferential model implemented by wearable device 102, interface system 120, and/or application system 130 may include and/or represent at least one statistical or machine learning model. For example, the inferential model may include and/or represent a neural network (e.g., a recurrent neural network) used to determine and/or classify body part states based at least in part on neuromuscular signals. In one example, the neural network may include and/or represent a long short-term memory (LSTM) neural network. Additionally or alternatively, the neural network may include and/or represent a fully recurrent neural network, a gated recurrent neural network, a recursive neural network, a Hopfield neural network, an associative memory neural network, an Elman neural network, a Jordan neural network, an echo state neural network, a second order recurrent neural network, deep neural networks, convolutional neural networks, feedforward neural networks, variations or combinations of one or more of the same, and/or any other suitable type of neural network.


In some examples, the inferential model may include and/or represent a supervised machine learning model in which the user makes certain positions, movements, forces, contractions, poses, and/or gestures with his or her body. In such examples, the inferential model may obtain sensor data samples representative of those positions, movements, forces, contractions, poses, and/or gestures via wearable device 102. The inferential model may then be trained (or further trained) based at least in part on those sensor data samples. In other examples, the inferential model may include and/or represent an unsupervised machine learning model that is trained without the user making such positions, movements, forces, contractions, poses, and/or gestures with his or her body. The inferential model may also be trained from data samples collected from multiple users.


In some implementations, the recorded neuromuscular signals may exhibit spatio-temporal (e.g., spatio-frequential) patterns that depend on way in which the user wears wearable device 102. For example, one body part state may be associated with a first spatio-temporal pattern when the user is donning wearable device 102 in a first manner (e.g., where the electrodes are in contact with certain areas of the user's skin) and a second spatio-temporal pattern when the user rotates wearable device 102 on his or her body or when the user moves wearable device 102 to a different part of the body (e.g., from a lower arm position to an upper arm position). Accordingly, the inferential model may be trained to identify one or more body part states by the exhibited spatio-temporal patterns.


In some implementations, wearable device 102 may be configured to determine a rotation and/or position of wearable device 102. In such implementations, wearable device 102 may be able to select and/or apply an inferential model trained and/or adapted for identifying body parts states at the determined rotation and/or position. In other words, wearable device 102 may be configured to auto-calibrate by adapting to any rotation and/or arm position offset without interfering with the user experience. By auto-calibrating in this way, wearable device 102 may be able to account for the manner in which the user is donning wearable device 102 relative to the user's underlying musculature and other anatomy that has the potential to affect the recording of the neuromuscular signals traversing the user's body. Moreover, wearable device 102 may be able to adapt to users with varying body types and/or abnormalites, including those who have suffered injured or missing muscles, different adipose tissue or fat, and/or other anatomic variables.


In some examples, HCI system 200 may build an inferential model that classifies neuromuscular signal patterns for auto-calibration by (1) building a new statistical model/experiment class that takes a set of preprocessed neuromuscular signals as input, (2) generating a batch of training data by randomly applying a rotation offset to the preprocessed signals, (3) producing positive labels when the augmented offset is zero and null labels when the augmented offset is not zero, (4) calibrating the batch of training data to have an offset of zero, and (5) training an inferential model and evaluating its performance by testing different rotation offsets.


In some examples, application system 230 may receive body state information from interface system 220. In response to this information, application system 230 perform certain actions on one or more applications. Examples of such actions include, without limitation, changing an execution state of an application (e.g., starting, stopping, suspending, or resuming the application), communicating with an application (e.g., providing commands and/or data to the application), moving a cursor in connection with an application, associating a cursor with a visual element displayed in a GUI, presenting and/or highlighting a visual element within a GUI, selecting and/or clicking on a visual indicator displayed in a GUI, transitioning from one mapping to another, and/or any other suitable actions.


In some examples, application system 230 may be configured to provide a GUI to the user donning wearable device 102. In one example, the GUI may generate and/or deliver visual, auditory, and/or haptic feedback in response to commands, instructions, and/or data received from application system 230. For example, a user donning wearable device 102 may interact with graphical controls and/or indicators displayed in the GUI of application system 230 via wearable device 102. As an additional example, the GUI may generate and/or deliver auditory prompts and/or feedback through speakers incorporated into HCI system 200. As a further example, the GUI may provide haptic prompts and/or feedback via one or more actuators that apply certain forces to the user's body (e.g., vibrations generated by a linear resonant actuator or eccentric rotating mass actuator).


In some embodiments, wearable device 102, interface system 220, and/or application system 230 may be combined into a single standalone computing device or unit. In other embodiments, wearable device 102 may include and/or represent a single standalone computing device or unit, and interface system 220 and application system 230 may be combined into a different standalone computing device or unit. In further embodiments, wearable device 102 and interface system 220 may be combined into a single standalone computing device or unit, and application system 230 may include and/or represent a different standalone computing device or unit. In additional embodiments, wearable device 102, interface system 220, and/or application system 230 may each include and/or represent a separate standalone computing device or unit.


In some examples, wearable device 102 may implement and/or be configured with one or more trained inferential models. In such examples, wearable device 102 may record neuromuscular signals, use the trained inferential models to identify one or more states of the user's body parts, and/or provide one or more indications of the identified body states to a separate computing device implementing interface system 220 and/or application system 230. Additionally or alternatively, wearable device 102 may communicate and/or disclose certain features extracted from the recorded neuromuscular signals and/or one or more commands or instructions based on such signals to a separate computing device implementing interface system 220 and/or application system 230.


In some examples, the separate computing device implementing interface system 220 and/or application system 230 may identify and/or determine the states of the user's body parts by feeding the recorded neuromuscular signals and/or certain features extracted from such signals into one or more trained inferential models. The identified states may be mapped to specific actions capable of being executed and/or performed by the computing device implementing application system 230. For example, a given body part state may cause application system 230 to execute and/or perform one or more actions in connection with an application running on that computing device.


In some examples, wearable device 102 or another portion of HCI system 200 may determine whether the amount of force exerted by the user satisfies multiple threshold force values. In one example, each of these threshold force values may be associated with a different action and/or input command. For example, wearable device 102 or another portion of HCI system 200 may determine that the amount of force exerted by the user while performing a certain hand gesture satisfies a first threshold force value and a second threshold force value. In this example, the first threshold force value and the second threshold force value may differ from one another.


In response to the determination that the amount of force exerted by the user satisfies the first and second threshold force values, wearable device 102 or another portion of HCI system 200 may generate a first input command corresponding to the first threshold force value having been satisfied and a second input command corresponding to the second threshold force value having been satisfied. In this example, the first and second input commands may differ from one another.


In some examples, wearable device 102 or another portion of HCI system 200 may forego generating input commands corresponding to threshold force values that have not been satisfied. For example, wearable device 102 or another portion of HCI system 200 may determine that the amount of force exerted by the user while performing a certain hand gesture does not satisfy a first threshold force value. In response to this determination, wearable device 102 or another portion of HCI system 200 may forgo generating an input command corresponding to the first threshold force value in connection with that gesture.


In some examples, wearable device 102 or another portion of HCI system 200 may determine that the amount of force exerted by the user satisfies one threshold force value but not another threshold force value. For example, wearable device 102 or another portion of HCI system 200 may determine that the amount of force exerted by the user while performing a certain gesture satisfies a first threshold force value but does not satisfy a second threshold force value. In response to this determination, wearable device 102 or another portion of HCI system 200 may generate a first input command corresponding to the first threshold force value having been satisfied but forgo generating a second input command corresponding to the second threshold force value having been satisfied. Alternatively, in response to this determination, wearable device 102 or another portion of HCI system 200 may generate a first input command corresponding to the first threshold force value having been satisfied and a second input command corresponding to the second threshold force value having not been satisfied.


In some examples, wearable device 102 or another portion of HCI system 200 may determine whether the user implemented and/or performed certain combinations of hand gestures and force. For example, wearable device 102 or another portion of HCI system 200 may determine that the user exerted a first amount of force while performing a first hand gesture. In this example, wearable device 102 or another portion of HCI system 200 may determine that the user exerted a second amount of force while performing a second hand gesture. Additionally or alternatively, wearable device 102 or another portion of HCI system 200 may determine that the first amount of force satisfies a first threshold force value and the second amount of force satisfies a second threshold force value.


In response to the determination that the first amount of force satisfies the first threshold force value, wearable device 102 or another portion of HCI system 200 may generate a first input command that accounts for the first amount of force exerted by the user while performing the first hand gesture. Moreover, in response to the determination that the second amount of force satisfies the second threshold force value, wearable device 102 or another portion of HCI system 200 may generate a second input command that accounts for the second amount of force exerted by the user while performing the second hand gesture.


In some examples, wearable device 102 or another portion of HCI system 200 may determine whether the user increases or decreases the amount of force exerted while performing a single hand gesture or pose. For example, wearable device 102 or another portion of HCI system 200 may determine and/or identify a single hand gesture (e.g., forming and/or holding a first) performed by the user over a certain period of time. In this example, wearable device 102 or another portion of HCI system 200 may determine and/or identify a first amount of force exerted by the user at a first point in time while performing that hand gesture. In response, wearable device 102 or another portion of HCI system 200 may generate a first input command that accounts for the first amount of force exerted by the user while performing that hand gesture.


Subsequently, wearable device 102 or another portion of HCI system 200 may determine and/or identify a second amount of force exerted by the user at a second point in time while performing that hand gesture. In response, wearable device 102 or another portion of HCI system 200 may generate a second input command that accounts for the second amount of force exerted by the user while performing that hand gesture. Accordingly, wearable device 102 or another portion of HCI system 200 may generate multiple input commands that correspond to and/or are commensurate with a varying scale of force exerted by the user while performing that hand gesture over time.


As a specific example, wearable device 102 or another portion of HCI system 200 may determine and/or identify a first formed and/or held by the hand of the user. In this example, the user may increase and/or decrease the amount of force applied to the first over time. For example, wearable device 102 or another portion of HCI system 200 may formulate a first input command to control the speed of a cursor implemented on application system 230 based at least in part on the amount of force applied to the first at a first point in time. Subsequent to the first point time, wearable device 102 or another portion of HCI system 200 may detect an increase and/or decrease in the amount of force exerted by the user in forming or holding the fist. In response, wearable device 102 or another portion of HCI system 200 may formulate a second input command to increase and/or decrease the speed of the cursor implemented on HCI system 200 based at least in part on the increase and/or decrease in the amount of force exerted by the user. Accordingly, light first squeezes may correspond to and/or result in relatively slow cursor speeds, whereas heavy first squeezes may correspond to and/or result in relatively fast cursor speeds (or vice versa).


In some examples, wearable device 102 or another portion of HCI system 200 may rely in part on photographic data to determine and/or identify certain hand gestures performed by the user. For example, wearable device 102 or another portion of HCI system 200 may identify photographic data representative of the one or more hand gestures as captured by a camera incorporated into an artificial-reality system. In one example, the camera may generate and/or capture this photographic data of the hand gestures from a head-mounted display worn by the user. Additionally or alternatively, the camera may generate and/or capture this photographic data of the hand gestures from a mount, pedestal, and/or base positioned in the surrounding environment of the user.


In one example, wearable device 102 or another portion of HCI system 200 may provide the photographic data to one or more trained inferential models to enable such trained inferential models to determine the one or more hand gestures based at least in part on the neuromuscular signals detected by the sensors and the photographic data. By doing so, wearable device 102 or another portion of HCI system 200 may be able to improve the accuracy of its hand gesture detection and/or identification, thereby mitigating the number of false positives and/or negatives produced by the trained inferential models. For example, the neuromuscular signals detected by the sensors may indicate and/or suggest that a certain hand gesture performed by the user is either an index finger pinch or a middle finger pinch. However, without further information, the trained inferential models may be unable to conclusively decide on the hand gesture being one or the other. In this example, the trained inferential models may rely on a combination of those neuromuscular signals and photographic data representative of the user's hands captured at the time of the gesture to accurately determine that the user is performing an index finger pinch or a middle finger pinch.



FIG. 3 is an illustration of an exemplary implementation 300 in which a user 310 is donning and/or operating wearable device 102 along with a head-mounted display 322. In one example, wearable device 102 may be communicatively coupled to head-mounted display 322. In this example, the user may be able to control and/or manipulate one or more visual elements presented via head-mounted display 322 by making certain poses, gestures, and/or isometric contractions with his or her right hand. More specifically, such poses, gestures, and/or isometric contractions may involve and/or entail certain neuromuscular signals that are detected by sensors 104(1)-(N) of wearable device 102. In response to those neuromuscular signals, a processing device of wearable device 102 and/or head-mounted display 322 may be able to discern and/or identify the poses, gestures, and/or contractions made by the user's right hand. Head-mounted display may then manipulate and/or modify one or more visual elements presented to the user based at least in part on such poses, gestures, and/or contractions.



FIG. 4A illustrates an exemplary implementation of wearable device 102 with sixteen sensors 410 (e.g., EMG sensors) arranged circumferentially around an elastic band 420. As illustrated in FIG. 4, elastic band 420 may be dimensioned and/or configured to be worn around a user's lower arm or wrist. In some examples, the number and/or arrangement of sensors 410 may depend on the particular application for which the wearable device 102 is used and/or deployed. For example, wearable device 102 may be used and/or deployed to generate control information for controlling a virtual reality system, an augmented reality system, a robot, a vehicle, a computer application, a scrolling feature, a virtual avatar, and/or any other suitable control task.


As illustrated in FIG. 4A, sensors 410 may be coupled together using flexible electronics 1630 incorporated into wearable device 102. FIG. 4B illustrates an exemplary cross-sectional view through one of the sensors 410 of wearable device 102 in FIG. 4A. FIGS. 5A, 58 and 6 illustrate an alternative implementation of wearable device 102 capable of executing and/or performing one or more the of signal processing techniques described herein without external assistance. Accordingly, wearable device 102 in FIGS. 5A, 5B, and 6 may include and/or incorporate a physical compute module and/or unit that, along with the neuromuscular sensors, is integrated into the elastic band.



FIGS. 7 and 8 illustrate exemplary pinch poses 700, 802, 804, and 806 made by a user. As illustrated in FIG. 7, exemplary pinch pose 700 may involve and/or represent a positioning of the user's right index finger and right thumb in a pinch pose. Accordingly, pinch pose 700 may be executed and/or performed as the user pinches, presses, and/or holds his or her right index finger and right thumb together. In one example, pinch pose 700 may be mapped to a click action such that, when the user executes and/or performs pinch pose 700 for a predetermined duration, application system 230 in FIG. 2 may direct and/or cause a corresponding application to click and/or select a certain feature and/or visual element presented in a GUI of that application. This clicking and/or selection of the feature or visual element may be executed and/or performed in connection with the current cursor position.


In another example, pinch pose 700 may be mapped to an activation action, such as activation action 1900 in FIG. 19. For example, when the user executes and/or performs pinch pose 700 for a predetermined duration, application system 230 in FIG. 2 may direct and/or cause a cursor 1902 in FIG. 19 to move toward element 1906 in a GUI of a corresponding application. As cursor 1902 approaches element 1906, the application may activate element 1906 due at least in part to an association between element 1906 and cursor 1902.


In some examples, the application may provide one or more feedback indicators of this association to the user. Such feedback indicators may inform the user that the HCI system has detected pinch pose 700. For example, the application may indicate and/or show the activation of this association with a connector 1904 between cursor 1902 and element 1906. Additionally or alternatively, the application may indicate and/or show the activation of this association with a box that surrounds and/or encompasses element 1906.


As another example, the application may indicate and/or show the activation of this association by modifying a certain characteristic or feature of a GUI. For example, the application may transition the appearance of cursor 1902 from an empty circle to a filled circle (e.g., as the user holds pinch pose 700 for the predetermined duration). In this example, the circle may appear empty at the initiation of pinch pose 700 and then appear to fill as pinch pose 700 is held over the predetermined duration. In a further example, the application may modify and/or alter the shape and/or color of cursor 1902.


In one example, element 1906 may include and/or represent a hyperlink. In this example, to activate element 1906, the application may cause the GUI to render and/or display a webpage linked to or by element 1906. To exit and/or return from this webpage, the user may execute and/or perform another pose and/or gesture represented in the active mapping. For example, if an open hand pose is mapped to an exit and/or return action, the user may accomplish exiting and/or returning from this webpage by executing and/or performing the open hand pose.


As illustrated in FIG. 8, exemplary pinch pose 800 may involve and/or represent a positioning of the user's right ring finger and right thumb in a pinch pose. Accordingly, pinch pose 800 may be executed and/or performed as the user pinches, presses, and/or holds his or her right ring finger and right thumb together. In one example, pinch pose 800 may be mapped to a click action such that, when the user executes and/or performs pinch pose 800 for three seconds, application system 230 in FIG. 2 may direct and/or cause a corresponding application to display an identifier for the action in a status bar of a GUI. Additionally or alternatively, application system 230 in FIG. 2 may direct and/or cause the application to display a countdown of the remaining time required for pinch pose 800 to be held by the user. Once the user has held pinch pose 800 for the three seconds, application system 230 in FIG. 2 may direct and/or cause the application to perform the mapped action and/or terminate the display of the action identifier and/or the countdown.


In one example, pinch pose 802 in FIG. 8B may be mapped to a scrolling action such that, when the user executes and/or performs pinch pose 802, application system 230 in FIG. 2 may direct and/or cause a corresponding application to scroll up a GUI. In this example, pinch pose 804 in FIG. 8C may be mapped to another scrolling action such that, when the user executes and/or performs pinch pose 804, application system 230 in FIG. 2 may direct and/or cause the application to scroll down the GUI.



FIGS. 9A and 9B illustrate exemplary time and amplitude criteria for discrete event detection that may be used in connection with some embodiments. As illustrated in FIG. 9A, a first portion of force time series 902 may satisfy an event criterion 912, and a second portion of force time series 902 may satisfy an event pattern criterion 914. In one example, event criterion 912 and event pattern criterion 914 may be specified and/or defined in any suitable way (such as minimum and/or maximum amplitude values, degree of force maximum and/or thresholds or limits, etc.). As illustrated in FIG. 9B, force time series 920 may satisfy force time series criteria by falling within upper bound 916 and lower bound 918 over a certain time interval.


In some implementations, a combination of pose and force may be used for one-dimensional control. For example, the identified body state may include and/or represent a pose and a force. In this example, the identified pose may dictate and/or influence the responsive action, whereas the identified degree of force may dictate and/or influence a specific characteristic of the responsive action. For example, if the action includes scrolling a GUI in a certain direction, the identified degree of force may dictate and/or influence the speed of that scrolling (e.g., the speed of scrolling may be proportional to the degree of force). As an additional example, if the action includes painting pixels or voxels in a virtual painting application, the identified degree of force may dictate and/or influence the width of the virtual brushstroke.



FIG. 23 illustrates an exemplary drawing application 2300 that includes a virtual drawing instrument whose width may be controllable and/or modifiable by certain states of the user's body parts. As illustrated in FIG. 23, drawing application 2300 may include a virtual drawing instrument 2302 capable of drawing lines of varying widths. In some examples, the user may be able to control and/or modify the width of such lines based at least in part on the identified degree of force applied in the user's body state. For example, the user may apply one degree of force that causes application system 230 to select a width 2310 for lines drawn by visual drawing instrument 2302 and/or another degree of force that causes application system 230 to select a width 2312 for lines drawn by visual drawing instrument 2302. In this example, the user may apply a further degree of force that causes application system 230 to select a width 2314 for lines drawn by visual drawing instrument 2302 and/or an even further degree of force that causes application system 230 to select a width 2316 for lines drawn by visual drawing instrument 2302. Additionally or alternatively, these degrees of force may be used to increase and/or decrease the width of visual drawing instrument 2302 by discrete increments and/or decrements.


In some embodiments, application system 230 may be configured to provide visual feedback of both the identified pose and the identified force. For example, when the action includes scrolling a GUI, application system 230 may display a cursor in connection with that scrolling. In this example, the cursor may be presented and/or shown as a horizontal line with a bar extending above or below the line, depending on the scrolling direction. Further, the distance to which the bar extends above or below the line (e.g., the height of the bar) may depend on the identified degree of force applied to the pose. As an additional example, when the action includes painting virtual pixels or voxels, application system 230 may vary the size of a cursor depending on the identified degree of force. In this example, the size of the cursor may indicate the position of the virtual brushstroke.



FIGS. 10A-10H illustrate exemplary interactions between a user and an exemplary radial menu 1000 in accordance with some embodiments. In some implementations, application system 230 may be configured to present and/or display radial menu 1000. In one example, application system 230 may incorporate radial menu 1000 into a GUI (e.g., a web browser) and/or a multi-state user interface (e.g., multi-state user interface 2000 in FIG. 20). As illustrated in FIGS. 10E-10H, radial menu 1000 may include certain visual indicators, such as an interface state indicator 1010, a selection indicator 1020, an action indicator 1040, and/or action indicator 1050.


In some examples, interface state indicator 1010 may indicate a transition from a disabled radial menu (e.g., a mode in which the user is not able to interact with the radial menu) to an enabled radial menu (e.g., a mode in which the user is able to interact with the radial menu). In such examples, selection indicator 1020 may indicate a currently selected action (e.g., either action indicator 1040 or action indicator 1050). In one example, application system 230 may perform an action associated with the selected action indicator. For example, if radial menu 1000 is used with a web browser, action indicator 1040 may be associated with a forward action, and/or action indicator 1050 may be associated with a backward action.



FIGS. 10A-10D illustrate exemplary poses and/or gestures suitable for use in connection with radial menu 1000. In some examples, wearable device 102 may detect and/or record a plurality of neuromuscular signals via the body of a user. For example, wearable device 102 may detect and/or record neuromuscular signals from the arm and/or wrist of the user. In this example, wearable device 102, interface system 220, and/or application system 230 may be configured to determine and/or identify a first pose using the recorded signals inputted to an inferential model. Application system 230 may be configured to provide commands and/or instructions to control aspects of radial menu 1000 in response to the identification of the first pose.



FIG. 10A illustrates an exemplary first pose 1002. In some examples, first pose 1002 may be mapped to a command and/or instruction to display, enter, and/or activate radial menu 1000 within a GUI of application system 230. In such examples, prior to the identification of first pose 1002, some or all of radial menu 1000 may be withheld and/or hidden from view within the GUI. In one example, state indicator 1010 may be displayed within the GUI upon identification of first pose 1002.


As shown in FIG. 10E, state indicator 1010 may include and/or represent a circle. In one example, as the user holds first pose 1002 for the predetermined duration, the circle may transition from empty to filled. Once first pose 1002 has been held for the predetermined duration, selection indicator 1020 and/or action indicators 1040 and 1050 may be displayed within the GUI. Alternatively, selection indicator 1020 and/or action indicators 1040 and 1050 may be displayed upon identification of first pose 1002. In another example, state indicator 1010, selection indicator 1020, and/or action indicators 1040 and 1050 may be displayed prior to identification of first pose 1002.



FIG. 10B illustrates an exemplary gesture 1004 that includes and/or represents a first pose combined with a specific movement of the user's wrist. In some examples, the specific movement may involve and/or entail a flexion, extension, deviation, and/or rotation of the user's wrist while he or she holds the first pose. In such examples, the first post combined with the specific movement may be mapped to a command and/or instruction to select a visual indicator in radial menu 1000 (e.g., either action indicator 1040 or action indicator 1050). In one example, the flexion of the wrist may be mapped to selecting action indicator 1050, and the extension of the wrist may be mapped to selecting action indicator 1040.


As shown in FIG. 10F, radial menu 1000 may be configured to indicate and/or identify a selected action indicator within the GUI. For example, in response to a certain pose and/or gesture made by the user, selection indicator 1020 may change from the position shown in FIG. 10E to the position shown in FIG. 10F. In this example, upon completion of this change, selection indicator 1020 may identify and/or point toward action indicator 1050.


More generally, the selection of a visual indicator (such as action indicator 1050) may be demonstrated and/or confirmed using visual, auditory, or haptic feedback. For example, in response to the selection of the visual indicator, application system 230 may play a sound (e.g., a click sound) and/or cause an actuator to vibrate with haptic feedback for user. In some examples, visual feedback may include and/or represent the change of a characteristic of a visual element within radial menu 1000 in response to the selection of the visual indicator. Examples of such a characteristic change include, without limitation, a position change, an orientation change, a color change, a size change, a transparency change, a fill change, an emphasis change, a shadow change, an animation change, a font change, a line type change, a line width change, combinations or variations of one or more of the same, and/or any other suitable characteristic changes.



FIG. 10C illustrates an exemplary finger pinch pose 1006. In some examples, finger pinch pose 1006 may be mapped to a command and/or instruction to click a visual indicator in radial menu 1000 (e.g., either action indicator 1040 or action indicator 1050). In some implementations, the effect of the click may be analogous to a mouse button click and/or a keypress in certain conventional computer systems. In one example, rather than mapping simply finger pinch pose 1006 to the click command, the combination of finger pinch pose 1006 and a certain degree of force may be mapped to the click command. For example, the user may clench his or her fingers with at least a threshold amount of force while maintaining finger pinch pose 1006 to initiate and/or cause the execution of the click command.


As shown in FIG. 10G, radial menu 1000 may be configured to click (e.g., engage with or activate a function of) the currently selected action indicator. In one example, application system 230 may click action indicator 1052 in response to an identified click gesture. In this example, upon performing the click action, application system 230 may cause the visual indicator to appear to depress and/or release (e.g., similar to the pressing and/or releasing of a physical button).



FIG. 10D illustrates an exemplary open hand pose 1008. In some examples, open hand pose 1008 may be mapped to a command and/or instruction to hide, exit, and/or deactivate radial menu 1000. In some implementations, the user may need to hold open hand pose 1008 for a predetermined amount of time before application system 230 executes and/or performs the deactivation command. Following identification of open hand pose 1008, some or all of radial menu 1000 may be obscured and/or hidden from view within the GUI.


As shown in FIG. 10H, a state indicator 1012 may include and/or represent a circle. In one example, as the user holds open hand pose 1008 for the predetermined duration, the circle may transition from empty to filled (or from filled to empty), and/or selection indicator 1020 and action indicators 1040 and 1050 may no longer be displayed within the GUI. Alternatively, selection indicator 1020 and/or action indicators 1040 and 1050 may disappear from the GUI upon identification of open hand pose 1008. In another example, state indicator 1010, selection indicator 1020, and/or action indicators 1040 and 1050 may remain displayed within the GUI following the deactivation command However, radial menu 1000 may not recognize and/or response to a subsequent selection (e.g., a wrist gesture) or click (e.g., a finger pinch pose) command until receiving a subsequent activation command.



FIG. 11A illustrates an exemplary implementation of radial menu 1000 in accordance with some embodiments. As illustrated in FIG. 11A, radial menu 1000 may be superimposed over a webpage 1102. In one example, when associated with webpage 1102, radial menu 1000 may function and/or serve as a navigation menu that enabling the user to move back to a previous page or forward to subsequent page depending on the menu button selected (e.g., using a wrist gesture) and/or clicked (e.g., using a finger pinch pose) by the user in the superimposed navigation menu. In this example, the user may cause the superimposed menu to disappear from webpage 1102 by holding open hand pose 1008 as described above.



FIGS. 11B and 11C illustrate alternative implementations of an exemplary radial menu 1110 in accordance with some embodiments. As illustrated in FIGS. 11B and 11C, radial menu 1110 may include a set of action indicators (e.g., indicator 1111 and indicator 1113). The indicators may be displayed in the GUI along a substantially circular arc. In one example, a first body state may be mapped to the selection of the next indicator in a given sequence. For example, in response to identification of the first body state, application system 230 may deselect indicator 1111 and/or select indicator 1113. Additionally or alternatively, application system 230 may demonstrate and/or conform the selection and/or deselection of a visual indicator using visual, auditory, and/or haptic feedback.


In some embodiments, the first body state and a second body state may represent counterparts of one another. For example, the first body state may include a wrist extension, and the second body state may include a wrist flexion. Additionally or alternatively, the first body state may include a clockwise wrist rotation, and the second body state may include a counterclockwise wrist rotation. In a further example, the first body state may include a radial deviation, and the second body state may include an ulnar deviation.



FIGS. 11D and 11E illustrate implementations of an exemplary sequential menu 1120 in accordance with some embodiments. In certain examples, sequential menu 1120 may include and/or represent a sequence of visual indicators (depicted as “A” through “F”). In such examples, sequential menu 1120 may provide a suitable spatial arrangement among these visual indicators (e.g., throughout the rows and columns in illustrated in FIGS. 11D and 11E). In one example, a first body state may be mapped to the selection of the next visual indicator in the sequence, and a second body state may be mapped to the selection of the previous visual indicator in the sequence. For example, in response to identification of the second body state, application system 230 may deselect subsequent indicator 1123 and/or select prior indicator 1121.



FIG. 20 illustrates a state diagram of an exemplary multi-state user interface 2000 in accordance with some embodiments. Multi-state user interface 2000 may be implemented by any suitable computing system, including any of the devices incorporated into HCI system 200 in FIG. 2 (e.g., wearable device 102, interface system 220 and/or application system 230). In one example, multi-state user interface 2000 may receive body state information from wearable device 102, interface system 220, and/or application system 230. Multi-state user interface 2000 may then identify, determine, and/or recognize certain body states, as defined by the user or by default, based at least in part on such information. In some cases, certain hand and arm poses and/or gestures may be symbolic and/or communicate according to cultural standards.


Multi-state user interface 2000 may be configured and/or programmed with multiple interface states (e.g., interface state 2020 and interface state 2030). Each of the multiple interface states may implement and/or represent mappings between one or more body states to a set of responsive actions. As an example, interface state 2020 may implement and/or represent a first mapping from a first set of body states to a first set of responsive actions 2022, and interface state 2030 may implement and/or represent a second mapping from a second set of body states to a second set of actions 2032. The first set of body states may differ from the second set of body states, and the first set of actions 2022 may differ from the second set of actions 2032. Alternatively, the same body states may map to differing responsive actions in different interface states. Further, differing body states may map to the same actions across different interface states.


In some embodiments, multi-state user interface 2000 may provide information about the current status of HCI system 200 via one or more visual, auditory, or haptic indicators. For example, multi-state user interface 2000 may be configured to display a connection status between interface system 220 (or application system 230) and wearable device 102. FIG. 12 illustrates an exemplary menu bar icon 1200 that demonstrates different appearances of a menu button displaying such a connection status. This menu button may be displayed in a menu bar of multi-state user interface 2000 (or another application running on application system 230).


The menu button may change in appearance to indicate the status of interface system 220. For example, a first appearance of the menu button may indicate that wearable device 102 is connected and sending data to interface system 220. In this example, a second appearance of the menu button may indicate that wearable device 102 is connected and but not sending data to interface system 220 or application system 230. Finally, a third appearance of the menu button may indicate that wearable device 102 is not connected to interface system 220.


In some implementations, multi-state user interface 2000 may be configured to provide and/or indicate its current state and/or setting of interface 2000 (e.g., whether interface state 2020 or interface state 2030 is currently active), a current body state, and/or a current action corresponding to the identified body state. For example, when a body state mapped to a responsive action includes a pose held for a predetermined duration, multi-state user interface 2000 may provide an indication that the HCI system 200 has recognized the pose. Furthermore, multi-state user interface 2000 may provide an indication of the remaining time necessary for the user to hold the pose before initiating performance of the action mapped to the pose.


In some implementations, multi-state user interface 2000 may include one or more graphical elements for displaying the current interface state, the current body state, and/or the responsive action. For example, the title bar, the menu bar, and/or the status bar of a GUI may display the current interface state, the current body state, and/or the responsive action. Additionally or alternatively, multi-state user interface 2000 may modify a visual characteristic (e.g., size, shape, fill, emphasis, orientation, animation, etc.) of one or more elements of the GUI (e.g., cursor, control element, indicator element, etc.) to indicate the current interface state, current body state, and/or the responsive action.


In some embodiments, multi-state user interface 2000 may be configured to indicate current interface state, the current body state, and/or the responsive action with visual, auditory, or haptic feedback. For example, the transition to a new interface state or the performance of a responsive action may be accompanied by a graphical presentation, sound, and/or vibration provided to the user.


In some embodiments, multi-state user interface 2000 may be configurable and/or programmable by a user. Accordingly, the user may be able to specify and/or select mappings between certain body states and responsive actions for one or more interface states. FIG. 24 illustrates an exemplary multi-state user interface 2400 that enables the user to select and/or define certain mappings between body part states 2402 and actions 2420. As illustrated in FIG. 24, body part states 2402 may include and/or represent a ring finger pinch 2404, a middle finger pinch 2406, a pinky finger pinch 2408, an index finger pinch 2410, a first pose 2412, and/or an open hand pose 2414. In this example, actions 2420 may include and/or represent a scroll up 2424, a scroll down 2426, an activate link 2428, a deactivate link 2430, an increment selector 2432, and/or a decrement selector 2434.


In some examples, the user may be able to select and/or define a mapping 2440 between one or more of body part states 2402 and actions 2420 via multi-state user interface 2400. For example, the user may direct multi-state user interface 2400 to map ring finger pinch 2404 to scroll up 2424 via mapping 2440 such that, when the user makes a ring finger pinch pose, the page and/or browser displayed on application system 230 scrolls up. Additionally or alternatively, the user may direct multi-state user interface 2400 to map pinky finger pinch 2408 to map pinky finger pinch 2408 to scroll down 2424 via mapping 2440 such that, when the user makes a pinky finger pinch pose, the page and/or browser displayed on application system 230 scrolls down.



FIGS. 13-17 illustrate exemplary portions and/or views of a multi-state user interface 1300 that enables a user to specify and/or select the mappings. For example, multi-state user interface 1300 in FIG. 13 may include and/or represent a popup box and/or dialog that appears in response to certain user input. In this example, multi-state user interface 1300 may facilitate enabling and/or disabling body state control (e.g., using the “enabled” button). In addition, multi-state user interface 1300 may indicate a current status of HCI system 200 (e.g., using the “API” and “Data” indicators). As illustrated in FIG. 14, multi-state user interface 1300 may include a drop-down menu control from which the user is able to select certain display settings and/or map certain body states to responsive actions.


Through multi-state user interface 1300, the user may modify the mappings between body states and actions. For example, if multi-state user interface 1300 includes and/or represents a web browser, the user may configure web navigation settings by selecting setting options shown in a drop-down menu of the web browser. As shown in FIG. 15, multi-state user interface 1300 may enable the user to click on and/or select “scroll:options” to configure a type of pose that initiates a scrolling control or feature. In this example, the user may map the scroll-down action with a ring finger pinch pose (as shown in FIG. 8A) and the scroll-up action with a pinky finger pinch (as shown in FIG. 8B). Accordingly, when the user holds a ring finger pinch pose, the web browser may receive a scroll-down command from application system 230. The web browser may then scroll down the displayed webpage.


As an additional example, FIG. 16 illustrates an exemplary “link activate” setting used to highlight links included in a web page. As shown in FIG. 16, multi-state user interface 1300 may enable the user to click on “linksActivate:options” to configure a type of pose and predetermined pose duration that initiates the activation and/or rendering of a link displayed in a webpage. In the example, the user may map the link-activation action to a middle finger pinch pose (as shown in FIG. 8C). Accordingly, when the user holds a middle finger pinch pose for the selected pose duration, the web browser may receive an activate-link command from application system 230. The web browser may then activate and/or render a link associated with the current cursor position.


As a further example, FIG. 17 illustrates an exemplary “click action” setting used to map a pose to a click action. As shown in FIG. 17, multi-state user interface 1300 may enable the user to click on “click:options” to configure a type of pose and predetermined pose duration that initiates a click action in an application (e.g., emulating a trackball or mouse click). In this example, the user may map the click action to an index finger pinch pose (as shown in FIG. 7). Accordingly, when the user holds an index finger pinch pose for the selected pose duration, the application may receive a click command from application system 230. The application may then perform the click action on a control and/or feature associated with the current cursor position.


In some examples, multi-state user interface 1300 may be configured and/or programmed as a plugin or API for use with existing applications. For example, multi-state user interface 1300 may be formatted and/or packaged to provide a body state recognition functionality as a web browser plugin for use with existing web browsers running on application system 230.



FIG. 18 illustrates an exemplary method 1800 for assisted user interface navigation in accordance with some embodiments. Method 1800 may facilitate convenient selection and/or activation of elements in a GUI by associating such elements with the current position of the user's cursor. Upon making this association, the user may activate one or more of those elements in the GUI to initiate a certain action in the application. As a result, the user may be able to interact with an application running on application system 230 in FIG. 2 without necessitating precise control over the cursor position.


As illustrated in FIG. 18, method 1800 may include a step 1802 of updating a cursor position in a GUI of an application. In this example, method 1800 may also include a step 1804 of associating an element of the GUI with the updated cursor position. The element may be any suitable feature of the GUI, including a hyperlink, a data entry field, an/or a control indicator. In one example, this association may be formed and/or initiated as a result of the cursor being positioned within a certain distance and/or range of the element. For example, application system 230 may be configured to associate the cursor with the closest applicable element within the GUI, especially if the determined distance between the cursor and that element is within a predetermined threshold or range.


Method 1800 may further include a step 1806 of providing a feedback indication of the association to the user. For example, application system 230 may provide a visual, auditory, or haptic indication of the association to the user. In one example, application system 230 may provide auditory prompts and/or feedback representative of the association using speakers associated with HCI system 200. As a further example, application system 230 may provide haptic prompts or feedback representative of the association using actuators that apply forces to the user's body. Additionally or alternatively, application system 230 may provide a visual indication of the association by modifying a characteristic of the cursor and/or the associated element.


Method 1800 may additionally include a step 1808 of performing an activation action based at least in part on the association between the cursor and the element. For example, an activation action may be mapped to a specific body state. In response to the identification of the mapped body state, application system 230 may determine whether the cursor is associated with a certain element of the GUI. If the cursor is associated with that element of the GUI, application system 230 may perform the mapped activation action on the associated element.



FIGS. 21A and 21B illustrate an exemplary block diagram of wearable device 102 with sixteen EMG sensors. As shown in FIG. 21A, wearable device 102 may include sensors 2110 that record neuromuscular signals traversing the user's body. The output of the sensors 2110 may be provided to analog front end 2130, which performs analog processing (e.g., noise reduction, filtering, etc.) on the recorded signals. The processed analog signals may then be provided to analog-to-digital converter (ADC) 2132, which converts the analog signals to digital signals for further processing by a computer processor (MCU) 2134. In one example, MCU 2134 may receive inputs from other sensors (e.g., IMU sensor 2140) and/or electric current from power and battery module 2142. The output of the processing performed by MCU 2134 may be provided to antenna 2150 for transmission to dongle 2120 shown in FIG. 21B.


In one example, dongle 2120 in FIG. 21B may communicate with the wearable device 102 (e.g., via Bluetooth or another suitable short-range wireless communication technology). In this example, dongle 2120 may include antenna 2152 configured to communicate with antenna 2150 of wearable device 102. The signals received by antenna 2152 of dongle 2120 may be provided to a host computer for further processing, display, and/or effecting control of a particular physical or virtual element of that host computer.



FIG. 22 is a flow diagram of an exemplary method 2200 for controlling computing devices via neuromuscular signals of users. The steps shown in FIG. 22 may be performed during the operation of an HCI system implemented and/or deployed by a user. Additionally or alternatively, the steps shown in FIG. 22 may also incorporate and/or involve various sub-steps and/or variations consistent with the descriptions provided above in connection with FIGS. 1-21.


As illustrated in FIG. 22, method 2200 may include a step 2210 in which one or more neuromuscular signals are detected and/or sensed from a forearm or wrist of a user. For example, a user donning wearable device 102 may make a pose and/or gesture that causes neuromuscular signals to traverse down the user's arm toward his or her hand. In this examples, wearable device 102 may include and/or incorporate a plurality of sensors that detect, sense, and/or measure those neuromuscular signals.


As illustrated in FIG. 22, method 2200 may also include a step 2220 in which an amount of force associated with the one or more neuromuscular signals are determined. For example, wearable device 102 or another portion of HCI system 200 may include and/or incorporate at least one processor that implements one or more trained inferential models. In this example, the one or more trained inferential models may analyze and/or consume data representative of the neuromuscular signals. Upon doing so, the one or more trained inferential models may determine the amount of force associated with the neuromuscular signals.


As illustrated in FIG. 22, method 2200 may also include a step 2230 in which the amount of force associated with the one or more neuromuscular signals is determined to have satisfied a threshold force value. For example, wearable device 102 or another portion of HCI system 200 may include and/or incorporate at least one processor that determines that the amount of force associated with the neuromuscular signals satisfies a threshold force value. In this example, the threshold force value may represent a certain level of force associated with a specific action to be performed by HCI system 200.


As illustrated in FIG. 22, method 2200 may further include a step 2240 in which a first input command is generated in accordance with the determination that the amount of force satisfies the threshold force value. For example, wearable device 102 or another portion of HCI system 200 may include and/or incorporate at least one processor that generates a first input command for HCI system 200 in response to the determination that the amount of force satisfies the threshold force value. In this example, the first input command may direct and/or cause HCI system 200 to perform a specific action that corresponds to the amount of force exerted by the user.


EXAMPLE EMBODIMENTS

Example 1: A human computer interface (HCI) system comprising (1) a plurality of sensors that detect one or more neuromuscular signals via a body of a user of a computing system and (2) at least one processing device that (A) determines, based at least in part on the neuromuscular signals detected by the plurality of sensors, a state of at least one body part of the user and, in response to determining the state of the body part, (B) generates an input command for the computing system that accounts for the state of the body part.


Example 2: The HCI system of Example 1, wherein (1) the plurality of sensors generate data representative of the neuromuscular signals detected via the body of the user and (2) the processing device (A) provides the data representative of the neuromuscular signals to an inferential model and (B) determines, based at least in part on an output of the inferential model, the state of the body part of the user.


Example 3: The HCI system of Example 1, wherein the processing device (1) determines, based at least in part on the neuromuscular signals detected by the plurality of sensors, an amount of force produced by the body part of the user and, in response to determining the state of the body part and the amount of force produced by the body part, (2) generates the input command for the computing system to account for the state of the body part and the amount of force produced by the body part.


Example 4: The HCI system of Example 3, wherein (1) the plurality of sensors generate data representative of the neuromuscular signals detected via the body of the user, and (2) the processing device (A) provides the data representative of the neuromuscular signals to an inferential model and (B) determines, based at least in part on an output of the inferential model, the state of the body part of the user and the amount of force produced by the body part.


Example 5: The HCI system of Example 1, the processing device causes the computing system to perform at least one action based at least in part on the input command.


Example 6: The HCI system of Example 5, wherein the processing device (1) identifies at least one characteristic of the action to be regulated in accordance with an amount of force produced by the body part and (2) formulates the input command to account for the amount of force produced by the body part such that the characteristic of the action corresponds to the amount of force produced by the body part.


Example 7: The HCI system of Example 6, wherein (1) the action comprises at least one of (A) scrolling through a graphical user interface (GUI) of an application running on the computing system or (B) drawing a visual element on a GUI of an application running on the computing system, and (2) the characteristic of the action comprises at least one of (A) a scrolling speed or (B) a width of a virtual drawing instrument.


Example 8: The HCI system of Example 5, wherein the action comprises at least one of (1) moving a cursor displayed on a GUI of an application running on the computing system, (2) associating a cursor of the computing system with a visual element displayed in a GUI of an application running on the computing system based at least in part an updated position of the cursor relative to the visual element, or (3) providing, to the user of the computing system, a feedback indication of an association made between a cursor of the computing system and a visual element displayed in a GUI of an application running on the computing system.


Example 9: The HCI system of Example 8, wherein the feedback indication of the association comprises at least one of (1) modifying at least one characteristic of the visual element of the GUI or (2) adding, to the GUI, at least one further visual element that represents the association.


Example 10: The HCI system of Example 8, wherein associating the cursor with the visual element comprises (1) determining that the updated position of the cursor is within a certain distance of the visual element in the GUI and, in response to determining that the updated position of the cursor is within the certain distance of the visual element, (2) associating the cursor with the visual element.


Example 11: The HCI system of Example 10, wherein determining that the updated position of the cursor is within the certain distance of the visual element comprises (1) identifying a position of the visual element within the GUI, (2) identifying at least one position of at least one additional visual element within the GUI, and (3) determining that the updated position of the cursor is closer to the position of the visual element than the position of the additional visual element.


Example 12: The HCI system of Example 8, wherein associating the cursor with the visual element comprises (1) detecting a direction in which the cursor moved within the GUI to reach the updated position and (2) associating the cursor with the visual element based at least in part on the direction in which the cursor moved to reach the updated position.


Example 13: The HCI system of Example 8, wherein associating the cursor with the visual element comprises (1) detecting a speed at which the cursor moved within the GUI to reach the updated position and (2) associating the cursor with the visual element based at least in part on the speed at which the cursor moved to reach the updated position.


Example 14: The HCI system of Example 5, wherein the processing device (1) maintains a mapping between possible states of the body part and actions capable of being performed by the computing system and (2) determines the action to be performed by the computing system based at least in part on the mapping and the state of at least one body part.


Example 15: The HCI system of Example 14, wherein the processing device (1) maintains an additional mapping between the possible states of the body part and additional actions capable of being performed by the computing system and (2) activating the mapping such that one of the actions is performed by the computing system in response to one of the possible states of the body part.


Example 16: The HCI system of Example 15, wherein the processing device (1) determines, based at least in part on additional neuromuscular signals detected by the plurality of sensors, an additional state of the body part and, in response to determining the additional state of the body part, (2) transitioning from the mapping to the additional mapping by (A) deactivating the mapping and (B) activating the additional mapping such that one of the additional actions is performed by the computing system in response to the one of the possible states of the body part.


Example 17: The HCI system of Example 5, wherein the action comprises at least one of (1) selecting a visual element of the GUI, (2) clicking on a visual element of the GUI, or (3) displaying a visual element in the GUI.


Example 18: The HCI system of Example 1, wherein the state of the body part comprises at least one of (1) a pose of the body part, (2) a gesture of the body part, or (3) an isometric contraction of the body part.


Example 19: A wearable device comprising (1) a plurality of sensors that detect one or more neuromuscular signals via a body of a user of a computing system and (2) at least one processing device communicatively coupled to the plurality of sensors, wherein the processing device (A) determines, based at least in part on the neuromuscular signals detected by the plurality of sensors, a state of at least one body part of the user and, in response to determining the state of the body part, (B) generates an input command for the computing system that accounts for the state of the body part.


Example 20: A method comprising (1) detecting, by a plurality of sensors incorporated into a wearable donned by a user of a computing system, one or more neuromuscular signals via a body of the user, (2) determining, by a processing device, a state of at least one body part of the user based at least in part on the neuromuscular signals detected by the plurality of sensors, and (3) generating, by the processing device in response to determining the state of the body part, an input command for the computing system that accounts for the state of the body part.


Embodiments of the present disclosure may include or be implemented in conjunction with various types of artificial-reality systems. Artificial reality is a form of reality that has been adjusted in some manner before presentation to a user, which may include, for example, a virtual reality, an augmented reality, a mixed reality, a hybrid reality, or some combination and/or derivative thereof. Artificial-reality content may include completely computer-generated content or computer-generated content combined with captured (e.g., real-world) content. The artificial-reality content may include video, audio, haptic feedback, or some combination thereof, any of which may be presented in a single channel or in multiple channels (such as stereo video that produces a three-dimensional (3D) effect to the viewer). Additionally, in some embodiments, artificial reality may also be associated with applications, products, accessories, services, or some combination thereof, that are used to, for example, create content in an artificial reality and/or are otherwise used in (e.g., to perform activities in) an artificial reality.


Artificial-reality systems may be implemented in a variety of different form factors and configurations. Some artificial-reality systems may be designed to work without near-eye displays (NEDs). Other artificial-reality systems may include an NED that also provides visibility into the real world (such as, e.g., augmented-reality system 2500 in FIG. 25) or that visually immerses a user in an artificial reality (such as, e.g., virtual-reality system 2600 in FIG. 26). While some artificial-reality devices may be self-contained systems, other artificial-reality devices may communicate and/or coordinate with external devices to provide an artificial-reality experience to a user. Examples of such external devices include handheld controllers, mobile devices, desktop computers, devices worn by a user, devices worn by one or more other users, and/or any other suitable external system.


Turning to FIG. 25, augmented-reality system 2500 may include an eyewear device 2502 with a frame 2510 configured to hold a left display device 2515(A) and a right display device 2515(B) in front of a user's eyes. Display devices 2515(A) and 2515(B) may act together or independently to present an image or series of images to a user. While augmented-reality system 2500 includes two displays, embodiments of this disclosure may be implemented in augmented-reality systems with a single NED or more than two NEDs.


In some embodiments, augmented-reality system 2500 may include one or more sensors, such as sensor 2540. Sensor 2540 may generate measurement signals in response to motion of augmented-reality system 2500 and may be located on substantially any portion of frame 2510. Sensor 2540 may represent one or more of a variety of different sensing mechanisms, such as a position sensor, an inertial measurement unit (IMU), a depth camera assembly, a structured light emitter and/or detector, or any combination thereof. In some embodiments, augmented-reality system 2500 may or may not include sensor 2540 or may include more than one sensor. In embodiments in which sensor 2540 includes an IMU, the IMU may generate calibration data based on measurement signals from sensor 2540. Examples of sensor 2540 may include, without limitation, accelerometers, gyroscopes, magnetometers, other suitable types of sensors that detect motion, sensors used for error correction of the IMU, or some combination thereof.


In some examples, augmented-reality system 2500 may also include a microphone array with a plurality of acoustic transducers 2520(A)-2520(J), referred to collectively as acoustic transducers 2520. Acoustic transducers 2520 may represent transducers that detect air pressure variations induced by sound waves. Each acoustic transducer 2520 may be configured to detect sound and convert the detected sound into an electronic format (e.g., an analog or digital format). The microphone array in FIG. 26 may include, for example, ten acoustic transducers: 2520(A) and 2520(B), which may be designed to be placed inside a corresponding ear of the user, acoustic transducers 2520(C), 2520(D), 2520(E), 2520(F), 2520(G), and 2520(H), which may be positioned at various locations on frame 2510, and/or acoustic transducers 2520(I) and 2520(J), which may be positioned on a corresponding neckband 2505.


In some embodiments, one or more of acoustic transducers 2520(A)-(F) may be used as output transducers (e.g., speakers). For example, acoustic transducers 2520(A) and/or 2520(B) may be earbuds or any other suitable type of headphone or speaker.


The configuration of acoustic transducers 2520 of the microphone array may vary. While augmented-reality system 2500 is shown in FIG. 25 as having ten acoustic transducers 2520, the number of acoustic transducers 2520 may be greater or less than ten. In some embodiments, using higher numbers of acoustic transducers 2520 may increase the amount of audio information collected and/or the sensitivity and accuracy of the audio information. In contrast, using a lower number of acoustic transducers 2520 may decrease the computing power required by an associated controller 2550 to process the collected audio information. In addition, the position of each acoustic transducer 2520 of the microphone array may vary. For example, the position of an acoustic transducer 2520 may include a defined position on the user, a defined coordinate on frame 2510, an orientation associated with each acoustic transducer 2520, or some combination thereof.


Acoustic transducers 2520(A) and 2520(B) may be positioned on different parts of the user's ear, such as behind the pinna, behind the tragus, and/or within the auricle or fossa. Or, there may be additional acoustic transducers 2520 on or surrounding the ear in addition to acoustic transducers 2520 inside the ear canal. Having an acoustic transducer 2520 positioned next to an ear canal of a user may enable the microphone array to collect information on how sounds arrive at the ear canal. By positioning at least two of acoustic transducers 2520 on either side of a user's head (e.g., as binaural microphones), augmented-reality device 2500 may simulate binaural hearing and capture a 3D stereo sound field around about a user's head. In some embodiments, acoustic transducers 2520(A) and 2520(B) may be connected to augmented-reality system 2500 via a wired connection 2530, and in other embodiments acoustic transducers 2520(A) and 2520(B) may be connected to augmented-reality system 2500 via a wireless connection (e.g., a Bluetooth connection). In still other embodiments, acoustic transducers 2520(A) and 2520(B) may not be used at all in conjunction with augmented-reality system 2500.


Acoustic transducers 2520 on frame 2510 may be positioned in a variety of different ways, including along the length of the temples, across the bridge, above or below display devices 2515(A) and 2515(B), or some combination thereof. Acoustic transducers 2520 may also be oriented such that the microphone array is able to detect sounds in a wide range of directions surrounding the user wearing the augmented-reality system 2500. In some embodiments, an optimization process may be performed during manufacturing of augmented-reality system 2500 to determine relative positioning of each acoustic transducer 2520 in the microphone array.


In some examples, augmented-reality system 2500 may include or be connected to an external device (e.g., a paired device), such as neckband 2505. Neckband 2505 generally represents any type or form of paired device. Thus, the following discussion of neckband 2505 may also apply to various other paired devices, such as charging cases, smart watches, smart phones, wrist bands, other wearable devices, hand-held controllers, tablet computers, laptop computers, other external compute devices, etc.


As shown, neckband 2505 may be coupled to eyewear device 2502 via one or more connectors. The connectors may be wired or wireless and may include electrical and/or non-electrical (e.g., structural) components. In some cases, eyewear device 2502 and neckband 2505 may operate independently without any wired or wireless connection between them. While FIG. 25 illustrates the components of eyewear device 2502 and neckband 2505 in example locations on eyewear device 2502 and neckband 2505, the components may be located elsewhere and/or distributed differently on eyewear device 2502 and/or neckband 2505. In some embodiments, the components of eyewear device 2502 and neckband 2505 may be located on one or more additional peripheral devices paired with eyewear device 2502, neckband 2505, or some combination thereof.


Pairing external devices, such as neckband 2505, with augmented-reality eyewear devices may enable the eyewear devices to achieve the form factor of a pair of glasses while still providing sufficient battery and computation power for expanded capabilities. Some or all of the battery power, computational resources, and/or additional features of augmented-reality system 2500 may be provided by a paired device or shared between a paired device and an eyewear device, thus reducing the weight, heat profile, and form factor of the eyewear device overall while still retaining desired functionality. For example, neckband 2505 may allow components that would otherwise be included on an eyewear device to be included in neckband 2505 since users may tolerate a heavier weight load on their shoulders than they would tolerate on their heads. Neckband 2505 may also have a larger surface area over which to diffuse and disperse heat to the ambient environment. Thus, neckband 2505 may allow for greater battery and computation capacity than might otherwise have been possible on a stand-alone eyewear device. Since weight carried in neckband 2505 may be less invasive to a user than weight carried in eyewear device 2502, a user may tolerate wearing a lighter eyewear device and carrying or wearing the paired device for greater lengths of time than a user would tolerate wearing a heavy standalone eyewear device, thereby enabling users to more fully incorporate artificial-reality environments into their day-to-day activities.


Neckband 2505 may be communicatively coupled with eyewear device 2502 and/or to other devices. These other devices may provide certain functions (e.g., tracking, localizing, depth mapping, processing, storage, etc.) to augmented-reality system 2500. In the embodiment of FIG. 25, neckband 2505 may include two acoustic transducers (e.g., 2520(I) and 2520(J)) that are part of the microphone array (or potentially form their own microphone subarray). Neckband 2505 may also include a controller 2525 and a power source 2535.


Acoustic transducers 2520(I) and 2520(J) of neckband 2505 may be configured to detect sound and convert the detected sound into an electronic format (analog or digital). In the embodiment of FIG. 25, acoustic transducers 2520(I) and 2520(J) may be positioned on neckband 2505, thereby increasing the distance between the neckband acoustic transducers 2520(I) and 2520(J) and other acoustic transducers 2520 positioned on eyewear device 2502. In some cases, increasing the distance between acoustic transducers 2520 of the microphone array may improve the accuracy of beamforming performed via the microphone array. For example, if a sound is detected by acoustic transducers 2520(C) and 2520(D) and the distance between acoustic transducers 2520(C) and 2520(D) is greater than, e.g., the distance between acoustic transducers 2520(D) and 2520(E), the determined source location of the detected sound may be more accurate than if the sound had been detected by acoustic transducers 2520(D) and 2520(E).


Controller 2525 of neckband 2505 may process information generated by the sensors on neckband 2505 and/or augmented-reality system 2500. For example, controller 2525 may process information from the microphone array that describes sounds detected by the microphone array. For each detected sound, controller 2525 may perform a direction-of-arrival (DOA) estimation to estimate a direction from which the detected sound arrived at the microphone array. As the microphone array detects sounds, controller 2525 may populate an audio data set with the information. In embodiments in which augmented-reality system 2500 includes an inertial measurement unit, controller 2525 may compute all inertial and spatial calculations from the IMU located on eyewear device 2502. A connector may convey information between augmented-reality system 2500 and neckband 2505 and between augmented-reality system 2500 and controller 2525. The information may be in the form of optical data, electrical data, wireless data, or any other transmittable data form. Moving the processing of information generated by augmented-reality system 2500 to neckband 2505 may reduce weight and heat in eyewear device 2502, making it more comfortable to the user.


Power source 2535 in neckband 2505 may provide power to eyewear device 2502 and/or to neckband 2505. Power source 2535 may include, without limitation, lithium ion batteries, lithium-polymer batteries, primary lithium batteries, alkaline batteries, or any other form of power storage. In some cases, power source 2535 may be a wired power source. Including power source 2535 on neckband 2505 instead of on eyewear device 2502 may help better distribute the weight and heat generated by power source 2535.


As noted, some artificial-reality systems may, instead of blending an artificial reality with actual reality, substantially replace one or more of a user's sensory perceptions of the real world with a virtual experience. One example of this type of system is a head-worn display system, such as virtual-reality system 2600 in FIG. 26, that mostly or completely covers a user's field of view. Virtual-reality system 2600 may include a front rigid body 2602 and a band 2604 shaped to fit around a user's head. Virtual-reality system 2600 may also include output audio transducers 2606(A) and 2606(B). Furthermore, while not shown in FIG. 26, front rigid body 2602 may include one or more electronic elements, including one or more electronic displays, one or more inertial measurement units (IMUS), one or more tracking emitters or detectors, and/or any other suitable device or system for creating an artificial-reality experience.


Artificial-reality systems may include a variety of types of visual feedback mechanisms. For example, display devices in augmented-reality system 2500 and/or virtual-reality system 2600 may include one or more liquid crystal displays (LCDs), light emitting diode (LED) displays, organic LED (OLED) displays, digital light project (DLP) micro-displays, liquid crystal on silicon (LCoS) micro-displays, and/or any other suitable type of display screen. These artificial-reality systems may include a single display screen for both eyes or may provide a display screen for each eye, which may allow for additional flexibility for varifocal adjustments or for correcting a user's refractive error. Some of these artificial-reality systems may also include optical subsystems having one or more lenses (e.g., conventional concave or convex lenses, Fresnel lenses, adjustable liquid lenses, etc.) through which a user may view a display screen. These optical subsystems may serve a variety of purposes, including to collimate (e.g., make an object appear at a greater distance than its physical distance), to magnify (e.g., make an object appear larger than its actual size), and/or to relay (to, e.g., the viewer's eyes) light. These optical subsystems may be used in a non-pupil-forming architecture (such as a single lens configuration that directly collimates light but results in so-called pincushion distortion) and/or a pupil-forming architecture (such as a multi-lens configuration that produces so-called barrel distortion to nullify pincushion distortion).


In addition to or instead of using display screens, some the artificial-reality systems described herein may include one or more projection systems. For example, display devices in augmented-reality system 2500 and/or virtual-reality system 2600 may include micro-LED projectors that project light (using, e.g., a waveguide) into display devices, such as clear combiner lenses that allow ambient light to pass through. The display devices may refract the projected light toward a user's pupil and may enable a user to simultaneously view both artificial-reality content and the real world. The display devices may accomplish this using any of a variety of different optical components, including waveguide components (e.g., holographic, planar, diffractive, polarized, and/or reflective waveguide elements), light-manipulation surfaces and elements (such as diffractive, reflective, and refractive elements and gratings), coupling elements, etc. Artificial-reality systems may also be configured with any other suitable type or form of image projection system, such as retinal projectors used in virtual retina displays.


The artificial-reality systems described herein may also include various types of computer vision components and subsystems. For example, augmented-reality system 2500 and/or virtual-reality system 2600 may include one or more optical sensors, such as two-dimensional (2D) or 3D cameras, structured light transmitters and detectors, time-of-flight depth sensors, single-beam or sweeping laser rangefinders, 3D LiDAR sensors, and/or any other suitable type or form of optical sensor. An artificial-reality system may process data from one or more of these sensors to identify a location of a user, to map the real world, to provide a user with context about real-world surroundings, and/or to perform a variety of other functions.


The artificial-reality systems described herein may also include one or more input and/or output audio transducers. Output audio transducers may include voice coil speakers, ribbon speakers, electrostatic speakers, piezoelectric speakers, bone conduction transducers, cartilage conduction transducers, tragus-vibration transducers, and/or any other suitable type or form of audio transducer. Similarly, input audio transducers may include condenser microphones, dynamic microphones, ribbon microphones, and/or any other type or form of input transducer. In some embodiments, a single transducer may be used for both audio input and audio output.


In some embodiments, the artificial-reality systems described herein may also include tactile (i.e., haptic) feedback systems, which may be incorporated into headwear, gloves, body suits, handheld controllers, environmental devices (e.g., chairs, floormats, etc.), and/or any other type of device or system. Haptic feedback systems may provide various types of cutaneous feedback, including vibration, force, traction, texture, and/or temperature. Haptic feedback systems may also provide various types of kinesthetic feedback, such as motion and compliance. Haptic feedback may be implemented using motors, piezoelectric actuators, fluidic systems, and/or a variety of other types of feedback mechanisms. Haptic feedback systems may be implemented independent of other artificial-reality devices, within other artificial-reality devices, and/or in conjunction with other artificial-reality devices.


By providing haptic sensations, audible content, and/or visual content, artificial-reality systems may create an entire virtual experience or enhance a user's real-world experience in a variety of contexts and environments. For instance, artificial-reality systems may assist or extend a user's perception, memory, or cognition within a particular environment. Some systems may enhance a user's interactions with other people in the real world or may enable more immersive interactions with other people in a virtual world. Artificial-reality systems may also be used for educational purposes (e.g., for teaching or training in schools, hospitals, government organizations, military organizations, business enterprises, etc.), entertainment purposes (e.g., for playing video games, listening to music, watching video content, etc.), and/or for accessibility purposes (e.g., as hearing aids, visual aids, etc.). The embodiments disclosed herein may enable or enhance a user's artificial-reality experience in one or more of these contexts and environments and/or in other contexts and environments.


The process parameters and sequence of the steps described and/or illustrated herein are given by way of example only and can be varied as desired. For example, while the steps illustrated and/or described herein may be shown or discussed in a particular order, these steps do not necessarily need to be performed in the order illustrated or discussed. The various exemplary methods described and/or illustrated herein may also omit one or more of the steps described or illustrated herein or include additional steps in addition to those disclosed.


Unless otherwise noted, the terms “connected to” and “coupled to” (and their derivatives), as used in the specification and claims, are to be construed as permitting both direct and indirect (i.e., via other elements or components) connection. In addition, the terms “a” or “an,” as used in the specification and claims, are to be construed as meaning “at least one of.” Finally, for ease of use, the terms “including” and “having” (and their derivatives), as used in the specification and claims, are interchangeable with and have the same meaning as the word “comprising.”

Claims
  • 1. A human-computer interface (HCI) system comprising: at least one processor;a plurality of sensors that detect one or more neuromuscular signals from a forearm or wrist of a user, wherein the plurality of sensors are arranged on one or more wearable devices; andmemory that stores: one or more trained inferential models that: determine an amount of force associated with the one or more neuromuscular signals detected by the plurality of sensors; anddetermine a single hand gesture performed by the user based at least in part on the one or more neuromuscular signals detected by the plurality of sensors; andcomputer-executable instructions that, when executed by the at least one processor, cause the at least one processor to: identify the amount of force associated with the one or more neuromuscular signals as determined by the one or more trained inferential models;determine whether the amount of force satisfies a threshold force value;determine whether the amount of force and an additional amount of force were both exerted by the user while performing the single hand gesture;in accordance with a determination that the amount of force satisfies the threshold force value and was exerted by the user while performing the single hand gesture, generate a first input command for the HCI system; andin accordance with a determination that the additional amount of force was exerted by the user while performing the single hand gesture, generate a second input command for the HCI system.
  • 2. The HCI system of claim 1, wherein: the threshold force value is a first threshold force value; andthe computer-executable instructions, when executed by the at least one processor, further cause the at least one processor to: determine whether the amount of force also satisfies a second threshold force value that is greater than the first threshold force value; andin accordance with a determination that the amount of force satisfies the first threshold force value and the second threshold force value, generate a second input command that differs from the first input command for the HCI system.
  • 3. The HCI system of claim 2, wherein: the one or more trained inferential models determine an additional amount of force associated with the one or more neuromuscular signals detected by the plurality of sensors; andthe computer-executable instructions, when executed by the at least one processor, further cause the at least one processor to: identify the additional amount of force associated with the one or more neuromuscular signals as determined by the one or more trained inferential models;determine whether the additional amount of force does not satisfy the first threshold force value; andin accordance with a determination that the additional amount of force does not satisfy the first threshold force value, forgo generation of an additional input command for the HCI system.
  • 4. The HCI system of claim 1, wherein: the one or more trained inferential models further determine one or more hand gestures performed by the user based at least in part on the one or more neuromuscular signals detected by the plurality of sensors; andthe computer-executable instructions, when executed by the at least one processor, further cause the at least one processor to: identify the one or more hand gestures performed by the user as determined by the one or more trained inferential models;determine whether the amount of force was exerted by the user while performing the one or more hand gestures; andgenerate the first input command for the HCI system in accordance with a determination that the amount of force was exerted by the user while performing the one or more hand gestures.
  • 5. The HCI system of claim 4, wherein: the one or more hand gestures comprise a first hand gesture and a second hand gesture; andthe computer-executable instructions, when executed by the at least one processor, further cause the at least one processor to: determine that the amount of force was exerted by the user while performing the first hand gesture;determine whether an additional amount of force that satisfies an additional threshold force value was exerted by the user while performing the second hand gesture;generate the first input command for the HCI system in accordance with a determination that the amount of force was exerted by the user while performing the first hand gesture; andgenerate a second input command for the HCI system in accordance with a determination that the additional amount of force that satisfies the additional threshold force value was exerted by the user while performing the second hand gesture.
  • 6. The HCI system of claim 4, wherein the computer-executable instructions, when executed by the at least one processor, further cause the at least one processor to: maintain a mapping between a set of hand gestures and a set of actions capable of being performed by the HCI system;select an action to be performed by the HCI system based at least in part on the mapping and the one or more hand gestures performed by the user; andformulate the first input command to direct the HCI system to perform the action in accordance with the mapping.
  • 7. The HCI system of claim 6, wherein the computer-executable instructions, when executed by the at least one processor, further cause the at least one processor to: maintain an additional mapping between the set of hand gestures and an additional set of actions capable of being performed by the HCI system; andprior to formulating the first input command for the HCI system, activate the mapping such that the action is selected in response to the one or more hand gestures performed by the user.
  • 8. The HCI system of claim 7, wherein the computer-executable instructions, when executed by the at least one processor, further cause the at least one processor to: deactivate the mapping after formulating the first input command for the HCI system;activate the additional mapping; andafter activating the additional mapping, formulate a second input command to direct the HCI system to perform a different action included in the additional mapping in accordance with a subsequent determination that the user performed the one or more hand gestures.
  • 9. The HCI system of claim 6, wherein the computer-executable instructions, when executed by the at least one processor, further cause the at least one processor to: map the action to be performed by the HCI system to a plurality of conditions;determine, based at least in part on the one or more neuromuscular signals detected by the plurality of sensors, that the plurality of conditions mapped to the action have been satisfied; andgenerate the first input command for the HCI system due at least in part to the plurality of conditions having been satisfied.
  • 10. The HCI system of claim 4, wherein the computer-executable instructions, when executed by the at least one processor, further cause the at least one processor to: identify photographic data representative of the one or more hand gestures as captured by a camera; andprovide the photographic data to the one or more trained inferential models to enable the one or more trained inferential models to determine the one or more hand gestures based at least in part on the one or more neuromuscular signals and the photographic data.
  • 11. The HCI system of claim 10, wherein: the one or more hand gestures comprise either: an index finger pinch; ora middle finger pinch; andthe one or more trained inferential models are able to distinguish between an index finger pinch and a middle finger pinch based at least in part on the one or more neuromuscular signals and the photographic data.
  • 12. The HCI system of claim 1, wherein: the single hand gesture comprises a first formed by a hand of the user; andthe computer-executable instructions, when executed by the at least one processor, further cause the at least one processor to: formulate the first input command to control a speed of a cursor implemented on the HCI system in accordance with the amount of force exerted by the user while performing the single hand gesture; andformulate the first input command to increase or decrease the speed of the cursor implemented on the HCI system in accordance with the additional amount of force exerted by the user while performing the single hand gesture.
  • 13. The HCI system of claim 1, wherein: the one or more trained inferential models determine an additional amount of force associated with the one or more neuromuscular signals detected by the plurality of sensors; andthe computer-executable instructions, when executed by the at least one processor, further cause the at least one processor to: identify the additional amount of force associated with the one or more neuromuscular signals as determined by the one or more trained inferential models;determine whether the additional amount of force does not satisfy the threshold force value; andin accordance with a determination that the additional amount of force does not satisfy the threshold force value, generate a second input command that differs from the first input command for the HCI system.
  • 14. The HCI system of claim 13, wherein the first input command or the second input command directs the HCI system to perform at least one of: navigate a graphical user interface (GUI) of an application;draw a visual element on a GUI of an application;control a scrolling speed of a GUI of an application;scroll through a GUI of an application;modify a width of a virtual drawing instrument used to draw in a GUI of an application;associate a cursor with a visual element displayed in a GUI of an application;provide a feedback indication of an association made between a cursor and a visual element displayed in a GUI of an application;move a selection indicator forward or backward along a radial menu displayed in a GUI of an application;modify a characteristic of a visual element displayed in a GUI of an application;display a visual element in a GUI of an application;click a visual element on a GUI of an application; orselect a visual element on a GUI of an application.
  • 15. The HCI system of claim 1, wherein the one or more wearable devices, the at least one processor, and the memory are incorporated into an artificial-reality system that includes a head-mounted display.
  • 16. An artificial-reality system comprising: a head-mounted display; anda human-computer interface (HCI) system communicatively coupled to the head-mounted display, wherein the HCI system comprises: at least one processor;a plurality of sensors that detect one or more neuromuscular signals from a forearm or wrist of a user, wherein the plurality of sensors are arranged on one or more wearable devices; andmemory that stores: one or more trained inferential models that: determine an amount of force associated with the one or more neuromuscular signals detected by the plurality of sensors; anddetermine a single hand gesture performed by the user based at least in part on the one or more neuromuscular signals detected by the plurality of sensors; andcomputer-executable instructions that, when executed by the at least one processor, cause the at least one processor to: identify the amount of force associated with the one or more neuromuscular signals as determined by the one or more trained inferential models;determine whether the amount of force satisfies a threshold force value;determine whether the amount of force and an additional amount of force were both exerted by the user while performing the single hand gesture;in accordance with a determination that the amount of force satisfies the threshold force value and was exerted by the user while performing the single hand gesture, generate a first input command for the artificial-reality system; andin accordance with a determination that the additional amount of force was exerted by the user while performing the single hand gesture, generate a second input command for the HCI system.
  • 17. The artificial-reality system of claim 16, wherein: the threshold force value is a first threshold force value; andthe computer-executable instructions, when executed by the at least one processor, further cause the at least one processor to: determine whether the amount of force also satisfies a second threshold force value that is greater than the first threshold force value; andin accordance with a determination that the amount of force satisfies the first threshold force value and the second threshold force value, generate a second input command that differs from the first input command for the artificial-reality system.
  • 18. The artificial-reality system of claim 17, wherein: the one or more trained inferential models determine an additional amount of force associated with the one or more neuromuscular signals detected by the plurality of sensors; andthe computer-executable instructions, when executed by the at least one processor, further cause the at least one processor to: identify the additional amount of force associated with the one or more neuromuscular signals as determined by the one or more trained inferential models;determine whether the additional amount of force does not satisfy the first threshold force value; andin accordance with a determination that the additional amount of force does not satisfy the first threshold force value, forgo generation of an additional input command for the artificial-reality system.
  • 19. A method comprising: detecting, by a plurality of sensors incorporated into one or more wearable devices of a human-computer interface (HCI) system, one or more neuromuscular signals from a forearm or wrist of a user;determining, by one or more trained inferential models implemented by at least one processor of the HCI system, an amount of force associated with the one or more neuromuscular signals detected by the plurality of sensors;determining, by the one or more trained inferential models, a single hand gesture performed by the user based at least in part on the one or more neuromuscular signals detected by the plurality of sensors;determining, by the at least one processor of the HCI system, whether the amount of force associated with the one or more neuromuscular signals satisfies a threshold force value;determining, by the one or more trained inferential models, whether the amount of force and an additional amount of force were both exerted by the user while performing the single hand gesture;generating, by the at least one processor of the HCI system, a first input command for the HCI system in accordance with a determination that the amount of force satisfies the threshold force value and was exerted by the user while performing the single hand gesture; andgenerating, by the at least one processor of the HCI system, a second input command for the HCI system in accordance with a determination that the additional amount of force was exerted by the user while performing the single hand gesture.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. application Ser. No. 16/863,098 filed 30 Apr. 2020, the disclosure of which is incorporated in its entirety by this reference. This application also claims the benefit of priority to U.S. Provisional Application Nos. 62/840,947; 62/840,980; 62/840,966; 62/841,069; 62/841,100; and 62/841,107, all of which were filed Apr. 30, 2019. The contents of these provisional applications are also incorporated herein by reference in their entirety.

US Referenced Citations (723)
Number Name Date Kind
1411995 Dull Apr 1922 A
3580243 Johnson May 1971 A
3620208 Wayne et al. Nov 1971 A
3735425 Hoshall et al. May 1973 A
3880146 Everett et al. Apr 1975 A
4055168 Miller et al. Oct 1977 A
4602639 Hoogendoorn et al. Jul 1986 A
4705408 Jordi Nov 1987 A
4817064 Milles Mar 1989 A
4896120 Kamil Jan 1990 A
5003978 Dunseath, Jr. Apr 1991 A
D322227 Warhol Dec 1991 S
5081852 Cox Jan 1992 A
5251189 Thorp Oct 1993 A
D348660 Parsons Jul 1994 S
5445869 Ishikawa et al. Aug 1995 A
5462065 Cusimano Oct 1995 A
5482051 Reddy et al. Jan 1996 A
5605059 Woodward Feb 1997 A
5625577 Kunii et al. Apr 1997 A
5683404 Johnson Nov 1997 A
6005548 Latypov et al. Dec 1999 A
6009210 Kang Dec 1999 A
6032530 Hock Mar 2000 A
6066794 Longo May 2000 A
6184847 Fateh et al. Feb 2001 B1
6238338 DeLuca et al. May 2001 B1
6244873 Hill et al. Jun 2001 B1
6377277 Yamamoto Apr 2002 B1
D459352 Giovanniello Jun 2002 S
6411843 Zarychta Jun 2002 B1
6487906 Hock Dec 2002 B1
6510333 Licata et al. Jan 2003 B1
6527711 Stivoric et al. Mar 2003 B1
6619836 Silvant et al. Sep 2003 B1
6658287 Litt et al. Dec 2003 B1
6720984 Jorgensen et al. Apr 2004 B1
6743982 Biegelsen et al. Jun 2004 B2
6771294 Pulli et al. Aug 2004 B1
6774885 Even-Zohar Aug 2004 B1
6807438 Brun Del Re et al. Oct 2004 B1
D502661 Rapport Mar 2005 S
D502662 Rapport Mar 2005 S
6865409 Getsla et al. Mar 2005 B2
D503646 Rapport Apr 2005 S
6880364 Vidolin et al. Apr 2005 B1
6901286 Sinderby et al. May 2005 B1
6927343 Watanabe et al. Aug 2005 B2
6942621 Avinash et al. Sep 2005 B2
6965842 Rekimoto Nov 2005 B2
6972734 Ohshima et al. Dec 2005 B1
6984208 Zheng Jan 2006 B2
7022919 Brist et al. Apr 2006 B2
7086218 Pasach Aug 2006 B1
7089148 Bachmann et al. Aug 2006 B1
D535401 Travis et al. Jan 2007 S
7173437 Hervieux et al. Feb 2007 B2
7209114 Radley-Smith Apr 2007 B2
D543212 Marks May 2007 S
7265298 Maghribi et al. Sep 2007 B2
7271774 Puuri Sep 2007 B2
7333090 Tanaka et al. Feb 2008 B2
7351975 Brady et al. Apr 2008 B2
7450107 Radley-Smith Nov 2008 B2
7491892 Wagner et al. Feb 2009 B2
7517725 Reis Apr 2009 B2
7558622 Tran Jul 2009 B2
7574253 Edney et al. Aug 2009 B2
7580742 Tan et al. Aug 2009 B2
7596393 Jung et al. Sep 2009 B2
7618260 Daniel et al. Nov 2009 B2
7636549 Ma et al. Dec 2009 B2
7640007 Chen et al. Dec 2009 B2
7660126 Cho et al. Feb 2010 B2
7761390 Ford Jul 2010 B2
7787946 Stahmann et al. Aug 2010 B2
7805386 Greer Sep 2010 B2
7809435 Ettare et al. Oct 2010 B1
7844310 Anderson Nov 2010 B2
7870211 Pascal et al. Jan 2011 B2
7901368 Flaherty et al. Mar 2011 B2
7925100 Howell et al. Apr 2011 B2
7948763 Chuang May 2011 B2
D643428 Janky et al. Aug 2011 S
D646192 Woode Oct 2011 S
8054061 Prance et al. Nov 2011 B2
D654622 Hsu Feb 2012 S
8170656 Tan et al. May 2012 B2
8179604 Prada Gomez et al. May 2012 B1
8188937 Amafuji et al. May 2012 B1
8190249 Gharieb et al. May 2012 B1
D661613 Demeglio Jun 2012 S
8203502 Chi et al. Jun 2012 B1
8207473 Axisa et al. Jun 2012 B2
8212859 Tang et al. Jul 2012 B2
8311623 Sanger Nov 2012 B2
8348538 Van Loenen et al. Jan 2013 B2
8351651 Lee Jan 2013 B2
8355671 Kramer et al. Jan 2013 B2
8384683 Luo Feb 2013 B2
8389862 Arora et al. Mar 2013 B2
8421634 Tan et al. Apr 2013 B2
8427977 Workman et al. Apr 2013 B2
D682727 Bulgari May 2013 S
8435191 Barboutis et al. May 2013 B2
8437844 Syed Momen et al. May 2013 B2
8447704 Tan et al. May 2013 B2
8467270 Gossweiler, III et al. Jun 2013 B2
8469741 Oster et al. Jun 2013 B2
8484022 Vanhoucke Jul 2013 B1
D689862 Liu Sep 2013 S
8591411 Banet et al. Nov 2013 B2
D695454 Moore Dec 2013 S
8620361 Bailey et al. Dec 2013 B2
8624124 Koo et al. Jan 2014 B2
8702629 Giuffrida et al. Apr 2014 B2
8704882 Turner Apr 2014 B2
8718980 Garudadri et al. May 2014 B2
8743052 Keller et al. Jun 2014 B1
8744543 Li et al. Jun 2014 B2
8754862 Zaliva Jun 2014 B2
8777668 Ikeda et al. Jul 2014 B2
D716457 Brefka et al. Oct 2014 S
D717685 Bailey et al. Nov 2014 S
8879276 Wang Nov 2014 B2
8880163 Barachant et al. Nov 2014 B2
8883287 Boyce et al. Nov 2014 B2
8890875 Jammes et al. Nov 2014 B2
8892479 Tan et al. Nov 2014 B2
8895865 Lenahan et al. Nov 2014 B2
8912094 Koo et al. Dec 2014 B2
8914472 Lee et al. Dec 2014 B1
8922481 Kauffmann et al. Dec 2014 B1
8970571 Wong et al. Mar 2015 B1
8971023 Olsson et al. Mar 2015 B2
9018532 Wesselmann et al. Apr 2015 B2
9037530 Tan et al. May 2015 B2
9086687 Park et al. Jul 2015 B2
9092664 Forutanpour et al. Jul 2015 B2
D736664 Paradise et al. Aug 2015 S
9146730 Lazar Sep 2015 B2
D741855 Park et al. Oct 2015 S
9170674 Forutanpour et al. Oct 2015 B2
D742272 Bailey et al. Nov 2015 S
D742874 Cheng et al. Nov 2015 S
D743963 Osterhout Nov 2015 S
9182826 Powledge et al. Nov 2015 B2
9211417 Heldman et al. Dec 2015 B2
9218574 Phillipps et al. Dec 2015 B2
D747714 Erbeus Jan 2016 S
9235934 Mandella et al. Jan 2016 B2
9240069 Li Jan 2016 B1
D750623 Park et al. Mar 2016 S
D751065 Magi Mar 2016 S
9278453 Assad Mar 2016 B2
9299248 Lake et al. Mar 2016 B2
D756359 Bailey et al. May 2016 S
9329694 Slonneger May 2016 B2
9341659 Poupyrev et al. May 2016 B2
9351653 Harrison May 2016 B1
9367139 Ataee et al. Jun 2016 B2
9372535 Bailey et al. Jun 2016 B2
9389694 Ataee et al. Jul 2016 B2
9393418 Giuffrida et al. Jul 2016 B2
9402582 Parviz et al. Aug 2016 B1
9408316 Bailey et al. Aug 2016 B2
9418927 Axisa et al. Aug 2016 B2
9439566 Arne et al. Sep 2016 B2
9459697 Bedikian et al. Oct 2016 B2
9472956 Michaelis et al. Oct 2016 B2
9477313 Mistry et al. Oct 2016 B2
9483123 Aleem et al. Nov 2016 B2
9529434 Choi et al. Dec 2016 B2
9597015 McNames et al. Mar 2017 B2
9600030 Bailey et al. Mar 2017 B2
9612661 Wagner et al. Apr 2017 B2
9613262 Holz Apr 2017 B2
9654477 Kotamraju May 2017 B1
9659403 Horowitz May 2017 B1
9687168 John Jun 2017 B2
9696795 Marcolina et al. Jul 2017 B2
9720515 Wagner et al. Aug 2017 B2
9741169 Holz Aug 2017 B1
9766709 Holz Sep 2017 B2
9785247 Horowitz et al. Oct 2017 B1
9788789 Bailey Oct 2017 B2
9864431 Keskin et al. Jan 2018 B2
9867548 Le et al. Jan 2018 B2
9880632 Ataee et al. Jan 2018 B2
9891718 Connor Feb 2018 B2
9921641 Worley, III et al. Mar 2018 B1
10042422 Morun et al. Aug 2018 B2
10070799 Ang et al. Sep 2018 B2
10078435 Noel Sep 2018 B2
10101809 Morun et al. Oct 2018 B2
10152082 Bailey Dec 2018 B2
10185416 Mistry et al. Jan 2019 B2
10188309 Morun et al. Jan 2019 B2
10199008 Aleem et al. Feb 2019 B2
10203751 Keskin et al. Feb 2019 B2
10216274 Chapeskie et al. Feb 2019 B2
10251577 Morun et al. Apr 2019 B2
10310601 Morun et al. Jun 2019 B2
10331210 Morun et al. Jun 2019 B2
10362958 Morun et al. Jul 2019 B2
10409371 Kaifosh et al. Sep 2019 B2
10437335 Daniels Oct 2019 B2
10460455 Giurgica-Tiron et al. Oct 2019 B2
10489986 Kaifosh et al. Nov 2019 B2
10496168 Kaifosh et al. Dec 2019 B2
10504286 Kaifosh et al. Dec 2019 B2
10520378 Brown et al. Dec 2019 B1
10528135 Bailey et al. Jan 2020 B2
10558273 Park et al. Feb 2020 B2
10592001 Berenzweig et al. Mar 2020 B2
10610737 Crawford Apr 2020 B1
10676083 De Sapio et al. Jun 2020 B1
10687759 Guo et al. Jun 2020 B2
10905350 Berenzweig et al. Feb 2021 B2
10905383 Barachant Feb 2021 B2
10937414 Berenzweig et al. Mar 2021 B2
10990174 Kaifosh et al. Apr 2021 B2
11009951 Bailey et al. May 2021 B2
11150730 Anderson Oct 2021 B1
20020032386 Sackner et al. Mar 2002 A1
20020077534 DuRousseau Jun 2002 A1
20020094701 Biegelsen et al. Jul 2002 A1
20020198472 Kramer Dec 2002 A1
20030036691 Stanaland et al. Feb 2003 A1
20030051505 Robertson et al. Mar 2003 A1
20030144586 Tsubata Jul 2003 A1
20030144829 Geatz et al. Jul 2003 A1
20030171921 Manabe et al. Sep 2003 A1
20030184544 Prudent Oct 2003 A1
20040010210 Avinash et al. Jan 2004 A1
20040054273 Finneran et al. Mar 2004 A1
20040068409 Tanaka Apr 2004 A1
20040073104 Brun Del Re et al. Apr 2004 A1
20040080499 Lui Apr 2004 A1
20040092839 Shin et al. May 2004 A1
20040194500 Rapport Oct 2004 A1
20040210165 Marmaropoulos et al. Oct 2004 A1
20040243342 Rekimoto Dec 2004 A1
20040254617 Hemmerling et al. Dec 2004 A1
20050005637 Rapport Jan 2005 A1
20050012715 Ford Jan 2005 A1
20050070227 Shen et al. Mar 2005 A1
20050070791 Edney et al. Mar 2005 A1
20050115561 Stahmann et al. Jun 2005 A1
20050119701 Lauter et al. Jun 2005 A1
20050177038 Kolpin et al. Aug 2005 A1
20060018833 Murphy et al. Jan 2006 A1
20060037359 Stinespring Feb 2006 A1
20060058699 Vitiello et al. Mar 2006 A1
20060061544 Min et al. Mar 2006 A1
20060121958 Jung et al. Jun 2006 A1
20060129057 Maekawa et al. Jun 2006 A1
20060149338 Flaherty et al. Jul 2006 A1
20060211956 Sankai Sep 2006 A1
20070009151 Pittman et al. Jan 2007 A1
20070016265 Davoodi Jan 2007 A1
20070023662 Brady et al. Feb 2007 A1
20070132785 Ebersole, Jr. et al. Jun 2007 A1
20070148624 Nativ Jun 2007 A1
20070172797 Hada et al. Jul 2007 A1
20070177770 Derchak et al. Aug 2007 A1
20070185697 Tan et al. Aug 2007 A1
20070256494 Nakamura et al. Nov 2007 A1
20070285399 Lund Dec 2007 A1
20080001735 Tran Jan 2008 A1
20080051673 Kong et al. Feb 2008 A1
20080052643 Ike et al. Feb 2008 A1
20080058668 Seyed Momen et al. Mar 2008 A1
20080103639 Troy et al. May 2008 A1
20080103769 Schultz et al. May 2008 A1
20080136775 Conant Jun 2008 A1
20080152217 Greer Jun 2008 A1
20080163130 Westerman Jul 2008 A1
20080214360 Stirling et al. Sep 2008 A1
20080221487 Zohar et al. Sep 2008 A1
20080262772 Luinge et al. Oct 2008 A1
20080278497 Jammes et al. Nov 2008 A1
20080285805 Luinge et al. Nov 2008 A1
20090005700 Joshi et al. Jan 2009 A1
20090007597 Hanevold Jan 2009 A1
20090027337 Hildreth Jan 2009 A1
20090031757 Harding Feb 2009 A1
20090040016 Ikeda Feb 2009 A1
20090051544 Niknejad Feb 2009 A1
20090079607 Denison et al. Mar 2009 A1
20090079813 Hildreth Mar 2009 A1
20090082692 Hale et al. Mar 2009 A1
20090082701 Zohar et al. Mar 2009 A1
20090102580 Uchaykin Apr 2009 A1
20090112080 Matthews Apr 2009 A1
20090124881 Rytky May 2009 A1
20090189864 Walker et al. Jul 2009 A1
20090189867 Krah et al. Jul 2009 A1
20090195497 Fitzgerald et al. Aug 2009 A1
20090204031 McNames et al. Aug 2009 A1
20090209878 Sanger Aug 2009 A1
20090251407 Flake et al. Oct 2009 A1
20090265671 Sachs et al. Oct 2009 A1
20090318785 Ishikawa et al. Dec 2009 A1
20090319230 Case, Jr. et al. Dec 2009 A1
20090326406 Tan et al. Dec 2009 A1
20090327171 Tan Dec 2009 A1
20100030532 Arora et al. Feb 2010 A1
20100041974 Ting et al. Feb 2010 A1
20100063794 Hernandez-Rebollar Mar 2010 A1
20100106044 Linderman Apr 2010 A1
20100113910 Brauers et al. May 2010 A1
20100228487 Leuthardt et al. Sep 2010 A1
20100234696 Li et al. Sep 2010 A1
20100240981 Barboutis et al. Sep 2010 A1
20100249635 Van Der Reijden Sep 2010 A1
20100280628 Sankai Nov 2010 A1
20100292595 Paul Nov 2010 A1
20100292606 Prakash et al. Nov 2010 A1
20100292617 Lei et al. Nov 2010 A1
20100293115 Seyed Momen Nov 2010 A1
20100306713 Geisner et al. Dec 2010 A1
20100315266 Gunawardana et al. Dec 2010 A1
20100317958 Beck et al. Dec 2010 A1
20110007035 Shai Jan 2011 A1
20110018754 Tojima et al. Jan 2011 A1
20110066381 Garudadri et al. Mar 2011 A1
20110077484 Van Slyke et al. Mar 2011 A1
20110082838 Niemela Apr 2011 A1
20110092826 Lee et al. Apr 2011 A1
20110119216 Wigdor May 2011 A1
20110133934 Tan et al. Jun 2011 A1
20110134026 Kang et al. Jun 2011 A1
20110151974 Deaguero Jun 2011 A1
20110166434 Gargiulo Jul 2011 A1
20110172503 Knepper et al. Jul 2011 A1
20110173204 Murillo et al. Jul 2011 A1
20110173574 Clavin et al. Jul 2011 A1
20110205242 Friesen Aug 2011 A1
20110213278 Horak et al. Sep 2011 A1
20110221672 Osterhout et al. Sep 2011 A1
20110224556 Moon et al. Sep 2011 A1
20110224564 Moon et al. Sep 2011 A1
20110230782 Bartol et al. Sep 2011 A1
20110248914 Sherr Oct 2011 A1
20110262002 Lee Oct 2011 A1
20110270135 Dooley et al. Nov 2011 A1
20110295100 Hegde et al. Dec 2011 A1
20110313762 Ben-David et al. Dec 2011 A1
20120007821 Zaliva Jan 2012 A1
20120029322 Wartena et al. Feb 2012 A1
20120051005 Vanfleteren et al. Mar 2012 A1
20120066163 Balls et al. Mar 2012 A1
20120071780 Barachant et al. Mar 2012 A1
20120101357 Hoskuldsson et al. Apr 2012 A1
20120117514 Kim et al. May 2012 A1
20120157789 Kangas et al. Jun 2012 A1
20120165695 Kidmose et al. Jun 2012 A1
20120184838 John Jul 2012 A1
20120188158 Tan et al. Jul 2012 A1
20120203076 Fatta et al. Aug 2012 A1
20120209134 Morita Aug 2012 A1
20120265090 Fink et al. Oct 2012 A1
20120265480 Oshima Oct 2012 A1
20120283526 Gommesen et al. Nov 2012 A1
20120293548 Perez et al. Nov 2012 A1
20120302858 Kidmose et al. Nov 2012 A1
20120323521 De Foras et al. Dec 2012 A1
20130004033 Trugenberger Jan 2013 A1
20130005303 Song et al. Jan 2013 A1
20130020948 Han et al. Jan 2013 A1
20130027341 Mastandrea Jan 2013 A1
20130038707 Cunningham et al. Feb 2013 A1
20130077820 Marais et al. Mar 2013 A1
20130080794 Hsieh Mar 2013 A1
20130106686 Bennett May 2013 A1
20130123656 Heck May 2013 A1
20130127708 Jung et al. May 2013 A1
20130131538 Gaw et al. May 2013 A1
20130135223 Shai May 2013 A1
20130141375 Ludwig et al. Jun 2013 A1
20130144629 Johnston et al. Jun 2013 A1
20130165813 Chang et al. Jun 2013 A1
20130191741 Dickinson et al. Jul 2013 A1
20130198694 Rahman et al. Aug 2013 A1
20130207889 Chang et al. Aug 2013 A1
20130217998 Mahfouz et al. Aug 2013 A1
20130221996 Poupyrev et al. Aug 2013 A1
20130232095 Tan et al. Sep 2013 A1
20130259238 Xiang et al. Oct 2013 A1
20130265229 Forutanpour et al. Oct 2013 A1
20130265437 Thorn et al. Oct 2013 A1
20130271292 McDermott Oct 2013 A1
20130285913 Griffin et al. Oct 2013 A1
20130293580 Spivack Nov 2013 A1
20130310979 Herr et al. Nov 2013 A1
20130312256 Wesselmann et al. Nov 2013 A1
20130317382 Le Nov 2013 A1
20130317648 Assad Nov 2013 A1
20130332196 Pinsker Dec 2013 A1
20140020945 Hurwitz et al. Jan 2014 A1
20140028546 Jeon et al. Jan 2014 A1
20140045547 Singamsetty et al. Feb 2014 A1
20140049417 Abdurrahman et al. Feb 2014 A1
20140052150 Taylor et al. Feb 2014 A1
20140092009 Yen et al. Apr 2014 A1
20140094675 Luna et al. Apr 2014 A1
20140098018 Kim et al. Apr 2014 A1
20140100432 Golda et al. Apr 2014 A1
20140107493 Yuen et al. Apr 2014 A1
20140121471 Walker May 2014 A1
20140122958 Greenebrg et al. May 2014 A1
20140142937 Powledge et al. May 2014 A1
20140143064 Tran May 2014 A1
20140147820 Snow et al. May 2014 A1
20140194062 Palin et al. Jul 2014 A1
20140196131 Lee Jul 2014 A1
20140198034 Bailey et al. Jul 2014 A1
20140198035 Bailey et al. Jul 2014 A1
20140198944 Forutanpour et al. Jul 2014 A1
20140200432 Banerji et al. Jul 2014 A1
20140201666 Bedikian et al. Jul 2014 A1
20140223462 Aimone et al. Aug 2014 A1
20140236031 Banet et al. Aug 2014 A1
20140240103 Lake et al. Aug 2014 A1
20140240223 Lake et al. Aug 2014 A1
20140245200 Holz Aug 2014 A1
20140249397 Lake et al. Sep 2014 A1
20140257141 Giuffrida et al. Sep 2014 A1
20140277622 Raniere Sep 2014 A1
20140278139 Hong et al. Sep 2014 A1
20140278441 Ton et al. Sep 2014 A1
20140279860 Pan et al. Sep 2014 A1
20140282282 Holz Sep 2014 A1
20140285326 Luna et al. Sep 2014 A1
20140297528 Agrawal et al. Oct 2014 A1
20140299362 Park et al. Oct 2014 A1
20140304665 Holz Oct 2014 A1
20140310595 Acharya et al. Oct 2014 A1
20140330404 Abdelghani et al. Nov 2014 A1
20140334083 Bailey Nov 2014 A1
20140334653 Luna et al. Nov 2014 A1
20140337861 Chang et al. Nov 2014 A1
20140340857 Hsu et al. Nov 2014 A1
20140344731 Holz Nov 2014 A1
20140349257 Connor Nov 2014 A1
20140354528 Laughlin et al. Dec 2014 A1
20140354529 Laughlin et al. Dec 2014 A1
20140355825 Kim et al. Dec 2014 A1
20140358024 Nelson et al. Dec 2014 A1
20140358825 Phillipps et al. Dec 2014 A1
20140359540 Kelsey et al. Dec 2014 A1
20140361988 Katz et al. Dec 2014 A1
20140364703 Kim et al. Dec 2014 A1
20140365163 Jallon Dec 2014 A1
20140368428 Pinault Dec 2014 A1
20140368474 Kim et al. Dec 2014 A1
20140375465 Fenuccio et al. Dec 2014 A1
20140376773 Holz Dec 2014 A1
20150006120 Sett et al. Jan 2015 A1
20150010203 Muninder et al. Jan 2015 A1
20150011857 Henson et al. Jan 2015 A1
20150019135 Kacyvenski et al. Jan 2015 A1
20150025355 Bailey et al. Jan 2015 A1
20150029092 Holz et al. Jan 2015 A1
20150035827 Yamaoka et al. Feb 2015 A1
20150045689 Barone Feb 2015 A1
20150045699 Mokaya et al. Feb 2015 A1
20150051470 Bailey et al. Feb 2015 A1
20150057506 Luna et al. Feb 2015 A1
20150057770 Bailey et al. Feb 2015 A1
20150065840 Bailey Mar 2015 A1
20150070270 Bailey et al. Mar 2015 A1
20150070274 Morozov Mar 2015 A1
20150072326 Mauri et al. Mar 2015 A1
20150084860 Aleem et al. Mar 2015 A1
20150091790 Forutanpour et al. Apr 2015 A1
20150094564 Tashman et al. Apr 2015 A1
20150099946 Sahin Apr 2015 A1
20150106052 Balakrishnan et al. Apr 2015 A1
20150109202 Ataee et al. Apr 2015 A1
20150124566 Lake et al. May 2015 A1
20150128094 Baldwin et al. May 2015 A1
20150141784 Morun et al. May 2015 A1
20150148641 Morun et al. May 2015 A1
20150148728 Sallum May 2015 A1
20150157944 Gottlieb Jun 2015 A1
20150160621 Yilmaz Jun 2015 A1
20150169074 Ataee et al. Jun 2015 A1
20150170421 Mandella et al. Jun 2015 A1
20150177841 Vanblon et al. Jun 2015 A1
20150182113 Utter, II Jul 2015 A1
20150182130 Utter, II Jul 2015 A1
20150182160 Kim et al. Jul 2015 A1
20150182163 Utter Jul 2015 A1
20150182164 Utter, II Jul 2015 A1
20150182165 Miller et al. Jul 2015 A1
20150186609 Utter, II Jul 2015 A1
20150187355 Parkinson et al. Jul 2015 A1
20150193949 Katz et al. Jul 2015 A1
20150199025 Holz Jul 2015 A1
20150213191 Abdelghani et al. Jul 2015 A1
20150216475 Luna et al. Aug 2015 A1
20150220152 Tait et al. Aug 2015 A1
20150223716 Korkala et al. Aug 2015 A1
20150230756 Luna et al. Aug 2015 A1
20150234426 Bailey et al. Aug 2015 A1
20150237716 Su et al. Aug 2015 A1
20150242009 Xiao et al. Aug 2015 A1
20150242575 Abovitz et al. Aug 2015 A1
20150261306 Lake Sep 2015 A1
20150261318 Scavezze et al. Sep 2015 A1
20150272483 Etemad et al. Oct 2015 A1
20150277575 Ataee et al. Oct 2015 A1
20150288944 Nistico et al. Oct 2015 A1
20150289995 Wilkinson et al. Oct 2015 A1
20150296553 DiFranco et al. Oct 2015 A1
20150302168 De Sapio et al. Oct 2015 A1
20150305672 Grey et al. Oct 2015 A1
20150309563 Connor Oct 2015 A1
20150309582 Gupta Oct 2015 A1
20150312175 Langholz Oct 2015 A1
20150313496 Connor Nov 2015 A1
20150323998 Kudekar et al. Nov 2015 A1
20150325202 Lake et al. Nov 2015 A1
20150332013 Lee et al. Nov 2015 A1
20150346701 Gordon et al. Dec 2015 A1
20150351690 Toth et al. Dec 2015 A1
20150355716 Balasubramanian et al. Dec 2015 A1
20150355718 Slonneger Dec 2015 A1
20150366504 Connor Dec 2015 A1
20150370326 Chapeskie et al. Dec 2015 A1
20150370333 Ataee et al. Dec 2015 A1
20150379770 Haley, Jr. et al. Dec 2015 A1
20160011668 Gilad-Bachrach et al. Jan 2016 A1
20160020500 Matsuda Jan 2016 A1
20160026853 Wexler et al. Jan 2016 A1
20160049073 Lee Feb 2016 A1
20160050037 Webb Feb 2016 A1
20160071319 Fallon et al. Mar 2016 A1
20160092504 Mitri et al. Mar 2016 A1
20160099010 Sainath et al. Apr 2016 A1
20160107309 Walsh et al. Apr 2016 A1
20160113587 Kothe et al. Apr 2016 A1
20160144172 Hsueh et al. May 2016 A1
20160150636 Otsubo May 2016 A1
20160156762 Bailey et al. Jun 2016 A1
20160162604 Xiaoli et al. Jun 2016 A1
20160170710 Kim et al. Jun 2016 A1
20160187992 Yamamoto et al. Jun 2016 A1
20160195928 Wagner et al. Jul 2016 A1
20160199699 Klassen Jul 2016 A1
20160202081 Debieuvre et al. Jul 2016 A1
20160206206 Avila et al. Jul 2016 A1
20160207201 Herr et al. Jul 2016 A1
20160217614 Kraver et al. Jul 2016 A1
20160235323 Tadi et al. Aug 2016 A1
20160239080 Marcolina et al. Aug 2016 A1
20160242646 Obma Aug 2016 A1
20160259407 Schick Sep 2016 A1
20160262687 Vaidyanathan et al. Sep 2016 A1
20160263458 Mather et al. Sep 2016 A1
20160274732 Bang et al. Sep 2016 A1
20160274758 Bailey Sep 2016 A1
20160275726 Mullins Sep 2016 A1
20160282947 Schwarz et al. Sep 2016 A1
20160291768 Cho et al. Oct 2016 A1
20160292497 Kehtarnavaz et al. Oct 2016 A1
20160309249 Wu et al. Oct 2016 A1
20160313798 Connor Oct 2016 A1
20160313801 Wagner et al. Oct 2016 A1
20160313890 Walline et al. Oct 2016 A1
20160313899 Noel Oct 2016 A1
20160314623 Coleman et al. Oct 2016 A1
20160342227 Natzke et al. Nov 2016 A1
20160350973 Shapira et al. Dec 2016 A1
20170025026 Ortiz Catalan Jan 2017 A1
20170031502 Rosenberg et al. Feb 2017 A1
20170035313 Hong et al. Feb 2017 A1
20170061817 Mettler May Mar 2017 A1
20170068445 Lee et al. Mar 2017 A1
20170075426 Camacho Perez et al. Mar 2017 A1
20170079828 Pedtke et al. Mar 2017 A1
20170080346 Abbas Mar 2017 A1
20170090604 Barbier Mar 2017 A1
20170091567 Wang et al. Mar 2017 A1
20170095178 Schoen et al. Apr 2017 A1
20170119472 Herrmann et al. May 2017 A1
20170123487 Hazra et al. May 2017 A1
20170124474 Kashyap May 2017 A1
20170124816 Yang et al. May 2017 A1
20170147077 Park et al. May 2017 A1
20170161635 Oono et al. Jun 2017 A1
20170188878 Lee Jul 2017 A1
20170188980 Ash Jul 2017 A1
20170197142 Stafford et al. Jul 2017 A1
20170209055 Pantelopoulos et al. Jul 2017 A1
20170220923 Bae et al. Aug 2017 A1
20170237789 Harner et al. Aug 2017 A1
20170237901 Lee et al. Aug 2017 A1
20170259167 Cook et al. Sep 2017 A1
20170262064 Ofir et al. Sep 2017 A1
20170277282 Go Sep 2017 A1
20170285744 Juliato Oct 2017 A1
20170285756 Wang et al. Oct 2017 A1
20170285757 Robertson et al. Oct 2017 A1
20170285848 Rosenberg et al. Oct 2017 A1
20170296363 Yetkin et al. Oct 2017 A1
20170301630 Nguyen et al. Oct 2017 A1
20170308118 Ito Oct 2017 A1
20170312614 Tran et al. Nov 2017 A1
20170329392 Keskin et al. Nov 2017 A1
20170329404 Keskin et al. Nov 2017 A1
20170340506 Zhang et al. Nov 2017 A1
20170344706 Torres et al. Nov 2017 A1
20170347908 Watanabe et al. Dec 2017 A1
20170371403 Wetzler et al. Dec 2017 A1
20180000367 Longinotti-Buitoni Jan 2018 A1
20180018825 Kim et al. Jan 2018 A1
20180020285 Zass Jan 2018 A1
20180020951 Kaifosh et al. Jan 2018 A1
20180020978 Kaifosh et al. Jan 2018 A1
20180020990 Park et al. Jan 2018 A1
20180024634 Kaifosh et al. Jan 2018 A1
20180024635 Kaifosh et al. Jan 2018 A1
20180024641 Mao et al. Jan 2018 A1
20180064363 Morun et al. Mar 2018 A1
20180067553 Morun et al. Mar 2018 A1
20180068489 Kim et al. Mar 2018 A1
20180074332 Li et al. Mar 2018 A1
20180081439 Daniels Mar 2018 A1
20180088675 Vogel et al. Mar 2018 A1
20180088765 Bailey Mar 2018 A1
20180092599 Kerth et al. Apr 2018 A1
20180093181 Goslin et al. Apr 2018 A1
20180095542 Mallinson Apr 2018 A1
20180095630 Bailey Apr 2018 A1
20180101235 Bodensteiner et al. Apr 2018 A1
20180101289 Bailey Apr 2018 A1
20180107275 Chen et al. Apr 2018 A1
20180120948 Aleem et al. May 2018 A1
20180133551 Chang et al. May 2018 A1
20180140441 Poirters May 2018 A1
20180150033 Lake et al. May 2018 A1
20180153430 Ang et al. Jun 2018 A1
20180153444 Yang et al. Jun 2018 A1
20180154140 Bouton et al. Jun 2018 A1
20180168905 Goodall et al. Jun 2018 A1
20180178008 Bouton et al. Jun 2018 A1
20180217249 La Salla et al. Aug 2018 A1
20180239430 Tadi et al. Aug 2018 A1
20180240459 Weng et al. Aug 2018 A1
20180247443 Briggs et al. Aug 2018 A1
20180279919 Bansbach et al. Oct 2018 A1
20180301057 Hargrove et al. Oct 2018 A1
20180307314 Connor Oct 2018 A1
20180314879 Khwaja et al. Nov 2018 A1
20180321745 Morun et al. Nov 2018 A1
20180321746 Morun et al. Nov 2018 A1
20180330549 Brenton Nov 2018 A1
20180333575 Bouton Nov 2018 A1
20180344195 Morun et al. Dec 2018 A1
20180356890 Zhang et al. Dec 2018 A1
20180360379 Harrison et al. Dec 2018 A1
20190008453 Spoof Jan 2019 A1
20190025919 Tadi et al. Jan 2019 A1
20190027141 Strong et al. Jan 2019 A1
20190033967 Morun et al. Jan 2019 A1
20190033974 Mu et al. Jan 2019 A1
20190038166 Tavabi et al. Feb 2019 A1
20190056422 Park et al. Feb 2019 A1
20190076716 Chiou et al. Mar 2019 A1
20190089898 Kim et al. Mar 2019 A1
20190113973 Coleman et al. Apr 2019 A1
20190121305 Kaifosh et al. Apr 2019 A1
20190121306 Kaifosh Apr 2019 A1
20190146809 Lee et al. May 2019 A1
20190150777 Guo et al. May 2019 A1
20190192037 Morun et al. Jun 2019 A1
20190196585 Laszlo et al. Jun 2019 A1
20190196586 Laszlo et al. Jun 2019 A1
20190197778 Sachdeva et al. Jun 2019 A1
20190209034 Deno Jul 2019 A1
20190212817 Kaifosh et al. Jul 2019 A1
20190216619 McDonnall Jul 2019 A1
20190223748 Al-Natsheh et al. Jul 2019 A1
20190227627 Kaifosh et al. Jul 2019 A1
20190228330 Kaifosh et al. Jul 2019 A1
20190228533 Giurgica-Tiron et al. Jul 2019 A1
20190228579 Kaifosh et al. Jul 2019 A1
20190228590 Kaifosh et al. Jul 2019 A1
20190228591 Giurgica-Tiron et al. Jul 2019 A1
20190247650 Tran Aug 2019 A1
20190279407 McHugh et al. Sep 2019 A1
20190294243 Laszlo et al. Sep 2019 A1
20190324549 Araki et al. Oct 2019 A1
20190348026 Berenzweig et al. Nov 2019 A1
20190348027 Berenzweig et al. Nov 2019 A1
20190357787 Barachant et al. Nov 2019 A1
20190362557 Lacey et al. Nov 2019 A1
20200042089 Ang et al. Feb 2020 A1
20200057661 Bendfeldt Feb 2020 A1
20200065569 Nduka et al. Feb 2020 A1
20200069210 Berenzweig et al. Mar 2020 A1
20200069211 Berenzweig et al. Mar 2020 A1
20200073483 Berenzweig et al. Mar 2020 A1
20200077955 Shui Mar 2020 A1
20200097081 Stone et al. Mar 2020 A1
20200111260 Osborn et al. Apr 2020 A1
20200142490 Xiong et al. May 2020 A1
20200143795 Park et al. May 2020 A1
20200163562 Neaves May 2020 A1
20200205932 Zar et al. Jul 2020 A1
20200225320 Belskikh et al. Jul 2020 A1
20200245873 Frank et al. Aug 2020 A1
20200275895 Barachant Sep 2020 A1
20200301509 Liu et al. Sep 2020 A1
20200305795 Floyd et al. Oct 2020 A1
20200320335 Shamun et al. Oct 2020 A1
20210109598 Zhang et al. Apr 2021 A1
20210117523 Kim et al. Apr 2021 A1
20210290159 Bruinsma et al. Sep 2021 A1
20220256706 Xiong et al. Aug 2022 A1
Foreign Referenced Citations (80)
Number Date Country
2902045 Aug 2014 CA
2921954 Feb 2015 CA
2939644 Aug 2015 CA
1838933 Sep 2006 CN
103777752 May 2014 CN
105009031 Oct 2015 CN
105190477 Dec 2015 CN
105190578 Dec 2015 CN
106102504 Nov 2016 CN
110300542 Oct 2019 CN
111902077 Nov 2020 CN
112074225 Dec 2020 CN
112469469 Mar 2021 CN
112822992 May 2021 CN
4412278 Oct 1995 DE
0301790 Feb 1989 EP
1345210 Sep 2003 EP
2198521 Jun 2012 EP
2541763 Jan 2013 EP
2959394 Dec 2015 EP
3104737 Dec 2016 EP
3200051 Aug 2017 EP
3487395 May 2019 EP
2959394 May 2021 EP
H05277080 Oct 1993 JP
3103427 Oct 2000 JP
2002287869 Oct 2002 JP
2005095561 Apr 2005 JP
2009050679 Mar 2009 JP
2010520561 Jun 2010 JP
2016507851 Mar 2016 JP
2017509386 Apr 2017 JP
2019023941 Feb 2019 JP
2021072136 May 2021 JP
20120094870 Aug 2012 KR
20120097997 Sep 2012 KR
20150123254 Nov 2015 KR
20160121552 Oct 2016 KR
20170067873 Jun 2017 KR
20170107283 Sep 2017 KR
101790147 Oct 2017 KR
9527341 Oct 1995 WO
2006086504 Aug 2006 WO
2008109248 Sep 2008 WO
2009042313 Apr 2009 WO
2010104879 Sep 2010 WO
2011070554 Jun 2011 WO
2012155157 Nov 2012 WO
2014130871 Aug 2014 WO
2014186370 Nov 2014 WO
2014194257 Dec 2014 WO
2014197443 Dec 2014 WO
2015027089 Feb 2015 WO
2015073713 May 2015 WO
2015081113 Jun 2015 WO
2015100172 Jul 2015 WO
2015123445 Aug 2015 WO
2015184760 Dec 2015 WO
2015192117 Dec 2015 WO
2015199747 Dec 2015 WO
2016041088 Mar 2016 WO
2017062544 Apr 2017 WO
2017075611 May 2017 WO
2017092225 Jun 2017 WO
2017120669 Jul 2017 WO
2017172185 Oct 2017 WO
2017208167 Dec 2017 WO
2018022602 Feb 2018 WO
2018098046 May 2018 WO
2019099758 May 2019 WO
2019147953 Aug 2019 WO
2019147958 Aug 2019 WO
2019147996 Aug 2019 WO
2019217419 Nov 2019 WO
2019226259 Nov 2019 WO
2019231911 Dec 2019 WO
2020047429 Mar 2020 WO
2020061440 Mar 2020 WO
2020061451 Mar 2020 WO
2020072915 Apr 2020 WO
Non-Patent Literature Citations (245)
Entry
Gaetano Gargiulo et al. GIGA-OHM High Impedance FET Input Amplifiers for Dry Electrode Biosensor Circuits and Systems Integrated Microsystems:Electronics, Photonics, and Biotechnology, Dec. 19, 2017, 41 pages.
Al-Timemy A.H., et al., “Improving the Performance Against Force Variation of EMG Controlled Multifunctional Upper-Limb Prostheses for Transradial Amputees,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, Jun. 2016, vol. 24 (6), 12 Pages.
Morris D., et al., “Emerging Input Technologies for Always-Available Mobile Interaction,” Foundations and Trends in Human-Computer Interaction, 2010, vol. 4 (4), pp. 245-316.
Naik G.R., et al., “Source Separation and Identification issues in Bio Signals: A Solution using Blind Source Separation,” Chapter 4 of Recent Advances in Biomedical Engineering, Intech, 2009, 23 pages.
Naik G.R., et al., “Real-Time Hand Gesture Identification for Human Computer Interaction Based on ICA of Surface Electromyogram,” IADIS International Conference Interfaces and Human Computer Interaction, 2007, pp. 83-90.
Naik G.R., et al., “Subtle Hand Gesture Identification for HCI Using Temporal Decorrelation Source Separation BSS of Surface EMG,” Digital Image Computing Techniques and Applications, IEEE Computer Society, 2007, pp. 30-37.
Negro F., et al., “Multi-Channel Intramuscular and Surface EMG Decomposition by Convolutive Blind Source Separation,” Journal of Neural Engineering, Feb. 29, 2016, vol. 13, 18 Pages.
Non-Final Office Action dated Mar. 2, 2021 for U.S. Appl. No. 15/974,430, filed May 8, 2018, 32 Pages.
Non-Final Office Action dated Sep. 2, 2020 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 66 Pages.
Non-Final Office Action dated Aug. 3, 2020 for U.S. Appl. No. 16/593,446, filed Oct. 4, 2019, 44 pages.
Non-Final Office Action dated Jun. 3, 2021 for U.S. Appl. No. 15/816,435, filed Nov. 17, 2017, 32 Pages.
Non-Final Office Action dated Jun. 5, 2020 for U.S. Appl. No. 15/659,072, filed Jul. 25, 2017, 59 Pages.
Non-Final Office Action dated Feb. 8, 2021 for U.S. Appl. No. 16/557,342, filed Aug. 30, 2019, 11 Pages.
Non-Final Office Action dated Oct. 8, 2020 for U.S. Appl. No. 16/577,207, filed Sep. 20, 2019, 51 Pages.
Non-Final Office Action dated Aug. 11, 2021 for U.S. Appl. No. 16/577,207, filed Sep. 20, 2019, 35 Pages.
Non-Final Office Action dated Jun. 13, 2019 for U.S. Appl. No. 16/258,279, filed Jan. 25, 2019, 38 Pages.
Non-Final Office Action dated Jun. 15, 2020 for U.S. Appl. No. 16/557,342, filed Aug. 30, 2019, 46 Pages.
Non-Final Office Action dated Jan. 16, 2020 for U.S. Appl. No. 16/389,419, filed Apr. 19, 2019, 26 Pages.
Non-Final Office Action dated May 16, 2019 for U.S. Appl. No. 15/974,384, filed May 8, 2018, 13 Pages.
Non-Final Office Action dated May 16, 2019 for U.S. Appl. No. 15/974,430, filed May 8, 2018, 12 Pages.
Non-Final Office Action dated Nov. 19, 2019 for U.S. Appl. No. 16/577,207, filed Sep. 20, 2019, 32 Pages.
Non-Final Office Action dated Aug. 20, 2020 for U.S. Appl. No. 15/974,454, filed May 8, 2018, 59 Pages.
Non-Final Office Action dated Dec. 20, 2019 for U.S. Appl. No. 15/974,454, filed May 8, 2018, 41 Pages.
Non-Final Office Action dated Jan. 22, 2020 for U.S. Appl. No. 15/816,435, filed Nov. 17, 2017, 35 Pages.
Non-Final Office Action dated Oct. 22, 2019 for U.S. Appl. No. 16/557,342, filed Aug. 30, 2019, 16 Pages.
Non-Final Office Action dated Dec. 23, 2019 for U.S. Appl. No. 16/557,383, filed Aug. 30, 2019, 53 Pages.
Non-Final Office Action dated Dec. 23, 2019 for U.S. Appl. No. 16/557,427, filed Aug. 30, 2019, 52 Pages.
Non-Final Office Action dated Feb. 23, 2017 for U.S. Appl. No. 14/505,836, filed Oct. 3, 2014, 54 Pages.
Non-Final Office Action dated Jul. 23, 2020 for U.S. Appl. No. 16/593,446, filed Oct. 4, 2019, 28 pages.
Non-Final Office Action dated May 24, 2019 for U.S. Appl. No. 16/353,998, filed Mar. 14, 2019, 20 Pages.
Non-Final Office Action dated May 26, 2020 for U.S. Appl. No. 16/353,998, filed Mar. 14, 2019, 60 Pages.
Non-Final Office Action dated Nov. 27, 2020 for U.S. Appl. No. 16/258,279, filed Jan. 25, 2019, 44 Pages.
Non-Final Office Action dated Apr. 30, 2019 for U.S. Appl. No. 15/659,072, filed Jul. 25, 2017, 99 Pages.
Non-Final Office Action dated Apr. 30, 2020 for U.S. Appl. No. 15/974,430, filed May 8, 2018, 57 Pages.
Non-Final Office Action dated Dec. 30, 2019 for U.S. Appl. No. 16/593,446, filed Oct. 4, 2019, 43 pages.
Non-Final Office Action dated Jun. 30, 2016 for U.S. Appl. No. 14/505,836, filed Oct. 3, 2014, 37 Pages.
Non-Final Office Action dated Oct. 30, 2019 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 22 Pages.
Notice of Allowance dated Nov. 2, 2020 for U.S. Appl. No. 15/974,454, filed May 8, 2018, 24 Pages.
Notice of Allowance dated Nov. 4, 2019 for U.S. Appl. No. 15/974,384, filed May 8, 2018, 39 Pages.
Notice of Allowance dated Feb. 9, 2022 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 9 pages.
Notice of Allowance dated Nov. 10, 2021 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 6 pages.
Notice of Allowance dated Jul. 15, 2021 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 2 pages.
Notice of Allowance dated Dec. 16, 2020 for U.S. Appl. No. 16/593,446, filed Oct. 4, 2019, 44 pages.
Notice of Allowance dated May 18, 2020 for U.S. Appl. No. 16/258,279, filed Jan. 25, 2019, 42 Pages.
Notice of Allowance dated Aug. 19, 2020 for U.S. Appl. No. 16/557,427, filed Aug. 30, 2019, 22 Pages.
Notice of Allowance dated May 20, 2020 for U.S. Appl. No. 16/389,419, filed Apr. 19, 2019, 28 Pages.
Notice of Allowance dated Oct. 22, 2021 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 8 pages.
Notice of Allowance dated Aug. 23, 2021 for U.S. Appl. No. 15/974,430, filed May 8, 2018, 12 pages.
Notice of Allowance dated Dec. 23, 2020 for U.S. Appl. No. 15/659,072, filed Jul. 25, 2017, 26 Pages.
Notice of Allowance dated Jun. 28, 2021 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 18 pages.
Office action for European Application No. 17835112.8, dated Feb. 11, 2022, 11 Pages.
Partial Supplementary European Search Report for European Application No. 18879156.0, dated Dec. 7, 2020, 9 pages.
International Search Report and Written Opinion for International Application No. PCT/US2014/052143, dated Nov. 21, 2014, 8 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2014/067443, dated Feb. 27, 2015, 8 pages.
International Search Report and Written Opinion for International Application No. PCT/US2015/015675, dated May 27, 2015, 9 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2017/043686, dated Oct. 6, 2017, 9 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2017/043693, dated Oct. 6, 2017, 7 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2017/043791, dated Oct. 5, 2017, 10 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2018/056768, dated Jan. 15, 2019, 8 pages.
International Search Report and Written Opinion for International Application No. PCT/US2018/061409, dated Mar. 12, 2019, 11 pages.
International Search Report and Written Opinion for International Application No. PCT/US2018/063215, dated Mar. 21, 2019, 17 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/015167, dated May 21, 2019, 7 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/015174, dated May 21, 2019, 8 pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/015244, dated May 16, 2019, 8 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/020065, dated May 16, 2019, 10 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/028299, dated Aug. 9, 2019, 12 pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/031114, dated Dec. 20, 2019, 18 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/034173, dated Sep. 18, 2019, 10 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/037302, dated Oct. 11, 2019, 13 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/042579, dated Oct. 31, 2019, 8 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/046351, dated Nov. 7, 2019, 9 pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/049094, dated Jan. 9, 2020, 27 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/052131, dated Dec. 6, 2019, 8 pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/052151, dated Jan. 15, 2020, 10 pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/054716, dated Dec. 20, 2019, 11 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/061759, dated Jan. 29, 2020, 12 pages.
International Search Report and Written Opinion for International Application No. PCT/US2020/025735, dated Jun. 22, 2020, 10 pages.
International Search Report and Written Opinion for International Application No. PCT/US2020/025772, dated Aug. 3, 2020, 11 pages.
International Search Report and Written Opinion for International Application No. PCT/US2020/025797, dated Jul. 9, 2020, 10 pages.
International Search Report and Written Opinion for International Application No. PCT/US2020/049274, dated Feb. 1, 2021, 17 pages.
International Search Report and Written Opinion for International Application No. PCT/US2020/061392, dated Mar. 12, 2021, 12 pages.
International Search Report and Written Opinion for International Application No. PCT/US2017/043792, dated Oct. 5, 2017, 9 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/015134, dated May 15, 2019, 11 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/015180, dated May 28, 2019, 10 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/015183, dated May 3, 2019, 8 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/015238, dated May 16, 2019, 8 Pages.
Invitation to Pay Additional Fees for International Application No. PCT/US2019/031114, mailed Aug. 6, 2019, 7 pages.
Invitation to Pay Additional Fees for International Application No. PCT/US2019/049094, mailed Oct. 24, 2019, 2 Pages.
Jiang H., “Effective and Interactive Interpretation of Gestures by Individuals with Mobility Impairments,” Thesis/Dissertation Acceptance, Purdue University Graduate School, Graduate School Form 30, Updated on Jan. 15, 2015, 24 pages.
Kainz et al., “Approach to Hand Tracking and Gesture Recognition Based on Depth-Sensing Cameras and EMG Monitoring,” ACTA Informatica Pragensia, vol. 3, Jan. 1, 2014, pp. 104-112, Retrieved from the Internet: URL: https://aip.vse.cz/pdfs/aip/2014/01/08.pdf.
Kawaguchi J., et al., “Estimation of Finger Joint Angles Based on Electromechanical Sensing of Wrist Shape,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, Sep. 2017, vol. 25 (9), pp. 1409-1418.
Kim H., et al., “Real-Time Human Pose Estimation and Gesture Recognition from Depth Images Using Superpixels and SVM Classifier,” Sensors, 2015, vol. 15, pp. 12410-12427.
Kipke D.R., et al., “Silicon-Substrate Intracortical Microelectrode Arrays for Long-Term Recording of Neuronal Spike Activity in Cerebral Cortex,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, Jun. 2003, vol. 11 (2), 5 pages, Retrieved on Oct. 7, 2019 [Oct. 7, 2019] Retrieved from the Internet: URL: https://www.ece.uvic.ca/bctill/papers/neurimp/Kipke_etal_2003_01214707.pdf.
Koerner M.D., “Design and Characterization of the Exo-Skin Haptic Device: A Novel Tendon Actuated Textile Hand Exoskeleton,” Abstract of thesis for Drexel University Masters Degree [online], Nov. 2, 2017, 5 pages, Retrieved from the Internet: URL: https://dialog.proquest.com/professional/docview/1931047627?accountid=153692.
Lee D.C., et al., “Motion and Force Estimation System of Human Fingers,” Journal of Institute of Control, Robotics and Systems, 2011, vol. 17 (10), pp. 1014-1020.
Li Y., et al., “Motor Function Evaluation of Hemiplegic Upper-Extremities Using Data Fusion from Wearable Inertial and Surface EMG Sensors,” Sensors, MDPI, 2017, vol. 17 (582), pp. 1-17.
Lopes J., et al., “Hand/Arm Gesture Segmentation by Motion Using IMU and EMG Sensing,” ScienceDirect, Jun. 27-30, 2017, vol. 11, pp. 107-113.
Marcard T.V., et al., “Sparse Inertial Poser: Automatic 3D Human Pose Estimation from Sparse IMUs,” arxiv.org, Computer Graphics Forum, 2017, vol. 36 (2), 12 pages, XP080759137.
Martin H., et al., “A Novel Approach of Prosthetic Arm Control using Computer Vision, Biosignals, and Motion Capture,” IEEE Symposium on Computational Intelligence in Robotic Rehabilitation and Assistive Technologies (CIR2AT), 2014, 5 pages.
McIntee S.S., “A Task Model of Free-Space Movement-Based Geastures,” Dissertation, Graduate Faculty of North Carolina State University, Computer Science, 2016, 129 pages.
Mendes Jr.J.J.A., et al., “Sensor Fusion and Smart Sensor in Sports and Biomedical Applications,” Sensors, 2016, vol. 16 (1569), pp. 1-31.
Mohamed O.H., “Homogeneous Cognitive Based Biometrics for Static Authentication,” Dissertation submitted to University of Victoria, Canada, 2010, [last accessed Oct. 11, 2019], 149 pages, Retrieved from the Internet: URL: http://hdl.handle.net/1828/321.
Extended European Search Report for European Application No. 19855191.3, dated Dec. 6, 2021, 11 pages.
Extended European Search Report for European Application No. 19883839.3, dated Dec. 15, 2021, 7 pages.
Farina D., et al., “Man/Machine Interface Based on the Discharge Timings of Spinal Motor Neurons After Targeted Muscle Reinnervation,” Nature Biomedical Engineering, Feb. 6, 2017, vol. 1, Article No. 0025, pp. 1-12.
Favorskaya M., et al., “Localization and Recognition of Dynamic Hand Gestures Based on Hierarchy of Manifold Classifiers,” International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, May 25-27, 2015, vol. XL-5/W6, pp. 1-8.
Final Office Action dated Jun. 2, 2020 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 127 Pages.
Final Office Action dated Jun. 2, 2020 for U.S. Appl. No. 16/557,383, filed Aug. 30, 2019, 66 Pages.
Final Office Action dated Nov. 3, 2020 for U.S. Appl. No. 15/974,430, filed May 8, 2018, 27 Pages.
Final Office Action dated Feb. 4, 2020 for U.S. Appl. No. 16/577,207, filed Sep. 20, 2019, 76 Pages.
Final Office Action dated Feb. 4, 2021 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 42 Pages.
Final Office Action dated Jun. 5, 2020 for U.S. Appl. No. 16/557,427, filed Aug. 30, 2019, 95 Pages.
Final Office Action dated Oct. 8, 2020 for U.S. Appl. No. 16/557,342, filed Aug. 30, 2019, 73 Pages.
Final Office Action dated Apr. 9, 2020 for U.S. Appl. No. 15/974,454, filed May 8, 2018, 19 Pages.
Final Office Action dated Dec. 11, 2019 for U.S. Appl. No. 15/974,430, filed May 8, 2018, 30 Pages.
Final Office Action dated Jan. 13, 2021 for U.S. Appl. No. 16/577,207, filed Sep. 20, 2019, 91 Pages.
Final Office Action dated Dec. 18, 2019 for U.S. Appl. No. 16/258,279, filed Jan. 25, 2019, 45 Pages.
Final Office Action dated Feb. 19, 2021 for U.S. Appl. No. 16/258,279, filed Jan. 25, 2019, 58 Pages.
Final Office Action dated Sep. 23, 2020 for U.S. Appl. No. 15/816,435, filed Nov. 17, 2017, 70 Pages.
Final Office Action dated Jan. 28, 2020 for U.S. Appl. No. 16/557,342, filed Aug. 30, 2019, 15 Pages.
Final Office Action dated Jul. 28, 2017 for U.S. Appl. No. 14/505,836, filed Oct. 3, 2014, 52 Pages.
Final Office Action dated Jun. 28, 2021 for U.S. Appl. No. 16/557,342, filed Aug. 30, 2019, 11 Pages.
Final Office Action dated Nov. 29, 2019 for U.S. Appl. No. 15/659,072, filed Jul. 25, 2017, 36 Pages.
Final Office Action dated Nov. 29, 2019 for U.S. Appl. No. 16/353,998, filed Mar. 14, 2019, 33 Pages.
Fong H.C., et al., “PepperGram With Interactive Control,” 22nd International Conference Onvirtual System & Multimedia (VSMM), Oct. 17, 2016, 5 pages.
Gallina A., et al., “Surface EMG Biofeedback,” Surface Electromyography: Physiology, Engineering, and Applications, 2016, pp. 485-500.
Ghasemzadeh H., et al., “A Body Sensor Network With Electromyogram and Inertial Sensors: Multimodal Interpretation of Muscular Activities,” IEEE Transactions on Information Technology in Biomedicine, Mar. 2010, vol. 14 (2), pp. 198-206.
Gopura R.A.R.C., et al., “A Human Forearm and Wrist Motion Assist Exoskeleton Robot With EMG-Based Fuzzy-Neuro Control,” Proceedings of the 2nd Biennial IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics, Oct. 19-22, 2008, 6 pages.
Gourmelon L., et al., “Contactless Sensors for Surface Electromyography,” Proceedings of the 28th IEEE EMBS Annual International Conference, New York City, NY, Aug. 30-Sep. 3, 2006, pp. 2514-2517.
Hauschild M., et al., “A Virtual Reality Environment for Designing and Fitting Neural Prosthetic Limbs,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, Mar. 2007, vol. 15 (1), pp. 9-15.
International Search Report and Written Opinion for International Application No. PCT/US2014/017799, dated May 16, 2014, 9 pages.
International Search Report and Written Opinion for International Application No. PCT/US2014/037863, dated Aug. 21, 2014, 10 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2017/043693, dated Feb. 7, 2019, 7 Pages.
International Preliminary Report on Patentability for International Application No. PCT/US2017/043791, dated Feb. 7, 2019, 9 Pages.
International Preliminary Report on Patentability for International Application No. PCT/US2019/031114, dated Nov. 19, 2020, 16 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2019/049094, dated Mar. 11, 2021, 24 Pages.
International Preliminary Report on Patentability for International Application No. PCT/US2019/052151, dated Apr. 1, 2021, 9 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2014/017799, dated Sep. 3, 2015, 8 Pages.
International Preliminary Report on Patentability for International Application No. PCT/US2014/037863, dated Nov. 26, 2015, 8 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2014/052143, dated Mar. 3, 2016, 7 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2014/067443, dated Jun. 9, 2016, 7 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2015/015675, dated Aug. 25, 2016, 8 Pages.
International Preliminary Report on Patentability for International Application No. PCT/US2017/043686, dated Feb. 7, 2019, 8 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2017/043792, dated Feb. 7, 2019, 8 Pages.
International Preliminary Report on Patentability for International Application No. PCT/US2018/061409, dated May 28, 2020, pages.
International Preliminary Report on Patentability for International Application No. PCT/US2019/015174, dated Aug. 6, 2020, 7 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2019/028299, dated Dec. 10, 2020, 11 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2019/046351, dated Feb. 25, 2021, 8 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2019/052131, dated Apr. 1, 2021, 8 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2019/054716, dated Apr. 15, 2021, 10 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2019/061759, dated May 27, 2021, 12 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2020/049274, dated Mar. 17, 2022, 14 pages.
Ali B., et al., “Spectral Collaborative Representation based Classification for Hand Gestures Recognition on Electromyography Signals,” Biomedical Signal Processing and Control, 2016, vol. 24, pp. 11-18.
Al-Jumaily A., et al., “Electromyogram(EMG) Driven System based Virtual Reality for Prosthetic and Rehabilitation Devices,” Proceedings of the 11th Internationalconference on Information Integration Andweb-Based Applications & Services, Jan. 1, 2009, pp. 582-586.
Al-Mashhadany Y.I., “Inverse Kinematics Problem (IKP) of 6-DOF Manipulator by Locally Recurrent Neural Networks (LRNNs),” Management and Service Science (MASS), International Conference on Management and Service Science., IEEE, Aug. 24, 2010, 5 pages.
Arkenbout E.A., et al., “Robust Hand Motion Tracking through Data Fusion of 5DT Data Glove and Nimble VR Kinect Camera Measurements,” Sensors, 2015, vol. 15, pp. 31644-31671.
Benko H., et al., “Enhancing Input on and Above the Interactive Surface with Muscle Sensing,” The ACM International Conference on Interactive Tabletops and Surfaces (ITS), Nov. 23-25, 2009, pp. 93-100.
Berenzweig A., et al., “Wearable Devices and Methods for Improved Speech Recognition,” U.S. Appl. No. 16/785,680, filed Feb. 10, 2020, 67 pages.
Boyali A., et al., “Spectral Collaborative Representation based Classification for Hand Gestures Recognition on Electromyography Signals,” Biomedical Signal Processing and Control, 2016, vol. 24, pp. 11-18.
Brownlee J., “Finite State Machines (FSM): Finite State Machines as a Control Technique in Artificial Intelligence (AI),” FSM, Jun. 2002, 12 pages.
Cannan J., et al., “A Wearable Sensor Fusion Armband for Simple Motion Control and Selection for Disabled and Non-Disabled Users,” Computer Science and Electronic Engineering Conference, IEEE, Sep. 12, 2012, pp. 216-219, XP032276745.
Cheng J., et al., “A Novel Phonology- and Radical-Coded Chinese Sign Language Recognition Framework Using Accelerometer and Surface Electromyography Sensors,” Sensors, 2015, vol. 15, pp. 23303-23324.
Communication Pursuant to Article 94(3) for European Patent Application No. 17835112.8, dated Dec. 14, 2020, 6 Pages.
Communication Pursuant to Rule 164(1) EPC, Partial Supplementary European Search Report for European Application No. 14753949.8, dated Sep. 30, 2016, 7 pages.
Co-pending U.S. Appl. No. 15/659,072, inventors Patrick; Kaifosh et al., filed Jul. 25, 2017.
Co-pending U.S. Appl. No. 15/816,435, inventors Ning; Guo et al., filed Nov. 17, 2017.
Co-pending U.S. Appl. No. 15/882,858, inventors Stephen; Lake et al., filed Jan. 29, 2018.
Co-pending U.S. Appl. No. 15/974,430, inventors Adam; Berenzweig et al., filed May 8, 2018.
Co-pending U.S. Appl. No. 16/353,998, inventors Patrick; Kaifosh et al., filed Mar. 14, 2019.
Co-pending U.S. Appl. No. 16/557,383, inventors Adam; Berenzweig et al., filed Aug. 30, 2019.
Co-pending U.S. Appl. No. 16/557,427, inventors Adam; Berenzweig et al., filed Aug. 30, 2019.
Co-Pending U.S. Appl. No. 15/974,430, filed May 8, 2018, 44 Pages.
Co-Pending U.S. Appl. No. 16/353,998, filed Mar. 14, 2019, 43 pages.
Co-Pending U.S. Appl. No. 16/557,383, filed Aug. 30, 2019, 94 Pages.
Co-Pending U.S. Appl. No. 16/557,427, filed Aug. 30, 2019, 93 Pages.
Co-Pending U.S. Appl. No. 16/577,207, filed Sep. 20, 2019, 67 Pages.
Co-Pending U.S. Appl. No. 14/505,836, filed Oct. 3, 2014, 59 Pages.
Co-Pending U.S. Appl. No. 15/816,435, filed Nov. 17, 2017, 24 Pages.
Co-Pending U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 54 Pages.
Co-Pending U.S. Appl. No. 15/974,384, filed May 8, 2018, 44 Pages.
Co-Pending U.S. Appl. No. 15/974,454, filed May 8, 2018, 45 Pages.
Co-Pending U.S. Appl. No. 16/557,342, filed Aug. 30, 2019, 93 Pages.
Corazza S., et al.,“A Markerless Motion Capture System to Study Musculoskeletal Biomechanics: Visual Hull and Simulated Annealing Approach,” Annals of Biomedical Engineering, Jul. 2006, vol. 34 (6), pp. 1019-1029, [Retrieved on Dec. 11, 2019], 11 pages, Retrieved from the Internet: URL: https://www.researchgate.net/publication/6999610_A_Markerless_Motion_Capture_System_to_Study_Musculoskeletal_Biomechanics_Visual_Hull_and Simulated Annealing Approach.
Costanza E., et al., “EMG as a Subtle Input Interface for Mobile Computing,” Mobile HCI, LNCS 3160, 2004, pp. 426-430.
Costanza E., et al., “Toward Subtle Intimate Interfaces for Mobile Devices Using an EMG Controller,” CHI, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Apr. 2-7, 2005, pp. 481-489.
Cote-Allard U., et al., “Deep Learning for Electromyographic Hand Gesture Signal Classification Using Transfer Learning,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, Jan. 26, 2019, vol. 27 (4), 11 Pages.
Csapo A.B., et al., “Evaluation of Human-Myo Gesture Control Capabilities in Continuous Search and Select Operations,” 7th IEEE International Conference on Cognitive Infocommunications, Oct. 16-18, 2016, pp. 000415-000420.
Davoodi R., et al., “Development of a Physics-Based Target Shooting Game to Train Amputee Users of Multi joint Upper Limb Prostheses,” Presence, Massachusetts Institute of Technology, 2012, vol. 21 (1), pp. 85-95.
Delis A.L., et al., “Development of a Myoelectric Controller Based on Knee Angle Estimation,” Biodevices, International Conference on Biomedical Electronics and Devices, Jan. 17, 2009, 7 pages.
Diener L., et al., “Direct Conversion From Facial Myoelectric Signals to Speech Using Deep Neural Networks,” International Joint Conference on Neural Networks (IJCNN), Oct. 1, 2015, 7 pages.
Ding I-J., et al., “HMM with Improved Feature Extraction-Based Feature Parameters for Identity Recognition of Gesture Command Operators by Using a Sensed Kinect-Data Stream,” Neurocomputing, 2017, vol. 262, pp. 108-119.
European Search Report for European Application No. 19861903.3, dated Oct. 12, 2021, 2 pages.
European Search Report for European Application No. 19863248.1, dated Oct. 19, 2021, 2 pages.
European Search Report for European Application No. 19868789.9, dated May 9, 2022, 9 pages.
Extended European Search Report for European Application No. 18879156.0, dated Mar. 12, 2021, 11 pages.
Extended European Search Report for European Application No. 19744404.5, dated Mar. 29, 2021, 11 pages.
Extended European Search Report for European Application No. 19799947.7, dated May 26, 2021, 10 pages.
Extended European Search Report for European Application No. 17835111.0, dated Nov. 21, 2019, 6 pages.
Extended European Search Report for European Application No. 17835112.8, dated Feb. 5, 2020, 17 pages.
Extended European Search Report for European Application No. 17835140.9, dated Nov. 26, 2019, 10 Pages.
Extended European Search Report for European Application No. 19806723.3, dated Jul. 7, 2021, 13 pages.
Extended European Search Report for European Application No. 19850130.6, dated Sep. 1, 2021, 14 Pages.
Picard R.W., et al., “Affective Wearables,” Proceedings of the IEEE 1st International Symposium on Wearable Computers, ISWC, Cambridge, MA, USA, Oct. 13-14, 1997, pp. 90-97.
Preinterview First Office Action dated Jun. 24, 2020 for U.S. Appl. No. 16/785,680, filed Feb. 10, 2020, 90 Pages.
Rekimoto J., “GestureWrist and GesturePad: Unobtrusive Wearable Interaction Devices,” ISWC Proceedings of the 5th IEEE International Symposium on Wearable Computers, 2001, 7 pages.
Saponas T.S., et al., “Demonstrating the Feasibility of Using Forearm Electromyography for Muscle-Computer Interfaces,” CHI Proceedings, Physiological Sensing for Input, Apr. 5-10, 2008, pp. 515-524.
Saponas T.S., et al., “Enabling Always-Available Input with Muscle-Computer Interfaces,” Conference: Proceedings of the 22nd Annual ACM Symposium on User Interface Software and Technology, Oct. 7, 2009, pp. 167-176.
Saponas T.S., et al., “Making Muscle-Computer Interfaces More Practical,” CHI, Atlanta, Georgia, USA, Apr. 10-15, 2010, 4 pages.
Sartori M., et al., “Neural Data-Driven Musculoskeletal Modeling for Personalized Neurorehabilitation Technologies,” IEEE Transactions on Biomedical Engineering, May 5, 2016, vol. 63 (5), pp. 879-893.
Sato M., et al., “Touche: Enhancing Touch Interaction on Humans, Screens, Liquids, and Everyday Objects,” CHI, Austin, Texas, May 5-10, 2012, 10 pages.
Sauras-Perez P., et al., “A Voice and Pointing Gesture Interaction System for Supporting Human Spontaneous Decisions in Autonomous Cars,” Clemson University, All Dissertations, May 2017, 174 pages.
Shen S., et al., “I am a Smartwatch and I Can Track My User's Arm,” University of Illinois at Urbana-Champaign, MobiSys, Jun. 25-30, 2016, 12 pages.
Son M., et al., “EValuating the Utility of Two Gestural Discomfort Evaluation Methods,” PLOS One, Apr. 19, 2017, 21 pages.
Strbac M., et al., “Microsoft Kinect-Based Artificial Perception System for Control of Functional Electrical Stimulation Assisted Grasping,” Hindawi Publishing Corporation, BioMed Research International [online], 2014, Article No. 740469, 13 pages, Retrieved from the Internet: URL: https://dx.doi.org/101155/2014/740469.
Torres T., “Myo Gesture Control Armband,” PCMag, Jun. 8, 2015, 9 pages, Retrieved from the Internet: URL: https://www.pcmag.com/article2/0,2817,2485462,00.asp.
Ueno A., et al., “A Capacitive Sensor System for Measuring Laplacian Electromyogram through Cloth: A Pilot Study,” Proceedings of the 29th Annual International Conference of the IEEE EMBS, Cite Internationale, Lyon, France, Aug. 23-26, 2007, pp. 5731-5734.
Ueno A., et al., “Feasibility of Capacitive Sensing of Surface Electromyographic Potential through Cloth,” Sensors and Materials, 2012, vol. 24 (6), pp. 335-346.
Valero-Cuevas F.J., et al., “Computational Models for Neuromuscular Function,” IEEE Reviews in Biomedical Engineering, 2009, vol. 2, NIH Public Access Author Manuscript [online], Jun. 16, 2011 [Retrieved on Jul. 29, 2019], 52 pages, Retrieved from the Internet: URL: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3116649/.
Wittevrongel B., et al., “Spatiotemporal Beamforming: A Transparent and Unified Decoding Approach to Synchronous Visual Brain-Computer Interfacing,” Frontiers in Neuroscience, Nov. 15, 2017, vol. 11, Article No. 630, 13 Pages.
Wodzinski M., et al., “Sequential Classification of Palm Gestures Based on A* Algorithm and MLP Neural Network for Quadrocopter Control,” Metrology and Measurement Systems, 2017, vol. 24 (2), pp. 265-276.
Xiong A., et al., “A Novel HCI based on EMG and IMU,” Proceedings of the 2011 IEEE International Conference on Robotics and Biomimetics, Phuket, Thailand, Dec. 7-11, 2011, pp. 2653-2657.
Xu Z., et al., “Hand Gesture Recognition and Virtual Game Control Based on 3D Accelerometer and EMG Sensors,” Proceedings of the 14th International Conference on Intelligent User Interfaces, D211 Sanibel Island, Florida, Feb. 8-11, 2009, pp. 401-406.
Xue Y., et al., “Multiple Sensors Based Hand Motion Recognition Using Adaptive Directed Acyclic Graph,” Applied Sciences, MDPI, 2017, vol. 7 (358), pp. 1-14.
Yang Z., et al., “Surface EMG Based Handgrip Force Predictions Using Gene Expression Programming,” Neurocomputing, 2016, vol. 207, pp. 568-579.
Zacharaki E.I., et al., “Spike Pattern Recognition by Supervised Classification in Low Dimensional Embedding Space,” Brain Informatics, 2016, vol. 3, pp. 73-83.
Zhang X., et al., “A Framework for Hand Gesture Recognition Based on Accelerometer and EMG Sensors,” IEEE Transactions on Systems, Man, and Cybernetics—Part A: Systems and Humans, Nov. 2011, vol. 41 (6), pp. 1064-1076.
European Search Report for European Application No. 19890394.0, dated Apr. 29, 2022, 9 pages.
Extended European Search Report for European Application No. 19743717.1, dated Mar. 3, 2021, 12 pages.
Extended European Search Report for European Application No. 18869441.8, dated Nov. 17, 2020, 20 Pages.
International Preliminary Report on Patentability for International Application No. PCT/US2018/056768, dated Apr. 30, 2020, 7 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2019/015183, dated Aug. 6, 2020, 7 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2019/015238, dated Aug. 6, 2020, 7 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2019/034173, dated Dec. 10, 2020, 9 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2019/063587, dated Jun. 10, 2021, 13 pages.
International Search Report and Written Opinion for International Application No. PCT/US2019/063587, dated Mar. 25, 2020, 16 pages.
Non-Final Office Action dated Sep. 6, 2019 for U.S. Appl. No. 16/424,144, filed May 28, 2019, 11 Pages.
Non-Final Office Action dated Apr. 9, 2019 for U.S. Appl. No. 16/258,409, filed Jan. 25, 2019, 71 Pages.
Non-Final Office Action dated Apr. 29, 2019 for U.S. Appl. No. 16/257,979, filed Jan. 25, 2019, 63 Pages.
Notice of Allowance dated Feb. 6, 2020 for U.S. Appl. No. 16/424,144, filed May 28, 2019, 28 Pages.
Notice of Allowance dated May 18, 2022 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 10 pages.
Notice of Allowance dated Jul. 19, 2019 for U.S. Appl. No. 16/258,409, filed Jan. 25, 2019, 36 Pages.
Notice of Allowance dated Jul. 31, 2019 for U.S. Appl. No. 16/257,979, filed Jan. 25, 2019, 22 Pages.
Office Action for European Patent Application No. 19743717.1, dated Apr. 11, 2022, 10 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2020/061392, dated Jun. 9, 2022, 11 pages.
Notice of Allowance dated Aug. 22, 2022 for U.S. Appl. No. 15/882,858, filed Jan. 29, 2018, 9 pages.
Provisional Applications (6)
Number Date Country
62840947 Apr 2019 US
62841069 Apr 2019 US
62841100 Apr 2019 US
62840980 Apr 2019 US
62840966 Apr 2019 US
62841107 Apr 2019 US
Continuations (1)
Number Date Country
Parent 16863098 Apr 2020 US
Child 17469537 US