Wearable muscle interface systems, devices and methods that interact with content displayed on an electronic display

Information

  • Patent Grant
  • 11009951
  • Patent Number
    11,009,951
  • Date Filed
    Tuesday, November 26, 2019
    4 years ago
  • Date Issued
    Tuesday, May 18, 2021
    3 years ago
Abstract
Systems, devices and methods that enable a user to access and interact with content displayed on a portable electronic display in an inconspicuous, hands-free manner are described. There is disclosed a completely wearable system comprising a wearable muscle interface device and a wearable head-mounted display, as well as methods for using the wearable system to effect interactions between the user and content displayed on the wearable head-mounted display. The wearable muscle interface device includes muscle activity sensors worn on an arm of the user to detect muscle activity generated when the user performs a physical gesture. The wearable system is adapted to recognize a plurality of gestures made by the user and, in response to each recognized gesture, to effect one or more interaction(s) with content displayed on the wearable head-mounted display.
Description
BACKGROUND
Technical Field

The present systems, devices, and methods relate generally to wearable muscle interfaces, and more specifically to a wearable muscle interface that interacts with content displayed on a wearable head-mounted display.


Description of the Related Art
Wearable Electronic Devices

Electronic devices are commonplace throughout most of the world today. Advancements in integrated circuit technology have enabled the development of electronic devices that are sufficiently small and lightweight to be carried by the user. Such “portable” electronic devices may include on-board power supplies (such as batteries or other power storage systems) and may be designed to operate without a physical wire-connection to any stationary (i.e., non-portable) electronic system (except, in some cases, during charging).


The convenience afforded by the portability of electronic devices has fostered a huge industry. Smartphones, audio players, laptop computers, tablet computers, and ebook readers are all examples of portable electronic devices. However, the convenience of being able to carry a portable electronic device has also introduced the inconvenience of having one's hand(s) encumbered by the device itself. This problem is addressed by making an electronic device not only portable, but wearable.


A wearable electronic device is any portable electronic device that a user can carry without physically grasping, clutching, or otherwise holding onto the device with their hand(s). For example, a wearable electronic device may be attached or coupled to the user by a strap or straps, a band or bands, a clip or clips, an adhesive, a pin and clasp, an article of clothing, tension or elastic support, an interference fit, an ergonomic form, etc. Examples of wearable electronic devices include digital wristwatches, electronic armbands, electronic rings, electronic ankle-bracelets or “anklets,” head-mounted electronic displays, hearing aids, and so on.


Human-Electronics Interfaces

A wearable electronic device may provide direct functionality for a user (such as audio playback, data display, measurement and monitoring, computing functions, “virtual reality,” “augmented reality,” etc.) or it may provide electronics to interact with, communicate with, or control another electronic device. For example, a wearable electronic device may include sensors that detect inputs affected by a user and transmit signals to another electronic device based on those inputs. Sensor-types and input-types may each take on a variety of forms, including but not limited to: tactile sensors (e.g., buttons, switches, touchpads, or keys) providing manual control, acoustic sensors providing voice-control, electromyography sensors providing gesture control, and/or accelerometers providing gesture control.


A human-computer interface (“HCI”) is an example of a human-electronics interface.


Electromyography Devices

Electromyography (“EMG”) is a process for detecting and processing the electrical signals generated by muscle activity. EMG devices employ EMG sensors that are responsive to the range of electrical potentials (typically μV-mV) involved in muscle activity. EMG signals may be used in a wide variety of applications, including: medical monitoring and diagnosis, muscle rehabilitation, exercise and training, prosthetic control, and even in controlling functions of electronic devices.


Human-electronics interfaces that employ EMG have been proposed. For example, U.S. Pat. Nos. 6,244,873 and 8,170,656 both describe proposals in which a user dons a wearable EMG device and performs physical gestures to control functions of a separate electronic device. In both cases, the separate electronic device is not itself a wearable electronic device, so true hands-free operation of and/or access to the separate electronic device is not achieved. For example, both cases describe using EMG signals to control mobile phones, smart phones, computers, laptop computers, and so on, all of which still typically require the user to use their hand(s) to carry the device and/or to orient the device in such a way that the user may see, access, receive feedback from, and/or generally interact with a display screen on the device.


Interacting with Head-Mounted Displays

As described above, portable electronic devices that include display screens typically require the user to use their hand(s) to carry the device and/or to orient the device so that the user may see, access, receive feedback from, and/or generally interact with the device's display screen. Occupying the user's hand(s) is an inconvenience that can significantly hinder the user's ability to interact with the portable electronic device and/or to interact with other aspects of their environment while operating the portable electronic device. However, this hindrance is at least partially overcome by making the display screen of the portable electronic device wearable. Making the display screen of the portable electronic device wearable enables the user to see, access, and/or receive feedback from the display screen without using their hand(s). In recent years, wearable head-mounted displays have begun to gain wider acceptance, with a number of recently introduced wearable head-mounted display devices having the potential for widespread adoption by consumers.


One such device disclosed in U.S. Pat. No. 8,203,502 issued to Chi et al. utilizes a finger operable input device such as a touch pad built into the wearable head-mounted display (e.g. built into a side-arm of a pair of glasses, with one of the lenses functioning as a display screen) such that a user can interact with and control content appearing on the display screen with positioning and movement of a finger along a surface of the input device. A potential drawback of this approach is that a user is required to conspicuously raise his or her hand to touch the input device each time the user wants to interact with content displayed on the screen. Furthermore, even though the display itself is wearable, it is still controlled by touch and so is not actually hands-free (thus negating part of the benefit of making the display wearable in the first place).


Another such device is disclosed in US 2012/0293548 (Perez et al.) in which a head-mounted display provides users with supplemental information on a display screen provided in at least one of the lenses of a pair of glasses. A processing unit may be connected to the head-mounted display to provide the computing power necessary for its operation. However, the method of user interaction with the display is not specified.


Yet another example of such a device is disclosed in U.S. Pat. No. 8,212,859 issued to Tang et al. in which a source image is projected onto screens built into head-mounted displays worn by a user. Tang et al. focuses on the method and system for projection, and does not specify the manner of user interaction with the head-mounted display device.


U.S. Pat. No. 5,482,051 ('051 patent) describes a human-electronics interface in which a user's EMG signals are detected and used to interact with content that is ultimately displayed on a head-mounted visual display unit. However, the interface described in the '051 patent is not a portable system. The human-electronics interface described in the '051 patent consists of at least three disparate components that are communicatively coupled in series with one another; i) a set of EMG sensors, ii) a stand-alone processing system, and iii) a head-mounted visual display unit. Although the set of EMG sensors and the head-mounted visual display unit are both physically coupled to (i.e., worn by) the user, there is no direct communication between the set of EMG sensors and the head-mounted visual display unit. Detected EMG signals are sent from the set of EMG sensors to the stand-alone processing system (i.e., off the body of the user) where they are processed to achieve some effect, and then signals that represent the effect are sent from the processing system to the head-mounted visual display unit where the effect is displayed to the user. The stand-alone processing system mediates all communication between the set of EMG sensors and the head-mounted visual display unit. The processing system is not worn by the user and is not portable (i.e., it is stationary), and therefore the human-electronics interface described in the '051 patent is limited in that the user must be in close proximity to the stationary processing system in order to use the interface.


What is needed is a completely wearable (i.e., completely portable) user interface that enables a user to see, access and interact with an electronic display in an inconspicuous, hands-free manner.


BRIEF SUMMARY

The present disclosure relates to a muscle interface device and method for interacting with content displayed on wearable head mounted displays.


More generally, the muscle interface device comprises a sensor worn on the forearm of a user, and the sensor is adapted to recognize a plurality of gestures made by a user's hand and or wrist to interact with content displayed on the wearable head mounted display.


In an embodiment, the muscle interface device utilizes a plurality of electromyographic (EMG) sensors to detect electrical activity produced by muscles during contraction, and convert the electrical signals for processing. The electrical signals detected from the muscles are interpreted as gestures (e.g. a combination of hand, wrist and arm movements) made by a user which provide a control input to a wearable head mounted display. The control input is preferably provided wirelessly via a wireless communication protocol, such as Near-Field Communication (“NFC”) or Bluetooth™, for example.


In another embodiment, various types of sensors may be used alone or in lieu of or in combination with EMG sensors to detect gestures made by a user, for processing as a control input for interacting with a wearable head mounted display. This may be one or more mechanomyographic (MMG) sensors to detect vibrations made by muscles during contraction, or one or more accelerometer sensors to detect larger movements.


In another embodiment, the muscle interface device includes a calibration module with a routine for calibrating the muscle interface device for use with the wearable head mounted display.


Other features and advantages will become apparent from the following detailed description and accompanying drawings. It should be understood, however, that the detailed description and specific examples are given by way of illustration and not limitation. Many modifications and changes within the scope of the present invention may be made without departing from the spirit thereof, and the invention includes all such modifications.


A wearable muscle interface device that in use interacts with content displayed on a wearable head-mounted display may be summarized as including: a plurality of muscle activity sensors to be worn on an arm of a user, the muscle activity sensors responsive to signals generated by muscles in the arm of the user; and a transmitter communicatively coupled to the plurality of muscle activity sensors, wherein in use the transmitter transmits at least one signal from the wearable muscle interface device directly to a receiver on the wearable head-mounted display based on the signals detected by the muscle activity sensors; wherein the at least one signal transmitted, in use, from the wearable muscle interface device directly to the receiver on the wearable head-mounted display effects at least one interaction with content displayed on the wearable head-mounted display. The wearable muscle interface device may further include a processor that in use interprets the signals detected by the muscle activity sensors as a gesture, wherein the processor is communicatively coupled in between the transmitter and the plurality of muscle activity sensors, and wherein the at least one signal that, in use, is transmitted from the wearable muscle interface device may be based on the gesture interpreted by the processor of the wearable muscle interface device. The wearable head-mounted display may include a processor communicatively coupled to the receiver of the wearable head-mounted display, and the at least one signal that, in use, is transmitted from the wearable muscle interface device to the wearable head-mounted display may be interpreted as a gesture by the processor of the wearable head-mounted display.


The wearable muscle interface device may further include a haptic feedback module that in use provides haptic feedback to the user, the haptic feedback module including a vibratory motor. The plurality of muscle activity sensors may include at least one muscle activity sensor selected from the group consisting of: an electromyographic (EMG) sensor and a mechanomyographic (MMG) sensor. The wearable muscle interface device may further include at least one accelerometer that in use detects signals generated by motion of the arm of the user, the at least one accelerometer communicatively coupled to the transmitter, and wherein in use the at least one signal transmitted from the transmitter of the wearable muscle interface device directly to the receiver on the wearable head-mounted display may be based on both the signals detected by the muscle activity sensors and the signals detected by the at least one accelerometer. The transmitter may include a wireless transmitter.


A wearable system that in use provides hands-free access to and control of a portable electronic display may be summarized as including: i) a wearable muscle interface device comprising: a plurality of muscle activity sensors to be worn on an arm of a user, the muscle activity sensors responsive to signals generated by muscles in the arm of the user; and a transmitter communicatively coupled to the plurality of muscle activity sensors, wherein in use the transmitter transmits at least one signal from the wearable muscle interface device based on the signals detected by the muscle activity sensors; and ii) a wearable head-mounted display comprising: at least one display screen to be worn on a head of the user, the at least one display screen arranged to be positioned in front of at least one eye of the user when worn on the head of the user; a receiver communicatively coupled to the at least one display screen, wherein in use the receiver directly receives the at least one signal transmitted from the transmitter of the wearable muscle interface device; and a processor communicatively coupled to the receiver and to the at least one display screen, wherein in use the at least one signal received directly from the transmitter of the wearable muscle interface device by the receiver of the wearable head-mounted display effects control of at least one function of the wearable head-mounted display. The transmitter of the wearable muscle interface device may include a wireless transmitter and the receiver of the wearable head-mounted display may include a wireless receiver. The wearable muscle interface device of the wearable system may further include a processor that in use interprets the signals detected by the muscle activity sensors as a gesture, wherein the processor of the wearable muscle interface device is communicatively coupled in between the transmitter and the plurality of muscle activity sensors, and wherein the at least one signal that, in use, is transmitted from the wearable muscle interface device may be based on the gesture interpreted by the processor of the wearable muscle interface device.


The plurality of muscle activity sensors in the wearable muscle interface device of the wearable system may include at least one muscle activity sensor selected from the group consisting of: an electromyographic (EMG) sensor and a mechanomyographic (MMG) sensor. The wearable muscle interface device of the wearable system may further include at least one accelerometer that in use detects signals generated by motion of the arm of the user, the at least one accelerometer communicatively coupled to the transmitter, and wherein in use the at least one signal transmitted by the transmitter of the wearable muscle interface device may be based on both the signals detected by the muscle activity sensors and the signals detected by the at least one accelerometer.


A method of using a wearable system to achieve hands-free access to and control of a portable electronic display, wherein the wearable system includes a wearable muscle interface device and a wearable head-mounted display, may be summarized as including: detecting muscle activity corresponding to a physical gesture performed by a user of the wearable system by at least one muscle activity sensor of the wearable muscle interface device; transmitting at least one signal from the wearable muscle interface device by a transmitter of the wearable muscle interface device based at least in part on the muscle activity detected by at least one muscle activity sensor of the wearable muscle interface device; receiving the at least one signal directly from the wearable muscle interface device by a receiver of the wearable head-mounted display; processing the at least one signal by a processor of the wearable head-mounted display; and effecting at least one interaction between the user and the wearable head-mounted display by the processor of the wearable head-mounted display based on the processing of the at least one signal by the processor of the wearable head-mounted display. The method may further include, in response to detecting muscle activity corresponding to a physical gesture performed by a user of the wearable system by at least one muscle activity sensor of the wearable muscle interface device, processing the detected muscle activity by a processor of the wearable muscle interface device, and transmitting at least one signal from the wearable muscle interface device by a transmitter of the wearable muscle interface device based at least in part on the muscle activity detected by at least one muscle activity sensor of the wearable muscle interface device may include transmitting at least one signal from the wearable muscle interface device by the transmitter of the wearable muscle interface device based at least in part on processing the detected muscle activity by the processor of the wearable muscle interface device.


The method may further include detecting motion of the wearable muscle interface device corresponding to the physical gesture performed by the user of the wearable system by at least one accelerometer of the wearable muscle interface device, and transmitting at least one signal from the wearable muscle interface device by a transmitter of the wearable muscle interface device based at least in part on the muscle activity detected by at least one muscle activity sensor of the wearable muscle interface device may include transmitting at least one signal from the wearable muscle interface device by the transmitter of the wearable muscle interface device based on both the muscle activity detected by at least one muscle activity sensor of the wearable muscle interface device and the motion detected by at least one accelerometer of the wearable muscle interface device. Transmitting at least one signal from the wearable muscle interface device by a transmitter of the wearable muscle interface device may include wirelessly transmitting at least one signal from the wearable muscle interface device by a wireless transmitter of the wearable muscle interface device. Receiving the at least one signal directly from the wearable muscle interface device by a receiver of the wearable head-mounted display may include wirelessly receiving the at least one signal directly from the wearable muscle interface device by a wireless receiver of the wearable head-mounted display.





BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS

In the drawings, identical reference numbers identify similar elements or acts. The sizes and relative positions of elements in the drawings are not necessarily drawn to scale. For example, the shapes of various elements and angles are not drawn to scale, and some of these elements are arbitrarily enlarged and positioned to improve drawing legibility. Further, the particular shapes of the elements as drawn are not intended to convey any information regarding the actual shape of the particular elements, and have been solely selected for ease of recognition in the drawings.



FIG. 1 is a side plan view that illustrates a user wearing a head mounted display and a muscle interface device in accordance with the present systems, devices, and methods.



FIG. 2A is an isometric view that illustrates a detailed view of a muscle interface device in accordance with the present systems, devices, and methods.



FIG. 2B is a data graph that illustrates an electrical signal detected by an EMG sensor.



FIG. 3 is a schematic view that illustrates wireless communication between a head mounted display and a muscle interface device in accordance with the present systems, devices, and methods.



FIG. 4 is a schematic view that illustrates a user's hand and wrist gesture processed as a control signal by the muscle interface device for interacting with content displayed on the head mounted display.



FIG. 5 is a schematic view of a system architecture of a muscle interface device in accordance with the present systems, devices, and methods.



FIG. 6 is a flow chart of a method of using a wearable system to achieve hands-free access to and control of a portable electronic display in accordance with the present systems, devices, and methods.



FIG. 7 is a flow-diagram showing a method of using wearable system to achieve hands-free access to and control of a portable electronic display in accordance with the present systems, devices, and methods.





In the drawings, embodiments of the invention are illustrated by way of example. It is to be expressly understood that the description and drawings are only for the purpose of illustration and as an aid to understanding, and are not intended as a definition of the limits of the invention.


DETAILED DESCRIPTION

In the following description, certain specific details are set forth in order to provide a thorough understanding of various disclosed embodiments. However, one skilled in the relevant art will recognize that embodiments may be practiced without one or more of these specific details, or with other methods, components, materials, etc. In other instances, well-known structures associated with electronic devices, and in particular portable electronic devices such as wearable electronic devices, have not been shown or described in detail to avoid unnecessarily obscuring descriptions of the embodiments.


Unless the context requires otherwise, throughout the specification and claims which follow, the word “comprise” and variations thereof, such as, “comprises” and “comprising” are to be construed in an open, inclusive sense, that is as “including, but not limited to.”


Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structures, or characteristics may be combined in any suitable manner in one or more embodiments.


As used in this specification and the appended claims, the singular forms “a,” “an,” and “the” include plural referents unless the content clearly dictates otherwise. It should also be noted that the term “or” is generally employed in its broadest sense, that is as meaning “and/or” unless the content clearly dictates otherwise.


The headings and Abstract of the Disclosure provided herein are for convenience only and do not interpret the scope or meaning of the embodiments.


The present disclosure relates to muscle interface systems, devices and methods that enable a user to access and interact with content displayed on an electronic display in an inconspicuous, hands-free manner.


In an aspect, a wearable system includes a wearable muscle interface device comprising a plurality of muscle activity sensors worn on an arm of a user. The plurality of muscle activity sensors are responsive to signals generated by muscles in the arm of the user. For example, when the user performs a physical gesture that involves one or more muscle(s) in the arm upon which the muscle interface device is worn, at least one of the muscle activity sensors may detect signals generated by the one or more muscle(s). The wearable muscle interface device is adapted to recognize gestures made by the user and to interact with content displayed on a wearable head-mounted display in response to the recognized gestures. To this end, the wearable system further includes a wearable head-mounted display and the wearable muscle interface device includes a transmitter communicatively coupled to the plurality of muscle activity sensors. In use, the transmitter of the wearable muscle interface device transmits at least one signal from the wearable muscle interface device directly to a receiver on the wearable head-mounted display based on the signals detected by the muscle activity sensors. The at least one signal transmitted from the wearable muscle interface device directly to the receiver on the wearable head-mounted display effects at least one interaction with content displayed on the wearable head-mounted display.


In another aspect, a muscle interface method comprises processing at least one signal based on one or more gesture(s) made by a user's hand, wrist and/or arm movements to interact with content displayed on the wearable head-mounted display.


The plurality of muscle activity sensors in and/or on-board the wearable muscle interface device may include electromyography (EMG) sensors and/or mechanomyography (MMG) sensors to detect electrical signals and/or vibrations, respectively, produced by muscles in the user's arm and to provide one or more signal(s) in response to the detected electrical signals and/or vibrations. The electrical signals and/or vibrations detected from the muscles are interpreted as gestures made by the user which provide a direct control input to a wearable head-mounted display.


The control input is provided directly from the wearable muscle interface device to the wearable head-mounted display. Preferably, the control input is provided wirelessly from the wearable muscle interface device directly to the wearable head-mounted display via a wireless communication protocol, such as NFC or Bluetooth™, for example. However, it will be appreciated that other types of wireless communications may be used, including any wireless communication protocol developed for smart phones and similar devices. In some applications, a direct wire connection between the wearable muscle interface device and the wearable head-mounted display may be used.


In addition to EMG and/or MMG sensors, various other types of sensors may be used to detect gestures made by the user. For example, inertial sensors such as accelerometers and/or gyroscopes may be used to detect signals generated by motion of the arm of the user in response to the user performing the physical gesture. The wearable muscle interface device may include one or more accelerometer sensors that, in use, detect signals generated by motion of the arm of the user and/or measure characteristics of gestures made by the user, including gestures involving the elbow or even the shoulders of the user. When used together with EMG and/or MMG sensors for detecting gestures, the accelerometer sensors may be utilized to increase the variety of control inputs that may be generated for direct interaction with a wearable head-mounted display.


An illustrative example will now be described with reference to the drawings.


Shown in FIG. 1 is an illustrative user 100 wearing a wearable system 150 that in use provides hands-free access to and control of a portable electronic display 310 in accordance with the present systems, devices, and methods. Wearable system 150 includes a wearable head-mounted display 310 with on-board display control 300, and a wearable muscle interface device 200 having a plurality of muscle activity sensors in accordance with the present systems, devices, and methods. In this illustrative example, wearable muscle interface device 200 is a flexible, stretchable band that may be worn on the arm (e.g., the forearm) of user 100 as shown. As discussed in more detail herein, wearable muscle interface device 200 includes a transmitter (e.g., a wireless transmitter) and wearable head-mounted display 310 includes a receiver (e.g., a wireless receiver) such that at least one signal may be transmitted from wearable muscle interface device 200 directly to wearable head-mounted display 310 (i.e., without being received and re-transmitted by any intervening device, such as a stationary, non-portable intervening device) in response to signals detected by the muscle activity sensors of wearable muscle interface device 200 in order to effect interactions with and/or control of content displayed on or by wearable head-mounted display 310.



FIG. 2A illustrates a detailed view of wearable muscle interface device 200 from wearable system 150 of FIG. 1 in accordance with the present systems, devices, and methods. As shown, wearable muscle interface device 200 may comprise a processor 210 (e.g., a central processing unit, a digital microcontroller, a digital signal processor, or similar), and one or more batteries 220, which may be rechargeable, and which may be utilized concurrently or sequentially in conventional manner. As shown, wearable muscle interface device 200 is a band to be worn on an arm of a user (e.g., a forearm of a user) and includes a plurality of muscle activity sensors 230 which may be positioned radially around the circumference of the band, such that the sensors 230 can, when in use, detect signals generated by muscles in the arm of user 100 in response to user 100 performing a physical gesture. Wearable muscle interface device 200 may further include transmitter 250 (e.g., a wireless transmitter) communicatively coupled to the plurality of muscle activity sensors 230 which, in use, transmits at least one signal from wearable muscle interface device 200 directly to a receiver on a wearable head-mounted display 310 based on the signals detected by muscle activity sensors 230. Wearable muscle interface device 200 may include a feedback mechanism (e.g., a haptic feedback module) such as a vibratory motor 240 to provide haptic feedback as described further below.


Further exemplary details that may be included in wearable muscle interface device 200 include the systems, articles, and methods described in, without limitation: U.S. Provisional Patent Application Ser. No. 61/768,322, U.S. Provisional Patent Application Ser. No. 61/771,500, U.S. Provisional Patent Application Ser. No. 61/857,105, U.S. Provisional Patent Application Ser. No. 61/860,063, U.S. Provisional Patent Application Ser. No. 61/822,740, U.S. Provisional Patent Application Ser. No. 61/866,960, U.S. Provisional Patent Application Ser. No. 61/869,526, U.S. Provisional Patent Application Ser. No. 61/874,846, U.S. Provisional Patent Application Ser. No. 61/872,569, U.S. Provisional Patent Application Ser. No. 61/881,064, U.S. Provisional Patent Application Ser. No. 61/894,263, U.S. Provisional Patent Application Ser. No. 61/903,238, U.S. Provisional Patent Application Ser. No. 61/909,786, and/or U.S. Provisional Patent Application Ser. No. 61/915,338, all of which are incorporated by reference herein in their entirety.


Wearable muscle interface device 200 may be calibrated when first worn, prior to operation, such that muscle interface device 200 may perform reliable gesture identification regardless of the exact positioning of the muscle activity sensors 230 on the user's arm.


By way of example, muscle activity sensors 230 may include one or more EMG sensor(s), each of which may provide a respective EMG signal in the form of an oscillating waveform that varies in both frequency and amplitude. A majority of signal information that is needed for reliable gesture identification may be contained within a limited bandwidth of such an oscillating waveform, such as in the 5 Hz to 250 Hz frequency band. An illustrative example of an EMG signal 200B is shown in FIG. 2B.


As previously described, the plurality of muscle activity sensors 230 may include one or more MMG sensor(s) comprising piezoelectric sensors, which may be used to measure the vibrations at the surface of the skin produced by the underlying muscles when contracted. By way of example, the MMG signal generated may be an oscillating waveform that varies in both frequency and amplitude, and a majority of signal information that is needed for reliable gesture identification may be contained within a limited bandwidth, such as in the 5 Hz to 250 Hz frequency band. Because the MMG signal is acquired via mechanical means, electrical variations like skin impedance may not have a significant effect on the signal. The MMG signal may be very similar to the illustrative example of EMG signal 200B shown in FIG. 2B.


As previously described, wearable muscle interface device 200 may include one or more accelerometer sensor(s) 260 that, in use, detect additional aspects of gestures made by user 100 in, for example, three degrees of freedom. For example, at least one accelerometer 260 may be communicatively coupled to transmitter 250 of wearable muscle interface device 200 and, in use, the at least one signal transmitted from transmitter 250 directly to the receiver on the wearable head-mounted display 310 may be based on both the signals detected by muscle activity sensors 230 and the signals detected by the at least one accelerometer 260. An accelerometer signal may, for example, consist of three digital channels of data, each representing the acceleration in a respective one of three orthogonal directions (e.g., the x, y, and z directions). The signal may be representative of all of the accelerations that the user's arm is subject to, and may further represent motion of the body as a whole.


Now referring to FIG. 3, shown is wearable system 150 from FIG. 1 with an illustration of direct wireless communication (e.g., Bluetooth™, NFC, etc.) between wearable muscle interface device 200 and wearable head-mounted display 310 in accordance with the present systems, devices, and methods. This wireless communication is utilized to transmit one or more signal(s) from wearable muscle interface device 200 directly to wearable head-mounted display 310 (e.g., to a wireless receiver 350 located in or on display control 300 of wearable head-mounted display 310) without any intervening communicative couplings or links. In this way, the user 100 may access and control or otherwise interact with a portable electronic display in an inconspicuous and hands-free manner. User 100 does not need to use his or her hand(s) to position or orient the portable electronic display of wearable head-mounted display 310 in order to be able to see, access, receive feedback from, or otherwise interact with the portable electronic display of wearable head-mounted display 310 because wearable head-mounted display 310 is arranged such that at least one display screen is positioned in front of at least one eye of user 100 at all times while wearable head-mounted display 310 is worn on user 100's head, regardless of the direction that user 100 is facing. Furthermore, wearable muscle interface device 200 enables user 100 to control or otherwise interact with content displayed on wearable head-mounted display 310 in an inconspicuous manner by using touchless gestures.


Inconspicuous gesture-based control of and/or interactions with wearable head-mounted display 310 is illustrated by way of example in FIG. 4, in which user 100's hand and wrist gesture is detected and processed by wearable muscle interface device 200 and transmitted directly from transmitter 250 to receiver 350 of wearable head-mounted display 310 for interacting with content displayed thereon.


In this particular example, a gesture 410 made by the user (100) extending an index finger, and making a wrist flexion motion 420 is detected by the muscle activity sensors 230 (and/or accelerometer sensors 260 if included) of wearable muscle interface device 200 (not visible in FIG. 4). Signals provided by the muscle activity sensors 230 in response to the detected gesture 410 are processed by processor 210 (FIG. 2) which interprets the signals to identify gesture 410 performed by user 100. A corresponding signal is produced based on the gesture 410 interpreted by the processor 210 and the signal is transmitted from transmitter 250 directly to receiver 350 of wearable head-mounted display 310, which causes a menu appearing on wearable head-mounted display 310 to scroll downwards.


As another example, a similar gesture in which user 100 extends the index finger and makes a wrist extension motion may be detected by muscle activity sensors 230 (and/or accelerometer sensors 260 if included) of wearable muscle interface device 200 and processed by processor 210 (FIG. 2). Processor 210 may interpret the detected muscle activity to identify the gesture performed, and a corresponding signal may be transmitted from transmitter 250 directly to receiver 350 of wearable head-mounted display 310 to cause a menu appearing on wearable head-mounted display 310 to scroll upwards.


As yet another example, a gesture in which user 100 extends the index finger and makes a poking motion involving a slight movement of the elbow and shoulder may be detected by muscle activity sensors 230 (and/or accelerometer sensors 260 if included) of wearable muscle interface device 200 and processed by processor 210 (FIG. 2). Processor 210 may interpret the detected muscle activity to identify the gesture performed, and a corresponding signal may be transmitted from transmitter 250 directly to receiver 350 of wearable head-mounted display 310 to cause a highlighted menu item appearing on wearable head-mounted display 310 to be selected.


If the user extends a different finger other than the index finger, muscle activity sensors 230 may detect this, a different gesture may be identified by wearable muscle interface device 200, and a different signal may be transmitted directly to wearable head-mounted display 310 to effect a different interaction or function thereof. For example, extending the little finger or “pinky” finger instead of the index finger may cause wearable system 150 to interpret the user's gestures with functions analogous to clicking a right mouse button rather than a left mouse button in a conventional mouse user interface. Extending both the index and pinky fingers at the same time may cause wearable system 150 to interpret the user's gestures with yet other functions analogous to clicking a third mouse button in a conventional mouse user interface.


Thus, wearable muscle interface device 200 may be adapted and/or calibrated to recognize a wide range of gestures made by a user 100, based on measurements from a plurality of muscle activity sensors 230 (and, in some implementations, one or more accelerometer sensor(s) 260) in the wearable muscle interface device 200.


Wearable muscle interface device 200 may itself be operative to interpret the gestures from the detected signals as described above by, for example, using an on-board processor 210 to process the EMG signals and interpret the EMG signals as a gesture via a gesture identification process (e.g., by invoking data and/or instructions stored in an on-board non-transitory computer-readable storage medium that, when executed by processor 210, cause processor 210 to identify the gesture performed by user 100). Wearable muscle interface device 200 may then transmit one or more signal(s) from transmitter 250 directly to receiver 350 of wearable head-mounted display 310 in order to effect some interaction with wearable head-mounted display 310 based on the interpreted gesture. In this example, the processor 210 may be communicatively coupled in between the transmitter 250 and the plurality of muscle activity sensors 230 such that transmitter 250 transmits one or more signal(s) provided by processor 210 (e.g., corresponding to an interpreted gesture) based at least in part on the signals provided by muscle activity sensors 230.


However, in an alternative implementation, the detected EMG signals may be transmitted directly to the receiver 350 of wearable head-mounted display 310 from transmitter 250 (e.g., without being processed by processor 210, which may or may not be included in device 200 in this example) and wearable head-mounted display 310 may include a processor 320 (e.g., a central processing unit, a digital microcontroller, a digital signal processor, or similar, located in or on display control 300) communicatively coupled to receiver 350 to process the EMG signals and interpret the EMG signals as a gesture via a gesture identification process (e.g., by invoking data and/or instructions stored in an on-board non-transitory computer-readable storage medium that, when executed by processor 320, cause processor 320 to identify the gesture performed by user 100). Wearable head-mounted display 310 may then effect some interaction with content displayed thereon based on the interpreted gesture. Whether the detected EMG signals are interpreted at the device 200 or at the display 310, the detected EMG signals are first interpreted as a recognized gesture in order to interact with content displayed on the display 310.


Wearable muscle interface device 200 may include a haptic feedback module to provide feedback that a gesture has been recognized. This haptic feedback may provide a user 100 with confirmation that the user 100's gesture has been recognized, and successfully converted to a signal to interact with content displayed on wearable head-mounted display 310. The haptic feedback module may comprise, for example, a vibrating mechanism such as a vibratory motor 240 built into the wearable muscle interface device 200.


Alternatively, rather than haptic feedback provided by the wearable muscle interface device 200, confirmation of recognition of a gesture may be provided by auditory feedback, either generated by a speaker on the wearable muscle interface device 200, or operatively connected to the wearable head-mounted display 310.


As another alternative, confirmation of recognition of a gesture may be provided visually on the wearable head-mounted display 310 itself. If there is more than one possible gesture that may be interpreted from the detected signals, rather than providing a possibly erroneous signal, the wearable muscle interface device 200 and/or the wearable head-mounted display 310 may provide a selection of two or more possible gestures as possible interpretations, and the user may be prompted to select from one of them to confirm the intended gesture and corresponding control.


Now referring to FIG. 5, shown is an illustrative schematic system architecture 500 of the wearable muscle interface device 200 component of a wearable system 150 providing inconspicuous and hands-free access to and control of a portable electronic display in accordance with the present systems, device, and methods. As shown, system architecture 500 includes a CPU 502 (e.g., a processor, such as a digital microprocessor or microcontroller), non-transitory computer-readable memory 504, system clock 506, a wireless communication module 508 (e.g., Bluetooth™, NFC, or the like), and a direct memory access (DMA) controller 510. As shown, DMA controller 510 is adapted to receive inputs from various sensors on-board the wearable muscle interface device 200, including one or more EMG sensors 520, MMG sensors 530 and/or accelerometer sensors 540.


In the illustrative example of system architecture 500, detected signals from one or more EMG sensors 520 are processed through signal filter 522 and converted from analog to digital signals by ADC 524. If one or more MMG sensors 530 are used (either in addition to or instead of EMG sensors 520), then the detected signals from the MMG sensors 530 are processed through signal filter 532 and converted from analog to digital signals by ADC 534. Digital signals from one or more accelerometer sensors 540 may also be processed through signal filter 542 and received by DMA controller 510.


The data from the various types of sensors 520, 530, 540 may be acquired through an analog filtering chain. The data may be band-passed through filters 522, 532 between about 10 Hz to about 500 Hz, and amplified (e.g. by a total of about 1000 times). This filtering and amplification can be altered to whatever is required to be within software parameters. A notch filter at 60 Hz, or at any other relevant frequency, may also be used to remove powerline noise.


Data from the sensors 520, 530 may be converted to, e.g., 12-bit digital data by ADCs 524, 534, and then clocked into onboard memory 504 using clock 506 by the DMA controller 510 to be processed by the CPU 502.


Now referring to FIG. 6, shown is a schematic flow chart of a method 600 of using a wearable system (e.g., 150) to achieve hands-free access to and control of a portable electronic display in accordance with the present systems, devices, and methods. As shown, method 600 begins at block 602, where method 600 pairs a wearable muscle interface device 200 with a wearable head-mounted display 310. Method 600 then proceeds to block 604, where content and/or user interface (UI) is displayed on the wearable head-mounted display 310.


Method 600 then proceeds to block 606, where method 600 determines if the displayed content and/or UI is navigable. If no, method 600 returns to block 604. If yes, method 600 proceeds to block 608, where the wearable muscle interface device 200 detects muscle activity corresponding to a physical gesture performed by a user of the wearable system 150 (i.e., at least one muscle activity sensor 230 of the wearable muscle interface device 200 detects the user's intentional hand/arm movements and positions), and wirelessly sends/transmits at least one signal corresponding to an identified gesture from the wearable muscle interface device 200 to the wearable head-mounted display 310. The at least one signal may be sent by a transmitter 250 of the wearable muscle interface device 200 based on the muscle activity detected by at least one muscle activity sensor 230 of the wearable muscle interface device 200.


Method 600 then proceeds to block 610, where a receiver 350 on the wearable head-mounted display 310 receives the at least one signal directly from the transmitter 250 of the wearable muscle interface device 200. A processor 320 of the wearable head-mounted display 310 processes the at least one signal, and effects at least one interaction between the user 100 and the wearable head-mounted display 310 based on the processing of the at least one signal by processor 320 of the wearable head-mounted display 310.


Another example of a method employing a wearable system in accordance with the present systems, devices, and methods is illustrated in FIG. 7. FIG. 7 is a flow-diagram showing a method 700 of using wearable system 150 to achieve hands-free access to and control of a portable electronic display. The wearable system 150 includes a wearable muscle interface device 200 and a wearable head-mounted display 310. Method 700 includes five acts 701, 702, 703, 704, and 705, although those of skill in the art will appreciate that in alternative embodiments certain acts may be omitted and/or additional acts may be added. Those of skill in the art will also appreciate that the illustrated order of the acts is shown for exemplary purposes only and may change in alternative embodiments. For the purpose of method 700, the term “user” refers to a person that is wearing both the wearable muscle interface device 200 (e.g., worn on at least one of the user's arms) and the wearable head-mounted display 310 of the wearable system 150 (e.g., worn on the user's head).


At 701, the user performs a physical gesture and muscle activity corresponding to the physical gesture is detected by at least muscle activity sensor 230 of the wearable interface device 200. The muscle activity sensors 230 may include at least one EMG sensor that detects electrical signals generated by the muscle activity and/or at least one MMG sensor that detects vibrations generated by the muscle activity. In addition to muscle activity, motion of the wearable muscle interface device 200 corresponding to the physical gesture may be detected by at least one accelerometer 260 on-board the wearable muscle interface device 200.


At 702, at least one signal is transmitted by a transmitter 250 of the wearable muscle interface device 200 based at least in part on the muscle activity detected at 701. As previously described, transmitter 250 may be a wireless transmitter such that transmitting at least one signal by transmitter 250 includes wirelessly transmitting the at least one signal by transmitter 250. In implementations in which motion of the wearable muscle interface device 200 is also detected by at least one accelerometer 260, transmitting at least one signal by transmitter 250 based at least in part on the muscle activity detected at 701 may include transmitting at least one signal by transmitter 250 based on both the muscle activity detected by at least one muscle activity sensor 230 and the motion detected by at least one accelerometer 260.


In response to detecting muscle activity corresponding to a physical gesture performed by the user at 701, method 700 may include processing the detected muscle activity by a processor 210 communicatively coupled in between the muscle activity sensors 230 and the transmitter 250 (e.g., to interpret the signals provided by the muscle activity sensors 230 and/or to identify the user-performed gesture). In this case, transmitting at least one signal by transmitter 250 based at least in part on the muscle activity detected at 701 may include transmitting at least one signal by transmitter 250 based at least in part on processing the detected muscle activity by the processor 210 of the wearable muscle interface device 200.


At 703, the at least one signal is received directly from transmitter 250 by a receiver 350 of the wearable head-mounted display 310. In implementations where transmitter 250 is a wireless transmitter, receiver 350 may include a wireless receiver such that receiving the at least one signal by receiver 350 includes wirelessly receiving the at least one signal by receiver 350. The at least one signal is transmitted directly from transmitter 250 to receiver 350 without routing through any intervening devices or systems.


At 704, the at least one signal received by receiver 350 is processed by a processor 320 of the wearable head-mounted display 320. Processing the at least one signal by the processor 320 of the wearable head-mounted display may include, for example, mapping or otherwise associating the at least one signal to/with one or more function(s) of the wearable head-mounted display 310 based on data and/or instructions stored in a non-transitory computer-readable storage medium on-board the wearable head-mounted display 310 (data and/or instructions which, when executed by the processor 320 of the wearable head-mounted display 310, cause the processor 320 of the wearable head-mounted display to effect one or more function(s) of the wearable head-mounted display 310).


At 705, at least one interaction between the user and the wearable head-mounted display 310 is effected by the processor 320 of the wearable head-mounted display 310 based on the processing of the at least one signal at 704. The at least one interaction may include any function or operation that prompts, modifies, changes, elicits, or otherwise involves visual information provided to the user by the wearable head-mounted display 310, including without limitation: interacting with visual material such as a photograph or video, navigating a menu, interacting with visually displayed elements such as a map or an element of a video game, and so on. Depending on the specific application, elements displayed on the wearable head-mounted display 310 may or may not accommodate or otherwise take into account aspects of the user's environment that may be visible to the user. For example, elements displayed on the wearable head-mounted display 310 may obscure, overlay, augment, highlight, block, be superimposed on, and/or semi-transparently project in front of elements of the user's environment.


As will be appreciated, the systems, devices, and methods that enable a user to access and interact with content displayed on an electronic display in an inconspicuous, hands-free manner described herein may be used for interaction with a portable electronic display in a wide range of applications, in virtually any application in which portable electronic displays are contemplated. By providing a discreet method of interacting with a wearable head-mounted display, a user is able to interact with such a display in any operating environment, including situations where overt gesturing (e.g. raising the hand to touch an input device provided on the wearable head-mounted display itself) is not desirable.


While various embodiments and illustrative examples have been described above, it will be appreciated that these embodiments and illustrative examples are not limiting, and the scope of the invention is defined by the following claims.


The various embodiments described herein provide, at least, a wearable system (e.g., 150) including a wearable muscle interface device (e.g., 200) that, in use, is to be worn on an arm of a user in order to enable hands-free access to, and control of, a wearable head-mounted display (e.g., 310). As described previously, the singular forms “a,” “an,” and “the” used in this specification and the appended claims include plural referents unless the content clearly dictates otherwise. In some applications, it can be advantageous or otherwise desirable for such a wearable system (150) to employ two or more wearable muscle interface devices (e.g., two or more wearable muscle interface devices 200) worn on both of the user's arms (e.g., at least a respective wearable muscle interface device 200 worn on each of the user's arms) as described in U.S. Provisional Patent Application Ser. No. 61/874,846. Such may enable a greater number and/or diversity of gestures to be used to interact with content displayed on the wearable head-mounted display (e.g., 310). Furthermore, in various embodiments the gesture-based interaction systems, devices, and methods described herein may be combined with other forms of touchless control, including without limitation: voice/speech-based control techniques such as Siri®, control techniques based on eye/vision tracking and/or blinking, electroencephalography (EEG), or the like.


Throughout this specification and the appended claims, the terms “head-mounted display” and “heads-up display” are used substantially interchangeably to refer to an electronic display that is worn on the head of a user and arranged so that at least one electronic display is positioned in front of at least one eye of the user when the head-mounted/heads-up display is worn on the head of the user. For greater clarity, “positioned in front of at least one eye of the user” means that the content displayed on or by the electronic display is displayed, projected, or otherwise provided generally in front of at least one eye of the user and is visible by that at least one eye regardless of the orientation or position of the user's head. An electronic display that is “positioned in front of at least one eye of the user” may correspond to a projection, reflection, refraction, diffraction, or direct display of optical signals and may be located in the user's direct line of sight or may be located off of the user's direct line of sight such that the user may or may not need to deliberately direct one or more eye(s), without necessarily moving their head, towards the electronic display in order to see (i.e., access) the content displayed thereby.


Throughout this specification and the appended claims, the term “gesture” is used to generally refer to a physical action (e.g., a movement, a stretch, a flex, a pose) performed or otherwise effected by a user. Any physical action performed or otherwise effected by a user that involves detectable muscle activity (detectable, e.g., by at least one appropriately positioned muscle activity sensor) and/or detectable motion (detectable, e.g., by at least one appropriately positioned inertial sensor, such as an accelerometer and/or a gyroscope) may constitute a gesture in the present systems, articles, and methods.


Throughout this specification and the appended claims the term “communicative” as in “communicative pathway,” “communicative coupling,” and in variants such as “communicatively coupled,” is generally used to refer to any arrangement for transferring and/or exchanging information. Exemplary communicative pathways include, but are not limited to, electrically conductive pathways (e.g., electrically conductive wires, electrically conductive traces), magnetic pathways (e.g., magnetic media), and/or optical pathways (e.g., optical fiber), and exemplary communicative couplings include, but are not limited to, electrical couplings, magnetic couplings, and/or optical couplings.


Throughout this specification and the appended claims, the term “provide” and variants such as “provided” and “providing” are frequently used in the context of signals. For example, a muscle activity sensor is described as “providing at least one signal” and an inertial sensor is described as “providing at least one signal.” Unless the specific context requires otherwise, the term “provide” is used in a most general sense to cover any form of providing a signal, including but not limited to: relaying a signal, outputting a signal, generating a signal, routing a signal, creating a signal, transducing a signal, and so on. For example, a surface EMG sensor may include at least one electrode that resistively or capacitively couples to electrical signals from muscle activity. This coupling induces a change in a charge or electrical potential of the at least one electrode which is then relayed through the sensor circuitry and output, or “provided,” by the sensor. Thus, the surface EMG sensor may “provide” an electrical signal by relaying an electrical signal from a muscle (or muscles) to an output (or outputs). In contrast, an inertial sensor may include components (e.g., piezoelectric, piezoresistive, capacitive, etc.) that are used to convert physical motion into electrical signals. The inertial sensor may “provide” an electrical signal by detecting motion and generating an electrical signal in response to the motion.


Throughout this specification and the appended claims, “identifying” or “interpreting signals as” a gesture means associating a set of signals provided by one or more muscle activity sensor(s) (230) with a particular gesture. In the various embodiments described herein, “identifying” or “interpreting signals as” a gesture includes determining which gesture in a gesture library is most probable (relative to the other gestures in the gesture library) of being the gesture that a user has performed or is performing in order to produce the signals upon which the gesture identification is at least partially based. The wearable muscle interface devices described herein are generally not operative to identify any arbitrary gesture performed by a user. Rather, the wearable muscle interface devices described herein are operative to identify when a user performs one of a specified set of gestures, and that specified set of gestures is referred to herein as a gesture library. A gesture library may include any number of gestures, though a person of skill in the art will appreciate that the precision/accuracy of gesture identification may be inversely related to the number of gestures in the gesture library. A gesture library may be expanded by adding one or more gesture(s) or reduced by removing one or more gesture(s). Furthermore, in accordance with the present systems, articles, and methods, a gesture library may include a “rest” gesture corresponding to a state for which no activity is detected and/or an “unknown” gesture corresponding to a state for which activity is detected but the activity does not correspond to any other gesture in the gesture library.


Throughout this specification and the appended claims, infinitive verb forms are often used. Examples include, without limitation: “to detect,” “to provide,” “to transmit,” “to communicate,” “to process,” “to route,” and the like. Unless the specific context requires otherwise, such infinitive verb forms are used in an open, inclusive sense, that is as “to, at least, detect,” to, at least, provide,” “to, at least, transmit,” and so on.


The above description of illustrated embodiments, including what is described in the Abstract, is not intended to be exhaustive or to limit the embodiments to the precise forms disclosed. Although specific embodiments of and examples are described herein for illustrative purposes, various equivalent modifications can be made without departing from the spirit and scope of the disclosure, as will be recognized by those skilled in the relevant art. The teachings provided herein of the various embodiments can be applied to other portable and/or wearable electronic devices, not necessarily the exemplary wearable electronic devices generally described above.


For instance, the foregoing detailed description has set forth various embodiments of the devices and/or processes via the use of block diagrams, schematics, and examples. Insofar as such block diagrams, schematics, and examples contain one or more functions and/or operations, it will be understood by those skilled in the art that each function and/or operation within such block diagrams, flowcharts, or examples can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or virtually any combination thereof. In one embodiment, the present subject matter may be implemented via Application Specific Integrated Circuits (ASICs). However, those skilled in the art will recognize that the embodiments disclosed herein, in whole or in part, can be equivalently implemented in standard integrated circuits, as one or more computer programs executed by one or more computers (e.g., as one or more programs running on one or more computer systems), as one or more programs executed by on one or more controllers (e.g., microcontrollers) as one or more programs executed by one or more processors (e.g., microprocessors, central processing units, graphical processing units), as firmware, or as virtually any combination thereof, and that designing the circuitry and/or writing the code for the software and or firmware would be well within the skill of one of ordinary skill in the art in light of the teachings of this disclosure.


When logic is implemented as software and stored in memory, logic or information can be stored on any computer-readable medium for use by or in connection with any processor-related system or method. In the context of this disclosure, a memory is a computer-readable medium that is an electronic, magnetic, optical, or other physical device or means that contains or stores a computer and/or processor program. Logic and/or the information can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions associated with logic and/or information.


In the context of this specification, a “non-transitory computer-readable medium” can be any element that can store the program associated with logic and/or information for use by or in connection with the instruction execution system, apparatus, and/or device. The computer-readable medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device. More specific examples (a non-exhaustive list) of the computer readable medium would include the following: a portable computer diskette (magnetic, compact flash card, secure digital, or the like), a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM, EEPROM, or Flash memory), a portable compact disc read-only memory (CDROM), digital tape, and other non-transitory media.


The various embodiments described above can be combined to provide further embodiments. To the extent that they are not inconsistent with the specific teachings and definitions herein, all of the U.S. patents, U.S. patent application publications, U.S. patent applications, foreign patents, foreign patent applications and non-patent publications referred to in this specification and/or listed in the Application Data Sheet, including but not limited to U.S. Provisional Patent Application No. 61/752,226, filed Jan. 14, 2013, U.S. Provisional Patent Application Ser. No. 61/768,322, U.S. Provisional Patent Application Ser. No. 61/771,500, U.S. Provisional Patent Application Ser. No. 61/857,105, U.S. Provisional Patent Application Ser. No. 61/860,063, U.S. Provisional Patent Application Ser. No. 61/822,740, U.S. Provisional Patent Application Ser. No. 61/866,960, U.S. Provisional Patent Application Ser. No. 61/869,526, U.S. Provisional Patent Application Ser. No. 61/874,846, U.S. Provisional Patent Application Ser. No. 61/872,569, US Provisional Patent Application Serial No. 61/881,064, U.S. Provisional Patent Application Ser. No. 61/894,263, U.S. Provisional Patent Application Ser. No. 61/903,238, U.S. Provisional Patent Application Ser. No. 61/909,786, and/or U.S. Provisional Patent Application Ser. No. 61/915,338, are incorporated herein by reference, in their entirety. Aspects of 5 the embodiments can be modified, if necessary, to employ systems, circuits and concepts of the various patents, applications and publications to provide yet further embodiments.


These and other changes can be made to the embodiments in light of the above-detailed description. In general, in the following claims, the terms used should not be construed to limit the claims to the specific embodiments disclosed in the specification and the claims, but should be construed to include all possible embodiments along with the full scope of equivalents to which such claims are entitled. Accordingly, the claims are not limited by the disclosure.

Claims
  • 1. A wearable muscle interface device that in use interacts with content displayed on a wearable head-mounted display, the wearable muscle interface device comprising: a plurality of muscle activity sensors to be worn on an arm of a user, the muscle activity sensors responsive to electrical signals generated by muscles in the arm of the user;at least one accelerometer;a processor configured to: generate, based on at least the electrical signals detected by the muscle activity sensors and motion of the arm of the user detected by the at least one accelerometer,a first interpretation of a gesture performed by the user, anda second interpretation of the gesture performed by the user, andgenerate at least one control signal to request input from the user to select one of the first interpretation of the gesture or the second interpretation of the gesture, wherein the first interpretation of the gesture and the second interpretation of the gesture represent different gestures selected by the processor from a set of at least three defined gestures;a transmitter communicatively coupled to the plurality of muscle activity sensors, wherein in use the transmitter transmits at least one signal from the wearable muscle interface device directly to a receiver on the wearable head-mounted display based on the signals detected by the muscle activity sensors;wherein the at least one signal transmitted, in use, from the wearable muscle interface device directly to the receiver on the wearable head-mounted display effects at least one interaction with content displayed on the wearable head-mounted display including requesting the input from the user to select one of the first interpretation of the gesture performed by the user or the second interpretation of the gesture performed by the user.
  • 2. The wearable muscle interface device of claim 1, wherein the processor interprets the signals detected by the muscle activity sensors as gesture, wherein the processor is communicatively coupled in between the transmitter and the plurality of muscle activity sensors, and wherein the at least one signal that, in use, is transmitted from the wearable muscle interface device is based on the gesture interpreted by the processor of the wearable muscle interface device.
  • 3. The wearable muscle interface device of claim 1, wherein the wearable head-mounted display includes a processor communicatively coupled to the receiver of the wearable head-mounted display, and wherein the at least one signal that, in use, is transmitted from the wearable muscle interface device to the wearable head-mounted display is interpreted as a gesture by the processor of the wearable head-mounted display.
  • 4. The wearable muscle interface device of claim 1, further comprising a haptic feedback module that in use provides haptic feedback to the user, the haptic feedback module including a vibratory motor.
  • 5. The wearable muscle interface device of claim 1, wherein the plurality of muscle activity sensors includes at least one muscle activity sensor selected from the group consisting of: an electromyographic (EMG) sensor and a mechanomyographic (MMG) sensor.
  • 6. The wearable muscle interface device of claim 1, wherein the at least one accelerometer detects signals generated by motion of the arm of the user, the at least one accelerometer communicatively coupled to the transmitter, and wherein in use the at least one signal transmitted from the transmitter of the wearable muscle interface device directly to the receiver on the wearable head-mounted display is based on both the signals detected by the muscle activity sensors and the signals detected by the at least one accelerometer.
  • 7. The wearable muscle interface device of claim 1, wherein the transmitter includes a wireless transmitter.
  • 8. A wearable system that in use provides hands-free access to and control of a portable electronic display, the wearable system comprising: i) a wearable muscle interface device comprising:a plurality of muscle activity sensors to be worn on an arm of a user, the muscle activity sensors responsive to electrical signals generated by muscles in the arm of the user;at least one accelerometer;a processor configured to: generate, based on at least the electrical signals detected by the muscle activity sensors and motion of the arm of the user detected by the at least one accelerometer,a first interpretation of a gesture performed by the user, anda second interpretation of the gesture performed by the user, andgenerate at least one control signal to request input from the user to select one of the first interpretation of the gesture or the second interpretation of the gesture, wherein the first interpretation of the gesture and the second interpretation of the gesture represent different gestures selected by the processor from a set of at least three defined gestures;a transmitter communicatively coupled to the plurality of muscle activity sensors, wherein in use the transmitter transmits at least one signal from the wearable muscle interface device based on the signals detected by the muscle activity sensors;andii) a wearable head-mounted display comprising:at least one display screen to be worn on a head of the user, the at least one display screen arranged to be positioned in front of at least one eye of the user when worn on the head of the user;a receiver communicatively coupled to the at least one display screen, wherein in use the receiver directly receives the at least one signal transmitted from the transmitter of the wearable muscle interface device; anda processor communicatively coupled to the receiver and to the at least one display screen, wherein in use the at least one signal received directly from the transmitter of the wearable muscle interface device by the receiver of the wearable head-mounted display effects control of at least one function of the wearable head-mounted display including requesting the input from the user to select one of the first interpretation of the gesture performed by the user or the second interpretation of the gesture performed by the user.
  • 9. The wearable system of claim 8, wherein the transmitter of the wearable muscle interface device includes a wireless transmitter and the receiver of the wearable head-mounted display includes a wireless receiver.
  • 10. The wearable system of claim 8, wherein the processor interprets the signals detected by the muscle activity sensors as a gesture, wherein the processor of the wearable muscle interface device is communicatively coupled in between the transmitter and the plurality of muscle activity sensors, and wherein the at least one signal that, in use, is transmitted from the wearable muscle interface device is based on the gesture interpreted by the processor of the wearable muscle interface device.
  • 11. The wearable system of claim 8, wherein the plurality of muscle activity sensors includes at least one muscle activity sensor selected from the group consisting of: an electromyographic (EMG) sensor and a mechanomyographic (MMG) sensor.
  • 12. The wearable system of claim 8, wherein the accelerometer detects signals generated by motion of the arm of the user, the at least one accelerometer communicatively coupled to the transmitter, and wherein in use the at least one signal transmitted by the transmitter of the wearable muscle interface device is based on both the signals detected by the muscle activity sensors and the signals detected by the at least one accelerometer.
  • 13. A method of using a wearable system to achieve hands-free access to and control of a portable electronic display, wherein the wearable system includes a wearable muscle interface device, at least one accelerometer, a processor, and a wearable head-mounted display, the method comprising: detecting muscle activity corresponding to a physical gesture performed by a user of the wearable system by at least one muscle activity sensor of the wearable muscle interface device;generating, based on at least the electrical signals detected by the at least one muscle activity sensor and motion of the arm of the user detected by the at least one accelerometer, a first interpretation of a gesture performed by the user, anda second interpretation of the gesture performed by the user, andgenerating at least one control signal to request input from the user to select one of the first interpretation of the gesture or the second interpretation of the gesture, wherein the first interpretation of the gesture and the second interpretation of the gesture represent different gestures selected by the processor from a set of at least three defined gestures;transmitting at least one signal from the wearable muscle interface device by a transmitter of the wearable muscle interface device based at least in part on the muscle activity detected by at least one muscle activity sensor of the wearable muscle interface device;receiving the at least one signal directly from the wearable muscle interface device by a receiver of the wearable head-mounted display;processing the at least one signal by a processor of the wearable head-mounted display and;effecting at least one interaction between the user and the wearable head-mounted display by the processor of the wearable head-mounted display based on the processing of the at least one signal by the processor of the wearable head-mounted display including requesting the input from the user to select one of the first interpretation of the gesture performed by the user or the second interpretation of the gesture performed by the user.
  • 14. The method of claim 13, further comprising: in response to detecting muscle activity corresponding to a physical gesture performed by a user of the wearable system by at least one muscle activity sensor of the wearable muscle interface device, processing the detected muscle activity by the processor of the wearable muscle interface device, and wherein transmitting at least one signal from the wearable muscle interface device by a transmitter of the wearable muscle interface device based at least in part on the muscle activity detected by at least one muscle activity sensor of the wearable muscle interface device includes transmitting at least one signal from the wearable muscle interface device by the transmitter of the wearable muscle interface device based at least in part on processing the detected muscle activity by the processor of the wearable muscle interface device.
  • 15. The method of claim 13, further comprising: detecting motion of the wearable muscle interface device corresponding to the physical gesture performed by the user of the wearable system by the at least one accelerometer of the wearable muscle interface device, and wherein transmitting at least one signal from the wearable muscle interface device by a transmitter of the wearable muscle interface device based at least in part on the muscle activity detected by at least one muscle activity sensor of the wearable muscle interface device includes transmitting at least one signal from the wearable muscle interface device by the transmitter of the wearable muscle interface device based on both the muscle activity detected by at least one muscle activity sensor of the wearable muscle interface device and the motion detected by at least one accelerometer of the wearable muscle interface device.
  • 16. The method of claim 13, wherein: transmitting at least one signal from the wearable muscle interface device by a transmitter of the wearable muscle interface device includes wirelessly transmitting at least one signal from the wearable muscle interface device by a wireless transmitter of the wearable muscle interface device; andreceiving the at least one signal directly from the wearable muscle interface device by a receiver of the wearable head-mounted display includes wirelessly receiving the at least one signal directly from the wearable muscle interface device by a wireless receiver of the wearable head-mounted display.
  • 17. The method of claim 13, wherein a haptic feedback module provides haptic feedback to the user.
  • 18. The method of claim 17, wherein the haptic feedback module includes a vibratory motor.
  • 19. The method of claim 13, wherein the at least one muscle activity sensor comprises a muscle activity sensor selected from the group consisting of: an electromyographic (EMG) sensor and a mechanomyographic (MMG) sensor.
  • 20. The method of claim 13, wherein the transmitter of the wearable muscle interface device includes a wireless transmitter and the receiver of the wearable head-mounted display includes a wireless receiver.
US Referenced Citations (357)
Number Name Date Kind
1411995 Dull Apr 1922 A
3408133 Lee Oct 1968 A
3620208 Higley et al. Nov 1971 A
3712716 Cornsweet et al. Jan 1973 A
3880146 Everett et al. Apr 1975 A
4602639 Hoogendoorn et al. Jul 1986 A
4705408 Jordi Nov 1987 A
4817064 Milles Mar 1989 A
4978213 El Hage Dec 1990 A
5003978 Dunseath, Jr. Apr 1991 A
D322227 Warhol Dec 1991 S
5081852 Cox Jan 1992 A
5103323 Magarinos et al. Apr 1992 A
5231674 Cleveland et al. Jul 1993 A
5251189 Thorp Oct 1993 A
D348660 Parsons Jul 1994 S
5445869 Ishikawa et al. Aug 1995 A
5467104 Furness, III et al. Nov 1995 A
5482051 Reddy et al. Jan 1996 A
5589956 Morishima et al. Dec 1996 A
5596339 Furness, III et al. Jan 1997 A
5605059 Woodward Feb 1997 A
5683404 Johnson Nov 1997 A
5742421 Wells et al. Apr 1998 A
6008781 Furness, III et al. Dec 1999 A
6027216 Guyton et al. Feb 2000 A
6032530 Hock Mar 2000 A
D422617 Simioni Apr 2000 S
6184847 Fateh et al. Feb 2001 B1
6236476 Son et al. May 2001 B1
6238338 DeLuca et al. May 2001 B1
6244873 Hill et al. Jun 2001 B1
6317103 Furness, III et al. Nov 2001 B1
6377277 Yamamoto Apr 2002 B1
D459352 Giovanniello Jul 2002 S
6487906 Hock Dec 2002 B1
6510333 Licata et al. Jan 2003 B1
6527711 Stivoric et al. Mar 2003 B1
6619836 Silvant et al. Sep 2003 B1
6639570 Furness, III et al. Oct 2003 B2
6720984 Jorgensen et al. Apr 2004 B1
6743982 Biegelsen et al. Jun 2004 B2
6807438 Brun Del Re et al. Oct 2004 B1
D502661 Rapport Mar 2005 S
D502662 Rapport Mar 2005 S
6865409 Getsla et al. Mar 2005 B2
D503646 Rapport Apr 2005 S
6880364 Vidolin et al. Apr 2005 B1
6927343 Watanabe et al. Aug 2005 B2
6965842 Rekimoto Nov 2005 B2
6972734 Ohshima et al. Dec 2005 B1
6984208 Zheng Jan 2006 B2
7022919 Brist et al. Apr 2006 B2
7086218 Pasach Aug 2006 B1
D535401 Travis et al. Jan 2007 S
7173437 Hervieux et al. Feb 2007 B2
7209114 Radley-Smith Apr 2007 B2
D543212 Marks May 2007 S
7265298 Maghribi et al. Sep 2007 B2
7271774 Puuri Sep 2007 B2
7333090 Tanaka et al. Feb 2008 B2
7450107 Radley-Smith Nov 2008 B2
7473888 Wine Jan 2009 B2
7491892 Wagner et al. Feb 2009 B2
7517725 Reis Apr 2009 B2
7558622 Tran Jul 2009 B2
7596393 Jung et al. Sep 2009 B2
7618260 Daniel et al. Nov 2009 B2
7636549 Ma et al. Dec 2009 B2
7640007 Chen et al. Dec 2009 B2
7660126 Cho et al. Feb 2010 B2
7684105 Lamontagne et al. Mar 2010 B2
7747113 Mukawa et al. Jun 2010 B2
7773111 Cleveland et al. Aug 2010 B2
7809435 Ettare et al. Oct 2010 B1
7844310 Anderson Nov 2010 B2
D628616 Yuan Dec 2010 S
7850306 Uusitalo et al. Dec 2010 B2
7870211 Pascal et al. Jan 2011 B2
D633939 Puentes et al. Mar 2011 S
D634771 Fuchs Mar 2011 S
7925100 Howell et al. Apr 2011 B2
7948763 Chuang May 2011 B2
D640314 Yang Jun 2011 S
D643428 Janky et al. Aug 2011 S
D646192 Woode Oct 2011 S
D649177 Cho et al. Nov 2011 S
8054061 Prance et al. Nov 2011 B2
D654622 Hsu Feb 2012 S
8120828 Schwerdtner Feb 2012 B2
8170656 Tan et al. May 2012 B2
8179604 Prada Gomez et al. May 2012 B1
8188937 Amafuji et al. May 2012 B1
D661613 Demeglio Jun 2012 S
8203502 Chi et al. Jun 2012 B1
8207473 Axisa et al. Jun 2012 B2
8212859 Tang et al. Jul 2012 B2
D667482 Healy et al. Sep 2012 S
D669522 Klinar et al. Oct 2012 S
D669523 Wakata et al. Oct 2012 S
D671590 Klinar et al. Nov 2012 S
8355671 Kramer et al. Jan 2013 B2
8389862 Arora et al. Mar 2013 B2
8427977 Workman et al. Apr 2013 B2
D682343 Waters May 2013 S
D682727 Bulgari May 2013 S
8447704 Tan et al. May 2013 B2
D685019 Li Jun 2013 S
8467270 Gossweiler, III et al. Jun 2013 B2
8469741 Oster et al. Jun 2013 B2
D687087 Iurilli Jul 2013 S
D689862 Liu Sep 2013 S
D692941 Klinar et al. Nov 2013 S
8591411 Banet et al. Nov 2013 B2
D695333 Farnam et al. Dec 2013 S
D695454 Moore Dec 2013 S
8620361 Bailey et al. Dec 2013 B2
8624124 Koo et al. Jan 2014 B2
8634119 Bablumyan et al. Jan 2014 B2
D701555 Markovitz et al. Mar 2014 S
8666212 Amirparviz Mar 2014 B1
8702629 Giuffrida et al. Apr 2014 B2
8704882 Turner Apr 2014 B2
D704248 Dichiara May 2014 S
8777668 Ikeda et al. Jul 2014 B2
D716457 Brefka et al. Oct 2014 S
D717685 Bailey et al. Nov 2014 S
8879276 Wang Nov 2014 B2
8883287 Boyce et al. Nov 2014 B2
8895865 Lenahan et al. Nov 2014 B2
D719568 Heinrich et al. Dec 2014 S
D719570 Heinrich et al. Dec 2014 S
8912094 Koo et al. Dec 2014 B2
8922481 Kauffmann et al. Dec 2014 B1
D723093 Li Feb 2015 S
8954135 Yuen et al. Feb 2015 B2
D724647 Rohrbach Mar 2015 S
8970571 Wong et al. Mar 2015 B1
8971023 Olsson et al. Mar 2015 B2
9018532 Wesselmann et al. Apr 2015 B2
9086687 Park et al. Jul 2015 B2
D736664 Paradise et al. Aug 2015 S
D738373 Davies et al. Sep 2015 S
9135708 Ebisawa Sep 2015 B2
9146730 Lazar Sep 2015 B2
D741855 Park et al. Oct 2015 S
D742272 Bailey et al. Nov 2015 S
D742874 Cheng et al. Nov 2015 S
D743963 Osterhout Nov 2015 S
9211417 Heldman et al. Dec 2015 B2
D747714 Erbeus Jan 2016 S
D747759 Ho Jan 2016 S
D750623 Park et al. Mar 2016 S
D751065 Magi Mar 2016 S
D756359 Bailey et al. May 2016 S
D758476 Ho Jun 2016 S
D760313 Ho et al. Jun 2016 S
9367139 Ataee et al. Jun 2016 B2
9372535 Bailey et al. Jun 2016 B2
9393418 Giuffrida et al. Jul 2016 B2
9418927 Axisa et al. Aug 2016 B2
D766895 Choi Sep 2016 S
9439566 Arne et al. Sep 2016 B2
D768627 Rochat et al. Oct 2016 S
9472956 Michaelis et al. Oct 2016 B2
9477313 Mistry et al. Oct 2016 B2
D771735 Lee et al. Nov 2016 S
9529434 Choi et al. Dec 2016 B2
D780828 Bonaventura et al. Mar 2017 S
D780829 Bonaventura et al. Mar 2017 S
10528135 Bailey et al. Jan 2020 B2
20010033402 Popovich Oct 2001 A1
20020003627 Rieder Jan 2002 A1
20020030636 Richards Mar 2002 A1
20020032386 Sackner et al. Mar 2002 A1
20020077534 DuRousseau Jun 2002 A1
20020120916 Snider, Jr. Aug 2002 A1
20030036691 Stanaland et al. Feb 2003 A1
20030051505 Robertson et al. Mar 2003 A1
20030144586 Tsubata Jul 2003 A1
20040073104 Brun del Re et al. Apr 2004 A1
20040194500 Rapport Oct 2004 A1
20040210165 Marmaropoulos et al. Oct 2004 A1
20050005637 Rapport Jan 2005 A1
20050012715 Ford Jan 2005 A1
20050070227 Shen et al. Mar 2005 A1
20050119701 Lauter et al. Jun 2005 A1
20050177038 Kolpin et al. Aug 2005 A1
20060037359 Stinespring Feb 2006 A1
20060061544 Min et al. Mar 2006 A1
20060132705 Li Jun 2006 A1
20060238707 Elvesjo et al. Oct 2006 A1
20070078308 Daly Apr 2007 A1
20080136775 Conant Jun 2008 A1
20090007597 Hanevold Jan 2009 A1
20090031757 Harding Feb 2009 A1
20090040016 Ikeda Feb 2009 A1
20090051544 Niknejad Feb 2009 A1
20090102580 Uchaykin Apr 2009 A1
20090109241 Tsujimoto Apr 2009 A1
20090147004 Ramon et al. Jun 2009 A1
20090179824 Tsujimoto et al. Jul 2009 A1
20090189867 Krah et al. Jul 2009 A1
20090207464 Wiltshire et al. Aug 2009 A1
20090251407 Flake et al. Oct 2009 A1
20090318785 Ishikawa et al. Dec 2009 A1
20090322653 Putilin et al. Dec 2009 A1
20090327171 Tan et al. Dec 2009 A1
20100041974 Ting et al. Feb 2010 A1
20100142015 Kuwahara et al. Jun 2010 A1
20100149073 Chaum et al. Jun 2010 A1
20100150415 Atkinson et al. Jun 2010 A1
20100280628 Sankai Nov 2010 A1
20100293115 Seyed Momen Nov 2010 A1
20100317958 Beck et al. Dec 2010 A1
20110018754 Tojima et al. Jan 2011 A1
20110072510 Cheswick Mar 2011 A1
20110134026 kang et al. Jun 2011 A1
20110166434 Gargiulo Jul 2011 A1
20110172503 Knepper et al. Jul 2011 A1
20110181527 Capela et al. Jul 2011 A1
20110213278 Horak et al. Sep 2011 A1
20110224556 Moon et al. Sep 2011 A1
20110224564 Moon et al. Sep 2011 A1
20120002256 Lacoste et al. Jan 2012 A1
20120029322 Wartena et al. Feb 2012 A1
20120051005 Vanfleteren et al. Mar 2012 A1
20120053439 Ylostalo et al. Mar 2012 A1
20120101357 Hoskuldsson et al. Apr 2012 A1
20120139817 Freeman Jun 2012 A1
20120157789 Kangas et al. Jun 2012 A1
20120165695 Kidmose et al. Jun 2012 A1
20120182309 Griffin et al. Jul 2012 A1
20120188158 Tan et al. Jul 2012 A1
20120203076 Fatta et al. Aug 2012 A1
20120209134 Morita et al. Aug 2012 A1
20120226130 De Graff et al. Sep 2012 A1
20120249797 Haddick Oct 2012 A1
20120265090 Fink et al. Oct 2012 A1
20120293548 Perez et al. Nov 2012 A1
20120302858 Kidmose et al. Nov 2012 A1
20120323521 De Foras et al. Dec 2012 A1
20130005303 Song et al. Jan 2013 A1
20130016292 Miao et al. Jan 2013 A1
20130016413 Saeedi et al. Jan 2013 A1
20130020948 Han et al. Jan 2013 A1
20130027341 Mastandrea Jan 2013 A1
20130080794 Hsieh Mar 2013 A1
20130123666 Giuffrida et al. May 2013 A1
20130127708 Jung et al. May 2013 A1
20130135722 Yokoyama May 2013 A1
20130165813 Chang et al. Jun 2013 A1
20130191741 Dickinson et al. Jul 2013 A1
20130198694 Rahman Aug 2013 A1
20130215235 Russell Aug 2013 A1
20130222384 Futterer Aug 2013 A1
20130265229 Forutanpour et al. Oct 2013 A1
20130265437 Thorn Oct 2013 A1
20130271292 McDermott Oct 2013 A1
20130285901 Lee et al. Oct 2013 A1
20130293580 Spivack Nov 2013 A1
20130312256 Wesselmann et al. Nov 2013 A1
20130317648 Assad Nov 2013 A1
20130332196 Pinkser Dec 2013 A1
20130335302 Crane et al. Dec 2013 A1
20140020945 Hurwitz et al. Jan 2014 A1
20140028539 Newham et al. Jan 2014 A1
20140028546 Jeon et al. Jan 2014 A1
20140045547 Singamsetty et al. Feb 2014 A1
20140049417 Abdurrahman et al. Feb 2014 A1
20140094675 Luna et al. Apr 2014 A1
20140121471 Walker May 2014 A1
20140122958 Greeneberg et al. May 2014 A1
20140194062 Palin et al. Jul 2014 A1
20140198034 Bailey et al. Jul 2014 A1
20140202643 Hikmet et al. Jul 2014 A1
20140204455 Popovich et al. Jul 2014 A1
20140226193 Sun Aug 2014 A1
20140232651 Kress et al. Aug 2014 A1
20140236031 Banet et al. Aug 2014 A1
20140249397 Lake et al. Sep 2014 A1
20140257141 Giuffrida et al. Sep 2014 A1
20140285326 Luna et al. Sep 2014 A1
20140285429 Simmons Sep 2014 A1
20140299362 Park et al. Oct 2014 A1
20140334083 Bailey Nov 2014 A1
20140334653 Luna et al. Nov 2014 A1
20140337861 Chang et al. Nov 2014 A1
20140340857 Hsu et al. Nov 2014 A1
20140349257 Connor Nov 2014 A1
20140354528 Laughlin et al. Dec 2014 A1
20140354529 Laughlin et al. Dec 2014 A1
20140364703 Kim et al. Dec 2014 A1
20140368896 Nakazono et al. Dec 2014 A1
20140375465 Fenuccio et al. Dec 2014 A1
20150011857 Henson et al. Jan 2015 A1
20150025355 Bailey et al. Jan 2015 A1
20150036221 Stephenson Feb 2015 A1
20150051470 Bailey et al. Feb 2015 A1
20150057506 Luna et al. Feb 2015 A1
20150057770 Bailey et al. Feb 2015 A1
20150065840 Bailey Mar 2015 A1
20150084860 Aleem et al. Mar 2015 A1
20150106052 Balakrishnan et al. Apr 2015 A1
20150109202 Ataee et al. Apr 2015 A1
20150124566 Lake et al. May 2015 A1
20150141784 Morun et al. May 2015 A1
20150148641 Morun et al. May 2015 A1
20150160621 Yilmaz Jun 2015 A1
20150182113 Utter, II Jul 2015 A1
20150182130 Utter, II Jul 2015 A1
20150182163 Utter Jul 2015 A1
20150182164 Utter, II Jul 2015 A1
20150185838 Camacho-Perez et al. Jul 2015 A1
20150186609 Utter, II Jul 2015 A1
20150205126 Schowengerdt Jul 2015 A1
20150205134 Bailey et al. Jul 2015 A1
20150216475 Luna et al. Aug 2015 A1
20150230756 Luna et al. Aug 2015 A1
20150234426 Bailey et al. Aug 2015 A1
20150237716 Su et al. Aug 2015 A1
20150261306 Lake Sep 2015 A1
20150277575 Ataee et al. Oct 2015 A1
20150296553 DiFranco et al. Oct 2015 A1
20150325202 Lake et al. Nov 2015 A1
20150362734 Moser et al. Dec 2015 A1
20150370333 Ataee et al. Dec 2015 A1
20150378161 Bailey et al. Dec 2015 A1
20150378162 Bailey et al. Dec 2015 A1
20150378164 Bailey et al. Dec 2015 A1
20160020500 Matsuda Jan 2016 A1
20160033771 Tremblay et al. Feb 2016 A1
20160150636 Otsubo May 2016 A1
20160156762 Bailey et al. Jun 2016 A1
20160199699 Klassen Jul 2016 A1
20160202081 Debieuvre et al. Jul 2016 A1
20160238845 Alexander et al. Aug 2016 A1
20160274365 Bailey et al. Sep 2016 A1
20160274758 Bailey Sep 2016 A1
20160309249 Wu et al. Oct 2016 A1
20160313899 Noel Oct 2016 A1
20160327796 Bailey et al. Nov 2016 A1
20160327797 Bailey et al. Nov 2016 A1
20160349514 Alexander et al. Dec 2016 A1
20160349515 Alexander et al. Dec 2016 A1
20160349516 Alexander et al. Dec 2016 A1
20160377865 Alexander et al. Dec 2016 A1
20160377866 Alexander et al. Dec 2016 A1
20170068095 Holland et al. Mar 2017 A1
20170097753 Bailey et al. Apr 2017 A1
20170115483 Aleem et al. Apr 2017 A1
20170153701 Mahon et al. Jun 2017 A1
20170205876 Vidal et al. Jul 2017 A1
20170212290 Alexander et al. Jul 2017 A1
20170212349 Bailey et al. Jul 2017 A1
20170219829 Bailey Aug 2017 A1
20170299956 Holland et al. Oct 2017 A1
Foreign Referenced Citations (10)
Number Date Country
1 412 278 Oct 1995 DE
0 301 790 Feb 1989 EP
S61-198892 Sep 1986 JP
2009-050679 Mar 2009 JP
2013-160905 Aug 2013 JP
10-2012-0094870 Aug 2012 KR
10-2012-0097997 Sep 2012 KR
2011070554 Jun 2011 WO
2014155288 Oct 2014 WO
2015123775 Aug 2015 WO
Non-Patent Literature Citations (91)
Entry
Non-Final Office Action received for U.S. Appl. No. 14/155,087 dated Mar. 31, 2015, 22 pages.
Final Office Action received for U.S. Appl. No. 14/155,087 dated Jul. 20, 2015, 27 pages.
Non-Final Office Action received for U.S. Appl. No. 14/155,087 dated Feb. 17, 2016, 26 pages.
Final Office Action received for U.S. Appl. No. 14/155,087 dated Jul. 8, 2016, 27 pages.
Non-Final Office Action received for U.S. Appl. No. 14/155,087 dated Aug. 16, 2016, 28 pages.
Final Office Action received for U.S. Appl. No. 14/155,087 dated Dec. 16, 2016, 32 pages.
Non-Final Office Action received for U.S. Appl. No. 14/155,087 dated Aug. 7, 2017, 28 pages.
Final Office Action received for U.S. Appl. No. 14/155,087 dated Nov. 27, 2017, 40 pages.
Non-Final Office Action received for U.S. Appl. No. 14/155,107 dated Mar. 31, 2015, 26 pages.
Final Office Action received for U.S. Appl. No. 14/155,107 dated Jul. 16, 2015, 28 pages.
Non-Final Office Action received for U.S. Appl. No. 14/155,107 dated Feb. 11, 2016, 42 pages.
Final Office Action received for U.S. Appl. No. 14/155,107 dated Jul. 8, 2016, 31 pages.
Non-Final Office Action received for U.S. Appl. No. 14/155,107 dated Aug. 17, 2016, 37 pages.
Final Office Action received for U.S. Appl. No. 14/155,107 dated Dec. 19, 2016, 35 pages.
Non-Final Office Action received for U.S. Appl. No. 14/155,107 dated Aug. 7, 2017, 34 pages.
Final Office Action received for U.S. Appl. No. 14/155,107 dated Nov. 27, 2017, 44 pages.
Non-Final Office Action received for U.S. Appl. No. 14/155,107 dated Jul. 13, 2018, 45 pages.
Final Office Action received for U.S. Appl. No. 14/155,107 dated Jan. 17, 2019, 46 pages.
Notice of Allowance received for U.S. Appl. No. 14/155,107 dated Aug. 30, 2019, 16 pages.
Bailey et al., “Muscle Interface Device and Method for Interacting With Content Displayed on Wearable Head Mounted Displays,” Preliminary Amendment filed Jan. 28, 2014, for U.S. Appl. No. 14/155,087, 8 pages.
Costanza et al., “EMG as a Subtle Input Interface for Mobile Computing,” Mobile HCI 2004, LNCS 3160, edited by S. Brewster and M. Dunlop, Springer-Verlag Berlin Heidelberg, pp. 426-430, 2004.
Costanza et al., “Toward Subtle Intimate Interfaces for Mobile Devices Using an EMG Controller,” CHI 2005, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 481-489, 2005.
Ghasemzadeh et al., “A Body Sensor Network With Electromyogram and Inertial Sensors: Multimodal Interpretation of Muscular Activities,” IEEE Transactions on Information Technology in Biomedicine, vol. 14, No. 2, pp. 198-206, Mar. 2010.
International Search Report and Written Opinion, dated May 16, 2014, for corresponding International Application No. PCT/US2014/017799, 11 pages.
Lake et al., “Method and Apparatus for Analyzing Capacitive EMG and IMU Sensor Signals for Gesture Control,” U.S. Appl. No. 14/186,878, filed Feb. 21, 2014, 29 pages.
Lake et al.' “Method and Apparatus for Analyzing Capacitive EMG and IMU Sensor Signals for Gesture Control,” Preliminary Amendment filed May 9, 2014, for U.S. Appl. No. 14/186,878, 9 pages.
Lake et al., “Methods and Devices That Combine Muscle Activity Sensor Signals and Inertial Sensor Signals for Gesture-Based Control,” U.S. Appl. No. 14/186,889, filed Feb. 21, 2014, 58 pages.
Morris et al., “Emerging Input Technologies for Always-Available Mobile Interaction,” Foundations and Trends in Human-Computer Interaction 4(4):245-316, 2011. (74 total pages).
Naik et al., “Real-Time Hand Gesture Identification for Human Computer Interaction Based on ICA of Surface Electromyogram,” IADIS International Conference Interfaces and Human Computer Interaction 2007, 8 pages.
Picard et al., “Affective Wearables,” Proceedings of the IEEE 1.sup.st International Symposium on Wearable Computers, ISWC, Cambridge, MA, USA, Oct. 13-14, 1997, pp. 90-97.
Rekimoto, “GestureWrist and GesturePad: Unobtrusive Wearable Interaction Devices,” ISWC '01 Proceedings of the 5.sup.th IEEE International Symposium on Wearable Computers, 2001, 7 pages.
Saponas et al., “Making Muscle-Computer Interfaces More Practical,” CHI 2010, Atlanta, Georgia, USA, Apr. 10-15, 2010, 4 pages.
Xiong et al., “A Novel HCI based on EMG and IMU,” Proceedings of the 2011 IEEE International Conference on Robotics and Biomimetics, Phuket, Thailand, Dec. 7-11, 2011, 5 pages.
Zhang et al., “A Framework for Hand Gesture Recognition Based on Accelerometer and EMG Sensors,” IEEE Transactions on Systems, Man, and Cybernetics—Part A: Systems and Humans, vol. 41, No. 6, pp. 1064-1076, Nov. 2011.
Bailey et al., “Muscle Interface Device and Method for Interacting With Content Displayed on Wearable Head Mounted Displays,” Office Action dated Mar. 31, 2015, for U.S. Appl. No. 14/155,087, 15 pages.
Bailey et al., “Muscle Interface Device and Method for Interacting With Content Displayed on Wearable Head Mounted Displays,” Office Action dated Jul. 20, 2015, for U.S. Appl. No. 14/155,087, 14 pages.
Lake et al., “Method and Apparatus for Analyzing Capacitive EMG and IMU Sensor Signals for Gesture Control,” Office Action dated Jun. 17, 2015, for U.S. Appl. No. 14/186,878, 13 pages.
Lake et al., “Method and Apparatus for Analyzing Capacitive EMG and IMU Sensor Signals for Gesture Control,” Amendment filed Aug. 21, 2015, for U.S. Appl. No. 14/186,878, 13 pages.
Bailey et al., “Muscle Interface Device and Method for Interacting With Content Displayed on Wearable Head Mounted Displays,” Amendment filed Aug. 25, 2015, for U.S. Appl. No. 14/155,087, 10 pages.
Bailey et al., “Muscle Interface Device and Method for Interacting With Content Displayed on Wearable Head Mounted Displays,” Office Action dated Feb. 17, 2016, for U.S. Appl. No. 14/155,087, 16 pages.
Lake et al., “Methods and Devices for Combining Muscle Activity Sensor Signals and Inertial Sensor Signals for Gesture-Based Control,” Office Action dated Nov. 5, 2015, for U.S. Appl. No. 14/186,889, 11 pages.
Lake et al., “Methods and Devices for Combining Muscle Activity Sensor Signals and Inertial Sensor Signals for Gesture-Based Control,” Amendment filed Jan. 8, 2016, for U.S. Appl. No. 14/186,889, 16 pages.
Lake et al., “Methods and Devices for Combining Muscle Activity Sensor Signals and Inertial Sensor Signals for Gesture-Based Control,” Office Action dated Jun. 16, 2016, for U.S. Appl. No. 14/186,889, 13 pages.
Lake et al., “Methods and Devices for Combining Muscle Activity Sensor Signals and Inertial Sensor Signals for Gesture-Based Control,” Amendment filed Jul. 13, 2016, for U.S. Appl. No. 14/186,889, 12 pages.
Bailey et al., “Muscle Interface Device and Method for Interacting With Content Displayed on Wearable Head Mounted Displays,” Amendment filed May 17, 2016, for U.S. Appl. No. 14/155,087, 13 pages.
Bailey et al., “Muscle Interface Device and Method for Interacting With Content Displayed on Wearable Head Mounted Displays,” Office Action dated Jul. 8, 2016, for U.S. Appl. No. 14/155,087, 16 pages.
Bailey et al., “Muscle Interface Device and Method for Interacting With Content Displayed on Wearable Head Mounted Displays,” Amendment filed Aug. 9, 2016, for U.S. Appl. No. 14/155,087, 8 pages.
Amitai, “P-27: A Two-Dimensional Aperture Expander for Ultra-Compact, High-Performance Head-Worn Displays,” SID Symposium Digest of Technical Papers 36(1):360-363, 2005.
Ayras et al., “Exit pupil expander with a large field of view based on diffractive optics,” Journal of the SID 17 (8):659-664, 2009.
Chellappan et al., “Laser-based displays: a review,” Applied Optics 49(25):F79-F98, 2010.
Cui et al., “Diffraction from angular multiplexing slanted volume hologram gratings,” Optik 116:118-122, 2005.
Curatu et al., “Dual Purpose Lens for an Eye-Tracked Projection Head-Mounted Display,” International Optical Design Conference 2006, SPIE-OSA 6342:63420X-1-63420X-7, 2007.
Curatu et al., “Projection-based head-mounted display with eye-tracking capabilities,” Proc. of SPIE 5875:58750J-1-58750J-9, 2005.
Essex, “Tutorial on Optomechanical Beam Steering Mechanisms,” OPTI 521 Tutorial, College of Optical Sciences, University of Arizona, 8 pages, 2006.
Fernandez et al., “Optimization of a thick polyvinyl alcohol-acrylamide photopolymer for data storage using a combination of angular and peristrophic holographic multiplexing,” Applied Optics 45(29):7661-7666, 2009.
Gourmelon et al., “Contactless sensors for Surface Electromyography,” Proceedings of the 28th IEEE EMBS Annual International Conference, New York City, NY, Aug. 30-Sep. 3, 2006, pp. 2514-2517.
Hainich et al., “Chapter 10: Near-Eye Displays,” Displays: Fundamentals & Applications, AK Peters/CRC Press, 2011, 65 pages.
Hornstein et al., “Maradin's Micro-Mirror—System Level Synchronization Notes,” SID 2012 Digest, pp. 981-984.
International Search Report, dated Jun. 8, 2016, for PCT/US2016/018293, 17 pages.
International Search Report, dated Jun. 8, 2016, for PCT/US2016/018298, 14 pages.
International Search Report, dated Jun. 8, 2016, for PCT/US2016/018299, 12 pages.
International Search Report and Written Opinion, dated Aug. 21, 2014, for corresponding International Application No. PCT/US2014/037863, 10 pages.
International Search Report and Written Opinion, dated Nov. 21, 2014, for corresponding International Application No. PCT/US2014/052143, 9 pages.
International Search Report and Written Opinion, dated Feb. 27, 2015, for corresponding International Application No. PCT/US2014/067443, 10 pages.
International Search Report and Written Opinion, dated May 27, 2015, for corresponding International Application No. PCT/US2015/015675, 9 pages.
Itoh et al., “Interaction-Free Calibration for Optical See-Through Head-Mounted Displays based on 3D Eye Localization,” 2014 IEEE Symposium on 3D User Interfaces (3DUI), pp. 75-82, 2014.
Kessler, “Optics of Near to Eye Displays (NEDs),” Presentation—Oasis 2013, Tel Aviv, Feb. 19, 2013, 37 pages.
Kress et al., “A review of head-mounted displays (HMD) technologies and applications for consumer electronics,” Proc. of SPIE 8720:87200A-1-87200A-13, 2013.
Kress et al., “Diffractive and Holographic Optics as Optical Combiners in Head Mounted Displays,” Proceedings of the 2013 ACM Conference on Pervasive and Ubiquitous Computing Adjunct Publication, pp. 1479-1482, 2013.
Kress, “Optical architectures for see-through wearable displays,” Presentation—Bay Area—SID Seminar, Apr. 30, 2014, 156 pages.
Levola, “7.1: Invited Paper: Novel Diffractive Optical Components for Near to Eye Displays,” SID Symposium Digest of Technical Papers 37(1):64-67, 2006.
Liao et al., “The Evolution of MEMS Displays,” IEEE Transactions on Industrial Electronics 56(4):1057-1065, 2009.
Lippert, “Chapter 6: Display Devices: RSD.TM. (Retinal Scanning Display),” The Avionics Handbook, CRC Press, 2001, 8 pages.
Majaranta et al., “Chapter 3—Eye-Tracking and Eye-Based Human-Computer Interaction,” in Advances in Physiological Computing, Springer-Verlag London, 2014, pp. 17-39.
Sato et al., “Touche. Enhancing Touch Interaction on Humans, Screens, Liquids, and Everyday Objects,” CHI' 12, May 5-10, 2012, Austin, Texas.
Schowengerdt et al., “Stereoscopic retinal scanning laser display with integrated focus cues for ocular accommodation” Proc. of SPIE-IS&T Electronic Imaging 5291:366-376, 2004.
Silverman et al., “58.5L: Late-News Paper: Engineering a Retinal Scanning Laser Display with Integrated Accommodative Depth Cues,” SID 03 Digest, pp. 1538-1541, 2003.
Takatsuka et al., “Retinal projection display using diffractive optical element,” Tenth International Conference on Intelligent Information Hiding and Multimedia Signal Processing, IEEE, 2014, pp. 403-406.
Ueno et al., “A Capacitive Sensor System for Measuring Laplacian Electromyogram through Cloth: A Pilot Study,” Proceedings of the 29.sup.th Annual International Conference of the IEEE EMBS, Cite Internationale, Lyon, France, Aug. 23-26, 2007.
Ueno et al., “Feasibility of Capacitive Sensing of Surface Electromyographic Potential through Cloth,” Sensors and Materials 24(6):335-346, 2012.
Urey et al., “Optical performance requirements for MEMS-scanner based microdisplays,” Conf. on MOEMS and Miniaturized Systems, SPIE 4178:176-185, 2000.
Urey, “Diffractive exit-pupil expander for display applications,” Applied Optics 40(32):5840-5851, 2001.
Viirre et al., “The Virtual Retinal Display: A New Technology for Virtual Reality and Augmented Vision in Medicine,” Proc. of Medicine Meets Virtual Reality, IOS Press and Ohmsha, 1998, pp. 252-257. (6 pages).
Xu et al., “Hand Gesture Recognition and Virtual Game Control Based on 3D Accelerometer and EMG Sensors,” Proceedings of the 14th international conference on Intelligent user interfaces, Sanibel Island, Florida, Feb. 8-11, 2009, pp. 401-406.
Communication pursuant to Rule 164(1) EPC, dated Sep. 30, 2016, for corresponding EP Application No. 14753949.8, 7 pages.
Brownlee, “Finite State Machines (FSM): Finite state machines as a control technique in Artificial Intelligence (AI),” Jun. 2002, 12 pages.
International Search Report and Written Opinion dated Apr. 25, 2017 for corresponding International Application No. PCT/US2016/067246, 12 pages.
Janssen, “What is Radio Frequency (RF)?” Technopedia, 2013, retrieved from https://web.archive.org/web/20130726153946/https://www.technopedia.com/de- finition/5083/radio-frequency-rf, retrieved on Jul. 12, 2017, 2 pages.
Merriam-Webster, “Radio Frequencies,” 2017, retrieved from https://www.merriam-webster.com/table/collegiate/radiofre.htm, retrieved on Jul. 12, 2017, 2 pages.
Bailey et al., “Wearable Muscle Interface Systems, Devices and Methods That Interact With Content Displayed on an Electronic Display,” Amendment filed May 11, 2016, for U.S. Appl. No. 14/155,107, 15 pages.
Bailey et al., “Wearable Muscle Interface Systems, Devices and Methods That Interact With. Content Displayed on an Electronic Display,” Amendment filed Aug. 9, 2016, for U.S. Appl No. 14/155,107, 8 pages.
Related Publications (1)
Number Date Country
20200159325 A1 May 2020 US
Provisional Applications (1)
Number Date Country
61752226 Jan 2013 US
Continuations (1)
Number Date Country
Parent 14155107 Jan 2014 US
Child 16696760 US