The present invention relates to systems and methods for hands-free activation/deactivation of illumination and/or other systems, in particular, head-worn illumination and/or imaging systems.
It is common for aviators, especially those operating military fixed-wing aircraft, to wear respiration systems that include a mask and for such masks to include therein control switches that can be manipulated by the wearer using his/her lip or tongue. For example, U.S. Pat. No. 7,184,903 describes a hands-free, mouth-activated switch disposed within a cup-shaped, rigid portion of a pilot's oxygen mask. Among the elements controllable by such a switch is a night vision compatible light.
While such systems are common, they do not provide complete solutions for an entire aircraft crew. For example, not all crew members may wear or even have access to oxygen masks or such masks as include these types of switches. Moreover, it is highly unusual for civilian aircraft crews to have such masks as employ mouth-activated switches. Even if the mask-fitted, mouth-activated switches are employed, use thereof demands that the mask be worn. This would not be typical of flight crew members when embarking or disembarking the aircraft. Hence, the associated controlled systems (e.g., illumination systems) are predictably unavailable for use during these activities.
Embodiments of the invention include a head-worn system having one or more illumination, communication, and/or other controlled elements electrically connected to a switch element (e.g., a piezo switch), said switch element positioned on the head-worn system so as to be overlying an area of a wearer's masseter muscle when worn on the wearer's person, and said switch element configured to be activated responsive to clenching of the wearer's jaw. In some instances, the switch (or additional switches) may be positioned in locations other than over the wearer's masseter muscles, allowing activation/deactivation by means of muscles associated with a wearer's eyebrow, temple, etc. The electrical connection may be wired or wireless. The one or more illumination elements may be supported on one or more booms attached to a frame to provide directional lighting from an area of the wearer's zygomatic bones when the head-worn system is worn on the wearer's person. Alternatively, the one or more illumination elements may be integrated as part of or attached to a headset. The switch element may be positioned at an end of a clip mounted to an earphone cup of the headset. The clip may be mounted to the earphone cup by a band about a circumference of the earphone cup. Alternatively, the clip may be mounted to the earphone cup by a mounting plate affixed to the earphone cup. The one or more illumination elements may be light emitting diodes (LEDs), for example which may emit light in more than one wavelength. Alternatively, the one or more illumination elements may be LEDs and different ones of the LEDs may emit light in different wavelengths. The head-worn system may also include one or more imaging devices.
The head-worn system may be arranged so that at least one of the one or more illumination elements is supported on a boom attached to a frame and the boom may further support at least one imaging device. The boom may be positioned relative to the frame so as to provide directional lighting from an area of the wearer's zygomatic bones when the head-worn system is worn on the wearer's person. Alternatively, at least one of the one or more illumination elements may be supported on a first boom attached to a frame and the head-worn system may further include a second boom attached to the frame, the second boom having at least one imaging device supported thereon.
The switch element may be electrically connected to the one or more illumination elements via a controller and the controller configured to activate and/or deactivate the one or more illumination elements responsively to different ones of multi-actuations of the switch element.
Another embodiment of the invention provides an actuator element for a hand-free switch system, said actuator element comprising a bracket and a piezo switch attached to a moveable portion of said bracket, said bracket including a mounting portion for affixing said bracket to a head-worn unit, and said moveable portion of said bracket being hingibly coupled to the mounting portion of the bracket.
Another embodiment of the invention provides a helmet having an illumination unit and a switch element electrically coupled to said illumination unit, said switch element positioned on the helmet so as to be overlying an area of a wearer's masseter muscle when the helmet is worn on the wearer's person, and said switch element configured to be activated responsive to clenching of the wearer's jaw. The illumination unit may be pivotably connected to the helmet. The illumination unit may include one or more LEDs. And, the helmet may further include one or more imaging devices.
Another embodiment of the invention provides a hands-free switching system, having a switch configured to be worn adjacent an outside part of a wearer's face and to be activated by a jaw clench of the wearer, said switch electrically connected to provide an output to a controller that includes a processor and an associated memory, said memory storing instructions for execution by the processor, and a lighting element, wherein said instructions, when executed by said processor, cause said processor to recognize a first sequence of pulses from the switch as indicating one or a number of commands for illuminating the lighting element. The hands-free switching system may also include an imaging unit and said instructions, when executed by said processor, may cause said processor to recognize a second sequence of pulses from the switch as indicating one or a number of commands for operating the imaging unit.
Still another embodiment of the invention may provide a head-worn device having a jaw clench-actuated interface configured to operate one or more illumination and/or imaging units of the head-worn device to provide directional lighting/imaging from an area of a wearer's zygomatic bones. The illumination units may be independently adjustable light sources (e.g., LEDs) that allow for illumination of two or more areas simultaneously. The independently adjustable light sources may allow for illumination of the two or more areas at two or more separate wavelengths. The jaw clench-actuated interface may include a piezo switch positioned on a boom so as to be near the face of the wearer, overlying an area of the wearer's masseter muscle, when the head-worn device is worn by the wearer.
A further embodiment of the invention provides a head-worn interface unit having a cursor control switch for a computer system, the cursor control switch positioned on the head-worn interface unit so as to be overlying an area of a wearer's masseter muscle when worn on the wearer's person, said cursor control switch configured to be activated responsive to clenching of the wearer's jaw, and a wireless communication interface configured to communicatively couple the cursor control switch to an input of a computer system via a wireless communication protocol.
These and still more embodiments of the invention are described in detail below with reference to the accompanying drawings.
The present invention is illustrated by way of example, and not limitation, in the figures of the accompanying drawings, in which:
Described herein are systems and methods for hands-free activation/deactivation of illumination and/or other systems, for example, head-worn illumination and/or imaging systems. These systems and methods are characterized, in part, by employing a switch element, e.g., a piezo switch, that is positioned on or near the face of a user, overlying the area of the user's masseter muscle so that clenching/flexing of the jaw activates the switch. In one embodiment, the switch element is employed in combination with a head-mounted illumination device suitable for application in a variety of contexts, including military, law enforcement, health care, and others (e.g., consumer). Unlike helmet-mounted lights, which require the user to wear a helmet in order to use them, illumination devices configured in accordance with embodiments of the present invention can be worn with or without a helmet or other eyewear, communication devices, visioning systems, etc. Such illumination devices provide directional lighting from the area of the user's zygomatic bones. Placing the light source in this vicinity reduces light-blinding of others when communicating. Additionally, the use of one, two (left-side and right-side), or more independently adjustable light sources allows for illumination of one, two, or more areas simultaneously. The use of the switch element overlying the area of the user's masseter muscle so that clenching/flexing of the jaw activates the switch allows for hand-free operation of the light sources (either individually, in combination, or collectively). Other embodiments of the invention make use of the switch element as part of other head-worn illumination, imaging, and/or communication systems.
The use of “clench interactions” has been recognized as a viable control technique. For example, subsequent to the filing of the present applicant's U.S. Provisional Application No. 62/775,482, Xu et al., “Clench Interaction: Novel Biting Input Techniques,” Proc. 2019 CHI Conference on Human Factors in Computing Systems (CHI 2019), May 4-9, 2019, Glasgow, Scotland UK, reported on the use of different levels of bite force as a means of human-computer interaction. In doing so, they reviewed the existing literature and found research exploring tongue-based interfaces and tooth-click interfaces. Bite force measures, that is the level of force applied between one's teeth by and during clench interactions, were proposed as an extension to the existing works in this area.
While bite force interfaces may afford some advantages in some applications, the present invention adopts a different approach inasmuch as it relies on sensors placed outside a user's oral cavity. Such sensors are more suitable for applications in which different users may employ a common head-worn illumination/imaging device at different times, and/or where the presence of sensors inside one's mouth may be uncomfortable or impractical.
Referring to
As should be immediately apparent from these illustrations, use of the lighting element 12 or other illumination, imaging, or communications system(s) by means of switch 16 does not require donning a mask. Instead, the lighting element 12 (or other illumination, imaging, or communications system(s)) can be controlled using switch 16 at any time the headset 14 is worn. Wearing such headsets would typically be the norm for any member of an aircraft flight or operations crew, even when embarking/disembarking the aircraft. Indeed, headsets such as the one illustrated in these figures are not restricted to use by flight/aircraft crews and may be employed by ground forces, naval/coast guard personnel, and civilians. For example, headsets such as the ones shown in
In various embodiments, the lighting element 12 may consist of one or more light emitting diodes (LEDs), which emit light in one or more wavelengths. Further, the lighting element may include, in addition to one or more LEDs, one or more cameras for digital still and/or video imaging. In some instances, a lighting element may be worn on one side of the headset 14 while an imaging system is worn on the opposite side, each being controlled by separate switches 16 mounted on respective opposite sides of the headset, or by a single switch 16, which lighting and illumination systems are responsive (under the control of a controller) to multi-actuations of switch 16, similar to the way computer cursor control devices (e.g., touch pads, mice, etc.) may be separately responsive to single, double, triple, or other multiple clicks.
Indeed, the switch 16 may itself be used to control a cursor as part of a user-computer interface. For example, any or all of cursor type, cursor movement, and cursor selection may be controlled using a switch 16 positioned so as to be flush against the wearer's face (or nearly so), over the area of the masseter muscle so that clenching/flexing of the jaw activates the switch. Applications for such uses include computer gaming interfaces, which today commonly include head-worn communication equipment. One or more switches 16 configured in accordance with embodiments of the invention may be fitted to such headgear (either when manufactured or as an after-market addition) to provide cursor control capabilities. Conventional Bluetooth or other wired or wireless communication means may be employed to provide a connection to a console, personal computer, tablet, mobile phone, or other device that serves as the gaming or other host. The use of such human-machine interfaces may find particular application for users that have no or limited use of their hands and afford them a convenient means of interacting with a personal computer, tablet, mobile phone, or similar device.
Further, the element(s) that is/are under control of switch 16 may include one or more microphones. Such microphones may be boom mounted, as shown by microphone 28 in
In general, switch 16 is a device that requires little or no mechanical displacement of a control element in order to signal or effect a change (or desired change) in state of a controlled system. One example of such a device is a piezo switch, such as the Piezo Proximity Sensor produced by Communicate AT Pty Ltd. of Dee Why, Australia. Piezo switches generally have an on/off output state responsive to electrical pulses generated by a piezoelectric element. The electrical pulse is produced when the piezoelectric element is placed under stress, for example as a result of compressive forces resulting from a wearer clenching his/her jaw so that pressure is exerted against the switch 16 at the end of clip 18. Although the pulse is produced only when the compressive force is present (e.g., when the wearer's jaw is clenched), additional circuitry may be provided so that the output state of the switch is maintained in either an “on” or an “off” state until a second actuation of the switch occurs. For example, a flip-flop may be used to maintain a switch output logic high or logic low, with state changes occurring as a result of sequential input pulses from the piezoelectric element. One advantage of such a piezo switch is that there are no moving parts (other than a front plate that must deform by a few micrometers each time a wearer's jaw is clenched) and the entire switch can be sealed against the environment, making it especially useful for marine and/or outdoor applications.
Other embodiments may employ a switch 16 that is a micro tactile switch. Although tactile switches employ mechanical elements subject to wear, for some applications they may be more appropriate than piezo switches because they provide mechanical feedback to the user (although haptic feedback incorporated with a piezo switch may also provide an acceptable level of feedback for a user and so may be incorporated in the above-described embodiment). This feedback can provide assurance that the switch 16 has been activated/deactivated. Momentary contact tactile switches may also be used, but because they require continual force (e.g., as provided by clenching one's jaw against the switch), they are best suited to applications where only a momentary or short engagement of the active element under the control of switch 16 is desired, for example, signal light flashes, burst transmissions, or other short duration applications, or where a flip flop is used to maintain an output state until a subsequent input is received, as discussed above. Other forms of switches 16 include a ribbon switch (e.g., as made by Tapeswitch Corporation of Farmingdale, NY) and conductive printed circuit board surface elements activated via carbon pucks on an overlaid keypad.
Turning now to
The hands-free switch 16 may be fashioned as an integral component of an actuator element 42, as shown in
Two-piece designs such as those illustrated in
Referring now to
In
In this example, a lighting element 12 is illustrated, however, in other instances this unit may be another form of illumination, imaging, or communications system. Associated with lighting element 12 is a slide switch or rocker switch 84 having a plurality of operation positions. According to a current operation position of switch 84, the lighting element may be “off,” “on,” or under control of switch 16. When in this third position, firmware stored in memory 88 may cause processor 86 to recognize a sequence of pulses from switch 16 as indicating one or a number of commands for lighting element 12. For example, processor 86 may be programmed to ignore single presses of switch 16 and only accept as valid command inputs two pulses of switch 16 which occur within a prescribed time of one another (i.e., a “double click” of switch 16). Alternatively, or in addition, triple clicks may be recognized as valid command inputs, which may signify a different command than a double click. Further multiple clicks and/or click-and-hold inputs may also be recognized as signifying different commands. Such multi-click commands are useful for eliminating unintentional actuations of switch 16, as may be occasioned by involuntary muscle movements or by a wearer chewing food, gum, etc., or clenching his/her jaw during a flight operation. Valid forms of commands may be used to turn on/off the lighting element 12 and/or individual LEDs thereof, adjust the intensity of one or more illuminated LEDs, or to signal other desired operations. In general, switch 16 actuation sequence timings, repetitions, and durations may each be used, individually and/or in combination to specify different command inputs for one or more controlled elements such as lighting element 12 and/or an imaging unit.
The processor 86 may also communicate with and control other peripherals, such as a heads-up display, audio input/output unit, off-headset unit, etc. Processor 86 is a hardware-implemented module and may be a general-purpose processor, or dedicated circuitry or logic, such as a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)), or other form of processing unit. Memory 88 may be a readable/writeable memory, such as an electrically erasable programmable read-only memory, or other storage device.
In addition to lighting, the frame 100 of the present devices provide a platform for image and/or video capture and/or projection devices, for example as may be used with a helmet-worn or other heads-up display. Further, one or more microphones may be provided integral to or on the frame and/or on a boom associated with the frame that supports the illumination units. Hands-free operation of the illumination/imaging device 90 may be facilitated using a switch element 102, e.g., a piezo switch, that is positioned on a boom 104 so as to be near the face of a user, overlying the area of the user's masseter muscle so that clenching/flexing of the jaw activates the switch.
As shown in these illustrations, individual illumination elements, e.g., light emitting diodes (LEDs) 106, are included on or in a frame 100, which is worn over the ears and behind the head, and which may include an optional retracting head strap 110 connecting the two sides of the frame. At the front of the frame are located one or more booms 108 that extend over a portion of the wearer's face, below the eyes, and terminate in the area of the zygomatic bones. Two such booms, one each on the left side and right side of the wearer's face are shown in the illustrations, however, embodiments of the invention may provide just a single such boom on one side of the wearer's face, or multiple such booms on each side of the wearer's face. For some specialized applications it may be desirable to have different numbers of booms on each side of a wearer's face. The booms may or may not contact the wearer's face and may include a rubberized or other backing to provide a comfortable surface against the wearer's cheek.
Each of the booms 108 terminates with a hinged panel 112. The hinged panels are swivelly mounted to the booms, e.g., with a plano hinge, butt hinge, barrel hinge, butterfly hinge, pivot hinge, spring hinge, or other arrangement, and may be detachable from the boom so as to be replaceable/reconfigurable. For example, different arrangements of hinged panels 112 may be adapted to carry different illumination devices, sensors, imaging devices, and/or projection devices. In some examples, hinged panels 112 may be adapted for carrying LEDs that emit light in the visible spectrum. Other forms of hinged panels 112 may be adapted to carry LEDs that emit light in other wavelengths, in addition to or in lieu of the LEDs that emit light in the visible spectrum. Still further forms of hinged panels 112 may be adapted to carry light detectors and/or imaging devices (e.g., still image and/or video cameras), in addition to or in lieu of the LEDs that emit light in the visible spectrum. Also, as discussed below, some hinged panels 112 may be adapted to carry LEDs that emit light as well as image/video projectors for use with a heads-up display or other imaging system. Although the majority of the remaining discussion focuses on hinged panels adapted to carry LEDs that emit light in the visible spectrum, this discussion applies equally to the other forms of hinged panels and associated illumination, projection, and imaging devices described herein. Cabling for the illumination devices and other sensors, etc. may be provided by wiring run through hollow channels within the hinged panels, booms and harness (not shown). In instances where the hinged panels 112 are detachable from the booms 108, electrical contacts may be placed on both sides of the hinged panel-boom junction so as to provide electrical continuity and avoid the need for separately coupled wirings (although such wired connections may be used).
In some instances, the illumination may be provided by fiber optic cables terminating (e.g., with or without lens systems) at the hinged panels, in which case the illumination source may be positioned remotely from the hinged panel, for example worn elsewhere on the person of the user such as in a shoulder harness or utility belt. This would allow for larger power sources and illumination sources of significant luminance, while still providing the directional control afforded by the use of the harness and boom system of the present invention. Likewise, image capture components, such as imaging systems and storage devices could be worn on a shoulder harness or belt and the information obtained by image sensors positioned in the hinged panels 112 at the ends of booms 108 conveyed to such systems through the use of fiber optic waveguides routed through channels in the present headwear.
The illustrated example of the hinged panels 112 at the ends of booms 108 are sized so as to provide one or more LEDs 106 (and/or other sensors and/or projecting elements) approximately below the wearer's eye(s) and facing forward, in the direction the wearer is looking, so that the LEDs illuminate the area of interest to the wearer. The booms 108 are sized so as to position the hinged panels 112 so that they just rest on the wearer's cheeks, preferably over the zygomatic bones, without putting undue pressure thereon. Accordingly, frame 100 may be provided in various sizes to accommodate head sizes and shapes of different wearers, and/or they may be adjustable at one or more points to accomplish same. In some instances, frames and booms may be personalized to a wearer by creating a model, either physical or digital, of the wearer's head and face and fabricating a harness specifically to suit the wearer according to the dimensions provided from the model. Modern additive manufacturing processes (commonly known as 3D printing) make such customizations economically feasible even for consumer applications and custom harnesses could readily be produced from images of a wearer's head and face captured using computer-based cameras and transmitted to remote server hosting a Web service for purchase of the frame and accessories therefor. For example, following instructions provided by the Web-based service, a user may capture multiple still images and/or a short video of his/her head and face. By including an object of known dimensions (e.g., a ruler, a credit card, etc.) within the field of view of the camera at the approximate position of the user's head as the images are captured, a 3D model of the user's head and face can be created at the server. The user can then be provided with an opportunity to customize a harness to be sized to the dimensions of the model, selecting, for example, the number of booms, the type and number of hinged panels, with illumination or other accessories, the positions over the ears, etc. at which the harness will be worn, and other parameters of the to-be-manufactured harness. Once the customizations are specified, and payment collected, the frame specification may be dispatched to a manufacturing facility at which the harness is fabricated.
The frame 100 may include one or more hinge points 114, one or more on each side, about which sections of the frame may articulate so as to allow for a comfortable fit on the wearer. This may be especially important in frames that are not fabricated for personalized fit, so as to allow individual wearers to achieve a comfortable fit. The frame 100 may be worn next to the head, beneath a helmet 118. Accordingly, by allowing the frame to articulate in several places, the fit of the frame may be adjusted to accommodate the presence of the helmet and its associated retention straps, as well as other helmet-worn accessories such as a screen of a head-up display 120 (see
The hinge points 114 may be purely friction fit adjustments in which the relative friction between opposing cylindrical ribs is sufficient to keep the relative orientation of two hinged members constant during wear. Or, the hinge points may incorporate ratchet fittings that provide interlocking gear-like rings to assure that the relative positions of two members will not change with respect to one another unless a relatively significant force is applied. Other hinged arrangements may be used at points 114, such as swivel torque hinges, circle rotational hinges, click and pawl mechanisms, etc. In some cases, the hinge points 114 are fitted with O-rings to prevent moisture from intruding. In some embodiments additional hinge points may be provided along booms to allow the frame to be folded into a compact configuration with the booms folded inwards towards the rear of the frame. This allows easy storage of the device while preventing accidental damage to the booms. Additionally, one or more grip points along the inner surface of each boom or other parts of frame 100 may be fitted with silicone pads for contacting the wearer's skin. The pads assist in reducing slipping of the harness when worn, and also distribute pressure over a larger surface area than might otherwise be the case if they were not present. While silicone pads are preferred, pads made of other materials, e.g., cork, may be used.
One or more LEDs 106 may be included in each hinged panel 112 at the end of each boom 108. In addition to the LED(s), the hinged panels 112 may include heads up display (HUD) projection optics oriented towards the wearer's eye to project information on a HUD screen 120 disposed in front of the wearer's eye(s). The screen 120 may be secured to helmet 18 on a hinge 122 so that it can be swiveled out of the wearer's line of sight when not in use, or it may exist in the form of a screen worn in front of the wearer's eyes in a fashion similar to a pair of spectacles. Alternatively, the projection optics may be oriented away from the user so that images can be projected onto a surface in front of the user. A power source and telemetry transmitter (e.g., for HUD data and audio communications) may be included in the frame 100 and/or a helmet 118 and attached to the various illumination and video elements, microphone(s), and earpiece(s) via one or more wire leads within the harness.
The frame 100 may further support one or more communication earpieces 124. Together with one or more microphones, which may be supported on one or more booms 108 or elsewhere on frame 100, the earpiece and microphone allow for communications to/from the wearer. The illumination devices, microphones, and other controlled elements may be configured for hands-free operation using the switch 102 positioned at the end of a boom 104 so as to be flush against the wearer's face (or nearly so), over the masseter muscle so that clenching/flexing of the jaw activates the switch. The switch 102 is electrically connected to control electronics and/or the controlled elements in the fashion described above. Alternatively, or in addition, earpiece and microphone elements may be communicatively connected to a transceiver carried elsewhere on the wearer's person, either using wired or wireless connections. In other embodiments, the earpiece and/or microphone may be eliminated, and audio communications facilitated through bone conduction elements. Portions of the frame 100 are in contact with the wearer's head. Hence, rather than an earpiece, a bone conduction headphone that decodes signals from a receiver and converts them to vibrations can transmit those vibrations directly to the wearer's cochlea. The receiver and bone conduction headphone(s) may be embedded directly in the frame 100, or in some cases the receiver may be external to the frame. One or more bone conduction headphones may be provided. For example, the headphone(s) may be similar to bone conduction speakers employed by scuba divers and may consist of a piezoelectric flexing disc encased in a molded portion of the frame 100 that contacts the wearer's head just behind one or both ears. Similarly, a bone conduction microphone may be provided in lieu of a boom microphone.
In some embodiments, frame 100 may include a sensor package 130 that allows for monitoring of the wearer's vital statistics. A power source and telemetry transmitter (not shown) may be included in frame 100 and attached to the sensor package via one or more wire leads. Thus, even with a helmet removed, the sensors package 130 can continue to relay information concerning the wearer's vital statistics and other monitored biometrics via the telemetry transmitter, because frame 100 remains attached to the wearer.
The sensor package may include a sensor pads constructed of conductive fabric that contact the wearer at or near the temple. Additional sensor pads may be integrated in the frame 100 or may be included in the retractable strap positioned over the head of the wearer. This would allow for additional sensor readings for electrophysiological or other noninvasive monitoring of the wearer.
The sensor pad(s) and associated electronics may allow for detection of electrical signals in the manner described by von Rosenberg, W. et al., “Smart Helmet: Monitoring Brain, Cardiac and Respiratory Activity,” Conf. Proc. IEEE Eng. Med. Biol. Soc. 2015, pp. 1829-32 (2015). For example, as shown in
In some embodiments, the sensor package may also include one or more accelerometers 144 which provide inputs to processor 132 concerning rapid accelerations/decelerations of the wearer's head. Such measurements may be important when assessing possible traumatic brain injuries, cervical spinal injuries, and the like.
Although not shown in the various views, a power source for the electronics is provided and may be housed within the frame 100 or located external thereto (e.g., worn on a vest or pack). In some cases, a primary power source may be located external to the frame 100 and a secondary power source provided integral thereto. This would allow the primary power source to be decoupled from the harness, which would then revert to using the secondary power source (e.g., a small battery or the like), at least temporarily. This would allow for continuous monitoring of the biometric and vital signs and provision of related telemetry. Primary power may later be restored by an attending medic using a transportable power supply. To facilitate this operation, the harness may be provided with one or more ports allowing connection of different forms of power supplies.
Beyond comfort, the present head-worn illumination device offers beam separation/brightness consistency when closing distance/peering. For example, by having separate illumination sources on booms on either side of a wearer's face, with each being mounted on a pivotable, hinged panel, a wearer can aim each illuminations source independently so as to provide for combining the illumination of the respective beams at a desired point in front of the wearer (e.g., corresponding to an area of interest to the wearer), so as to maximize the provided illumination at that point. Then, by moving his/her head towards/away from the area of interest, the user can provide a form of brightness control over that area of illumination. As the user moves his/her head, the beams provided by the illumination sources will separate, thereby adjusting the effective amount of illumination at the area of interest. In some embodiments, haptic feedback may be used for various indications, e.g., low battery, etc. Embodiments of the head-worn illumination device may also support other components of a head-worn “system” that includes integrated eyewear components, disposable masks and caps, heads-up display, sensors, data capture components, etc.
The illumination device 90 is configured for hands-free operation using the switch 102 positioned at the end of a boom 104 so as to be flush against the wearer's face (or nearly so), over the masseter muscle so that clenching/flexing of the jaw activates the switch. The switch 102 is electrically connected to control electronics and/or the illumination elements in the fashion described above. Alternatively, the switch may be worn on the user's face independently, for example using a temporary adhesive, and connected via any of the wireless communication means discussed above to a control unit so as to control the illumination/imaging elements.
Illumination devices of the kind described herein, and especially the frame, booms, and hinged panels thereof, may be fashioned from a variety of materials, including but not limited to plastics (e.g., zylonite), metals and/or metal alloys, carbon fiber, wood, cellulose acetates (including but not limited to nylon), natural horn and/or bone, leather, epoxy resins, and combinations of the foregoing. Fabrication processes include, but are not limited to, injection molding, sintering, milling, and die cutting. Alternatively, or in addition, one or more additive manufacturing processes, such as extrusion, vat photopolymerization, powder bed fusion, material jetting, or direct energy jetting, may be used to fashion the illumination device and/or components thereof.
The imaging devices associated with embodiments of the invention may include still image and or video cameras. As such, the imaging devices are well suited to photo and/or video capture and may also be used as imaging portions of optical scanners. For example, the imaging device may be employed under the control of a scanning application running on a host processor (e.g., processor 132), and used to image optically encoded information in the form of a bar code, QR code or similar machine-readable item. Machine-readable codes of this nature are ubiquitous and may be associated with any number of things that the wearer of the illumination/imaging device may need or use. For example, in the case of a field medic, the medic may need to administer drugs, intravenous fluids, or take other actions with respect to a casualty. By scanning the packaging associated with the drugs to be administered, for example by training the imaging device onto the packaging and operating the switch 130 by clenching his/her jaw, the medic may cause a bar code or other machine-readable label printed on the packaging to be recorded. This information may then be wirelessly transmitted to a remote location where it can be associated with the casualty's medical records, thereby alleviating the need for the medic to separately perform this operation. Similarly, workers in a warehouse or similar facility may scan a machine-readable label using the present illumination/imaging device and the associated data may be transmitted to a logistics system to update the status of an associated item. And, in some instances, the information flow may be bidirectional. For example, a repair technician may scan a machine-readable label associated with an item under inspection and, responsive to transmission of that information to a remote facility, troubleshooting information may be transferred to the technician's handheld device or even streamed for playback via a HUD display worn by the technician. Another form of bi-directional communication may involve the wearer transmitting location information (e.g., as provided by a personal GPS or other location finding unit) in response to a clench interaction via switch 130, and having directional information relayed back (e.g., via display on a HUD unit) so as to keep the wearer progressing on a defined route.
Thus, systems and methods for hands-free activation/deactivation of illumination, imaging, and/or other systems, in particular, head-worn illumination, imaging, and/or other systems have been described.
This is a CONTINUATION of U.S. application Ser. No. 16/698,301, filed Nov. 27, 2019, which is a NONPROVISIONAL of, incorporates by reference, and claims priority to U.S. Provisional Application No. 62/775,482, filed Dec. 5, 2018, and is also a CONTINUATION-IN-PART of and incorporates by reference U.S. application Ser. No. 16/202,601, filed Nov. 28, 2018, which claims the priority benefit of U.S. Provisional Applications 62/596,046, filed Dec. 7, 2017, and 62/729,048, filed Sep. 10, 2018, each of which are also incorporated by reference herein.
Number | Name | Date | Kind |
---|---|---|---|
3227836 | Renwick, Sr. | Jan 1966 | A |
4920466 | Liu | Apr 1990 | A |
4970589 | Hanson et al. | Nov 1990 | A |
5083246 | Lambert | Jan 1992 | A |
5226712 | Luca | Jul 1993 | A |
5946071 | Feldman | Aug 1999 | A |
5951141 | Bradley | Sep 1999 | A |
6016160 | Coombs et al. | Jan 2000 | A |
6046712 | Beller et al. | Apr 2000 | A |
6126294 | Koyama et al. | Oct 2000 | A |
6560029 | Dobbie et al. | May 2003 | B1 |
6612695 | Waters | Sep 2003 | B2 |
6896389 | Paul | May 2005 | B1 |
7184903 | Williams et al. | Feb 2007 | B1 |
7303303 | Haynes | Dec 2007 | B1 |
7580028 | Jeong et al. | Aug 2009 | B2 |
7814903 | Osborne et al. | Oct 2010 | B2 |
8188937 | Amafuji et al. | May 2012 | B1 |
8337014 | Kokonaski et al. | Dec 2012 | B2 |
8587514 | Lundstrom | Nov 2013 | B2 |
8708483 | Kokonaski et al. | Apr 2014 | B2 |
9013264 | Parshionikar et al. | Apr 2015 | B2 |
9285609 | Rost | Mar 2016 | B1 |
9632318 | Goto et al. | Apr 2017 | B2 |
10477298 | Cruz-Hernandez | Nov 2019 | B2 |
10736560 | Haugland et al. | Aug 2020 | B2 |
20020027777 | Takasu | Mar 2002 | A1 |
20020122014 | Rajasingham | Sep 2002 | A1 |
20030202341 | McClanahan | Oct 2003 | A1 |
20040008158 | Chi et al. | Jan 2004 | A1 |
20040136178 | Yu | Jul 2004 | A1 |
20040189930 | Skuro | Sep 2004 | A1 |
20040252487 | McCullough et al. | Dec 2004 | A1 |
20050102133 | Rees | May 2005 | A1 |
20050105285 | Maden | May 2005 | A1 |
20050226433 | McClanahan | Oct 2005 | A1 |
20060048286 | Donato | Mar 2006 | A1 |
20060061544 | Min et al. | Mar 2006 | A1 |
20060119539 | Kato et al. | Jun 2006 | A1 |
20060238878 | Miyake et al. | Oct 2006 | A1 |
20070243835 | Zhu et al. | Oct 2007 | A1 |
20070277819 | Osborne | Dec 2007 | A1 |
20080216215 | Lee | Sep 2008 | A1 |
20090073082 | Yoshikawa | Mar 2009 | A1 |
20090187124 | Ludlow et al. | Jul 2009 | A1 |
20090251661 | Fuziak, Jr. | Oct 2009 | A1 |
20090267805 | Jin et al. | Oct 2009 | A1 |
20100014699 | Anderson et al. | Jan 2010 | A1 |
20100081895 | Zand | Apr 2010 | A1 |
20100177277 | Kokonaski et al. | Jul 2010 | A1 |
20100250231 | Almagro | Sep 2010 | A1 |
20100271588 | Kokonaski et al. | Oct 2010 | A1 |
20100283412 | Baudou | Nov 2010 | A1 |
20100327028 | Nakabayashi et al. | Dec 2010 | A1 |
20110089207 | Tricoukes et al. | Apr 2011 | A1 |
20110221672 | Osterhout | Sep 2011 | A1 |
20110288445 | Lillydahl et al. | Nov 2011 | A1 |
20110317402 | Cristoforo | Dec 2011 | A1 |
20120002046 | Rapoport et al. | Jan 2012 | A1 |
20120052469 | Sobel et al. | Mar 2012 | A1 |
20120127420 | Blum et al. | May 2012 | A1 |
20120127423 | Blum et al. | May 2012 | A1 |
20120206323 | Osterhout et al. | Aug 2012 | A1 |
20120229248 | Parshionikar et al. | Sep 2012 | A1 |
20120242698 | Haddick et al. | Sep 2012 | A1 |
20120262667 | Willey | Oct 2012 | A1 |
20120287284 | Jacobsen et al. | Nov 2012 | A1 |
20120312669 | Breeds et al. | Dec 2012 | A1 |
20130016426 | Chiang | Jan 2013 | A1 |
20130201439 | Kokonaski et al. | Aug 2013 | A1 |
20130278881 | Kokonaski et al. | Oct 2013 | A1 |
20130300649 | Parkinson et al. | Nov 2013 | A1 |
20130329183 | Blum et al. | Dec 2013 | A1 |
20140000014 | Redpath et al. | Jan 2014 | A1 |
20140028966 | Blum et al. | Jan 2014 | A1 |
20140079257 | Ruwe et al. | Mar 2014 | A1 |
20140082587 | Delaney | Apr 2014 | A1 |
20140160250 | Pomerantz et al. | Jun 2014 | A1 |
20140249354 | Anderson et al. | Sep 2014 | A1 |
20140259287 | Waters et al. | Sep 2014 | A1 |
20140259319 | Ross | Sep 2014 | A1 |
20140354397 | Quintal, Jr. et al. | Dec 2014 | A1 |
20150094715 | Laufer et al. | Apr 2015 | A1 |
20150109769 | Chang | Apr 2015 | A1 |
20160054570 | Bosveld et al. | Feb 2016 | A1 |
20160178903 | Nakajima | Jun 2016 | A1 |
20160216519 | Park et al. | Jul 2016 | A1 |
20160255305 | Ritchey | Sep 2016 | A1 |
20160313801 | Wagner et al. | Oct 2016 | A1 |
20160316181 | Hamra | Oct 2016 | A1 |
20170075198 | Kuroki | Mar 2017 | A1 |
20170215717 | Orringer et al. | Aug 2017 | A1 |
20170227780 | Tatsuta et al. | Aug 2017 | A1 |
20170257723 | Morishita et al. | Sep 2017 | A1 |
20170270820 | Ashby | Sep 2017 | A1 |
20170322641 | Osterhout | Nov 2017 | A1 |
20180003764 | Menon et al. | Jan 2018 | A1 |
20180242908 | Sazonov et al. | Aug 2018 | A1 |
20190142349 | Schorey et al. | May 2019 | A1 |
20190178476 | Ross | Jun 2019 | A1 |
20190178477 | Ross | Jun 2019 | A1 |
20190265802 | Parshionikar | Aug 2019 | A1 |
20200059467 | Chereshnev | Feb 2020 | A1 |
20200072596 | Pang et al. | Mar 2020 | A1 |
20200097084 | Ross | Mar 2020 | A1 |
20200249752 | Parshionikar | Aug 2020 | A1 |
20200285080 | Belli et al. | Sep 2020 | A1 |
20200320412 | Gillian et al. | Oct 2020 | A1 |
20210029435 | Siahaan et al. | Jan 2021 | A1 |
Number | Date | Country |
---|---|---|
2268980 | Apr 1998 | CA |
208338996 | Jan 2019 | CN |
10 2006 015334 | Oct 2007 | DE |
1 928 296 | May 2011 | EP |
2832906 | Jun 2003 | FR |
2002268815 | Sep 2002 | JP |
2009116609 | May 2009 | JP |
9637730 | Nov 1996 | WO |
2004087258 | Oct 2004 | WO |
2009133258 | Nov 2009 | WO |
2010062479 | Jun 2010 | WO |
2014068371 | May 2014 | WO |
2015124937 | Aug 2015 | WO |
2017065663 | Apr 2017 | WO |
2020117597 | Jun 2020 | WO |
2020248778 | Dec 2020 | WO |
2022098973 | May 2022 | WO |
Entry |
---|
International Search Report and Written Opinion mailed Jan. 25, 2022, from the ISA/European Patent Office, for International Application No. PCT/US2021/058209 (filed Nov. 5, 2021), 15 pgs. |
Etani, Takehito, “The Masticator”, The Masticator: the social mastiction (2016), downloaded from: http://www.takehitoetani.com/masticator, 5 pages. |
International Preliminary Report on Patentability mailed Aug. 4, 2020, from the IPEA/US, for International Application No. PCT/US2018/062767 (filed Nov. 28, 2018), 18 pgs. |
International Search Report and Written Opinion mailed Mar. 5, 2019, from the ISA/US, for International Application No. PCT/US18/62767 (filed Nov. 28, 2018), 15 pages. |
Invitation to Pay Additional Fees and Partial Search mailed Mar. 4, 2020, from the ISA/European Patent Office, for International Patent Application No. PCT/US2019/063717 (filed Nov. 27, 2019), 15 pages. |
Goel, Mayank; et al., “Tongue-in-Cheek: Using Wireless Signals to Enable Non-Intrusive and Flexible Facial Gestures Detection”, HMDs & WEarables to Overcome Disabilities, CHI 2015, Apr. 18-23, 2015, Crossings, Seoul, Korea, pp. 255-258. |
Von Rosenberg; et al., “Smart helmet: Monitoring brain, cardiac and respiratory activity,” 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EM BC), Milan, 2015, pp. 1829-1832, NPL001 (Year: 2015). |
Xu; et al., “Clench Interaction: Novel Biting Input Techniques”, Human Factors in Computing Systems Proceedings (CHI 2019), May 4-9, 2019, 12 pages. |
International Search Report and Written Opinion mailed May 13, 2020, from the ISA/European Patent Office, for International Application No. PCT/US2019/063717 (filed Nov. 27, 2019), 16 pgs. |
Khoshnam; et al., “Hands-Free EEG-Based Control of a Computer Interface Based on Online Detection of Clenching of Jaw”, International Conference on Bioinformatics and Biomedical Engineering, IWBBIO 2017, Part I, Lecture Notes in Computer Science book series (LNCS, vol. 10208), pp. 497-507. |
Tato-Klesa, Hella, “Detection of Teeth Grinding and Clenching using Surface Electromyography”, Master's thesis, Jul. 29, 2020, Technische Universitat Munchen, Munich, Germany, 72 pgs. |
Matthew Temndrup, “Wiggle your nose to control VR experiences with Reach Bionics,” Upload VR (Jan. 12, 2016) (Available at https://uploadvr.com/reach-bionics-lets-you-control-vr-experiences-by-wiggling-your-nose/). |
R. Benjamin Knapp, “Biosignal Processing in Virtual Reality Applications,” Cal. State University Northridge Center on Disabilities Virtual Reality Conference 1993 (1993) (available at http://www.csun.edu/˜hfdss006/conf/1993/proceedings/BIOSIG˜1.htm). |
International Preliminary Report on Patentability mailed Mar. 3, 2021, from the IPEA/US, for International Patent Application No. PCT/US2019/063717 (filed Nov. 27, 2019), 6 pgs. |
International Search Report and Written Opinion mailed Apr. 20, 2022, from the ISA/European Patent Office, for International Patent Application No. PCT/US2022/012746 (filed Jan. 18, 2022), 15 pgs. |
International Search Report and Written Opinion mailed Oct. 20, 2021, from the ISA/European Patent Office, for International Application No. PCT/US2021/039395 (filed Jun. 28, 2021), 14 pgs. |
International Search Report and Written Opinion mailed Jul. 26, 2022, from the ISA/European Patent Office, for International Patent Application No. PCT/US2022/025324 (filed Apr. 19, 2022), 13 pgs. |
International Preliminary Report on Patentability mailed Jan. 12, 2023, from The International Bureau of WIPO, for International Patent Application No. PCT/US2021/039395 (filed Jun. 28, 2021), 11 pgs. |
Office Action dated Apr. 24, 2023, for U.S. Appl. No. 18/146,087, filed Dec. 23, 2022, 11 pgs. |
Communication pursuant to Article 94(3) EPC dated Apr. 28, 2023, from the European Patent Office, for European Patent Application No. 19827960.6, 7 pgs. |
Gu; et al. “Efficacy of biofeedback therapy via a mini wireless device on sleep bruxism contrasted with occlusal splint: a pilot study”, Journal of Biomedical Research, 2015, 29(2):160-168. |
“detent”, Merriam-Webster, definition, downloaded Jan. 23, 2024, from https://www.merriam-webster.com/dictionary/detent, 14 pgs. |
International Search Report and Written Opinion mailed Jan. 22, 2024, from the ISA/US, for International Application No. PCT/US23/32656 (filed Sep. 13, 2023), 13 pgs. |
Number | Date | Country | |
---|---|---|---|
20210405751 A1 | Dec 2021 | US |
Number | Date | Country | |
---|---|---|---|
62775482 | Dec 2018 | US | |
62729048 | Sep 2018 | US | |
62596046 | Dec 2017 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16698301 | Nov 2019 | US |
Child | 17305801 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16202601 | Nov 2018 | US |
Child | 16698301 | US |