Adjustable augmented reality eyewear for image-guided medical intervention

Information

  • Patent Grant
  • 12044858
  • Patent Number
    12,044,858
  • Date Filed
    Thursday, December 28, 2023
    10 months ago
  • Date Issued
    Tuesday, July 23, 2024
    3 months ago
  • Inventors
  • Original Assignees
    • AUGMEDICS LTD.
  • Examiners
    • Wills-Burns; Chineyere D
    • Almeida; Cory A
    Agents
    • KNOBBE, MARTENS, OLSON & BEAR, LLP
Abstract
A head-mounted display system includes a frame; an adjustable strap assembly that includes a first side strap coupled to the frame; a second side strap coupled to the frame; and an adjustment mechanism configured to adjust a circumferential size defined by the first side strap, the second side strap, and the frame; a see-through display assembly pivotably coupled to the frame; a first temple housing pivotably coupled to the frame and slidably coupled to the first side strap; and a second temple housing pivotably coupled to the frame and slidably coupled to the second side strap.
Description
FIELD

This disclosure relates generally to an augmented reality systems, devices and methods, including a head mounted display (HMD) device of an augmented-reality (AR)-based image-guided system to facilitate surgery or other medical intervention (e.g., therapeutic and/or diagnostic procedures), as well as other uses.


BACKGROUND

Near-eye displays may be used in various types of AR applications. For example, Applicant has previously demonstrated that head-mounted devices having AR capabilities may be used for performing image-guided surgery (see, for example, Applicant's U.S. Pat. Nos. 11,382,712, 11,389,252, 10,939,977, and 9,928,629, Applicant's US Publication No. 2020/0163723, and Applicant's PCT Publication Nos. WO 2023/021448 and WO 2023/021450, which describe methods and systems for applying augmented reality and near-eye display techniques to an image-guided surgical system). The disclosures of these patents and published applications are incorporated herein by reference.


SUMMARY

Several embodiments of the disclosure that are described hereinbelow provide improved methods and systems for applying augmented reality (AR) techniques, peripheral devices and operational methods thereof to a head-mounted display (HMD) of, for example, an image-guided medical (e.g., surgical) system. In the context of the present disclosure and in the claims, the term head-mounted display or HMD shall be given its ordinary meaning and shall also refer to any suitable display apparatus configured to display information (e.g., images) over a scene (e.g., portion of a body of a patient for therapeutic or diagnostic intervention or assessment), such as an organ, skin, bones or joints of a patient, using AR techniques and/or other suitable displaying techniques. For example, the term HMD may refer to a helmet, AR glasses, goggles, spectacles, monocle, eyewear, headset, visor, head-up display, and any other suitable type of displaying device mounted on or worn by any portion of a user or wearer's head, including but not limited to the face, crown, forehead, nose and ears. In some embodiments, the head-mounted displays are not used or used together with stand-alone displays, such as monitors, portable devices, tablets, etc. The display may be a hands-free display such that the operator does not need to hold the display.


In accordance with several embodiments, head-mounted display devices described herein provide reduced stress or fatigue on a wearer, and/or provide additional comfort features. The head-mounted display devices may provide improved ergonomics, comfort, and/or the ability to enable a wearer, such as a surgeon, to wear the device for relatively long periods of time (such as for two, four, six, hours or more in one embodiment), without unnecessary fatigue and/or other negative consequences. For example, the head-mounted display device can be designed and adapted to distribute weight around the wearer's head, including to the wearer's forehead and the back of the wearer's head, to reduce at least some of the weight applied to the wearer's nose or other undesired location. Such a configuration can also reduce pressure on the wearer's temples which can be another relatively weight-sensitive area, in addition to the nose. Stated another way, such a head-mounted display can more widely distribute pressure over larger and/or less sensitive areas, such as the forehead and the back of the head. Although medical applications are well-suited for several embodiments, non-medical applications also benefit from many embodiments described herein. For example, non-medical applications may involve consumer or commercial applications such as athletics and fitness, gaming, driving, product design, navigation, manufacturing, logistics, shopping and commerce, educational training, remote collaboration, etc.


The head-mounted display device may be substituted with an alternative hands-free device that is not worn by the operator, such as a portal, monitor or tablet. The display may be a head-up display or heads-up display.


In accordance with several implementations, a head-mounted display device includes: a frame extending from a first end to a second end, the first end configured to be positioned adjacent a first temple of a wearer (e.g., a surgeon or other user), and the second end configured to be positioned adjacent a second temple of the wearer. The device further includes an adjustable strap assembly including: a first side strap having a first end coupled to the first end of the frame; a second side strap having a first end coupled to the second end of the frame; and an adjustment mechanism (e.g., a knob, a rack and pinion, a mechanism that uses friction, a sliding mechanism, a ratchet mechanism, a snap or lock mechanism, etc.) configured to adjust a position of a second end of the first side strap with respect to a second end of the second side strap, in order to adjust a circumferential size defined by the first side strap, the second side strap, and the frame. The device also includes a see-through display assembly (including, e.g., a near-eye display, an augmented-reality display, a stereoscopic display, glasses, a visor, a head-up display, etc.) coupled (e.g., rotatably, pivotably, movably, or slidably coupled) to the frame such that a tilt angle (e.g., pantoscopic tilt angle) can be adjusted. The device further includes a first temple housing coupled (e.g., rotatably or pivotably coupled) to the first end of the frame and slidably coupled to the first side strap of the adjustable strap assembly and a second temple housing coupled (e.g., rotatably or pivotably coupled) to the second end of the frame and slidably coupled to the second side strap of the adjustable strap assembly.


In some implementations, the see-through display assembly is coupled (e.g., rotatably, pivotably, movably, or slidably coupled) to the frame with a pantoscopic tilting assembly that includes an arc-shaped slot that rotatably, pivotably, movably, or slidably couples a portion of the see-through display assembly to a portion of the frame.


In some implementations, the pantoscopic tilting assembly further includes a detent mechanism including a spring-loaded pin or ball and a plurality of detents (e.g., two, three, four, five, or more than five detents), the detent mechanism configured to selectively retain the see-through display assembly in any of a plurality of predefined positions with respect to the frame.


In some implementations, the detent mechanism further includes a guide member slidably engaged with the arc-shaped slot, the guide member configured to apply a force to the spring-loaded pin or ball to move the spring-loaded pin or ball from one of the plurality of detents to another of the plurality of detents.


In some implementations, the see-through display assembly includes a detachable lens assembly (e.g., clip-on lens assembly, snap-on lens assembly, a friction-fit lens assembly, a magnetic attachment assembly, etc.) that can be detached and replaced with a second detachable lens assembly for changing a prescription of lenses of the see-through display assembly (e.g., the detachable lens assemblies may be customized for a particular wearer such that they are swappable and the same device can be easily interchangeably used by multiple different wearers). In some implementations, the detachable lens assembly may be provided by the manufacturer and/or provider of the head-mounted display device with the head-mounted display device such that they are not required to be obtained separately by a user from an optometrist or third-party provider. In some implementations, the detachable lens assembly can be coupled and detached to the head-mounted display device without requiring the use of any tools (e.g., “tools-free”). In some implementations, the detachable lens assembly comprises or functions as an adapter. In some implementations, the detachable lens assembly is a non-magnetically coupled to the head-mounted display device.


In some implementations, the see-through display assembly includes a display assembly frame; a waveguide lens coupled to the display assembly frame; an anterior lens affixed to the waveguide lens or to the display assembly frame in front of the waveguide lens; a posterior lens frame detachably coupled to the display assembly frame using at least one of: a snap fit, a friction fit, or a clip; and a posterior lens affixed to the posterior lens frame.


In some implementations, the head-mounted display device further includes a flashlight assembly (e.g., headlamp assembly, headlight assembly, etc.) that may be detachably coupled to the frame. In some implementations, the flashlight may be permanently coupled to the frame.


In some implementations, the head-mounted display device further includes: a first follower (e.g., protrusion, slidable member, etc.) that slidably couples the first temple housing to the first side strap; and a second follower (e.g., protrusion, slidable member, etc.) that slidably couples the second temple housing to the second side strap.


In some implementations, the frame further includes a nose pad (e.g., nose support member, etc.) configured to engage a nose of the wearer. The frame may optionally not include a nose pad or may be configured not to engage a nose of the wearer.


In some implementations, the head-mounted display device further includes: a forehead support including: a first end coupled (e.g., rotatably or pivotably coupled) to the first side strap; a second end coupled (e.g., rotatably or pivotably coupled) to the second side strap; and a central support coupled to the frame.


In some implementations, the first side strap includes a connector that couples (e.g., rotatably or pivotably couples) a front portion of the first side strap to a rear portion of the first side strap, and that couples (e.g., rotatably or pivotably couples) the first end of the forehead support to the first side strap. The second side strap includes a connector that couples (e.g., rotatably or pivotably couples) a front portion of the second side strap to a rear portion of the second side strap, and that couples (e.g., rotatably or pivotably couples) the second end of the forehead support to the second side strap.


In some implementations, the head-mounted display device further includes: a first follower (e.g., protrusion, slidable member, etc.) that is slidably coupled to the first temple housing and that is coupled to the first side strap at a position between the connector of the first side strap and the first end of the first side strap; and a second follower (e.g., protrusion, slidable member, etc.) that is slidably coupled to the second temple housing and that is coupled to the second side strap at a position between the connector of the second side strap and the first end of the second side strap.


In some implementations, the head-mounted display device further optionally includes a top strap removably coupled at a first end to the forehead support and at a second end to the adjustment mechanism of the adjustable strap assembly.


In some implementations, each of the first side strap and the second strap includes a rack, and wherein the adjustment mechanism of the adjustable strap assembly includes: a pinion engaged with the rack of the first side strap and with the rack of the second side strap; and a knob configured to cause rotation of the pinion in order to adjust the circumferential size defined by the first side strap, the second side strap, and the frame (e.g., to customize the fit to a circumferential head size of a particular wearer).


In some implementations, the adjustment mechanism of the adjustable strap assembly further includes a tension mechanism (e.g., stop mechanism, one or more gears engaged with a tension member, etc.) that resists rotation of the knob until a threshold force is overcome.


In some implementations, the adjustment mechanism of the adjustable strap assembly further includes a pad configured to engage a back of a head of the wearer.


In some implementations, the see-through display assembly is configured to display to the wearer an augmented reality (AR) image including a virtual reality (VR) image presented over a portion of a body of a patient, and the head-mounted display device further includes or consists essentially of one or more processors configured to, e.g., upon execution of program instructions stored on a non-transitory computer readable medium, receive one or more anatomical images of the patient and signals indicative of at least a position of the see-through display assembly relative to the scene, and to render the AR image to the see-through display assembly. The see-through display assembly may be configured to allow the wearer to see AR images from both a standing and a sitting position without inconvenience or manual adjustment.


In some implementations, at least one of the one or more processors is positioned within the first temple housing or the second temple housing. In some implementations, at least one of the one or more processors may be located in other locations on the head-mounted display device or separate from the head-mounted display device and in wireless communication with the head-mounted display device.


In accordance with several implementations, a head-mounted display device includes a frame extending from a first end to a second end, the first end configured to be positioned adjacent a first temple of a wearer (e.g., a surgeon or other user), and the second end configured to be positioned adjacent a second temple of the wearer. The frame further includes a nose pad (e.g., nose support member, etc.) configured to engage a nose of the wearer; an adjustable strap assembly including a first side strap having a first end coupled to the first end of the frame; a second side strap having a first end coupled to the second end of the frame; and an adjustment mechanism (e.g., a knob, a rack and pinion, a mechanism that uses friction, etc.) configured to adjust a position of a second end of the first side strap with respect to a second end of the second side strap, in order to adjust a circumferential size defined by the first side strap, the second side strap, and the frame. The head-mounted display device further includes a forehead support including a first end coupled (e.g., rotatably or pivotably coupled) to the first side strap; a second end coupled (e.g., rotatably or pivotably coupled) to the second side strap; and a central support coupled to the frame. The head-mounted display device further includes a see-through display assembly (e.g., augmented-reality display, near-eye display, stereoscopic display, glasses, visor, headset, goggles, head-up display, etc.) coupled (e.g., rotatably, pivotably, movably, or slidably coupled) to the frame such that a tilt angle (e.g., pantoscopic tilt angle) can be adjusted. The see-through display assembly includes a detachable lens assembly (e.g., clip-on lens assembly, snap-on lens assembly, a friction-fit lens assembly, etc.) that can be detached and replaced with a second detachable lens assembly for changing a prescription of lenses of the see-through display assembly. The head-mounted display device also includes a first temple housing pivotably coupled to the first end of the frame and slidably coupled to the first side strap of the adjustable strap assembly and a second temple housing pivotably coupled to the second end of the frame and slidably coupled to the second side strap of the adjustable strap assembly. The head-mounted display device further includes a flashlight assembly (e.g., headlamp assembly, headlight assembly, etc.) detachably coupled to the frame.


In some implementations, the first side strap includes a connector that pivotably couples a front portion of the first side strap to a rear portion of the first side strap, and that pivotably couples the first end of the forehead support to the first side strap. The second side strap includes a connector that pivotably couples a front portion of the second side strap to a rear portion of the second side strap, and that pivotably couples the second end of the forehead support to the second side strap.


In some implementations, the head-mounted display device further includes a first follower (e.g., protrusion, slidable member, etc.) that is slidably coupled to the first temple housing and that is coupled to the first side strap at a position between the connector of the first side strap and the first end of the first side strap, and a second follower (e.g., protrusion, slidable member, etc.) that is slidably coupled to the second temple housing and that is coupled to the second side strap at a position between the connector of the second side strap and the first end of the second side strap.


In some implementations, each of the first side strap and the second strap includes a rack. The adjustment mechanism of the adjustable strap assembly may include a pinion engaged with the rack of the first side strap and with the rack of the second side strap and a knob configured to cause rotation of the pinion in order to adjust the circumferential size defined by the first side strap, the second side strap, and the frame.


In some implementations, the adjustment mechanism of the adjustable strap assembly further includes a tension mechanism (e.g., stop mechanism, one or more gears engaged with a tension member, etc.) that resists rotation of the knob until a threshold force is overcome.


In some implementations, the adjustment mechanism of the adjustable strap assembly further includes a pad configured to engage a back of a head of the wearer.


In some implementations, the head-mounted display device further includes a top strap removably coupled at a first end to the forehead support and at a second end to the adjustment mechanism of the adjustable strap assembly.


In some implementations, the see-through display assembly is configured to display to the wearer an augmented reality (AR) image including a virtual reality (VR) image presented over a scene on a body of a patient, and the head-mounted display device further includes one or more processors configured to receive one or more anatomical images of the patient and signals indicative of at least a position of the see-through display assembly relative to the scene, and to render the AR image to the see-through display assembly.


In some implementations, at least one of the one or more processors is positioned within the first temple housing or the second temple housing. In some implementations, at least one of the one or more processors may be located in other locations on the head-mounted display device or separate from the head-mounted display device and in wireless communication with the head-mounted display device.


In some implementations, each of the first temple housing and the second temple housing includes a plurality of heat-dissipation fins (e.g., protrusions, heatsinks, etc.).


In some implementations, the see-through display assembly is rotatably, pivotably, movably, or slidably coupled coupled to the frame with a pantoscopic tilting assembly that includes an arc-shaped slot that rotatably, pivotably, movably, or slidably couples a portion of the see-through display assembly to a portion of the frame and a detent mechanism including a spring-loaded pin or ball and a plurality of detents. The detent mechanism may be configured to selectively retain the see-through display assembly in any of a plurality of predefined positions with respect to the frame.


In some implementations, the detent mechanism further includes a guide member slidably engaged with the arc-shaped slot, the guide member configured to apply a force to the spring-loaded pin or ball to move the spring-loaded pin or ball from one of the plurality of detents to another of the plurality of detents.


In some implementations, the frame includes a flashlight mounting socket including a first rod (e.g., post, protrusion, shaft, etc.) that defines a pivot or rotation axis and a second rod (e.g., post, protrusion, shaft, etc.) positioned parallel to the first rod. The flashlight assembly may include a first recess (e.g., opening, socket, depression, etc.) shaped to engage and pivot about the first rod of the flashlight mounting socket; a second recess (e.g., opening, socket, depression, etc.) shaped to engage the second rod of the flashlight mounting socket, the second recess being oriented such that the first recess cannot disengage the first rod when the second recess is engaged with the second rod; and a movable latch (e.g., hook, coupler, etc.) configured to selectively retain the second recess in engagement with the second rod.


In some implementations, the head-mounted display device further includes a spring or other biasing mechanism that biases the movable latch toward a position that retains the second recess in engagement with the second rod.


In some implementations, the flashlight assembly includes a flashlight (e.g., headlight, headlamp, etc.); a mounting base that includes the first recess, the second recess, and the movable latch; and one or more arms that pivotably or otherwise rotatably couple the flashlight to the mounting base.


In some implementations, the see-through display assembly includes a display assembly frame; a waveguide lens coupled to the display assembly frame; an anterior lens affixed to the waveguide lens or to the display assembly frame in front of the waveguide lens; a posterior lens frame detachably coupled to the display assembly frame using at least one of: a snap fit, a friction fit, a magnetic attachment, a hook-and-fastener attachment, or a clip; and a posterior lens affixed to the posterior lens frame.


In accordance with several implementations, a head-mounted display device includes a frame extending from a first end to a second end, the first end configured to be positioned adjacent a first temple of a wearer (e.g., a surgeon or other user), and the second end configured to be positioned adjacent a second temple of the wearer; a head mounting assembly configured to retain the frame in a position on a head of the wearer; a see-through display; and a tilting assembly (e.g., pantoscopic tilting assembly) that rotatably, pivotably, movably, or slidably couples the see-through display to the frame such that a tilt angle (e.g., pantoscopic tilt angle) can be adjusted. The tilting assembly (e.g., pantoscopic tilting assembly) includes an arc-shaped slot that rotatably, pivotably, movably, or slidably couples a portion of the see-through display assembly to a portion of the frame. The tilting assembly also includes a detent mechanism including a spring-loaded pin or ball and a plurality of detents, the detent mechanism configured to selectively retain the see-through display assembly in any of a plurality of predefined positions with respect to the frame.


In some implementations, the detent mechanism further includes a guide member slidably engaged with the arc-shaped slot, the guide member configured to apply a force to the spring-loaded pin or ball to move the spring-loaded pin or ball from one of the plurality of detents to another of the plurality of detents.


In some implementations, the arc-shaped slot defines a virtual hinge that includes an axis of rotation that is configured to be aligned with a center of an eyeball of the wearer. In some implementations, the virtual hinge as opposed to a physical hinge advantageously allows for peripheral vision of the wearer not to be obstructed or distorted.


In some implementations, the head mounting assembly includes an adjustable strap assembly including a first side strap having a first end coupled to the first end of the frame; a second side strap having a first end coupled to the second end of the frame; and an adjustment mechanism (e.g., a knob, a rack and pinion, a mechanism that uses friction, etc.) configured to adjust a position of a second end of the first side strap with respect to a second end of the second side strap, in order to adjust a circumferential size defined by the first side strap, the second side strap, and the frame.


In some implementations, the head-mounted display device further includes a forehead support including a first end pivotably coupled to the first side strap; a second end pivotably coupled to the second side strap; and a central support coupled to the frame.


In some implementations, the head-mounted display device further includes a first temple housing pivotably coupled to the first end of the frame and slidably coupled to the head mounting assembly; a second temple housing pivotably coupled to the second end of the frame and slidably coupled to the head mounting assembly; and one or more processors configured to (e.g., upon execution of program instructions stored on a non-transitory computer readable medium) render images for display by the see-through display. In some implementations, at least one of the one or more processors is positioned within the first temple housing or the second temple housing, although they may be in other locations as well.


In some implementations, the head mounting assembly includes a first temple arm coupled to the frame and configured to be placed over a first ear of the wearer; and a second temple arm coupled to the frame and configured to be placed over a second ear of the wearer.


In some implementations, the frame further includes a nose pad (e.g., nose support member, etc.) configured to engage a nose of the wearer.


In accordance with several implementations, a head-mounted display device includes a frame extending from a first end to a second end, the first end configured to be positioned adjacent a first temple of a wearer (e.g., a surgeon or other user), and the second end configured to be positioned adjacent a second temple of the wearer. The head-mounted display device also includes an adjustable strap assembly including a first side strap having a first end coupled to the first end of the frame; a second side strap having a first end coupled to the second end of the frame; and an adjustment mechanism (e.g., a knob, a rack and pinion, a mechanism that uses friction, etc.) configured to adjust a position of a second end of the first side strap with respect to a second end of the second side strap, in order to adjust a circumferential size defined by the first side strap, the second side strap, and the frame. The head-mounted display device further includes a first temple housing pivotably coupled to the first end of the frame and slidably coupled to the first side strap of the adjustable strap assembly and a second temple housing pivotably coupled to the second end of the frame and slidably coupled to the second side strap of the adjustable strap assembly. The head-mounted display device also includes a see-through display coupled to the frame.


In some implementations, the head-mounted display device further includes a first follower (e.g., protrusion, slidable member, etc.) that slidably couples the first temple housing to the first side strap; and a second follower (e.g., protrusion, slidable member, etc.) that slidably couples the second temple housing to the second side strap.


In some implementations, the first follower includes an elongate protrusion that can slide forward and backward within an elongate slot of the first temple housing responsive to pivoting of the first temple housing with respect to the frame. The second follower may include an elongate protrusion that can slide forward and backward within an elongate slot of the second temple housing responsive to pivoting of the second temple housing with respect to the frame.


In some implementations, each of the first side strap and the second strap includes a rack, and the adjustment mechanism of the adjustable strap assembly includes a pinion engaged with the rack of the first side strap and with the rack of the second side strap and a knob configured to cause rotation of the pinion in order to adjust the circumferential size defined by the first side strap, the second side strap, and the frame.


In some implementations, the adjustment mechanism of the adjustable strap assembly further includes a tension mechanism (e.g., stop mechanism, one or more gears engaged with a tension member, etc.) that resists rotation of the knob until a threshold force is overcome.


In some implementations, the adjustment mechanism of the adjustable strap assembly further includes a pad configured to engage a back of a head of the wearer.


In some implementations, the head-mounted display device further includes a forehead support including a first end pivotably coupled to the first side strap; a second end pivotably coupled to the second side strap; and a central support coupled to the frame.


In some implementations, the head-mounted display device further includes a top strap removably coupled at a first end to the forehead support and at a second end to the adjustment mechanism of the adjustable strap assembly.


In some implementations, the frame further includes a nose pad (e.g., nose support member, etc.) configured to engage a nose of the wearer.


In accordance with several implementations, a head-mounted display device includes a frame extending from a first end to a second end, the first end configured to be positioned adjacent a first temple of a wearer (e.g., a surgeon or other user), and the second end configured to be positioned adjacent a second temple of the wearer. The head-mounting display device further includes a head mounting assembly configured to retain the frame in a position on a head of the wearer; a see-through display; and a flashlight assembly (e.g., headlamp assembly, headlight assembly, etc.) detachably coupled to the frame. The frame includes a flashlight mounting socket including a first rod (e.g., post, protrusion, shaft, etc.) that defines a pivot axis; and a second rod (e.g., post, protrusion, shaft, etc.) positioned parallel to the first rod. The flashlight assembly includes a first recess (e.g., opening, socket, depression, etc.) shaped to engage and pivot about the first rod of the flashlight mounting socket; a second recess (e.g., opening, socket, depression, etc.) shaped to engage the second rod of the flashlight mounting socket, the second recess being oriented such that the first recess cannot disengage the first rod when the second recess is engaged with the second rod; and a movable latch (e.g., hook, coupler, etc.) configured to selectively retain the second recess in engagement with the second rod.


In some implementations, the head-mounted display device further includes a spring or other biasing structure that biases the movable latch toward a position that retains the second recess in engagement with the second rod.


In some implementations, the flashlight assembly includes a flashlight (e.g., headlight, headlamp, etc.); a mounting base that includes the first recess, the second recess, and the movable latch; and one or more arms that pivotably couple the flashlight to the mounting base.


In some implementations, the flashlight mounting socket further includes one or more electrical contacts configured to electrically couple to a corresponding one or more electrical contacts of the flashlight assembly.


In some implementations, the frame further includes a nose pad (e.g., nose support member, etc.) configured to engage a nose of the wearer.


In accordance with several implementations, a head-mounted display device includes a frame extending from a first end to a second end, the first end configured to be positioned adjacent a first temple of a wearer (e.g., a surgeon or other user), and the second end configured to be positioned adjacent a second temple of the wearer. The head-mounted display device further includes a head mounting assembly configured to retain the frame in a position on a head of the wearer. The head-mounted display device also includes a see-through display assembly (e.g., augmented-reality display, stereoscopic display, glasses, visor, etc.) coupled to the frame. The see-through display assembly includes a display assembly frame; a waveguide lens coupled to the display assembly frame; an anterior lens affixed to the waveguide lens or to the display assembly frame in front of the waveguide lens; a posterior lens frame detachably coupled to the display assembly frame using at least one of a snap fit, a friction fit, or a clip; and a posterior lens affixed to the posterior lens frame.


In some implementations, the head-mounted display device further includes a first seal between the anterior lens and the waveguide lens and a second seal between the posterior lens frame and the waveguide lens.


In some implementations, the posterior lens frame includes a first protrusion (e.g., clip, snap, etc.) at a top of the posterior lens frame that fits into a first corresponding recess (e.g., opening, hole, slot, etc.) of the display assembly frame. The posterior lens frame includes a second protrusion (e.g., clip, snap, etc.) at a bottom of the posterior lens frame that forms a snap fit with a second corresponding recess (e.g., opening, hole, slot, etc.) of the display assembly frame.


In some implementations, the see-through display assembly is coupled (e.g., rotatably, pivotably, movably, or slidably coupled) to the frame such that a tilt angle (e.g., pantoscopic tilt angle) can be adjusted.


In some implementations, the see-through display assembly is coupled (e.g., rotatably, pivotably, movably, or slidably coupled) to the frame with a pantoscopic tilting assembly that includes an arc-shaped slot that rotatably, pivotably, movably, or slidably couples a portion of the see-through display assembly to a portion of the frame and a detent mechanism including a spring-loaded pin or ball and a plurality of detents, the detent mechanism configured to selectively retain the see-through display assembly in any of a plurality of predefined positions with respect to the frame.


In some implementations, the detent mechanism further includes a guide member slidably engaged with the arc-shaped slot, the guide member configured to apply a force to the spring-loaded pin or ball to move the spring-loaded pin or ball from one of the plurality of detents to another of the plurality of detents.


In some implementations, the frame further includes a nose pad (e.g., nose support member, etc.) configured to engage a nose of the wearer.


In accordance with several implementations, a system includes a head-mounted display (HMD), including a frame configured to be mounted on a head of a user (e.g., a surgeon or other user); and a display (including, e.g., an augmented-reality display, a stereoscopic display, glasses, goggles, a head-up display, a visor, etc.), which is (i) connected to the frame and configured to rotate relative to a frontal plane of the user for setting a tilt angle (e.g., pantoscopic tilt angle) of the display, and (ii) at least partially transparent and configured to display to the user, an augmented reality (AR) image including a virtual reality (VR) image presented over a scene on a body of a patient; and at least one processor configured to (i) receive one or more anatomical images of the patient and signals indicative of at least the position of the display relative to the scene; and (ii) render the AR image to the display.


In some implementations, the frame includes an adjustable head mounting assembly including adjustable temple arms, which are placed over ears of the user and are configured to be adjusted for conforming with at least a section of temples of the user; an adjustable nose pad (e.g., nose support member, etc.), which is placed over a nose of the user and is configured to conform to a shape of at least part of the nose; and a housing, which is connected to the temple arms and nose pad.


In some implementations, at least one of the temple arms includes first and second sections and first and second tilting assemblies, wherein the first tilting assembly is configured to tilt the first section relative to the frame, and the second tilting assembly is configured to rotate the second section relative to the first section.


In some implementations, the second tilting assembly includes a rocker arm, which is configured to rotate about a hinge relative to a longitudinal axis of the first section.


In some implementations, the first section has an opening, and when the rocker arm is rotating, the opening is configured to contain at least part of the rocker arm.


In some implementations, the rocker arm includes a cushion, which is made from a viscoelastic foam shaped to conform with a nape of the user and to improve a clamping between the frame of the HMD and the head of the user.


In some implementations, the cushion includes a material selected from a list of materials consisting essentially of at least one of (i) silicone, (ii) neoprene, and (iii) polyurethane.


In some implementations, the cushion includes a sponge.


In some implementations, the second tilting assembly includes an alloy, which is coated by a viscoelastic foam and is shaped to conform with a nape of the user.


In some implementations, (i) the first and second sections, (ii) the first and second tilting assemblies, and (iii) the nose pad, are adapted to provide the user with two or more degrees of freedom (DOFs) for adjusting the frame to conform with a contour of the head of the user.


In some implementations, the second tilting assembly includes an array of rocker arms, each of the rocker arms is configured to rotate about a respective hinge.


In some implementations, the array of rocker arms is mounted on a common bar, and including an additional hinge, and wherein the common bar is configured to rotate about the additional hinge.


In some implementations, the frame includes an adjustable head mounting assembly including a first side strap having a first end coupled to a first end of the frame; a second side strap having a first end coupled to a second end of the frame; and an adjustment mechanism (e.g., a knob, a rack and pinion, a mechanism that uses friction, etc.) configured to adjust a position of a second end of the first side strap with respect to a second end of the second side strap, in order to adjust a circumferential size defined by the first side strap, the second side strap, and the frame.


In some implementations, the system further includes a first temple housing pivotably coupled to the first end of the frame and slidably coupled to the first side strap; and a second temple housing pivotably coupled to the second end of the frame and slidably coupled to the second side strap.


In some implementations, at least one eye has a first optical axis, and wherein an optical engine of the display has a second optical axis, and wherein the pantoscopic tilt angle is set for aligning the second optical axis with the first optical axis.


In some implementations, the HMD includes a pantoscopic-tilting assembly (PTA), which is connected to the frame and the display, and is configured to rotate the second optical axis relative to the first optical axis for adjusting the pantoscopic tilt angle.


In some implementations, the PTA includes a hinge connecting between the first and second optical axes.


In some implementations, the PTA includes a bar, which is coupled to the optical engine including the display, and an edge of the bar is coupled to the hinge, and wherein the hinge is adapted to rotate the bar relative to the first axis of the frame.


In some implementations, the PTA includes a virtual axis including at least two parts other than a hinge, and adapted to rotate the second optical axis relative to the first optical axis.


In some implementations, the virtual axis includes (i) a bar coupled to an optical engine including the display and to a rotatable section of a disc having a slit, and (ii) an element configured to be inserted into the slit and to be moved along the slit and relative to the slit in a tangential direction.


In some implementations, the virtual axis includes an arm including first, second and third sections, wherein the first section is coupled to an optical engine including the display, the second section is coupled to the frame, and the third section coupled between the first and second sections and configured to bend in response to a force applied to the PTA for adjusting the pantoscopic tilt angle.


In some implementations, the virtual axis includes (i) a rigid arm coupled to the frame, and (ii) a flexible arm, which is coupled to an optical engine including the display, and wherein, the flexible arm is adapted to transform from an elastic deformation to a plastic deformation, and to retain a shape obtained in response to a force applied to the PTA for adjusting the pantoscopic tilt angle.


In some implementations, the PTA further includes a detent mechanism including a spring-loaded pin or ball and a plurality of detents, the detent mechanism configured to selectively retain the display in any of a plurality of predefined positions with respect to the frame.


In accordance with several implementations, a head-mounted display device includes a frame extending from a first end to a second end; a head mounting assembly configured to be adjustable and to retain the frame in a position on a head of the user (e.g., a surgeon or other user); a display (including, e.g., an augmented-reality display, a stereoscopic display, glasses, a visor, etc.) that is at least partially transparent and configured to display to the user an augmented reality (AR) image including a virtual reality (VR) image presented over a scene on a body of a patient; a pantoscopic tilting assembly that pivotably couples (e.g., rotatably, pivotably, movably, or slidably couples) the display to the frame such that a pantoscopic tilt angle can be adjusted; a first temple housing pivotably coupled to the first end of the frame by a first tilting assembly; and a second temple housing pivotably coupled to the second end of the frame by a second tilting assembly. At least one of the first temple housing or the second temple housing includes at least one processor configured to receive one or more anatomical images of the patient and signals indicative of at least a position of the display relative to the scene, and to render the AR image to the display.


In some implementations, the head mounting assembly includes an adjustable head strap configured to engage at least a back of a head of the user and a forehead of the user.


In some implementations, the first temple housing and the second temple housing are each part of the head mounting assembly.


In some implementations, the head mounting assembly further includes an adjustable nose pad (e.g., nose support member, etc.) configured to conform to a shape of at least a part of a nose of the user.


In some implementations, the pantoscopic tilting assembly includes a slot positioned to enable a pivoting element to slide tangential to the slot, the pivoting element configured to be locked at one or more predefined positions with respect to the slot.


In accordance with several implementations, a method of pairing head-mounted displays and workstations in medical center networks includes introducing a communication device of a head-mounted display into a first network; initiating pairing of the communication device of the head-mounted display to a first workstation of the first network using previous connection parameters, when the communication device of the head-mounted display is previously known to the first network. The method also includes initiating pairing of the communication device of the head-mounted display to the first workstation of the first network using a key exchanging process that generates new connection parameters, when the communication device of the head-mounted display is not previously known to the first network. The method further includes exchanging data between the first workstation and the communication device of the head-mounted display during a surgical operation, to enable the head-mounted display to display to a user an augmented reality (AR) image including a virtual reality (VR) image presented over a scene on a body of a patient. In some implementations, the head-mounted display comprises any of the head-mounted displays or head-mounted display devices disclosed herein.


In some implementations, the method further includes responsive to pairing not being completed successfully within a predefined time limit, initiating pairing of the communication device of the head-mounted display to a second workstation of the first network.


In some implementations, the method further includes unpairing the communication device of the head-mounted display from the first workstation of the first network; initiating pairing of the communication device of the head-mounted display to a second workstation of a second network; and exchanging data between the second workstation and the communication device of the head-mounted display during a surgical operation, to enable the head-mounted display to display to a user an augmented reality (AR) image including a virtual reality (VR) image presented over a scene on a body of a patient.


Also described and contemplated herein is the use of any of the apparatus, systems, or methods for the treatment of a spine through a surgical intervention.


Also described and contemplated herein is the use of any of the apparatus, systems, or methods for the treatment of an orthopedic joint through a surgical intervention, including, optionally, a shoulder, a knee, an ankle, a hip, or other joint.


Also described and contemplated herein is the use of any of the apparatus, systems, or methods for the treatment of a cranium through a surgical intervention.


Also described and contemplated herein is the use of any of the apparatus, systems, or methods for the treatment of a jaw through a surgical intervention.


Also described and contemplated herein is the use of any of the apparatus, systems, or methods for the diagnosis of a spinal abnormality.


Also described and contemplated herein is the use of any of the apparatus, systems, or methods for the diagnosis of a spinal injury.


Also described and contemplated herein is the use of any of the apparatus, systems, or methods for the diagnosis of joint damage.


Also described and contemplated herein is the use of any of the apparatus, systems, or methods for the diagnosis of an orthopedic injury.


Also described and contemplated herein is the use of any of the apparatus, systems, or methods in non-medical applications, such as gaming, driving, product design, shopping, manufacturing, athletics or fitness, navigation, remote collaboration, and/or education.


For purposes of summarizing the disclosure, certain aspects, advantages, and novel features are discussed herein. It is to be understood that not necessarily all such aspects, advantages, or features will be embodied in any particular embodiment of the disclosure, and an artisan would recognize from the disclosure herein a myriad of combinations of such aspects, advantages, or features. The embodiments disclosed herein may be embodied or carried out in a manner that achieves or optimizes one advantage or group of advantages as taught or suggested herein without necessarily achieving other advantages as may be taught or suggested herein. The systems and methods of the disclosure each have several innovative aspects, no single one of which is solely responsible for the desirable attributes disclosed herein.


The present disclosure will be more fully understood from the following detailed description of the embodiments thereof, taken together with the drawings, claims and descriptions above. A brief description of the drawings follows.





BRIEF DESCRIPTION OF THE DRAWINGS

Non-limiting features of some embodiments of the inventions are set forth with particularity in the claims that follow. The following drawings are for illustrative purposes only and show non-limiting embodiments. Features from different figures may be combined in several embodiments. It should be understood that the figures are not necessarily drawn to scale. Distances, angles, etc. are merely illustrative and do not necessarily bear an exact relationship to actual dimensions and layout of the devices illustrated.



FIG. 1 is a schematic pictorial illustration of an augmented-reality based image-guided surgery system comprising a head mounted display (HMD).



FIGS. 2A and 2B are schematic pictorial illustrations of HMDs for use in the system of FIG. 1.



FIG. 3 is a schematic pictorial illustration of an HMD comprising a waveguide-based optical engine (OE) having a liquid crystal display (LCD) and a closed-loop control assembly for controlling light intensity in the OE.



FIGS. 4-8 are schematic pictorial illustrations of headlight assemblies (HAs) for use in any of the HMDs shown in FIGS. 1, 2A, 2B, 3, 21A-29E, and 31A-32.



FIG. 9 is a schematic pictorial illustration of a tilting assembly of a self-adjustable temple arm for use in any of the HMDs shown in FIGS. 1, 2A, 2B, 3, 21A-29E, and 31A-32.



FIGS. 10A and 10B are schematic pictorial illustrations of additional multiple Degrees Of Freedom (DOF) implementations implemented in one or more tilting assemblies comprising self-adjustable and ergonomic temple arms and rocker arms for use in any sort of the HMDs shown in FIGS. 1, 2A, 2B, 3, and 29A.



FIG. 11 is a schematic pictorial illustration of another implementation of a multiple DOFs implementation implemented in one or more respective tilting assemblies of a self-adjustable and ergonomic temple arm for use in any of the HMDs shown in FIGS. 1, 2A, 2B, 3, and 29A.



FIG. 12 is a schematic pictorial illustration of a nose pad for use in any of the HMDs shown in FIGS. 1, 2A, 2B, 3, 21A-29E, and 31A-32.



FIGS. 13 and 14 are schematic pictorial illustrations of DOF implementations implemented in pantoscopic tilting assemblies (PTAs) implemented with an optical engine for use in any of the HMDs shown in FIGS. 1, 2A, 2B, 3, 21A-29E, and 31A-32.



FIGS. 15A, 15B, and 15C are schematic pictorial illustrations of use cases of the PTA of FIG. 14 used in several types of surgical procedures.



FIG. 16 is a schematic pictorial illustration of a structured light projector for use in any of the HMDs shown in FIGS. 1, 2A, 2B, 3, 21A-29E, and 31A-32.



FIG. 17 is a diagram that schematically illustrates a method for generating a synchronously exposed image by applying an illumination strobe to a rolling shutter image sensor within an integration time of the rolling shutter for use in any of the HMDs shown in FIGS. 1, 2A, 2B, 3, 21A-29E, and 31A-32.



FIG. 18 is a schematic pictorial illustration of a direct pairing system for directly pairing between a workstation (WS) and any of the HMDs shown in FIGS. 1, 2A, 2B, 3, 21A-29E, and 31A-32.



FIG. 19 is a flow chart that schematically illustrates a method for directly pairing between the WS of FIG. 17 and multiple HMDs selected from any of the HMDs shown in FIGS. 1, 2A, 2B, 3, 21A-29E, and 31A-32.



FIG. 20 is a block diagram that schematically illustrates an implementation of an electronic architecture of the system of FIG. 1.



FIGS. 21A-21D illustrate another embodiment of a head-mounted display for use in the system of FIG. 1.



FIGS. 22A-22B and 23A-23D illustrate additional details of adjustment mechanisms of the head-mounted display of FIG. 21A.



FIGS. 24A-24B, 25, and 26A-26C illustrate embodiments of additional padding and/or straps that may be incorporated into any of the head-mounted displays disclosed herein.



FIGS. 27A-27D illustrate another embodiment of a headlight assembly for use with any of the head-mounted displays disclosed herein.



FIGS. 28A-28F illustrate another embodiment of a headlight assembly for use with any of the head-mounted displays disclosed herein.



FIGS. 29A-29E illustrate another embodiment of a head-mounted display incorporating a clip-on lens assembly that can be used with any of the head-mounted displays disclosed herein.



FIG. 30 is a schematic diagram illustrating different lenses and frames for use with the clip-on lens assembly of FIG. 29A.



FIGS. 31A-31E illustrate another embodiment of a clip-on lens assembly that can be used with any of the head-mounted displays disclosed herein.



FIG. 32 illustrates an embodiment of an AR display having changeable lenses that can be used with any of the head-mounted displays disclosed herein.





DETAILED DESCRIPTION

Some image-guided surgery systems may apply augmented reality (AR) techniques for displaying, over structures inside a patient's body intended to be operated on, one or more anatomical images of the structures (e.g., bones, joints, soft tissue, organs, cartilage). For example, the system may comprise a suitable head-mounted display (HMD), which is configured to display to a surgeon or other wearer, three-dimensional (3D) anatomical images, two-dimensional (2D) or 3D cross sections, tool trajectory, tool depth and additional information that assists the surgeon or other wearer to visualize structures (e.g., vertebrae, joints, bones, soft tissue, organs) that are hidden from actual view by overlying layers of tissue (for example, but not by way of limitation, during a minimally invasive interventional procedure or surgery that does not require open surgery to expose a target region of the body).


Several embodiments of the disclosure that are described herein provide assemblies and methods that may be implemented in conjunction with several types of HMDs for improving the quality of image-guided surgical or other interventional procedures, including spinal surgery and other sorts of orthopedic procedures (as well as other types or categories of procedures, such as dental procedures, cranial procedures, neurological procedures, joint surgery (e.g., shoulder, knee, hip, ankle, other joints), heart surgery, bariatric surgery, facial bone surgery, neurosurgery, and the like), including minimally invasive procedures that do not require open surgery but can be performed through small incisions (e.g., self-sealing incisions that do not require staples or stitches). Note that each of the HMDs described hereinafter may comprise a basic configuration and additional optional components and assemblies that may be implemented in one or more of the HMDs in addition to the basic configuration.


The systems, devices and methods described may be used in connection with other medical procedures (including therapeutic and diagnostic procedures) and with other instruments and devices or other non-medical display environments. The methods described herein further include the performance of the medical procedures (including but not limited to performing a surgical intervention such as treating a spine, shoulder, hip, knee, ankle, other joint, jaw, cranium, etc.). Although medical applications are well-suited for several embodiments, non-medical applications also benefit from many embodiments described herein. For example, non-medical applications may involve consumer or commercial applications such as athletics and fitness, gaming, driving, product design, navigation, manufacturing, logistics, shopping and commerce, educational training, remote collaboration, etc. The surgeon referenced herein may be a consumer or other wearer or user.



FIG. 1 is a schematic pictorial illustration of an augmented-reality (AR) based image-guided system 11 used to carry out a surgical or other interventional medical procedure, in accordance with an embodiment of the disclosure.


In one example, the surgical procedure (e.g., minimally invasive surgical procedure or open surgical procedure) comprises one or more orthopedic procedures performed on one or more vertebrae of the spine of a human subject, referred to herein as a patient 23, who is lying on an operating table 12. One example application is a Lateral Lumbar Interbody Fusion (LLIF) procedure for treating disc problems in the lower (e.g., lumbar or lumbosacral) back of patient 23. In other embodiments, the techniques described below are applicable, mutatis mutandis, to other sorts of surgical procedures carried out on other vertebrae (e.g., lumbar vertebrae, thoracic vertebrae, cervical vertebrae, sacral vertebrae), any suitable organ, bone(s), joints (e.g., sacroiliac joints, knee joints, shoulder joints, ankle joints, hip joints) or other tissue of patient 23. In non-limiting examples, system 11 may be used in other sorts of procedures performed on bone tissue of patient 23, such as in cranial procedures, oral procedures and in maxillofacial surgery, knee surgery, hip surgery, shoulder surgery. Moreover, system 11 may also be used, mutatis mutandis, in surgical or other interventional (e.g., therapeutic or diagnostic) procedures of soft tissue (e.g., neuro procedures, cranial procedures, joint repair or reconstruction procedures, scoping procedures, arthroscopic procedures, ablation procedures, etc.). System 11 may also be used in non-medical applications, including consumer or commercial applications such as athletics and fitness, gaming, driving, product design, navigation, manufacturing, logistics, shopping and commerce, educational training, remote collaboration, etc. and the wearer may be a person other than a surgeon or medical professional (such as a consumer or other user).


In some embodiments, during the procedure, a medical professional, also referred to herein as a physician or a surgeon 26, uses a suitable tool for making an incision 24 into the patient's back. In some embodiments, surgeon 26 inserts an anchoring device, such as spinous process clamp 30 into the incision 24, so that opposing jaws of the clamp 30 are located on opposite sides of the spinous processes. Subsequently, surgeon 26 slides the clamp 30 over the vertebral laminas, and adjusts the clamp 30 to grip one or more spinous processes, selected by the surgeon 26, of the patient 23. One optional implementation of clamp 30 is described in more detail in PCT Application Publication WO 2022/079565, whose disclosure is incorporated herein by reference.


In some embodiments, clamp 30 acts as a support for a patient marker 38, which is attached rigidly to the clamp. During substantially all of the procedure, e.g., during the initial, as well as the subsequent stages, patient marker (not shown), which is used as a fiducial for patient 23, since due to its rigid connection to the patient 23, any movement of the patient 23 is reflected in a corresponding motion of the patient marker 38. Thus, at an initial stage of the procedure, marker 38 is registered with the anatomy of patient 23.


In some embodiments, the anchoring device may be a pin inserted into a bone of the patient, e.g., iliac bone. One optional implementation of such a pin is described in more detail in PCT Application Publication No. WO 2023/281395, whose disclosure is incorporated herein by reference.


Embodiments related to registration tools, markers, marks, adaptors, and methods are described in detail, for example, in U.S. Patent Application Publication 2022/0071712, U.S. Patent Application Publication 2022/0142730, U.S. Patent Publication 10,939,977 and U.S. Patent Application Publication 2021/0161614, whose disclosures are all incorporated herein by reference.


In some embodiments, system 11 comprises (i) a head-mounted display (HMD) 22, which is worn by surgeon 26 and is described in detail hereinafter, (ii) one or more surgical and/or diagnostic tools, such as but not limited to a surgical tool 190, and one or more reflectors, such as a reflector 194, mounted on tool 190. The reflectors may comprise markers for registration and/or calibration purposes.


Reference is now made to an inset 13 showing one optional implementation of HMD 22 shown in a front view.


In the context of the present disclosure, the term “front view” refers to the view of HMD 22 as seen by the eyes of a person located in front of surgeon 26 wearing HMD 22. In the example of FIG. 1, HMD 22 comprises a visor-based optical engine comprising a processor 33, a communication device (shown and described in connection with FIGS. 2B and 18 below) configured to exchange signals or transmissions with entities external to HMD 22, such as a workstation and a remote control tablet described in connection with FIG. 18 below. Typically, the communication device is configured to send and receive the signals or transmissions using suitable wireless techniques, such as but not limited to Wi-Fi (also referred to herein as Wi-Fi) and Bluetooth wireless communication protocols or standards. HMD 22 further comprises one or more power supply devices, such as but not limited to a battery (shown in FIG. 20 below) and a supercapacitor or ultracapacitor described in more detail below.


In some embodiments, processor 33 is configured to receive information, such as anatomical images, and signals from one or more sensors (described below) and other entities of system 11, and to display to surgeon 26 one or more images overlaid on the surgeon's actual view of a portion of the exterior of the patient's body. For example, during a spinal surgery, processor 33 is configured to produce an augmented reality (AR) display that may show 3D images of the vertebrae overlaid on the patient's back, as seen by the patient's eyes. Certain embodiments related to the images, signals and AR display are described in more detail below.


In some embodiments, HMD 22 comprises a visor 14 of a visor-based optical engine for each eye of surgeon 26, which is not shown in FIG. 1, but example implementations thereof are shown and described hereinafter in connection with FIG. 2A below.


In some embodiments, the optical engine (OE) comprises (i) a projector configured to project the AR image produced by the processor, and (ii) optics configured to direct the projected AR image to the visor, also referred to herein as an AR display 15.


In various embodiments, the projector comprises one or more light sources and/or image sources. As one example, the projector comprises an organic light-emitting diode (OLED)-based image source and display comprising a matrix of LEDs having a total size (e.g. diagonal size) of about 0.5 inch. Other sizes of displays may also be implemented.


In the context of the present disclosure and in the claims, the term “AR image” and grammatical variations thereof refer to a virtual reality (VR) image displayed over or integrated with a display including at least partially transparent portions and having a scene in the background, so that a combination of the VR image and the scene is referred to herein as the AR image.


In some embodiments, AR display 15 is configured to display to surgeon 26, the AR image produced by processor 33 by reflecting the AR image into the pupil of the eye of surgeon 26. The optical engine is shown and described in connection with FIG. 2A below.


In other embodiments, the OE of HMD 22 may have different configurations and may be based on different techniques, such as but not limited to a waveguide and liquid crystal-based OE described in more detail in connection with FIGS. 3 and 29A-29E below.


In some embodiments, HMD 22 comprises one or more light sources for tracking applications configured to direct light beams to the surface of the organ or treatment region in question (e.g., back) of patient 23. In some embodiments, the light source comprises a pair of infrared (IR) LED projectors 17 configured to direct IR light beams to the surface of the treatment region. In other embodiments, the light source may comprise any other suitable type of one or more light sources, configured to direct any suitable wavelength or band of wavelengths of light, and mounted on HMD 22 or elsewhere in the operating room.


In some embodiments, HMD 22 comprises a camera 16. In some embodiments, camera 16 comprises a red green blue (RGB) camera having an IR-pass filter, referred to herein as an IR camera and also referred to herein as an IR tracker. In other embodiments, camera 16 may comprise a monochrome camera configured to operate in the IR wavelengths. Camera 16 is configured to capture images including both reflectors 194 and markers 38 and markers (not shown) attached to patient 23. Although camera 16 in FIG. 1 is mounted on HMD 22, these images may additionally, or alternatively, be captured by a suitable camera mounted at any other suitable position on the head or body of surgeon 26 or mounted at any suitable position in the operating room. Camera 16 is configured to produce signals indicative of the captured image and to send them to processor 33, also referred to herein as infrared (IR) images for embodiments incorporating an IR camera. In such embodiments, processor 33 is configured to process the IR images acquired by camera 16, in order to calculate the location and orientation of HMD 22 tracking system reference point relative to the one or more surgical tools (e.g., surgical tool 190) and the organ, bone, joint, or other target treatment region or location in question.


In several embodiments, the tracking application that is based on the images produced by camera 16 requires monochromatic images. In some embodiments, camera 16 comprises a color image sensor. The addition of colors in the tracking images may, in at least some instances, lower image quality due to the de-bayering interpolation applied to the color pixels for producing a contiguous image based on the separated RGB pixels (two red, one green and one blue pixels) of the Bayer filter of the RGB image sensor.


In some embodiments, camera 16 comprises compact sensors (such as sensors designed for consumer products) having a color array filter (CAF) (also denoted a Bayer filter) giving each of the different color channels a unique response. By adding an external band-pass filter, the raw pixel data received from camera 16, can be treated as monochrome data.


In some embodiments, the band-pass filter (BPF) is applied to a selected section of the infrared zone (e.g., between about 830 nm and 870 nm, or using any other suitable range within the wavelengths of the infrared spectrum).


In some embodiments, processor 33 (or any other controller) is configured to apply to each channel, a respective single gain value, so as to offset the effects of the Bayer filter on the filter's pass band.


In some embodiments, the basic configuration of HMD 22 comprises the aforementioned processor 33, communication device (wireless and/or wired), camera 16 (e.g., an IR camera), projectors 17 (e.g., IR projectors), display 15 and the optical engine comprising the projector (shown in FIG. 2A below) on display 15.


In some embodiments, HMD 22 comprises components, which are additional to the basic configuration. For example, HMD 22 comprises an inertial measurement unit (IMU) 18, which is configured to produce position signals indicative of the position and orientation of HMD 22 at a frequency level between about 1 Hz and 10 kHz (e.g., between 1 Hz and 50 Hz, between 1 Hz and 200 Hz, between 50 Hz and 250 Hz, between 100 Hz and 200 Hz, between 50 Hz and 1 kHz, between 100 Hz and 10 kHz, overlapping ranges thereof, or any value within the recited ranges). Based on the position signals received from IMU 18, processor 33 may be configured to improve the response time of system 11, e.g., to any relative motion between HMD 22, and the organ in question (or other target treatment anatomy or region) of patient 23.


In some embodiments, IMU 18 is configured to operate in conjunction with camera 16 (which typically operates at a frequency of about 60 frames per second or at any other suitable frequency corresponding to any other suitable number of frames per second) and to provide processor 33 with the position and orientation of HMD 22 with a reduced latency compared to images received from camera 16. In such embodiments, the position and orientation of HMD 22 can be calculated with the reduced latency obtained by using IMU 18. Moreover, in case the optical path between camera 16 and one or more of the markers is occluded, processor 33 may rely on the signals from IMU 18 for calculating the position of HMD 22 relative to the organ in question (or other target treatment anatomy or region).


In some embodiments, processor 33 is configured to conduct a registration between the coordinate systems of IMU 18 and camera 16 and a synchronization between the signals received from IMU 18 and camera 16.


In some embodiments, HMD 22 comprises one or more additional cameras 25 (e.g., a pair of red-green-blue (RGB) cameras). Each additional camera 25 (e.g., RGB camera) is configured to produce high resolution (HR) images (e.g., HR RGB images) of the organ being operated on (or other target treatment anatomy or region), and processor 33 is configured to display the HR images on the display 15 of HMD 22.


Moreover, because HMD 22 comprises a pair of additional cameras 25, which are positioned at a known distance from one another, processor 33 is configured to produce a stereoscopic 3D image of the site being operated on (e.g., the organ in question or other target treatment anatomy or region). Such techniques are also referred to herein as a digital loupe, for augmented reality near eye display, and are described in more detail, for example in U.S. Provisional Patent Application 63/234,272, and in PCT Publication No. WO2023/021450, the disclosure of both of which are incorporated herein by reference.


In other embodiments, HMD 22 may comprise any other suitable number of additional cameras 25 having similar features to that of RGB cameras. Alternatively, at least one of the additional cameras 25 may have different features compared to that of RGB cameras.


In some embodiments, each additional camera 25 is further configured to acquire images containing a structured light pattern, which is directed to the site being operated on by surgeon 26, and based on the image of the structured light pattern, processor 33 is configured to improve the precision of the 3D imaging of the site being operated on.


In some embodiments, the structured light projector is configured to direct a large number of beams (e.g., hundreds, or thousands) to the organ in question (or other target treatment anatomy or region), so as to enable 3D imaging (and most importantly depth imaging) of the organ (or other target treatment anatomy or region). In some embodiments, the wavelength of the structured light must be suitable for producing spots on the skin and/or internal tissue being operated. In some examples, the structured light may comprise green laser beams, blue laser beams, red laser beams, infrared laser beams or beams of any other suitable wavelength or range of wavelengths.


In some examples, the structured light comprises a visible wavelength (e.g., green), so that cameras 25 produce images of green spots on the skin and/or on the surface of the tissue being operated on. Based on the images received from cameras 25, processor 33 is configured to produce the 3D image of the organ (or other target treatment anatomy or region) being operated on. The structured light is described in more detail in connection with FIG. 16 below.


In other cases, the structured light comprises beams having a non-visible wavelength, such as infrared wavelengths. In some embodiments, processor 33 is configured to produce one or more images of the IR spots on the surface of the organ in question (or other target treatment anatomy or region). The images may be two-dimensional (2D), or typically 3D, using images (e.g., IR images) acquired, for example, by camera 16 and/or using additional IR cameras that may be mounted on HMD 22 or at a known position in the operation room. The 2D and 3D images may be produced by processor 33 using the same techniques, mutatis mutandis, described above for producing the images having the visible spots of structured light.


In some embodiments, HMD 22 comprises a housing 29, which is configured to package all the components described above. In some embodiments, housing 29 comprises a surface 20, which is configured to receive a headlight assembly mounted thereon. Several types of headlight assemblies and mounting techniques thereof to surface 20 are described in detail in connection with FIGS. 4-8 and 27A-28F below.


In some embodiments, HMD 22 comprises a nose pad 28, which is adjustable and is configured to support HMD 22 over the nose of surgeon 26. Several embodiments related to the structure and functionality of nose pad 28 are described in detail in connection with FIG. 12 below.


In some embodiments, HMD 22 comprises one or more light sensors 19, also referred to herein as an ambient light sensor (ALS), which is configured to produce one or more signals indicative of the light in the ambient area surrounding HMD 22. Based on the signals received from light sensor 19, processor 33 is configured to adjust the brightness of the AR image presented on display 15. Additionally, or alternatively, based on the signals received from light sensor 19, processor 33 is configured to adjust the power applied to a light source mounted on MHD 22 and/or to a lighting assembly 27 described herein.


In some embodiments, HMD 22 comprises an IR-based proximity sensor (not shown), which is configured to produce signals for various uses, such as but not limited to hand-gesture tracking.


Reference is now made back to the general view of FIG. 1. Additionally, or alternatively to the aforementioned headlight assemblies, system 11 may comprise a lighting assembly 27, which is mounted at any suitable position of the operating room and is configured to direct white light, or any other suitable wavelength or range of wavelengths of light, to the site being operated in the organ in question (or other target treatment anatomy or region) of patient 23.


During the surgical or other interventional procedure, surgeon 26 wears HMD 22, which is configured to present to surgeon 26 the captured and stored images as well as additional calculated information based on the tracking system that are aligned with the organ (or other target treatment anatomy or region) being operated on.


In some embodiments, in serving as a fiducial, marker 38 performs two functions: in a first function, the marker is used to maintain registration between frames of reference of HMD 22 and the patient's anatomy, and in a second function, the marker is used to ascertain where the head and gaze of surgeon 26 is located with respect to the organ in question (or other target treatment anatomy or region) of patient 23.


During the initial stage of the procedure, a registration marker (not shown) is placed on the patient's back or other anatomical location, and is used to implement the registration of patient marker 38 with the anatomy of patient 23. In contrast to patient marker 38, the registration marker, in some implementations, is only used during the initial stage of the procedure, e.g., for the registration of the patient marker 38, and once the registration has been performed, for the subsequent procedure stages the registration marker may be removed from the patient's back or other anatomical location.


In some embodiments, system 11 comprises a processing system 31, which is communicatively coupled, by cables and/or wirelessly, to HMD 22. In some embodiments, processing system 31 comprises a computer processor 32, a storage device 37 comprising stored images 35, a screen 34, and an input device 36 such as a pointing device, mouse, touchscreen input such as a touchpad or touchscreen display, keyboard, etc.


In some embodiments, processing system 31 is configured to analyze the images acquired by the one or more cameras of HMD 22, and to present over display 15, the aforementioned AR image to surgeon 26.


As described above, HMD 22 comprises processor 33 to carry out at least the functions described above, but in alternative embodiments, during operation HMD 22 is connected to processor 32 of processing system 31, so as to carry these processing and displaying functions remotely, and the AR images are displayed to surgeon 26 over display(s) 15. The functions or tasks described herein as being implemented by processor 32 may be implemented by processor 33, or vice-versa.


The configuration of HMD 22 is provided by way of example and is simplified for the sake of conceptual clarity. In some embodiments, the processor 32 and the processor 33 can share processing tasks and/or allocate processing tasks between the processors 32, 33. Each of the processors 32, 33 may consist essentially of one processor or more than one processor.



FIG. 2A is a schematic rear view of HMD 22, in accordance with embodiments of the disclosure. In the context of the present disclosure, the term “rear view” refers to the view of HMD 22 as seen by the eyes of surgeon 26 while mounting HMD 22.


In some embodiments, HMD 22 comprises two optical engines (OEs) 40, one for each eye. In some embodiments, each OE 40 comprises an AR projector 42, which is configured to direct the AR image described in FIG. 1 above, to a respective AR display 15. In one example, each OE 40 comprises the OLED-based image source and display (described with reference to FIG. 1 above) configured to project the AR image produced by processor 33, and optics (e.g., one or more lenses and mirrors) configured to direct the AR image to AR display 15.


In some embodiments, AR display 15 is a section of visor 14, which is coated with one or more suitable layers, which is configured to reflect the projected VR image to the pupil of the respective eye, so that surgeon 26 can see the VR image overlaid on a scene of interest (e.g., the organ (or other target treatment anatomy or region) being operated on), in a way of augmented vision, virtual overlay on the real world.


In some embodiments, visor 14 is fully transparent or partially transparent so that when directing the gaze away from AR display 15, surgeon 26 can see the scene around him without having the AR image overlaid thereon.


In some embodiments, HMD 22 comprises two temple arms described herein, and nose pad 28 for mechanically supporting the mounting of HMD 22 over the head of surgeon 26. In the context of the present disclosure and in the claims, the term “temple arm” and grammatical variations thereof refer to a section of the frame of HMD 22 (or any other suitable type of HMD), which is coupled to housing 29 and is typically (but not necessarily) mounted on an ear of the user (e.g., surgeon 26) and is positioned in contact with at least a section of a respective temple of surgeon 26.


In some embodiments, a left temple arm 43 comprises processor 33 and optionally other devices, such as a wireless communication device 45 configured to exchange signals between HMD 22 and external entities, and a storage device 46 configured to store images, signals, program instructions and additional data of HMD 22. Note that processor 33, wireless communication device 45, and storage device 46 appear in dashed lines for being embedded within the inner volume of left temple arm 43.


In some embodiments, processor 33, wireless communication device 45, and storage device 46 may be disposed on one or more suitable substrates, such as one or more printed circuit boards (PCBs).


In some embodiments, all the devices are disposed on a single rigid PCB. In some embodiments, at least one of the PCBs may be flexible. Additional embodiments related to suitable types of flexible PCB are described in connection with FIG. 14 below.


In some embodiments, HMD 22 comprises a right temple arm 44, which comprises an on/off or standby button 39 configured to turn the power on when using HMD 22 and to turn the power off when HMD 22 is not in use.


In some embodiments, temple arms 43 and 44 are configured to be adjusted to the shape of the respective left and right temples of surgeon 26 or of any other user, and to be mounted on the ears of surgeon 26 in order to hold the weight of HMD 22 (in one example, together with a nose pad described below). The structure and functionality of temple arms 43 and 44 is described in detail in connection with FIGS. 9-11 below.


In some embodiments, nose pad 28 is configured to be adjusted to the shape of the nose of surgeon 26 or to the nose of any other user. The structure and functionality of nose pad 28, as well as embodiments related to the combination of nose pad 28 and temple arms 43 and 44 are described in detail in connection with FIG. 12 below.


In the context of the present disclosure and in the claims, the terms “frame” and “head mounting assembly” are used interchangeably and may refer to the combination of two or more elements among housing 29, nose pad 28 and temple arms 43 and 44, including the combination of head strap 740 and knob 744 (that together form an adjustable strap assembly) of FIG. 2B or 21A or any other suitable assembly configured to hold HUD 700 or HMD 2122 at a selected position on the head of surgeon 26.


In some embodiments, a power cable (not shown) is threaded through a power cable strain relief 47 of HMD 22. In the present configuration, power cable strain relief 47 is mounted on right temple arm 44, and the power cable is configured to electrically connect between a power source (not shown) and several components of HMD 22, such as but not limited to an on/off button 39.


In some embodiments, the power source comprises a pack of suitable batteries, and one or more supercapacitors or ultracapacitors (not shown). In some embodiments, the pack of batteries comprises lithium-based batteries, such as but not limited to batteries produced by RRC Power Solutions GmbH (Hamburg, Germany).


In some embodiments, the supercapacitor or ultracapacitor can be used to reduce lengthy boot-up when changing the battery packs. Instead of powering down HMD 22, processor 33 may be configured to control the components of HMD 22 to enter a low current standby mode. By powering off all components and peripherals, current may be reduced to the minimum, so as to enable the supercapacitor or ultracapacitor to retain the state of HMD 22 for a sufficiently long time interval of switching the battery packs without the need for an additional battery for the standby mode.


This particular configuration of HMD 22 is shown by way of example, in order to illustrate certain problems that are addressed by embodiments of the disclosure and to demonstrate the application of these embodiments in enhancing the performance of such a system. Embodiments of the disclosure, however, are by no means limited to this specific sort of example system, and the principles described herein may similarly be applied to other sorts of HMDs used in suitable types of AR-based image-guided surgical systems. In addition, HMD 22 may be used in non-medical applications, including consumer and commercial applications.


In some embodiments, system 11 is not limited only to augmented reality systems and/or to systems comprising one or more HMDs. For example, the tracking of patient and/or tool may be performed using stationary tracking systems (other than HMD) and the display may also be on a stationary display, which may or may not be displayed as augmented reality and may or may not be mounted on the head of the user (e.g., surgeon 26) of system 11.


In some embodiments, HMD 22 may comprise various types of image sources, such as but not limited to OLED and liquid-crystal on silicon (shown and described in connection with FIG. 3 below). These image sources may be combined with various sorts of optics (or optical setups), such as but not limited to visor-based, waveguide-based and birdbath-based optical engines. For example, HMD 22 may comprise a combination of OLED and visor, or a combination of liquid-crystal on silicon and waveguide sources. However, it is noted that all possible permutations of the above image source and optics may be applicable for HMD 22.


In the example of FIG. 1, HMD 22 and components and assemblies thereof are implemented using the projection/visor based optical engine. In some embodiments, HMD 22 may be implemented using other techniques and configurations, such as a waveguide-based optical engine. Moreover, instead of glasses, HMD 22 may comprise a helmet-shaped headset also referred to herein as a head-up display (HUD), an example configuration of which is described in detail in connection with FIG. 2B below. In some embodiments, HMD 22 may include features of both the helmet-shaped headset of FIG. 2B and the version shown in FIG. 2A, for example as is shown in FIGS. 21A-21D, described below.


In some embodiments, an HMD 22 that includes all the parts described in FIGS. 1 and 2A above, and some of the additional parts (or variations thereof) described in connection with FIGS. 5-14 below, has a weight of about 200 grams or less, and has a suitable distribution of weight, in order to obtain high comfortability to surgeon 26 and eliminate fatigue associated with a large weight (e.g., between about 0.5 Kg and 1.5 Kg) of other sorts of head-mounted devices.



FIG. 2B is a schematic pictorial illustration of an exemplary head-up display (HUD) 700, in accordance with one embodiment. HUD 700 also serves and is referred to herein as an HMD, and may replace, for example, HMD 22 of FIG. 1 above.


In some embodiments, HUD 700 comprises an optics housing 704, which incorporates a camera 708. More specifically, camera 708 may comprise an RGB camera configured as an IR camera using a suitable filter and software or the camera 708 may comprise an infrared camera or an RGB-IR camera. In some embodiments, housing 704 comprises an infrared transparent window 712, and within the housing, e.g., behind the window, are mounted one or more infrared projectors 716.


In some embodiments, HUD 700 comprises a pair of AR displays 720 that are mounted on housing 704. In some embodiments, displays 720 may comprise, for example, an optical combiner, a waveguide, or a visor, as described in connection with FIGS. 1 and 2A above.


In some embodiments, AR displays 720 allow surgeon 26 to view entities, such as part or all of a selected field-of-view (not shown) through AR displays 720, and which are also configured to present to the surgeon images that may be received from processing system 31 or any other information.


In some embodiments, HUD 700 comprises a processor 724, which operates elements of HUD 700 and is mounted in a processor housing 726. Processor 724 typically communicates with processing system 31 via an antenna 728. In some embodiments, processor 724 may perform some of the functions performed by processing system 31. In some embodiments, processor 724 may completely replace processing system 31.


In some embodiments, HUD 700 comprises a flashlight 732, which is mounted on the front of HUD 700. Flashlight 732 is configured to direct a beam of visible spectrum light (e.g., wavelengths between about 350 nm and 800 nm or between about 300 nm and 900 nm) to selected objects, so that surgeon 26 or other wearer is able to clearly see the objects through displays 720.


In some embodiments, HUD 700 comprises a power source (e.g., a battery (not shown)), which is configured to supply power to several elements of HUD 700 via a battery cable input 736. The power source may additionally or alternatively include one or more capacitors, supercapacitors or ultracapacitors.


In some embodiments, HUD 700 is held and gripped in place on the head of surgeon 26 using a head strap 740, and comprises a knob 744 that the surgeon 26 may use to adjust the head strap of HUD 700. The head strap 740 and knob 744 may together be referred to as an adjustable strap assembly.


In some embodiments, HUD 700 may comprise additional components, such as but not limited to components described in FIGS. 1 and 2A above. For example, HUD 700 may comprise IMU 18, which is configured to produce position signals indicative of the position and orientation of a tracking system reference point/origin of HUD 700. Moreover, HUD 700 may comprise IR LED projectors, such as projectors 17 of HMD 22.


Additionally, or alternatively, flashlight 732 of HUD 700 may be coupled to housing 704 using a suitable detachable lighting fixture assembly (DLFA), which is configured to be attached to and detached from housing 704 and/or the upper bridge (not indicated by a numeral) of HUD 700, or any other suitable location of HUD 700. The ability to detach flashlight 732 reduces weight from HUD 700, and may also be performed in case lighting assembly 27 of FIG. 1 above directs sufficient light to perform the surgical procedure. Non-limiting examples of detachable lighting fixture assemblies are described in detail in connection with FIGS. 4-8 and 27A-28F below.


This particular configuration of HUD 700 is shown by way of example, in order to illustrate certain problems that are addressed by certain embodiments and to demonstrate the application of these embodiments in enhancing the performance of such a system. Embodiments of the disclosure, however, are by no means limited to this specific sort of example HMD, and the principles described herein may similarly be applied to other sorts of head-mounted displays, head-up displays used in suitable types of AR-based image-guided surgical systems. For example, additional features of a head-mount display or a head-up display are described in detail, for example, in U.S. Patent Application Publication 2017/0178375, which is incorporated herein by reference.



FIG. 3 is a schematic pictorial illustration of an HMD 50 comprising a liquid-crystal based optical engine (OE) 55 and a closed-loop control assembly 59 configured to control the light intensity in OE 55, in accordance with an embodiment of the disclosure. HMD 50 may replace, for example, HMD 22 of FIG. 1 above.


In some embodiments, HMD 50 comprises a wave-guide based OE 55 comprising a backlight source 54. In some embodiments, backlight source 54 comprises one or more LEDs configured to supply visible light through a waveguide 52, which is coated with an opaque shroud to prevent, or at least reduce, the amount of stray light, which is an optical leakage of photons of the backlight. More specifically, backlight source 54 may comprise red, green and blue (RGB) LEDs and may be configured to emit a white light generated by combining the light of the RGB LEDs.


In some embodiments, the backlight is passing through an optical coding device (e.g., a liquid-crystal on silicon (LCOS) device 51), which is configured to modulate the backlight back on information coded by processor 33. For example, in response to receiving a signal indicative of a coded slice of a computerized tomography (CT) image, LCOS device 51 is configured to modulate the backlight and to produce an image of the CT slice presented over a display 49a of HMD 50.


In some embodiments, OE 55 comprises a photosensor 53, which is configured to measure stray light between backlight source 54 and LCOS 51 without interfering with the operation of OE 55. In other words, in some embodiments, photosensor 53 is not positioned along the optical axis of the backlight intended to be modulated, and uses the stray light for measuring the intensity of the backlight emitted from backlight source 54 into waveguide 52.


Reference is now made to an inset 57 showing a block diagram of closed-loop control assembly 59. In some embodiments, in response to sensing the intensity of the backlight emitted from backlight source 54, photosensor 53 is configured to produce a signal indicative of the measured intensity of the backlight.


In some embodiments, based on the signal received from photosensor 53, processor 33 is configured to control a backlight driver 58 to adjust the current applied to the RGB LEDs of backlight source 54.


In principle, it is possible to control the current supplied to backlight source, but due to the non-uniform response of the LEDs (even from the same batch of LEDs) of any light source (such as backlight source 54), the intensity of the backlight may be non-uniform and altered. More specifically, (i) the backlight may be altered over time in the same backlight source 54, e.g., when the LEDs are aging and/or in response to changes in the temperature of parts surrounding the LEDs, (ii) the backlight may be altered between different backlight sources 54 of different respective OEs 55 (e.g., between the left and right OEs 55 of HMD 50), (iii) between OEs of different HMDs 50, and (iv) any combination thereof.


In other words, processor 33, or any suitable dedicated circuit controls driver to adjust the current supplied to each of the LEDs of backlight source 54, so as to keep the light levels constant. Due to the fact that a sequential strobing scheme is used, a single photosensor 53 may be sufficient for controlling the light emitted in all three-color channels (RGB) in one embodiment.


Thus, in one embodiment, controlling the backlight based on direct off-axis measurement of the stray light of OE 55 improves the uniformity of the brightness of the AR image presented over AR display 49a.


Reference is now made back to the general view of FIG. 3. In some embodiments, HMD 50 comprises an additional AR display 49b, which is configured to present to the left eye of surgeon 26, an additional AR image by applying the same technique to a similar optical engine (not shown) mounted on a section 56 (e.g., a bracket) of the left temple arm (not shown) of HMD 50. This implementation may also be incorporated in the other HMDS and/or HUDs described herein.


In some embodiments, processor 33 is configured to present different AR images over AR displays 49a and 49b, so as to display to surgeon 26 images such as a stereoscopic image (e.g., of a 3D CT image) of the organ (or other target treatment anatomy or region) being operated on.


In some embodiments, HMD 50 comprises an adaptor 48, which is formed in a frame 41 of HMD 50 and is adapted for mounting on HMD 50, and a suitable type of nose pad, such as nose pad 28 shown in FIG. 2A above and described in more detail in connection with FIG. 12 below. Moreover, the temple arms of HMD 50 are extended from section 56 and may have a similar configuration to that of FIG. 2A above. Embodiments related to several configurations of the temple arms of the HMDs and HUD of the present disclosure are described in detail in connection with FIGS. 9-11 below.


This particular configuration of HMD 50 is shown by way of example, in order to illustrate certain problems that are addressed by certain embodiments and to demonstrate the application of these embodiments in enhancing the performance of such a system. Embodiments of the disclosure, however, are by no means limited to this specific sort of example HMD configuration, and the principles described herein may similarly be applied to other sorts of HMDs and HUDs used in any suitable types of near-eye display AR-based image-guided surgical systems. In addition, HMD 50 may be used with other medical systems or with non-medical systems, including for consumer or commercial applications.


Example Headlight Assemblies



FIG. 4 is a schematic pictorial illustration of a headlight assembly (HA) 60 for use in any of the HMDs and HUD shown in FIGS. 1, 2A, 2B, 3, 21A-29E, and 31A-32, in accordance with various embodiments.


In some embodiments, HA 60 comprises a flashlight 61, which may have similar features of flashlight 732 of FIG. 2B above, or of any other flashlight disclosed herein.


In some embodiments, HA 60 comprises a detachable lighting fixture assembly (DLFA) 66, which is adapted for attaching flashlight 61 to surface 20 of housing 29, and for detaching flashlight 61 from surface 20.


Reference is now made to insets 62, 63, and 64. Inset 62 shows DLFA 66 without flashlight 61. In some embodiments (such as shown in inset 63), DLFA 66 comprises one or more clips 65 (e.g., one clip, two clips, three clips or more than three clips), which are configured to: (i) attach DLFA 66 to a base 67 located on surface 20, when DLFA 66 (and flashlight 61) are moved toward surface 20, and (ii) detach DLFA 66 from base 67 when clip(s) 65 are pressed toward the inner volume of DLFA 66.


In some embodiments (such as shown in inset 64), base 67 comprises electrical connections 68 (e.g., two or more vertical pogo pins, three pogo pins, or more than three pogo pins) configured to conduct electrical power and/or signals or data between housing 29 and flashlight 61.



FIG. 5 is a schematic pictorial illustration of a headlight assembly (HA) 70 for use in any of the HMDs and HUD shown in FIGS. 1, 2A, 2B, 3, 21A-29E, and 31A-32, in accordance with several embodiments.


In some embodiments, HA 70 comprises a flashlight (not shown), which may have similar features of flashlight 61 of FIG. 4 above and/or flashlight 732 of FIG. 2B above, and/or of any other flashlight disclosed herein.


Reference is now made to an inset 72. In some embodiments, HA 70 comprises a DLFA 71, which is adapted for attaching the flashlight to surface 20 of housing 29, and for detaching the flashlight away from surface 20.


In some embodiments, DLFA 71 comprises one or more clips 73 (for example, one clip, two clips, three clips or more than three clips), which are configured to: (i) attach DLFA 71 to a base 74 located on surface 20, when DLFA 71 is moved in a direction 76 (e.g., away from the forehead of surgeon 26), and (ii) detach DLFA 71 from base 74 when clip(s) 73 are pressed toward base 74, and at the same time, DLFA 71 is moved in a direction 75 (e.g., toward the forehead of surgeon 26).


In some embodiments, base 74 comprises electrical connections 68 (e.g., two or more horizontal pogo pins, three pogo pins, or more than three pogo pins) configured to conduct electrical power and/or signals or data between housing 29 and the flashlight described above.



FIG. 6 is a schematic pictorial illustration of a headlight assembly (HA) 80 for use in any of the HMDs and HUD shown in FIGS. 1, 2A, 2B, 3, 21A-29E, and 31A-32, in accordance with various embodiments.


In some embodiments, HA 80 comprises a flashlight (not shown), which may have similar features of flashlights 61 of FIG. 4 above and flashlight 732 of FIG. 2B above, and/or of any other flashlight disclosed herein.


Reference is now made to an inset 81. In some embodiments, HA 80 comprises a DLFA 82, which is adapted for attaching the flashlight to surface 20 of housing 29, and for detaching the flashlight away from surface 20.


In some embodiments, a base 86 formed on surface 20 comprises trenches 85 configured to receive DLFA 82 as will be described herein.


In some embodiments, DLFA 82 comprises two leads 84 (one at each side of DLFA 82), which are configured to: (i) slide through trenches 85 along direction 75 for attaching DLFA 82 to base 86, and (ii) slide through trenches 85 along direction 76 for detaching DLFA 82 away from base 86.


In some embodiments, DLFA 82 comprises a handle 83 for moving DLFA 82 in directions 75 and 76, and one or more clips configured to attach and detach DLFA 82 in conjunction with the movement in directions 75 and 76.


In some embodiments, housing 29 comprises electrical connections (e.g., one or more vertical or horizontal pogo pins (not shown)) configured to conduct electrical power and/or communication signals or data between housing 29 and the flashlight described above, which is connected to DLFA 82.



FIG. 7 is a schematic pictorial illustration of a headlight assembly (HA) 90 for use in any of the HMDs and HUD shown in FIGS. 1, 2A, 2B, 3, 21A-29E, and 31A-32, in accordance with various embodiments.


In some embodiments, HA 90 comprises a flashlight (not shown), which may have similar features of flashlights 61 of FIG. 4 above, flashlight 732 of FIG. 2B above, and/or of any other flashlight disclosed herein.


Reference is now made to an inset 91. In some embodiments, HA 90 comprises a DLFA 92, which is adapted for attaching the flashlight to a base 93 coupled to surface 20 of housing 29, and for detaching the flashlight away from base 93.


In some embodiments, inset 91 shows how DLFA 92 is attached to and detached from base 93, when being moved in directions 75 and 76, respectively.


Reference is now made back to the general view of FIG. 7. In some embodiments, base 93 comprises a lead 94 configured to slide along a trench (not shown) formed at the lower surface of DLFA 92, which is facing base 93. Base 93 further comprises two fences 95 at the sides of the base.


In some embodiments, base 93 comprises electrical connections (e.g., two or more vertical and/or horizontal pogo pins (not shown)) configured to conduct electrical power and/or communication signals or data between housing 29 and the flashlight described above, which is connected to DLFA 92.


Reference is now made back to inset 91. In some embodiments, DLFA 92 comprises two pairs of flexible fins 96 (one pair at each side of DLFA 92), which are partially surrounding an opening 97 configured to contain fences 95 for attaching DLFA 92 to base 93.


In some embodiments, the configuration of at least part of DLFA 92 and base 93 is similar to a design of a GoPro®-style mount of cameras and other electronic accessories, which is produced by GoPro Corporate (3025 Clearview Way, San Mateo, CA).



FIG. 8 is a schematic pictorial illustration of a headlight assembly (HA) 100 for use in any of the HMDs and HUD shown in FIGS. 1, 2A, 2B, 3, 21A-29E, and 31A-32, in accordance with various embodiments.


In some embodiments, HA 100 comprises a flashlight 61 and a detachable lighting fixture assembly (DLFA) 101, which is adapted for attaching flashlight 61 to surface 20 of housing 29, and for detaching flashlight 61 away from surface 20.


Reference is now made to insets 102 and 103. In some embodiments, DLFA 101 comprises a bore 104, which is adapted to contain an axis (not shown) for controlling an elevation angle 105 of flashlight 61 relative to a virtual plane, e.g., parallel to surface 20 or to any other suitable reference plane.


With reference to inset 103, in some embodiments, DLFA 101 comprises a base 106, which is coupled to surface 20 and is configured to connect between DLFA 101 and surface 20 of housing 29.


Reference is now made to an inset 107. In some embodiments, base 106 comprises electrical connections 68 (e.g., two or more vertical or horizontal pogo pins) configured to conduct electrical power and/or signals or data between housing 29 and flashlight 61.


In some embodiments, base 106 comprises a pair of magnets 109a and 109b, and DLFA 101 comprises a pair of magnets 108a and 108b. When DLFA 101 is placed over base 106, magnets 108a and 109a attract one another, and similarly, magnets 108b and 109b attract one another.


In such embodiments, the magnetic-based attachment and detachment between DLFA 101 and base 106 are quick and easy because they do not demand a mechanical release of clips or any other sort of capturing and/or locking device. Thus, surgeon 26 or a medical staff member in the operating room can attach and detach DLFA 101 and flashlight 61 using one hand, and subsequently, adjust angle 105 for directing the light beam to a region of interest, e.g., the organ (or other target treatment anatomy or region) being operated on.


In some embodiments, DLFA 101 and base 106 may comprise three, four, or more than four pairs of magnets 108 and 109 for improving the stability of the magnetic-based coupling and preventing undesired rotation of DLFA 101 relative to base 106. In accordance with several embodiments, the size, shape, and magnetism level of magnets 108 and 109, and the distance between every pair of magnets 108 and 109, may define the stability of the magnetic-based coupling.


In some embodiments, a single pair of magnets 108 and 109 may be sufficient for enabling the stability of the magnetic-based coupling and preventing undesired rotation of DLFA 101 relative to base 106.


These particular configurations of the HAs and DLFAs of FIGS. 4-8 discussed above (and of FIGS. 27A-27D and 28A-28F discussed below) are shown by way of example, in order to illustrate certain problems that are addressed by various embodiments and to demonstrate the application of these embodiments in enhancing the illumination performance of such a HMD and/or HUD. Embodiments of the disclosure, however, are by no means limited to this specific sort of example configurations, and the principles described herein may similarly be applied to other sorts of headlight assemblies integrated with any suitable types of HMDs and/or HUDs used in near-eye display AR-based image-guided surgical procedures.


In some embodiments, the headlight assemblies shown in FIGS. 4-8 discussed above (and shown in FIGS. 27A-27D and 28A-28F discussed below) may be removed from the configuration of the HMDs and/or HUDs in case lighting assembly 27 of FIG. 1 above directs sufficient light to perform the surgical procedure.


Example Tilting Assemblies


The embodiments related to tilting assemblies that are described in detail in connection with FIGS. 9, 10A, 10B, 11, 12, 13, 14, 15A, 15B, 15C, and 23B below may be implemented, mutatis mutandis, in any of the HMDs and HUD shown in FIGS. 1, 2A, 2B, 3, 21A-29E, and 31A-32, and the techniques described in connection with FIGS. 9-15C and 23B are also applicable to any other suitable configurations of AR-based glasses and/or helmets, and to other sorts of AR-based near-eye head-mounted display assemblies, whether for medical or non-medical applications.



FIG. 9 is a schematic pictorial illustration of a tilting assembly 111 (e.g., a temple tilting assembly), in accordance with an embodiment of the disclosure.


In some embodiments, tilting assembly 111 is configured for tilting temple arm 44 relative to housing 29. In other words, tilting assembly 111 provides surgeon 26 with a horizontal degree-of-freedom (DOF) for adjusting HMD 22 to the shape of the head of surgeon 26 or to the shape of the head of any other user of HMD 22. The surgeon 26 could be substituted with a consumer for non-medical applications.


In some embodiments, tilting assembly 111 may be implemented using a hinge (not shown), also referred to herein as an axis. In some embodiments, tilting assembly 111 is implemented in a virtual hinge, also referred to herein as a virtual axis. The terms “virtual hinge” and “virtual axis” and grammatical variations thereof refer to tilting one object relative to another object without using a real, or physical, hinge or a real, or physical, axis.


Reference is now made to an inset 110, which is a top view of tilting assembly 111 integrated in HMD 22.


In the example of FIG. 9, the virtual axis of tilting assembly 111 comprises three sections of a strap made from stainless steel or from any other suitable material. More specifically, tilting assembly 111 comprises a section 112 coupled to housing 29, a section 113 coupled to temple arm 44, and a section 114 configured to bend in response to the movement of temple arm 44 in directions 117 or 118. For example, in case a first surgeon has a smaller head than a second surgeon, tilting assembly 111 enables movement of temple arm 44 and section 113: (i) in direction 117 when the first surgeon wears HMD 22, and (ii) in direction 118 when the second surgeon wears HMD 22.


In the example implementation of the virtual axis of tilting assembly 111, sections 112 and 113 are coupled to housing 29 and temple arm 44, respectively, using screws 115 and bores 116. Section 112 and housing 29 move together as a rigid entity, and section 113 is moved in directions 117 and 118 by bending section 114.


In some embodiments, instead of using screws and bores, at least one of sections 112 and 113 tilting assembly 111 may be coupled to the respective parts of HMD 22 using any other coupling technique, such as but not limited to fastening with devices other than screws, gluing, welding, and/or soldering.


Additionally, or alternatively, at least one of sections 112 and 113 of tilting assembly 111 may be formed as an integral part of HMD 22. For example, section 112 may be formed in one cast mold together with at least part of housing 29.


In some embodiments, tilting assembly 111 may comprise different materials used in at least two of the sections. For example, sections 112 and 113 are made of stainless steel, and section 114 is made of a softer and more flexible material, such as but not limited to a nickel titanium alloy, also known as nitinol, and any other suitable material with proper characteristics, flexibility within the elastic range at the required movement range.


Reference is now made to an inset 119, which is a side view of tilting assembly 111 integrated in HMD 22. In the example of inset 119, the coupling of sections 112 and 113 to the respective parts of HMD 22 is shown using screw 115 and bores 116 (through which additional screws 115 are intended to be inserted in some embodiments).


In some embodiments, the techniques described for tilting assembly 111 are implemented, mutatis mutandis, also between temple arm 43 and housing 29.


In some embodiments, any other suitable type of tilting assembly may be implemented between temple arm 43 and housing 29.



FIG. 10A is a schematic pictorial illustration of a tilting assembly 77, in accordance with an embodiment.


In some embodiments, tilting assembly 77 is implemented in HMD 22, but may also be implemented in other sorts of HMDs.


In some embodiments, temple arm 44 comprises a section 121 configured to conform with the shape of the right temple of surgeon 26 or any other user of HMD 22. Temple arm 44 further comprises a section 122 configured to conform with the shape of the right side of the temple and/or the rear portion of human head, referred to herein as the nape of surgeon 26 or any other user of HMD 22.


Reference is now made to an inset 120, which is a side view of section 122 and tilting assembly 77 implemented in temple arm 44 for obtaining an additional adjustment DOF between HMD 22 and the head of surgeon 26 or any other user of HMD 22.


In some embodiments, tilting assembly 77 comprises a rotatable rocker arm 123, a hinge 124 connecting between section 121 and rocker arm 123, and a cushion 125 formed on the surface of rocker arm 123.


Reference is now made to an inset 131, which is a pictorial illustration of the inner structure of tilting assembly 77.


In some embodiments, rocker arm 123 (which may be made from polymer, such as but not limited to polycarbonate and/or Polyoxymethylene (POM) and/or any other suitable material) has a proximal section 127 and a distal section 129 configured to rotate about hinge 124, and a spring 130 (e.g., a torsion spring).


In some embodiments, rocker arm 123 is configured to rotate about hinge 124 relative to a longitudinal axis 191 of section 121.


For example, when HMD 22 is mounted on the head of the aforementioned first surgeon, section 129 is moving from its stationary state, all the way in direction 128a, and follows the ergonomic structure of the user head, usually slightly to direction 126a and section 127 is moved in a direction 126b to conform with the shape of the nape of the first surgeon.


Similarly, when HMD 22 is mounted on the head of the second surgeon (having a different shape of head compared to that of the first surgeon), section 129 is moved in a direction 126a and section 127 is moved in a direction 126b to conform with the shape of the rear head portion, also referred to herein as the nape of the second surgeon. Torsion spring 130 may be configured to reverse the movement direction. In the illustrated example, torsion spring 130 moves section 127 in direction 128b for improving the clamping between rocker arm 123 and the head of the user (e.g., the second surgeon). Moreover, torsion spring 130 is configured to move section 127 in order to enable smooth insertion of HMD 22 on the head of the respective surgeon.


In some embodiments, section 121 has an opening or a partial opening for containing section 127 when being rotated in direction 126b.


Reference is now made to FIG. 10B, showing an alternative configuration to rocker arm shown in insets 120 and 131. In the example of FIG. 10B, HMD 22 may comprise an array 198 of rocker arms 193 (e.g., two rocker arms 193a and 193b). At least one of rocker arms 193 may be similar to rocker arm 123 (and may be adapted to fit into array 198), or may comprise any other suitable design of a rocker arm adapted to be incorporated in array 198.


In the example of FIG. 10B, the assembly of array 198 comprises a bar 196 also referred to herein as a common bar, which is configured to rotate about a hinge 197 coupling between section 121 and bar 196. Moreover, both rocker arms 193a and 193b are coupled to hinges 193a and 193b, respectively, which are coupled to bar 196. In an embodiment, the configuration of hinges 197, 193a and 193b may be similar to that of hinge 124 shown in insets 120 and 131 of FIG. 10A. In another embodiment, at least one of hinges 197, 193a and 193b may have a configuration other than that of hinge 124.


Reference is now made to an inset 132 of FIG. 10A, showing cushion 125 of HMD 22.


In some embodiments, cushion 125 is disposed over the entire surface of rocker arm 123 that is facing the head of the surgeon.


In some embodiments, cushion 125 may be molded on rocker arm 123, or may comprise a separate part attached to rocker arm 123. Cushion 125 may comprise a large variety of materials, solid or sponged (for example, silicone, neoprene, polyurethane, and/or other suitable materials).


In some embodiments, the sponge may comprise closed cells that do not absorb fluids (e.g., sweat), or open cells adapted to absorb fluids.


In some embodiments, cushion 125 comprises a viscoelastic foam also referred to as a “memory foam” for obtaining a soft cushion.


In some embodiments, when a weighted object is positioned on the viscoelastic foam, the foam progressively conforms to the shape of the object, and after the weight (i.e., force) is removed, the foam slowly reassumes its original shape.


In some embodiments, when using viscoelastic material in cushion 125, a human body temperature between about 36° C. and 37° C. accelerates the properties of the memory foam described above.


In accordance with several embodiments, the soft cushion and the viscoelastic foam are adapted to generate a uniform distribution of pressure applied to the head of the surgeon using HMD 22, and thereby, enabling both effective clamping between rocker arm 123 and the head, while retaining a feeling of comfortability for the user (e.g., surgeon 26). Moreover, the clamping effect can be beneficial for safety reasons, in order to preclude a falling event of HMD 22 from the head during the surgical operation.


In some embodiments, HMD 22 comprises two or more DOFs obtained by tilting assemblies 111 and 77, and by using the two-section shape of rocker arm 123 and the selected materials of cushion 125. In accordance with several embodiments, increasing the number of DOFs improves the gripping and adjustments between the frame of HMD 22 (e.g., temple arms 43 and 44, and housing 29) and the contour and/or curvature of the head of the surgeon performing the surgical operation.


In some embodiments, the outer surface of cushion 125 has a suitable texture for improving the gripping to the nape of the head of surgeon 26. For example, the outer surface of cushion 125 (intended to be placed in contact with the nape) may comprise grooves shown in the example of inset 132.



FIG. 11 is a schematic pictorial illustration of a tilting assembly 88, in accordance with one embodiment.


In some embodiments, tilting assembly 88 is implemented in HMD 22, but may also be implemented in other sorts of HMDs, such as the other HMDs described herein. Moreover, tilting assembly 88 may be used instead of tilting assembly 77 of FIG. 10A above, and due to the different configuration, the respective section has a numeral 137 instead of a numeral 122 of the corresponding section shown in FIG. 10A above.


Reference is now made to an inset 135 showing a side view of tilting assembly 88 in an XYZ coordinate system.


In some embodiments, tilting assembly 88 comprises a skeleton 136, which may be made from a suitable metal (e.g. Aluminum 5052 H32, supplied by Aleris International Inc. (Beachwood, OH) or other aluminum or metallic or metallic alloy or polymeric or elastomeric material) adapted to be shaped at least along an XY plane of the XYZ coordinate system. In some embodiments, skeleton 136 may also be shaped along the Z-axis to some extent (even though, in some implementations, this shape adjustment is not required).


In some embodiments, tilting assembly 88 comprises an upholstery 138, which is fitted over and may be coupled with skeleton 136. In some embodiments, upholstery 138 comprises an over molding cushion having similar properties and materials (e.g., viscoelastic foam) to that of cushion 125 of FIG. 10A above. In some implementations, a section 139 of temple arm 43 comprises a structure similar to that of the skeleton and upholstery of section 137, mutatis mutandis.


In some embodiments, the metal of skeleton 136 is adapted to transform from an elastic deformation (in which the skeleton returns to its original shape in response to applying a force and performing a small deformation) to a plastic deformation (in which the skeleton undergoes a larger deformation and retains the shape obtained in response to the applied force).


For example, with reference to the general view of FIG. 11, in case a minimal distance 140 between the edges of sections 137 and 139 of temple arms 44 and 43, respectively, is smaller than the size of the nape of surgeon 26. The skeleton of one or both of sections 137 and 139 may be deformed and shaped, so that the revised (adjusted) distance 140 matches the corresponding size of the nape of surgeon 26. Moreover, in this configuration, upholsteries 138 of sections 137 and 139 snugly fit over the respective section of the nape of surgeon 26.


Reference is now made to inset 135. In some embodiments, upholstery 138 of sections 137 and 139 comprises a suitable texture for improving the gripping to the nape of the head of surgeon 26. For example, the outer surface of upholstery 138 (intended to be placed in contact with the nape) may comprise grooves having a suitable orientation (e.g., parallel to the XY plane, and/or parallel to the XZ plane) or parallel to the longitudinal axis of the respective section 137 and 139 of temple arms 44 and 43. In some embodiments, upholstery 138 may have any other suitable texture other than grooves, which is adapted to improve the gripping between sections of the nape of surgeon 26 and sections 137 and 139 of HMD 22.



FIG. 12 is a schematic pictorial illustration of a nose pad 28, in accordance with an embodiment.


In some embodiments, nose pad 28 is implemented in HMD 22, but in some embodiments, nose pad 28 may also be implemented, mutatis mutandis, in other sorts of HMDs, including any of the HMDs disclosed herein.


Reference is now made to the head of surgeon 26 for showing the position of nose pad 28 over a nose 141 of surgeon 26.


Reference is now made to an inset 148 showing a higher magnification of nose pad 28 placed over a section of nose 141. In the example of inset 148, nose pad 28 is presented with an upholstery 151 of polymer (e.g., a viscoelastic material, such as but not limited to the viscoelastic material of cushion 125 of FIG. 10A above, or any other suitable type of elastomer (e.g., silicone-based polymer), neoprene, polyurethane or any other suitable material), which is partially transparent for the sake of the presentation.


In some embodiments, nose pad comprises a metal-based skeleton 150, which is surrounded by upholstery 151 and may comprise similar materials and functionality to that of skeleton 136 shown in FIG. 11 above.


Nose 141 has a forward section 146, whose cartilage and nerves may be sensitive to being in constant contact with nose pad 28. However, the skin in the left and right sides 147 of nose 141 is not in close proximity to the cartilage, and therefore, is less sensitive to be in contact with nose pad 28.


Reference is now made to an inset 149 showing a frontal view of nose pad 28. Note that in the general view of HMD 22, nose pad 28 and HMD 22 are shown from a rear view, which is opposite to the front view of inset 149.


In some embodiments, nose pad 28 comprises a left section 142, a right section 143 and middle section 144 connecting between the left and right sections. In some embodiments, middle section 144 has an opening, but in some embodiments, middle section 144 may have a solid structure without an opening.


In some embodiments, nose pad comprises a section 145 adapted to couple between nose pad 28 and the frame of HMD 22. In an embodiment, section 145 may have a DOF for being adjusted relative to the frame of HMD 22. The DOF may be implemented using any suitable type of tilting assembly, which may be based on a hinge, or on a vertical axis as described in detail, for example, in any of FIGS. 9, 10A, 10B, and 11 above.


The DOF implemented in section 145 may be operated synchronously or asynchronously with an additional DOF (also referred to herein as a vertical DOF or a pantoscopic-tilting assembly, and is described in detail in connection with FIG. 14 below), between the frame and the AR display of HMD 22.


In some embodiments, section 145 may be rigidly coupled to the frame of HMD 22, without a degree of freedom for tilting relative to the frame.


In one implementation, nose pad 28 comprises skeleton 150 disposed in sections 142, 143 and 145, so as to adjust the shape of nose pad 28 to the shape of nose 141.


Reference is now made back to inset 148. In some embodiments, nose pad 28 is adapted to be shaped so that section 144 of nose pad 28 is not in direct contact with forward section 146 of nose 141, and in some embodiments, an air gap 153 is buffering between section 144 and forward section 146.


In some embodiments, the surgeon may place section 144 directly over forward section 146 of the nose.


In some embodiments, section 145 may comprise an assembly configured to adjust the height (e.g., along the Z-axis) of nose pad 28 relative to the frame of HMD 22. The adjustment may be carried out in steps of predefined movement range, or may be continuous using one or more suitable assemblies implemented in nose pad 28, or in a part of the frame of HMD 22 (e.g., in housing 29), or therebetween.


In some embodiments, upholstery 151 may have a suitable texture, such as but not limited to the textures described above for cushion 125 and upholstery 138 of FIGS. 10A and 11, respectively.


In some embodiments, nose pad 28 may comprise two ball joints for sections 142 and 143, respectively, so as to provide the user with improved adjustment of the shape of nose pad 28 relative to the width of nose 141.



FIGS. 13 and 14 are schematic pictorial illustrations of pantoscopic tilting and a pantoscopic-tilting assembly (PTA) 155 of an optical engine (OE) 165, in accordance with various embodiments. Optical engine 165 may replace, for example, OE 42 of FIG. 2A above, and/or OE 55 of FIG. 3 above.


In some embodiments, PTA 155 may be implemented, mutatis mutandis, in HMDs 22 and 50, in HUD 700 and in any other sort of HMD assembly using embodiments described in connection with FIGS. 13 and 14 below, and/or in connection with FIGS. 21A-29E, and 31A-32. Moreover, the implementation of PTA 155 may be affected by several operational modes of surgical procedures shown and described, for example, in connection with FIGS. 15A-15C below. The PTA 155 may be used to, for example, rotate a see-through display assembly relative to a frontal plane of a user, such as surgeon 26.


Reference is now made to FIG. 13, which is used mainly as an introduction to pantoscopic tilting and optional implementations thereof.


In the example of FIG. 13, an eyeball 158 of surgeon 26 has an optical axis (OA) 160 (also referred to herein as a first optical axis) extended from the center of an eyeball through pupil 161 of surgeon 26. An angle 168 (e.g., of about 25°) is defined between a horizontal line 159 and OA 160.


In the example of FIG. 13, an optical axis 162 of OE 165 (also referred to herein as a second optical axis) is defined from an OE eye relief point 163 within OE 165.


Reference is now made to a frame 164 showing fixed pantoscopic tilts implemented in respective pantoscopic tilt angle of about 0° and 12° applied to the upper and lower glasses of frame 164. In the example of frame 164, the pantoscopic tilt angle is defined between a frame 169 and glasses 167 (e.g., corresponding to AR display 15 of HMD 22).


In some embodiments, in near-eye display AR-based systems, such as in HMD 22 of system 11, the pantoscopic tilt is set for aligning the second optical axis with the first optical axis. In the example of FIG. 13, the pantoscopic tilt is set for aligning optical axis 162 to converge with optical axis 160.


Reference is now made to FIG. 14, showing DOFs of vertical axes implemented using examples of vertical tilting assemblies, referred to herein as pantoscopic tilting assemblies (PTAs) 155 and 156.


In some embodiments, a pantoscopic tilt angle 178 of about 35° is applied to HMD 22 using PTAs 155 and 156. However, other pantoscopic tilt angles may be applied (e.g., angles between 25 degrees and 45 degrees, angles between 30 degrees and 40 degrees, angles between 33 degrees and 37 degrees, overlapping ranges thereof, or any value within the recited ranges). Reference is now made to an inset 170 showing the vertical DOF implemented in PTA 155.


In some embodiments, PTA 155 comprises a bar 172 rigidly coupled to optical engine 165, and a hinge 171 configured to rotate bar 172 relative to temple arm 44 of the frame of HMD 22. In accordance with several embodiments, the movement is a relative movement about a vertical axis between the optical engine and the frame.


In one non limiting example, the frame of HMD 22, and more specifically, housing 29, temple arms 43 and 44, and nose pad 28, are not moved when PTA 155 is tilting OE 165 in angle 178 to the desired pantoscopic tilt angle, and the same applies to the frame when PTA 156 (described herein) is tilting OE 165 in angle 178 to the desired pantoscopic tilt angle.


In some embodiments, OE 165 may comprise an optical assembly comprising one or more cameras, one or more light sources and other components, which are all moved in accordance with the DOF implemented using PTA 155 of PTA 156.


Reference is now made to an inset 175 showing the vertical DOF implemented in PTA 156. In some embodiments, PTA 156 comprises bar 172 coupled to (e.g., molded with) a rotatable section 179 of a disc having a slit 174. PTA 156 further comprises an assembly 177, so that when tilting OE 165 relative to temple arm 44 of the frame, section 179 is moved relative to assembly 177 in a selected tangential direction 176 and locked (e.g., using a locking element 173) at a predefined position within slit 174 to fixate OE 165 at the desired pantoscopic angle relative to temple arm 44 and the other components of the frame of HMD 22. The implementation of the vertical DOF using PTA 156 is also referred to herein as a virtual axis because the components are being moved (e.g., rotated about an axis) without using a physical hinge, such as hinge 171 of PTA 155.


In some embodiments, PTA 156 may comprise any other suitable type of a virtual axis. For example, with reference to inset 110 of FIG. 9 above, PTA 156 may comprise three sections, such as sections 112, 113 and 114 shaped to be adapted to frame and OE 165, and referred to herein as first, second and third sections, respectively. In such embodiments, the first section is coupled to OE 165, the second section is coupled to the frame (e.g., to the respective temple arm of each side, or to housing 29), and the third section is adapted to bend in response to a relative movement between the first and second sections when surgeon 26, or any other user, adjusts the pantoscopic tilt angle.


In some embodiments, PTA 156 may comprise a rigid bar coupled to the frame and a flexible arm having properties similar to that of skeleton 136 shown and described in inset 135 of FIG. 11 above. In such embodiments, in response to surgeon 26 adjusting the pantoscopic tilt angle, the flexible arm is adapted to transform from an elastic deformation to a plastic deformation, and to retain the shape obtained in response to the force applied for the adjustment of the pantoscopic tilt angle.


These particular configuration of PTAs 155 and 156 are shown by way of example, in order to illustrate certain problems that are addressed the example implementation of the vertical DOF for controlling the pantoscopic angle of the optical engine relative to the frame of HMD 22. Embodiments of the disclosure, however, are by no means limited to this specific sort of example configurations and implementations, and the principles described herein may similarly be implemented in other sorts of vertical DOFs used for controlling pantoscopic tilt angles in near-eye display AR-based image-guided surgical systems (including, but not limited to, the structures described below with reference to FIG. 23B). Embodiments may be used in non-medical applications as well, such as for commercial and/or consumer applications, including athletics and fitness, gaming, driving, product design, navigation, manufacturing, logistics, shopping and commerce, educational training, remote collaboration, etc.


In some embodiments, optical engine 165 typically comprises electronic devices configured for exchanging electrical signals with processor 33 (shown in FIG. 1) and other components of HMD 22 and system 11.


In some embodiments, HMD 22 comprises hardware configured to exchange the signals while executing the vertical DOF using PTAs 155 and 156.


Reference is now made to insets 180 and 186 showing two example implementations of the hardware configured to exchange the signals while applying the pantoscopic tilting to OE 165. In some embodiments, a rigid PCB 181, having electronic devices mounted thereon, is disposed in the frame of HMD 22 (e.g., in housing 29 of FIG. 1), a rigid PCB 182, having other electronic devices mounted thereon, is disposed in OE 165, and a flexible PCB 183 is configured to bend in at least two axes in order to enable smooth exchange of the electrical signals between housing 29 and OE 165.


In the configuration of inset 186, flexible PCB 184 has openings 185 shaped as slits along an axis the flexible PCB 184, so as to enable bending of flexible PCB 184 along two or more axes, and thereby to enable the exchange of electrical signals between housing 29 and OE 165.


The configuration of the rigid and flexible PCBs of insets 180 and 186 are shown by way of example, and in some embodiments, any other suitable configuration may be used for enabling the exchange of electrical signals between housing 29 and OE 165 while performing the pantoscopic tilting as well as when HMD 22 operates at a preset pantoscopic tilt angle.


In some embodiments, the glasses of the display (e.g., displays 49a and 49b of FIG. 3 above) may have a separate frame (e.g., display assembly frame 41), which is separated from the optical frame that comprises the OEs, cameras, sensors, and other devices. In such embodiments, frame 41 remains static relative to the head (ears & nose), and the optical frame has a vertical DOF (based on a hinge or on a virtual axis) relative frame 41 around the center of the eye ball of the surgeon.



FIGS. 15A, 15B and 15C are schematic pictorial illustrations of use cases of PTA 155 and 156 used in several types of surgical procedures (e.g., in both sitting and standing positions), in accordance with several embodiments. PTA 2355, described below with reference to FIG. 23B, may also be used in such procedures. These figures help to illustrate, among other things, certain advantages or benefits of including an adjustable pantoscopic tilt, such as to enable the PTA 155, 156, 2355 or the like to be set to different pantoscopic tilt angles for different types of surgical procedures that may each utilize a different Line of Sight (LOS) to an area of interest (e.g., surgery site on a patient body), in accordance with several embodiments.


Reference is now made to FIGS. 15A and 15B showing a spine surgery based on a lateral approach.


A surgeon view of a patient or Line of Sight (LOS) to an area of interest (e.g., surgery site on a patient body) during a medical intervention, is typically downwards, e.g., in a vertical or oblique manner (e.g., since the patient or surgical site is located beneath the eyes of the surgeon). However, in some cases, the surgeon may view the patient or surgical site, or the surgeon's LOS during the medical intervention may be different than downwards, e.g., horizontal or substantially horizontal or straight ahead (e.g., in case the surgical site or area of interest is located in front or substantially in front of the surgeon's eyes). In such cases, the HMD should allow a horizontal or substantially horizontal view and/or augmented reality view of the area of interest.


For example, in a Lateral Lumbar Interbody Fusion (LLIF) procedure, a lateral approach may be required. In such a procedure, patient 23 may be positioned on the side as shown in FIG. 15A, and the surgeons (shown in FIG. 15B) take a side approach and center an incision 187 over the flank of patient 23. The surgeon shown in FIG. 158 is in a sitting position thus viewing the surgery site in a substantially horizontal manner.


Using the lateral approach, the surgeons, in one embodiment, insert the surgical tools from a lateral trajectory, and therefore, can reach the vertebrae and intervertebral discs without moving the nerves or opening up muscles in the back.


In some embodiments, the surgeons of FIG. 158 use an HMD 188 comprising PTA 155 or PTA 156 described in FIG. 14 above (or PTA 2355 described below with reference to FIG. 23B). In some embodiments, the PTA of HMD 188 is configured to set any suitable pantoscopic tilt angle. In the lateral approach example of FIG. 15B, a surgeon 99a may select a first pantoscopic tilt angle (e.g., an angle of about 15° or other angle, such as an angle between 5° and 30°, between 10° and 25°, between 10° and 20°, overlapping ranges thereof, or any value within the recited ranges), and a surgeon 99b may select a different pantoscopic tilt angle (e.g., an angle of about 30° or other angle, such as an angle between 20° and 40°, between 25° and 35°, between 30° and 40°, between 25° and 40°, between 30° and 50°, overlapping ranges thereof, or any value within the recited ranges).


Reference is now made to FIG. 15C showing a surgical approach in which patient 23 lies on the abdomen and chest, with his/her back facing surgeon 26, who is standing, as also shown in the surgical procedure described in FIG. 1 above.


In some embodiments, a surgeon 99c of FIG. 15C uses a HMD 189 comprising a suitable PTA, such as PTA 155 or PTA 156 (or PTA 2355 described below with reference to FIG. 23B). For the sake of conceptual clarity, when surgeon 99c is standing and is inserting the surgical tool(s) from an upper trajectory, or while looking downwards, the surgical approach is referred to herein as “a standing posture” (of the surgeon). In the standing posture example of FIG. 15C, surgeon 99c may select a pantoscopic tilt angle of about 35° or any other suitable pantoscopic tilt angle, and may also adjust the pantoscopic tilt angle during the surgical procedure using the vertical DOF enabled by PTAs 155 and 156 (and 2355).


As described with reference to FIG. 13 above, the pantoscopic tilt angle may be defined as the angle between an axis of the frame and an axis of the display (e.g., between a horizontal axis of the frame and an optical axis of the display). In some embodiments, the PTAs of both HMDs 188 and 189 are configured to set any suitable pantoscopic tilt angle between about 5° and about 40°, and the adjustment of the pantoscopic tilt angle may be continuous or in predefined steps (e.g., of about 1° or 5°). In some implementations, the PTAs are configured to move (e.g., to pivot, rotate, or slide) the display about an axis that is aligned or substantially aligned with a center of an eyeball of a user of the display. In some implementations, such movement can have a number of benefits, including, for example, alignment of the user's optical axis with the display's optical axis, the ability to focus on both near and distant objects using the same head-mounted display device (e.g., glasses) at the same time, and enhancing image quality for both straight ahead viewing and downgaze viewing. As used herein, pantoscopic tilt angle may refer to an amount of tilt toward a wearer's cheek (e.g., with a higher value referring to movement in a direction that moves the display closer to the wearer's cheek).


In some embodiments, HMDs 188 and 189 may replace, for example, any of HMD 22 (of FIGS. 1 and 2A above), HUD 700 (of FIG. 2B above), HMD 50 of FIG. 3 above, HMD 2122 of FIG. 21A, and/or HMD 2922 of FIG. 29A.


In some embodiments, the vertical DOF (implemented for example, in PTAs 155, 156, and 2355) may be used, mutatis mutandis, in any other surgical or other interventional procedures. In such procedures, the surgeon or other professional may select any suitable posture of himself and of the patient. Moreover, in accordance with several embodiments, even though specific procedures are typically performed while the surgeon and/or the patient are in a specific posture, the surgeon or other professional may decide to change his or her posture relative to that of the patient during the procedure and/or the patient's posture, and therefore, an adjustable pantoscopic tilt angle, as implemented for example in PTAs 155, 156, and 2355, is important for the quality of the procedure.


Example Structured Light Projector Implementation



FIG. 16 is a schematic pictorial illustration of a structured light projector (SLP) 200 implemented in HMD 22, in accordance with an embodiment. SLP 200 may also be implemented, mutatis mutandis, in HUD 700 of FIG. 2B, in HMD 50 of FIG. 3, in HMDs 188 and 189 of FIGS. 15B and 15C, respectively, in HMD 2122 of FIG. 21A, in HMD 2922 of FIG. 29A, and in any other suitable type of HMD used in any suitable type of near-eye-display surgical procedures or other medical therapeutic and/or diagnostic procedures. Although medical applications are well-suited for several embodiments, non-medical applications also benefit from many embodiments described herein. For example, non-medical applications may involve consumer or commercial applications such as athletics and fitness, gaming, driving, product design, navigation, manufacturing, logistics, shopping and commerce, educational training, remote collaboration, etc.


In some embodiments, the surgeon (e.g., surgeon 26) makes an incision 202 (e.g., similar to incision 24) in the skin 208 and other tissues of the back of patient 23, so as to expose one or more vertebrae 206 of patient 23 intended to be operated. Some of the areas intended to be operated on may not be exposed or fully exposed by incision 202, depending on the medical application.


In some embodiments, SLP 200 comprises a laser dot pattern projector configured to apply to an area 204 on the organ or body region in question (e.g., the back) of patient 23, a structured light comprising a large number (e.g., between hundreds and hundreds of thousands) of dots 210 arranged in a suitable pattern. This pattern serves as an artificial texture for identifying positions on large anatomical structures lacking fine details of their own (e.g., skin 208 and the surface of vertebrae 206 but the edge thereof).


In some embodiments, using a pseudo random pattern of dots 210, clusters can be uniquely identified and used for disparity measurements. In some embodiments, the disparity measurements are used for calculating depth, and for enhancing the precision of the 3D imaging of area 204.


In some embodiments, the wavelength of dots 210 may be visible to a human eye (e.g., blue, green, or red color) or invisible (e.g., infrared). In accordance with several embodiments, blue dots may advantageously retain their original shape (e.g., round) and appear sharp on skin 208. In some embodiments, SLP 200 is configured to direct blue laser dots or green laser dots (depending on the quality and other parameters of the laser source and optics) to area 204.


In some embodiments, cameras 25 (e.g., RGB cameras) may be used for producing a 3D image of area 204, and based on the images received from cameras 25, processor 33 is configured to produce the 3D image of area 204.


In some embodiments, an additional depth sensing technique may be implemented in HMD 22. The technique relies on a single camera with a precisely calibrated offset relative to SLP 200. In such embodiments, based on the calibrated offset, processor 33 is configured to produce depth information without the need for stereo cameras. The depth information may be obtained by identifying the relative shift of dot clusters.


Additionally, or alternatively, system 11 may comprise a structured light projector mounted on a wall or on an arm of the operating room. In such embodiments, a calibration process between the structured light projector and the one or more cameras (e.g., cameras 25 on HMD 22, or one or more suitable cameras mounted at any suitable position of the operating room) may be required for obtaining the 3D image based on dots 210 projected on area 204.


In some embodiments, SLP 200 may apply an infrared or any other beam having an invisible wavelength (or range of wavelengths), and one or more cameras, such as camera 16 described in FIG. 1 above, may be used for obtaining the 3D image.


The position of SLP 200 in HMD 22 is selected by way of example, and in some embodiments, SLP 200 may be mounted on HMD 22 (or on any other of the HMDs and HUD described above) at any other suitable position.


Rolling Shutter Example



FIG. 17 is a diagram that schematically illustrates a method for generating an image by applying a strobe of an infrared (IR) light source to a rolling shutter image sensor within an integration time of the rolling shutter, in accordance with an embodiment.


In some embodiments, an image sensor of camera 16 (which may comprise an IR camera or an RGB camera configured to act as an IR camera) comprises any suitable number of pixels, for example, a 2 Mb sensor comprising about 2 millions of pixels, or a sensor between 0.5 Mb and 10 Mb, between 1 Mb and 5 Mb, between 2 Mb and 10 Mb, overlapping ranges thereof, or any value within the recited ranges.


In some embodiments, each pixel has an integration time, which is a time interval in which the pixel is open for exposure. FIG. 17 shows the integration time of the three pixels along the time scale “t.”


In some embodiments, numerals 215, 216 and 217 refer to the integration times of the first pixel, the 1-millionth pixel, and the 2-millionth pixel of the image sensor, respectively. The first pixel is opened at t0 and closes at t2, defining an integration time 222 of the pixel.


In some embodiments, the duration of integration time 222 may be determined using the integral capacitance of each pixel of the image sensor. In some embodiments, at t0 the capacitor of the first pixel is opened for charging by voltage produced by the first pixel in response to sensing photons on the surface of the first pixel. At t2 the charging of the capacitor of the first pixel is stopped, and ready for being read to produce the IR image described in FIG. 1 above.


In some embodiments, the integration time of the first pixel (e.g., between t0 and t2) is between 5 milliseconds and 15 milliseconds (e.g., between 5 milliseconds and 10 milliseconds, between 6 milliseconds and 10 milliseconds, between 8 milliseconds and 15 milliseconds, overlapping ranges thereof, about 8 milliseconds, or any value within the recited ranges. Similarly, as shown in the row having numeral 217, the integration time of the 2-millionth pixel starts at t1 (about 7 ms after t0) and lasts for an integration time of between 5 milliseconds and 15 milliseconds (e.g., between 5 milliseconds and 10 milliseconds, between 6 milliseconds and 10 milliseconds, between 8 milliseconds and 15 milliseconds, overlapping ranges thereof, about 8 milliseconds, or any value within the recited ranges. Note that within between 5 milliseconds and 15 milliseconds (e.g., between 5 milliseconds and 10 milliseconds, between 6 milliseconds and 10 milliseconds, between 8 milliseconds and 15 milliseconds, overlapping ranges thereof, about 8 milliseconds, or any value within the recited ranges (e.g., between t0 and t2) all the pixels of the image sensor have been opened. Moreover, at a time interval 220 between t1 and t2 (e.g., about 1 ms) all the pixels of the image sensor are open at the same time.


In some embodiments, at time interval 220, processor 33 (or any other processor or controller of system 11) controls (e.g., via a driver) IR projectors 17 to direct a strobe of an IR beam, referred to herein as an IR strobe, to the area being operated in the body of patient 23. Moreover, during the same time interval 220, processor 33 (or any other processor or controller of system 11) controls camera 16 to acquire the IR image from the area being operated.


In accordance with several embodiments, camera 16 comprises a rolling shutter, which performs the reading time of each pixel sequentially. In some embodiments, the rolling shutter of camera 16 is operated in a global-shutter mode by implementing sufficiently-long pixel integration time 222 and directing the IR strobe at time interval 220 in which all the pixels (e.g., 2 million pixels) of the image sensor are opened.


In such embodiments, artifacts related to rolling shutters, such as but not limited to shifting of objects in the image due to the serial reading time of the pixels, are reduced or eliminated.


Example Pairing Subsystem



FIG. 18 is a schematic pictorial illustration of a direct pairing subsystem 230 for directly pairing between a workstation (WS) 232 and HMDs 22 and 50, in accordance with an embodiment.


In some embodiments, subsystem 230 is a communication subsystem of system 11 shown in FIG. 1 above. In accordance with several embodiments, subsystem 230 is configured to connect between WS 232 and any sort of one or more HMDs and HUD (including any of the HMDs or HUDs disclosed herein), in addition to or instead of HMDs 22 and 50.


In some embodiments, WS 232 serves as a Wi-Fi hotspot device and the devices of system 11 (e.g., HMDs) are typically connected to WS 232 of subsystem 230. The pairing process between the devices is referred to herein as “hopping” and two operational modes of the pairing. The hopping and pairing are described hereinafter, and also in connection with FIG. 19 below.


Reference is now made to an inset 236 showing another configuration of a communication subsystem. In the example of inset 236 the subsystem comprises: (i) WS 232, (ii) HUD 700 (and/or other suitable HMDs described herein), and (iii) a wireless router 238. Router 238 may be hacked, and for cybersecurity reasons, there may be a motivation to exclude a router (such as router 238) from the configuration of the communication subsystem.


Reference is now made back to the general view of FIG. 18. In some embodiments, subsystem 230 comprises a remote control station (e.g., a tablet) 234 intended to be used by the medical staff in preparing for the surgical procedure, and during the surgical or other interventional procedure (in addition to the HMD).


In some embodiments, WS 232, HMDs 22 and 50 and tablet 234 are connected wirelessly using a service set identifier (SSID), which is a sequence of characters that uniquely names a wireless local area network (WLAN) comprising WS 232, HMDs 22 and 50, tablet 234 and optionally additional devices. The SSID may be configured to allow stations of subsystem 230 to connect to the desired network when multiple independent networks operate in the same physical area. Moreover, WS 232 may be configured to generate a password, such as password 240, and the password may be sent to HMDs 22 and 50 to enable a secured connection using a key exchange process described herein. This additional security layer may be used for improving the cybersecurity of the network of subsystem 230.


In some embodiments, in the example configuration of FIG. 18, WS 232 serves as a client and the HMD(s) intended to be connected serve(s) as a server. Embodiments related to the pairing process are described in connection with FIG. 19 below.


In some embodiments, the communication technique may comprise Wi-Fi, which is a family of network protocols, based on the IEEE 802.11 family of standards, which may be used in wireless local area networking (LAN) applications.


In some embodiments, WS 232 and HMDs 22 and 50 comprise Bluetooth (BT) adapters and the key exchange process is carried out using BT technology, which is a short-range wireless technology standard that is used for exchanging data between fixed devices (e.g., WS 232 implemented in a desktop or a laptop computer) and mobile devices (such as but not limited to HMDs 22 and 50) over short distances using ultra-high frequency (UHF) radio waves in the industrial, scientific and medical (ISM) bands (e.g., between 2.402 GHz and 2.48 GHz).


In some embodiments, WS 232 (and optionally other WSs located at the same medical center or facility) comprise an additional Wi-Fi adapter, also referred to herein as a second Wi-Fi adapter (not shown). In such embodiments, the key exchange process is carried out using a peer2peer (P2P) connection. In such embodiments, WS 232 is using two Wi-Fi connections: a first Wi-Fi connection for hotspot connection, and a second Wi-Fi connection for the key exchange process using the second Wi-Fi adapter.


In some embodiments, WS 232 is configured to encode the hotspot key (e.g., password 240) into an optical code or other machine-readable code such as a barcode or a quick response (QR) code, generated using a suitable software or online tools and displayed over the display of WS 232 or on any other suitable display. In such embodiments, the HMD intended to be paired with WS 232 is configured to scan the optical code or machine-readable code (e.g., barcode or QR code) and decipher the key for performing the key exchange process and the pairing.


In some embodiments, the optical code or machine-readable code (e.g., barcode or QR code) scanning may be carried out using one or more additional cameras 25 (e.g., RGB cameras), or using camera 16 configured to capture a monochromatic image instead of an IR image (e.g., RGB camera that can also function as an IR camera).


This particular configuration of subsystem 230 is shown by way of example, in order to illustrate certain problems related to connectivity and cyber security that are addressed by embodiments of the disclosure, and to demonstrate the application of these embodiments in enhancing the performance of such a communication subsystem. Embodiments of the disclosure, however, are by no means limited to this specific sort of example communication subsystem, and the principles described herein may similarly be applied to other sorts of communication subsystems used in suitable types of AR-based image-guided surgical systems.



FIG. 19 is a flow chart that schematically illustrates a method for directly pairing between WS 232 and HMD 22, in accordance with various embodiments.


The flow chart may also be applicable, mutatis mutandis, for connecting between WS 232 and one or more of HMDs 50, 188, and 189 of FIGS. 3, 15B and 15C, respectively, and for connecting between WS 232 and HUD 700 of FIG. 2B, HMD 2122 of FIG. 21A and/or HMD 2922 of FIG. 29A.


Moreover, the method is applicable for connecting between any workstation and any suitable device configured to display information, such as but not limited to images and markers, over the organ being operated using augmented reality techniques or any suitable technique other than augmented reality.


The method begins at a scanning step 300, with processor 33 introducing a communication device of HMD 22 into a medical center.


At a first decision step 302, the method differentiates between a first use case in which HMD 22 is known to the selected network, and a second use case in which HMD 22 is new to the selected network.


In the first use case, the method proceeds to a parameters application step 304, in which processor 33 uses a set of parameters, typically parameters used in the previous connection (also referred to herein as previous parameters) that are known based on the previous connection between HMD 22 and WS 232. Note that in this use case, the paring process has a predefined time limit and is performed automatically, so that the method proceeds to check whether the pairing is successful (shown in a step 312 described hereinafter)


In the second use case, the method proceeds to a pairing initiation step 306, in which the workstation (e.g., WS 232) initiated the pairing process with HMD 22 for a given time interval. Note that the time intervals (also referred to herein as time limits) of steps 304 and 306 are determined by a user or an administrator of the system. Therefore, the time intervals of steps 304 and 306 may be similar to one another, or may differ from one another.


In some embodiments, the pairing process described below has a predefined time interval (e.g., about 15 seconds, about 20 seconds, about 30 seconds), also referred to herein as a time limit. The implications of the time limit are described in more detail below.


At a key exchange process step 310, the key exchange process is performed for pairing between HMD 22 and WS 232. The key exchange process may be based on Bluetooth, P2P, QR code or any other communication technique and protocols, as described in detail in connection with FIG. 18 above. In accordance with several embodiments, password 400 and the SSID of the network are stored in HMD 22 (or in any other HMD introduced into the medical center as described in step 300 above) in advance (e.g., typically before step 300).


At a second decision step 312, processor 33 of HMD 22 (and optionally the processor of WS 232) check whether the pairing was performed successfully. Note that, in accordance with several embodiments, the pairing process has to be successful within the time limits described in steps 304 and 306 above for the two respective use cases.


As described above, in the first use case (e.g., HMD 22 has already been paired with the selected network), the method proceeds from step 304 directly to step 312 for checking whether or not the pairing has been successful.


In case the pairing fails and/or is not completed successfully within the predefined time limit, the method proceeds to an alternative WS pairing initiation step 314 in which a processor of another workstation (e.g., other than WS 232) associated with the selected network initiates the pairing process. Subsequently the method loops back to step 302 described above.


In case pairing is successful within the time limit, the method proceeds to a third decision step 316, in which the user of subsystem 230 (e.g., surgeon 26) and/or processor 33 check whether to move HMD 22 to another network. The decision to move HMD to another network may be performed based on operational considerations, clinical consideration, technical (e.g., communication) considerations, or any other suitable consideration.


In case there is no need to move HMD 22 to another network, the method proceeds to a data exchanging step 318 in which HMD 22 and WS 232 exchange data during surgical operations or between surgical operations, as described above.


In case HMD 22 is moved to another network, the method proceeds to an unpairing step 320 in which HMD 22 is unpaired from WS 232, and the connection parameters for connecting with WS 232 are deleted from the storage of HMD 22. After unpairing between HMD 22 and WS 232, the method loops back to step 314 and further to step 306 as described above.


The method of FIG. 19 is provided by way of example in order to illustrate embodiments related to secure communication performed directly between a workstation and an HMD without having a router or any other sort of intermediate communication device. As described in connection with FIG. 18 above and in accordance with several embodiments, the elimination of the router is particularly advantageous for complying with cybersecurity requirements in communicating between HMD 22 and WS 232 or between any other two or more suitable medical and communication devices.


The method is simplified for the sake of conceptual clarity and relies on the description of the hardware of FIG. 18 above. In some embodiments, the method may be implemented, mutatis mutandis, on any other suitable hardware comprising any suitable type of medical devices and/or communication devices.


Example Electronic Subsystem



FIG. 20 is a block diagram that schematically illustrates an implementation of an architecture of an electronic subsystem (ES) 444 of system 11, in accordance with an embodiment of the disclosure.


In some embodiments, ES 444 may be implemented, mutatis mutandis, in any of the HMDs and HUD described above or below.


In some embodiments, ES 444 comprises a carrier board (CB) 400 made from a suitable PCB or any other suitable substrate having traces for exchanging signals between components described herein.


In some embodiments, ES 444 comprises a battery pack, referred to herein as a battery 420, or any other suitable power source configured to supply electrical power to ES 444. As described above, ES 444 may comprise a supercapacitor or ultracapacitor (not shown) connected to CB 400 in parallel with battery 420 and configured to be used for eliminating lengthy boot-up when changing battery 420 of HMD 22 (or any other HMD or HUD described above or below).


In some embodiments, ES 444 comprises processor 33, wireless communication device 45 (e.g., a Wi-Fi-6 transceiver connected to a Wi-Fi-6 antenna 406), and storage device 46, which are mounted on CB 400.


In some embodiments, ES 444 comprises a system-on-chip (SOC) device or a system-on-module (SOM) device comprising processor 33, wireless communication device 45, storage device 46, a graphic processing unit (GPU) (not shown), an artificial intelligence (AI) accelerator (not shown), image signal processors (ISPs) and/or other components. For example but not by way of limitation, the SOC device may comprise any suitable SOC device selected from the Snapdragon family produced by Qualcomm (San Diego, CA).


In some embodiments, ES 444 comprises controllers 417 configured to control and drive IR LED projectors 17 described in FIG. 1 above, an IMU controller 418 configured to drive IMU 18 described in FIG. 1 above, and an ambient light sensor (ALS) controller 419 of light sensor 19 described in FIG. 1 above. Controllers 417 and 418 are mounted on CB 400. In some embodiments, IMU 18 may comprise both the sensing device and the drive circuitry. Similarly, ALS controller 419 may be integrated with sensor 19, and controller 417 may be integrated with LED projectors 17.


In some embodiments, ES 444 comprises backlight driver 58, which is described with reference to FIG. 3 above and is configured to drive and adjust the current supplier to display 15.


In some embodiments, ES 444 comprises a microphone assembly (MA) 78 comprising a microphone and electronic circuitry thereof, and a speaker assembly (SA) 79 comprising a speaker and electronic circuitry thereof. MA 78 and SA 79 are mounted on CB 400.


In some embodiments, ES 444 comprises a bus 98 configured to conduct power signals and data signals between the aforementioned devices mounted on CB 400, and also to conduct power signals and data signals between CB 400 and external entities described herein.


In some embodiments, ES 444 comprises an interface 402 configured to exchange power signals and data signals between CB 400 and: (i) camera 16, (ii) HA 60 (or any other headlight assembly described above), (iii) SLP 200, and (iv) additional cameras 25 (e.g., RGB cameras).


In some embodiments, interface 402 comprises a Camera Serial Interface (CSI), which is a specification of the Mobile Industry Processor Interface (MIPI) Alliance, referred to herein as MIPI CSI, configured to conduct, between (i) cameras 16 and 25 and (ii) CB 400 the signals for producing the IR images and the RGB images, respectively.


In some embodiments, ES 444 comprises an interface 404 configured to output signals indicative of the AR images (described above) between CB 400 and the displays of the HMD. In some embodiments, the displays comprise display 15 associated with OE 42 (shown and described in FIG. 2A above). In some embodiments, the display 15 may comprise displays 49a and 49b of HMD 50 shown and described in FIG. 3 above.


In some embodiments, interface 404 comprises a MIPI Display Serial Interface (MPI DSI), which is a high-speed interface that is used in various types of consumer devices.


In some embodiments, ES 444 is configured to exchange video signals with external entities. In some embodiments, ES 444 is configured to transmit video of the scene captured by RGB cameras 25 together with the rendered augmented reality images.


In some embodiments, the transmission may be carried out using Wi-Fi-6 (an IEEE standard for wireless local-area networks) for obtaining low-latency and high-speed transmission of the signals.


The transmission technique is not limited to Wi-Fi-6 and may also be carried out over the (fifth generation) 5G cellular network or other communications networks.


In some embodiments, these communication techniques may be used for operating room staff to observe exactly what the surgeon (e.g., surgeon 26) sees, and for training and/or telesurgery applications.


In some embodiments, surgeon 26 performs the surgery or other medical intervention in the operating room shown in FIG. 1, and receives feedback/comments/instructions on how to proceed with the surgical or other interventional procedure, from remote surgeons or other professionals located at a remote station 412 having a suitable communication assembly 414 (e.g., Wi-Fi-6) configured to receive signals of a video output 415.


In some embodiments, the signals of video output 415 may be recorded for documenting the surgical or other interventional procedure in medical records.


In some embodiments, ES 444 is configured to receive and display input signals of video, and more specifically of high-resolution (HR) video images also referred to herein as video input 411, received from an external source 408 having a suitable communication assembly 410 (e.g., Wi-Fi-6) configured to receive signals of video input 411.


In some embodiments, ES 444 is configured to receive input video 411 from a digital surgical microscope, and to display the received video signals using the aforementioned digital loupe techniques that are described, for example, in U.S. Provisional Patent Application 63/234,272, and in PCT Publication No. WO2023/021450, the disclosure of both of which are incorporated herein by reference. Note that in this use case, displaying such HR images received from the digital surgical microscope may provide surgeon 26 HR images rendered in augmented reality.


In some embodiments, ES 444 is configured to receive and display input signals of input video 411 comprising endoscopic video received from an endoscopy system. This use case allows surgeon 26 to perform various types of endoscopic or laparoscopic surgical procedures without diverting the gaze to a remote monitor.


In some embodiments, external source 408 may comprise any other suitable video source(s) configured to produce any suitable type of HR video images. In such embodiments, processor 33 is configured to receive video input 411 comprising these HR video images, and to display the images over displays 15 (or any other suitable type of display) using the AR techniques described above or any other presentation technique. Moreover, processor 33 is configured to present any other types of patient information received from external source 408.


In some embodiments, the GPU and the AI accelerator of ES 444 may be used together with processor 33 for controlling system 11 (of FIG. 1 above) using voice commands. For example, the GPU and the AI accelerator of ES 444 may be used together with processor 33 for voice and/or command recognition, using any suitable type of artificial intelligence techniques.


This particular configuration of ES 444, external source 408 and remote station 412 is shown by way of example, in order to illustrate certain problems that are addressed by embodiments of the disclosure and to demonstrate the application of these embodiments in enhancing the performance of system 11 and similar types of image-guided surgical systems. Embodiments of the disclosure, however, are by no means limited to this specific sort of example electronic architecture, and the principles described herein may similarly be applied to other sorts of HMDs and hardware used in suitable types of AR-based image-guided surgical systems and other sorts of image-guided surgical systems. Although medical applications are well-suited for several embodiments, non-medical applications also benefit from many embodiments described herein. For example, non-medical applications may involve consumer or commercial applications such as athletics and fitness, gaming, driving, product design, navigation, manufacturing, logistics, shopping and commerce, educational training, remote collaboration, etc.


Additional Example Head-Mounted Display



FIGS. 21A-21D illustrate another example embodiment of a head-mounted display 2122. The head-mounted display 2122 has many similarities to other head-mounted displays disclosed herein, including, for example, head-mounted display 22 of FIG. 2A, head-mounted display 50 of FIG. 3, and head-mounted display 700 of FIG. 2B, discussed above, and the same or similar reference numbers are used to refer to the same or similar components. Accordingly, the present description of head-mounted display 2122 focuses on differences from the head-mounted displays 22, 50, and 700. Any elements or features discussed herein with reference to other head-mounted displays, such as head-mounted display 22, 50, or 700, may be incorporated into head-mounted display 2122, and likewise any elements or features discussed herein with reference to head-mounted display 2122 may be incorporated into other head-mounted displays discussed herein, such as head-mounted displays 22, 50, or 700.


Some benefits of the design of the head-mounted display 2122 are related to ergonomics, comfort, and/or the ability to enable a user, such as surgeon 26, to utilize the system for relatively long periods of time, such as for four hours or more, without unnecessary fatigue and/or other negative consequences. For example, in some embodiments, a head-mounted display like the head-mounted display 22 of FIG. 2A may weigh approximately 260 g, and such a design may transfer roughly 200 g of that weight to the user's nose through the nose pad 28. While this may be acceptable in some situations, in some situations, such as during longer operations, it may be desirable to reduce the load on the user's nose to, for example, 90 g or less. A design such as the head-mounted display 2122 shown in FIG. 21A can accomplish such a benefit. For example, the head-mounted display 2122 can distribute weight around the wearer's head, including to the wearer's forehead and the back of the wearer's head, to reduce at least some of the weight applied to the wearer's nose. Such a configuration can also reduce pressure on the wearer's temples which can be another relatively weight-sensitive area, in addition to the nose. Stated another way, such a head-mounted display can more widely distribute pressure over larger and/or less sensitive areas, such as the forehead and the back of the head.


The head-mounted display 2122 combines certain features of other head-mounted displays disclosed herein, including the head-mounted display 22, head-mounted display 50, and head-mounted display 700. For example, as discussed in greater detail below, the head-mounted display 2122 includes left and right temple housings that have some similarities to the left and right temple arms of head-mounted display 22, but that are not in contact with and/or being supported by the wearer's temples or ears. As another example, as discussed in greater detail below, the head-mounted display 2122 includes a rear pad and adjustable strap mechanism that can be similar to those used in the head-mounted display 700. Additionally, some embodiments can include an optional upper or top strap that can further distribute weight over the top of a wearer's head.


Another advantage of a design that distributes weight to less sensitive areas of a wearer's head, such as the design of the head-mounted display 2122, is that additional weight may be added to the head-mounted display without significantly increasing the pressure on the wearer's head in any particular spot or in a sensitive area. For example, a flashlight assembly may be attached to the head-mounted display without significantly increasing the pressure on the wearer's head in any particular spot or in a sensitive area.



FIG. 21A is a front perspective view of the head-mounted display 2122, FIG. 21B is a rear perspective view of the head-mounted display 2122, FIG. 21C is a left side view of the head-mounted display 2122, and FIG. 21D is a top view of the head-mounted display 2122. Some similarities of the head-mounted display 2122 to other head-mounted displays disclosed herein include, for example, a head strap 740 and knob 744 for adjusting the head strap 740, similar to in the head-mounted display 700 of FIG. 2B. The head strap 740 can have some additional features, however, in order to accommodate certain differences in the head-mounted display 2122, described in greater detail below.


Other similarities of the head-mounted display 2122 to other head-mounted displays disclosed herein include, for example, an optics housing 704 that comprises one or more cameras 708 and infrared projectors 716 (see FIG. 21A), one or more processors 33, wireless communication devices 45, and/or storage devices 46 (see FIG. 21B), an optical engine 55 that can include the same or similar components as optical engine 55 discussed above with reference to FIG. 3, a plurality of AR displays 49a, 49b similar to as discussed above with reference to FIG. 3, and an adjustable nose pad 28 similar to as discussed above with reference to FIG. 12. In this embodiment, the optical engine is housed within an optical engine housing or frame 2104. In some embodiments, the optics housing 704 and optical engine housing 2104 are part of the same frame or housing, and in some embodiments, the optics housing 704 and optical engine housing 2104 are separate components. In this embodiment, the optics housing 704 and optical engine housing 2104 are configured to tilt together (see FIG. 23B), but some embodiments may allow the optical engine housing 2104 to tilt or otherwise move with respect to optics housing 704. The optics housing 704, optical engine housing 2104, and the components housed within or attached to those housings (such as, but not limited to, displays 49a, 49b, and optical engine 55) may be collectively referred to as a see-through display assembly 2349. Further, in any embodiment disclosed herein that includes one or more displays that are configured to tilt using a pantoscopic tilting assembly (PTA), the displays and any other components, frames, and/or housings that tilt along with the displays can be collectively referred to as a see-through display assembly. In some embodiments, optical engine housing or frame 2104 includes some or all of the features of display assembly frame 41 of FIG. 3.


One difference in the head-mounted display 2122 from the head-mounted display 22 of FIG. 2A is that the head-mounted display 2122 comprises a left temple housing 2143 and a right temple housing 2144 instead of a left temple arm 43 and a right temple arm 44. The left and right temple housings 2143, 2144 may incorporate some or all of the same features as left and right temple arms 43, 44 of the head-mounted display 22 of FIG. 2A, such as including an on/off button 39, housing one or more processors 33, wireless communication devices 45, and/or storage devices 46, including a strain relief 47 for a cable, and/or the like. The head-mounted display 2122 is configured to mount to the user's head (e.g., to surgeon 26) differently, however, and the left and right temple housings 2143, 2144 accordingly do not include rocker arms like the head-mounted display 22. Specifically, the head-mounted display 2122 is configured to attach to a user's head using head strap 740. Head strap 740 has similarities to the head strap 740 of head-mounted display 700 of FIG. 2B, but also has some differences, described below.


With reference to FIG. 21B, the head strap 740 of head-mounted display 2122 comprises or consists essentially of a first or left side strap 2152 and a right or second side strap 2155. The left side strap 2152 comprises a first end 2153 that is affixed to a first or left end of a frame 2150, and the right side strap 2155 comprises a first end 2156 that is affixed to a second or right end of the frame 2150 (see FIG. 21A). The first or left end 2150 is configured to be positioned adjacent a first or left temple of a user (e.g., surgeon 26), and the second or right end 2150 is configured to be positioned adjacent a second or right temple of a user (e.g., surgeon 26). The left side strap 2152 further comprises a second end 2154, and the right side strap 2155 further comprises a second end 2157. As discussed in greater detail below, an adjustment mechanism 2158 can be used to adjust the relative positions of the second ends 2154, 2157, in order to adjust a circumferential size of the head strap 740. The adjustment mechanism 2158 and head strap 740 can together be referred to as an adjustable strap assembly.


With further reference to FIG. 21B, the head strap 740 further comprises a forehead support 2160 that comprises a strap extending from a first end 2161 that is pivotably coupled to the left side strap 2152 to a second end 2162 that is pivotably coupled to the right side strap 2155. The forehead support 2160 further comprises a central support 2163 extending down from the main strap and affixed to a middle or central location of the frame 2150. Although not shown in these figures, cushioning or padding may be added to any portion of the head strap 740, including to the forehead support 2160 and elsewhere. An example of such cushioning or padding is discussed below with reference to FIGS. 25 and 26A-26C.


With continued reference to FIG. 21B, in this embodiment, the left and right side straps 2152, 2155 each comprise a front portion 2165 that is pivotably coupled to a rear portion 2166 by a connector 2164. Further, in this embodiment, the front portions 2165 are pivotably connected at the same pivotal connection as the first and second ends 2161, 2162 of the forehead support 2160 strap. Such a configuration is not required, however, and various other embodiments may include more or fewer pivotal connections, or may not include pivotal connections. The pivotal configuration shown in FIG. 21B has been found to be desirable, however, in order to accommodate various users (e.g., surgeons 26) while comfortably and securely maintaining the head-mounted display 2122 on the user's head.


The head strap 740 also desirably comprises a pad or cushion 2170 attached to the adjustment mechanism 2158 that can engage the back of the user's head. Further, the head strap 740 in this embodiment also comprises a front slot 2171 and a rear slot 2172 that can be used to attach an optional upper or top strap, as described in more detail below with reference to FIGS. 24A-24B.


With reference to FIG. 21C, this figure illustrates that the optical engine housing 2104 and optics housing 704 can desirably pivot or tilt with respect to the frame 2150, in order to adjust pantoscopic tilt angle 2168. Additional details of the mechanism used for adjusting the pantoscopic tilt angle 2168 in head-mounted display 2122 are provided below with reference to FIG. 23B. Additionally or alternatively, the same or similar tilting mechanism may be used as described above, such as with reference to FIGS. 13 and 14.


With reference to FIG. 21D, this top view of the head-mounted display 2122 helps to show certain adjustability produced by the adjustment mechanism 2158. Specifically, the adjustment mechanism 2158 desirably adjusts a circumferential size defined essentially by the left side strap 2152, the right side strap 2155, and the frame 2150. For example, if the adjustment mechanism 2158 causes the ends 2154, 2157 to become closer together, then the circumferential size will increase, and if the adjustment mechanism 2158 causes the ends 2154, 2157 to become further part, then the circumferential size will decrease. Arrow 2180 represents this increasing or decreasing in circumferential size.


With continued reference to FIG. 21D, this figure also shows that the left temple housing 2143 and right temple housing 2144 can desirably move or pivot inward and outward with respect to the ends 2181, 2182 of the frame 2150. This movement or pivoting is indicated by arrows 2183. The movement 2183 of the temple housings 2143, 2144 can be a result of a combination of adjustments to the circumferential size of the head strap 740 and flexing of the head strap 740 as a user installs the head-mounted display 2122 onto their head or removes the head-mounted display 2122 from their head. The movement or pivoting may be facilitated by the same or similar tilting assembly 111, as discussed above with reference to FIG. 9.


In order to assist with the adjustability of the head-mounted display 2122 and its head strap 740, the left and right temple housings 2143, 2144 can also desirably be movably coupled to a portion of the head strap 740. For example, in this embodiment, the left temple housing 2143 is slidably coupled to the left side strap 2152, and the right temple housing 2144 is slidably coupled to the right side strap 2155. More specifically, each of the temple housings 2143, 2144 desirably comprises or consists essentially of a follower 2185 that is slidable forward and backward, and that is affixed to the left or right side strap 2152, 2155 by one or more fasteners, such as fastener 2186. Further details of this structure are described below with reference to FIGS. 23A-23D.



FIGS. 21A and 21D also show that the head-mounted display 2122 comprises a cover 2190 that is removably coupled to the frame 2150. Desirably, this cover 2190 may be removed and replaced with a detachable flashlight assembly, such as is described in greater detail below, with reference to FIGS. 27A-27D and 28A-28F.


With reference to FIG. 21D, this figure also shows that each of the temple housings 2143, 2144 can comprise a plurality of fins 2199 (e.g., heat-dissipation fins) which can, for example, help with heat dissipation. Various other heat dissipation features may also or alternatively be used, including vents, heatsinks, fins, protrusions, active and/or passive cooling, and/or the like. Such features may be desirable, such as to dissipate heat generated by, for example, one or more processors 33 located within one or both of the temple housings 2143, 2144.


This particular configuration of HMD 2122 is shown by way of example, in order to illustrate certain problems that are addressed by certain embodiments and to demonstrate the application of these embodiments in enhancing the performance of such a system. Embodiments of the disclosure, however, are by no means limited to this specific sort of example HMD configuration, and the principles described herein may similarly be applied to other sorts of HMDs and HUDs used in any suitable types of near-eye display AR-based image-guided surgical systems. HMD 2122 may be used in non-medical applications. For example, non-medical applications may involve consumer or commercial applications such as athletics and fitness, gaming, driving, product design, navigation, manufacturing, logistics, shopping and commerce, educational training, remote collaboration, etc.


Example Adjustment Mechanism Details



FIGS. 22A and 22B illustrates additional detail of the adjustment mechanism 2158 that can be used to adjust a size of the head strap 740 of head-mounted display 2122. Additionally, FIGS. 23A-23D illustrate additional details of the follower mechanism that assists in controlling the inward and outward pivoting 2183 of the left and right temple housings 2143, 2144 (see FIG. 21D). These figures also illustrate additional details of the pantoscopic tilting mechanics.


As discussed above, the head-mounted display 2122 can desirably be configured to be usable by users (e.g., surgeons 26) having a variety of head sizes and/or shapes. Accordingly, the head strap 740 can desirably be adjustable, such as to adjust a circumferential size of the head strap 740 to accommodate heads of the users having various shapes and sizes. In order to make the head-mounted display 2122 more comfortable for a user and to better accommodate users having differently sized and/or shaped heads, in addition to adjusting the circumferential size of the head strap 740, the left and right temple housings 2143, 2144 can desirably be movably or pivotably coupled to a frame 2150 (as discussed above with reference to FIG. 21D). Accordingly, when the head strap 740 is adjusted to be smaller (e.g. by using knob 744 to reduce the circumferential size of the head strap 740) the left and right temple housings 2143, 2144 can desirably pivot inward (e.g., in a direction towards the user's head) with respect to the frame 2150. Likewise, when the head strap 740 is adjusted to be larger (e.g., by using knob 744 to increase the circumferential size of the head strap 740) the left and right temple housings 2143, 2144 can desirably pivot outward (e.g., in a direction away from the user's head) with respect to the frame 2150. Such pivoting may also occur as a user installs or removes the head-mounted display 2122.



FIGS. 22A and 22B illustrate additional details of how the knob 744 of the adjustment mechanism 2158 causes the change in circumferential size of the head strap 740 (see, e.g., adjustment 2180 of FIG. 21D). Specifically, in this embodiment, each of the side straps 2152, 2155 comprises a rack 2210 that engages a pinion or pinion gear 2212. In this embodiment, there is a single pinion gear 2212 that rotates with the knob 744 and that engages both of the racks 2210. In some embodiments, however, there may be separate pinion gears, with each pinion gear engaging one of the racks 2210. Rotation of the knob 744, and thus the pinion gear 2212, desirably causes the ends of the straps 2157, 2154 to move closer together or further apart, thus changing the circumferential size of the head strap.


In order to maintain the circumferential size of the head strap in a particular configuration, the adjustment mechanism 2158 further comprises a tension mechanism (e.g., stop mechanism) 2215. The tension mechanism 2215 is configured to maintain the knob 744 in a particular position until, for example, a user overcomes a threshold force of the tension mechanism 2215 in order to rotate the knob 744. For example, the tension mechanism 2215 shown in FIGS. 22A and 22B comprises a gear 2218 on an inside surface of the knob 744, with the gear 2218 engaging two other gears 2217. Further, a tension member 2219 is positioned between the gears 2217 and applies friction to the gears 2217 in order to inhibit or restrict their rotation until a threshold force is overcome. It should be noted that the tension mechanism 2215 shown in FIGS. 22A and 22B is merely one example, and various other techniques may be used to maintain the adjustment mechanism 2158 in a particular configuration until a user wishes to adjust the configuration. For example, other ways to generate friction may be used, a separate locking mechanism may be used, a gearing assembly that includes a mechanical advantage that is configured to be self-locking may be used, and/or the like.


It should be noted that this is merely one example of how an adjustment mechanism can work, and various other ways of adjusting the size of the head strap 740 may be used. For example, the left and right side straps 2152, 2155 may be adjusted and then held in place with respect to each other using a different type of geartrain or gearing mechanism, friction, and/or the like.


Turning to FIGS. 23A-23D, these figures illustrate additional details of the follower mechanism and tilting mechanism. Specifically, with reference to FIGS. 23B-23D, these figures illustrate additional details of the follower 2185 and fastener 2186 discussed above with reference to FIG. 21D. These figures illustrate the right temple housing 2144, and the same or similar features may be used in the left temple housing 2143. As can be seen in FIG. 23B, the follower 2185 desirably comprises or consists essentially of an elongate protrusion that is slidably coupled to an elongate slot 2387 in a side wall of the temple housing 2144. As discussed above, when the head strap 740 is adjusted and/or flexed, the follower 2185 can desirably slide with respect to the temple housing 2144, thus allowing the temple housing 2144 to move inward or outward with the head strap 740, while still being sufficiently vertically supported by the head strap 740.


Turning to FIGS. 23C and 23D, these figures show additional details of how the follower 2185 is coupled to the side strap 2155 and temple housing 2144. Specifically, a fastener 2186 and a cap or washer 2388 affix the follower 2185 to the side strap 2155. Further, the wall of the temple housing 2144 desirably comprises a portion 2389 that is positioned between the elongate protrusion of the follower 2185 and the strap 2155, thus retaining the follower 2185 in position with respect to the temple housing 2144.


It should be noted that this is merely one example way of movably or slidably mounting a temple housing to a side strap, and other techniques may also be used. Desirably, the temple housings are supported by the side straps in a generally vertical direction while being free to move in a generally horizontal direction with respect to the side straps. Such a configuration can lead to adequate vertical support of the temple housings by the head strap 740 while allowing adjustments for a comfortable and snug fit on the user's head.


Returning to FIG. 23B, this figure also illustrates additional details of a pantoscopic tilting assembly (PTA) 2355 that enables tilting (e.g., rotating, pivoting, moving, or sliding) of the optical engine housing 2104 and optical housing 704 with respect to the frame 2150. The PTA 2355 performs a similar function to the PTA 155 and PTA 156 discussed above with reference to FIG. 14, and the same or similar reference numbers are used to refer to same or similar components. Further, any of the other PTA's disclosed herein, including but not limited to PTA 156 and PTA 155, may be incorporated into the head-mounted display 2122, and the PTA 2355 of head-mounted display 2122 may be incorporated into any of the other head-mounted displays disclosed herein.


The PTA 2355 comprises or consists essentially of a virtual hinge or axis (represented by point 2361), created by arc-shaped slot 2374 as a radial cam, and rotatable section or guide member 2379 as a follower, about which optical engine housing 2104 and housing 704 can pivot (e.g., tilt, rotate, move, or slide) with respect to the frame 2150, thus causing adjustment to the pantoscopic tilt angle 2168. The center of the virtual hinge or axis 2361 is desirably located at the center of the human eyeball (e.g., eyeball 158 of FIG. 13), so that movements of the portable or movable parts (e.g., of optical engine housing 2104 and housing 704) about or with respect to the stationary parts (e.g., frame 2150 and part 2372 of the frame 2150 that defines the slot 2374) maintain undistorted vision on the optical engine displays (e.g., display 49a of FIG. 21A). In order to maintain the optical engine housing 2104 and housing 704 at a particular tilt angle or position with respect to the frame 2150, the PTA 2355 further comprises a locking element 2373. In this embodiment, the locking element 2373 comprises a spring-loaded ball that can engage any one of a plurality of detents 2375 that are fixed with respect to the frame 2150, in order to maintain the optical engine housing 2104 and housing 704 (and other parts of the see-through display assembly 2349 that tilt along with housing 2104 and housing 704) in any one of a plurality of predefined positions corresponding to the plurality of detents 2375. Specifically, in this embodiment, the ball 2373 is coupled to an arm 2376 that is configured to pivot about axis 2380 (with axis 2380 being parallel to the plane of this drawing, and thus the pivoting action of the arm 2376 being into or out of the page). Further, the arm 2376, and thus also the ball 2373, is spring-loaded about the axis 2380 using, for example, a spring 2371 (positioned underneath a lower end of the arm 2376 in this cross-sectional view). When a user (such as surgeon 26) wishes to rotate the optical engine housing 2104 and optical housing 704 with respect to the frame 2150, the user may apply a force to the optical engine housing 2104 and/or housing 704 that causes the rotatable section 2379 to transfer the force to the locking element 2373 through the arm 2376. Once a sufficient force is applied to overcome a threshold force of the spring-loaded ball, the locking element 2373 may disengage one of the detents 2375 of the frame 2150 and allow the optical engine housing 2104 to pivot about the virtual axis 2361 with respect to the frame 2150. Once the optical engine housing 2104 is at the desired pantoscopic tilt angle 2168, the locking element 2373 can maintain the optical engine housing 2104 at that angle by re-engaging one of the detents 2375. The optical engine housing 2104 will desirably then be maintained at that angle with respect to the frame 2150 until the threshold force of the spring-loaded locking element 2373 is again exceeded. Although FIG. 23B illustrates the PTA 2355 of only the right side of head-mounted display 2122, the same or similar structures may also be used on the left side of head-mounted display 2122.


Various modifications to the pantoscopic tilting assembly 2355 may be made. For example, a spring-loaded pin may be used instead of a spring-loaded ball. As another example, the spring-loaded pin or ball may be spring-loaded by a spring that is directly engaged with and/or in line with the ball instead of engaging a lever arm (e.g., arm 2376) that in turn engages the ball. As another example, friction may be used to maintain the optical engine housing 2104 and optical housing 704 at a particular angle with respect to the frame 2150, instead of or in addition to a detent mechanism. Further, the described mechanism (e.g., PTA 2355) is only one example, which in this embodiment implements a virtual axis 2361 with five detents 2375, providing a range of 20° of adjustability (e.g., from 15° to 35° horizontal tilt), although various embodiments may include other ranges of adjustability, as discussed above with reference to FIGS. 15A-15C. Any other virtual axis mechanism can be implemented, such as using friction instead of detents, and/or a totally positive locking mechanism instead of a “semi positive” locking mechanism (like a spring-loaded ball and detents) that unlocks when a threshold force is exceeded. Additionally, some PTAs may utilize a real, or physical, axis (e.g., an axis created by a mechanical hinge) for pivoting instead of a virtual axis. It can be desirable to use a virtual axis, however, such as to reduce or eliminate side field of view limitations and/or obstacles. For example, with reference to FIG. 23B, if the frame 2150 and housing 2104 included mechanical components extending to point 2361 to form a physical hinge at point 2361 (instead of the current virtual axis or virtual hinge at point 2361), those mechanical components may be visible in the surgeon's peripheral vision, and may thus undesirably block some or all of the surgeon's peripheral vision. That said, the disclosure herein is not limited to PTAs that utilize virtual axes, and some embodiments may utilize a physical hinge to form the axis about which the PTA can rotate.


Example Additional Head Strap Features


As discussed above with reference to the head-mounted display 2122 shown in FIGS. 21A-21D, the head strap 740 may optionally include a detachable upper or top strap. FIGS. 24A and 24B illustrate the head-mounted display 2122 with such a detachable upper or top strap 2410 coupled thereto. Specifically, the top strap 2410 comprises a first end 2411 that has been passed through slot 2171 and folded back on itself. The first end 2411 may comprise, for example, a hook and loop fastener, a button, and/or the like, that maintains the position of the first end 2411 with respect to the slot 2171. Similarly, the top strap 2410 comprises a second end 2412 that has been passed through slot 2172 and folded back on itself. The second end 2412 may comprise, for example, a hook and loop fastener, a button, and/or the like, that maintains the position of the second end 2412 with respect to the slot 2172.


It can be desirable to have an optional detachable top strap, such as top strap 2410, because some users may prefer to use the head-mounted display 2122 without a top strap, and some users may prefer to use the head-mounted display 2122 with a top strap. That said, some embodiments may include a permanently attached top strap, and some embodiments may not have an option to attach a top strap. It can also be desirable to have the top strap 2410 be adjustable, such as by adjusting a length of the top strap 2410. One way to accomplish this is to allow the amount of the strap at the first and/or second ends 2411, 2412 that is passed through the corresponding slot and folded back on itself to be varied. An additional or alternative way to accomplish this is to include an adjuster in the strap, such as using a buckle, hook and loop fasteners (see, e.g., the description below related to FIGS. 26A-26C), and/or the like. The top strap 2410 may be formed from a variety of materials, such as flexible polymer, fabric, and/or the like. In some embodiments, the top strap 2410 comprises a soft fabric, that can help with comfort, for example.


As discussed above, FIGS. 21A-21D illustrate the head strap 740 of head-mounted display 2122 without any padding or cushioning added thereto, other than the rear pad 2170. Turning to FIG. 25, this figure shows an example of the head-mounted display 2122 with a pad 2520 having been added at a front of the head-mounted display 2122, in order to provide some cushion between the head-mounted display 2122 and a forehead of the user (e.g., surgeon 26). Specifically, the pad 2520 in this embodiment includes a first portion 2521 positioned between the main strap of the forehead support 2160 and the surgeon 26, a second portion 2522 positioned between the central support 2163 and the surgeon 26, and a third portion 2523 positioned between the frame 2150 and the surgeon 26.



FIG. 25 illustrates merely one example of cushioning or padding being added to the head-mounted display 2122, and more or less cushioning may be provided, the cushioning may be provided in additional locations, such as along the side straps of the head-mounted display 2122, and/or the like.


Turning to FIGS. 26A-26C, these figures illustrate the same head-mounted display 2122 of FIG. 25 that has a forehead pad 2520 added, but also has a top strap 2610 added. The top strap 2610 is similar to the top strap 2410 of FIGS. 24A and 24B, except that additional features are shown in FIGS. 26A-26C. Specifically, FIG. 26C shows that each end of the top strap 2610 utilizes a button 2670 to maintain the top strap 2610 in position with respect to the rest of the head-mounted display 2122. Further, FIGS. 26A and 26C show that the top strap 2610 is provided in two parts, namely a rear portion 2671 and a front portion 2672. The rear portion 2671 is laid over the top of the front portion 2672. A hook and loop fastener 2673 desirably maintains the rear portion 2671 in position with respect to the front portion 2672. Such a design can desirably enable the length of the top strap 2610 to be adjusted by, for example, changing the relative position of the rear portion 2671 with respect to the front portion 2672. Although this embodiment desirably uses a hook and loop fastener between the front and rear portions 2672, 2671, other embodiments may use other techniques, such as buttons, a buckle mechanism, elastic, and/or the like.


Additional Example Detachable Lighting Systems


As discussed above, various embodiments of head-mounted displays may include a permanently attached or detachable headlight or lighting system that includes a flashlight, such as to illuminate the area in which a surgical procedure is being performed. Various examples of such lighting systems are described above with reference to, for example, the embodiments of FIGS. 2B, 4, 5, 6, 7, and 8. The head-mounted display 2122 of FIG. 21A, discussed above, may be modified to be used with any of such disclosed lighting systems. Additionally, FIGS. 27A-27D illustrate another example embodiment of a detachable lighting system 2701 in use with the head-mounted display 2122. The detachable lighting system 2701 may also be incorporated into any of the other head-mounted displays disclosed herein.



FIGS. 27A and 27B illustrate the detachable lighting system 2701 attached to a mounting socket 2710 of housing 704 of head-mounted display 2122. For example, the removable cover 2190 shown in FIG. 21A has been removed and replaced with mounting base 2705 of lighting system 2701. As can be seen in FIGS. 27A and 27B, the lighting system 2701 comprises or consists essentially of a flashlight 2703 that is pivotably coupled to the mounting base 2705 by an arm 2707. More specifically, the arm 2707 is pivotably coupled at a first end to the flashlight 2703 and at a second end to the mounting base 2705. These pivotal connections desirably allow the flashlight 2703 to be repositioned as needed. For example, in the head-mounted display 2122, the optics housing 704 that incorporates the flashlight mounting socket 2710 will tilt along with the optical engine housing 2104 when the pantoscopic tilt angle of the optical engine housing 2104 is adjusted with respect to the frame 2150. Accordingly, when the pantoscopic tilt angle is adjusted, the orientation of the flashlight 2703 may also change with respect to the frame 2150. In such a case, it may be desirable to reposition the flashlight 2703, such that the flashlight 2703 can be directed toward a same region of interest regardless of the current pantoscopic tilt angle of the optical engine housing 2104. The arm 2707 may be aimed to allow a light beam to be unified with a wearer line of sight for optimal visibility.



FIG. 27C illustrates a side view of the lighting system 2701 detached from the rest of the head-mounted display. This figure shows more detail of the mounting base 2705, which will be described in further detail below with respect to the embodiment of FIGS. 28A-28F. FIG. 27D illustrates a perspective view of the head-mounted display 2122 with the lighting system 2701 removed. This figure shows more detail of the flashlight mounting socket 2710, which will also be described in further detail below with respect to the embodiment of FIGS. 28A-28F.


Turning to FIGS. 28A-28F, these figures illustrate another embodiment of a lighting system 2801 that can be used with the head-mounted display 2122 (or any other head-mounted displays disclosed herein). The same or similar reference numbers are used as for lighting system 2701 of FIGS. 27A-27D to refer to the same or similar features. The main substantive difference between lighting system 2801 and lighting system 2701 is that lighting system 2801 comprises or consists essentially of two pivotably coupled arms 2707 instead of a single arm 2707. Specifically, with reference to FIG. 28A, the two arms 2707 are pivotably coupled together, with one of the arms 2707 being pivotably coupled to the mounting base 2705, and the other arm 2707 being pivotably coupled to the flashlight 2703. This design can, for example, enable a greater range of adjustability of the position and/or angle of the flashlight 2703 as compared to the lighting system 2701 of FIG. 27A. That said, the single-arm design of lighting system 2701 of FIG. 27A can also be beneficial, particularly if using the single arm 2707 enables a sufficient range of adjustment while reducing weight and/or complexity of the system.


Turning to FIG. 28E, this figure illustrates additional detail of the flashlight mounting socket 2710. Specifically, the flashlight mounting socket 2710 comprises or consists essentially of a first rod 2820 extending generally horizontally, and a second rod 2822 extending generally parallel to the first rod 2020. The flashlight mounting socket 2710 further comprises two electrical contacts 2834 at a base of the socket. The electrical contacts 2834 may, for example, be used to provide power for use by the flashlight 2703. More or fewer electrical contacts may be used (including zero electrical contacts). In some embodiments, one or more electrical contacts may be used for control signals, such as to turn flashlight 2703 on or off, adjust a brightness of flashlight 2703, and/or the like.


Turning to FIGS. 28B-28D, these figures illustrate the lighting system 2801, and enlarged views of the mounting base 2705 of the lighting system 2801, detached from the mounting socket 2710 of the head-mounted display 2122. The mounting base 2705 in this embodiment comprises a first recess 2821 and a second recess 2823. The first recess 2821 is sized and positioned to engage the first rod 2820 of the mounting socket 2710 (see FIG. 28E), and the second recess 2823 is sized and positioned to engage the second rod 2822 of the mounting socket 2710 (see FIG. 28E). For example, in order to attach the mounting base 2705 to the mounting socket 2710, a user may position the first recess 2821 such that it is engaged with the first rod 2820, and the user may then rotate the lighting system 2801 downward (e.g., pivoting about the first rod 2820) until the second recess 2823 has engaged the second rod 2822. As can be seen in FIG. 28C, the first recess 2821 opens in a generally rearward direction, and second recess 2823 opens in a generally downward direction. By having the two recesses open in different directions, this can make it such that, when both recesses 2821, 2823 are engaged with the rods 2820, 2822, respectively, the engagement of second recess 2823 with second rod 2022 will resist recess 2821 from becoming disengaged with first rod 2820.


As can further be seen in FIGS. 28C and 28D, the mounting base 2705 further comprises a movable latch 2824 that can pivot about pivot axis 2828 in order to allow or disallow second rod 2822 from entering and/or exiting the second recess 2823. Accordingly, when the second recess 2823 is fully engaged with the second rod 2822, and the latch 2824 is in the closed position (e.g., the position shown in FIGS. 28C and 28D), the latch 2824 will resist second rod 2822 from becoming disengaged from second recess 2823, thus fully (or at least sufficiently) constraining the location of the lighting system 2801 with respect to the rest of the head-mounted display 2122.


With continued reference to FIG. 28D, the mounting base 2705 further comprises a user interface feature, such as a button 2826, that can be manipulated by a user in order to move the latch 2824 into the open position, thus allowing removal of the mounting base 2705 from the mounting socket 2710. Desirably, the latch 2824 is biased to the closed position by, for example, a spring, such as spring 2830 visible in FIG. 28D. As can also be seen in FIG. 28D, the latch 2824 desirably comprises an angled or tapered outer surface that will engage the second rod 2822 during installation and force the latch 2824 to pivot and allow the second rod 2822 to engage the second recess 2823 without necessarily requiring a user to manipulate the button 2826 during installation. Stated another way, the latch 2824 and button 2826 may be configured such that a user can attach the mounting base 2705 to the mounting socket 2710 without manipulating the button 2826, but may be configured to require the button 2826 to be manipulated in order to remove the mounting base 2705 from the mounting socket 2710.



FIG. 28D also illustrates an electrical contact 2832 extending downward from the bottom of the mounting base 2705. Desirably, the electrical contact 2832 may be spring-loaded and/or elastically bendable such that a sufficiently low resistance electrical connection may be made between the electrical contact 2832 and a corresponding one of the electrical contacts 2834 of the mounting socket 2710 (see FIG. 28E). Although the side view of FIG. 28D only shows one electrical contact 2832, the mounting base 2705 desirably comprises or consists essentially of two electrical contacts 2832, positioned corresponding to the positions of the two electrical contacts 2034 of the mounting socket 2710 shown in FIG. 28E. Only one is visible in FIG. 28D, however, because the second is directly behind the first in this side view orientation. Further, similar to as discussed above with respect to the electrical contacts 2834 of mounting socket 2710, the mounting base 2705 may include more or fewer electrical contacts 2832.


With reference to FIG. 28F, which shows the lighting system 2801 attached to the rest of the head-mounted display 2122, this figure also shows a strain relief member 2830 extending out a front of the mounting base 2705. Although not shown, the strain relief member 2830 may have one or more wires passing therethrough that, for example, transfer electrical power and/or control signals from the electrical contacts 2832 (received from electrical contacts 2834) to the flashlight 2703.


Example Optics Considerations and Clip-On Lens Assemblies


Various users of the head-mounted displays disclosed herein (e.g., surgeon 26) may require prescription lenses for vision correction. For example, at least some of such users may typically wear prescription eyeglasses to correct their vision. Some of the head-mounted displays disclosed herein may be able to fit over such prescription eyeglasses, but such a configuration may not be ideal. As an alternative to a surgeon needing to wear both prescription eyeglasses and a head-mounted display as disclosed herein, various embodiments of the head-mounted displays disclosed herein may be configured to have prescription lenses coupled thereto in order to allow the surgeon to use the head-mounted display with clear vision without needing to wear separate corrective devices, such as eyeglasses, contact lenses, and/or the like.



FIGS. 29A-29E illustrate one embodiment of a head-mounted display 2922 that includes clip-on, snap-on, removable, and/or replaceable prescription lenses, to enable the head-mounted display 2922 to be used by a variety of users (e.g., surgeons 26) with a variety of corrective lens requirements. The head-mounted display 2922 is similar to the head-mounted display 50 described above with reference to FIG. 3, and the same or similar reference numbers are used to refer to the same or similar components. The head-mounted display 2922 depicted in FIG. 29A includes two significant differences from the head-mounted display 50 depicted in FIG. 3. First, left and right temple arms 43, 44 (which may be included in head-mounted display 50, but are not shown in FIG. 3) are shown attached to the display assembly frame 41. The temple arms 43, 44 may be the same or similar to, for example, the left and right temple arms 43, 44 of the head-mounted display 22 of FIG. 2A. Second, the displays 49a, 49b have been modified to utilize posterior lenses 2912 that are part of a detachable lens assembly 2910 (e.g., clip-on lens assembly, snap-on lens assembly, and/or the like) (see FIG. 29B).



FIG. 29C depicts a cross-section of the right display 49a, with the left display 49b being similar in construction. The display 49a includes an anterior lens 2902 and a posterior lens 2912. Between the anterior lens 2902 and posterior lens 2912 is a reflective waveguide lens 2904, which can be used to display images transmitted from the waveguide 52 (see FIG. 3). For example, the waveguide lens 2904 may be a lens that is embedded with multiple partially reflective mirrors that enable a user to see images transmitted to the waveguide lens 24 by the waveguide 52, while also allowing the user to see through the waveguide lens 2904. In some embodiments, the reflective waveguide lens 2904 comprises a reflective waveguide lens available from Lumus Ltd. (Ness Ziona, Israel). It should be noted that, although the embodiment shown in FIG. 29C includes a detachable lens assembly, the various lenses shown in FIG. 29C may also be used in a similar configuration in embodiments that are not intended to have any of the lenses be detachable. For example, some embodiments may include anterior lens 2902, posterior lens 2912, and reflective waveguide lens 2904 positioned in substantially the same relative positions as shown in FIG. 29C, but all permanently or semi-permanently installed, instead of having the posterior lens 2912 be removable.


In some embodiments, the posterior lens 2912 is by default (e.g., before consideration for a personal prescription compensation) shaped to provide a particular diopter to achieve a focus for an AR image at a particular operational distance. For example, in some embodiments, the posterior lens 2912 is by default shaped to provide −2D diopter compensation to achieve a focus at a 0.5 m (e.g., 50 cm plus or minus 20 cm) operational distance. The anterior lens 2902 may be shaped to compensate for the above-described effect of the posterior lens 2912 (e.g., to reduce or eliminate the effect of the posterior lens 2912 on the view of realty through the display 49a). For example, with the default −2D posterior lens 2912, the anterior lens 2902 may, for example, be shaped to provide a +2D diopter compensation. The considerations for focal distance in the disclosed systems can be different than for normal eyeglasses or other augmented reality systems (such as consumer-focused augmented reality systems). For example, normal eyeglasses or other augmented reality systems (such as consumer-focused augmented reality systems) may be configured to achieve focus at a distance of approximately 3-4 m or greater. When using systems as disclosed herein with surgical or other medical procedures, however, the desired operational or focal distance may be significantly lower, such as approximately 50 cm, within a range of approximately 30 cm to 70 cm, and/or the like. For example, the wearer may be viewing a treatment or diagnostic site from a relatively close range from a standing or sitting position adjacent a patient. That said, the systems disclosed herein may also be used in other applications (e.g., athletics and fitness, gaming, driving, product design, navigation, manufacturing, logistics, shopping and commerce, educational training, remote collaboration, etc.), which each may have longer or shorter desired focal distances, and the diopter compensations in the anterior and posterior lenses may be adjusted accordingly to account for such different focal distances.


With continued reference to FIG. 29C, for a user (e.g., surgeon 26) that does not require vision correction, the default posterior lens 2912 may be acceptable. For a user that requires vision correction, however, it may be desirable to replace the posterior lens 2912 with an alternative lens that changes the compensation in order to correct for that specific user's vision. Accordingly, the posterior lens 2912 in the head-mounted display 2922 is part of a detachable, removable and/or replaceable clip-on lens assembly 2910 that is shown in FIG. 29B. With reference to FIG. 29B, the clip-on lens assembly 2910 comprises or consists essentially of a frame 2911 (e.g., posterior lens frame) that holds left and right posterior lenses 2912, and that includes a bridge portion 2916 that fixes the distance between the two posterior lenses 2912. It should be noted that alternative embodiments could use individual clip-on lenses that, for example, do not have a bridge 2916 coupling them together to form a single clip-on lens assembly.


The frame 2911 of the clip-on lens assembly 2910 further comprises or consists essentially of two protrusions 2918 protruding from a top of the frame 2911. The protrusions 2918 are shaped to fit into corresponding recesses 2919 of frame 41 of the head-mounted display 2922 (see FIG. 29A). Additionally, the frame 2911 includes two clips 2914 protruding from a bottom portion of the frame 2911, and shaped to form a snap fit into a slot 2915 in the frame 41 (see FIG. 29C). In order to install the clip-on lens assembly 2910 onto the frame 41, a user may, for example, insert the protrusions 2918 into the recesses 2919, and then pivot the clip-on lens assembly 2910 into the final position shown in FIGS. 29A and 29C, with the clips 2914 snapped into corresponding slots 2915. Removal may be conducted by doing the opposite process, namely pulling the clips 2914 from the slots 2015, pivoting the bottom of the frame 2911 away from the frame 41, and then removing the protrusions 2918 from the recesses 2919 in the frame 41.


Various other mechanical methods of removably attaching a posterior lens assembly to the frame 41 may also or alternatively be used. For example, more clips 2914 and/or protrusions 2918 may be used, the clips 2914 and/or protrusions 2918 may be replaced by and/or supplemented by magnets, components that form a friction fit, adhesives, screws, other fasteners, and/or the like.


Turning now to FIG. 29D, this figure illustrates additional details of the cross-sectional view of FIG. 29C. Specifically, FIG. 29D illustrates how the lenses 2902 and 2912 can be sealed to the reflective waveguide lens 2904. For example, a first seal 2920 may be positioned about a perimeter of the anterior lens 2902 and seal anterior lens 2902 to a front side of the reflective waveguide lens 2904. FIG. 29E is a perspective view of one of the anterior lenses 2902 that shows an example of the seal 2920 positioned about the perimeter of the anterior lens 2902. In this embodiment, the seal 2920 may comprise double-sided tape, such as a double-sided foam tape that has adhesive on both sides and thus permanently or semi-permanently adheres the anterior lens 2902 to a front side of the reflective waveguide lens 2904.


With continued reference to FIG. 29D, a second seal 2922 is shown between the frame 2911 of the clip-on lens assembly 2910 and a rear side of the reflective waveguide lens 2904. In order to make the clip-on lens assembly 2910 more easily removable and/or replaceable, the seal 2922 may in some embodiments comprise a single-sided tape, such as a foam tape that has adhesive on one side, such as the side connected to the frame 2911. Accordingly, with such a configuration, the frame 2911 may be sealed against the reflective waveguide lens 2904, but may be easily removed and replaced without leaving adhesive residue on the reflective waveguide lens 2904. In some embodiments, one or both of seals 2920 and 2922 may also perform the function of acting as a spacer between the anterior and posterior lenses and the reflective waveguide lens, which can be beneficial in some cases.


In this embodiment, the posterior lens 2912 is desirably affixed to the frame 2911 of the clip-on lens assembly 2910. For example, as can be seen in FIG. 29C, the posterior lens 2912 may comprise a hump or protrusion 2930 that fits into a corresponding groove or depression 2932 in the frame 2911. The fit between the frame 2911 and posterior lens 2912 may desirably be an interference fit that fixedly retains and seals the posterior lens 2912 to the frame 2911. Additionally or alternatively, the perimeter of the posterior lens 2912 may be adhered to the frame 2911 with adhesive.


The design discussed above and shown in FIGS. 29A-29E has a number of beneficial features including, for example, relatively easy tool free installation of custom prescription posterior lenses, maintaining sealing between both the anterior and posterior lenses and the reflective waveguide lens, and the potential for relatively quick production of a customized head-mounted displays. The maintaining of sealing between the anterior and posterior lenses and the reflective waveguide lens can be particularly beneficial in, for example, maintaining ideal transparency of the reflective waveguide lens.


The frame 2911 of the clip-on lens assembly 2910 may be manufactured from a variety of materials. In some embodiments, it may be desirable to manufacture the frame 2911 from PEEK (polyetheretherketone), which may, for example, have a relatively desirable weight to strength ratio and may be suitable for use with a variety of cleaning procedures. In some embodiments, the weight of the frame 2911 may be approximately or no greater than 4 g. In some embodiments, other materials may be used for the frame 2911, such as polycarbonate, which may be a more efficient material to use in some cases.


Turning to FIG. 30, this figure illustrates schematically how the cross-sectional profile of the frame 2911 of the clip-on lens assembly 2910 may vary with posterior lenses 2912 having different diopter compensations. Specifically, FIG. 30 illustrates cross-sectional views of three different variations of the clip-on lens assembly 2910, namely clip-on lens assemblies 2910a, 2910b, and 2910c. Each of these variations shows a width W that indicates the overall width of the clip-on lens assembly 2910a, 2910b, 2910c. Clip-on lens assembly 2910b illustrates a diopter compensation of zero, and is the narrowest width assembly, with the width W of this assembly being approximately 4.7 mm. Clip-on lens assembly 2910a illustrates an example with a posterior lens 2912 having a diopter compensation of +6D, resulting in an assembly width W of approximately 7.7 mm. Finally, clip-on lens assembly 2910c illustrates an example with a posterior lens 2912 having a diopter compensation of −6D, resulting in an assembly width of approximately 10.5 mm.


Turning to FIGS. 31A-31E, these figures illustrate additional views and/or portions of the head-mounted display 2122 of FIG. 21A, to show a detachable lens assembly 3110 (e.g., clip-on lens assembly, snap-on lens assembly, and/or the like). The clip-on lens assembly 3110 of head-mounted display 2122 is similar to the clip-on lens assembly 2910 of head-mounted display 2922 discussed above with reference to FIGS. 29A-29E, and the same or similar reference numbers are used to refer to the same or similar features.


One difference in the clip-on lens assembly 3110 of head-mounted display 2122 is that the outer profile shape of the posterior lenses 2912 and the frame 2911 (e.g., posterior lens frame) that the lenses fit within is shaped differently. Specifically, the outer profile comprises a stepped shape in order to provide clearance for portions 3150 of the housing 2104 of optical engine 55 which are shaped somewhat differently than the corresponding portions of head-mounted display 2922. Another difference is that the configurations of protrusions 2918 and recesses 2919 are different. Specifically, the protrusions 2918 are part of housing 2104, and the recesses 2919 are part of the frame 2911, which is the opposite of the configuration discussed above with reference to FIGS. 29A-29E. The clip-on lens assembly 3110 still includes similar clips 2914 as the clip-on lens assembly 2910. Further, the overall cross-sectional construction of the displays 49a, 49b of head-mounted display 2122 can be functionally similar to the structure discussed above with reference to FIGS. 29C and 29D.


Finally, turning to FIG. 32, this figure illustrates a cross-section similar to the cross-sections of FIGS. 29C and 29D, but depicts an alternative screw-on design instead of clip-on or snap-on design. Specifically, the structure shown in FIG. 32 includes a display assembly frame 41 that has a reflective waveguide lens 2904 coupled thereto, an anterior lens 2902 adhered to the front of the waveguide lens 2904 using seal 2920, and a posterior lens 2912 attached to the back of the waveguide lens 2904, with seal 2922 positioned between the posterior lens 2912 and waveguide lens 2004. Instead of frame 2911 being clipped or snapped on to frame 41, however, frame 2911 is screwed onto frame 41 with one or more screws 3202.


Another difference in the embodiment of FIG. 32 is that the front side of the posterior lens 2912 is shown as being flat or substantially flat. Such a design may help to, for example, save space in the display assembly. On the other hand, having a flat front surface for such an optical corrective lens may be nonstandard in the prescription lens industry, and thus may be more difficult and/or expensive to manufacture. Accordingly, in some embodiments, the front side of the posterior lens 2912 may also be curved, similar to the posterior lenses 2912 of FIG. 29C.


While certain examples of usage of the disclosed embodiments are given with respect to body portions containing spine vertebrae, the principles disclosed may also be used with respect to other bones and/or body portions than spine, including hip bones, pelvic bones, leg bones, arm bones, ankle bones, foot bones, shoulder bones, cranial bones, oral and maxillofacial bones, sacroiliac joints, etc.


The disclosed embodiments are presented with relation to image-guided surgery systems or methods, in general, and accordingly, the disclosed systems and devices should not be considered limited only to surgery or medical applications but for non-medical applications as well. For example, the disclosed embodiments are applicable to consumer or commercial applications such as athletics and fitness, gaming, driving, product design, navigation, manufacturing, logistics, shopping and commerce, educational training, remote collaboration, etc.


The terms “top,” “bottom,” “first,” “second,” “upper,” “lower,” “height,” “width,” “length,” “end,” “side,” “horizontal,” “vertical,” and similar terms may be used herein; it should be understood that these terms have reference only to the structures shown in the figures and are utilized only to facilitate describing embodiments of the disclosure. Various embodiments of the disclosure have been presented in a range format. It should be understood that the description in range format is merely for convenience and brevity and should not be construed as an inflexible limitation on the scope of the disclosure. The ranges disclosed herein encompass any and all overlap, sub-ranges, and combinations thereof, as well as individual numerical values within that range. For example, description of a range such as from about 25 to about 45 degrees should be considered to have specifically disclosed subranges such as from 25 to 35 degrees, from 30 to 40 degrees, from 35 to 45 degrees etc., as well as individual numbers within that range (for example, 25, 30, 35, 40, 45, 32, 30.5 and any whole and partial increments therebetween). Language such as “up to,” “at least,” “greater than,” “less than,” “between,” and the like includes the number recited. Numbers preceded by a term such as “about” or “approximately” include the recited numbers. For example, “approximately 260 g” includes “260 g.” The terms “approximately”, “about”, and “substantially” as used herein represent an amount close to the stated amount that still performs a desired function or achieves a desired result.


Some embodiments comprise various features that are presented as single features (as opposed to multiple features). For example, in one embodiment, a system includes a single HMD, a single camera, a single processor, a single display, a single flashlight, a single PTA, a single PTA detent mechanism, a single head strap adjustment knob, etc. Multiple features or components are provided in alternate embodiments.


In some embodiments, the systems disclosed herein comprise one or more of the following: means for tilting (e.g., a hinge, a virtual hinge, an arc-shaped slot, detents, a strap configured to bend), means for adjusting (e.g., a knob, a rack and pinion), means for imaging (e.g., a camera or fluoroscope or MRI machine or CT machine), means for calibration (e.g., calibration jigs), means for registration (e.g., adapters, markers, objects, cameras), means for biasing (e.g., springs), means for fastening (e.g., anchors, adhesives, clamps, pins), means for segmentation (e.g., one or more neural networks), etc.


The processors described herein may include one or more central processing units (CPUs) or processors or microprocessors. The processors may be communicatively coupled to one or more memory units, such as random-access memory (RAM) for temporary storage of information, one or more read only memory (ROM) for permanent storage of information, and one or more mass storage devices, such as a hard drive, diskette, solid state drive, or optical media storage device. The processors (or memory units communicatively coupled thereto) may include modules comprising program instructions or algorithm steps configured for execution by the processors to perform any of all of the processes or algorithms discussed herein. The processors may be communicatively coupled to external devices (e.g., display devices, data storage devices, databases, servers, etc.) over a network via a network communications interface.


In general, the algorithms or processes described herein can be implemented by logic embodied in hardware or firmware, or by a collection of software instructions, possibly having entry and exit points, written in a programming language, such as, for example, Python, Java, Lua, C, C#, or C++. A software module or product may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language such as, for example, BASIC, Perl, or Python. It will be appreciated that software modules may be callable from other modules or from themselves, and/or may be invoked in response to detected events or interrupts. Software modules configured for execution on computing devices may be provided on a computer readable medium, such as a compact disc, digital video disc, flash drive, or any other tangible medium. Such software code may be stored, partially or fully, on a memory device of the executing computing device, such as the processing system 31, for execution by the computing device. Software instructions may be embedded in firmware, such as an EPROM. It will be further appreciated that hardware modules may be comprised of connected logic units, such as gates and flip-flops, and/or may be comprised of programmable units, such as programmable gate arrays or processors. The modules described herein are preferably implemented as software modules but may be represented in hardware or firmware. Generally, any modules or programs or flowcharts described herein may refer to logical modules that may be combined with other modules or divided into sub-modules despite their physical organization or storage.


The various features and processes described above may be used independently of one another or may be combined in various ways. All possible combinations and sub-combinations are intended to fall within the scope of this disclosure. In addition, certain method or process blocks or steps may be omitted in some implementations. The methods and processes described herein are also not limited to any particular sequence, and the blocks, steps, or states relating thereto can be performed in other sequences that are appropriate. For example, described blocks, steps, or states may be performed in an order other than that specifically disclosed, or multiple blocks or states may be combined in a single block or state. The example blocks, steps, or states may be performed in serial, in parallel, or in some other manner. Blocks, steps, or states may be added to or removed from the disclosed example embodiments. The example systems and components described herein may be configured differently than described. For example, elements may be added to, removed from, or rearranged compared to the disclosed example embodiments.


Any process descriptions, elements, or blocks in the flow diagrams described herein and/or depicted in the attached figures should be understood as potentially representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process.


It will be appreciated that the systems and methods of the disclosure each have several innovative aspects, no single one of which is solely responsible or required for the desirable attributes disclosed herein. The various features and processes described above may be used independently of one another or may be combined in various ways. The section headings used herein are merely provided to enhance readability and are not intended to limit the scope of the embodiments disclosed in a particular section to the features or elements disclosed in that section.


Certain features that are described in this specification in the context of separate embodiments also may be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment also may be implemented in multiple embodiments separately or in any suitable sub-combination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination may in some cases be excised from the combination, and the claimed combination may be directed to a sub-combination or variation of a sub-combination. No single feature or group of features is necessary or indispensable to each and every embodiment.


Conditional language, such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.


The terms “comprising,” “including,” “having,” and the like are synonymous and are used inclusively, in an open-ended fashion, and do not exclude additional elements, features, acts, operations, and so forth. In addition, the term “or” is used in its inclusive sense (and not in its exclusive sense) so that when used, for example, to connect a list of elements, the term “or” means one, some, or all of the elements in the list. In addition, the articles “a,” “an,” and “the” as used in this application and the appended claims are to be construed to mean “one or more” or “at least one” unless specified otherwise.

Claims
  • 1. A head-mounted display device comprising: a frame extending from a first end to a second end, the first end configured to be positioned adjacent a first temple of a wearer, and the second end configured to be positioned adjacent a second temple of the wearer,the frame further comprising a nose pad configured to engage a nose of the wearer;an adjustable strap assembly comprising: a first side strap having a first end coupled to the first end of the frame;a second side strap having a first end coupled to the second end of the frame; andan adjustment mechanism configured to adjust a position of a second end of the first side strap with respect to a second end of the second side strap, in order to adjust a circumferential size defined by the first side strap, the second side strap, and the frame;a forehead support comprising: a first end pivotably coupled to the first side strap;a second end pivotably coupled to the second side strap; anda central support coupled to the frame;a see-through display assembly pivotably coupled to the frame such that a pantoscopic tilt angle can be adjusted,a first temple housing pivotably coupled to the first end of the frame and slidably coupled to the first side strap of the adjustable strap assembly;a second temple housing pivotably coupled to the second end of the frame and slidably coupled to the second side strap of the adjustable strap assembly; anda lighting system detachably coupled to the frame.
  • 2. The head-mounted display device of claim 1, wherein the see-through display assembly comprises a detachable lens assembly that can be detached and replaced with a second detachable lens assembly for changing a prescription of lenses of the see-through display assembly.
  • 3. The head-mounted display device of claim 1, wherein the see-through display assembly comprises a display that is at least partially transparent and configured to display to the wearer an augmented reality (AR) image comprising a virtual reality (VR) image presented over a scene on a body of a patient.
  • 4. The head-mounted display device of claim 3, wherein at least one of the first temple housing or the second temple housing comprises at least one processor configured to receive one or more anatomical images of the patient and signals indicative of at least a position of the display relative to the scene, and to render the AR image to the display.
  • 5. The head-mounted display device of claim 1, wherein the first temple housing is pivotably coupled to the first end of the frame by a first tilting assembly, and the second temple housing is pivotably coupled to the second end of the frame by a second tilting assembly.
  • 6. The head-mounted display device of claim 1, wherein the lighting system comprises a flashlight.
  • 7. The head-mounted display device of claim 1, wherein the see-through display assembly is pivotably coupled to the frame with a pantoscopic tilting assembly that comprises: an arc-shaped slot that pivotably couples a portion of the see-through display assembly to a portion of the frame; anda detent mechanism comprising a spring-loaded pin or ball and a plurality of detents, the detent mechanism configured to selectively retain the see-through display assembly in any of a plurality of predefined positions with respect to the frame.
  • 8. The head-mounted display device of claim 7, wherein the detent mechanism further comprises a guide member slidably engaged with the arc-shaped slot, the guide member configured to apply a force to the spring-loaded pin or ball to move the spring-loaded pin or ball from one of the plurality of detents to another of the plurality of detents.
  • 9. The head-mounted display device of claim 1, wherein the first side strap comprises a connector that pivotably couples a front portion of the first side strap to a rear portion of the first side strap, and that pivotably couples the first end of the forehead support to the first side strap, and wherein the second side strap comprises a connector that pivotably couples a front portion of the second side strap to a rear portion of the second side strap, and that pivotably couples the second end of the forehead support to the second side strap.
  • 10. The head-mounted display device of claim 1, further comprising: a first follower that is coupled to the first side strap and that is slidably coupled to the first temple housing; anda second follower that is coupled to the second side strap and that is slidably coupled to the second temple housing.
  • 11. A head-mounted display device comprising: a frame extending from a first end to a second end;a head mounting assembly configured to be adjustable and to retain the frame in a position on a head of the user;a display that is at least partially transparent and configured to display to the user an augmented reality (AR) image comprising a virtual reality (VR) image presented over a scene on a body of a patient;a see-through display assembly that comprises the display, the see-through display assembly being pivotably coupled to the frame by a pantoscopic tilting assembly such that a pantoscopic tilt angle can be adjusted, the pantoscopic tilting assembly comprising: an arc-shaped slot that pivotably couples a portion of the see-through display assembly to a portion of the frame; anda detent mechanism comprising a spring-loaded pin or ball and a plurality of detents, the detent mechanism configured to selectively retain the see-through display assembly in any of a plurality of predefined positions with respect to the frame;a first temple housing pivotably coupled to the first end of the frame by a first tilting assembly; anda second temple housing pivotably coupled to the second end of the frame by a second tilting assembly,wherein at least one of the first temple housing or the second temple housing comprises at least one processor configured to receive one or more anatomical images of the patient and signals indicative of at least a position of the display relative to the scene, and to render the AR image to the display.
  • 12. The head-mounted display device of claim 11, wherein the see-through display assembly comprises a detachable lens assembly that can be detached and replaced with a second detachable lens assembly for changing a prescription of lenses of the see-through display assembly.
  • 13. The head-mounted display device of claim 11, further comprising a lighting system detachably coupled to the frame.
  • 14. The head-mounted display device of claim 11, wherein the first end of the frame is configured to be positioned adjacent a first temple of the user, and the second end of the frame configured to be positioned adjacent a second temple of the user.
  • 15. The head-mounted display device of claim 11, wherein the frame further comprises a nose pad configured to engage a nose of the user.
  • 16. The head-mounted display device of claim 11, wherein the head mounting assembly comprises an adjustable head strap configured to engage at least a back of a head of the user and a forehead of the user.
  • 17. The head-mounted display device of claim 16, wherein the first temple housing is slidably coupled to the adjustable head strap and the second temple housing is slidably coupled to the adjustable head strap.
  • 18. The head-mounted display device of claim 11, wherein the detent mechanism further comprises a guide member slidably engaged with the arc-shaped slot, the guide member configured to apply a force to the spring-loaded pin or ball to move the spring-loaded pin or ball from one of the plurality of detents to another of the plurality of detents.
  • 19. A head-mounted display device comprising: a frame extending from a first end to a second end;a head mounting assembly configured to be adjustable and to retain the frame in a position on a head of the user,wherein the head mounting assembly comprises an adjustable head strap configured to engage at least a back of a head of the user and a forehead of the user;a display that is at least partially transparent and configured to display to the user an augmented reality (AR) image comprising a virtual reality (VR) image presented over a scene on a body of a patient;a see-through display assembly that comprises the display, the see-through display assembly being pivotably coupled to the frame such that a pantoscopic tilt angle can be adjusted;a first temple housing pivotably coupled to the first end of the frame by a first tilting assembly; anda second temple housing pivotably coupled to the second end of the frame by a second tilting assembly,wherein the first temple housing is slidably coupled to the adjustable head strap and the second temple housing is slidably coupled to the adjustable head strap, andwherein at least one of the first temple housing or the second temple housing comprises at least one processor configured to receive one or more anatomical images of the patient and signals indicative of at least a position of the display relative to the scene, and to render the AR image to the display.
  • 20. The head-mounted display device of claim 19, wherein the see-through display assembly comprises a detachable lens assembly that can be detached and replaced with a second detachable lens assembly for changing a prescription of lenses of the see-through display assembly.
  • 21. The head-mounted display device of claim 19, further comprising a lighting system detachably coupled to the frame.
  • 22. The head-mounted display device of claim 19, wherein the first end of the frame is configured to be positioned adjacent a first temple of the user, and the second end of the frame configured to be positioned adjacent a second temple of the user.
  • 23. The head-mounted display device of claim 19, wherein the frame further comprises a nose pad configured to engage a nose of the user.
  • 24. The head-mounted display device of claim 19, wherein the see-through display assembly is pivotably coupled to the frame by a pantoscopic tilting assembly that comprises a slot positioned to enable a pivoting element to slide tangential to the slot, the pivoting element configured to be locked at one or more predefined positions with respect to the slot.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of International PCT Application PCT/IB2023/059049, filed Sep. 12, 2023, titled “AUGMENTED REALITY EYEWEAR FOR IMAGE-GUIDED MEDICAL INTERVENTION,” which claims priority to U.S. Provisional Application No. 63/405,901, filed Sep. 13, 2022, titled “AUGMENTED REALITY EYEWEAR FOR IMAGE-GUIDED MEDICAL INTERVENTION.” The disclosure of each of the foregoing applications is incorporated herein by reference in its entirety for all purposes.

US Referenced Citations (1272)
Number Name Date Kind
3101715 Glassman Aug 1963 A
3690776 Zaporoshan Sep 1972 A
4459358 Berke Jul 1984 A
4711512 Upatnieks Dec 1987 A
4863238 Brewster Sep 1989 A
4944739 Torre Jul 1990 A
5147365 Whitlock et al. Sep 1992 A
5357292 Wiedner Oct 1994 A
5441042 Putman Aug 1995 A
5442146 Bell et al. Aug 1995 A
5510832 Garcia Apr 1996 A
D370309 Stucky May 1996 S
5636255 Ellis Jun 1997 A
5665092 Mangiardi et al. Sep 1997 A
5771121 Hentschke Jun 1998 A
5792046 Dobrovolny Aug 1998 A
5841507 Barnes Nov 1998 A
6006126 Cosman Dec 1999 A
6038467 De Bliek et al. Mar 2000 A
6125164 Murphy et al. Sep 2000 A
6147805 Fergason Nov 2000 A
6227667 Halldorsson et al. May 2001 B1
6256529 Holupka et al. Jul 2001 B1
6285505 Melville et al. Sep 2001 B1
6314310 Ben-Haim et al. Nov 2001 B1
6349001 Spitzer Feb 2002 B1
6444192 Mattrey Sep 2002 B1
6447503 Wynne et al. Sep 2002 B1
6449090 Omar et al. Sep 2002 B1
6456405 Horikoshi et al. Sep 2002 B2
6456868 Saito et al. Sep 2002 B2
6474159 Foxlin et al. Nov 2002 B1
6518939 Kikuchi Feb 2003 B1
6527777 Justin Mar 2003 B2
6529331 Massof et al. Mar 2003 B2
6549645 Oikawa et al. Apr 2003 B1
6578962 Amir et al. Jun 2003 B1
6609022 Vilsmeier et al. Aug 2003 B2
6610009 Person Aug 2003 B2
D480476 Martinson et al. Oct 2003 S
6659611 Amir et al. Dec 2003 B2
6675040 Cosman Jan 2004 B1
6683584 Ronzani et al. Jan 2004 B2
6690964 Bieger et al. Feb 2004 B2
6714810 Grzeszczuk et al. Mar 2004 B2
6737425 Yamamoto et al. May 2004 B1
6740882 Weinberg May 2004 B2
6757068 Foxlin Jun 2004 B2
6759200 Stanton, Jr. Jul 2004 B1
6847336 Lemelson et al. Jan 2005 B1
6856324 Sauer et al. Feb 2005 B2
6856826 Seeley et al. Feb 2005 B2
6891518 Sauer et al. May 2005 B2
6900777 Hebert et al. May 2005 B1
6919867 Sauer Jul 2005 B2
6921167 Nagata Jul 2005 B2
6966668 Cugini et al. Nov 2005 B2
6980849 Sasso Dec 2005 B2
6993374 Sasso Jan 2006 B2
6997552 Hung Feb 2006 B1
6999239 Martins et al. Feb 2006 B1
7000262 Bielefeld Feb 2006 B2
7035371 Boese et al. Apr 2006 B2
7043961 Pandey et al. May 2006 B2
7072435 Metz et al. Jul 2006 B2
7103233 Stearns Sep 2006 B2
7107091 Jutras et al. Sep 2006 B2
7112656 Desnoyers et al. Sep 2006 B2
7141812 Appleby et al. Nov 2006 B2
7157459 Ohta et al. Jan 2007 B2
7169785 Timmer et al. Jan 2007 B2
7171255 Holupka et al. Jan 2007 B2
7176936 Sauer et al. Feb 2007 B2
7187792 Fu et al. Mar 2007 B2
7190331 Genc et al. Mar 2007 B2
7194295 Stefan Mar 2007 B2
7215322 Genc et al. May 2007 B2
7229078 Lechot Jun 2007 B2
7231076 Fu et al. Jun 2007 B2
7235076 Pacheco Jun 2007 B2
7239330 Sauer et al. Jul 2007 B2
7241292 Hooven Jul 2007 B2
7259266 Carter et al. Aug 2007 B2
7260426 Schweikard et al. Aug 2007 B2
7269192 Hayashi Sep 2007 B2
7281826 Huang Oct 2007 B2
7315636 Kuduvalli Jan 2008 B2
7320556 Vagn-Erik Jan 2008 B2
7330578 Wang et al. Feb 2008 B2
7359535 Salla et al. Apr 2008 B2
7364314 Nilsen et al. Apr 2008 B2
7366934 Narayan et al. Apr 2008 B1
7379077 Bani-Hashemi et al. May 2008 B2
7431453 Hogan Oct 2008 B2
7435219 Kim Oct 2008 B2
7450743 Sundar et al. Nov 2008 B2
7458977 McGinley et al. Dec 2008 B2
7462852 Appleby et al. Dec 2008 B2
7493153 Ahmed et al. Feb 2009 B2
7505617 Fu et al. Mar 2009 B2
7507968 Wollenweber et al. Mar 2009 B2
7518136 Appleby et al. Apr 2009 B2
7525735 Sottilare et al. Apr 2009 B2
D592691 Chang May 2009 S
D592692 Chang May 2009 S
D592693 Chang May 2009 S
7536216 Geiger et al. May 2009 B2
7542791 Mire et al. Jun 2009 B2
7556428 Sukovic et al. Jul 2009 B2
7557824 Holliman Jul 2009 B2
7563228 Ma et al. Jul 2009 B2
7567834 Clayton et al. Jul 2009 B2
7570791 Frank et al. Aug 2009 B2
7586686 Hall Sep 2009 B1
D602620 Cristoforo Oct 2009 S
7605826 Sauer Oct 2009 B2
7606613 Simon et al. Oct 2009 B2
7607775 Hermanson et al. Oct 2009 B2
7620223 Xu et al. Nov 2009 B2
7623902 Pacheco Nov 2009 B2
7627085 Boyden et al. Dec 2009 B2
7630753 Simon et al. Dec 2009 B2
7633501 Wood et al. Dec 2009 B2
7645050 Wilt et al. Jan 2010 B2
7653226 Guhring et al. Jan 2010 B2
7657075 Viswanathan Feb 2010 B2
7689019 Boese et al. Mar 2010 B2
7689042 Brunner et al. Mar 2010 B2
7689320 Prisco et al. Mar 2010 B2
7699486 Beiner Apr 2010 B1
7699793 Goette et al. Apr 2010 B2
7719769 Sugihara et al. May 2010 B2
D617825 Chang Jun 2010 S
7734327 Colquhoun Jun 2010 B2
D619285 Cristoforo Jul 2010 S
7751865 Jascob et al. Jul 2010 B2
7758204 Klipstein et al. Jul 2010 B2
7768702 Hirose et al. Aug 2010 B2
7769236 Fiala Aug 2010 B2
7773074 Arenson et al. Aug 2010 B2
7774044 Sauer et al. Aug 2010 B2
7822483 Stone et al. Oct 2010 B2
D628307 Krause-Bonte Nov 2010 S
7826902 Stone et al. Nov 2010 B2
7831073 Fu et al. Nov 2010 B2
7831096 Williamson, Jr. Nov 2010 B2
7835778 Foley et al. Nov 2010 B2
7835784 Mire et al. Nov 2010 B2
7837987 Shi et al. Nov 2010 B2
7840093 Fu et al. Nov 2010 B2
7840253 Tremblay et al. Nov 2010 B2
7840256 Lakin et al. Nov 2010 B2
7853305 Simon et al. Dec 2010 B2
7854705 Pawluczyk et al. Dec 2010 B2
7857271 Lees Dec 2010 B2
7860282 Boese et al. Dec 2010 B2
D630766 Harbin Jan 2011 S
7865269 Prisco et al. Jan 2011 B2
7874686 Rossner et al. Jan 2011 B2
7881770 Melkent et al. Feb 2011 B2
7893413 Appleby et al. Feb 2011 B1
7894649 Fu et al. Feb 2011 B2
7920162 Masini et al. Apr 2011 B2
7922391 Essenreiter et al. Apr 2011 B2
7938553 Beiner May 2011 B1
7945310 Gattani et al. May 2011 B2
7953471 Clayton et al. May 2011 B2
7969383 Eberl et al. Jun 2011 B2
7974677 Mire et al. Jul 2011 B2
7985756 Barlow et al. Jul 2011 B2
7991557 Liew et al. Aug 2011 B2
7993353 Roner et al. Aug 2011 B2
7996064 Simon et al. Aug 2011 B2
8004524 Deinzer Aug 2011 B2
8021300 Ma et al. Sep 2011 B2
8022984 Cheong et al. Sep 2011 B2
8045266 Nakamura Oct 2011 B2
8060181 Rodriguez et al. Nov 2011 B2
8068581 Boese et al. Nov 2011 B2
8068896 Daghighian et al. Nov 2011 B2
8077943 Williams et al. Dec 2011 B2
8079957 Ma et al. Dec 2011 B2
8081812 Kreiser Dec 2011 B2
8085075 Huffman et al. Dec 2011 B2
8085897 Morton Dec 2011 B2
8090175 Fu et al. Jan 2012 B2
8092400 Warkentine et al. Jan 2012 B2
8108072 Zhao et al. Jan 2012 B2
8112292 Simon Feb 2012 B2
8116847 Gattani et al. Feb 2012 B2
8120847 Chang Feb 2012 B2
8121255 Sugiyama Feb 2012 B2
8155479 Hoffman et al. Apr 2012 B2
8180132 Gorges et al. May 2012 B2
8180429 Sasso May 2012 B2
8208599 Ye et al. Jun 2012 B2
8216211 Mathis et al. Jul 2012 B2
8221402 Francischelli et al. Jul 2012 B2
8239001 Verard et al. Aug 2012 B2
8244012 Liang et al. Aug 2012 B2
8253778 Atsushi Aug 2012 B2
8271069 Jascob et al. Sep 2012 B2
8280491 Kuduvalli et al. Oct 2012 B2
8285021 Boese et al. Oct 2012 B2
8300315 Kobayashi Oct 2012 B2
8305685 Heine et al. Nov 2012 B2
8306305 Porat et al. Nov 2012 B2
8309932 Haselman et al. Nov 2012 B2
8317320 Huang Nov 2012 B2
8328815 Farr et al. Dec 2012 B2
8335553 Rubner et al. Dec 2012 B2
8335557 Maschke Dec 2012 B2
8340379 Razzaque et al. Dec 2012 B2
8369925 Giesel et al. Feb 2013 B2
8386022 Jutras et al. Feb 2013 B2
8394144 Zehavi et al. Mar 2013 B2
8398541 Dimaio et al. Mar 2013 B2
8444266 Waters May 2013 B2
8457719 Moctezuma De La Barrera et al. Jun 2013 B2
8467851 Mire et al. Jun 2013 B2
8469902 Dick et al. Jun 2013 B2
8475470 Von Jako Jul 2013 B2
8494612 Vetter et al. Jul 2013 B2
8509503 Nahum et al. Aug 2013 B2
8511827 Hua et al. Aug 2013 B2
8531394 Maltz Sep 2013 B2
8540364 Waters Sep 2013 B2
8545012 Waters Oct 2013 B2
8548567 Maschke et al. Oct 2013 B2
8556883 Saleh Oct 2013 B2
8559596 Thomson et al. Oct 2013 B2
8567945 Waters Oct 2013 B2
8571353 Watanabe Oct 2013 B2
8585598 Razzaque et al. Nov 2013 B2
8600001 Schweizer Dec 2013 B2
8600477 Beyar et al. Dec 2013 B2
8605199 Imai Dec 2013 B2
8611988 Miyamoto Dec 2013 B2
8612024 Stone et al. Dec 2013 B2
8634897 Simon et al. Jan 2014 B2
8641621 Razzaque et al. Feb 2014 B2
8643950 König Feb 2014 B2
8644907 Hartmann et al. Feb 2014 B2
8674902 Park et al. Mar 2014 B2
8686923 Eberl et al. Apr 2014 B2
8690581 Ruf et al. Apr 2014 B2
8690776 Razzaque et al. Apr 2014 B2
8692845 Fedorovskaya et al. Apr 2014 B2
8693632 Allison Apr 2014 B2
8694075 Groszmann et al. Apr 2014 B2
8699765 Hao et al. Apr 2014 B2
8705829 Frank et al. Apr 2014 B2
8737708 Hartmann et al. May 2014 B2
8746887 Shestak et al. Jun 2014 B2
8784450 Moskowitz et al. Jul 2014 B2
8786689 Liu Jul 2014 B1
D710545 Wu Aug 2014 S
D710546 Wu Aug 2014 S
8827934 Chopra et al. Sep 2014 B2
8831706 Fu et al. Sep 2014 B2
8836768 Rafii et al. Sep 2014 B1
8838199 Simon et al. Sep 2014 B2
8848977 Bammer et al. Sep 2014 B2
8855395 Baturin et al. Oct 2014 B2
8878900 Yang et al. Nov 2014 B2
8879815 Miao et al. Nov 2014 B2
8885177 Ben-Yishai et al. Nov 2014 B2
8890772 Woo et al. Nov 2014 B2
8890773 Pederson Nov 2014 B1
8890943 Lee et al. Nov 2014 B2
8897514 Feikas et al. Nov 2014 B2
8900131 Chopra et al. Dec 2014 B2
8903150 Star-Lack et al. Dec 2014 B2
8908952 Isaacs et al. Dec 2014 B2
8911358 Koninckx et al. Dec 2014 B2
8917268 Johnsen et al. Dec 2014 B2
8920776 Gaiger et al. Dec 2014 B2
8922589 Laor Dec 2014 B2
8941559 Bar-Zeev et al. Jan 2015 B2
8942455 Chou et al. Jan 2015 B2
8950877 Northey et al. Feb 2015 B2
8953246 Koenig Feb 2015 B2
8965583 Ortmaier et al. Feb 2015 B2
8969829 Wollenweber et al. Mar 2015 B2
8989349 Thomson et al. Mar 2015 B2
8992580 Bar et al. Mar 2015 B2
8994729 Nakamura Mar 2015 B2
8994795 Oh Mar 2015 B2
9004711 Gerolemou Apr 2015 B2
9005211 Brundobler et al. Apr 2015 B2
9011441 Bertagnoli et al. Apr 2015 B2
9057759 Klingenbeck et al. Jun 2015 B2
9060757 Lawson et al. Jun 2015 B2
9066751 Sasso Jun 2015 B2
9081436 Berme et al. Jul 2015 B1
9084635 Nuckley et al. Jul 2015 B2
9085643 Svanborg et al. Jul 2015 B2
9087471 Miao Jul 2015 B2
9100643 McDowall et al. Aug 2015 B2
9101394 Arata et al. Aug 2015 B2
9104902 Xu et al. Aug 2015 B2
9111175 Strommer et al. Aug 2015 B2
9123155 Cunningham et al. Sep 2015 B2
9125556 Zehavi et al. Sep 2015 B2
9129054 Nawana et al. Sep 2015 B2
9129372 Kriston et al. Sep 2015 B2
9132361 Smithwick Sep 2015 B2
9135706 Zagorchev et al. Sep 2015 B2
9141873 Takemoto Sep 2015 B2
9142020 Deguise et al. Sep 2015 B2
9149317 Arthur et al. Oct 2015 B2
9165203 McCarthy Oct 2015 B2
9165362 Siewerdsen et al. Oct 2015 B2
9179984 Teichman et al. Nov 2015 B2
D746354 Chang Dec 2015 S
9208916 Appleby et al. Dec 2015 B2
9220573 Kendrick et al. Dec 2015 B2
9225895 Kozinski Dec 2015 B2
9232982 Soler et al. Jan 2016 B2
9235934 Mandella et al. Jan 2016 B2
9240046 Carrell et al. Jan 2016 B2
9244278 Sugiyama et al. Jan 2016 B2
9247240 Park et al. Jan 2016 B2
9259192 Ishihara Feb 2016 B2
9265572 Fuchs et al. Feb 2016 B2
9269192 Kobayashi Feb 2016 B2
9283052 Rodriguez Ponce Mar 2016 B2
9286730 Bar-Zeev et al. Mar 2016 B2
9289267 Sauer et al. Mar 2016 B2
9294222 Proctor, Jr. Mar 2016 B2
9300949 Ahearn Mar 2016 B2
9310591 Hua et al. Apr 2016 B2
9320474 Demri et al. Apr 2016 B2
9323055 Baillot Apr 2016 B2
9330477 Rappel May 2016 B2
9335547 Takano et al. May 2016 B2
9335567 Nakamura May 2016 B2
9341704 Picard et al. May 2016 B2
9344686 Moharir May 2016 B2
9349066 Koo et al. May 2016 B2
9349520 Demetriou et al. May 2016 B2
9364294 Razzaque et al. Jun 2016 B2
9370332 Paladini et al. Jun 2016 B2
9373166 Azar Jun 2016 B2
9375639 Kobayashi et al. Jun 2016 B2
9378558 Kajiwara et al. Jun 2016 B2
9380287 Nistico et al. Jun 2016 B2
9387008 Sarvestani et al. Jul 2016 B2
9392129 Simmons Jul 2016 B2
9395542 Tilleman et al. Jul 2016 B2
9398936 Razzaque et al. Jul 2016 B2
9400384 Griffith Jul 2016 B2
9414041 Ko et al. Aug 2016 B2
9424611 Kanjirathinkal et al. Aug 2016 B2
9424641 Wiemker et al. Aug 2016 B2
9427286 Siewerdsen et al. Aug 2016 B2
9438894 Park et al. Sep 2016 B2
9443488 Borenstein et al. Sep 2016 B2
9453804 Tahtali Sep 2016 B2
9456878 Macfarlane et al. Oct 2016 B2
9465235 Chang Oct 2016 B2
9468373 Larsen Oct 2016 B2
9470908 Frankel et al. Oct 2016 B1
9473766 Douglas et al. Oct 2016 B2
9492222 Singh Nov 2016 B2
9495585 Bicer et al. Nov 2016 B2
9498132 Maier-Hein et al. Nov 2016 B2
9498231 Haider et al. Nov 2016 B2
9499999 Nanqing Nov 2016 B2
9507155 Morimoto Nov 2016 B2
9513495 Waters Dec 2016 B2
9521966 Schwartz Dec 2016 B2
9526443 Berme et al. Dec 2016 B1
9530382 Simmons Dec 2016 B2
9532846 Nakamura Jan 2017 B2
9532849 Anderson et al. Jan 2017 B2
9538962 Hannaford et al. Jan 2017 B1
9545233 Sirpad et al. Jan 2017 B2
9546779 Rementer Jan 2017 B2
9547174 Gao et al. Jan 2017 B2
9547940 Sun et al. Jan 2017 B1
9557566 Fujimaki Jan 2017 B2
9560318 Reina et al. Jan 2017 B2
9561095 Nguyen et al. Feb 2017 B1
9561446 Brecher Feb 2017 B2
9565415 Zhang et al. Feb 2017 B2
9572661 Robin et al. Feb 2017 B2
9576556 Simmons Feb 2017 B2
9581822 Morimoto Feb 2017 B2
9610056 Lavallee et al. Apr 2017 B2
9612657 Bertram et al. Apr 2017 B2
9629595 Walker et al. Apr 2017 B2
9633431 Merlet Apr 2017 B2
9645395 Bolas et al. May 2017 B2
9646423 Sun et al. May 2017 B1
9672597 Amiot et al. Jun 2017 B2
9672607 Demri et al. Jun 2017 B2
9672640 Kleiner Jun 2017 B2
9675306 Morton Jun 2017 B2
9675319 Razzaque et al. Jun 2017 B1
9684980 Royalty et al. Jun 2017 B2
9690119 Garofolo et al. Jun 2017 B2
RE46463 Fienbloom et al. Jul 2017 E
9693748 Rai et al. Jul 2017 B2
9710968 Dillavou et al. Jul 2017 B2
9713502 Finkman et al. Jul 2017 B2
9724119 Hissong et al. Aug 2017 B2
9724165 Arata et al. Aug 2017 B2
9726888 Giartosio et al. Aug 2017 B2
9728006 Varga Aug 2017 B2
9729831 Birnkrant et al. Aug 2017 B2
9757034 Desjardins et al. Sep 2017 B2
9757087 Simon et al. Sep 2017 B2
9766441 Rappel Sep 2017 B2
9767608 Lee et al. Sep 2017 B2
9770203 Berme et al. Sep 2017 B1
9772102 Ferguson Sep 2017 B1
9772495 Tam et al. Sep 2017 B2
9791138 Feinbloom et al. Oct 2017 B1
9800995 Libin et al. Oct 2017 B2
9805504 Zhang et al. Oct 2017 B2
9808148 Miller et al. Nov 2017 B2
9839448 Reckling et al. Dec 2017 B2
9844413 Daon et al. Dec 2017 B2
9851080 Wilt et al. Dec 2017 B2
9858663 Penney et al. Jan 2018 B2
9861446 Lang Jan 2018 B2
9864214 Fass Jan 2018 B2
9872733 Shoham et al. Jan 2018 B2
9875544 Rai et al. Jan 2018 B2
9877642 Duret Jan 2018 B2
9885465 Nguyen Feb 2018 B2
9886552 Dillavou et al. Feb 2018 B2
9886760 Liu et al. Feb 2018 B2
9892564 Cvetko et al. Feb 2018 B1
9898866 Fuchs et al. Feb 2018 B2
9901414 Lively et al. Feb 2018 B2
9911187 Steinle et al. Mar 2018 B2
9927611 Rudy et al. Mar 2018 B2
9928629 Benishti et al. Mar 2018 B2
9940750 Dillavou et al. Apr 2018 B2
9943374 Merritt et al. Apr 2018 B2
9947110 Haimerl Apr 2018 B2
9952664 Border et al. Apr 2018 B2
9956054 Aguirre-Valencia May 2018 B2
9958674 Border May 2018 B2
9959620 Merlet May 2018 B2
9959629 Dillavou et al. May 2018 B2
9965681 Border et al. May 2018 B2
9968297 Connor May 2018 B2
9980780 Lang May 2018 B2
9986228 Woods May 2018 B2
D824523 Paoli et al. Jul 2018 S
10010379 Gibby et al. Jul 2018 B1
10013531 Richards et al. Jul 2018 B2
10015243 Kazerani et al. Jul 2018 B2
10016243 Esterberg Jul 2018 B2
10022064 Kim et al. Jul 2018 B2
10022065 Ben-Yishai et al. Jul 2018 B2
10022104 Sell et al. Jul 2018 B2
10023615 Bonny Jul 2018 B2
10026015 Cavusoglu et al. Jul 2018 B2
10034713 Yang et al. Jul 2018 B2
10046165 Frewin et al. Aug 2018 B2
10055838 Elenbaas et al. Aug 2018 B2
10066816 Chang Sep 2018 B2
10067359 Ushakov Sep 2018 B1
10073515 Awdeh Sep 2018 B2
10080616 Wilkinson et al. Sep 2018 B2
10082680 Chung Sep 2018 B2
10085709 Lavallee et al. Oct 2018 B2
10105187 Corndorf et al. Oct 2018 B2
10107483 Oren Oct 2018 B2
10108833 Hong et al. Oct 2018 B2
10123840 Dorman Nov 2018 B2
10130378 Bryan Nov 2018 B2
10132483 Feinbloom et al. Nov 2018 B1
10134166 Benishti et al. Nov 2018 B2
10134194 Kepner et al. Nov 2018 B2
10139652 Windham Nov 2018 B2
10139920 Isaacs et al. Nov 2018 B2
10142496 Rao et al. Nov 2018 B1
10151928 Ushakov Dec 2018 B2
10154239 Casas Dec 2018 B2
10159530 Lang Dec 2018 B2
10163207 Merlet Dec 2018 B2
10166079 Mclachlin et al. Jan 2019 B2
10175507 Nakamura Jan 2019 B2
10175753 Boesen Jan 2019 B2
10181361 Dillavou et al. Jan 2019 B2
10186055 Takahashi et al. Jan 2019 B2
10188672 Wagner Jan 2019 B2
10194131 Casas Jan 2019 B2
10194990 Amanatullah et al. Feb 2019 B2
10194993 Roger et al. Feb 2019 B2
10195076 Fateh Feb 2019 B2
10197803 Badiali et al. Feb 2019 B2
10197816 Waisman et al. Feb 2019 B2
10207315 Appleby et al. Feb 2019 B2
10212517 Beltran Feb 2019 B1
10230719 Vaughn et al. Mar 2019 B2
10231893 Lei et al. Mar 2019 B2
10235606 Miao et al. Mar 2019 B2
10240769 Braganca et al. Mar 2019 B1
10247965 Ton Apr 2019 B2
10251724 Mclachlin et al. Apr 2019 B2
10261324 Chuang et al. Apr 2019 B2
10262424 Ketcha et al. Apr 2019 B2
10274731 Maimone Apr 2019 B2
10278777 Lang May 2019 B1
10292768 Lang May 2019 B2
10296805 Yang et al. May 2019 B2
10319154 Chakravarthula et al. Jun 2019 B1
10326975 Casas Jun 2019 B2
10332267 Rai et al. Jun 2019 B2
10339719 Jagga et al. Jul 2019 B2
10352543 Braganca et al. Jul 2019 B1
10357146 Fiebel et al. Jul 2019 B2
10357574 Hilderbrand et al. Jul 2019 B2
10366489 Boettger et al. Jul 2019 B2
10368947 Lang Aug 2019 B2
10368948 Tripathi Aug 2019 B2
10382748 Benishti et al. Aug 2019 B2
10383654 Yilmaz et al. Aug 2019 B2
10386645 Abou Shousha Aug 2019 B2
10398514 Ryan et al. Sep 2019 B2
10405825 Rai et al. Sep 2019 B2
10405927 Lang Sep 2019 B1
10413752 Berlinger et al. Sep 2019 B2
10419655 Sivan Sep 2019 B2
10420626 Tokuda et al. Sep 2019 B2
10420813 Newell-Rogers et al. Sep 2019 B2
10424115 Ellerbrock Sep 2019 B2
D862469 Sadot et al. Oct 2019 S
10426554 Siewerdsen et al. Oct 2019 B2
10429675 Greget Oct 2019 B2
10431008 Djajadiningrat et al. Oct 2019 B2
10433814 Razzaque et al. Oct 2019 B2
10434335 Takahashi et al. Oct 2019 B2
10441236 Bar-Tal et al. Oct 2019 B2
10444514 Abou Shousha et al. Oct 2019 B2
10447947 Liu Oct 2019 B2
10448003 Grafenberg Oct 2019 B2
10449040 Lashinski et al. Oct 2019 B2
10453187 Peterson et al. Oct 2019 B2
10463434 Siegler et al. Nov 2019 B2
10465892 Feinbloom et al. Nov 2019 B1
10466487 Blum et al. Nov 2019 B2
10470732 Baumgart et al. Nov 2019 B2
10473314 Braganca et al. Nov 2019 B1
10485989 Jordan et al. Nov 2019 B2
10488663 Choi Nov 2019 B2
D869772 Gand Dec 2019 S
D870977 Berggren et al. Dec 2019 S
10492755 Lin et al. Dec 2019 B2
10499997 Weinstein et al. Dec 2019 B2
10502363 Edwards Dec 2019 B2
10504231 Fiala Dec 2019 B2
10507066 Dimaio et al. Dec 2019 B2
10511822 Casas Dec 2019 B2
10517544 Taguchi et al. Dec 2019 B2
10537395 Perez Jan 2020 B2
10540780 Cousins et al. Jan 2020 B1
10543485 Ismagilov et al. Jan 2020 B2
10546423 Jones et al. Jan 2020 B2
10548557 Lim et al. Feb 2020 B2
10555775 Hoffman et al. Feb 2020 B2
10568535 Roberts et al. Feb 2020 B2
10571696 Urey et al. Feb 2020 B2
10571716 Chapiro Feb 2020 B2
10573087 Gallop et al. Feb 2020 B2
10577630 Zhang et al. Mar 2020 B2
10586400 Douglas Mar 2020 B2
10591737 Yildiz et al. Mar 2020 B2
10592748 Cousins et al. Mar 2020 B1
10594998 Casas Mar 2020 B1
10595716 Nazareth et al. Mar 2020 B2
10601950 Devam et al. Mar 2020 B2
10602114 Casas Mar 2020 B2
10603113 Lang Mar 2020 B2
10603133 Wang et al. Mar 2020 B2
10606085 Toyama Mar 2020 B2
10610172 Hummel et al. Apr 2020 B2
10610179 Altmann Apr 2020 B2
10613352 Knoll Apr 2020 B2
10617566 Esmonde Apr 2020 B2
10620460 Carabin Apr 2020 B2
10621738 Miao et al. Apr 2020 B2
10625099 Takahashi et al. Apr 2020 B2
10626473 Mariani et al. Apr 2020 B2
10631905 Asfora et al. Apr 2020 B2
10631907 Zucker et al. Apr 2020 B2
10634331 Feinbloom et al. Apr 2020 B1
10634921 Blum et al. Apr 2020 B2
10638080 Ovchinnikov et al. Apr 2020 B2
10646285 Siemionow et al. May 2020 B2
10650513 Penney et al. May 2020 B2
10650594 Jones et al. May 2020 B2
10652525 Woods May 2020 B2
10653495 Gregerson et al. May 2020 B2
10660715 Dozeman May 2020 B2
10663738 Carlvik et al. May 2020 B2
10672145 Albiol et al. Jun 2020 B2
10682112 Pizaine et al. Jun 2020 B2
10682767 Grafenberg et al. Jun 2020 B2
10687901 Thomas Jun 2020 B2
10691397 Clements Jun 2020 B1
10702713 Mori et al. Jul 2020 B2
10706540 Merlet Jul 2020 B2
10709398 Schweizer Jul 2020 B2
10713801 Jordan et al. Jul 2020 B2
10716643 Justin et al. Jul 2020 B2
10722733 Takahashi Jul 2020 B2
10725535 Yu Jul 2020 B2
10731832 Koo Aug 2020 B2
10732721 Clements Aug 2020 B1
10742949 Casas Aug 2020 B2
10743939 Lang Aug 2020 B1
10743943 Razeto et al. Aug 2020 B2
10747315 Tungare et al. Aug 2020 B2
10748319 Tao et al. Aug 2020 B1
10758315 Johnson et al. Sep 2020 B2
10777094 Rao et al. Sep 2020 B1
10777315 Zehavi et al. Sep 2020 B2
10781482 Gubatayao et al. Sep 2020 B2
10792110 Leung et al. Oct 2020 B2
10799145 West et al. Oct 2020 B2
10799296 Lang Oct 2020 B2
10799298 Crawford et al. Oct 2020 B2
10799316 Sela et al. Oct 2020 B2
10810799 Tepper et al. Oct 2020 B2
10818019 Piat et al. Oct 2020 B2
10818101 Gallop et al. Oct 2020 B2
10818199 Buras et al. Oct 2020 B2
10825563 Gibby et al. Nov 2020 B2
10831943 Santarone et al. Nov 2020 B2
10835296 Elimelech et al. Nov 2020 B2
10838206 Fortin-Deschnes et al. Nov 2020 B2
10839629 Jones et al. Nov 2020 B2
10839956 Beydoun et al. Nov 2020 B2
10841556 Casas Nov 2020 B2
10842002 Chang Nov 2020 B2
10842461 Johnson et al. Nov 2020 B2
10849691 Zucker et al. Dec 2020 B2
10849693 Lang Dec 2020 B2
10849710 Liu Dec 2020 B2
10861236 Geri et al. Dec 2020 B2
10865220 Ebetino et al. Dec 2020 B2
10869517 Halpern Dec 2020 B1
10869727 Yanof et al. Dec 2020 B2
10872472 Watola et al. Dec 2020 B2
10877262 Luxembourg Dec 2020 B1
10877296 Lindsey et al. Dec 2020 B2
10878639 Douglas et al. Dec 2020 B2
10893260 Trail et al. Jan 2021 B2
10895742 Schneider et al. Jan 2021 B2
10895743 Dausmann Jan 2021 B2
10895906 West et al. Jan 2021 B2
10898151 Harding et al. Jan 2021 B2
10921595 Rakshit et al. Feb 2021 B2
10921613 Gupta et al. Feb 2021 B2
10928321 Rawle Feb 2021 B2
10928638 Ninan et al. Feb 2021 B2
10929670 Troy et al. Feb 2021 B1
10935815 Castaeda Mar 2021 B1
10935816 Ban et al. Mar 2021 B2
10936537 Huston Mar 2021 B2
10939973 Dimaio et al. Mar 2021 B2
10939977 Messinger et al. Mar 2021 B2
10941933 Ferguson Mar 2021 B2
10946108 Zhang et al. Mar 2021 B2
10950338 Douglas Mar 2021 B2
10951872 Casas Mar 2021 B2
10964095 Douglas Mar 2021 B1
10964124 Douglas Mar 2021 B1
10966768 Poulos Apr 2021 B2
10993754 Kuntz et al. May 2021 B2
11000335 Dorman May 2021 B2
11006093 Hegyi May 2021 B1
11013550 Rioux et al. May 2021 B2
11013560 Lang May 2021 B2
11013562 Marti et al. May 2021 B2
11013573 Chang May 2021 B2
11013900 Malek et al. May 2021 B2
11019988 Fiebel et al. Jun 2021 B2
11027027 Manning et al. Jun 2021 B2
11029147 Abovitz et al. Jun 2021 B2
11030809 Wang Jun 2021 B2
11041173 Zhang et al. Jun 2021 B2
11045663 Mori et al. Jun 2021 B2
11049293 Chae et al. Jun 2021 B2
11049476 Fuchs et al. Jun 2021 B2
11050990 Casas Jun 2021 B2
11057505 Dharmatilleke Jul 2021 B2
11058390 Douglas Jul 2021 B1
11061257 Hakim Jul 2021 B1
11064904 Kay et al. Jul 2021 B2
11065062 Frushour et al. Jul 2021 B2
11067387 Marell et al. Jul 2021 B2
11071497 Hallack et al. Jul 2021 B2
11079596 Hua et al. Aug 2021 B2
11087039 Duff et al. Aug 2021 B2
11090019 Siemionow et al. Aug 2021 B2
11097129 Sakata et al. Aug 2021 B2
11099376 Steier et al. Aug 2021 B1
11103320 Leboeuf et al. Aug 2021 B2
D930162 Cremer et al. Sep 2021 S
11109762 Steier et al. Sep 2021 B1
11112611 Kessler et al. Sep 2021 B1
11122164 Gigante Sep 2021 B2
11123604 Fung Sep 2021 B2
11129562 Roberts et al. Sep 2021 B2
11132055 Jones et al. Sep 2021 B2
11135015 Crawford et al. Oct 2021 B2
11135016 Frielinghaus et al. Oct 2021 B2
11137610 Kessler et al. Oct 2021 B1
11141221 Hobeika et al. Oct 2021 B2
11153549 Casas Oct 2021 B2
11153555 Healy et al. Oct 2021 B1
11163176 Karafin et al. Nov 2021 B2
11164324 Liu et al. Nov 2021 B2
11166006 Hegyi Nov 2021 B2
11172990 Lang Nov 2021 B2
11179136 Kohli et al. Nov 2021 B2
11180557 Noelle Nov 2021 B2
11181747 Kessler et al. Nov 2021 B1
11185891 Cousins et al. Nov 2021 B2
11202682 Staunton et al. Dec 2021 B2
11207150 Healy et al. Dec 2021 B2
11217028 Jones et al. Jan 2022 B2
11224483 Steinberg et al. Jan 2022 B2
11224763 Takahashi et al. Jan 2022 B2
11227417 Berlinger et al. Jan 2022 B2
11231787 Isaacs et al. Jan 2022 B2
11244508 Kazanzides et al. Feb 2022 B2
11253216 Crawford et al. Feb 2022 B2
11253323 Hughes et al. Feb 2022 B2
11257190 Mao et al. Feb 2022 B2
11257241 Tao Feb 2022 B2
11263772 Siemionow et al. Mar 2022 B2
11269401 West et al. Mar 2022 B2
11272151 Casas Mar 2022 B2
11278359 Siemionow et al. Mar 2022 B2
11278413 Lang Mar 2022 B1
11280480 Wilt et al. Mar 2022 B2
11284846 Graumann et al. Mar 2022 B2
11291521 Im Apr 2022 B2
11294167 Ishimoda Apr 2022 B2
11297285 Pierce Apr 2022 B2
11300252 Nguyen Apr 2022 B2
11300790 Cheng et al. Apr 2022 B2
11304621 Merschon et al. Apr 2022 B2
11304759 Kovtun et al. Apr 2022 B2
11307402 Steier et al. Apr 2022 B2
11308663 Alhrishy et al. Apr 2022 B2
11311341 Lang Apr 2022 B2
11317973 Calloway et al. May 2022 B2
11337763 Choi May 2022 B2
11348257 Lang May 2022 B2
11350072 Quiles Casas May 2022 B1
11350965 Yilmaz et al. Jun 2022 B2
11351006 Aferzon et al. Jun 2022 B2
11354813 Piat et al. Jun 2022 B2
11360315 Tu et al. Jun 2022 B2
11382699 Wassall et al. Jul 2022 B2
11382700 Calloway et al. Jul 2022 B2
11382712 Elimelech et al. Jul 2022 B2
11382713 Healy et al. Jul 2022 B2
11389252 Gera et al. Jul 2022 B2
11399895 Soper et al. Aug 2022 B2
11402524 Song et al. Aug 2022 B2
11406338 Tolkowsky Aug 2022 B2
11423554 Borsdorf et al. Aug 2022 B2
11432828 Lang Sep 2022 B1
11432931 Lang Sep 2022 B2
11452568 Lang Sep 2022 B2
11460915 Frielinghaus et al. Oct 2022 B2
11461983 Jones et al. Oct 2022 B2
11464581 Calloway Oct 2022 B2
11478214 Siewerdsen et al. Oct 2022 B2
11483532 Quiles Casas Oct 2022 B2
11490986 Ben-Yishai Nov 2022 B2
11527002 Govari Dec 2022 B2
11528393 Garofolo et al. Dec 2022 B2
11627924 Alexandroni et al. Apr 2023 B2
11648016 Hathaway et al. May 2023 B2
11657518 Ketcha et al. May 2023 B2
11666458 Kim et al. Jun 2023 B2
11669984 Siewerdsen et al. Jun 2023 B2
11712582 Miyazaki et al. Aug 2023 B2
11750794 Benishti et al. Sep 2023 B2
11766296 Wolf et al. Sep 2023 B2
11798178 Merlet Oct 2023 B2
11801097 Crawford et al. Oct 2023 B2
11801115 Elimelech et al. Oct 2023 B2
11826111 Mahfouz Nov 2023 B2
11839501 Takahashi et al. Dec 2023 B2
11885752 St-Aubin et al. Jan 2024 B2
11896445 Gera et al. Feb 2024 B2
20020082498 Wendt et al. Jun 2002 A1
20030059097 Abovitz et al. Mar 2003 A1
20030117393 Sauer et al. Jun 2003 A1
20030130576 Seeley et al. Jul 2003 A1
20030156144 Morita Aug 2003 A1
20030210812 Khamene et al. Nov 2003 A1
20030225329 Rossner et al. Dec 2003 A1
20040019263 Jutras et al. Jan 2004 A1
20040030237 Lee et al. Feb 2004 A1
20040138556 Cosman Jul 2004 A1
20040238732 State et al. Dec 2004 A1
20050017972 Poole et al. Jan 2005 A1
20050024586 Teiwes et al. Feb 2005 A1
20050119639 McCombs et al. Jun 2005 A1
20050203367 Ahmed et al. Sep 2005 A1
20050203380 Sauer et al. Sep 2005 A1
20050215879 Chuanggui Sep 2005 A1
20060072124 Smetak et al. Apr 2006 A1
20060134198 Tawa et al. Jun 2006 A1
20060176242 Jaramaz et al. Aug 2006 A1
20070018975 Chuanggui et al. Jan 2007 A1
20070058261 Sugihara et al. Mar 2007 A1
20070183041 McCloy et al. Aug 2007 A1
20070233371 Stoschek et al. Oct 2007 A1
20070273610 Baillot Nov 2007 A1
20080002809 Bodduluri Jan 2008 A1
20080007645 McCutchen Jan 2008 A1
20080035266 Danziger Feb 2008 A1
20080085033 Haven et al. Apr 2008 A1
20080159612 Fu et al. Jul 2008 A1
20080183065 Goldbach Jul 2008 A1
20080221625 Hufner et al. Sep 2008 A1
20080253527 Boyden et al. Oct 2008 A1
20080262812 Arata et al. Oct 2008 A1
20080287728 Mostafavi et al. Nov 2008 A1
20090018437 Cooke Jan 2009 A1
20090024127 Lechner et al. Jan 2009 A1
20090036902 Dimaio et al. Feb 2009 A1
20090062869 Claverie et al. Mar 2009 A1
20090099445 Burger Apr 2009 A1
20090123452 Madison May 2009 A1
20090227847 Tepper et al. Sep 2009 A1
20090285366 Essenreiter et al. Nov 2009 A1
20090300540 Russell Dec 2009 A1
20100076305 Maier-Hein et al. Mar 2010 A1
20100094308 Tatsumi et al. Apr 2010 A1
20100106010 Rubner et al. Apr 2010 A1
20100114110 Taft et al. May 2010 A1
20100138939 Bentzon et al. Jun 2010 A1
20100149073 Chaum et al. Jun 2010 A1
20100172567 Prokoski Jul 2010 A1
20100210939 Hartmann et al. Aug 2010 A1
20100266220 Zagorchev et al. Oct 2010 A1
20100274124 Jascob et al. Oct 2010 A1
20110004259 Stallings et al. Jan 2011 A1
20110098553 Robbins et al. Apr 2011 A1
20110105895 Kornblau et al. May 2011 A1
20110216060 Weising et al. Sep 2011 A1
20110245625 Trovato et al. Oct 2011 A1
20110248064 Marczyk Oct 2011 A1
20110254922 Schaerer et al. Oct 2011 A1
20110306873 Shenai et al. Dec 2011 A1
20120014608 Watanabe Jan 2012 A1
20120068913 Bar-Zeev et al. Mar 2012 A1
20120078236 Schoepp Mar 2012 A1
20120109151 Maier-Hein et al. May 2012 A1
20120143050 Heigl Jun 2012 A1
20120155064 Waters Jun 2012 A1
20120162452 Liu Jun 2012 A1
20120182605 Hall et al. Jul 2012 A1
20120201421 Hartmann et al. Aug 2012 A1
20120216411 Wevers et al. Aug 2012 A1
20120224260 Healy Sep 2012 A1
20120238609 Srivastava et al. Sep 2012 A1
20120289777 Chopra et al. Nov 2012 A1
20120306850 Balan et al. Dec 2012 A1
20120320100 Machida et al. Dec 2012 A1
20130002928 Imai Jan 2013 A1
20130009853 Hesselink et al. Jan 2013 A1
20130038632 Dillavou et al. Feb 2013 A1
20130050258 Liu et al. Feb 2013 A1
20130050833 Lewis et al. Feb 2013 A1
20130057581 Meier Mar 2013 A1
20130083009 Geisner et al. Apr 2013 A1
20130106833 Fun May 2013 A1
20130135734 Shafer et al. May 2013 A1
20130135738 Shafer et al. May 2013 A1
20130190602 Liao et al. Jul 2013 A1
20130195338 Xu et al. Aug 2013 A1
20130209953 Arlinsky et al. Aug 2013 A1
20130234914 Fujimaki Sep 2013 A1
20130234935 Griffith Sep 2013 A1
20130237811 Mihailescu et al. Sep 2013 A1
20130245461 Maier-Hein et al. Sep 2013 A1
20130249787 Morimoto Sep 2013 A1
20130249945 Kobayashi Sep 2013 A1
20130265623 Sugiyama et al. Oct 2013 A1
20130267838 Fronk et al. Oct 2013 A1
20130278635 Maggiore Oct 2013 A1
20130300637 Smits et al. Nov 2013 A1
20130300760 Sugano et al. Nov 2013 A1
20130342571 Kinnebrew et al. Dec 2013 A1
20140031668 Mobasser et al. Jan 2014 A1
20140049629 Siewerdsen et al. Feb 2014 A1
20140088402 Xu Mar 2014 A1
20140088990 Nawana et al. Mar 2014 A1
20140104505 Koenig Apr 2014 A1
20140105912 Noelle Apr 2014 A1
20140114173 Bar-Tal et al. Apr 2014 A1
20140142426 Razzaque et al. May 2014 A1
20140168261 Margolis et al. Jun 2014 A1
20140176661 Smurro et al. Jun 2014 A1
20140177023 Gao et al. Jun 2014 A1
20140189508 Granchi et al. Jul 2014 A1
20140198129 Liu et al. Jul 2014 A1
20140218291 Kirk Aug 2014 A1
20140240484 Kodama et al. Aug 2014 A1
20140243614 Rothberg et al. Aug 2014 A1
20140256429 Kobayashi et al. Sep 2014 A1
20140266983 Christensen Sep 2014 A1
20140268356 Bolas et al. Sep 2014 A1
20140270505 McCarthy Sep 2014 A1
20140275760 Lee et al. Sep 2014 A1
20140285404 Takano et al. Sep 2014 A1
20140285429 Simmons Sep 2014 A1
20140300632 Laor Oct 2014 A1
20140300967 Tilleman et al. Oct 2014 A1
20140301624 Barckow et al. Oct 2014 A1
20140303491 Shekhar et al. Oct 2014 A1
20140320399 Kim et al. Oct 2014 A1
20140333899 Smithwick Nov 2014 A1
20140336461 Reiter et al. Nov 2014 A1
20140340286 Machida et al. Nov 2014 A1
20140361956 Mikhailov et al. Dec 2014 A1
20150005772 Anglin et al. Jan 2015 A1
20150018672 Blumhofer et al. Jan 2015 A1
20150031985 Reddy et al. Jan 2015 A1
20150043798 Carrell et al. Feb 2015 A1
20150070347 Hofmann et al. Mar 2015 A1
20150084990 Laor Mar 2015 A1
20150150641 Daon et al. Jun 2015 A1
20150182293 Yang et al. Jul 2015 A1
20150192776 Lee et al. Jul 2015 A1
20150209119 Theodore et al. Jul 2015 A1
20150261922 Nawana et al. Sep 2015 A1
20150277123 Chaum et al. Oct 2015 A1
20150282735 Rossner Oct 2015 A1
20150287188 Gazit et al. Oct 2015 A1
20150287236 Winne et al. Oct 2015 A1
20150297314 Fowler et al. Oct 2015 A1
20150305828 Park et al. Oct 2015 A1
20150310668 Ellerbrock Oct 2015 A1
20150338652 Lim Nov 2015 A1
20150350517 Duret et al. Dec 2015 A1
20150351863 Plassky et al. Dec 2015 A1
20150363978 Maimone et al. Dec 2015 A1
20150366620 Cameron et al. Dec 2015 A1
20160022287 Nehls Jan 2016 A1
20160030131 Yang et al. Feb 2016 A1
20160054571 Tazbaz et al. Feb 2016 A1
20160086380 Vayser et al. Mar 2016 A1
20160103318 Du et al. Apr 2016 A1
20160125603 Tanji May 2016 A1
20160133051 Aonuma et al. May 2016 A1
20160143699 Tanji May 2016 A1
20160153004 Zhang et al. Jun 2016 A1
20160163045 Penney et al. Jun 2016 A1
20160175064 Steinle et al. Jun 2016 A1
20160178910 Giudicelli et al. Jun 2016 A1
20160191887 Casas Jun 2016 A1
20160223822 Harrison et al. Aug 2016 A1
20160228033 Rossner Aug 2016 A1
20160246059 Halpin Aug 2016 A1
20160249989 Devam et al. Sep 2016 A1
20160256223 Haimerl et al. Sep 2016 A1
20160275684 Elenbaas et al. Sep 2016 A1
20160302870 Wilkinson et al. Oct 2016 A1
20160324580 Esterberg Nov 2016 A1
20160324583 Kheradpir et al. Nov 2016 A1
20160339337 Ellsworth et al. Nov 2016 A1
20170014119 Capote et al. Jan 2017 A1
20170024634 Miao et al. Jan 2017 A1
20170027650 Merck et al. Feb 2017 A1
20170031163 Gao et al. Feb 2017 A1
20170031179 Guillot et al. Feb 2017 A1
20170045742 Greenhalgh Feb 2017 A1
20170068119 Antaki et al. Mar 2017 A1
20170076501 Jagga et al. Mar 2017 A1
20170086941 Marti et al. Mar 2017 A1
20170112586 Dhupar Apr 2017 A1
20170164919 Lavallee et al. Jun 2017 A1
20170164920 Lavallee et al. Jun 2017 A1
20170178375 Benishti et al. Jun 2017 A1
20170220224 Kodali et al. Aug 2017 A1
20170239015 Sela et al. Aug 2017 A1
20170245944 Crawford et al. Aug 2017 A1
20170251900 Hansen et al. Sep 2017 A1
20170252109 Yang et al. Sep 2017 A1
20170258526 Lang Sep 2017 A1
20170281283 Siegler et al. Oct 2017 A1
20170312032 Amanatullah et al. Nov 2017 A1
20170348055 Salcedo et al. Dec 2017 A1
20170348061 Joshi et al. Dec 2017 A1
20170366773 Kiraly et al. Dec 2017 A1
20170367766 Mahfouz Dec 2017 A1
20170367771 Tako et al. Dec 2017 A1
20170372477 Penney et al. Dec 2017 A1
20180003981 Urey Jan 2018 A1
20180018791 Guoyi Jan 2018 A1
20180021597 Berlinger et al. Jan 2018 A1
20180028266 Barnes et al. Feb 2018 A1
20180036884 Chen et al. Feb 2018 A1
20180049622 Ryan et al. Feb 2018 A1
20180055579 Daon et al. Mar 2018 A1
20180078316 Schaewe et al. Mar 2018 A1
20180082480 White et al. Mar 2018 A1
20180092667 Heigl et al. Apr 2018 A1
20180092698 Chopra et al. Apr 2018 A1
20180092699 Finley Apr 2018 A1
20180116732 Lin et al. May 2018 A1
20180117150 O'Dwyer et al. May 2018 A1
20180133871 Farmer May 2018 A1
20180153626 Yang et al. Jun 2018 A1
20180182150 Benishti et al. Jun 2018 A1
20180185100 Weinstein et al. Jul 2018 A1
20180185113 Gregerson et al. Jul 2018 A1
20180193097 McLachlin et al. Jul 2018 A1
20180200002 Kostrzewski et al. Jul 2018 A1
20180247128 Alvi et al. Aug 2018 A1
20180262743 Casas Sep 2018 A1
20180303558 Thomas Oct 2018 A1
20180311011 Van et al. Nov 2018 A1
20180317803 Ben-Yishai et al. Nov 2018 A1
20180318035 McLachlin et al. Nov 2018 A1
20180368898 Divincenzo et al. Dec 2018 A1
20190000372 Gullotti et al. Jan 2019 A1
20190000564 Navab et al. Jan 2019 A1
20190015163 Abhari et al. Jan 2019 A1
20190038362 Nash et al. Feb 2019 A1
20190038365 Soper et al. Feb 2019 A1
20190043238 Benishti et al. Feb 2019 A1
20190043392 Abele Feb 2019 A1
20190046272 Zoabi et al. Feb 2019 A1
20190046276 Inglese et al. Feb 2019 A1
20190053851 Siemionow et al. Feb 2019 A1
20190069971 Tripathi et al. Mar 2019 A1
20190080515 Geri et al. Mar 2019 A1
20190105116 Johnson et al. Apr 2019 A1
20190130792 Rios et al. May 2019 A1
20190142519 Siemionow et al. May 2019 A1
20190144443 Jackson et al. May 2019 A1
20190175228 Elimelech et al. Jun 2019 A1
20190192230 Siemionow et al. Jun 2019 A1
20190200894 Jung et al. Jul 2019 A1
20190201106 Siemionow et al. Jul 2019 A1
20190216537 Eltorai et al. Jul 2019 A1
20190254753 Johnson et al. Aug 2019 A1
20190273916 Benishti et al. Sep 2019 A1
20190310481 Blum et al. Oct 2019 A1
20190333480 Lang Oct 2019 A1
20190369660 Wen Dec 2019 A1
20190369717 Frielinghaus et al. Dec 2019 A1
20190387351 Lyren et al. Dec 2019 A1
20200015895 Frielinghaus et al. Jan 2020 A1
20200019364 Pond Jan 2020 A1
20200020249 Jarc et al. Jan 2020 A1
20200038112 Amanatullah et al. Feb 2020 A1
20200043160 Mizukura et al. Feb 2020 A1
20200078100 Weinstein et al. Mar 2020 A1
20200085511 Oezbek et al. Mar 2020 A1
20200088997 Lee et al. Mar 2020 A1
20200100847 Siegler et al. Apr 2020 A1
20200117025 Sauer Apr 2020 A1
20200129058 Li et al. Apr 2020 A1
20200129136 Harding et al. Apr 2020 A1
20200129262 Verard et al. Apr 2020 A1
20200129264 Oativia et al. Apr 2020 A1
20200133029 Yonezawa Apr 2020 A1
20200138518 Lang May 2020 A1
20200138618 Roszkowiak et al. May 2020 A1
20200143594 Lal et al. May 2020 A1
20200146546 Chene et al. May 2020 A1
20200151507 Siemionow et al. May 2020 A1
20200156259 Ruiz et al. May 2020 A1
20200159313 Gibby et al. May 2020 A1
20200163723 Wolf et al. May 2020 A1
20200163739 Messinger et al. May 2020 A1
20200178916 Lalys et al. Jun 2020 A1
20200184638 Meglan et al. Jun 2020 A1
20200186786 Gibby et al. Jun 2020 A1
20200188028 Feiner et al. Jun 2020 A1
20200188034 Lequette et al. Jun 2020 A1
20200201082 Carabin Jun 2020 A1
20200229877 Siemionow et al. Jul 2020 A1
20200237256 Farshad et al. Jul 2020 A1
20200237459 Racheli et al. Jul 2020 A1
20200237880 Kent et al. Jul 2020 A1
20200242280 Pavloff et al. Jul 2020 A1
20200246074 Lang Aug 2020 A1
20200246081 Johnson et al. Aug 2020 A1
20200264451 Blum et al. Aug 2020 A1
20200265273 Wei et al. Aug 2020 A1
20200275988 Johnson et al. Sep 2020 A1
20200281554 Trini et al. Sep 2020 A1
20200286222 Essenreiter et al. Sep 2020 A1
20200288075 Bonin et al. Sep 2020 A1
20200294233 Merlet Sep 2020 A1
20200297427 Cameron et al. Sep 2020 A1
20200305980 Lang Oct 2020 A1
20200315734 El Amm Oct 2020 A1
20200321099 Holladay et al. Oct 2020 A1
20200323460 Busza et al. Oct 2020 A1
20200323609 Johnson et al. Oct 2020 A1
20200327721 Siemionow et al. Oct 2020 A1
20200330179 Ton Oct 2020 A1
20200337780 Winkler et al. Oct 2020 A1
20200341283 McCracken et al. Oct 2020 A1
20200352655 Freese Nov 2020 A1
20200355927 Marcellin-Dibon et al. Nov 2020 A1
20200360091 Murray et al. Nov 2020 A1
20200375666 Murphy Dec 2020 A1
20200377493 Heiser et al. Dec 2020 A1
20200377956 Vogelstein et al. Dec 2020 A1
20200388075 Kazanzides et al. Dec 2020 A1
20200389425 Bhatia et al. Dec 2020 A1
20200390502 Holthuizen et al. Dec 2020 A1
20200390503 Casas et al. Dec 2020 A1
20200402647 Domracheva et al. Dec 2020 A1
20200409306 Gelman et al. Dec 2020 A1
20200410687 Siemionow et al. Dec 2020 A1
20200413031 Khani et al. Dec 2020 A1
20210004956 Book et al. Jan 2021 A1
20210009339 Morrison et al. Jan 2021 A1
20210015560 Boddington et al. Jan 2021 A1
20210015583 Avisar et al. Jan 2021 A1
20210022599 Freeman et al. Jan 2021 A1
20210022808 Lang Jan 2021 A1
20210022811 Mahfouz Jan 2021 A1
20210022828 Elimelech et al. Jan 2021 A1
20210029804 Chang Jan 2021 A1
20210030374 Takahashi et al. Feb 2021 A1
20210030511 Wolf et al. Feb 2021 A1
20210038339 Yu et al. Feb 2021 A1
20210049825 Wheelwright et al. Feb 2021 A1
20210052348 Stifter et al. Feb 2021 A1
20210065911 Goel et al. Mar 2021 A1
20210077195 Saeidi et al. Mar 2021 A1
20210077210 Itkowitz et al. Mar 2021 A1
20210080751 Lindsey et al. Mar 2021 A1
20210090344 Geri et al. Mar 2021 A1
20210093391 Poltaretskyi et al. Apr 2021 A1
20210093392 Poltaretskyi et al. Apr 2021 A1
20210093400 Quaid et al. Apr 2021 A1
20210093417 Liu Apr 2021 A1
20210104055 Ni et al. Apr 2021 A1
20210107923 Jackson et al. Apr 2021 A1
20210109349 Schneider et al. Apr 2021 A1
20210109373 Loo et al. Apr 2021 A1
20210110517 Flohr et al. Apr 2021 A1
20210113269 Vilsmeier et al. Apr 2021 A1
20210113293 Silva et al. Apr 2021 A9
20210121238 Palushi et al. Apr 2021 A1
20210137634 Lang May 2021 A1
20210141887 Kim et al. May 2021 A1
20210150702 Claessen et al. May 2021 A1
20210157544 Denton May 2021 A1
20210160472 Casas May 2021 A1
20210161614 Elimelech et al. Jun 2021 A1
20210162287 Xing et al. Jun 2021 A1
20210165207 Peyman Jun 2021 A1
20210169504 Brown Jun 2021 A1
20210169578 Calloway et al. Jun 2021 A1
20210169581 Calloway et al. Jun 2021 A1
20210169605 Calloway et al. Jun 2021 A1
20210186647 Elimelech et al. Jun 2021 A1
20210196404 Wang Jul 2021 A1
20210211640 Bristol Jul 2021 A1
20210223577 Zhang et al. Jul 2021 A1
20210227791 De et al. Jul 2021 A1
20210235061 Hegyi Jul 2021 A1
20210248822 Choi et al. Aug 2021 A1
20210274281 Zhang et al. Sep 2021 A1
20210278675 Klug et al. Sep 2021 A1
20210282887 Wiggermann Sep 2021 A1
20210290046 Nazareth et al. Sep 2021 A1
20210290336 Wang Sep 2021 A1
20210290394 Mahfouz Sep 2021 A1
20210295512 Knoplioch et al. Sep 2021 A1
20210298835 Wang Sep 2021 A1
20210306599 Pierce Sep 2021 A1
20210311322 Belanger et al. Oct 2021 A1
20210314502 Liu Oct 2021 A1
20210315636 Akbarian et al. Oct 2021 A1
20210315662 Freeman et al. Oct 2021 A1
20210325684 Ninan et al. Oct 2021 A1
20210332447 Lubelski et al. Oct 2021 A1
20210333561 Oh et al. Oct 2021 A1
20210341739 Cakmakci Nov 2021 A1
20210341740 Cakmakci Nov 2021 A1
20210346115 Dulin et al. Nov 2021 A1
20210349677 Baldev et al. Nov 2021 A1
20210369226 Siemionow et al. Dec 2021 A1
20210371413 Thurston et al. Dec 2021 A1
20210373333 Moon Dec 2021 A1
20210373344 Loyola et al. Dec 2021 A1
20210378757 Bay et al. Dec 2021 A1
20210386482 Gera et al. Dec 2021 A1
20210389590 Freeman et al. Dec 2021 A1
20210400247 Casas Dec 2021 A1
20210401533 Im Dec 2021 A1
20210402255 Fung Dec 2021 A1
20210405369 King Dec 2021 A1
20220003992 Ahn Jan 2022 A1
20220007006 Healy et al. Jan 2022 A1
20220008135 Frielinghaus et al. Jan 2022 A1
20220038675 Hegyi Feb 2022 A1
20220039873 Harris Feb 2022 A1
20220051484 Jones et al. Feb 2022 A1
20220054199 Sivaprakasam et al. Feb 2022 A1
20220061921 Crawford et al. Mar 2022 A1
20220071712 Wolf et al. Mar 2022 A1
20220079675 Lang Mar 2022 A1
20220113810 Isaacs et al. Apr 2022 A1
20220121041 Hakim Apr 2022 A1
20220133484 Lang May 2022 A1
20220142730 Wolf et al. May 2022 A1
20220155861 Myung et al. May 2022 A1
20220159227 Quiles Casas May 2022 A1
20220179209 Cherukuri Jun 2022 A1
20220192776 Gibby et al. Jun 2022 A1
20220193453 Miyazaki et al. Jun 2022 A1
20220201274 Achilefu et al. Jun 2022 A1
20220245400 Siemionow et al. Aug 2022 A1
20220245821 Ouzounis Aug 2022 A1
20220269077 Adema Aug 2022 A1
20220270263 Junio Aug 2022 A1
20220287676 Steines et al. Sep 2022 A1
20220292786 Pelzl et al. Sep 2022 A1
20220295033 Quiles Casas Sep 2022 A1
20220304768 Elimelech et al. Sep 2022 A1
20220351385 Finley et al. Nov 2022 A1
20220358759 Cork et al. Nov 2022 A1
20220392085 Finley et al. Dec 2022 A1
20220405935 Flossmann et al. Dec 2022 A1
20230009793 Gera et al. Jan 2023 A1
20230027801 Qian et al. Jan 2023 A1
20230034189 Gera et al. Feb 2023 A1
20230073041 Samadani et al. Mar 2023 A1
20230149083 Lin et al. May 2023 A1
20230290037 Tasse et al. Sep 2023 A1
20230295302 Bhagavatheeswaran et al. Sep 2023 A1
20230316550 Hiasa Oct 2023 A1
20230329799 Gera et al. Oct 2023 A1
20230329801 Elimelech et al. Oct 2023 A1
20230371984 Leuthardt et al. Nov 2023 A1
20230372053 Elimelech et al. Nov 2023 A1
20230372054 Elimelech et al. Nov 2023 A1
20230377175 Seok Nov 2023 A1
20230379448 Benishti et al. Nov 2023 A1
20230379449 Benishti et al. Nov 2023 A1
20230386153 Rybnikov et al. Nov 2023 A1
20230397349 Capelli et al. Dec 2023 A1
20230397957 Crawford et al. Dec 2023 A1
20230410445 Elimelech et al. Dec 2023 A1
20240008935 Wolf et al. Jan 2024 A1
20240016549 Johnson et al. Jan 2024 A1
20240016572 Elimelech et al. Jan 2024 A1
20240020831 Johnson et al. Jan 2024 A1
20240020840 Johnson et al. Jan 2024 A1
20240020862 Johnson et al. Jan 2024 A1
20240022704 Benishti et al. Jan 2024 A1
20240023946 Wolf et al. Jan 2024 A1
20240041558 Siewerdsen et al. Feb 2024 A1
Foreign Referenced Citations (133)
Number Date Country
3022448 Feb 2018 CA
3034314 Feb 2018 CA
101379412 Mar 2009 CN
103106348 May 2013 CN
111915696 Nov 2020 CN
112489047 Mar 2021 CN
202004011567 Nov 2004 DE
102004011567 Sep 2005 DE
102014008153 Oct 2014 DE
202022103168 Jun 2022 DE
0933096 Aug 1999 EP
1640750 Mar 2006 EP
1757974 Feb 2007 EP
2119397 Nov 2009 EP
2134847 Dec 2009 EP
2557998 Feb 2013 EP
2823463 Jan 2015 EP
2868277 May 2015 EP
2891966 Jul 2015 EP
2963616 Jan 2016 EP
3028258 Jun 2016 EP
3034607 Jun 2016 EP
3037038 Jun 2016 EP
3069318 Sep 2016 EP
3076660 Oct 2016 EP
3121789 Jan 2017 EP
3123970 Feb 2017 EP
2654749 May 2017 EP
3175815 Jun 2017 EP
3216416 Sep 2017 EP
2032039 Oct 2017 EP
3224376 Oct 2017 EP
3247297 Nov 2017 EP
3256213 Dec 2017 EP
3306567 Apr 2018 EP
2030193 Jul 2018 EP
2225723 Feb 2019 EP
2892558 Apr 2019 EP
2635299 Jul 2019 EP
3505050 Jul 2019 EP
2875149 Dec 2019 EP
3593227 Jan 2020 EP
3634294 Apr 2020 EP
3206583 Sep 2020 EP
3711700 Sep 2020 EP
2625845 Mar 2021 EP
3789965 Mar 2021 EP
3858280 Aug 2021 EP
3913423 Nov 2021 EP
3952331 Feb 2022 EP
3960235 Mar 2022 EP
4173590 May 2023 EP
4252695 Oct 2023 EP
4270313 Nov 2023 EP
4287120 Dec 2023 EP
2507314 Apr 2014 GB
10-2014-0120155 Oct 2014 KR
0334705 Apr 2003 WO
2006002559 Jan 2006 WO
2007051304 May 2007 WO
2007115826 Oct 2007 WO
2008103383 Aug 2008 WO
2010067267 Jun 2010 WO
2010074747 Jul 2010 WO
2012061537 May 2012 WO
2012101286 Aug 2012 WO
2013112554 Aug 2013 WO
2014014498 Jan 2014 WO
2014024188 Feb 2014 WO
2014037953 Mar 2014 WO
2014113455 Jul 2014 WO
2014125789 Aug 2014 WO
2014167563 Oct 2014 WO
2014174067 Oct 2014 WO
2015058816 Apr 2015 WO
2015061752 Apr 2015 WO
2015109145 Jul 2015 WO
2016151506 Sep 2016 WO
2018052966 Mar 2018 WO
2018073452 Apr 2018 WO
2018200767 Nov 2018 WO
2018206086 Nov 2018 WO
2019083431 May 2019 WO
2019135209 Jul 2019 WO
2019161477 Aug 2019 WO
2019195926 Oct 2019 WO
2019210353 Nov 2019 WO
2019211741 Nov 2019 WO
2020109903 Jun 2020 WO
2020109904 Jun 2020 WO
2021017019 Feb 2021 WO
2021019369 Feb 2021 WO
2021021979 Feb 2021 WO
2021023574 Feb 2021 WO
2021046455 Mar 2021 WO
2021048158 Mar 2021 WO
2021061459 Apr 2021 WO
2021062375 Apr 2021 WO
2021073743 Apr 2021 WO
2021087439 May 2021 WO
2021091980 May 2021 WO
2021112918 Jun 2021 WO
2021130564 Jul 2021 WO
2021137752 Jul 2021 WO
2021141887 Jul 2021 WO
2021145584 Jul 2021 WO
2021154076 Aug 2021 WO
2021183318 Sep 2021 WO
2021188757 Sep 2021 WO
2021255627 Dec 2021 WO
2021257897 Dec 2021 WO
2021258078 Dec 2021 WO
2022009233 Jan 2022 WO
2022053923 Mar 2022 WO
2022079565 Apr 2022 WO
2023281395 Jan 2023 WO
2023007418 Feb 2023 WO
2023011924 Feb 2023 WO
2023021448 Feb 2023 WO
2023021450 Feb 2023 WO
2023021451 Feb 2023 WO
2023026229 Mar 2023 WO
2023047355 Mar 2023 WO
2023072887 May 2023 WO
2023088986 May 2023 WO
2023163933 Aug 2023 WO
2023186996 Oct 2023 WO
2023205212 Oct 2023 WO
2023209014 Nov 2023 WO
2023232492 Dec 2023 WO
2023240912 Dec 2023 WO
2024013642 Jan 2024 WO
2024018368 Jan 2024 WO
Non-Patent Literature Citations (47)
Entry
International Search Report and Written Opinion received for PCT Patent Application No. PCT/IB2023/059049, mailed on Feb. 8, 2024, 24 pages.
16 Augmented Reality Glasses of 2021 (with Features), in Back to News, Dated May 6, 2022, accessed at https://web.archive.org/web/20221127195438/https://circuitstream.com/blog/16-augmented-reality-glasses-of-2021-with-features-breakdowns/.
Vuzix Blades, Prescription Lens Installation Guide, copyright 2020.
Frames Direct, InSpatialRx Prescription Insert, Prescription Insert for Magic Leap 1, accessed Mar. 8, 2024 at https://www.framesdirect.com/inspatialrx-prescription-insert.html.
Everysight, Installing your RX Adaptor, accessed Mar. 13, 2024 at https://support.everysight.com/hc/en-us/articles/115000984571-Installing-your-RX-Adaptor.
Reddit, Notice on Prescription Lenses for Nreal Glasses, accessed Mar. 13, 2024 at https://www.reddit.com/r/nreal/comments/x1fte5/notice_on_prescription_lenses_for_nreal_glasses/.
Everysight, Raptor User Manual, copyright 2017, in 46 pages.
U.S. Appl. No. 15/896,102 U.S. Pat. No. 10,134,166, filed Feb. 14, 2018 Nov. 20, 2018, Combining Video-Based and Optic-Based Augmented Reality in a Near Eye Display.
U.S. Appl. No. 16/159,740 U.S. Pat. No. 10,382,748, filed Oct. 15, 2018 Aug. 13, 2019, Combining Video-Based and Optic-Based Augmented Reality in a Near Eye Display.
U.S. Appl. No. 16/419,023 U.S. Pat. No. 11,750,794, filed May 22, 2019 Sep. 5, 2023, Combining Video-Based and Optic-Based Augmented Reality in a Near Eye Display.
U.S. Appl. No. 18/352,158, filed Jul. 13, 2023, Combining Video-Based and Optic-Based Augmented Reality in a Near Eye Display.
U.S. Appl. No. 18/365,643, filed Aug. 4, 2023, Head-Mounted Augmented Reality Near Eye Display Device.
U.S. Appl. No. 18/365,650, filed Aug. 4, 2023, Systems for Facilitating Augmented Reality-Assisted Medical Procedures.
U.S. Appl. No. 15/127,423 U.S. Pat. No. 9,928,629, filed Sep. 20, 2016 Mar. 27, 2018, Combining Video-Based and Optic-Based Augmented Reality in a Near Eye Display.
U.S. Appl. No. 16/120,480 U.S. Pat. No. 10/835,296, filed Sep. 4, 2018 Nov. 17, 2020, Spinous Process Clamp.
U.S. Appl. No. 17/067,831, filed Oct. 12, 2020, Spinous Process Clamp.
U.S. Appl. No. 18/030,072, filed Apr. 4, 2023, Spinous Process Clamp.
U.S. Appl. No. 18/365,590, filed Aug. 4, 2023, Registration of a Fiducial Marker for an Augmented Reality System.
U.S. Appl. No. 18/365,571, filed Aug. 4, 2023, Registration Marker for an Augmented Reality System.
U.S. Appl. No. 17/045,766, filed Oct. 7, 2020, Registration of a Fiducial Marker for an Augmented Reality System.
U.S. Appl. No. 16/199,281 U.S. Pat. No. 10,939,977, filed Nov. 26, 2018 Mar. 9, 2021, Positioning Marker.
U.S. Appl. No. 16/524,258, filed Jul. 29, 2019, Fiducial Marker.
U.S. Appl. No. 17/585,629, filed Jan. 27, 2022, Fiducial Marker.
U.S. Appl. No. 16/724,297 U.S. Pat. No. 11/382,712, filed Dec. 22, 2019 Jul. 12, 2022, Mirroring in Image Guided Surgery.
U.S. Appl. No. 17/827,710 U.S. Pat. No. 11,801,115, filed May 29, 2022 Oct. 31, 2023, Mirroring in Image Guided Surgery.
U.S. Appl. No. 18/352,181, filed Jul. 13, 2023, Mirroring in Image Guided Surgery.
U.S. Appl. No. 18/400,739, filed Dec. 29, 2023, Mirroring in Image Guided Surgery.
U.S. Appl. No. 16/200,144 U.S. Pat. No. 11,766,296, filed Nov. 26, 2018 Sep. 26, 2023, Tracking System for Image-Guided Surgery.
U.S. Appl. No. 18/470,809, filed Sep. 20, 2023, Tracking Methods for Image-Guided Surgery.
U.S. Appl. No. 17/015,199, filed Sep. 9, 2020, Universal Tool Adapter.
U.S. Appl. No. 18/598,965, filed Mar. 7, 2024, Universal Tool Adapter for Image Guided Surgery.
U.S. Appl. No. 18/044,380, filed Mar. 8, 2023, Universal Tool Adapter for Image-Guided Surgery.
U.S. Appl. No. 16/901,026 U.S. Pat. No. 11,389,252, filed Jun. 15, 2020 Jul. 19, 2022, Rotating Marker for Image Guided Surgery.
U.S. Appl. No. 18/008,980, filed Dec. 8, 2022, Rotating Marker.
U.S. Appl. No. 17/368,859 U.S. Pat. No. 11,896,445, filed Jul. 7, 2021 Feb. 13, 2024, Iliac Pin and Adapter.
U.S. Appl. No. 18/437,898, filed Feb. 9, 2024, Iliac Pin and Adapter.
U.S. Appl. No. 18/576,516, filed Jan. 4, 2024, Iliac Pin and Adapter.
U.S. Appl. No. 17/388,064, filed Jul. 29, 2021, Rotating Marker and Adapter for Image-Guided Surgery.
U.S. Appl. No. 18/291,731, filed Jan. 24, 2024, Rotating Marker and Adapter for Image-Guided Surgery.
U.S. Appl. No. 18/365,844, filed Aug. 4, 2023, Augmented-Reality Surgical System Using Depth Sensing.
U.S. Appl. No. 18/683,676, filed Feb. 14, 2024, Stereoscopic Display and Digital Loupe for Augmented-Reality Near-Eye Display.
U.S. Appl. No. 18/683,680, filed Feb. 14, 2024, Augmented Reality Assistance for Osteotomy and Discetomy.
U.S. Appl. No. 18/684,756, filed Feb. 19, 2024, Registration and Registration Validation in Image-Guided Surgery.
U.S. Appl. No. 18/365,566, filed Aug. 4, 2023, Systems for Medical Image Visualization.
U.S. Appl. No. 18/399,253, filed Dec. 28, 2023, Methods for Medical Image Visualization.
U.S. Appl. No. 18/399,433, filed Dec. 28, 2023, Configurable Augmented Reality Eyewear for Image-Guided Medical Intervention.
U.S. Appl. No. 35/508,942 U.S. Pat. No. D. 930,162, filed Feb. 13, 2020 Sep. 7, 2021, Medical Headset.
Related Publications (1)
Number Date Country
20240134206 A1 Apr 2024 US
Provisional Applications (1)
Number Date Country
63405901 Sep 2022 US
Continuations (1)
Number Date Country
Parent PCT/IB2023/059049 Sep 2023 WO
Child 18398837 US