This disclosure relates generally to an augmented reality systems, devices and methods, including a head mounted display (HMD) device of an augmented-reality (AR)-based image-guided system to facilitate surgery or other medical intervention (e.g., therapeutic and/or diagnostic procedures), as well as other uses.
Near-eye displays may be used in various types of AR applications. For example, Applicant has previously demonstrated that head-mounted devices having AR capabilities may be used for performing image-guided surgery (see, for example, Applicant's U.S. Pat. Nos. 11,382,712, 11,389,252, 10,939,977, and 9,928,629, Applicant's US Publication No. 2020/0163723, and Applicant's PCT Publication Nos. WO 2023/021448 and WO 2023/021450, which describe methods and systems for applying augmented reality and near-eye display techniques to an image-guided surgical system). The disclosures of these patents and published applications are incorporated herein by reference.
Several embodiments of the disclosure that are described hereinbelow provide improved methods and systems for applying augmented reality (AR) techniques, peripheral devices and operational methods thereof to a head-mounted display (HMD) of, for example, an image-guided medical (e.g., surgical) system. In the context of the present disclosure and in the claims, the term head-mounted display or HMD shall be given its ordinary meaning and shall also refer to any suitable display apparatus configured to display information (e.g., images) over a scene (e.g., portion of a body of a patient for therapeutic or diagnostic intervention or assessment), such as an organ, skin, bones or joints of a patient, using AR techniques and/or other suitable displaying techniques. For example, the term HMD may refer to a helmet, AR glasses, goggles, spectacles, monocle, eyewear, headset, visor, head-up display, and any other suitable type of displaying device mounted on or worn by any portion of a user or wearer's head, including but not limited to the face, crown, forehead, nose and ears. In some embodiments, the head-mounted displays are not used or used together with stand-alone displays, such as monitors, portable devices, tablets, etc. The display may be a hands-free display such that the operator does not need to hold the display.
In accordance with several embodiments, head-mounted display devices described herein provide reduced stress or fatigue on a wearer, and/or provide additional comfort features. The head-mounted display devices may provide improved ergonomics, comfort, and/or the ability to enable a wearer, such as a surgeon, to wear the device for relatively long periods of time (such as for two, four, six, hours or more in one embodiment), without unnecessary fatigue and/or other negative consequences. For example, the head-mounted display device can be designed and adapted to distribute weight around the wearer's head, including to the wearer's forehead and the back of the wearer's head, to reduce at least some of the weight applied to the wearer's nose or other undesired location. Such a configuration can also reduce pressure on the wearer's temples which can be another relatively weight-sensitive area, in addition to the nose. Stated another way, such a head-mounted display can more widely distribute pressure over larger and/or less sensitive areas, such as the forehead and the back of the head. Although medical applications are well-suited for several embodiments, non-medical applications also benefit from many embodiments described herein. For example, non-medical applications may involve consumer or commercial applications such as athletics and fitness, gaming, driving, product design, navigation, manufacturing, logistics, shopping and commerce, educational training, remote collaboration, etc.
The head-mounted display device may be substituted with an alternative hands-free device that is not worn by the operator, such as a portal, monitor or tablet. The display may be a head-up display or heads-up display.
In accordance with several implementations, a head-mounted display device includes: a frame extending from a first end to a second end, the first end configured to be positioned adjacent a first temple of a wearer (e.g., a surgeon or other user), and the second end configured to be positioned adjacent a second temple of the wearer. The device further includes an adjustable strap assembly including: a first side strap having a first end coupled to the first end of the frame; a second side strap having a first end coupled to the second end of the frame; and an adjustment mechanism (e.g., a knob, a rack and pinion, a mechanism that uses friction, a sliding mechanism, a ratchet mechanism, a snap or lock mechanism, etc.) configured to adjust a position of a second end of the first side strap with respect to a second end of the second side strap, in order to adjust a circumferential size defined by the first side strap, the second side strap, and the frame. The device also includes a see-through display assembly (including, e.g., a near-eye display, an augmented-reality display, a stereoscopic display, glasses, a visor, a head-up display, etc.) coupled (e.g., rotatably, pivotably, movably, or slidably coupled) to the frame such that a tilt angle (e.g., pantoscopic tilt angle) can be adjusted. The device further includes a first temple housing coupled (e.g., rotatably or pivotably coupled) to the first end of the frame and slidably coupled to the first side strap of the adjustable strap assembly and a second temple housing coupled (e.g., rotatably or pivotably coupled) to the second end of the frame and slidably coupled to the second side strap of the adjustable strap assembly.
In some implementations, the see-through display assembly is coupled (e.g., rotatably, pivotably, movably, or slidably coupled) to the frame with a pantoscopic tilting assembly that includes an arc-shaped slot that rotatably, pivotably, movably, or slidably couples a portion of the see-through display assembly to a portion of the frame.
In some implementations, the pantoscopic tilting assembly further includes a detent mechanism including a spring-loaded pin or ball and a plurality of detents (e.g., two, three, four, five, or more than five detents), the detent mechanism configured to selectively retain the see-through display assembly in any of a plurality of predefined positions with respect to the frame.
In some implementations, the detent mechanism further includes a guide member slidably engaged with the arc-shaped slot, the guide member configured to apply a force to the spring-loaded pin or ball to move the spring-loaded pin or ball from one of the plurality of detents to another of the plurality of detents.
In some implementations, the see-through display assembly includes a detachable lens assembly (e.g., clip-on lens assembly, snap-on lens assembly, a friction-fit lens assembly, a magnetic attachment assembly, etc.) that can be detached and replaced with a second detachable lens assembly for changing a prescription of lenses of the see-through display assembly (e.g., the detachable lens assemblies may be customized for a particular wearer such that they are swappable and the same device can be easily interchangeably used by multiple different wearers). In some implementations, the detachable lens assembly may be provided by the manufacturer and/or provider of the head-mounted display device with the head-mounted display device such that they are not required to be obtained separately by a user from an optometrist or third-party provider. In some implementations, the detachable lens assembly can be coupled and detached to the head-mounted display device without requiring the use of any tools (e.g., “tools-free”). In some implementations, the detachable lens assembly comprises or functions as an adapter. In some implementations, the detachable lens assembly is a non-magnetically coupled to the head-mounted display device.
In some implementations, the see-through display assembly includes a display assembly frame; a waveguide lens coupled to the display assembly frame; an anterior lens affixed to the waveguide lens or to the display assembly frame in front of the waveguide lens; a posterior lens frame detachably coupled to the display assembly frame using at least one of: a snap fit, a friction fit, or a clip; and a posterior lens affixed to the posterior lens frame.
In some implementations, the head-mounted display device further includes a flashlight assembly (e.g., headlamp assembly, headlight assembly, etc.) that may be detachably coupled to the frame. In some implementations, the flashlight may be permanently coupled to the frame.
In some implementations, the head-mounted display device further includes: a first follower (e.g., protrusion, slidable member, etc.) that slidably couples the first temple housing to the first side strap; and a second follower (e.g., protrusion, slidable member, etc.) that slidably couples the second temple housing to the second side strap.
In some implementations, the frame further includes a nose pad (e.g., nose support member, etc.) configured to engage a nose of the wearer. The frame may optionally not include a nose pad or may be configured not to engage a nose of the wearer.
In some implementations, the head-mounted display device further includes: a forehead support including: a first end coupled (e.g., rotatably or pivotably coupled) to the first side strap; a second end coupled (e.g., rotatably or pivotably coupled) to the second side strap; and a central support coupled to the frame.
In some implementations, the first side strap includes a connector that couples (e.g., rotatably or pivotably couples) a front portion of the first side strap to a rear portion of the first side strap, and that couples (e.g., rotatably or pivotably couples) the first end of the forehead support to the first side strap. The second side strap includes a connector that couples (e.g., rotatably or pivotably couples) a front portion of the second side strap to a rear portion of the second side strap, and that couples (e.g., rotatably or pivotably couples) the second end of the forehead support to the second side strap.
In some implementations, the head-mounted display device further includes: a first follower (e.g., protrusion, slidable member, etc.) that is slidably coupled to the first temple housing and that is coupled to the first side strap at a position between the connector of the first side strap and the first end of the first side strap; and a second follower (e.g., protrusion, slidable member, etc.) that is slidably coupled to the second temple housing and that is coupled to the second side strap at a position between the connector of the second side strap and the first end of the second side strap.
In some implementations, the head-mounted display device further optionally includes a top strap removably coupled at a first end to the forehead support and at a second end to the adjustment mechanism of the adjustable strap assembly.
In some implementations, each of the first side strap and the second strap includes a rack, and wherein the adjustment mechanism of the adjustable strap assembly includes: a pinion engaged with the rack of the first side strap and with the rack of the second side strap; and a knob configured to cause rotation of the pinion in order to adjust the circumferential size defined by the first side strap, the second side strap, and the frame (e.g., to customize the fit to a circumferential head size of a particular wearer).
In some implementations, the adjustment mechanism of the adjustable strap assembly further includes a tension mechanism (e.g., stop mechanism, one or more gears engaged with a tension member, etc.) that resists rotation of the knob until a threshold force is overcome.
In some implementations, the adjustment mechanism of the adjustable strap assembly further includes a pad configured to engage a back of a head of the wearer.
In some implementations, the see-through display assembly is configured to display to the wearer an augmented reality (AR) image including a virtual reality (VR) image presented over a portion of a body of a patient, and the head-mounted display device further includes or consists essentially of one or more processors configured to, e.g., upon execution of program instructions stored on a non-transitory computer readable medium, receive one or more anatomical images of the patient and signals indicative of at least a position of the see-through display assembly relative to the scene, and to render the AR image to the see-through display assembly. The see-through display assembly may be configured to allow the wearer to see AR images from both a standing and a sitting position without inconvenience or manual adjustment.
In some implementations, at least one of the one or more processors is positioned within the first temple housing or the second temple housing. In some implementations, at least one of the one or more processors may be located in other locations on the head-mounted display device or separate from the head-mounted display device and in wireless communication with the head-mounted display device.
In accordance with several implementations, a head-mounted display device includes a frame extending from a first end to a second end, the first end configured to be positioned adjacent a first temple of a wearer (e.g., a surgeon or other user), and the second end configured to be positioned adjacent a second temple of the wearer. The frame further includes a nose pad (e.g., nose support member, etc.) configured to engage a nose of the wearer; an adjustable strap assembly including a first side strap having a first end coupled to the first end of the frame; a second side strap having a first end coupled to the second end of the frame; and an adjustment mechanism (e.g., a knob, a rack and pinion, a mechanism that uses friction, etc.) configured to adjust a position of a second end of the first side strap with respect to a second end of the second side strap, in order to adjust a circumferential size defined by the first side strap, the second side strap, and the frame. The head-mounted display device further includes a forehead support including a first end coupled (e.g., rotatably or pivotably coupled) to the first side strap; a second end coupled (e.g., rotatably or pivotably coupled) to the second side strap; and a central support coupled to the frame. The head-mounted display device further includes a see-through display assembly (e.g., augmented-reality display, near-eye display, stereoscopic display, glasses, visor, headset, goggles, head-up display, etc.) coupled (e.g., rotatably, pivotably, movably, or slidably coupled) to the frame such that a tilt angle (e.g., pantoscopic tilt angle) can be adjusted. The see-through display assembly includes a detachable lens assembly (e.g., clip-on lens assembly, snap-on lens assembly, a friction-fit lens assembly, etc.) that can be detached and replaced with a second detachable lens assembly for changing a prescription of lenses of the see-through display assembly. The head-mounted display device also includes a first temple housing pivotably coupled to the first end of the frame and slidably coupled to the first side strap of the adjustable strap assembly and a second temple housing pivotably coupled to the second end of the frame and slidably coupled to the second side strap of the adjustable strap assembly. The head-mounted display device further includes a flashlight assembly (e.g., headlamp assembly, headlight assembly, etc.) detachably coupled to the frame.
In some implementations, the first side strap includes a connector that pivotably couples a front portion of the first side strap to a rear portion of the first side strap, and that pivotably couples the first end of the forehead support to the first side strap. The second side strap includes a connector that pivotably couples a front portion of the second side strap to a rear portion of the second side strap, and that pivotably couples the second end of the forehead support to the second side strap.
In some implementations, the head-mounted display device further includes a first follower (e.g., protrusion, slidable member, etc.) that is slidably coupled to the first temple housing and that is coupled to the first side strap at a position between the connector of the first side strap and the first end of the first side strap, and a second follower (e.g., protrusion, slidable member, etc.) that is slidably coupled to the second temple housing and that is coupled to the second side strap at a position between the connector of the second side strap and the first end of the second side strap.
In some implementations, each of the first side strap and the second strap includes a rack. The adjustment mechanism of the adjustable strap assembly may include a pinion engaged with the rack of the first side strap and with the rack of the second side strap and a knob configured to cause rotation of the pinion in order to adjust the circumferential size defined by the first side strap, the second side strap, and the frame.
In some implementations, the adjustment mechanism of the adjustable strap assembly further includes a tension mechanism (e.g., stop mechanism, one or more gears engaged with a tension member, etc.) that resists rotation of the knob until a threshold force is overcome.
In some implementations, the adjustment mechanism of the adjustable strap assembly further includes a pad configured to engage a back of a head of the wearer.
In some implementations, the head-mounted display device further includes a top strap removably coupled at a first end to the forehead support and at a second end to the adjustment mechanism of the adjustable strap assembly.
In some implementations, the see-through display assembly is configured to display to the wearer an augmented reality (AR) image including a virtual reality (VR) image presented over a scene on a body of a patient, and the head-mounted display device further includes one or more processors configured to receive one or more anatomical images of the patient and signals indicative of at least a position of the see-through display assembly relative to the scene, and to render the AR image to the see-through display assembly.
In some implementations, at least one of the one or more processors is positioned within the first temple housing or the second temple housing. In some implementations, at least one of the one or more processors may be located in other locations on the head-mounted display device or separate from the head-mounted display device and in wireless communication with the head-mounted display device.
In some implementations, each of the first temple housing and the second temple housing includes a plurality of heat-dissipation fins (e.g., protrusions, heatsinks, etc.).
In some implementations, the see-through display assembly is rotatably, pivotably, movably, or slidably coupled coupled to the frame with a pantoscopic tilting assembly that includes an arc-shaped slot that rotatably, pivotably, movably, or slidably couples a portion of the see-through display assembly to a portion of the frame and a detent mechanism including a spring-loaded pin or ball and a plurality of detents. The detent mechanism may be configured to selectively retain the see-through display assembly in any of a plurality of predefined positions with respect to the frame.
In some implementations, the detent mechanism further includes a guide member slidably engaged with the arc-shaped slot, the guide member configured to apply a force to the spring-loaded pin or ball to move the spring-loaded pin or ball from one of the plurality of detents to another of the plurality of detents.
In some implementations, the frame includes a flashlight mounting socket including a first rod (e.g., post, protrusion, shaft, etc.) that defines a pivot or rotation axis and a second rod (e.g., post, protrusion, shaft, etc.) positioned parallel to the first rod. The flashlight assembly may include a first recess (e.g., opening, socket, depression, etc.) shaped to engage and pivot about the first rod of the flashlight mounting socket; a second recess (e.g., opening, socket, depression, etc.) shaped to engage the second rod of the flashlight mounting socket, the second recess being oriented such that the first recess cannot disengage the first rod when the second recess is engaged with the second rod; and a movable latch (e.g., hook, coupler, etc.) configured to selectively retain the second recess in engagement with the second rod.
In some implementations, the head-mounted display device further includes a spring or other biasing mechanism that biases the movable latch toward a position that retains the second recess in engagement with the second rod.
In some implementations, the flashlight assembly includes a flashlight (e.g., headlight, headlamp, etc.); a mounting base that includes the first recess, the second recess, and the movable latch; and one or more arms that pivotably or otherwise rotatably couple the flashlight to the mounting base.
In some implementations, the see-through display assembly includes a display assembly frame; a waveguide lens coupled to the display assembly frame; an anterior lens affixed to the waveguide lens or to the display assembly frame in front of the waveguide lens; a posterior lens frame detachably coupled to the display assembly frame using at least one of: a snap fit, a friction fit, a magnetic attachment, a hook-and-fastener attachment, or a clip; and a posterior lens affixed to the posterior lens frame.
In accordance with several implementations, a head-mounted display device includes a frame extending from a first end to a second end, the first end configured to be positioned adjacent a first temple of a wearer (e.g., a surgeon or other user), and the second end configured to be positioned adjacent a second temple of the wearer; a head mounting assembly configured to retain the frame in a position on a head of the wearer; a see-through display; and a tilting assembly (e.g., pantoscopic tilting assembly) that rotatably, pivotably, movably, or slidably couples the see-through display to the frame such that a tilt angle (e.g., pantoscopic tilt angle) can be adjusted. The tilting assembly (e.g., pantoscopic tilting assembly) includes an arc-shaped slot that rotatably, pivotably, movably, or slidably couples a portion of the see-through display assembly to a portion of the frame. The tilting assembly also includes a detent mechanism including a spring-loaded pin or ball and a plurality of detents, the detent mechanism configured to selectively retain the see-through display assembly in any of a plurality of predefined positions with respect to the frame.
In some implementations, the detent mechanism further includes a guide member slidably engaged with the arc-shaped slot, the guide member configured to apply a force to the spring-loaded pin or ball to move the spring-loaded pin or ball from one of the plurality of detents to another of the plurality of detents.
In some implementations, the arc-shaped slot defines a virtual hinge that includes an axis of rotation that is configured to be aligned with a center of an eyeball of the wearer. In some implementations, the virtual hinge as opposed to a physical hinge advantageously allows for peripheral vision of the wearer not to be obstructed or distorted.
In some implementations, the head mounting assembly includes an adjustable strap assembly including a first side strap having a first end coupled to the first end of the frame; a second side strap having a first end coupled to the second end of the frame; and an adjustment mechanism (e.g., a knob, a rack and pinion, a mechanism that uses friction, etc.) configured to adjust a position of a second end of the first side strap with respect to a second end of the second side strap, in order to adjust a circumferential size defined by the first side strap, the second side strap, and the frame.
In some implementations, the head-mounted display device further includes a forehead support including a first end pivotably coupled to the first side strap; a second end pivotably coupled to the second side strap; and a central support coupled to the frame.
In some implementations, the head-mounted display device further includes a first temple housing pivotably coupled to the first end of the frame and slidably coupled to the head mounting assembly; a second temple housing pivotably coupled to the second end of the frame and slidably coupled to the head mounting assembly; and one or more processors configured to (e.g., upon execution of program instructions stored on a non-transitory computer readable medium) render images for display by the see-through display. In some implementations, at least one of the one or more processors is positioned within the first temple housing or the second temple housing, although they may be in other locations as well.
In some implementations, the head mounting assembly includes a first temple arm coupled to the frame and configured to be placed over a first ear of the wearer; and a second temple arm coupled to the frame and configured to be placed over a second ear of the wearer.
In some implementations, the frame further includes a nose pad (e.g., nose support member, etc.) configured to engage a nose of the wearer.
In accordance with several implementations, a head-mounted display device includes a frame extending from a first end to a second end, the first end configured to be positioned adjacent a first temple of a wearer (e.g., a surgeon or other user), and the second end configured to be positioned adjacent a second temple of the wearer. The head-mounted display device also includes an adjustable strap assembly including a first side strap having a first end coupled to the first end of the frame; a second side strap having a first end coupled to the second end of the frame; and an adjustment mechanism (e.g., a knob, a rack and pinion, a mechanism that uses friction, etc.) configured to adjust a position of a second end of the first side strap with respect to a second end of the second side strap, in order to adjust a circumferential size defined by the first side strap, the second side strap, and the frame. The head-mounted display device further includes a first temple housing pivotably coupled to the first end of the frame and slidably coupled to the first side strap of the adjustable strap assembly and a second temple housing pivotably coupled to the second end of the frame and slidably coupled to the second side strap of the adjustable strap assembly. The head-mounted display device also includes a see-through display coupled to the frame.
In some implementations, the head-mounted display device further includes a first follower (e.g., protrusion, slidable member, etc.) that slidably couples the first temple housing to the first side strap; and a second follower (e.g., protrusion, slidable member, etc.) that slidably couples the second temple housing to the second side strap.
In some implementations, the first follower includes an elongate protrusion that can slide forward and backward within an elongate slot of the first temple housing responsive to pivoting of the first temple housing with respect to the frame. The second follower may include an elongate protrusion that can slide forward and backward within an elongate slot of the second temple housing responsive to pivoting of the second temple housing with respect to the frame.
In some implementations, each of the first side strap and the second strap includes a rack, and the adjustment mechanism of the adjustable strap assembly includes a pinion engaged with the rack of the first side strap and with the rack of the second side strap and a knob configured to cause rotation of the pinion in order to adjust the circumferential size defined by the first side strap, the second side strap, and the frame.
In some implementations, the adjustment mechanism of the adjustable strap assembly further includes a tension mechanism (e.g., stop mechanism, one or more gears engaged with a tension member, etc.) that resists rotation of the knob until a threshold force is overcome.
In some implementations, the adjustment mechanism of the adjustable strap assembly further includes a pad configured to engage a back of a head of the wearer.
In some implementations, the head-mounted display device further includes a forehead support including a first end pivotably coupled to the first side strap; a second end pivotably coupled to the second side strap; and a central support coupled to the frame.
In some implementations, the head-mounted display device further includes a top strap removably coupled at a first end to the forehead support and at a second end to the adjustment mechanism of the adjustable strap assembly.
In some implementations, the frame further includes a nose pad (e.g., nose support member, etc.) configured to engage a nose of the wearer.
In accordance with several implementations, a head-mounted display device includes a frame extending from a first end to a second end, the first end configured to be positioned adjacent a first temple of a wearer (e.g., a surgeon or other user), and the second end configured to be positioned adjacent a second temple of the wearer. The head-mounting display device further includes a head mounting assembly configured to retain the frame in a position on a head of the wearer; a see-through display; and a flashlight assembly (e.g., headlamp assembly, headlight assembly, etc.) detachably coupled to the frame. The frame includes a flashlight mounting socket including a first rod (e.g., post, protrusion, shaft, etc.) that defines a pivot axis; and a second rod (e.g., post, protrusion, shaft, etc.) positioned parallel to the first rod. The flashlight assembly includes a first recess (e.g., opening, socket, depression, etc.) shaped to engage and pivot about the first rod of the flashlight mounting socket; a second recess (e.g., opening, socket, depression, etc.) shaped to engage the second rod of the flashlight mounting socket, the second recess being oriented such that the first recess cannot disengage the first rod when the second recess is engaged with the second rod; and a movable latch (e.g., hook, coupler, etc.) configured to selectively retain the second recess in engagement with the second rod.
In some implementations, the head-mounted display device further includes a spring or other biasing structure that biases the movable latch toward a position that retains the second recess in engagement with the second rod.
In some implementations, the flashlight assembly includes a flashlight (e.g., headlight, headlamp, etc.); a mounting base that includes the first recess, the second recess, and the movable latch; and one or more arms that pivotably couple the flashlight to the mounting base.
In some implementations, the flashlight mounting socket further includes one or more electrical contacts configured to electrically couple to a corresponding one or more electrical contacts of the flashlight assembly.
In some implementations, the frame further includes a nose pad (e.g., nose support member, etc.) configured to engage a nose of the wearer.
In accordance with several implementations, a head-mounted display device includes a frame extending from a first end to a second end, the first end configured to be positioned adjacent a first temple of a wearer (e.g., a surgeon or other user), and the second end configured to be positioned adjacent a second temple of the wearer. The head-mounted display device further includes a head mounting assembly configured to retain the frame in a position on a head of the wearer. The head-mounted display device also includes a see-through display assembly (e.g., augmented-reality display, stereoscopic display, glasses, visor, etc.) coupled to the frame. The see-through display assembly includes a display assembly frame; a waveguide lens coupled to the display assembly frame; an anterior lens affixed to the waveguide lens or to the display assembly frame in front of the waveguide lens; a posterior lens frame detachably coupled to the display assembly frame using at least one of a snap fit, a friction fit, or a clip; and a posterior lens affixed to the posterior lens frame.
In some implementations, the head-mounted display device further includes a first seal between the anterior lens and the waveguide lens and a second seal between the posterior lens frame and the waveguide lens.
In some implementations, the posterior lens frame includes a first protrusion (e.g., clip, snap, etc.) at a top of the posterior lens frame that fits into a first corresponding recess (e.g., opening, hole, slot, etc.) of the display assembly frame. The posterior lens frame includes a second protrusion (e.g., clip, snap, etc.) at a bottom of the posterior lens frame that forms a snap fit with a second corresponding recess (e.g., opening, hole, slot, etc.) of the display assembly frame.
In some implementations, the see-through display assembly is coupled (e.g., rotatably, pivotably, movably, or slidably coupled) to the frame such that a tilt angle (e.g., pantoscopic tilt angle) can be adjusted.
In some implementations, the see-through display assembly is coupled (e.g., rotatably, pivotably, movably, or slidably coupled) to the frame with a pantoscopic tilting assembly that includes an arc-shaped slot that rotatably, pivotably, movably, or slidably couples a portion of the see-through display assembly to a portion of the frame and a detent mechanism including a spring-loaded pin or ball and a plurality of detents, the detent mechanism configured to selectively retain the see-through display assembly in any of a plurality of predefined positions with respect to the frame.
In some implementations, the detent mechanism further includes a guide member slidably engaged with the arc-shaped slot, the guide member configured to apply a force to the spring-loaded pin or ball to move the spring-loaded pin or ball from one of the plurality of detents to another of the plurality of detents.
In some implementations, the frame further includes a nose pad (e.g., nose support member, etc.) configured to engage a nose of the wearer.
In accordance with several implementations, a system includes a head-mounted display (HMD), including a frame configured to be mounted on a head of a user (e.g., a surgeon or other user); and a display (including, e.g., an augmented-reality display, a stereoscopic display, glasses, goggles, a head-up display, a visor, etc.), which is (i) connected to the frame and configured to rotate relative to a frontal plane of the user for setting a tilt angle (e.g., pantoscopic tilt angle) of the display, and (ii) at least partially transparent and configured to display to the user, an augmented reality (AR) image including a virtual reality (VR) image presented over a scene on a body of a patient; and at least one processor configured to (i) receive one or more anatomical images of the patient and signals indicative of at least the position of the display relative to the scene; and (ii) render the AR image to the display.
In some implementations, the frame includes an adjustable head mounting assembly including adjustable temple arms, which are placed over ears of the user and are configured to be adjusted for conforming with at least a section of temples of the user; an adjustable nose pad (e.g., nose support member, etc.), which is placed over a nose of the user and is configured to conform to a shape of at least part of the nose; and a housing, which is connected to the temple arms and nose pad.
In some implementations, at least one of the temple arms includes first and second sections and first and second tilting assemblies, wherein the first tilting assembly is configured to tilt the first section relative to the frame, and the second tilting assembly is configured to rotate the second section relative to the first section.
In some implementations, the second tilting assembly includes a rocker arm, which is configured to rotate about a hinge relative to a longitudinal axis of the first section.
In some implementations, the first section has an opening, and when the rocker arm is rotating, the opening is configured to contain at least part of the rocker arm.
In some implementations, the rocker arm includes a cushion, which is made from a viscoelastic foam shaped to conform with a nape of the user and to improve a clamping between the frame of the HMD and the head of the user.
In some implementations, the cushion includes a material selected from a list of materials consisting essentially of at least one of (i) silicone, (ii) neoprene, and (iii) polyurethane.
In some implementations, the cushion includes a sponge.
In some implementations, the second tilting assembly includes an alloy, which is coated by a viscoelastic foam and is shaped to conform with a nape of the user.
In some implementations, (i) the first and second sections, (ii) the first and second tilting assemblies, and (iii) the nose pad, are adapted to provide the user with two or more degrees of freedom (DOFs) for adjusting the frame to conform with a contour of the head of the user.
In some implementations, the second tilting assembly includes an array of rocker arms, each of the rocker arms is configured to rotate about a respective hinge.
In some implementations, the array of rocker arms is mounted on a common bar, and including an additional hinge, and wherein the common bar is configured to rotate about the additional hinge.
In some implementations, the frame includes an adjustable head mounting assembly including a first side strap having a first end coupled to a first end of the frame; a second side strap having a first end coupled to a second end of the frame; and an adjustment mechanism (e.g., a knob, a rack and pinion, a mechanism that uses friction, etc.) configured to adjust a position of a second end of the first side strap with respect to a second end of the second side strap, in order to adjust a circumferential size defined by the first side strap, the second side strap, and the frame.
In some implementations, the system further includes a first temple housing pivotably coupled to the first end of the frame and slidably coupled to the first side strap; and a second temple housing pivotably coupled to the second end of the frame and slidably coupled to the second side strap.
In some implementations, at least one eye has a first optical axis, and wherein an optical engine of the display has a second optical axis, and wherein the pantoscopic tilt angle is set for aligning the second optical axis with the first optical axis.
In some implementations, the HMD includes a pantoscopic-tilting assembly (PTA), which is connected to the frame and the display, and is configured to rotate the second optical axis relative to the first optical axis for adjusting the pantoscopic tilt angle.
In some implementations, the PTA includes a hinge connecting between the first and second optical axes.
In some implementations, the PTA includes a bar, which is coupled to the optical engine including the display, and an edge of the bar is coupled to the hinge, and wherein the hinge is adapted to rotate the bar relative to the first axis of the frame.
In some implementations, the PTA includes a virtual axis including at least two parts other than a hinge, and adapted to rotate the second optical axis relative to the first optical axis.
In some implementations, the virtual axis includes (i) a bar coupled to an optical engine including the display and to a rotatable section of a disc having a slit, and (ii) an element configured to be inserted into the slit and to be moved along the slit and relative to the slit in a tangential direction.
In some implementations, the virtual axis includes an arm including first, second and third sections, wherein the first section is coupled to an optical engine including the display, the second section is coupled to the frame, and the third section coupled between the first and second sections and configured to bend in response to a force applied to the PTA for adjusting the pantoscopic tilt angle.
In some implementations, the virtual axis includes (i) a rigid arm coupled to the frame, and (ii) a flexible arm, which is coupled to an optical engine including the display, and wherein, the flexible arm is adapted to transform from an elastic deformation to a plastic deformation, and to retain a shape obtained in response to a force applied to the PTA for adjusting the pantoscopic tilt angle.
In some implementations, the PTA further includes a detent mechanism including a spring-loaded pin or ball and a plurality of detents, the detent mechanism configured to selectively retain the display in any of a plurality of predefined positions with respect to the frame.
In accordance with several implementations, a head-mounted display device includes a frame extending from a first end to a second end; a head mounting assembly configured to be adjustable and to retain the frame in a position on a head of the user (e.g., a surgeon or other user); a display (including, e.g., an augmented-reality display, a stereoscopic display, glasses, a visor, etc.) that is at least partially transparent and configured to display to the user an augmented reality (AR) image including a virtual reality (VR) image presented over a scene on a body of a patient; a pantoscopic tilting assembly that pivotably couples (e.g., rotatably, pivotably, movably, or slidably couples) the display to the frame such that a pantoscopic tilt angle can be adjusted; a first temple housing pivotably coupled to the first end of the frame by a first tilting assembly; and a second temple housing pivotably coupled to the second end of the frame by a second tilting assembly. At least one of the first temple housing or the second temple housing includes at least one processor configured to receive one or more anatomical images of the patient and signals indicative of at least a position of the display relative to the scene, and to render the AR image to the display.
In some implementations, the head mounting assembly includes an adjustable head strap configured to engage at least a back of a head of the user and a forehead of the user.
In some implementations, the first temple housing and the second temple housing are each part of the head mounting assembly.
In some implementations, the head mounting assembly further includes an adjustable nose pad (e.g., nose support member, etc.) configured to conform to a shape of at least a part of a nose of the user.
In some implementations, the pantoscopic tilting assembly includes a slot positioned to enable a pivoting element to slide tangential to the slot, the pivoting element configured to be locked at one or more predefined positions with respect to the slot.
In accordance with several implementations, a method of pairing head-mounted displays and workstations in medical center networks includes introducing a communication device of a head-mounted display into a first network; initiating pairing of the communication device of the head-mounted display to a first workstation of the first network using previous connection parameters, when the communication device of the head-mounted display is previously known to the first network. The method also includes initiating pairing of the communication device of the head-mounted display to the first workstation of the first network using a key exchanging process that generates new connection parameters, when the communication device of the head-mounted display is not previously known to the first network. The method further includes exchanging data between the first workstation and the communication device of the head-mounted display during a surgical operation, to enable the head-mounted display to display to a user an augmented reality (AR) image including a virtual reality (VR) image presented over a scene on a body of a patient. In some implementations, the head-mounted display comprises any of the head-mounted displays or head-mounted display devices disclosed herein.
In some implementations, the method further includes responsive to pairing not being completed successfully within a predefined time limit, initiating pairing of the communication device of the head-mounted display to a second workstation of the first network.
In some implementations, the method further includes unpairing the communication device of the head-mounted display from the first workstation of the first network; initiating pairing of the communication device of the head-mounted display to a second workstation of a second network; and exchanging data between the second workstation and the communication device of the head-mounted display during a surgical operation, to enable the head-mounted display to display to a user an augmented reality (AR) image including a virtual reality (VR) image presented over a scene on a body of a patient.
Also described and contemplated herein is the use of any of the apparatus, systems, or methods for the treatment of a spine through a surgical intervention.
Also described and contemplated herein is the use of any of the apparatus, systems, or methods for the treatment of an orthopedic joint through a surgical intervention, including, optionally, a shoulder, a knee, an ankle, a hip, or other joint.
Also described and contemplated herein is the use of any of the apparatus, systems, or methods for the treatment of a cranium through a surgical intervention.
Also described and contemplated herein is the use of any of the apparatus, systems, or methods for the treatment of a jaw through a surgical intervention.
Also described and contemplated herein is the use of any of the apparatus, systems, or methods for the diagnosis of a spinal abnormality.
Also described and contemplated herein is the use of any of the apparatus, systems, or methods for the diagnosis of a spinal injury.
Also described and contemplated herein is the use of any of the apparatus, systems, or methods for the diagnosis of joint damage.
Also described and contemplated herein is the use of any of the apparatus, systems, or methods for the diagnosis of an orthopedic injury.
Also described and contemplated herein is the use of any of the apparatus, systems, or methods in non-medical applications, such as gaming, driving, product design, shopping, manufacturing, athletics or fitness, navigation, remote collaboration, and/or education.
For purposes of summarizing the disclosure, certain aspects, advantages, and novel features are discussed herein. It is to be understood that not necessarily all such aspects, advantages, or features will be embodied in any particular embodiment of the disclosure, and an artisan would recognize from the disclosure herein a myriad of combinations of such aspects, advantages, or features. The embodiments disclosed herein may be embodied or carried out in a manner that achieves or optimizes one advantage or group of advantages as taught or suggested herein without necessarily achieving other advantages as may be taught or suggested herein. The systems and methods of the disclosure each have several innovative aspects, no single one of which is solely responsible for the desirable attributes disclosed herein.
The present disclosure will be more fully understood from the following detailed description of the embodiments thereof, taken together with the drawings, claims and descriptions above. A brief description of the drawings follows.
Non-limiting features of some embodiments of the inventions are set forth with particularity in the claims that follow. The following drawings are for illustrative purposes only and show non-limiting embodiments. Features from different figures may be combined in several embodiments. It should be understood that the figures are not necessarily drawn to scale. Distances, angles, etc. are merely illustrative and do not necessarily bear an exact relationship to actual dimensions and layout of the devices illustrated.
Some image-guided surgery systems may apply augmented reality (AR) techniques for displaying, over structures inside a patient's body intended to be operated on, one or more anatomical images of the structures (e.g., bones, joints, soft tissue, organs, cartilage). For example, the system may comprise a suitable head-mounted display (HMD), which is configured to display to a surgeon or other wearer, three-dimensional (3D) anatomical images, two-dimensional (2D) or 3D cross sections, tool trajectory, tool depth and additional information that assists the surgeon or other wearer to visualize structures (e.g., vertebrae, joints, bones, soft tissue, organs) that are hidden from actual view by overlying layers of tissue (for example, but not by way of limitation, during a minimally invasive interventional procedure or surgery that does not require open surgery to expose a target region of the body).
Several embodiments of the disclosure that are described herein provide assemblies and methods that may be implemented in conjunction with several types of HMDs for improving the quality of image-guided surgical or other interventional procedures, including spinal surgery and other sorts of orthopedic procedures (as well as other types or categories of procedures, such as dental procedures, cranial procedures, neurological procedures, joint surgery (e.g., shoulder, knee, hip, ankle, other joints), heart surgery, bariatric surgery, facial bone surgery, neurosurgery, and the like), including minimally invasive procedures that do not require open surgery but can be performed through small incisions (e.g., self-sealing incisions that do not require staples or stitches). Note that each of the HMDs described hereinafter may comprise a basic configuration and additional optional components and assemblies that may be implemented in one or more of the HMDs in addition to the basic configuration.
The systems, devices and methods described may be used in connection with other medical procedures (including therapeutic and diagnostic procedures) and with other instruments and devices or other non-medical display environments. The methods described herein further include the performance of the medical procedures (including but not limited to performing a surgical intervention such as treating a spine, shoulder, hip, knee, ankle, other joint, jaw, cranium, etc.). Although medical applications are well-suited for several embodiments, non-medical applications also benefit from many embodiments described herein. For example, non-medical applications may involve consumer or commercial applications such as athletics and fitness, gaming, driving, product design, navigation, manufacturing, logistics, shopping and commerce, educational training, remote collaboration, etc. The surgeon referenced herein may be a consumer or other wearer or user.
In one example, the surgical procedure (e.g., minimally invasive surgical procedure or open surgical procedure) comprises one or more orthopedic procedures performed on one or more vertebrae of the spine of a human subject, referred to herein as a patient 23, who is lying on an operating table 12. One example application is a Lateral Lumbar Interbody Fusion (LLIF) procedure for treating disc problems in the lower (e.g., lumbar or lumbosacral) back of patient 23. In other embodiments, the techniques described below are applicable, mutatis mutandis, to other sorts of surgical procedures carried out on other vertebrae (e.g., lumbar vertebrae, thoracic vertebrae, cervical vertebrae, sacral vertebrae), any suitable organ, bone(s), joints (e.g., sacroiliac joints, knee joints, shoulder joints, ankle joints, hip joints) or other tissue of patient 23. In non-limiting examples, system 11 may be used in other sorts of procedures performed on bone tissue of patient 23, such as in cranial procedures, oral procedures and in maxillofacial surgery, knee surgery, hip surgery, shoulder surgery. Moreover, system 11 may also be used, mutatis mutandis, in surgical or other interventional (e.g., therapeutic or diagnostic) procedures of soft tissue (e.g., neuro procedures, cranial procedures, joint repair or reconstruction procedures, scoping procedures, arthroscopic procedures, ablation procedures, etc.). System 11 may also be used in non-medical applications, including consumer or commercial applications such as athletics and fitness, gaming, driving, product design, navigation, manufacturing, logistics, shopping and commerce, educational training, remote collaboration, etc. and the wearer may be a person other than a surgeon or medical professional (such as a consumer or other user).
In some embodiments, during the procedure, a medical professional, also referred to herein as a physician or a surgeon 26, uses a suitable tool for making an incision 24 into the patient's back. In some embodiments, surgeon 26 inserts an anchoring device, such as spinous process clamp 30 into the incision 24, so that opposing jaws of the clamp 30 are located on opposite sides of the spinous processes. Subsequently, surgeon 26 slides the clamp 30 over the vertebral laminas, and adjusts the clamp 30 to grip one or more spinous processes, selected by the surgeon 26, of the patient 23. One optional implementation of clamp 30 is described in more detail in PCT Application Publication WO 2022/079565, whose disclosure is incorporated herein by reference.
In some embodiments, clamp 30 acts as a support for a patient marker 38, which is attached rigidly to the clamp. During substantially all of the procedure, e.g., during the initial, as well as the subsequent stages, patient marker (not shown), which is used as a fiducial for patient 23, since due to its rigid connection to the patient 23, any movement of the patient 23 is reflected in a corresponding motion of the patient marker 38. Thus, at an initial stage of the procedure, marker 38 is registered with the anatomy of patient 23.
In some embodiments, the anchoring device may be a pin inserted into a bone of the patient, e.g., iliac bone. One optional implementation of such a pin is described in more detail in PCT Application Publication No. WO 2023/281395, whose disclosure is incorporated herein by reference.
Embodiments related to registration tools, markers, marks, adaptors, and methods are described in detail, for example, in U.S. Patent Application Publication 2022/0071712, U.S. Patent Application Publication 2022/0142730, U.S. Pat. No. 10,939,977 and U.S. Patent Application Publication 2021/0161614, whose disclosures are all incorporated herein by reference.
In some embodiments, system 11 comprises (i) a head-mounted display (HMD) 22, which is worn by surgeon 26 and is described in detail hereinafter, (ii) one or more surgical and/or diagnostic tools, such as but not limited to a surgical tool 190, and one or more reflectors, such as a reflector 194, mounted on tool 190. The reflectors may comprise markers for registration and/or calibration purposes.
Reference is now made to an inset 13 showing one optional implementation of HMD 22 shown in a front view.
In the context of the present disclosure, the term “front view” refers to the view of HMD 22 as seen by the eyes of a person located in front of surgeon 26 wearing HMD 22. In the example of
In some embodiments, processor 33 is configured to receive information, such as anatomical images, and signals from one or more sensors (described below) and other entities of system 11, and to display to surgeon 26 one or more images overlaid on the surgeon's actual view of a portion of the exterior of the patient's body. For example, during a spinal surgery, processor 33 is configured to produce an augmented reality (AR) display that may show 3D images of the vertebrae overlaid on the patient's back, as seen by the patient's eyes. Certain embodiments related to the images, signals and AR display are described in more detail below.
In some embodiments, HMD 22 comprises a visor 14 of a visor-based optical engine for each eye of surgeon 26, which is not shown in
In some embodiments, the optical engine (OE) comprises (i) a projector configured to project the AR image produced by the processor, and (ii) optics configured to direct the projected AR image to the visor, also referred to herein as an AR display 15.
In various embodiments, the projector comprises one or more light sources and/or image sources. As one example, the projector comprises an organic light-emitting diode (OLED)-based image source and display comprising a matrix of LEDs having a total size (e.g. diagonal size) of about 0.5 inch. Other sizes of displays may also be implemented.
In the context of the present disclosure and in the claims, the term “AR image” and grammatical variations thereof refer to a virtual reality (VR) image displayed over or integrated with a display including at least partially transparent portions and having a scene in the background, so that a combination of the VR image and the scene is referred to herein as the AR image.
In some embodiments, AR display 15 is configured to display to surgeon 26, the AR image produced by processor 33 by reflecting the AR image into the pupil of the eye of surgeon 26. The optical engine is shown and described in connection with
In other embodiments, the OE of HMD 22 may have different configurations and may be based on different techniques, such as but not limited to a waveguide and liquid crystal-based OE described in more detail in connection with
In some embodiments, HMD 22 comprises one or more light sources for tracking applications configured to direct light beams to the surface of the organ or treatment region in question (e.g., back) of patient 23. In some embodiments, the light source comprises a pair of infrared (IR) LED projectors 17 configured to direct IR light beams to the surface of the treatment region. In other embodiments, the light source may comprise any other suitable type of one or more light sources, configured to direct any suitable wavelength or band of wavelengths of light, and mounted on HMD 22 or elsewhere in the operating room.
In some embodiments, HMD 22 comprises a camera 16. In some embodiments, camera 16 comprises a red green blue (RGB) camera having an IR-pass filter, referred to herein as an IR camera and also referred to herein as an IR tracker. In other embodiments, camera 16 may comprise a monochrome camera configured to operate in the IR wavelengths. Camera 16 is configured to capture images including both reflectors 194 and markers 38 and markers (not shown) attached to patient 23. Although camera 16 in
In several embodiments, the tracking application that is based on the images produced by camera 16 requires monochromatic images. In some embodiments, camera 16 comprises a color image sensor. The addition of colors in the tracking images may, in at least some instances, lower image quality due to the de-bayering interpolation applied to the color pixels for producing a contiguous image based on the separated RGB pixels (two red, one green and one blue pixels) of the Bayer filter of the RGB image sensor.
In some embodiments, camera 16 comprises compact sensors (such as sensors designed for consumer products) having a color array filter (CAF) (also denoted a Bayer filter) giving each of the different color channels a unique response. By adding an external band-pass filter, the raw pixel data received from camera 16, can be treated as monochrome data.
In some embodiments, the band-pass filter (BPF) is applied to a selected section of the infrared zone (e.g., between about 830 nm and 870 nm, or using any other suitable range within the wavelengths of the infrared spectrum).
In some embodiments, processor 33 (or any other controller) is configured to apply to each channel, a respective single gain value, so as to offset the effects of the Bayer filter on the filter's pass band.
In some embodiments, the basic configuration of HMD 22 comprises the aforementioned processor 33, communication device (wireless and/or wired), camera 16 (e.g., an IR camera), projectors 17 (e.g., IR projectors), display 15 and the optical engine comprising the projector (shown in
In some embodiments, HMD 22 comprises components, which are additional to the basic configuration. For example, HMD 22 comprises an inertial measurement unit (IMU) 18, which is configured to produce position signals indicative of the position and orientation of HMD 22 at a frequency level between about 1 Hz and 10 kHz (e.g., between 1 Hz and 50 Hz, between 1 Hz and 200 Hz, between 50 Hz and 250 Hz, between 100 Hz and 200 Hz, between 50 Hz and 1 kHz, between 100 Hz and 10 kHz, overlapping ranges thereof, or any value within the recited ranges). Based on the position signals received from IMU 18, processor 33 may be configured to improve the response time of system 11, e.g., to any relative motion between HMD 22, and the organ in question (or other target treatment anatomy or region) of patient 23.
In some embodiments, IMU 18 is configured to operate in conjunction with camera 16 (which typically operates at a frequency of about 60 frames per second or at any other suitable frequency corresponding to any other suitable number of frames per second) and to provide processor 33 with the position and orientation of HMD 22 with a reduced latency compared to images received from camera 16. In such embodiments, the position and orientation of HMD 22 can be calculated with the reduced latency obtained by using IMU 18. Moreover, in case the optical path between camera 16 and one or more of the markers is occluded, processor 33 may rely on the signals from IMU 18 for calculating the position of HMD 22 relative to the organ in question (or other target treatment anatomy or region).
In some embodiments, processor 33 is configured to conduct a registration between the coordinate systems of IMU 18 and camera 16 and a synchronization between the signals received from IMU 18 and camera 16.
In some embodiments, HMD 22 comprises one or more additional cameras 25 (e.g., a pair of red-green-blue (RGB) cameras). Each additional camera 25 (e.g., RGB camera) is configured to produce high resolution (HR) images (e.g., HR RGB images) of the organ being operated on (or other target treatment anatomy or region), and processor 33 is configured to display the HR images on the display 15 of HMD 22.
Moreover, because HMD 22 comprises a pair of additional cameras 25, which are positioned at a known distance from one another, processor 33 is configured to produce a stereoscopic 3D image of the site being operated on (e.g., the organ in question or other target treatment anatomy or region). Such techniques are also referred to herein as a digital loupe, for augmented reality near eye display, and are described in more detail, for example in U.S. Provisional Patent Application 63/234,272, and in PCT Publication No. WO2023/021450, the disclosure of both of which are incorporated herein by reference.
In other embodiments, HMD 22 may comprise any other suitable number of additional cameras 25 having similar features to that of RGB cameras. Alternatively, at least one of the additional cameras 25 may have different features compared to that of RGB cameras.
In some embodiments, each additional camera 25 is further configured to acquire images containing a structured light pattern, which is directed to the site being operated on by surgeon 26, and based on the image of the structured light pattern, processor 33 is configured to improve the precision of the 3D imaging of the site being operated on.
In some embodiments, the structured light projector is configured to direct a large number of beams (e.g., hundreds, or thousands) to the organ in question (or other target treatment anatomy or region), so as to enable 3D imaging (and most importantly depth imaging) of the organ (or other target treatment anatomy or region). In some embodiments, the wavelength of the structured light must be suitable for producing spots on the skin and/or internal tissue being operated. In some examples, the structured light may comprise green laser beams, blue laser beams, red laser beams, infrared laser beams or beams of any other suitable wavelength or range of wavelengths.
In some examples, the structured light comprises a visible wavelength (e.g., green), so that cameras 25 produce images of green spots on the skin and/or on the surface of the tissue being operated on. Based on the images received from cameras 25, processor 33 is configured to produce the 3D image of the organ (or other target treatment anatomy or region) being operated on. The structured light is described in more detail in connection with
In other cases, the structured light comprises beams having a non-visible wavelength, such as infrared wavelengths. In some embodiments, processor 33 is configured to produce one or more images of the IR spots on the surface of the organ in question (or other target treatment anatomy or region). The images may be two-dimensional (2D), or typically 3D, using images (e.g., IR images) acquired, for example, by camera 16 and/or using additional IR cameras that may be mounted on HMD 22 or at a known position in the operation room. The 2D and 3D images may be produced by processor 33 using the same techniques, mutatis mutandis, described above for producing the images having the visible spots of structured light.
In some embodiments, HMD 22 comprises a housing 29, which is configured to package all the components described above. In some embodiments, housing 29 comprises a surface 20, which is configured to receive a headlight assembly mounted thereon. Several types of headlight assemblies and mounting techniques thereof to surface 20 are described in detail in connection with
In some embodiments, HMD 22 comprises a nose pad 28, which is adjustable and is configured to support HMD 22 over the nose of surgeon 26. Several embodiments related to the structure and functionality of nose pad 28 are described in detail in connection with
In some embodiments, HMD 22 comprises one or more light sensors 19, also referred to herein as an ambient light sensor (ALS), which is configured to produce one or more signals indicative of the light in the ambient area surrounding HMD 22. Based on the signals received from light sensor 19, processor 33 is configured to adjust the brightness of the AR image presented on display 15. Additionally, or alternatively, based on the signals received from light sensor 19, processor 33 is configured to adjust the power applied to a light source mounted on MHD 22 and/or to a lighting assembly 27 described herein.
In some embodiments, HMD 22 comprises an IR-based proximity sensor (not shown), which is configured to produce signals for various uses, such as but not limited to hand-gesture tracking.
Reference is now made back to the general view of
During the surgical or other interventional procedure, surgeon 26 wears HMD 22, which is configured to present to surgeon 26 the captured and stored images as well as additional calculated information based on the tracking system that are aligned with the organ (or other target treatment anatomy or region) being operated on.
In some embodiments, in serving as a fiducial, marker 38 performs two functions: in a first function, the marker is used to maintain registration between frames of reference of HMD 22 and the patient's anatomy, and in a second function, the marker is used to ascertain where the head and gaze of surgeon 26 is located with respect to the organ in question (or other target treatment anatomy or region) of patient 23.
During the initial stage of the procedure, a registration marker (not shown) is placed on the patient's back or other anatomical location, and is used to implement the registration of patient marker 38 with the anatomy of patient 23. In contrast to patient marker 38, the registration marker, in some implementations, is only used during the initial stage of the procedure, e.g., for the registration of the patient marker 38, and once the registration has been performed, for the subsequent procedure stages the registration marker may be removed from the patient's back or other anatomical location.
In some embodiments, system 11 comprises a processing system 31, which is communicatively coupled, by cables and/or wirelessly, to HMD 22. In some embodiments, processing system 31 comprises a computer processor 32, a storage device 37 comprising stored images 35, a screen 34, and an input device 36 such as a pointing device, mouse, touchscreen input such as a touchpad or touchscreen display, keyboard, etc.
In some embodiments, processing system 31 is configured to analyze the images acquired by the one or more cameras of HMD 22, and to present over display 15, the aforementioned AR image to surgeon 26.
As described above, HMD 22 comprises processor 33 to carry out at least the functions described above, but in alternative embodiments, during operation HMD 22 is connected to processor 32 of processing system 31, so as to carry these processing and displaying functions remotely, and the AR images are displayed to surgeon 26 over display(s) 15. The functions or tasks described herein as being implemented by processor 32 may be implemented by processor 33, or vice-versa.
The configuration of HMD 22 is provided by way of example and is simplified for the sake of conceptual clarity. In some embodiments, the processor 32 and the processor 33 can share processing tasks and/or allocate processing tasks between the processors 32, 33. Each of the processors 32, 33 may consist essentially of one processor or more than one processor.
In some embodiments, HMD 22 comprises two optical engines (OEs) 40, one for each eye. In some embodiments, each OE 40 comprises an AR projector 42, which is configured to direct the AR image described in
In some embodiments, AR display 15 is a section of visor 14, which is coated with one or more suitable layers, which is configured to reflect the projected VR image to the pupil of the respective eye, so that surgeon 26 can see the VR image overlaid on a scene of interest (e.g., the organ (or other target treatment anatomy or region) being operated on), in a way of augmented vision, virtual overlay on the real world.
In some embodiments, visor 14 is fully transparent or partially transparent so that when directing the gaze away from AR display 15, surgeon 26 can see the scene around him without having the AR image overlaid thereon.
In some embodiments, HMD 22 comprises two temple arms described herein, and nose pad 28 for mechanically supporting the mounting of HMD 22 over the head of surgeon 26. In the context of the present disclosure and in the claims, the term “temple arm” and grammatical variations thereof refer to a section of the frame of HMD 22 (or any other suitable type of HMD), which is coupled to housing 29 and is typically (but not necessarily) mounted on an ear of the user (e.g., surgeon 26) and is positioned in contact with at least a section of a respective temple of surgeon 26.
In some embodiments, a left temple arm 43 comprises processor 33 and optionally other devices, such as a wireless communication device 45 configured to exchange signals between HMD 22 and external entities, and a storage device 46 configured to store images, signals, program instructions and additional data of HMD 22. Note that processor 33, wireless communication device 45, and storage device 46 appear in dashed lines for being embedded within the inner volume of left temple arm 43.
In some embodiments, processor 33, wireless communication device 45, and storage device 46 may be disposed on one or more suitable substrates, such as one or more printed circuit boards (PCBs).
In some embodiments, all the devices are disposed on a single rigid PCB. In some embodiments, at least one of the PCBs may be flexible. Additional embodiments related to suitable types of flexible PCB are described in connection with
In some embodiments, HMD 22 comprises a right temple arm 44, which comprises an on/off or standby button 39 configured to turn the power on when using HMD 22 and to turn the power off when HMD 22 is not in use.
In some embodiments, temple arms 43 and 44 are configured to be adjusted to the shape of the respective left and right temples of surgeon 26 or of any other user, and to be mounted on the ears of surgeon 26 in order to hold the weight of HMD 22 (in one example, together with a nose pad described below). The structure and functionality of temple arms 43 and 44 is described in detail in connection with
In some embodiments, nose pad 28 is configured to be adjusted to the shape of the nose of surgeon 26 or to the nose of any other user. The structure and functionality of nose pad 28, as well as embodiments related to the combination of nose pad 28 and temple arms 43 and 44 are described in detail in connection with
In the context of the present disclosure and in the claims, the terms “frame” and “head mounting assembly” are used interchangeably and may refer to the combination of two or more elements among housing 29, nose pad 28 and temple arms 43 and 44, including the combination of head strap 740 and knob 744 (that together form an adjustable strap assembly) of
In some embodiments, a power cable (not shown) is threaded through a power cable strain relief 47 of HMD 22. In the present configuration, power cable strain relief 47 is mounted on right temple arm 44, and the power cable is configured to electrically connect between a power source (not shown) and several components of HMD 22, such as but not limited to an on/off button 39.
In some embodiments, the power source comprises a pack of suitable batteries, and one or more supercapacitors or ultracapacitors (not shown). In some embodiments, the pack of batteries comprises lithium-based batteries, such as but not limited to batteries produced by RRC Power Solutions GmbH (Hamburg, Germany).
In some embodiments, the supercapacitor or ultracapacitor can be used to reduce lengthy boot-up when changing the battery packs. Instead of powering down HMD 22, processor 33 may be configured to control the components of HMD 22 to enter a low current standby mode. By powering off all components and peripherals, current may be reduced to the minimum, so as to enable the supercapacitor or ultracapacitor to retain the state of HMD 22 for a sufficiently long time interval of switching the battery packs without the need for an additional battery for the standby mode.
This particular configuration of HMD 22 is shown by way of example, in order to illustrate certain problems that are addressed by embodiments of the disclosure and to demonstrate the application of these embodiments in enhancing the performance of such a system. Embodiments of the disclosure, however, are by no means limited to this specific sort of example system, and the principles described herein may similarly be applied to other sorts of HMDs used in suitable types of AR-based image-guided surgical systems. In addition, HMD 22 may be used in non-medical applications, including consumer and commercial applications.
In some embodiments, system 11 is not limited only to augmented reality systems and/or to systems comprising one or more HMDs. For example, the tracking of patient and/or tool may be performed using stationary tracking systems (other than HMD) and the display may also be on a stationary display, which may or may not be displayed as augmented reality and may or may not be mounted on the head of the user (e.g., surgeon 26) of system 11.
In some embodiments, HMD 22 may comprise various types of image sources, such as but not limited to OLED and liquid-crystal on silicon (shown and described in connection with
In the example of
In some embodiments, an HMD 22 that includes all the parts described in
In some embodiments, HUD 700 comprises an optics housing 704, which incorporates a camera 708. More specifically, camera 708 may comprise an RGB camera configured as an IR camera using a suitable filter and software or the camera 708 may comprise an infrared camera or an RGB-IR camera. In some embodiments, housing 704 comprises an infrared transparent window 712, and within the housing, e.g., behind the window, are mounted one or more infrared projectors 716.
In some embodiments, HUD 700 comprises a pair of AR displays 720 that are mounted on housing 704. In some embodiments, displays 720 may comprise, for example, an optical combiner, a waveguide, or a visor, as described in connection with
In some embodiments, AR displays 720 allow surgeon 26 to view entities, such as part or all of a selected field-of-view (not shown) through AR displays 720, and which are also configured to present to the surgeon images that may be received from processing system 31 or any other information.
In some embodiments, HUD 700 comprises a processor 724, which operates elements of HUD 700 and is mounted in a processor housing 726. Processor 724 typically communicates with processing system 31 via an antenna 728. In some embodiments, processor 724 may perform some of the functions performed by processing system 31. In some embodiments, processor 724 may completely replace processing system 31.
In some embodiments, HUD 700 comprises a flashlight 732, which is mounted on the front of HUD 700. Flashlight 732 is configured to direct a beam of visible spectrum light (e.g., wavelengths between about 350 nm and 800 nm or between about 300 nm and 900 nm) to selected objects, so that surgeon 26 or other wearer is able to clearly see the objects through displays 720.
In some embodiments, HUD 700 comprises a power source (e.g., a battery (not shown)), which is configured to supply power to several elements of HUD 700 via a battery cable input 736. The power source may additionally or alternatively include one or more capacitors, supercapacitors or ultracapacitors.
In some embodiments, HUD 700 is held and gripped in place on the head of surgeon 26 using a head strap 740, and comprises a knob 744 that the surgeon 26 may use to adjust the head strap of HUD 700. The head strap 740 and knob 744 may together be referred to as an adjustable strap assembly.
In some embodiments, HUD 700 may comprise additional components, such as but not limited to components described in
Additionally, or alternatively, flashlight 732 of HUD 700 may be coupled to housing 704 using a suitable detachable lighting fixture assembly (DLFA), which is configured to be attached to and detached from housing 704 and/or the upper bridge (not indicated by a numeral) of HUD 700, or any other suitable location of HUD 700. The ability to detach flashlight 732 reduces weight from HUD 700, and may also be performed in case lighting assembly 27 of
This particular configuration of HUD 700 is shown by way of example, in order to illustrate certain problems that are addressed by certain embodiments and to demonstrate the application of these embodiments in enhancing the performance of such a system. Embodiments of the disclosure, however, are by no means limited to this specific sort of example HMD, and the principles described herein may similarly be applied to other sorts of head-mounted displays, head-up displays used in suitable types of AR-based image-guided surgical systems. For example, additional features of a head-mount display or a head-up display are described in detail, for example, in U.S. Patent Application Publication 2017/0178375, which is incorporated herein by reference.
In some embodiments, HMD 50 comprises a wave-guide based OE 55 comprising a backlight source 54. In some embodiments, backlight source 54 comprises one or more LEDs configured to supply visible light through a waveguide 52, which is coated with an opaque shroud to prevent, or at least reduce, the amount of stray light, which is an optical leakage of photons of the backlight. More specifically, backlight source 54 may comprise red, green and blue (RGB) LEDs and may be configured to emit a white light generated by combining the light of the RGB LEDs.
In some embodiments, the backlight is passing through an optical coding device (e.g., a liquid-crystal on silicon (LCOS) device 51), which is configured to modulate the backlight back on information coded by processor 33. For example, in response to receiving a signal indicative of a coded slice of a computerized tomography (CT) image, LCOS device 51 is configured to modulate the backlight and to produce an image of the CT slice presented over a display 49a of HMD 50.
In some embodiments, OE 55 comprises a photosensor 53, which is configured to measure stray light between backlight source 54 and LCOS 51 without interfering with the operation of OE 55. In other words, in some embodiments, photosensor 53 is not positioned along the optical axis of the backlight intended to be modulated, and uses the stray light for measuring the intensity of the backlight emitted from backlight source 54 into waveguide 52.
Reference is now made to an inset 57 showing a block diagram of closed-loop control assembly 59. In some embodiments, in response to sensing the intensity of the backlight emitted from backlight source 54, photosensor 53 is configured to produce a signal indicative of the measured intensity of the backlight.
In some embodiments, based on the signal received from photosensor 53, processor 33 is configured to control a backlight driver 58 to adjust the current applied to the RGB LEDs of backlight source 54.
In principle, it is possible to control the current supplied to backlight source, but due to the non-uniform response of the LEDs (even from the same batch of LEDs) of any light source (such as backlight source 54), the intensity of the backlight may be non-uniform and altered. More specifically, (i) the backlight may be altered over time in the same backlight source 54, e.g., when the LEDs are aging and/or in response to changes in the temperature of parts surrounding the LEDs, (ii) the backlight may be altered between different backlight sources 54 of different respective OEs 55 (e.g., between the left and right OEs 55 of HMD 50), (iii) between OEs of different HMDs 50, and (iv) any combination thereof.
In other words, processor 33, or any suitable dedicated circuit controls driver to adjust the current supplied to each of the LEDs of backlight source 54, so as to keep the light levels constant. Due to the fact that a sequential strobing scheme is used, a single photosensor 53 may be sufficient for controlling the light emitted in all three-color channels (RGB) in one embodiment.
Thus, in one embodiment, controlling the backlight based on direct off-axis measurement of the stray light of OE 55 improves the uniformity of the brightness of the AR image presented over AR display 49a.
Reference is now made back to the general view of
In some embodiments, processor 33 is configured to present different AR images over AR displays 49a and 49b, so as to display to surgeon 26 images such as a stereoscopic image (e.g., of a 3D CT image) of the organ (or other target treatment anatomy or region) being operated on.
In some embodiments, HMD 50 comprises an adaptor 48, which is formed in a frame 41 of HMD 50 and is adapted for mounting on HMD 50, and a suitable type of nose pad, such as nose pad 28 shown in
This particular configuration of HMD 50 is shown by way of example, in order to illustrate certain problems that are addressed by certain embodiments and to demonstrate the application of these embodiments in enhancing the performance of such a system. Embodiments of the disclosure, however, are by no means limited to this specific sort of example HMD configuration, and the principles described herein may similarly be applied to other sorts of HMDs and HUDs used in any suitable types of near-eye display AR-based image-guided surgical systems. In addition, HMD 50 may be used with other medical systems or with non-medical systems, including for consumer or commercial applications.
Example Headlight Assemblies
In some embodiments, HA 60 comprises a flashlight 61, which may have similar features of flashlight 732 of
In some embodiments, HA 60 comprises a detachable lighting fixture assembly (DLFA) 66, which is adapted for attaching flashlight 61 to surface 20 of housing 29, and for detaching flashlight 61 from surface 20.
Reference is now made to insets 62, 63, and 64. Inset 62 shows DLFA 66 without flashlight 61. In some embodiments (such as shown in inset 63), DLFA 66 comprises one or more clips 65 (e.g., one clip, two clips, three clips or more than three clips), which are configured to: (i) attach DLFA 66 to a base 67 located on surface 20, when DLFA 66 (and flashlight 61) are moved toward surface 20, and (ii) detach DLFA 66 from base 67 when clip(s) 65 are pressed toward the inner volume of DLFA 66.
In some embodiments (such as shown in inset 64), base 67 comprises electrical connections 68 (e.g., two or more vertical pogo pins, three pogo pins, or more than three pogo pins) configured to conduct electrical power and/or signals or data between housing 29 and flashlight 61.
In some embodiments, HA 70 comprises a flashlight (not shown), which may have similar features of flashlight 61 of
Reference is now made to an inset 72. In some embodiments, HA 70 comprises a DLFA 71, which is adapted for attaching the flashlight to surface 20 of housing 29, and for detaching the flashlight away from surface 20.
In some embodiments, DLFA 71 comprises one or more clips 73 (for example, one clip, two clips, three clips or more than three clips), which are configured to: (i) attach DLFA 71 to a base 74 located on surface 20, when DLFA 71 is moved in a direction 76 (e.g., away from the forehead of surgeon 26), and (ii) detach DLFA 71 from base 74 when clip(s) 73 are pressed toward base 74, and at the same time, DLFA 71 is moved in a direction 75 (e.g., toward the forehead of surgeon 26).
In some embodiments, base 74 comprises electrical connections 68 (e.g., two or more horizontal pogo pins, three pogo pins, or more than three pogo pins) configured to conduct electrical power and/or signals or data between housing 29 and the flashlight described above.
In some embodiments, HA 80 comprises a flashlight (not shown), which may have similar features of flashlights 61 of
Reference is now made to an inset 81. In some embodiments, HA 80 comprises a DLFA 82, which is adapted for attaching the flashlight to surface 20 of housing 29, and for detaching the flashlight away from surface 20.
In some embodiments, a base 86 formed on surface 20 comprises trenches 85 configured to receive DLFA 82 as will be described herein.
In some embodiments, DLFA 82 comprises two leads 84 (one at each side of DLFA 82), which are configured to: (i) slide through trenches 85 along direction 75 for attaching DLFA 82 to base 86, and (ii) slide through trenches 85 along direction 76 for detaching DLFA 82 away from base 86.
In some embodiments, DLFA 82 comprises a handle 83 for moving DLFA 82 in directions 75 and 76, and one or more clips configured to attach and detach DLFA 82 in conjunction with the movement in directions 75 and 76.
In some embodiments, housing 29 comprises electrical connections (e.g., one or more vertical or horizontal pogo pins (not shown)) configured to conduct electrical power and/or communication signals or data between housing 29 and the flashlight described above, which is connected to DLFA 82.
In some embodiments, HA 90 comprises a flashlight (not shown), which may have similar features of flashlights 61 of
Reference is now made to an inset 91. In some embodiments, HA 90 comprises a DLFA 92, which is adapted for attaching the flashlight to a base 93 coupled to surface 20 of housing 29, and for detaching the flashlight away from base 93.
In some embodiments, inset 91 shows how DLFA 92 is attached to and detached from base 93, when being moved in directions 75 and 76, respectively.
Reference is now made back to the general view of
In some embodiments, base 93 comprises electrical connections (e.g., two or more vertical and/or horizontal pogo pins (not shown)) configured to conduct electrical power and/or communication signals or data between housing 29 and the flashlight described above, which is connected to DLFA 92.
Reference is now made back to inset 91. In some embodiments, DLFA 92 comprises two pairs of flexible fins 96 (one pair at each side of DLFA 92), which are partially surrounding an opening 97 configured to contain fences 95 for attaching DLFA 92 to base 93.
In some embodiments, the configuration of at least part of DLFA 92 and base 93 is similar to a design of a GoPro®-style mount of cameras and other electronic accessories, which is produced by GoPro Corporate (3025 Clearview Way, San Mateo, CA).
In some embodiments, HA 100 comprises a flashlight 61 and a detachable lighting fixture assembly (DLFA) 101, which is adapted for attaching flashlight 61 to surface 20 of housing 29, and for detaching flashlight 61 away from surface 20.
Reference is now made to insets 102 and 103. In some embodiments, DLFA 101 comprises a bore 104, which is adapted to contain an axis (not shown) for controlling an elevation angle 105 of flashlight 61 relative to a virtual plane, e.g., parallel to surface 20 or to any other suitable reference plane.
With reference to inset 103, in some embodiments, DLFA 101 comprises a base 106, which is coupled to surface 20 and is configured to connect between DLFA 101 and surface 20 of housing 29.
Reference is now made to an inset 107. In some embodiments, base 106 comprises electrical connections 68 (e.g., two or more vertical or horizontal pogo pins) configured to conduct electrical power and/or signals or data between housing 29 and flashlight 61.
In some embodiments, base 106 comprises a pair of magnets 109a and 109b, and DLFA 101 comprises a pair of magnets 108a and 108b. When DLFA 101 is placed over base 106, magnets 108a and 109a attract one another, and similarly, magnets 108b and 109b attract one another.
In such embodiments, the magnetic-based attachment and detachment between DLFA 101 and base 106 are quick and easy because they do not demand a mechanical release of clips or any other sort of capturing and/or locking device. Thus, surgeon 26 or a medical staff member in the operating room can attach and detach DLFA 101 and flashlight 61 using one hand, and subsequently, adjust angle 105 for directing the light beam to a region of interest, e.g., the organ (or other target treatment anatomy or region) being operated on.
In some embodiments, DLFA 101 and base 106 may comprise three, four, or more than four pairs of magnets 108 and 109 for improving the stability of the magnetic-based coupling and preventing undesired rotation of DLFA 101 relative to base 106. In accordance with several embodiments, the size, shape, and magnetism level of magnets 108 and 109, and the distance between every pair of magnets 108 and 109, may define the stability of the magnetic-based coupling.
In some embodiments, a single pair of magnets 108 and 109 may be sufficient for enabling the stability of the magnetic-based coupling and preventing undesired rotation of DLFA 101 relative to base 106.
These particular configurations of the HAs and DLFAs of
In some embodiments, the headlight assemblies shown in
Example Tilting Assemblies
The embodiments related to tilting assemblies that are described in detail in connection with
In some embodiments, tilting assembly 111 is configured for tilting temple arm 44 relative to housing 29. In other words, tilting assembly 111 provides surgeon 26 with a horizontal degree-of-freedom (DOF) for adjusting HMD 22 to the shape of the head of surgeon 26 or to the shape of the head of any other user of HMD 22. The surgeon 26 could be substituted with a consumer for non-medical applications.
In some embodiments, tilting assembly 111 may be implemented using a hinge (not shown), also referred to herein as an axis. In some embodiments, tilting assembly 111 is implemented in a virtual hinge, also referred to herein as a virtual axis. The terms “virtual hinge” and “virtual axis” and grammatical variations thereof refer to tilting one object relative to another object without using a real, or physical, hinge or a real, or physical, axis.
Reference is now made to an inset 110, which is a top view of tilting assembly 111 integrated in HMD 22.
In the example of
In the example implementation of the virtual axis of tilting assembly 111, sections 112 and 113 are coupled to housing 29 and temple arm 44, respectively, using screws 115 and bores 116. Section 112 and housing 29 move together as a rigid entity, and section 113 is moved in directions 117 and 118 by bending section 114.
In some embodiments, instead of using screws and bores, at least one of sections 112 and 113 tilting assembly 111 may be coupled to the respective parts of HMD 22 using any other coupling technique, such as but not limited to fastening with devices other than screws, gluing, welding, and/or soldering.
Additionally, or alternatively, at least one of sections 112 and 113 of tilting assembly 111 may be formed as an integral part of HMD 22. For example, section 112 may be formed in one cast mold together with at least part of housing 29.
In some embodiments, tilting assembly 111 may comprise different materials used in at least two of the sections. For example, sections 112 and 113 are made of stainless steel, and section 114 is made of a softer and more flexible material, such as but not limited to a nickel titanium alloy, also known as nitinol, and any other suitable material with proper characteristics, flexibility within the elastic range at the required movement range.
Reference is now made to an inset 119, which is a side view of tilting assembly 111 integrated in HMD 22. In the example of inset 119, the coupling of sections 112 and 113 to the respective parts of HMD 22 is shown using screw 115 and bores 116 (through which additional screws 115 are intended to be inserted in some embodiments).
In some embodiments, the techniques described for tilting assembly 111 are implemented, mutatis mutandis, also between temple arm 43 and housing 29.
In some embodiments, any other suitable type of tilting assembly may be implemented between temple arm 43 and housing 29.
In some embodiments, tilting assembly 77 is implemented in HMD 22, but may also be implemented in other sorts of HMDs.
In some embodiments, temple arm 44 comprises a section 121 configured to conform with the shape of the right temple of surgeon 26 or any other user of HMD 22. Temple arm 44 further comprises a section 122 configured to conform with the shape of the right side of the temple and/or the rear portion of human head, referred to herein as the nape of surgeon 26 or any other user of HMD 22.
Reference is now made to an inset 120, which is a side view of section 122 and tilting assembly 77 implemented in temple arm 44 for obtaining an additional adjustment DOF between HMD 22 and the head of surgeon 26 or any other user of HMD 22.
In some embodiments, tilting assembly 77 comprises a rotatable rocker arm 123, a hinge 124 connecting between section 121 and rocker arm 123, and a cushion 125 formed on the surface of rocker arm 123.
Reference is now made to an inset 131, which is a pictorial illustration of the inner structure of tilting assembly 77.
In some embodiments, rocker arm 123 (which may be made from polymer, such as but not limited to polycarbonate and/or Polyoxymethylene (POM) and/or any other suitable material) has a proximal section 127 and a distal section 129 configured to rotate about hinge 124, and a spring 130 (e.g., a torsion spring).
In some embodiments, rocker arm 123 is configured to rotate about hinge 124 relative to a longitudinal axis 191 of section 121.
For example, when HMD 22 is mounted on the head of the aforementioned first surgeon, section 129 is moving from its stationary state, all the way in direction 128a, and follows the ergonomic structure of the user head, usually slightly to direction 126a and section 127 is moved in a direction 126b to conform with the shape of the nape of the first surgeon.
Similarly, when HMD 22 is mounted on the head of the second surgeon (having a different shape of head compared to that of the first surgeon), section 129 is moved in a direction 126a and section 127 is moved in a direction 126b to conform with the shape of the rear head portion, also referred to herein as the nape of the second surgeon. Torsion spring 130 may be configured to reverse the movement direction. In the illustrated example, torsion spring 130 moves section 127 in direction 128b for improving the clamping between rocker arm 123 and the head of the user (e.g., the second surgeon). Moreover, torsion spring 130 is configured to move section 127 in order to enable smooth insertion of HMD 22 on the head of the respective surgeon.
In some embodiments, section 121 has an opening or a partial opening for containing section 127 when being rotated in direction 126b.
Reference is now made to
In the example of
Reference is now made to an inset 132 of
In some embodiments, cushion 125 is disposed over the entire surface of rocker arm 123 that is facing the head of the surgeon.
In some embodiments, cushion 125 may be molded on rocker arm 123, or may comprise a separate part attached to rocker arm 123. Cushion 125 may comprise a large variety of materials, solid or sponged (for example, silicone, neoprene, polyurethane, and/or other suitable materials).
In some embodiments, the sponge may comprise closed cells that do not absorb fluids (e.g., sweat), or open cells adapted to absorb fluids.
In some embodiments, cushion 125 comprises a viscoelastic foam also referred to as a “memory foam” for obtaining a soft cushion.
In some embodiments, when a weighted object is positioned on the viscoelastic foam, the foam progressively conforms to the shape of the object, and after the weight (i.e., force) is removed, the foam slowly reassumes its original shape.
In some embodiments, when using viscoelastic material in cushion 125, a human body temperature between about 36° C. and 37° C. accelerates the properties of the memory foam described above.
In accordance with several embodiments, the soft cushion and the viscoelastic foam are adapted to generate a uniform distribution of pressure applied to the head of the surgeon using HMD 22, and thereby, enabling both effective clamping between rocker arm 123 and the head, while retaining a feeling of comfortability for the user (e.g., surgeon 26). Moreover, the clamping effect can be beneficial for safety reasons, in order to preclude a falling event of HMD 22 from the head during the surgical operation.
In some embodiments, HMD 22 comprises two or more DOFs obtained by tilting assemblies 111 and 77, and by using the two-section shape of rocker arm 123 and the selected materials of cushion 125. In accordance with several embodiments, increasing the number of DOFs improves the gripping and adjustments between the frame of HMD 22 (e.g., temple arms 43 and 44, and housing 29) and the contour and/or curvature of the head of the surgeon performing the surgical operation.
In some embodiments, the outer surface of cushion 125 has a suitable texture for improving the gripping to the nape of the head of surgeon 26. For example, the outer surface of cushion 125 (intended to be placed in contact with the nape) may comprise grooves shown in the example of inset 132.
In some embodiments, tilting assembly 88 is implemented in HMD 22, but may also be implemented in other sorts of HMDs, such as the other HMDs described herein. Moreover, tilting assembly 88 may be used instead of tilting assembly 77 of
Reference is now made to an inset 135 showing a side view of tilting assembly 88 in an XYZ coordinate system.
In some embodiments, tilting assembly 88 comprises a skeleton 136, which may be made from a suitable metal (e.g. Aluminum 5052 H32, supplied by Aleris International Inc. (Beachwood, OH) or other aluminum or metallic or metallic alloy or polymeric or elastomeric material) adapted to be shaped at least along an XY plane of the XYZ coordinate system. In some embodiments, skeleton 136 may also be shaped along the Z-axis to some extent (even though, in some implementations, this shape adjustment is not required).
In some embodiments, tilting assembly 88 comprises an upholstery 138, which is fitted over and may be coupled with skeleton 136. In some embodiments, upholstery 138 comprises an over molding cushion having similar properties and materials (e.g., viscoelastic foam) to that of cushion 125 of
In some embodiments, the metal of skeleton 136 is adapted to transform from an elastic deformation (in which the skeleton returns to its original shape in response to applying a force and performing a small deformation) to a plastic deformation (in which the skeleton undergoes a larger deformation and retains the shape obtained in response to the applied force).
For example, with reference to the general view of
Reference is now made to inset 135. In some embodiments, upholstery 138 of sections 137 and 139 comprises a suitable texture for improving the gripping to the nape of the head of surgeon 26. For example, the outer surface of upholstery 138 (intended to be placed in contact with the nape) may comprise grooves having a suitable orientation (e.g., parallel to the XY plane, and/or parallel to the XZ plane) or parallel to the longitudinal axis of the respective section 137 and 139 of temple arms 44 and 43. In some embodiments, upholstery 138 may have any other suitable texture other than grooves, which is adapted to improve the gripping between sections of the nape of surgeon 26 and sections 137 and 139 of HMD 22.
In some embodiments, nose pad 28 is implemented in HMD 22, but in some embodiments, nose pad 28 may also be implemented, mutatis mutandis, in other sorts of HMDs, including any of the HMDs disclosed herein.
Reference is now made to the head of surgeon 26 for showing the position of nose pad 28 over a nose 141 of surgeon 26.
Reference is now made to an inset 148 showing a higher magnification of nose pad 28 placed over a section of nose 141. In the example of inset 148, nose pad 28 is presented with an upholstery 151 of polymer (e.g., a viscoelastic material, such as but not limited to the viscoelastic material of cushion 125 of
In some embodiments, nose pad comprises a metal-based skeleton 150, which is surrounded by upholstery 151 and may comprise similar materials and functionality to that of skeleton 136 shown in
Nose 141 has a forward section 146, whose cartilage and nerves may be sensitive to being in constant contact with nose pad 28. However, the skin in the left and right sides 147 of nose 141 is not in close proximity to the cartilage, and therefore, is less sensitive to be in contact with nose pad 28.
Reference is now made to an inset 149 showing a frontal view of nose pad 28. Note that in the general view of HMD 22, nose pad 28 and HMD 22 are shown from a rear view, which is opposite to the front view of inset 149.
In some embodiments, nose pad 28 comprises a left section 142, a right section 143 and middle section 144 connecting between the left and right sections. In some embodiments, middle section 144 has an opening, but in some embodiments, middle section 144 may have a solid structure without an opening.
In some embodiments, nose pad comprises a section 145 adapted to couple between nose pad 28 and the frame of HMD 22. In an embodiment, section 145 may have a DOF for being adjusted relative to the frame of HMD 22. The DOF may be implemented using any suitable type of tilting assembly, which may be based on a hinge, or on a vertical axis as described in detail, for example, in any of
The DOF implemented in section 145 may be operated synchronously or asynchronously with an additional DOF (also referred to herein as a vertical DOF or a pantoscopic-tilting assembly, and is described in detail in connection with
In some embodiments, section 145 may be rigidly coupled to the frame of HMD 22, without a degree of freedom for tilting relative to the frame.
In one implementation, nose pad 28 comprises skeleton 150 disposed in sections 142, 143 and 145, so as to adjust the shape of nose pad 28 to the shape of nose 141.
Reference is now made back to inset 148. In some embodiments, nose pad 28 is adapted to be shaped so that section 144 of nose pad 28 is not in direct contact with forward section 146 of nose 141, and in some embodiments, an air gap 153 is buffering between section 144 and forward section 146.
In some embodiments, the surgeon may place section 144 directly over forward section 146 of the nose.
In some embodiments, section 145 may comprise an assembly configured to adjust the height (e.g., along the Z-axis) of nose pad 28 relative to the frame of HMD 22. The adjustment may be carried out in steps of predefined movement range, or may be continuous using one or more suitable assemblies implemented in nose pad 28, or in a part of the frame of HMD 22 (e.g., in housing 29), or therebetween.
In some embodiments, upholstery 151 may have a suitable texture, such as but not limited to the textures described above for cushion 125 and upholstery 138 of
In some embodiments, nose pad 28 may comprise two ball joints for sections 142 and 143, respectively, so as to provide the user with improved adjustment of the shape of nose pad 28 relative to the width of nose 141.
In some embodiments, PTA 155 may be implemented, mutatis mutandis, in HMDs 22 and 50, in HUD 700 and in any other sort of HMD assembly using embodiments described in connection with
Reference is now made to
In the example of
In the example of
Reference is now made to a frame 164 showing fixed pantoscopic tilts implemented in respective pantoscopic tilt angle of about 0° and 12° applied to the upper and lower glasses of frame 164. In the example of frame 164, the pantoscopic tilt angle is defined between a frame 169 and glasses 167 (e.g., corresponding to AR display 15 of HMD 22).
In some embodiments, in near-eye display AR-based systems, such as in HMD 22 of system 11, the pantoscopic tilt is set for aligning the second optical axis with the first optical axis. In the example of
Reference is now made to
In some embodiments, a pantoscopic tilt angle 178 of about 35° is applied to HMD 22 using PTAs 155 and 156. However, other pantoscopic tilt angles may be applied (e.g., angles between 25 degrees and 45 degrees, angles between 30 degrees and 40 degrees, angles between 33 degrees and 37 degrees, overlapping ranges thereof, or any value within the recited ranges). Reference is now made to an inset 170 showing the vertical DOF implemented in PTA 155.
In some embodiments, PTA 155 comprises a bar 172 rigidly coupled to optical engine 165, and a hinge 171 configured to rotate bar 172 relative to temple arm 44 of the frame of HMD 22. In accordance with several embodiments, the movement is a relative movement about a vertical axis between the optical engine and the frame.
In one non limiting example, the frame of HMD 22, and more specifically, housing 29, temple arms 43 and 44, and nose pad 28, are not moved when PTA 155 is tilting OE 165 in angle 178 to the desired pantoscopic tilt angle, and the same applies to the frame when PTA 156 (described herein) is tilting OE 165 in angle 178 to the desired pantoscopic tilt angle.
In some embodiments, OE 165 may comprise an optical assembly comprising one or more cameras, one or more light sources and other components, which are all moved in accordance with the DOF implemented using PTA 155 of PTA 156.
Reference is now made to an inset 175 showing the vertical DOF implemented in PTA 156. In some embodiments, PTA 156 comprises bar 172 coupled to (e.g., molded with) a rotatable section 179 of a disc having a slit 174. PTA 156 further comprises an assembly 177, so that when tilting OE 165 relative to temple arm 44 of the frame, section 179 is moved relative to assembly 177 in a selected tangential direction 176 and locked (e.g., using a locking element 173) at a predefined position within slit 174 to fixate OE 165 at the desired pantoscopic angle relative to temple arm 44 and the other components of the frame of HMD 22. The implementation of the vertical DOF using PTA 156 is also referred to herein as a virtual axis because the components are being moved (e.g., rotated about an axis) without using a physical hinge, such as hinge 171 of PTA 155.
In some embodiments, PTA 156 may comprise any other suitable type of a virtual axis. For example, with reference to inset 110 of
In some embodiments, PTA 156 may comprise a rigid bar coupled to the frame and a flexible arm having properties similar to that of skeleton 136 shown and described in inset 135 of
These particular configuration of PTAs 155 and 156 are shown by way of example, in order to illustrate certain problems that are addressed the example implementation of the vertical DOF for controlling the pantoscopic angle of the optical engine relative to the frame of HMD 22. Embodiments of the disclosure, however, are by no means limited to this specific sort of example configurations and implementations, and the principles described herein may similarly be implemented in other sorts of vertical DOFs used for controlling pantoscopic tilt angles in near-eye display AR-based image-guided surgical systems (including, but not limited to, the structures described below with reference to
In some embodiments, optical engine 165 typically comprises electronic devices configured for exchanging electrical signals with processor 33 (shown in
In some embodiments, HMD 22 comprises hardware configured to exchange the signals while executing the vertical DOF using PTAs 155 and 156.
Reference is now made to insets 180 and 186 showing two example implementations of the hardware configured to exchange the signals while applying the pantoscopic tilting to OE 165. In some embodiments, a rigid PCB 181, having electronic devices mounted thereon, is disposed in the frame of HMD 22 (e.g., in housing 29 of
In the configuration of inset 186, flexible PCB 184 has openings 185 shaped as slits along an axis the flexible PCB 184, so as to enable bending of flexible PCB 184 along two or more axes, and thereby to enable the exchange of electrical signals between housing 29 and OE 165.
The configuration of the rigid and flexible PCBs of insets 180 and 186 are shown by way of example, and in some embodiments, any other suitable configuration may be used for enabling the exchange of electrical signals between housing 29 and OE 165 while performing the pantoscopic tilting as well as when HMD 22 operates at a preset pantoscopic tilt angle.
In some embodiments, the glasses of the display (e.g., displays 49a and 49b of
Reference is now made to
A surgeon view of a patient or Line of Sight (LOS) to an area of interest (e.g., surgery site on a patient body) during a medical intervention, is typically downwards, e.g., in a vertical or oblique manner (e.g., since the patient or surgical site is located beneath the eyes of the surgeon). However, in some cases, the surgeon may view the patient or surgical site, or the surgeon's LOS during the medical intervention may be different than downwards, e.g., horizontal or substantially horizontal or straight ahead (e.g., in case the surgical site or area of interest is located in front or substantially in front of the surgeon's eyes). In such cases, the HMD should allow a horizontal or substantially horizontal view and/or augmented reality view of the area of interest.
For example, in a Lateral Lumbar Interbody Fusion (LLIF) procedure, a lateral approach may be required. In such a procedure, patient 23 may be positioned on the side as shown in
Using the lateral approach, the surgeons, in one embodiment, insert the surgical tools from a lateral trajectory, and therefore, can reach the vertebrae and intervertebral discs without moving the nerves or opening up muscles in the back.
In some embodiments, the surgeons of
Reference is now made to
In some embodiments, a surgeon 99c of
As described with reference to
In some embodiments, HMDs 188 and 189 may replace, for example, any of HMD 22 (of
In some embodiments, the vertical DOF (implemented for example, in PTAs 155, 156, and 2355) may be used, mutatis mutandis, in any other surgical or other interventional procedures. In such procedures, the surgeon or other professional may select any suitable posture of himself and of the patient. Moreover, in accordance with several embodiments, even though specific procedures are typically performed while the surgeon and/or the patient are in a specific posture, the surgeon or other professional may decide to change his or her posture relative to that of the patient during the procedure and/or the patient's posture, and therefore, an adjustable pantoscopic tilt angle, as implemented for example in PTAs 155, 156, and 2355, is important for the quality of the procedure.
Example Structured Light Projector Implementation
In some embodiments, the surgeon (e.g., surgeon 26) makes an incision 202 (e.g., similar to incision 24) in the skin 208 and other tissues of the back of patient 23, so as to expose one or more vertebrae 206 of patient 23 intended to be operated. Some of the areas intended to be operated on may not be exposed or fully exposed by incision 202, depending on the medical application.
In some embodiments, SLP 200 comprises a laser dot pattern projector configured to apply to an area 204 on the organ or body region in question (e.g., the back) of patient 23, a structured light comprising a large number (e.g., between hundreds and hundreds of thousands) of dots 210 arranged in a suitable pattern. This pattern serves as an artificial texture for identifying positions on large anatomical structures lacking fine details of their own (e.g., skin 208 and the surface of vertebrae 206 but the edge thereof).
In some embodiments, using a pseudo random pattern of dots 210, clusters can be uniquely identified and used for disparity measurements. In some embodiments, the disparity measurements are used for calculating depth, and for enhancing the precision of the 3D imaging of area 204.
In some embodiments, the wavelength of dots 210 may be visible to a human eye (e.g., blue, green, or red color) or invisible (e.g., infrared). In accordance with several embodiments, blue dots may advantageously retain their original shape (e.g., round) and appear sharp on skin 208. In some embodiments, SLP 200 is configured to direct blue laser dots or green laser dots (depending on the quality and other parameters of the laser source and optics) to area 204.
In some embodiments, cameras 25 (e.g., RGB cameras) may be used for producing a 3D image of area 204, and based on the images received from cameras 25, processor 33 is configured to produce the 3D image of area 204.
In some embodiments, an additional depth sensing technique may be implemented in HMD 22. The technique relies on a single camera with a precisely calibrated offset relative to SLP 200. In such embodiments, based on the calibrated offset, processor 33 is configured to produce depth information without the need for stereo cameras. The depth information may be obtained by identifying the relative shift of dot clusters.
Additionally, or alternatively, system 11 may comprise a structured light projector mounted on a wall or on an arm of the operating room. In such embodiments, a calibration process between the structured light projector and the one or more cameras (e.g., cameras 25 on HMD 22, or one or more suitable cameras mounted at any suitable position of the operating room) may be required for obtaining the 3D image based on dots 210 projected on area 204.
In some embodiments, SLP 200 may apply an infrared or any other beam having an invisible wavelength (or range of wavelengths), and one or more cameras, such as camera 16 described in
The position of SLP 200 in HMD 22 is selected by way of example, and in some embodiments, SLP 200 may be mounted on HMD 22 (or on any other of the HMDs and HUD described above) at any other suitable position.
Rolling Shutter Example
In some embodiments, an image sensor of camera 16 (which may comprise an IR camera or an RGB camera configured to act as an IR camera) comprises any suitable number of pixels, for example, a 2 Mb sensor comprising about 2 millions of pixels, or a sensor between 0.5 Mb and 10 Mb, between 1 Mb and 5 Mb, between 2 Mb and 10 Mb, overlapping ranges thereof, or any value within the recited ranges.
In some embodiments, each pixel has an integration time, which is a time interval in which the pixel is open for exposure.
In some embodiments, numerals 215, 216 and 217 refer to the integration times of the first pixel, the 1-millionth pixel, and the 2-millionth pixel of the image sensor, respectively. The first pixel is opened at t0 and closes at t2, defining an integration time 222 of the pixel.
In some embodiments, the duration of integration time 222 may be determined using the integral capacitance of each pixel of the image sensor. In some embodiments, at t0 the capacitor of the first pixel is opened for charging by voltage produced by the first pixel in response to sensing photons on the surface of the first pixel. At t2 the charging of the capacitor of the first pixel is stopped, and ready for being read to produce the IR image described in
In some embodiments, the integration time of the first pixel (e.g., between t0 and t2) is between 5 milliseconds and 15 milliseconds (e.g., between 5 milliseconds and 10 milliseconds, between 6 milliseconds and 10 milliseconds, between 8 milliseconds and 15 milliseconds, overlapping ranges thereof, about 8 milliseconds, or any value within the recited ranges. Similarly, as shown in the row having numeral 217, the integration time of the 2-millionth pixel starts at t1 (about 7 ms after t0) and lasts for an integration time of between 5 milliseconds and 15 milliseconds (e.g., between 5 milliseconds and 10 milliseconds, between 6 milliseconds and 10 milliseconds, between 8 milliseconds and 15 milliseconds, overlapping ranges thereof, about 8 milliseconds, or any value within the recited ranges. Note that within between 5 milliseconds and 15 milliseconds (e.g., between 5 milliseconds and 10 milliseconds, between 6 milliseconds and 10 milliseconds, between 8 milliseconds and 15 milliseconds, overlapping ranges thereof, about 8 milliseconds, or any value within the recited ranges (e.g., between t0 and t2) all the pixels of the image sensor have been opened. Moreover, at a time interval 220 between t1 and t2 (e.g., about 1 ms) all the pixels of the image sensor are open at the same time.
In some embodiments, at time interval 220, processor 33 (or any other processor or controller of system 11) controls (e.g., via a driver) IR projectors 17 to direct a strobe of an IR beam, referred to herein as an IR strobe, to the area being operated in the body of patient 23. Moreover, during the same time interval 220, processor 33 (or any other processor or controller of system 11) controls camera 16 to acquire the IR image from the area being operated.
In accordance with several embodiments, camera 16 comprises a rolling shutter, which performs the reading time of each pixel sequentially. In some embodiments, the rolling shutter of camera 16 is operated in a global-shutter mode by implementing sufficiently-long pixel integration time 222 and directing the IR strobe at time interval 220 in which all the pixels (e.g., 2 million pixels) of the image sensor are opened.
In such embodiments, artifacts related to rolling shutters, such as but not limited to shifting of objects in the image due to the serial reading time of the pixels, are reduced or eliminated.
Example Pairing Subsystem
In some embodiments, subsystem 230 is a communication subsystem of system 11 shown in
In some embodiments, WS 232 serves as a Wi-Fi hotspot device and the devices of system 11 (e.g., HMDs) are typically connected to WS 232 of subsystem 230. The pairing process between the devices is referred to herein as “hopping” and two operational modes of the pairing. The hopping and pairing are described hereinafter, and also in connection with
Reference is now made to an inset 236 showing another configuration of a communication subsystem. In the example of inset 236 the subsystem comprises: (i) WS 232, (ii) HUD 700 (and/or other suitable HMDs described herein), and (iii) a wireless router 238. Router 238 may be hacked, and for cybersecurity reasons, there may be a motivation to exclude a router (such as router 238) from the configuration of the communication subsystem.
Reference is now made back to the general view of
In some embodiments, WS 232, HMDs 22 and 50 and tablet 234 are connected wirelessly using a service set identifier (SSID), which is a sequence of characters that uniquely names a wireless local area network (WLAN) comprising WS 232, HMDs 22 and 50, tablet 234 and optionally additional devices. The SSID may be configured to allow stations of subsystem 230 to connect to the desired network when multiple independent networks operate in the same physical area. Moreover, WS 232 may be configured to generate a password, such as password 240, and the password may be sent to HMDs 22 and 50 to enable a secured connection using a key exchange process described herein. This additional security layer may be used for improving the cybersecurity of the network of subsystem 230.
In some embodiments, in the example configuration of
In some embodiments, the communication technique may comprise Wi-Fi, which is a family of network protocols, based on the IEEE 802.11 family of standards, which may be used in wireless local area networking (LAN) applications.
In some embodiments, WS 232 and HMDs 22 and 50 comprise Bluetooth (BT) adapters and the key exchange process is carried out using BT technology, which is a short-range wireless technology standard that is used for exchanging data between fixed devices (e.g., WS 232 implemented in a desktop or a laptop computer) and mobile devices (such as but not limited to HMDs 22 and 50) over short distances using ultra-high frequency (UHF) radio waves in the industrial, scientific and medical (ISM) bands (e.g., between 2.402 GHz and 2.48 GHz).
In some embodiments, WS 232 (and optionally other WSs located at the same medical center or facility) comprise an additional Wi-Fi adapter, also referred to herein as a second Wi-Fi adapter (not shown). In such embodiments, the key exchange process is carried out using a peer2peer (P2P) connection. In such embodiments, WS 232 is using two Wi-Fi connections: a first Wi-Fi connection for hotspot connection, and a second Wi-Fi connection for the key exchange process using the second Wi-Fi adapter.
In some embodiments, WS 232 is configured to encode the hotspot key (e.g., password 240) into an optical code or other machine-readable code such as a barcode or a quick response (QR) code, generated using a suitable software or online tools and displayed over the display of WS 232 or on any other suitable display. In such embodiments, the HMD intended to be paired with WS 232 is configured to scan the optical code or machine-readable code (e.g., barcode or QR code) and decipher the key for performing the key exchange process and the pairing.
In some embodiments, the optical code or machine-readable code (e.g., barcode or QR code) scanning may be carried out using one or more additional cameras 25 (e.g., RGB cameras), or using camera 16 configured to capture a monochromatic image instead of an IR image (e.g., RGB camera that can also function as an IR camera).
This particular configuration of subsystem 230 is shown by way of example, in order to illustrate certain problems related to connectivity and cyber security that are addressed by embodiments of the disclosure, and to demonstrate the application of these embodiments in enhancing the performance of such a communication subsystem. Embodiments of the disclosure, however, are by no means limited to this specific sort of example communication subsystem, and the principles described herein may similarly be applied to other sorts of communication subsystems used in suitable types of AR-based image-guided surgical systems.
The flow chart may also be applicable, mutatis mutandis, for connecting between WS 232 and one or more of HMDs 50, 188, and 189 of
Moreover, the method is applicable for connecting between any workstation and any suitable device configured to display information, such as but not limited to images and markers, over the organ being operated using augmented reality techniques or any suitable technique other than augmented reality.
The method begins at a scanning step 300, with processor 33 introducing a communication device of HMD 22 into a medical center.
At a first decision step 302, the method differentiates between a first use case in which HMD 22 is known to the selected network, and a second use case in which HMD 22 is new to the selected network.
In the first use case, the method proceeds to a parameters application step 304, in which processor 33 uses a set of parameters, typically parameters used in the previous connection (also referred to herein as previous parameters) that are known based on the previous connection between HMD 22 and WS 232. Note that in this use case, the paring process has a predefined time limit and is performed automatically, so that the method proceeds to check whether the pairing is successful (shown in a step 312 described hereinafter)
In the second use case, the method proceeds to a pairing initiation step 306, in which the workstation (e.g., WS 232) initiated the pairing process with HMD 22 for a given time interval. Note that the time intervals (also referred to herein as time limits) of steps 304 and 306 are determined by a user or an administrator of the system. Therefore, the time intervals of steps 304 and 306 may be similar to one another, or may differ from one another.
In some embodiments, the pairing process described below has a predefined time interval (e.g., about 15 seconds, about 20 seconds, about 30 seconds), also referred to herein as a time limit. The implications of the time limit are described in more detail below.
At a key exchange process step 310, the key exchange process is performed for pairing between HMD 22 and WS 232. The key exchange process may be based on Bluetooth, P2P, QR code or any other communication technique and protocols, as described in detail in connection with
At a second decision step 312, processor 33 of HMD 22 (and optionally the processor of WS 232) check whether the pairing was performed successfully. Note that, in accordance with several embodiments, the pairing process has to be successful within the time limits described in steps 304 and 306 above for the two respective use cases.
As described above, in the first use case (e.g., HMD 22 has already been paired with the selected network), the method proceeds from step 304 directly to step 312 for checking whether or not the pairing has been successful.
In case the pairing fails and/or is not completed successfully within the predefined time limit, the method proceeds to an alternative WS pairing initiation step 314 in which a processor of another workstation (e.g., other than WS 232) associated with the selected network initiates the pairing process. Subsequently the method loops back to step 302 described above.
In case pairing is successful within the time limit, the method proceeds to a third decision step 316, in which the user of subsystem 230 (e.g., surgeon 26) and/or processor 33 check whether to move HMD 22 to another network. The decision to move HMD to another network may be performed based on operational considerations, clinical consideration, technical (e.g., communication) considerations, or any other suitable consideration.
In case there is no need to move HMD 22 to another network, the method proceeds to a data exchanging step 318 in which HMD 22 and WS 232 exchange data during surgical operations or between surgical operations, as described above.
In case HMD 22 is moved to another network, the method proceeds to an unpairing step 320 in which HMD 22 is unpaired from WS 232, and the connection parameters for connecting with WS 232 are deleted from the storage of HMD 22. After unpairing between HMD 22 and WS 232, the method loops back to step 314 and further to step 306 as described above.
The method of
The method is simplified for the sake of conceptual clarity and relies on the description of the hardware of
Example Electronic Subsystem
In some embodiments, ES 444 may be implemented, mutatis mutandis, in any of the HMDs and HUD described above or below.
In some embodiments, ES 444 comprises a carrier board (CB) 400 made from a suitable PCB or any other suitable substrate having traces for exchanging signals between components described herein.
In some embodiments, ES 444 comprises a battery pack, referred to herein as a battery 420, or any other suitable power source configured to supply electrical power to ES 444. As described above, ES 444 may comprise a supercapacitor or ultracapacitor (not shown) connected to CB 400 in parallel with battery 420 and configured to be used for eliminating lengthy boot-up when changing battery 420 of HMD 22 (or any other HMD or HUD described above or below).
In some embodiments, ES 444 comprises processor 33, wireless communication device 45 (e.g., a Wi-Fi-6 transceiver connected to a Wi-Fi-6 antenna 406), and storage device 46, which are mounted on CB 400.
In some embodiments, ES 444 comprises a system-on-chip (SOC) device or a system-on-module (SOM) device comprising processor 33, wireless communication device 45, storage device 46, a graphic processing unit (GPU) (not shown), an artificial intelligence (AI) accelerator (not shown), image signal processors (ISPs) and/or other components. For example but not by way of limitation, the SOC device may comprise any suitable SOC device selected from the Snapdragon family produced by Qualcomm (San Diego, CA).
In some embodiments, ES 444 comprises controllers 417 configured to control and drive IR LED projectors 17 described in
In some embodiments, ES 444 comprises backlight driver 58, which is described with reference to
In some embodiments, ES 444 comprises a microphone assembly (MA) 78 comprising a microphone and electronic circuitry thereof, and a speaker assembly (SA) 79 comprising a speaker and electronic circuitry thereof. MA 78 and SA 79 are mounted on CB 400.
In some embodiments, ES 444 comprises a bus 98 configured to conduct power signals and data signals between the aforementioned devices mounted on CB 400, and also to conduct power signals and data signals between CB 400 and external entities described herein.
In some embodiments, ES 444 comprises an interface 402 configured to exchange power signals and data signals between CB 400 and: (i) camera 16, (ii) HA 60 (or any other headlight assembly described above), (iii) SLP 200, and (iv) additional cameras 25 (e.g., RGB cameras).
In some embodiments, interface 402 comprises a Camera Serial Interface (CSI), which is a specification of the Mobile Industry Processor Interface (MIPI) Alliance, referred to herein as MIPI CSI, configured to conduct, between (i) cameras 16 and 25 and (ii) CB 400 the signals for producing the IR images and the RGB images, respectively.
In some embodiments, ES 444 comprises an interface 404 configured to output signals indicative of the AR images (described above) between CB 400 and the displays of the HMD. In some embodiments, the displays comprise display 15 associated with OE 42 (shown and described in
In some embodiments, interface 404 comprises a MIPI Display Serial Interface (MPI DSI), which is a high-speed interface that is used in various types of consumer devices.
In some embodiments, ES 444 is configured to exchange video signals with external entities. In some embodiments, ES 444 is configured to transmit video of the scene captured by RGB cameras 25 together with the rendered augmented reality images.
In some embodiments, the transmission may be carried out using Wi-Fi-6 (an IEEE standard for wireless local-area networks) for obtaining low-latency and high-speed transmission of the signals.
The transmission technique is not limited to Wi-Fi-6 and may also be carried out over the (fifth generation) 5G cellular network or other communications networks.
In some embodiments, these communication techniques may be used for operating room staff to observe exactly what the surgeon (e.g., surgeon 26) sees, and for training and/or telesurgery applications.
In some embodiments, surgeon 26 performs the surgery or other medical intervention in the operating room shown in
In some embodiments, the signals of video output 415 may be recorded for documenting the surgical or other interventional procedure in medical records.
In some embodiments, ES 444 is configured to receive and display input signals of video, and more specifically of high-resolution (HR) video images also referred to herein as video input 411, received from an external source 408 having a suitable communication assembly 410 (e.g., Wi-Fi-6) configured to receive signals of video input 411.
In some embodiments, ES 444 is configured to receive input video 411 from a digital surgical microscope, and to display the received video signals using the aforementioned digital loupe techniques that are described, for example, in U.S. Provisional Patent Application 63/234,272, and in PCT Publication No. WO2023/021450, the disclosure of both of which are incorporated herein by reference. Note that in this use case, displaying such HR images received from the digital surgical microscope may provide surgeon 26 HR images rendered in augmented reality.
In some embodiments, ES 444 is configured to receive and display input signals of input video 411 comprising endoscopic video received from an endoscopy system. This use case allows surgeon 26 to perform various types of endoscopic or laparoscopic surgical procedures without diverting the gaze to a remote monitor.
In some embodiments, external source 408 may comprise any other suitable video source(s) configured to produce any suitable type of HR video images. In such embodiments, processor 33 is configured to receive video input 411 comprising these HR video images, and to display the images over displays 15 (or any other suitable type of display) using the AR techniques described above or any other presentation technique. Moreover, processor 33 is configured to present any other types of patient information received from external source 408.
In some embodiments, the GPU and the AI accelerator of ES 444 may be used together with processor 33 for controlling system 11 (of
This particular configuration of ES 444, external source 408 and remote station 412 is shown by way of example, in order to illustrate certain problems that are addressed by embodiments of the disclosure and to demonstrate the application of these embodiments in enhancing the performance of system 11 and similar types of image-guided surgical systems. Embodiments of the disclosure, however, are by no means limited to this specific sort of example electronic architecture, and the principles described herein may similarly be applied to other sorts of HMDs and hardware used in suitable types of AR-based image-guided surgical systems and other sorts of image-guided surgical systems. Although medical applications are well-suited for several embodiments, non-medical applications also benefit from many embodiments described herein. For example, non-medical applications may involve consumer or commercial applications such as athletics and fitness, gaming, driving, product design, navigation, manufacturing, logistics, shopping and commerce, educational training, remote collaboration, etc.
Additional Example Head-Mounted Display
Some benefits of the design of the head-mounted display 2122 are related to ergonomics, comfort, and/or the ability to enable a user, such as surgeon 26, to utilize the system for relatively long periods of time, such as for four hours or more, without unnecessary fatigue and/or other negative consequences. For example, in some embodiments, a head-mounted display like the head-mounted display 22 of
The head-mounted display 2122 combines certain features of other head-mounted displays disclosed herein, including the head-mounted display 22, head-mounted display 50, and head-mounted display 700. For example, as discussed in greater detail below, the head-mounted display 2122 includes left and right temple housings that have some similarities to the left and right temple arms of head-mounted display 22, but that are not in contact with and/or being supported by the wearer's temples or ears. As another example, as discussed in greater detail below, the head-mounted display 2122 includes a rear pad and adjustable strap mechanism that can be similar to those used in the head-mounted display 700. Additionally, some embodiments can include an optional upper or top strap that can further distribute weight over the top of a wearer's head.
Another advantage of a design that distributes weight to less sensitive areas of a wearer's head, such as the design of the head-mounted display 2122, is that additional weight may be added to the head-mounted display without significantly increasing the pressure on the wearer's head in any particular spot or in a sensitive area. For example, a flashlight assembly may be attached to the head-mounted display without significantly increasing the pressure on the wearer's head in any particular spot or in a sensitive area.
Other similarities of the head-mounted display 2122 to other head-mounted displays disclosed herein include, for example, an optics housing 704 that comprises one or more cameras 708 and infrared projectors 716 (see
One difference in the head-mounted display 2122 from the head-mounted display 22 of
With reference to
With further reference to
With continued reference to
The head strap 740 also desirably comprises a pad or cushion 2170 attached to the adjustment mechanism 2158 that can engage the back of the user's head. Further, the head strap 740 in this embodiment also comprises a front slot 2171 and a rear slot 2172 that can be used to attach an optional upper or top strap, as described in more detail below with reference to
With reference to
With reference to
With continued reference to
In order to assist with the adjustability of the head-mounted display 2122 and its head strap 740, the left and right temple housings 2143, 2144 can also desirably be movably coupled to a portion of the head strap 740. For example, in this embodiment, the left temple housing 2143 is slidably coupled to the left side strap 2152, and the right temple housing 2144 is slidably coupled to the right side strap 2155. More specifically, each of the temple housings 2143, 2144 desirably comprises or consists essentially of a follower 2185 that is slidable forward and backward, and that is affixed to the left or right side strap 2152, 2155 by one or more fasteners, such as fastener 2186. Further details of this structure are described below with reference to
With reference to
This particular configuration of HMD 2122 is shown by way of example, in order to illustrate certain problems that are addressed by certain embodiments and to demonstrate the application of these embodiments in enhancing the performance of such a system. Embodiments of the disclosure, however, are by no means limited to this specific sort of example HMD configuration, and the principles described herein may similarly be applied to other sorts of HMDs and HUDs used in any suitable types of near-eye display AR-based image-guided surgical systems. HMD 2122 may be used in non-medical applications. For example, non-medical applications may involve consumer or commercial applications such as athletics and fitness, gaming, driving, product design, navigation, manufacturing, logistics, shopping and commerce, educational training, remote collaboration, etc.
Example Adjustment Mechanism Details
As discussed above, the head-mounted display 2122 can desirably be configured to be usable by users (e.g., surgeons 26) having a variety of head sizes and/or shapes. Accordingly, the head strap 740 can desirably be adjustable, such as to adjust a circumferential size of the head strap 740 to accommodate heads of the users having various shapes and sizes. In order to make the head-mounted display 2122 more comfortable for a user and to better accommodate users having differently sized and/or shaped heads, in addition to adjusting the circumferential size of the head strap 740, the left and right temple housings 2143, 2144 can desirably be movably or pivotably coupled to a frame 2150 (as discussed above with reference to
In order to maintain the circumferential size of the head strap in a particular configuration, the adjustment mechanism 2158 further comprises a tension mechanism (e.g., stop mechanism) 2215. The tension mechanism 2215 is configured to maintain the knob 744 in a particular position until, for example, a user overcomes a threshold force of the tension mechanism 2215 in order to rotate the knob 744. For example, the tension mechanism 2215 shown in
It should be noted that this is merely one example of how an adjustment mechanism can work, and various other ways of adjusting the size of the head strap 740 may be used. For example, the left and right side straps 2152, 2155 may be adjusted and then held in place with respect to each other using a different type of geartrain or gearing mechanism, friction, and/or the like.
Turning to
Turning to
It should be noted that this is merely one example way of movably or slidably mounting a temple housing to a side strap, and other techniques may also be used. Desirably, the temple housings are supported by the side straps in a generally vertical direction while being free to move in a generally horizontal direction with respect to the side straps. Such a configuration can lead to adequate vertical support of the temple housings by the head strap 740 while allowing adjustments for a comfortable and snug fit on the user's head.
Returning to
The PTA 2355 comprises or consists essentially of a virtual hinge or axis (represented by point 2361), created by arc-shaped slot 2374 as a radial cam, and rotatable section or guide member 2379 as a follower, about which optical engine housing 2104 and housing 704 can pivot (e.g., tilt, rotate, move, or slide) with respect to the frame 2150, thus causing adjustment to the pantoscopic tilt angle 2168. The center of the virtual hinge or axis 2361 is desirably located at the center of the human eyeball (e.g., eyeball 158 of
Various modifications to the pantoscopic tilting assembly 2355 may be made. For example, a spring-loaded pin may be used instead of a spring-loaded ball. As another example, the spring-loaded pin or ball may be spring-loaded by a spring that is directly engaged with and/or in line with the ball instead of engaging a lever arm (e.g., arm 2376) that in turn engages the ball. As another example, friction may be used to maintain the optical engine housing 2104 and optical housing 704 at a particular angle with respect to the frame 2150, instead of or in addition to a detent mechanism. Further, the described mechanism (e.g., PTA 2355) is only one example, which in this embodiment implements a virtual axis 2361 with five detents 2375, providing a range of 20° of adjustability (e.g., from 15° to 35° horizontal tilt), although various embodiments may include other ranges of adjustability, as discussed above with reference to
Example Additional Head Strap Features
As discussed above with reference to the head-mounted display 2122 shown in
It can be desirable to have an optional detachable top strap, such as top strap 2410, because some users may prefer to use the head-mounted display 2122 without a top strap, and some users may prefer to use the head-mounted display 2122 with a top strap. That said, some embodiments may include a permanently attached top strap, and some embodiments may not have an option to attach a top strap. It can also be desirable to have the top strap 2410 be adjustable, such as by adjusting a length of the top strap 2410. One way to accomplish this is to allow the amount of the strap at the first and/or second ends 2411, 2412 that is passed through the corresponding slot and folded back on itself to be varied. An additional or alternative way to accomplish this is to include an adjuster in the strap, such as using a buckle, hook and loop fasteners (see, e.g., the description below related to
As discussed above,
Turning to
Additional Example Detachable Lighting Systems
As discussed above, various embodiments of head-mounted displays may include a permanently attached or detachable headlight or lighting system that includes a flashlight, such as to illuminate the area in which a surgical procedure is being performed. Various examples of such lighting systems are described above with reference to, for example, the embodiments of
Turning to
Turning to
Turning to
As can further be seen in
With continued reference to
With reference to
Example Optics Considerations and Clip-On Lens Assemblies
Various users of the head-mounted displays disclosed herein (e.g., surgeon 26) may require prescription lenses for vision correction. For example, at least some of such users may typically wear prescription eyeglasses to correct their vision. Some of the head-mounted displays disclosed herein may be able to fit over such prescription eyeglasses, but such a configuration may not be ideal. As an alternative to a surgeon needing to wear both prescription eyeglasses and a head-mounted display as disclosed herein, various embodiments of the head-mounted displays disclosed herein may be configured to have prescription lenses coupled thereto in order to allow the surgeon to use the head-mounted display with clear vision without needing to wear separate corrective devices, such as eyeglasses, contact lenses, and/or the like.
In some embodiments, the posterior lens 2912 is by default (e.g., before consideration for a personal prescription compensation) shaped to provide a particular diopter to achieve a focus for an AR image at a particular operational distance. For example, in some embodiments, the posterior lens 2912 is by default shaped to provide −2D diopter compensation to achieve a focus at a 0.5 m (e.g., 50 cm plus or minus 20 cm) operational distance. The anterior lens 2902 may be shaped to compensate for the above-described effect of the posterior lens 2912 (e.g., to reduce or eliminate the effect of the posterior lens 2912 on the view of realty through the display 49a). For example, with the default −2D posterior lens 2912, the anterior lens 2902 may, for example, be shaped to provide a +2D diopter compensation. The considerations for focal distance in the disclosed systems can be different than for normal eyeglasses or other augmented reality systems (such as consumer-focused augmented reality systems). For example, normal eyeglasses or other augmented reality systems (such as consumer-focused augmented reality systems) may be configured to achieve focus at a distance of approximately 3-4 m or greater. When using systems as disclosed herein with surgical or other medical procedures, however, the desired operational or focal distance may be significantly lower, such as approximately 50 cm, within a range of approximately 30 cm to 70 cm, and/or the like. For example, the wearer may be viewing a treatment or diagnostic site from a relatively close range from a standing or sitting position adjacent a patient. That said, the systems disclosed herein may also be used in other applications (e.g., athletics and fitness, gaming, driving, product design, navigation, manufacturing, logistics, shopping and commerce, educational training, remote collaboration, etc.), which each may have longer or shorter desired focal distances, and the diopter compensations in the anterior and posterior lenses may be adjusted accordingly to account for such different focal distances.
With continued reference to
The frame 2911 of the clip-on lens assembly 2910 further comprises or consists essentially of two protrusions 2918 protruding from a top of the frame 2911. The protrusions 2918 are shaped to fit into corresponding recesses 2919 of frame 41 of the head-mounted display 2922 (see
Various other mechanical methods of removably attaching a posterior lens assembly to the frame 41 may also or alternatively be used. For example, more clips 2914 and/or protrusions 2918 may be used, the clips 2914 and/or protrusions 2918 may be replaced by and/or supplemented by magnets, components that form a friction fit, adhesives, screws, other fasteners, and/or the like.
Turning now to
With continued reference to
In this embodiment, the posterior lens 2912 is desirably affixed to the frame 2911 of the clip-on lens assembly 2910. For example, as can be seen in
The design discussed above and shown in
The frame 2911 of the clip-on lens assembly 2910 may be manufactured from a variety of materials. In some embodiments, it may be desirable to manufacture the frame 2911 from PEEK (polyetheretherketone), which may, for example, have a relatively desirable weight to strength ratio and may be suitable for use with a variety of cleaning procedures. In some embodiments, the weight of the frame 2911 may be approximately or no greater than 4 g. In some embodiments, other materials may be used for the frame 2911, such as polycarbonate, which may be a more efficient material to use in some cases.
Turning to
Turning to
One difference in the clip-on lens assembly 3110 of head-mounted display 2122 is that the outer profile shape of the posterior lenses 2912 and the frame 2911 (e.g., posterior lens frame) that the lenses fit within is shaped differently. Specifically, the outer profile comprises a stepped shape in order to provide clearance for portions 3150 of the housing 2104 of optical engine 55 which are shaped somewhat differently than the corresponding portions of head-mounted display 2922. Another difference is that the configurations of protrusions 2918 and recesses 2919 are different. Specifically, the protrusions 2918 are part of housing 2104, and the recesses 2919 are part of the frame 2911, which is the opposite of the configuration discussed above with reference to
Finally, turning to
Another difference in the embodiment of
While certain examples of usage of the disclosed embodiments are given with respect to body portions containing spine vertebrae, the principles disclosed may also be used with respect to other bones and/or body portions than spine, including hip bones, pelvic bones, leg bones, arm bones, ankle bones, foot bones, shoulder bones, cranial bones, oral and maxillofacial bones, sacroiliac joints, etc.
The disclosed embodiments are presented with relation to image-guided surgery systems or methods, in general, and accordingly, the disclosed systems and devices should not be considered limited only to surgery or medical applications but for non-medical applications as well. For example, the disclosed embodiments are applicable to consumer or commercial applications such as athletics and fitness, gaming, driving, product design, navigation, manufacturing, logistics, shopping and commerce, educational training, remote collaboration, etc.
The terms “top,” “bottom,” “first,” “second,” “upper,” “lower,” “height,” “width,” “length,” “end,” “side,” “horizontal,” “vertical,” and similar terms may be used herein; it should be understood that these terms have reference only to the structures shown in the figures and are utilized only to facilitate describing embodiments of the disclosure. Various embodiments of the disclosure have been presented in a range format. It should be understood that the description in range format is merely for convenience and brevity and should not be construed as an inflexible limitation on the scope of the disclosure. The ranges disclosed herein encompass any and all overlap, sub-ranges, and combinations thereof, as well as individual numerical values within that range. For example, description of a range such as from about 25 to about 45 degrees should be considered to have specifically disclosed subranges such as from 25 to 35 degrees, from 30 to 40 degrees, from 35 to 45 degrees etc., as well as individual numbers within that range (for example, 25, 30, 35, 40, 45, 32, 30.5 and any whole and partial increments therebetween). Language such as “up to,” “at least,” “greater than,” “less than,” “between,” and the like includes the number recited. Numbers preceded by a term such as “about” or “approximately” include the recited numbers. For example, “approximately 260 g” includes “260 g.” The terms “approximately”, “about”, and “substantially” as used herein represent an amount close to the stated amount that still performs a desired function or achieves a desired result.
Some embodiments comprise various features that are presented as single features (as opposed to multiple features). For example, in one embodiment, a system includes a single HMD, a single camera, a single processor, a single display, a single flashlight, a single PTA, a single PTA detent mechanism, a single head strap adjustment knob, etc. Multiple features or components are provided in alternate embodiments.
In some embodiments, the systems disclosed herein comprise one or more of the following: means for tilting (e.g., a hinge, a virtual hinge, an arc-shaped slot, detents, a strap configured to bend), means for adjusting (e.g., a knob, a rack and pinion), means for imaging (e.g., a camera or fluoroscope or MRI machine or CT machine), means for calibration (e.g., calibration jigs), means for registration (e.g., adapters, markers, objects, cameras), means for biasing (e.g., springs), means for fastening (e.g., anchors, adhesives, clamps, pins), means for segmentation (e.g., one or more neural networks), etc.
The processors described herein may include one or more central processing units (CPUs) or processors or microprocessors. The processors may be communicatively coupled to one or more memory units, such as random-access memory (RAM) for temporary storage of information, one or more read only memory (ROM) for permanent storage of information, and one or more mass storage devices, such as a hard drive, diskette, solid state drive, or optical media storage device. The processors (or memory units communicatively coupled thereto) may include modules comprising program instructions or algorithm steps configured for execution by the processors to perform any of all of the processes or algorithms discussed herein. The processors may be communicatively coupled to external devices (e.g., display devices, data storage devices, databases, servers, etc.) over a network via a network communications interface.
In general, the algorithms or processes described herein can be implemented by logic embodied in hardware or firmware, or by a collection of software instructions, possibly having entry and exit points, written in a programming language, such as, for example, Python, Java, Lua, C, C#, or C++. A software module or product may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language such as, for example, BASIC, Perl, or Python. It will be appreciated that software modules may be callable from other modules or from themselves, and/or may be invoked in response to detected events or interrupts. Software modules configured for execution on computing devices may be provided on a computer readable medium, such as a compact disc, digital video disc, flash drive, or any other tangible medium. Such software code may be stored, partially or fully, on a memory device of the executing computing device, such as the processing system 31, for execution by the computing device. Software instructions may be embedded in firmware, such as an EPROM. It will be further appreciated that hardware modules may be comprised of connected logic units, such as gates and flip-flops, and/or may be comprised of programmable units, such as programmable gate arrays or processors. The modules described herein are preferably implemented as software modules but may be represented in hardware or firmware. Generally, any modules or programs or flowcharts described herein may refer to logical modules that may be combined with other modules or divided into sub-modules despite their physical organization or storage.
The various features and processes described above may be used independently of one another or may be combined in various ways. All possible combinations and sub-combinations are intended to fall within the scope of this disclosure. In addition, certain method or process blocks or steps may be omitted in some implementations. The methods and processes described herein are also not limited to any particular sequence, and the blocks, steps, or states relating thereto can be performed in other sequences that are appropriate. For example, described blocks, steps, or states may be performed in an order other than that specifically disclosed, or multiple blocks or states may be combined in a single block or state. The example blocks, steps, or states may be performed in serial, in parallel, or in some other manner. Blocks, steps, or states may be added to or removed from the disclosed example embodiments. The example systems and components described herein may be configured differently than described. For example, elements may be added to, removed from, or rearranged compared to the disclosed example embodiments.
Any process descriptions, elements, or blocks in the flow diagrams described herein and/or depicted in the attached figures should be understood as potentially representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process.
It will be appreciated that the systems and methods of the disclosure each have several innovative aspects, no single one of which is solely responsible or required for the desirable attributes disclosed herein. The various features and processes described above may be used independently of one another or may be combined in various ways. The section headings used herein are merely provided to enhance readability and are not intended to limit the scope of the embodiments disclosed in a particular section to the features or elements disclosed in that section.
Certain features that are described in this specification in the context of separate embodiments also may be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment also may be implemented in multiple embodiments separately or in any suitable sub-combination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination may in some cases be excised from the combination, and the claimed combination may be directed to a sub-combination or variation of a sub-combination. No single feature or group of features is necessary or indispensable to each and every embodiment.
Conditional language, such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.
The terms “comprising,” “including,” “having,” and the like are synonymous and are used inclusively, in an open-ended fashion, and do not exclude additional elements, features, acts, operations, and so forth. In addition, the term “or” is used in its inclusive sense (and not in its exclusive sense) so that when used, for example, to connect a list of elements, the term “or” means one, some, or all of the elements in the list. In addition, the articles “a,” “an,” and “the” as used in this application and the appended claims are to be construed to mean “one or more” or “at least one” unless specified otherwise.
This application is a continuation of U.S. patent application Ser. No. 18/398,837, filed Dec. 28, 2023, titled “ADJUSTABLE AUGMENTED REALITY EYEWEAR FOR IMAGE-GUIDED MEDICAL INTERVENTION,” which is a continuation of International PCT Application PCT/IB2023/059049, filed Sep. 12, 2023, titled “AUGMENTED REALITY EYEWEAR FOR IMAGE-GUIDED MEDICAL INTERVENTION,” which claims priority to U.S. Provisional Application No. 63/405,901, filed Sep. 13, 2022, titled “AUGMENTED REALITY EYEWEAR FOR IMAGE-GUIDED MEDICAL INTERVENTION.” The disclosure of each of the foregoing applications is incorporated herein by reference in its entirety for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
3101715 | Glassman | Aug 1963 | A |
3690776 | Zaporoshan | Sep 1972 | A |
4459358 | Berke | Jul 1984 | A |
4711512 | Upatnieks | Dec 1987 | A |
4863238 | Brewster | Sep 1989 | A |
4944739 | Torre | Jul 1990 | A |
5147365 | Whitlock et al. | Sep 1992 | A |
5357292 | Wiedner | Oct 1994 | A |
5441042 | Putman | Aug 1995 | A |
5442146 | Bell et al. | Aug 1995 | A |
5510832 | Garcia | Apr 1996 | A |
D370309 | Stucky | May 1996 | S |
5636255 | Ellis | Jun 1997 | A |
5665092 | Mangiardi et al. | Sep 1997 | A |
5771121 | Hentschke | Jun 1998 | A |
5792046 | Dobrovolny | Aug 1998 | A |
5841507 | Barnes | Nov 1998 | A |
6006126 | Cosman | Dec 1999 | A |
6038467 | De Bliek et al. | Mar 2000 | A |
6125164 | Murphy et al. | Sep 2000 | A |
6147805 | Fergason | Nov 2000 | A |
6227667 | Halldorsson et al. | May 2001 | B1 |
6256529 | Holupka et al. | Jul 2001 | B1 |
6285505 | Melville et al. | Sep 2001 | B1 |
6314310 | Ben-Haim et al. | Nov 2001 | B1 |
6349001 | Spitzer | Feb 2002 | B1 |
6444192 | Mattrey | Sep 2002 | B1 |
6447503 | Wynne et al. | Sep 2002 | B1 |
6449090 | Omar et al. | Sep 2002 | B1 |
6456405 | Horikoshi et al. | Sep 2002 | B2 |
6456868 | Saito et al. | Sep 2002 | B2 |
6474159 | Foxlin et al. | Nov 2002 | B1 |
6518939 | Kikuchi | Feb 2003 | B1 |
6527777 | Justin | Mar 2003 | B2 |
6529331 | Massof et al. | Mar 2003 | B2 |
6549645 | Oikawa et al. | Apr 2003 | B1 |
6578962 | Amir et al. | Jun 2003 | B1 |
6609022 | Mlsmeier et al. | Aug 2003 | B2 |
6610009 | Person | Aug 2003 | B2 |
D480476 | Martinson et al. | Oct 2003 | S |
6659611 | Amir et al. | Dec 2003 | B2 |
6675040 | Cosman | Jan 2004 | B1 |
6683584 | Ronzani et al. | Jan 2004 | B2 |
6690964 | Bieger et al. | Feb 2004 | B2 |
6714810 | Grzeszczuk et al. | Mar 2004 | B2 |
6737425 | Yamamoto et al. | May 2004 | B1 |
6740882 | Weinberg | May 2004 | B2 |
6757068 | Foxlin | Jun 2004 | B2 |
6759200 | Stanton, Jr. | Jul 2004 | B1 |
6847336 | Lemelson et al. | Jan 2005 | B1 |
6856324 | Sauer et al. | Feb 2005 | B2 |
6856826 | Seeley et al. | Feb 2005 | B2 |
6891518 | Sauer et al. | May 2005 | B2 |
6900777 | Hebert et al. | May 2005 | B1 |
6919867 | Sauer | Jul 2005 | B2 |
6921167 | Nagata | Jul 2005 | B2 |
6966668 | Cugini et al. | Nov 2005 | B2 |
6980849 | Sasso | Dec 2005 | B2 |
6993374 | Sasso | Jan 2006 | B2 |
6997552 | Hung | Feb 2006 | B1 |
6999239 | Martins et al. | Feb 2006 | B1 |
7000262 | Bielefeld | Feb 2006 | B2 |
7035371 | Boese et al. | Apr 2006 | B2 |
7043961 | Pandey et al. | May 2006 | B2 |
7072435 | Metz et al. | Jul 2006 | B2 |
7103233 | Stearns | Sep 2006 | B2 |
7107091 | Jutras et al. | Sep 2006 | B2 |
7112656 | Desnoyers et al. | Sep 2006 | B2 |
7141812 | Appleby et al. | Nov 2006 | B2 |
7157459 | Ohta et al. | Jan 2007 | B2 |
7169785 | Timmer et al. | Jan 2007 | B2 |
7171255 | Holupka et al. | Jan 2007 | B2 |
7176936 | Sauer et al. | Feb 2007 | B2 |
7187792 | Fu et al. | Mar 2007 | B2 |
7190331 | Genc et al. | Mar 2007 | B2 |
7194295 | Stefan | Mar 2007 | B2 |
7215322 | Genc et al. | May 2007 | B2 |
7229078 | Lechot | Jun 2007 | B2 |
7231076 | Fu et al. | Jun 2007 | B2 |
7235076 | Pacheco | Jun 2007 | B2 |
7239330 | Sauer et al. | Jul 2007 | B2 |
7241292 | Hooven | Jul 2007 | B2 |
7259266 | Carter et al. | Aug 2007 | B2 |
7260426 | Schweikard et al. | Aug 2007 | B2 |
7269192 | Hayashi | Sep 2007 | B2 |
7281826 | Huang | Oct 2007 | B2 |
7315636 | Kuduvalli | Jan 2008 | B2 |
7320556 | Vagn-Erik | Jan 2008 | B2 |
7330578 | Wang et al. | Feb 2008 | B2 |
7359535 | Salla et al. | Apr 2008 | B2 |
7364314 | Nilsen et al. | Apr 2008 | B2 |
7366934 | Narayan et al. | Apr 2008 | B1 |
7379077 | Bani-Hashemi et al. | May 2008 | B2 |
7431453 | Hogan | Oct 2008 | B2 |
7435219 | Kim | Oct 2008 | B2 |
7450743 | Sundar et al. | Nov 2008 | B2 |
7458977 | McGinley et al. | Dec 2008 | B2 |
7462852 | Appleby et al. | Dec 2008 | B2 |
7493153 | Ahmed et al. | Feb 2009 | B2 |
7505617 | Fu et al. | Mar 2009 | B2 |
7507968 | Wollenweber et al. | Mar 2009 | B2 |
7518136 | Appleby et al. | Apr 2009 | B2 |
7525735 | Sottilare et al. | Apr 2009 | B2 |
D592691 | Chang | May 2009 | S |
D592692 | Chang | May 2009 | S |
D592693 | Chang | May 2009 | S |
7536216 | Geiger et al. | May 2009 | B2 |
7542791 | Mire et al. | Jun 2009 | B2 |
7556428 | Sukovic et al. | Jul 2009 | B2 |
7557824 | Holliman | Jul 2009 | B2 |
7563228 | Ma et al. | Jul 2009 | B2 |
7567834 | Clayton et al. | Jul 2009 | B2 |
7570791 | Frank et al. | Aug 2009 | B2 |
7586686 | Hall | Sep 2009 | B1 |
D602620 | Cristoforo | Oct 2009 | S |
7605826 | Sauer | Oct 2009 | B2 |
7606613 | Simon et al. | Oct 2009 | B2 |
7607775 | Hermanson et al. | Oct 2009 | B2 |
7620223 | Xu et al. | Nov 2009 | B2 |
7623902 | Pacheco | Nov 2009 | B2 |
7627085 | Boyden et al. | Dec 2009 | B2 |
7630753 | Simon et al. | Dec 2009 | B2 |
7633501 | Wood et al. | Dec 2009 | B2 |
7645050 | Wilt et al. | Jan 2010 | B2 |
7653226 | Guhring et al. | Jan 2010 | B2 |
7657075 | Viswanathan | Feb 2010 | B2 |
7689019 | Boese et al. | Mar 2010 | B2 |
7689042 | Brunner et al. | Mar 2010 | B2 |
7689320 | Prisco et al. | Mar 2010 | B2 |
7699486 | Beiner | Apr 2010 | B1 |
7699793 | Goette et al. | Apr 2010 | B2 |
7719769 | Sugihara et al. | May 2010 | B2 |
D617825 | Chang | Jun 2010 | S |
7734327 | Colquhoun | Jun 2010 | B2 |
D619285 | Cristoforo | Jul 2010 | S |
7751865 | Jascob et al. | Jul 2010 | B2 |
7758204 | Klipstein et al. | Jul 2010 | B2 |
7768702 | Hirose et al. | Aug 2010 | B2 |
7769236 | Fiala | Aug 2010 | B2 |
7773074 | Arenson et al. | Aug 2010 | B2 |
7774044 | Sauer et al. | Aug 2010 | B2 |
7822483 | Stone et al. | Oct 2010 | B2 |
D628307 | Krause-Bonte | Nov 2010 | S |
7826902 | Stone et al. | Nov 2010 | B2 |
7831073 | Fu et al. | Nov 2010 | B2 |
7831096 | Williamson, Jr. | Nov 2010 | B2 |
7835778 | Foley et al. | Nov 2010 | B2 |
7835784 | Mire et al. | Nov 2010 | B2 |
7837987 | Shi et al. | Nov 2010 | B2 |
7840093 | Fu et al. | Nov 2010 | B2 |
7840253 | Tremblay et al. | Nov 2010 | B2 |
7840256 | Lakin et al. | Nov 2010 | B2 |
7853305 | Simon et al. | Dec 2010 | B2 |
7854705 | Pawluczyk et al. | Dec 2010 | B2 |
7857271 | Lees | Dec 2010 | B2 |
7860282 | Boese et al. | Dec 2010 | B2 |
D630766 | Harbin | Jan 2011 | S |
7865269 | Prisco et al. | Jan 2011 | B2 |
7874686 | Rossner et al. | Jan 2011 | B2 |
7881770 | Melkent et al. | Feb 2011 | B2 |
7893413 | Appleby et al. | Feb 2011 | B1 |
7894649 | Fu et al. | Feb 2011 | B2 |
7920162 | Masini et al. | Apr 2011 | B2 |
7922391 | Essenreiter et al. | Apr 2011 | B2 |
7938553 | Beiner | May 2011 | B1 |
7945310 | Gattani et al. | May 2011 | B2 |
7953471 | Clayton et al. | May 2011 | B2 |
7969383 | Eberl et al. | Jun 2011 | B2 |
7974677 | Mire et al. | Jul 2011 | B2 |
7985756 | Barlow et al. | Jul 2011 | B2 |
7991557 | Liew et al. | Aug 2011 | B2 |
7993353 | Roner et al. | Aug 2011 | B2 |
7996064 | Simon et al. | Aug 2011 | B2 |
8004524 | Deinzer | Aug 2011 | B2 |
8021300 | Ma et al. | Sep 2011 | B2 |
8022984 | Cheong et al. | Sep 2011 | B2 |
8045266 | Nakamura | Oct 2011 | B2 |
8060181 | Rodriguez et al. | Nov 2011 | B2 |
8068581 | Boese et al. | Nov 2011 | B2 |
8068896 | Daghighian et al. | Nov 2011 | B2 |
8077943 | Williams et al. | Dec 2011 | B2 |
8079957 | Ma et al. | Dec 2011 | B2 |
8081812 | Kreiser | Dec 2011 | B2 |
8085075 | Huffman et al. | Dec 2011 | B2 |
8085897 | Morton | Dec 2011 | B2 |
8090175 | Fu et al. | Jan 2012 | B2 |
8092400 | Warkentine et al. | Jan 2012 | B2 |
8108072 | Zhao et al. | Jan 2012 | B2 |
8112292 | Simon | Feb 2012 | B2 |
8116847 | Gattani et al. | Feb 2012 | B2 |
8120847 | Chang | Feb 2012 | B2 |
8121255 | Sugiyama | Feb 2012 | B2 |
8155479 | Hoffman et al. | Apr 2012 | B2 |
8180132 | Gorges et al. | May 2012 | B2 |
8180429 | Sasso | May 2012 | B2 |
8208599 | Ye et al. | Jun 2012 | B2 |
8216211 | Mathis et al. | Jul 2012 | B2 |
8221402 | Francischelli et al. | Jul 2012 | B2 |
8239001 | Verard et al. | Aug 2012 | B2 |
8244012 | Liang et al. | Aug 2012 | B2 |
8253778 | Atsushi | Aug 2012 | B2 |
8271069 | Jascob et al. | Sep 2012 | B2 |
8280491 | Kuduvalli et al. | Oct 2012 | B2 |
8285021 | Boese et al. | Oct 2012 | B2 |
8300315 | Kobayashi | Oct 2012 | B2 |
8305685 | Heine et al. | Nov 2012 | B2 |
8306305 | Porat et al. | Nov 2012 | B2 |
8309932 | Haselman et al. | Nov 2012 | B2 |
8317320 | Huang | Nov 2012 | B2 |
8328815 | Farr et al. | Dec 2012 | B2 |
8335553 | Rubner et al. | Dec 2012 | B2 |
8335557 | Maschke | Dec 2012 | B2 |
8340379 | Razzaque et al. | Dec 2012 | B2 |
8369925 | Giesel et al. | Feb 2013 | B2 |
8386022 | Jutras et al. | Feb 2013 | B2 |
8394144 | Zehavi et al. | Mar 2013 | B2 |
8398541 | Dimaio et al. | Mar 2013 | B2 |
8444266 | Waters | May 2013 | B2 |
8457719 | Moctezuma De La Barrera et al. | Jun 2013 | B2 |
8467851 | Mire et al. | Jun 2013 | B2 |
8469902 | Dick et al. | Jun 2013 | B2 |
8475470 | Von Jako | Jul 2013 | B2 |
8494612 | Vetter et al. | Jul 2013 | B2 |
8509503 | Nahum et al. | Aug 2013 | B2 |
8511827 | Hua et al. | Aug 2013 | B2 |
8531394 | Maltz | Sep 2013 | B2 |
8540364 | Waters | Sep 2013 | B2 |
8545012 | Waters | Oct 2013 | B2 |
8548567 | Maschke et al. | Oct 2013 | B2 |
8556883 | Saleh | Oct 2013 | B2 |
8559596 | Thomson et al. | Oct 2013 | B2 |
8567945 | Waters | Oct 2013 | B2 |
8571353 | Watanabe | Oct 2013 | B2 |
8585598 | Razzaque et al. | Nov 2013 | B2 |
8600001 | Schweizer | Dec 2013 | B2 |
8600477 | Beyar et al. | Dec 2013 | B2 |
8605199 | Imai | Dec 2013 | B2 |
8611988 | Miyamoto | Dec 2013 | B2 |
8612024 | Stone et al. | Dec 2013 | B2 |
8634897 | Simon et al. | Jan 2014 | B2 |
8641621 | Razzaque et al. | Feb 2014 | B2 |
8643950 | Jens | Feb 2014 | B2 |
8644907 | Hartmann et al. | Feb 2014 | B2 |
8674902 | Park et al. | Mar 2014 | B2 |
8686923 | Eberl et al. | Apr 2014 | B2 |
8690581 | Ruf et al. | Apr 2014 | B2 |
8690776 | Razzaque et al. | Apr 2014 | B2 |
8692845 | Fedorovskaya et al. | Apr 2014 | B2 |
8693632 | Allison | Apr 2014 | B2 |
8694075 | Groszmann et al. | Apr 2014 | B2 |
8699765 | Hao et al. | Apr 2014 | B2 |
8705829 | Frank et al. | Apr 2014 | B2 |
8737708 | Hartmann et al. | May 2014 | B2 |
8746887 | Shestak et al. | Jun 2014 | B2 |
8784450 | Moskowitz et al. | Jul 2014 | B2 |
8786689 | Liu | Jul 2014 | B1 |
D710545 | Wu | Aug 2014 | S |
D710546 | Wu | Aug 2014 | S |
8827934 | Chopra et al. | Sep 2014 | B2 |
8831706 | Fu et al. | Sep 2014 | B2 |
8836768 | Rafii et al. | Sep 2014 | B1 |
8838199 | Simon et al. | Sep 2014 | B2 |
8848977 | Bammer et al. | Sep 2014 | B2 |
8855395 | Baturin et al. | Oct 2014 | B2 |
8878900 | Yang et al. | Nov 2014 | B2 |
8879815 | Miao et al. | Nov 2014 | B2 |
8885177 | Ben-Yishai et al. | Nov 2014 | B2 |
8890772 | Woo et al. | Nov 2014 | B2 |
8890773 | Pederson | Nov 2014 | B1 |
8890943 | Lee et al. | Nov 2014 | B2 |
8897514 | Feikas et al. | Nov 2014 | B2 |
8900131 | Chopra et al. | Dec 2014 | B2 |
8903150 | Star-Lack et al. | Dec 2014 | B2 |
8908952 | Isaacs et al. | Dec 2014 | B2 |
8911358 | Koninckx et al. | Dec 2014 | B2 |
8917268 | Johnsen et al. | Dec 2014 | B2 |
8920776 | Gaiger et al. | Dec 2014 | B2 |
8922589 | Laor | Dec 2014 | B2 |
8941559 | Bar-Zeev et al. | Jan 2015 | B2 |
8942455 | Chou et al. | Jan 2015 | B2 |
8950877 | Northey et al. | Feb 2015 | B2 |
8953246 | Koenig | Feb 2015 | B2 |
8965583 | Ortmaier et al. | Feb 2015 | B2 |
8969829 | Wollenweber et al. | Mar 2015 | B2 |
8989349 | Thomson et al. | Mar 2015 | B2 |
8992580 | Bar et al. | Mar 2015 | B2 |
8994729 | Nakamura | Mar 2015 | B2 |
8994795 | Oh | Mar 2015 | B2 |
9004711 | Gerolemou | Apr 2015 | B2 |
9005211 | Brundobler et al. | Apr 2015 | B2 |
9011441 | Bertagnoli et al. | Apr 2015 | B2 |
9057759 | Klingenbeck et al. | Jun 2015 | B2 |
9060757 | Lawson et al. | Jun 2015 | B2 |
9066751 | Sasso | Jun 2015 | B2 |
9081436 | Berme et al. | Jul 2015 | B1 |
9084635 | Nuckley et al. | Jul 2015 | B2 |
9085643 | Svanborg et al. | Jul 2015 | B2 |
9087471 | Miao | Jul 2015 | B2 |
9100643 | McDowall et al. | Aug 2015 | B2 |
9101394 | Arata et al. | Aug 2015 | B2 |
9104902 | Xu et al. | Aug 2015 | B2 |
9111175 | Strommer et al. | Aug 2015 | B2 |
9123155 | Cunningham et al. | Sep 2015 | B2 |
9125556 | Zehavi et al. | Sep 2015 | B2 |
9129054 | Nawana et al. | Sep 2015 | B2 |
9129372 | Kriston et al. | Sep 2015 | B2 |
9132361 | Smithwick | Sep 2015 | B2 |
9135706 | Zagorchev et al. | Sep 2015 | B2 |
9141873 | Takemoto | Sep 2015 | B2 |
9142020 | Deguise et al. | Sep 2015 | B2 |
9149317 | Arthur et al. | Oct 2015 | B2 |
9165203 | McCarthy | Oct 2015 | B2 |
9165362 | Siewerdsen et al. | Oct 2015 | B2 |
9179984 | Teichman et al. | Nov 2015 | B2 |
D746354 | Chang | Dec 2015 | S |
9208916 | Appleby et al. | Dec 2015 | B2 |
9220573 | Kendrick et al. | Dec 2015 | B2 |
9225895 | Kozinski | Dec 2015 | B2 |
9232982 | Soler et al. | Jan 2016 | B2 |
9235934 | Mandella et al. | Jan 2016 | B2 |
9240046 | Carrell et al. | Jan 2016 | B2 |
9244278 | Sugiyama et al. | Jan 2016 | B2 |
9247240 | Park et al. | Jan 2016 | B2 |
9259192 | Ishihara | Feb 2016 | B2 |
9265572 | Fuchs et al. | Feb 2016 | B2 |
9269192 | Kobayashi | Feb 2016 | B2 |
9283052 | Rodriguez Ponce | Mar 2016 | B2 |
9286730 | Bar-Zeev et al. | Mar 2016 | B2 |
9289267 | Sauer et al. | Mar 2016 | B2 |
9294222 | Proctor, Jr. | Mar 2016 | B2 |
9300949 | Ahearn | Mar 2016 | B2 |
9310591 | Hua et al. | Apr 2016 | B2 |
9320474 | Demri et al. | Apr 2016 | B2 |
9323055 | Baillot | Apr 2016 | B2 |
9330477 | Rappel | May 2016 | B2 |
9335547 | Takano et al. | May 2016 | B2 |
9335567 | Nakamura | May 2016 | B2 |
9341704 | Picard et al. | May 2016 | B2 |
9344686 | Moharir | May 2016 | B2 |
9349066 | Koo et al. | May 2016 | B2 |
9349520 | Demetriou et al. | May 2016 | B2 |
9364294 | Razzaque et al. | Jun 2016 | B2 |
9370332 | Paladini et al. | Jun 2016 | B2 |
9373166 | Azar | Jun 2016 | B2 |
9375639 | Kobayashi et al. | Jun 2016 | B2 |
9378558 | Kajiwara et al. | Jun 2016 | B2 |
9380287 | Nistico et al. | Jun 2016 | B2 |
9387008 | Sarvestani et al. | Jul 2016 | B2 |
9392129 | Simmons | Jul 2016 | B2 |
9395542 | Tilleman et al. | Jul 2016 | B2 |
9398936 | Razzaque et al. | Jul 2016 | B2 |
9400384 | Griffith | Jul 2016 | B2 |
9414041 | Ko et al. | Aug 2016 | B2 |
9424611 | Kanjirathinkal et al. | Aug 2016 | B2 |
9424641 | Wiemker et al. | Aug 2016 | B2 |
9427286 | Siewerdsen et al. | Aug 2016 | B2 |
9438894 | Park et al. | Sep 2016 | B2 |
9443488 | Borenstein et al. | Sep 2016 | B2 |
9453804 | Tahtali | Sep 2016 | B2 |
9456878 | Macfarlane et al. | Oct 2016 | B2 |
9465235 | Chang | Oct 2016 | B2 |
9468373 | Larsen | Oct 2016 | B2 |
9470908 | Frankel et al. | Oct 2016 | B1 |
9473766 | Douglas et al. | Oct 2016 | B2 |
9492222 | Singh | Nov 2016 | B2 |
9495585 | Bicer et al. | Nov 2016 | B2 |
9498132 | Maier-Hein et al. | Nov 2016 | B2 |
9498231 | Haider et al. | Nov 2016 | B2 |
9499999 | Zhou | Nov 2016 | B2 |
9507155 | Morimoto | Nov 2016 | B2 |
9513495 | Waters | Dec 2016 | B2 |
9521966 | Schwartz | Dec 2016 | B2 |
9526443 | Berme et al. | Dec 2016 | B1 |
9530382 | Simmons | Dec 2016 | B2 |
9532846 | Nakamura | Jan 2017 | B2 |
9532849 | Anderson et al. | Jan 2017 | B2 |
9538962 | Hannaford et al. | Jan 2017 | B1 |
9545233 | Sirpad et al. | Jan 2017 | B2 |
9546779 | Rementer | Jan 2017 | B2 |
9547174 | Gao et al. | Jan 2017 | B2 |
9547940 | Sun et al. | Jan 2017 | B1 |
9557566 | Fujimaki | Jan 2017 | B2 |
9560318 | Reina et al. | Jan 2017 | B2 |
9561095 | Nguyen et al. | Feb 2017 | B1 |
9561446 | Brecher | Feb 2017 | B2 |
9565415 | Zhang et al. | Feb 2017 | B2 |
9572661 | Robin et al. | Feb 2017 | B2 |
9576556 | Simmons | Feb 2017 | B2 |
9581822 | Morimoto | Feb 2017 | B2 |
9610056 | Lavallee et al. | Apr 2017 | B2 |
9612657 | Bertram et al. | Apr 2017 | B2 |
9629595 | Walker et al. | Apr 2017 | B2 |
9633431 | Merlet | Apr 2017 | B2 |
9645395 | Bolas et al. | May 2017 | B2 |
9646423 | Sun et al. | May 2017 | B1 |
9672597 | Amiot et al. | Jun 2017 | B2 |
9672607 | Demri et al. | Jun 2017 | B2 |
9672640 | Kleiner | Jun 2017 | B2 |
9675306 | Morton | Jun 2017 | B2 |
9675319 | Razzaque et al. | Jun 2017 | B1 |
9684980 | Royalty et al. | Jun 2017 | B2 |
9690119 | Garofolo et al. | Jun 2017 | B2 |
RE46463 | Fienbloom et al. | Jul 2017 | E |
9693748 | Rai et al. | Jul 2017 | B2 |
9710968 | Dillavou et al. | Jul 2017 | B2 |
9713502 | Finkman et al. | Jul 2017 | B2 |
9724119 | Hissong et al. | Aug 2017 | B2 |
9724165 | Arata et al. | Aug 2017 | B2 |
9726888 | Giartosio et al. | Aug 2017 | B2 |
9728006 | Varga | Aug 2017 | B2 |
9729831 | Birnkrant et al. | Aug 2017 | B2 |
9757034 | Desjardins et al. | Sep 2017 | B2 |
9757087 | Simon et al. | Sep 2017 | B2 |
9766441 | Rappel | Sep 2017 | B2 |
9767608 | Lee et al. | Sep 2017 | B2 |
9770203 | Berme et al. | Sep 2017 | B1 |
9772102 | Ferguson | Sep 2017 | B1 |
9772495 | Tam et al. | Sep 2017 | B2 |
9791138 | Feinbloom et al. | Oct 2017 | B1 |
9800995 | Libin et al. | Oct 2017 | B2 |
9805504 | Zhang et al. | Oct 2017 | B2 |
9808148 | Miller et al. | Nov 2017 | B2 |
9839448 | Reckling et al. | Dec 2017 | B2 |
9844413 | Daon et al. | Dec 2017 | B2 |
9851080 | Wilt et al. | Dec 2017 | B2 |
9858663 | Penney et al. | Jan 2018 | B2 |
9861446 | Lang | Jan 2018 | B2 |
9864214 | Fass | Jan 2018 | B2 |
9872733 | Shoham et al. | Jan 2018 | B2 |
9875544 | Rai et al. | Jan 2018 | B2 |
9877642 | Duret | Jan 2018 | B2 |
9885465 | Nguyen | Feb 2018 | B2 |
9886552 | Dillavou et al. | Feb 2018 | B2 |
9886760 | Liu et al. | Feb 2018 | B2 |
9892564 | Cvetko et al. | Feb 2018 | B1 |
9898866 | Fuchs et al. | Feb 2018 | B2 |
9901414 | Lively et al. | Feb 2018 | B2 |
9911187 | Steinle et al. | Mar 2018 | B2 |
9927611 | Rudy et al. | Mar 2018 | B2 |
9928629 | Benishti et al. | Mar 2018 | B2 |
9940750 | Dillavou et al. | Apr 2018 | B2 |
9943374 | Merritt et al. | Apr 2018 | B2 |
9947110 | Haimerl | Apr 2018 | B2 |
9952664 | Border et al. | Apr 2018 | B2 |
9956054 | Aguirre-Valencia | May 2018 | B2 |
9958674 | Border | May 2018 | B2 |
9959620 | Merlet | May 2018 | B2 |
9959629 | Dillavou et al. | May 2018 | B2 |
9965681 | Border et al. | May 2018 | B2 |
9968297 | Connor | May 2018 | B2 |
9980780 | Lang | May 2018 | B2 |
9986228 | Woods | May 2018 | B2 |
D824523 | Paoli et al. | Jul 2018 | S |
10010379 | Gibby et al. | Jul 2018 | B1 |
10013531 | Richards et al. | Jul 2018 | B2 |
10015243 | Kazerani et al. | Jul 2018 | B2 |
10016243 | Esterberg | Jul 2018 | B2 |
10022064 | Kim et al. | Jul 2018 | B2 |
10022065 | Ben-Yishai et al. | Jul 2018 | B2 |
10022104 | Sell et al. | Jul 2018 | B2 |
10023615 | Bonny | Jul 2018 | B2 |
10026015 | Cavusoglu et al. | Jul 2018 | B2 |
10034713 | Yang et al. | Jul 2018 | B2 |
10046165 | Frewin et al. | Aug 2018 | B2 |
10055838 | Elenbaas et al. | Aug 2018 | B2 |
10066816 | Chang | Sep 2018 | B2 |
10067359 | Ushakov | Sep 2018 | B1 |
10073515 | Awdeh | Sep 2018 | B2 |
10080616 | Wilkinson et al. | Sep 2018 | B2 |
10082680 | Chung | Sep 2018 | B2 |
10085709 | Lavallee et al. | Oct 2018 | B2 |
10105187 | Corndorf et al. | Oct 2018 | B2 |
10107483 | Oren | Oct 2018 | B2 |
10108833 | Hong et al. | Oct 2018 | B2 |
10123840 | Dorman | Nov 2018 | B2 |
10130378 | Bryan | Nov 2018 | B2 |
10132483 | Feinbloom et al. | Nov 2018 | B1 |
10134166 | Benishti et al. | Nov 2018 | B2 |
10134194 | Kepner et al. | Nov 2018 | B2 |
10139652 | Windham | Nov 2018 | B2 |
10139920 | Isaacs et al. | Nov 2018 | B2 |
10142496 | Rao et al. | Nov 2018 | B1 |
10151928 | Ushakov | Dec 2018 | B2 |
10154239 | Casas | Dec 2018 | B2 |
10159530 | Lang | Dec 2018 | B2 |
10163207 | Merlet | Dec 2018 | B2 |
10166079 | McLachlin et al. | Jan 2019 | B2 |
10175507 | Nakamura | Jan 2019 | B2 |
10175753 | Boesen | Jan 2019 | B2 |
10181361 | Dillavou et al. | Jan 2019 | B2 |
10186055 | Takahashi et al. | Jan 2019 | B2 |
10188672 | Wagner | Jan 2019 | B2 |
10194131 | Casas | Jan 2019 | B2 |
10194990 | Amanatullah et al. | Feb 2019 | B2 |
10194993 | Roger et al. | Feb 2019 | B2 |
10195076 | Fateh | Feb 2019 | B2 |
10197803 | Badiali et al. | Feb 2019 | B2 |
10197816 | Waisman et al. | Feb 2019 | B2 |
10207315 | Appleby et al. | Feb 2019 | B2 |
10212517 | Beltran | Feb 2019 | B1 |
10230719 | Vaughn et al. | Mar 2019 | B2 |
10231893 | Lei et al. | Mar 2019 | B2 |
10235606 | Miao et al. | Mar 2019 | B2 |
10240769 | Braganca et al. | Mar 2019 | B1 |
10247965 | Ton | Apr 2019 | B2 |
10251724 | McLachlin et al. | Apr 2019 | B2 |
10261324 | Chuang et al. | Apr 2019 | B2 |
10262424 | Ketcha et al. | Apr 2019 | B2 |
10274731 | Maimone | Apr 2019 | B2 |
10278777 | Lang | May 2019 | B1 |
10292768 | Lang | May 2019 | B2 |
10296805 | Yang et al. | May 2019 | B2 |
10319154 | Chakravarthula et al. | Jun 2019 | B1 |
10326975 | Casas | Jun 2019 | B2 |
10332267 | Rai et al. | Jun 2019 | B2 |
10339719 | Jagga et al. | Jul 2019 | B2 |
10352543 | Braganca et al. | Jul 2019 | B1 |
10357146 | Fiebel et al. | Jul 2019 | B2 |
10357574 | Hilderbrand et al. | Jul 2019 | B2 |
10366489 | Boettger et al. | Jul 2019 | B2 |
10368947 | Lang | Aug 2019 | B2 |
10368948 | Tripathi | Aug 2019 | B2 |
10382748 | Benishti et al. | Aug 2019 | B2 |
10383654 | Yilmaz et al. | Aug 2019 | B2 |
10386645 | Abou Shousha | Aug 2019 | B2 |
10398514 | Ryan et al. | Sep 2019 | B2 |
10405825 | Rai et al. | Sep 2019 | B2 |
10405927 | Lang | Sep 2019 | B1 |
10413752 | Berlinger et al. | Sep 2019 | B2 |
10419655 | Sivan | Sep 2019 | B2 |
10420626 | Tokuda et al. | Sep 2019 | B2 |
10420813 | Newell-Rogers et al. | Sep 2019 | B2 |
10424115 | Ellerbrock | Sep 2019 | B2 |
D862469 | Sadot et al. | Oct 2019 | S |
10426554 | Siewerdsen et al. | Oct 2019 | B2 |
10429675 | Greget | Oct 2019 | B2 |
10431008 | Djajadiningrat et al. | Oct 2019 | B2 |
10433814 | Razzaque et al. | Oct 2019 | B2 |
10434335 | Takahashi et al. | Oct 2019 | B2 |
10441236 | Bar-Tal et al. | Oct 2019 | B2 |
10444514 | Abou Shousha et al. | Oct 2019 | B2 |
10447947 | Liu | Oct 2019 | B2 |
10448003 | Grafenberg | Oct 2019 | B2 |
10449040 | Lashinski et al. | Oct 2019 | B2 |
10453187 | Peterson et al. | Oct 2019 | B2 |
10463434 | Siegler et al. | Nov 2019 | B2 |
10465892 | Feinbloom et al. | Nov 2019 | B1 |
10466487 | Blum et al. | Nov 2019 | B2 |
10470732 | Baumgart et al. | Nov 2019 | B2 |
10473314 | Braganca et al. | Nov 2019 | B1 |
10485989 | Jordan et al. | Nov 2019 | B2 |
10488663 | Choi | Nov 2019 | B2 |
D869772 | Gand | Dec 2019 | S |
D870977 | Berggren et al. | Dec 2019 | S |
10492755 | Lin et al. | Dec 2019 | B2 |
10499997 | Weinstein et al. | Dec 2019 | B2 |
10502363 | Edwards | Dec 2019 | B2 |
10504231 | Fiala | Dec 2019 | B2 |
10507066 | Dimaio et al. | Dec 2019 | B2 |
10511822 | Casas | Dec 2019 | B2 |
10517544 | Taguchi et al. | Dec 2019 | B2 |
10537395 | Perez | Jan 2020 | B2 |
10540780 | Cousins et al. | Jan 2020 | B1 |
10543485 | Ismagilov et al. | Jan 2020 | B2 |
10546423 | Jones et al. | Jan 2020 | B2 |
10548557 | Lim et al. | Feb 2020 | B2 |
10555775 | Hoffman et al. | Feb 2020 | B2 |
10568535 | Roberts et al. | Feb 2020 | B2 |
10571696 | Urey et al. | Feb 2020 | B2 |
10571716 | Chapiro | Feb 2020 | B2 |
10573087 | Gallop et al. | Feb 2020 | B2 |
10577630 | Zhang et al. | Mar 2020 | B2 |
10586400 | Douglas | Mar 2020 | B2 |
10591737 | Yildiz et al. | Mar 2020 | B2 |
10592748 | Cousins et al. | Mar 2020 | B1 |
10594998 | Casas | Mar 2020 | B1 |
10595716 | Nazareth et al. | Mar 2020 | B2 |
10601950 | Devam et al. | Mar 2020 | B2 |
10602114 | Casas | Mar 2020 | B2 |
10603113 | Lang | Mar 2020 | B2 |
10603133 | Wang et al. | Mar 2020 | B2 |
10606085 | Toyama | Mar 2020 | B2 |
10610172 | Hummel et al. | Apr 2020 | B2 |
10610179 | Altmann | Apr 2020 | B2 |
10613352 | Knoll | Apr 2020 | B2 |
10617566 | Esmonde | Apr 2020 | B2 |
10620460 | Carabin | Apr 2020 | B2 |
10621738 | Miao et al. | Apr 2020 | B2 |
10625099 | Takahashi et al. | Apr 2020 | B2 |
10626473 | Mariani et al. | Apr 2020 | B2 |
10631905 | Asfora et al. | Apr 2020 | B2 |
10631907 | Zucker et al. | Apr 2020 | B2 |
10634331 | Feinbloom et al. | Apr 2020 | B1 |
10634921 | Blum et al. | Apr 2020 | B2 |
10638080 | Ovchinnikov et al. | Apr 2020 | B2 |
10646285 | Siemionow et al. | May 2020 | B2 |
10650513 | Penney et al. | May 2020 | B2 |
10650594 | Jones et al. | May 2020 | B2 |
10652525 | Woods | May 2020 | B2 |
10653495 | Gregerson et al. | May 2020 | B2 |
10660715 | Dozeman | May 2020 | B2 |
10663738 | Carlvik et al. | May 2020 | B2 |
10672145 | Albiol et al. | Jun 2020 | B2 |
10682112 | Pizaine et al. | Jun 2020 | B2 |
10682767 | Grafenberg et al. | Jun 2020 | B2 |
10687901 | Thomas | Jun 2020 | B2 |
10691397 | Clements | Jun 2020 | B1 |
10702713 | Mori et al. | Jul 2020 | B2 |
10706540 | Merlet | Jul 2020 | B2 |
10709398 | Schweizer | Jul 2020 | B2 |
10713801 | Jordan et al. | Jul 2020 | B2 |
10716643 | Justin et al. | Jul 2020 | B2 |
10722733 | Takahashi | Jul 2020 | B2 |
10725535 | Yu | Jul 2020 | B2 |
10731832 | Koo | Aug 2020 | B2 |
10732721 | Clements | Aug 2020 | B1 |
10742949 | Casas | Aug 2020 | B2 |
10743939 | Lang | Aug 2020 | B1 |
10743943 | Razeto et al. | Aug 2020 | B2 |
10747315 | Tungare et al. | Aug 2020 | B2 |
10748319 | Tao et al. | Aug 2020 | B1 |
10758315 | Johnson et al. | Sep 2020 | B2 |
10777094 | Rao et al. | Sep 2020 | B1 |
10777315 | Zehavi et al. | Sep 2020 | B2 |
10781482 | Gubatayao et al. | Sep 2020 | B2 |
10792110 | Leung et al. | Oct 2020 | B2 |
10799145 | West et al. | Oct 2020 | B2 |
10799296 | Lang | Oct 2020 | B2 |
10799298 | Crawford et al. | Oct 2020 | B2 |
10799316 | Sela et al. | Oct 2020 | B2 |
10810799 | Tepper et al. | Oct 2020 | B2 |
10818019 | Piat et al. | Oct 2020 | B2 |
10818101 | Gallop et al. | Oct 2020 | B2 |
10818199 | Buras et al. | Oct 2020 | B2 |
10825563 | Gibby et al. | Nov 2020 | B2 |
10831943 | Santarone et al. | Nov 2020 | B2 |
10835296 | Elimelech et al. | Nov 2020 | B2 |
10838206 | Fortin-Deschnes et al. | Nov 2020 | B2 |
10839629 | Jones et al. | Nov 2020 | B2 |
10839956 | Beydoun et al. | Nov 2020 | B2 |
10841556 | Casas | Nov 2020 | B2 |
10842002 | Chang | Nov 2020 | B2 |
10842461 | Johnson et al. | Nov 2020 | B2 |
10849691 | Zucker et al. | Dec 2020 | B2 |
10849693 | Lang | Dec 2020 | B2 |
10849710 | Liu | Dec 2020 | B2 |
10861236 | Geri et al. | Dec 2020 | B2 |
10865220 | Ebetino et al. | Dec 2020 | B2 |
10869517 | Halpern | Dec 2020 | B1 |
10869727 | Yanof et al. | Dec 2020 | B2 |
10872472 | Watola et al. | Dec 2020 | B2 |
10877262 | Luxembourg | Dec 2020 | B1 |
10877296 | Lindsey et al. | Dec 2020 | B2 |
10878639 | Douglas et al. | Dec 2020 | B2 |
10893260 | Trail et al. | Jan 2021 | B2 |
10895742 | Schneider et al. | Jan 2021 | B2 |
10895743 | Dausmann | Jan 2021 | B2 |
10895906 | West et al. | Jan 2021 | B2 |
10898151 | Harding et al. | Jan 2021 | B2 |
10921595 | Rakshit et al. | Feb 2021 | B2 |
10921613 | Gupta et al. | Feb 2021 | B2 |
10928321 | Rawle | Feb 2021 | B2 |
10928638 | Ninan et al. | Feb 2021 | B2 |
10929670 | Troy et al. | Feb 2021 | B1 |
10935815 | Castaeda | Mar 2021 | B1 |
10935816 | Ban et al. | Mar 2021 | B2 |
10936537 | Huston | Mar 2021 | B2 |
10939973 | Dimaio et al. | Mar 2021 | B2 |
10939977 | Messinger et al. | Mar 2021 | B2 |
10941933 | Ferguson | Mar 2021 | B2 |
10946108 | Zhang et al. | Mar 2021 | B2 |
10950338 | Douglas | Mar 2021 | B2 |
10951872 | Casas | Mar 2021 | B2 |
10964095 | Douglas | Mar 2021 | B1 |
10964124 | Douglas | Mar 2021 | B1 |
10966768 | Poulos | Apr 2021 | B2 |
10993754 | Kuntz et al. | May 2021 | B2 |
11000335 | Dorman | May 2021 | B2 |
11006093 | Hegyi | May 2021 | B1 |
11013550 | Rioux et al. | May 2021 | B2 |
11013560 | Lang | May 2021 | B2 |
11013562 | Marti et al. | May 2021 | B2 |
11013573 | Chang | May 2021 | B2 |
11013900 | Malek et al. | May 2021 | B2 |
11019988 | Fiebel et al. | Jun 2021 | B2 |
11027027 | Manning et al. | Jun 2021 | B2 |
11029147 | Abovitz et al. | Jun 2021 | B2 |
11030809 | Wang | Jun 2021 | B2 |
11041173 | Zhang et al. | Jun 2021 | B2 |
11045663 | Mori et al. | Jun 2021 | B2 |
11049293 | Chae et al. | Jun 2021 | B2 |
11049476 | Fuchs et al. | Jun 2021 | B2 |
11050990 | Casas | Jun 2021 | B2 |
11057505 | Dharmatilleke | Jul 2021 | B2 |
11058390 | Douglas | Jul 2021 | B1 |
11061257 | Hakim | Jul 2021 | B1 |
11064904 | Kay et al. | Jul 2021 | B2 |
11065062 | Frushour et al. | Jul 2021 | B2 |
11067387 | Marell et al. | Jul 2021 | B2 |
11071497 | Hallack et al. | Jul 2021 | B2 |
11079596 | Hua et al. | Aug 2021 | B2 |
11087039 | Duff et al. | Aug 2021 | B2 |
11090019 | Siemionow et al. | Aug 2021 | B2 |
11097129 | Sakata et al. | Aug 2021 | B2 |
11099376 | Steier et al. | Aug 2021 | B1 |
11103320 | Leboeuf et al. | Aug 2021 | B2 |
D930162 | Cremer et al. | Sep 2021 | S |
11109762 | Steier et al. | Sep 2021 | B1 |
11112611 | Kessler et al. | Sep 2021 | B1 |
11122164 | Gigante | Sep 2021 | B2 |
11123604 | Fung | Sep 2021 | B2 |
11129562 | Roberts et al. | Sep 2021 | B2 |
11132055 | Jones et al. | Sep 2021 | B2 |
11135015 | Crawford et al. | Oct 2021 | B2 |
11135016 | Frielinghaus et al. | Oct 2021 | B2 |
11137610 | Kessler et al. | Oct 2021 | B1 |
11141221 | Hobeika et al. | Oct 2021 | B2 |
11153549 | Casas | Oct 2021 | B2 |
11153555 | Healy et al. | Oct 2021 | B1 |
11163176 | Karafin et al. | Nov 2021 | B2 |
11164324 | Liu et al. | Nov 2021 | B2 |
11166006 | Hegyi | Nov 2021 | B2 |
11172990 | Lang | Nov 2021 | B2 |
11179136 | Kohli et al. | Nov 2021 | B2 |
11180557 | Noelle | Nov 2021 | B2 |
11181747 | Kessler et al. | Nov 2021 | B1 |
11185891 | Cousins et al. | Nov 2021 | B2 |
11202682 | Staunton et al. | Dec 2021 | B2 |
11207150 | Healy et al. | Dec 2021 | B2 |
11217028 | Jones et al. | Jan 2022 | B2 |
11224483 | Steinberg et al. | Jan 2022 | B2 |
11224763 | Takahashi et al. | Jan 2022 | B2 |
11227417 | Berlinger et al. | Jan 2022 | B2 |
11231787 | Isaacs et al. | Jan 2022 | B2 |
11244508 | Kazanzides et al. | Feb 2022 | B2 |
11253216 | Crawford et al. | Feb 2022 | B2 |
11253323 | Hughes et al. | Feb 2022 | B2 |
11257190 | Mao et al. | Feb 2022 | B2 |
11257241 | Tao | Feb 2022 | B2 |
11263772 | Siemionow et al. | Mar 2022 | B2 |
11269401 | West et al. | Mar 2022 | B2 |
11272151 | Casas | Mar 2022 | B2 |
11278359 | Siemionow et al. | Mar 2022 | B2 |
11278413 | Lang | Mar 2022 | B1 |
11280480 | Wilt et al. | Mar 2022 | B2 |
11284846 | Graumann et al. | Mar 2022 | B2 |
11291521 | Im | Apr 2022 | B2 |
11294167 | Ishimoda | Apr 2022 | B2 |
11297285 | Pierce | Apr 2022 | B2 |
11300252 | Nguyen | Apr 2022 | B2 |
11300790 | Cheng et al. | Apr 2022 | B2 |
11304621 | Merschon et al. | Apr 2022 | B2 |
11304759 | Kovtun et al. | Apr 2022 | B2 |
11307402 | Steier et al. | Apr 2022 | B2 |
11308663 | Alhrishy et al. | Apr 2022 | B2 |
11311341 | Lang | Apr 2022 | B2 |
11317973 | Calloway et al. | May 2022 | B2 |
11337763 | Choi | May 2022 | B2 |
11348257 | Lang | May 2022 | B2 |
11350072 | Quiles Casas | May 2022 | B1 |
11350965 | Yilmaz et al. | Jun 2022 | B2 |
11351006 | Aferzon et al. | Jun 2022 | B2 |
11354813 | Piat et al. | Jun 2022 | B2 |
11360315 | Tu et al. | Jun 2022 | B2 |
11382699 | Wassall et al. | Jul 2022 | B2 |
11382700 | Calloway et al. | Jul 2022 | B2 |
11382712 | Elimelech et al. | Jul 2022 | B2 |
11382713 | Healy et al. | Jul 2022 | B2 |
11389252 | Gera et al. | Jul 2022 | B2 |
11399895 | Soper et al. | Aug 2022 | B2 |
11402524 | Song et al. | Aug 2022 | B2 |
11406338 | Tolkowsky | Aug 2022 | B2 |
11423554 | Borsdorf et al. | Aug 2022 | B2 |
11432828 | Lang | Sep 2022 | B1 |
11432931 | Lang | Sep 2022 | B2 |
11452568 | Lang | Sep 2022 | B2 |
11460915 | Frielinghaus et al. | Oct 2022 | B2 |
11461983 | Jones et al. | Oct 2022 | B2 |
11464581 | Calloway | Oct 2022 | B2 |
11478214 | Siewerdsen et al. | Oct 2022 | B2 |
11483532 | Quiles Casas | Oct 2022 | B2 |
11490986 | Ben-Yishai | Nov 2022 | B2 |
11527002 | Govari | Dec 2022 | B2 |
11528393 | Garofolo et al. | Dec 2022 | B2 |
11627924 | Alexandroni et al. | Apr 2023 | B2 |
11648016 | Hathaway et al. | May 2023 | B2 |
11657518 | Ketcha et al. | May 2023 | B2 |
11666458 | Kim et al. | Jun 2023 | B2 |
11669984 | Siewerdsen et al. | Jun 2023 | B2 |
11712582 | Miyazaki et al. | Aug 2023 | B2 |
11750794 | Benishti et al. | Sep 2023 | B2 |
11766296 | Wolf et al. | Sep 2023 | B2 |
11798178 | Merlet | Oct 2023 | B2 |
11801097 | Crawford et al. | Oct 2023 | B2 |
11801115 | Elimelech et al. | Oct 2023 | B2 |
11826111 | Mahfouz | Nov 2023 | B2 |
11839501 | Takahashi et al. | Dec 2023 | B2 |
11885752 | St-Aubin et al. | Jan 2024 | B2 |
11896445 | Gera et al. | Feb 2024 | B2 |
20020082498 | Wendt et al. | Jun 2002 | A1 |
20030059097 | Abovitz et al. | Mar 2003 | A1 |
20030117393 | Sauer et al. | Jun 2003 | A1 |
20030130576 | Seeley et al. | Jul 2003 | A1 |
20030156144 | Morita | Aug 2003 | A1 |
20030210812 | Khamene et al. | Nov 2003 | A1 |
20030225329 | Rossner et al. | Dec 2003 | A1 |
20040019263 | Jutras et al. | Jan 2004 | A1 |
20040030237 | Lee et al. | Feb 2004 | A1 |
20040138556 | Cosman | Jul 2004 | A1 |
20040238732 | State et al. | Dec 2004 | A1 |
20050017972 | Poole et al. | Jan 2005 | A1 |
20050024586 | Teiwes et al. | Feb 2005 | A1 |
20050119639 | McCombs et al. | Jun 2005 | A1 |
20050203367 | Ahmed et al. | Sep 2005 | A1 |
20050203380 | Sauer et al. | Sep 2005 | A1 |
20050215879 | Chuanggui | Sep 2005 | A1 |
20060072124 | Smetak et al. | Apr 2006 | A1 |
20060134198 | Tawa et al. | Jun 2006 | A1 |
20060176242 | Jaramaz et al. | Aug 2006 | A1 |
20070018975 | Chuanggui et al. | Jan 2007 | A1 |
20070058261 | Sugihara et al. | Mar 2007 | A1 |
20070183041 | McCloy et al. | Aug 2007 | A1 |
20070233371 | Stoschek et al. | Oct 2007 | A1 |
20070273610 | Baillot | Nov 2007 | A1 |
20080002809 | Bodduluri | Jan 2008 | A1 |
20080007645 | McCutchen | Jan 2008 | A1 |
20080035266 | Danziger | Feb 2008 | A1 |
20080085033 | Haven et al. | Apr 2008 | A1 |
20080159612 | Fu et al. | Jul 2008 | A1 |
20080183065 | Goldbach | Jul 2008 | A1 |
20080221625 | Hufner et al. | Sep 2008 | A1 |
20080253527 | Boyden et al. | Oct 2008 | A1 |
20080262812 | Arata et al. | Oct 2008 | A1 |
20080287728 | Mostafavi et al. | Nov 2008 | A1 |
20090018437 | Cooke | Jan 2009 | A1 |
20090024127 | Lechner et al. | Jan 2009 | A1 |
20090036902 | Dimaio et al. | Feb 2009 | A1 |
20090062869 | Claverie et al. | Mar 2009 | A1 |
20090099445 | Burger | Apr 2009 | A1 |
20090123452 | Madison | May 2009 | A1 |
20090227847 | Tepper et al. | Sep 2009 | A1 |
20090285366 | Essenreiter et al. | Nov 2009 | A1 |
20090300540 | Russell | Dec 2009 | A1 |
20100076305 | Maier-Hein et al. | Mar 2010 | A1 |
20100094308 | Tatsumi et al. | Apr 2010 | A1 |
20100106010 | Rubner et al. | Apr 2010 | A1 |
20100114110 | Taft et al. | May 2010 | A1 |
20100138939 | Bentzon et al. | Jun 2010 | A1 |
20100149073 | Chaum et al. | Jun 2010 | A1 |
20100172567 | Prokoski | Jul 2010 | A1 |
20100210939 | Hartmann et al. | Aug 2010 | A1 |
20100266220 | Zagorchev et al. | Oct 2010 | A1 |
20100274124 | Jascob et al. | Oct 2010 | A1 |
20110004259 | Stallings et al. | Jan 2011 | A1 |
20110098553 | Robbins et al. | Apr 2011 | A1 |
20110105895 | Kornblau et al. | May 2011 | A1 |
20110216060 | Weising et al. | Sep 2011 | A1 |
20110245625 | Trovato et al. | Oct 2011 | A1 |
20110248064 | Marczyk | Oct 2011 | A1 |
20110254922 | Schaerer et al. | Oct 2011 | A1 |
20110306873 | Shenai et al. | Dec 2011 | A1 |
20120014608 | Watanabe | Jan 2012 | A1 |
20120068913 | Bar-Zeev et al. | Mar 2012 | A1 |
20120078236 | Schoepp | Mar 2012 | A1 |
20120109151 | Maier-Hein et al. | May 2012 | A1 |
20120143050 | Heigl | Jun 2012 | A1 |
20120155064 | Waters | Jun 2012 | A1 |
20120162452 | Liu | Jun 2012 | A1 |
20120182605 | Hall et al. | Jul 2012 | A1 |
20120201421 | Hartmann et al. | Aug 2012 | A1 |
20120216411 | Wevers et al. | Aug 2012 | A1 |
20120224260 | Healy | Sep 2012 | A1 |
20120238609 | Srivastava et al. | Sep 2012 | A1 |
20120289777 | Chopra et al. | Nov 2012 | A1 |
20120306850 | Balan et al. | Dec 2012 | A1 |
20120320100 | Machida et al. | Dec 2012 | A1 |
20130002928 | Imai | Jan 2013 | A1 |
20130009853 | Hesselink et al. | Jan 2013 | A1 |
20130038632 | Dillavou et al. | Feb 2013 | A1 |
20130050258 | Liu et al. | Feb 2013 | A1 |
20130050833 | Lewis et al. | Feb 2013 | A1 |
20130057581 | Meier | Mar 2013 | A1 |
20130083009 | Geisner et al. | Apr 2013 | A1 |
20130106833 | Fun | May 2013 | A1 |
20130135734 | Shafer et al. | May 2013 | A1 |
20130135738 | Shafer et al. | May 2013 | A1 |
20130190602 | Liao et al. | Jul 2013 | A1 |
20130195338 | Xu et al. | Aug 2013 | A1 |
20130209953 | Arlinsky et al. | Aug 2013 | A1 |
20130234914 | Fujimaki | Sep 2013 | A1 |
20130234935 | Griffith | Sep 2013 | A1 |
20130237811 | Mihailescu et al. | Sep 2013 | A1 |
20130245461 | Maier-Hein et al. | Sep 2013 | A1 |
20130249787 | Morimoto | Sep 2013 | A1 |
20130249945 | Kobayashi | Sep 2013 | A1 |
20130265623 | Sugiyama et al. | Oct 2013 | A1 |
20130267838 | Fronk et al. | Oct 2013 | A1 |
20130278635 | Maggiore | Oct 2013 | A1 |
20130300637 | Smits et al. | Nov 2013 | A1 |
20130300760 | Sugano et al. | Nov 2013 | A1 |
20130342571 | Kinnebrew et al. | Dec 2013 | A1 |
20140031668 | Mobasser et al. | Jan 2014 | A1 |
20140049629 | Siewerdsen et al. | Feb 2014 | A1 |
20140088402 | Xu | Mar 2014 | A1 |
20140088990 | Nawana et al. | Mar 2014 | A1 |
20140104505 | Koenig | Apr 2014 | A1 |
20140105912 | Noelle | Apr 2014 | A1 |
20140114173 | Bar-Tal et al. | Apr 2014 | A1 |
20140142426 | Razzaque et al. | May 2014 | A1 |
20140168261 | Margolis et al. | Jun 2014 | A1 |
20140176661 | Smurro et al. | Jun 2014 | A1 |
20140177023 | Gao et al. | Jun 2014 | A1 |
20140189508 | Granchi et al. | Jul 2014 | A1 |
20140198129 | Liu et al. | Jul 2014 | A1 |
20140218291 | Kirk | Aug 2014 | A1 |
20140240484 | Kodama et al. | Aug 2014 | A1 |
20140243614 | Rothberg et al. | Aug 2014 | A1 |
20140256429 | Kobayashi et al. | Sep 2014 | A1 |
20140266983 | Christensen | Sep 2014 | A1 |
20140268356 | Bolas et al. | Sep 2014 | A1 |
20140270505 | McCarthy | Sep 2014 | A1 |
20140275760 | Lee et al. | Sep 2014 | A1 |
20140285404 | Takano et al. | Sep 2014 | A1 |
20140285429 | Simmons | Sep 2014 | A1 |
20140300632 | Laor | Oct 2014 | A1 |
20140300967 | Tilleman et al. | Oct 2014 | A1 |
20140301624 | Barckow et al. | Oct 2014 | A1 |
20140303491 | Shekhar et al. | Oct 2014 | A1 |
20140320399 | Kim et al. | Oct 2014 | A1 |
20140333899 | Smithwick | Nov 2014 | A1 |
20140336461 | Reiter et al. | Nov 2014 | A1 |
20140340286 | Machida et al. | Nov 2014 | A1 |
20140361956 | Mikhailov et al. | Dec 2014 | A1 |
20150005772 | Anglin et al. | Jan 2015 | A1 |
20150018672 | Blumhofer et al. | Jan 2015 | A1 |
20150031985 | Reddy et al. | Jan 2015 | A1 |
20150043798 | Carrell et al. | Feb 2015 | A1 |
20150070347 | Hofmann et al. | Mar 2015 | A1 |
20150084990 | Laor | Mar 2015 | A1 |
20150150641 | Daon et al. | Jun 2015 | A1 |
20150182293 | Yang et al. | Jul 2015 | A1 |
20150192776 | Lee et al. | Jul 2015 | A1 |
20150209119 | Theodore et al. | Jul 2015 | A1 |
20150261922 | Nawana et al. | Sep 2015 | A1 |
20150277123 | Chaum et al. | Oct 2015 | A1 |
20150282735 | Rossner | Oct 2015 | A1 |
20150287188 | Gazit et al. | Oct 2015 | A1 |
20150287236 | Winne et al. | Oct 2015 | A1 |
20150297314 | Fowler et al. | Oct 2015 | A1 |
20150305828 | Park et al. | Oct 2015 | A1 |
20150310668 | Ellerbrock | Oct 2015 | A1 |
20150338652 | Lim | Nov 2015 | A1 |
20150350517 | Duret et al. | Dec 2015 | A1 |
20150351863 | Plassky et al. | Dec 2015 | A1 |
20150363978 | Maimone et al. | Dec 2015 | A1 |
20150366620 | Cameron et al. | Dec 2015 | A1 |
20160022287 | Nehls | Jan 2016 | A1 |
20160030131 | Yang et al. | Feb 2016 | A1 |
20160054571 | Tazbaz et al. | Feb 2016 | A1 |
20160086380 | Vayser et al. | Mar 2016 | A1 |
20160103318 | Du et al. | Apr 2016 | A1 |
20160125603 | Tanji | May 2016 | A1 |
20160133051 | Aonuma et al. | May 2016 | A1 |
20160143699 | Tanji | May 2016 | A1 |
20160153004 | Zhang et al. | Jun 2016 | A1 |
20160163045 | Penney et al. | Jun 2016 | A1 |
20160175064 | Steinle et al. | Jun 2016 | A1 |
20160178910 | Giudicelli et al. | Jun 2016 | A1 |
20160191887 | Casas | Jun 2016 | A1 |
20160223822 | Harrison et al. | Aug 2016 | A1 |
20160228033 | Rossner | Aug 2016 | A1 |
20160246059 | Halpin | Aug 2016 | A1 |
20160249989 | Devam et al. | Sep 2016 | A1 |
20160256223 | Haimerl et al. | Sep 2016 | A1 |
20160275684 | Elenbaas et al. | Sep 2016 | A1 |
20160302870 | Wilkinson et al. | Oct 2016 | A1 |
20160324580 | Esterberg | Nov 2016 | A1 |
20160324583 | Kheradpir et al. | Nov 2016 | A1 |
20160339337 | Ellsworth et al. | Nov 2016 | A1 |
20170014119 | Capote et al. | Jan 2017 | A1 |
20170024634 | Miao et al. | Jan 2017 | A1 |
20170027650 | Merck et al. | Feb 2017 | A1 |
20170031163 | Gao et al. | Feb 2017 | A1 |
20170031179 | Guillot et al. | Feb 2017 | A1 |
20170045742 | Greenhalgh | Feb 2017 | A1 |
20170068119 | Antaki et al. | Mar 2017 | A1 |
20170076501 | Jagga et al. | Mar 2017 | A1 |
20170086941 | Marti et al. | Mar 2017 | A1 |
20170112586 | Dhupar | Apr 2017 | A1 |
20170164919 | Lavallee et al. | Jun 2017 | A1 |
20170164920 | Lavallee et al. | Jun 2017 | A1 |
20170178375 | Benishti et al. | Jun 2017 | A1 |
20170220224 | Kodali et al. | Aug 2017 | A1 |
20170239015 | Sela et al. | Aug 2017 | A1 |
20170245944 | Crawford et al. | Aug 2017 | A1 |
20170251900 | Hansen et al. | Sep 2017 | A1 |
20170252109 | Yang et al. | Sep 2017 | A1 |
20170258526 | Lang | Sep 2017 | A1 |
20170281283 | Siegler et al. | Oct 2017 | A1 |
20170312032 | Amanatullah et al. | Nov 2017 | A1 |
20170348055 | Salcedo et al. | Dec 2017 | A1 |
20170348061 | Joshi et al. | Dec 2017 | A1 |
20170366773 | Kiraly et al. | Dec 2017 | A1 |
20170367766 | Mahfouz | Dec 2017 | A1 |
20170367771 | Tako et al. | Dec 2017 | A1 |
20170372477 | Penney et al. | Dec 2017 | A1 |
20180003981 | Urey | Jan 2018 | A1 |
20180018791 | Guoyi | Jan 2018 | A1 |
20180021597 | Berlinger et al. | Jan 2018 | A1 |
20180028266 | Barnes et al. | Feb 2018 | A1 |
20180036884 | Chen et al. | Feb 2018 | A1 |
20180049622 | Ryan et al. | Feb 2018 | A1 |
20180055579 | Daon et al. | Mar 2018 | A1 |
20180078316 | Schaewe et al. | Mar 2018 | A1 |
20180082480 | White et al. | Mar 2018 | A1 |
20180092667 | Heigl et al. | Apr 2018 | A1 |
20180092698 | Chopra et al. | Apr 2018 | A1 |
20180092699 | Finley | Apr 2018 | A1 |
20180116732 | Lin et al. | May 2018 | A1 |
20180117150 | O'Dwyer et al. | May 2018 | A1 |
20180133871 | Farmer | May 2018 | A1 |
20180153626 | Yang et al. | Jun 2018 | A1 |
20180182150 | Benishti et al. | Jun 2018 | A1 |
20180185100 | Weinstein et al. | Jul 2018 | A1 |
20180185113 | Gregerson et al. | Jul 2018 | A1 |
20180193097 | McLachlin et al. | Jul 2018 | A1 |
20180200002 | Kostrzewski et al. | Jul 2018 | A1 |
20180247128 | Alvi et al. | Aug 2018 | A1 |
20180262743 | Casas | Sep 2018 | A1 |
20180303558 | Thomas | Oct 2018 | A1 |
20180311011 | Van et al. | Nov 2018 | A1 |
20180317803 | Ben-Yishai et al. | Nov 2018 | A1 |
20180318035 | McLachlin et al. | Nov 2018 | A1 |
20180368898 | Divincenzo et al. | Dec 2018 | A1 |
20190000372 | Gullotti et al. | Jan 2019 | A1 |
20190000564 | Navab et al. | Jan 2019 | A1 |
20190015163 | Abhari et al. | Jan 2019 | A1 |
20190018235 | Ouderkirk et al. | Jan 2019 | A1 |
20190038362 | Nash et al. | Feb 2019 | A1 |
20190038365 | Soper et al. | Feb 2019 | A1 |
20190043238 | Benishti et al. | Feb 2019 | A1 |
20190043392 | Abele | Feb 2019 | A1 |
20190046272 | Zoabi et al. | Feb 2019 | A1 |
20190046276 | Inglese et al. | Feb 2019 | A1 |
20190053851 | Siemionow et al. | Feb 2019 | A1 |
20190069971 | Tripathi et al. | Mar 2019 | A1 |
20190080515 | Geri et al. | Mar 2019 | A1 |
20190105116 | Johnson et al. | Apr 2019 | A1 |
20190130792 | Rios et al. | May 2019 | A1 |
20190142519 | Siemionow et al. | May 2019 | A1 |
20190144443 | Jackson et al. | May 2019 | A1 |
20190175228 | Elimelech et al. | Jun 2019 | A1 |
20190192230 | Siemionow et al. | Jun 2019 | A1 |
20190200894 | Jung et al. | Jul 2019 | A1 |
20190201106 | Siemionow et al. | Jul 2019 | A1 |
20190216537 | Eltorai et al. | Jul 2019 | A1 |
20190254753 | Johnson et al. | Aug 2019 | A1 |
20190273916 | Benishti et al. | Sep 2019 | A1 |
20190310481 | Blum et al. | Oct 2019 | A1 |
20190333480 | Lang | Oct 2019 | A1 |
20190369660 | Wen | Dec 2019 | A1 |
20190369717 | Frielinghaus et al. | Dec 2019 | A1 |
20190387351 | Lyren et al. | Dec 2019 | A1 |
20200015895 | Frielinghaus et al. | Jan 2020 | A1 |
20200019364 | Pond | Jan 2020 | A1 |
20200020249 | Jarc et al. | Jan 2020 | A1 |
20200038112 | Amanatullah et al. | Feb 2020 | A1 |
20200043160 | Mizukura et al. | Feb 2020 | A1 |
20200078100 | Weinstein et al. | Mar 2020 | A1 |
20200085511 | Oezbek et al. | Mar 2020 | A1 |
20200088997 | Lee et al. | Mar 2020 | A1 |
20200100847 | Siegler et al. | Apr 2020 | A1 |
20200117025 | Sauer | Apr 2020 | A1 |
20200129058 | Li et al. | Apr 2020 | A1 |
20200129136 | Harding et al. | Apr 2020 | A1 |
20200129262 | Verard et al. | Apr 2020 | A1 |
20200129264 | Oativia et al. | Apr 2020 | A1 |
20200133029 | Yonezawa | Apr 2020 | A1 |
20200138518 | Lang | May 2020 | A1 |
20200138618 | Roszkowiak et al. | May 2020 | A1 |
20200143594 | Lal et al. | May 2020 | A1 |
20200146546 | Chene et al. | May 2020 | A1 |
20200151507 | Siemionow et al. | May 2020 | A1 |
20200156259 | Ruiz et al. | May 2020 | A1 |
20200159313 | Gibby et al. | May 2020 | A1 |
20200163723 | Wolf et al. | May 2020 | A1 |
20200163739 | Messinger et al. | May 2020 | A1 |
20200178916 | Lalys et al. | Jun 2020 | A1 |
20200184638 | Meglan et al. | Jun 2020 | A1 |
20200186786 | Gibby et al. | Jun 2020 | A1 |
20200188028 | Feiner et al. | Jun 2020 | A1 |
20200188034 | Lequette et al. | Jun 2020 | A1 |
20200201082 | Carabin | Jun 2020 | A1 |
20200229877 | Siemionow et al. | Jul 2020 | A1 |
20200237256 | Farshad et al. | Jul 2020 | A1 |
20200237459 | Racheli et al. | Jul 2020 | A1 |
20200237880 | Kent et al. | Jul 2020 | A1 |
20200242280 | Pavloff et al. | Jul 2020 | A1 |
20200246074 | Lang | Aug 2020 | A1 |
20200246081 | Johnson et al. | Aug 2020 | A1 |
20200264451 | Blum et al. | Aug 2020 | A1 |
20200265273 | Wei et al. | Aug 2020 | A1 |
20200275988 | Johnson et al. | Sep 2020 | A1 |
20200281554 | Trini et al. | Sep 2020 | A1 |
20200286222 | Essenreiter et al. | Sep 2020 | A1 |
20200288075 | Bonin et al. | Sep 2020 | A1 |
20200294233 | Merlet | Sep 2020 | A1 |
20200297427 | Cameron et al. | Sep 2020 | A1 |
20200305980 | Lang | Oct 2020 | A1 |
20200315734 | El Amm | Oct 2020 | A1 |
20200321099 | Holladay et al. | Oct 2020 | A1 |
20200323460 | Busza et al. | Oct 2020 | A1 |
20200323609 | Johnson et al. | Oct 2020 | A1 |
20200327721 | Siemionow et al. | Oct 2020 | A1 |
20200330179 | Ton | Oct 2020 | A1 |
20200337780 | Winkler et al. | Oct 2020 | A1 |
20200341283 | McCracken et al. | Oct 2020 | A1 |
20200352655 | Freese | Nov 2020 | A1 |
20200355927 | Marcellin-Dibon et al. | Nov 2020 | A1 |
20200360091 | Murray et al. | Nov 2020 | A1 |
20200375666 | Murphy | Dec 2020 | A1 |
20200377493 | Heiser et al. | Dec 2020 | A1 |
20200377956 | Vogelstein et al. | Dec 2020 | A1 |
20200388075 | Kazanzides et al. | Dec 2020 | A1 |
20200389425 | Bhatia et al. | Dec 2020 | A1 |
20200390502 | Holthuizen et al. | Dec 2020 | A1 |
20200390503 | Casas et al. | Dec 2020 | A1 |
20200402647 | Domracheva et al. | Dec 2020 | A1 |
20200409306 | Gelman et al. | Dec 2020 | A1 |
20200410687 | Siemionow et al. | Dec 2020 | A1 |
20200413031 | Khani et al. | Dec 2020 | A1 |
20210004956 | Book et al. | Jan 2021 | A1 |
20210009339 | Morrison et al. | Jan 2021 | A1 |
20210015560 | Boddington et al. | Jan 2021 | A1 |
20210015583 | Avisar et al. | Jan 2021 | A1 |
20210022599 | Freeman et al. | Jan 2021 | A1 |
20210022808 | Lang | Jan 2021 | A1 |
20210022811 | Mahfouz | Jan 2021 | A1 |
20210022828 | Elimelech et al. | Jan 2021 | A1 |
20210029804 | Chang | Jan 2021 | A1 |
20210030374 | Takahashi et al. | Feb 2021 | A1 |
20210030511 | Wolf et al. | Feb 2021 | A1 |
20210038339 | Yu et al. | Feb 2021 | A1 |
20210049825 | Wheelwright et al. | Feb 2021 | A1 |
20210052348 | Stifter et al. | Feb 2021 | A1 |
20210065911 | Goel et al. | Mar 2021 | A1 |
20210077195 | Saeidi et al. | Mar 2021 | A1 |
20210077210 | Itkowitz et al. | Mar 2021 | A1 |
20210080751 | Lindsey et al. | Mar 2021 | A1 |
20210090344 | Geri et al. | Mar 2021 | A1 |
20210093391 | Poltaretskyi et al. | Apr 2021 | A1 |
20210093392 | Poltaretskyi et al. | Apr 2021 | A1 |
20210093400 | Quaid et al. | Apr 2021 | A1 |
20210093417 | Liu | Apr 2021 | A1 |
20210104055 | Ni et al. | Apr 2021 | A1 |
20210107923 | Jackson et al. | Apr 2021 | A1 |
20210109349 | Schneider et al. | Apr 2021 | A1 |
20210109373 | Loo et al. | Apr 2021 | A1 |
20210110517 | Flohr et al. | Apr 2021 | A1 |
20210113269 | Vilsmeier et al. | Apr 2021 | A1 |
20210113293 | Silva et al. | Apr 2021 | A9 |
20210121238 | Palushi et al. | Apr 2021 | A1 |
20210137634 | Lang | May 2021 | A1 |
20210141887 | Kim et al. | May 2021 | A1 |
20210150702 | Claessen et al. | May 2021 | A1 |
20210157544 | Denton | May 2021 | A1 |
20210160472 | Casas | May 2021 | A1 |
20210161614 | Elimelech et al. | Jun 2021 | A1 |
20210162287 | Xing et al. | Jun 2021 | A1 |
20210165207 | Peyman | Jun 2021 | A1 |
20210169504 | Brown | Jun 2021 | A1 |
20210169578 | Calloway et al. | Jun 2021 | A1 |
20210169581 | Calloway et al. | Jun 2021 | A1 |
20210169605 | Calloway et al. | Jun 2021 | A1 |
20210186647 | Elimelech et al. | Jun 2021 | A1 |
20210196404 | Wang | Jul 2021 | A1 |
20210211640 | Bristol | Jul 2021 | A1 |
20210223577 | Zhang et al. | Jul 2021 | A1 |
20210227791 | De et al. | Jul 2021 | A1 |
20210235061 | Hegyi | Jul 2021 | A1 |
20210248822 | Choi et al. | Aug 2021 | A1 |
20210274281 | Zhang et al. | Sep 2021 | A1 |
20210278675 | Klug et al. | Sep 2021 | A1 |
20210282887 | Wiggermann | Sep 2021 | A1 |
20210290046 | Nazareth et al. | Sep 2021 | A1 |
20210290336 | Wang | Sep 2021 | A1 |
20210290394 | Mahfouz | Sep 2021 | A1 |
20210295512 | Knoplioch et al. | Sep 2021 | A1 |
20210298835 | Wang | Sep 2021 | A1 |
20210306599 | Pierce | Sep 2021 | A1 |
20210311322 | Belanger et al. | Oct 2021 | A1 |
20210314502 | Liu | Oct 2021 | A1 |
20210315636 | Akbarian et al. | Oct 2021 | A1 |
20210315662 | Freeman et al. | Oct 2021 | A1 |
20210325684 | Ninan et al. | Oct 2021 | A1 |
20210332447 | Lubelski et al. | Oct 2021 | A1 |
20210333561 | Oh et al. | Oct 2021 | A1 |
20210341739 | Cakmakci | Nov 2021 | A1 |
20210341740 | Cakmakci | Nov 2021 | A1 |
20210346115 | Dulin et al. | Nov 2021 | A1 |
20210349677 | Baldev et al. | Nov 2021 | A1 |
20210364802 | Uchiyama | Nov 2021 | A1 |
20210369226 | Siemionow et al. | Dec 2021 | A1 |
20210371413 | Thurston et al. | Dec 2021 | A1 |
20210373333 | Moon | Dec 2021 | A1 |
20210373344 | Loyola et al. | Dec 2021 | A1 |
20210378757 | Bay et al. | Dec 2021 | A1 |
20210386482 | Gera et al. | Dec 2021 | A1 |
20210389590 | Freeman et al. | Dec 2021 | A1 |
20210400247 | Casas | Dec 2021 | A1 |
20210401533 | Im | Dec 2021 | A1 |
20210402255 | Fung | Dec 2021 | A1 |
20210405369 | King | Dec 2021 | A1 |
20220003992 | Ahn | Jan 2022 | A1 |
20220007006 | Healy et al. | Jan 2022 | A1 |
20220008135 | Frielinghaus et al. | Jan 2022 | A1 |
20220038675 | Hegyi | Feb 2022 | A1 |
20220039873 | Harris | Feb 2022 | A1 |
20220051484 | Jones et al. | Feb 2022 | A1 |
20220054199 | Sivaprakasam et al. | Feb 2022 | A1 |
20220061921 | Crawford et al. | Mar 2022 | A1 |
20220071712 | Wolf et al. | Mar 2022 | A1 |
20220079675 | Lang | Mar 2022 | A1 |
20220087746 | Lang | Mar 2022 | A1 |
20220113810 | Isaacs et al. | Apr 2022 | A1 |
20220117669 | Nikou et al. | Apr 2022 | A1 |
20220121041 | Hakim | Apr 2022 | A1 |
20220133484 | Lang | May 2022 | A1 |
20220142730 | Wolf et al. | May 2022 | A1 |
20220155861 | Myung et al. | May 2022 | A1 |
20220159227 | Quiles Casas | May 2022 | A1 |
20220179209 | Cherukuri | Jun 2022 | A1 |
20220192776 | Gibby et al. | Jun 2022 | A1 |
20220193453 | Miyazaki et al. | Jun 2022 | A1 |
20220201274 | Achilefu et al. | Jun 2022 | A1 |
20220245400 | Siemionow et al. | Aug 2022 | A1 |
20220245821 | Ouzounis | Aug 2022 | A1 |
20220269077 | Adema | Aug 2022 | A1 |
20220270263 | Junio | Aug 2022 | A1 |
20220287676 | Steines et al. | Sep 2022 | A1 |
20220292786 | Pelzl et al. | Sep 2022 | A1 |
20220295033 | Quiles Casas | Sep 2022 | A1 |
20220304768 | Elimelech et al. | Sep 2022 | A1 |
20220351385 | Finley et al. | Nov 2022 | A1 |
20220358759 | Cork et al. | Nov 2022 | A1 |
20220392085 | Finley et al. | Dec 2022 | A1 |
20220405935 | Flossmann et al. | Dec 2022 | A1 |
20230009793 | Gera et al. | Jan 2023 | A1 |
20230027801 | Qian et al. | Jan 2023 | A1 |
20230034189 | Gera et al. | Feb 2023 | A1 |
20230073041 | Samadani et al. | Mar 2023 | A1 |
20230149083 | Lin et al. | May 2023 | A1 |
20230290037 | Tasse et al. | Sep 2023 | A1 |
20230295302 | Bhagavatheeswaran et al. | Sep 2023 | A1 |
20230316550 | Hiasa | Oct 2023 | A1 |
20230329799 | Gera et al. | Oct 2023 | A1 |
20230329801 | Elimelech et al. | Oct 2023 | A1 |
20230371984 | Leuthardt et al. | Nov 2023 | A1 |
20230372053 | Elimelech et al. | Nov 2023 | A1 |
20230372054 | Elimelech et al. | Nov 2023 | A1 |
20230377175 | Seok | Nov 2023 | A1 |
20230379448 | Benishti et al. | Nov 2023 | A1 |
20230379449 | Benishti et al. | Nov 2023 | A1 |
20230386153 | Rybnikov et al. | Nov 2023 | A1 |
20230397349 | Capelli et al. | Dec 2023 | A1 |
20230397957 | Crawford et al. | Dec 2023 | A1 |
20230410445 | Elimelech et al. | Dec 2023 | A1 |
20240008935 | Wolf et al. | Jan 2024 | A1 |
20240016549 | Johnson et al. | Jan 2024 | A1 |
20240016572 | Elimelech et al. | Jan 2024 | A1 |
20240020831 | Johnson et al. | Jan 2024 | A1 |
20240020840 | Johnson et al. | Jan 2024 | A1 |
20240020862 | Johnson et al. | Jan 2024 | A1 |
20240022704 | Benishti et al. | Jan 2024 | A1 |
20240023946 | Wolf et al. | Jan 2024 | A1 |
20240041558 | Siewerdsen et al. | Feb 2024 | A1 |
Number | Date | Country |
---|---|---|
3022448 | Feb 2018 | CA |
3034314 | Feb 2018 | CA |
101379412 | Mar 2009 | CN |
103106348 | May 2013 | CN |
111915696 | Nov 2020 | CN |
112489047 | Mar 2021 | CN |
202004011567 | Nov 2004 | DE |
102004011567 | Sep 2005 | DE |
102014008153 | Oct 2014 | DE |
202022103168 | Jun 2022 | DE |
0933096 | Aug 1999 | EP |
1640750 | Mar 2006 | EP |
1757974 | Feb 2007 | EP |
2119397 | Nov 2009 | EP |
2134847 | Dec 2009 | EP |
2557998 | Feb 2013 | EP |
2823463 | Jan 2015 | EP |
2868277 | May 2015 | EP |
2891966 | Jul 2015 | EP |
2963616 | Jan 2016 | EP |
3028258 | Jun 2016 | EP |
3034607 | Jun 2016 | EP |
3037038 | Jun 2016 | EP |
3069318 | Sep 2016 | EP |
3076660 | Oct 2016 | EP |
3121789 | Jan 2017 | EP |
3123970 | Feb 2017 | EP |
2654749 | May 2017 | EP |
3175815 | Jun 2017 | EP |
3216416 | Sep 2017 | EP |
2032039 | Oct 2017 | EP |
3224376 | Oct 2017 | EP |
3247297 | Nov 2017 | EP |
3256213 | Dec 2017 | EP |
3306567 | Apr 2018 | EP |
2030193 | Jul 2018 | EP |
2225723 | Feb 2019 | EP |
2892558 | Apr 2019 | EP |
2635299 | Jul 2019 | EP |
3505050 | Jul 2019 | EP |
2875149 | Dec 2019 | EP |
3593227 | Jan 2020 | EP |
3634294 | Apr 2020 | EP |
3206583 | Sep 2020 | EP |
3711700 | Sep 2020 | EP |
2625845 | Mar 2021 | EP |
3789965 | Mar 2021 | EP |
3858280 | Aug 2021 | EP |
3913423 | Nov 2021 | EP |
3952331 | Feb 2022 | EP |
3960235 | Mar 2022 | EP |
4173590 | May 2023 | EP |
4252695 | Oct 2023 | EP |
4270313 | Nov 2023 | EP |
4287120 | Dec 2023 | EP |
2507314 | Apr 2014 | GB |
10-2014-0120155 | Oct 2014 | KR |
0334705 | Apr 2003 | WO |
2006002559 | Jan 2006 | WO |
2007051304 | May 2007 | WO |
2007115826 | Oct 2007 | WO |
2008103383 | Aug 2008 | WO |
2010067267 | Jun 2010 | WO |
2010074747 | Jul 2010 | WO |
2012061537 | May 2012 | WO |
2012101286 | Aug 2012 | WO |
2013112554 | Aug 2013 | WO |
2014014498 | Jan 2014 | WO |
2014024188 | Feb 2014 | WO |
2014037953 | Mar 2014 | WO |
2014113455 | Jul 2014 | WO |
2014125789 | Aug 2014 | WO |
2014167563 | Oct 2014 | WO |
2014174067 | Oct 2014 | WO |
2015058816 | Apr 2015 | WO |
2015061752 | Apr 2015 | WO |
2015109145 | Jul 2015 | WO |
2016151506 | Sep 2016 | WO |
2018052966 | Mar 2018 | WO |
2018073452 | Apr 2018 | WO |
2018200767 | Nov 2018 | WO |
2018206086 | Nov 2018 | WO |
2019083431 | May 2019 | WO |
2019135209 | Jul 2019 | WO |
2019161477 | Aug 2019 | WO |
2019195926 | Oct 2019 | WO |
2019210353 | Nov 2019 | WO |
2019211741 | Nov 2019 | WO |
2020109903 | Jun 2020 | WO |
2020109904 | Jun 2020 | WO |
2021017019 | Feb 2021 | WO |
2021019369 | Feb 2021 | WO |
2021021979 | Feb 2021 | WO |
2021023574 | Feb 2021 | WO |
2021046455 | Mar 2021 | WO |
2021048158 | Mar 2021 | WO |
2021061459 | Apr 2021 | WO |
2021062375 | Apr 2021 | WO |
2021073743 | Apr 2021 | WO |
2021087439 | May 2021 | WO |
2021091980 | May 2021 | WO |
2021112918 | Jun 2021 | WO |
2021130564 | Jul 2021 | WO |
2021137752 | Jul 2021 | WO |
2021141887 | Jul 2021 | WO |
2021145584 | Jul 2021 | WO |
2021154076 | Aug 2021 | WO |
2021183318 | Sep 2021 | WO |
2021188757 | Sep 2021 | WO |
2021255627 | Dec 2021 | WO |
2021257897 | Dec 2021 | WO |
2021258078 | Dec 2021 | WO |
2022009233 | Jan 2022 | WO |
2022053923 | Mar 2022 | WO |
2022079565 | Apr 2022 | WO |
2023281395 | Jan 2023 | WO |
2023007418 | Feb 2023 | WO |
2023011924 | Feb 2023 | WO |
2023021448 | Feb 2023 | WO |
2023021450 | Feb 2023 | WO |
2023021451 | Feb 2023 | WO |
2023026229 | Mar 2023 | WO |
2023047355 | Mar 2023 | WO |
2023072887 | May 2023 | WO |
2023088986 | May 2023 | WO |
2023163933 | Aug 2023 | WO |
2023186996 | Oct 2023 | WO |
2023205212 | Oct 2023 | WO |
2023209014 | Nov 2023 | WO |
2023232492 | Dec 2023 | WO |
2023240912 | Dec 2023 | WO |
2024013642 | Jan 2024 | WO |
2024018368 | Jan 2024 | WO |
Entry |
---|
U.S. Appl. No. 16/159,740 U.S. Pat. No. 10,382,748, filed Oct. 15, 2018 Aug. 13, 2019, Combining Video-Based and Optic-Based Augmented Reality in a Near Eye Display. |
U.S. Appl. No. 16/419,023 U.S. Pat. No. 11,750,794, filed May 22, 2019 Sep. 5, 2023, Combining Video-Based and Optic-Based Augmented Reality in a Near Eye Display. |
U.S. Appl. No. 18/352,158, filed Jul. 13, 2023, Combining Video-Based and Optic-Based Augmented Reality in a Near Eye Display. |
U.S. Appl. No. 18/365,643, filed Aug. 4, 2023, Head-Mounted Augmented Reality Near Eye Display Device. |
U.S. Appl. No. 18/365,650, filed Aug. 4, 2023, Systems for Facilitating Augmented Reality-Assisted Medical Procedures. |
U.S. Appl. No. 15/127,423 U.S. Pat. No. 9,928,629, filed Sep. 20, 2016 Mar. 27, 2018, Combining Video-Based and Optic-Based Augmented Reality in a Near Eye Display. |
U.S. Appl. No. 16/120,480 U.S. Pat. No. 10,835,296, filed Sep. 4, 2018 Nov. 17, 2020, Spinous Process Clamp. |
U.S. Appl. No. 17/067,831, filed Oct. 12, 2020, Spinous Process Clamp. |
U.S. Appl. No. 18/030,072, filed Apr. 4, 2023, Spinous Process Clamp. |
U.S. Appl. No. 18/365,590, filed Aug. 4, 2023, Registration of a Fiducial Marker for an Augmented Reality System. |
U.S. Appl. No. 18/365,571, filed Aug. 4, 2023, Registration Marker for an Augmented Reality System. |
U.S. Appl. No. 17/045,766, filed Oct. 7, 2020, Registration of a Fiducial Marker for an Augmented Reality System. |
U.S. Appl. No. 16/199,281 U.S. Pat. No. 10,939,977, filed Nov. 26, 2018 Mar. 9, 2021, Positioning Marker. |
U.S. Appl. No. 16/524,258, filed Jul. 29, 2019, Fiducial Marker. |
U.S. Appl. No. 17/585,629, filed Jan. 27, 2022, Fiducial Marker. |
U.S. Appl. No. 16/724,297 U.S. Pat. No. 11,382,712, filed Dec. 22, 2019 Jul. 12, 2022, Mirroring in Image Guided Surgery. |
U.S. Appl. No. 17/827,710 U.S. Pat. No. 11,801,115, filed May 29, 2022 Oct. 31, 2023, Mirroring in Image Guided Surgery. |
U.S. Appl. No. 18/352,181, filed Jul. 13, 2023, Mirroring in Image Guided Surgery. |
U.S. Appl. No. 18/400,739, filed Dec. 29, 2023, Mirroring in Image Guided Surgery. |
U.S. Appl. No. 16/200,144 U.S. Pat. No. 11,766,296, filed Nov. 26, 2018 Sep. 26, 2023, Tracking System for Image-Guided Surgery. |
U.S. Appl. No. 18/470,809, filed Sep. 20, 2023, Tracking Methods for Image-Guided Surgery. |
U.S. Appl. No. 17/015,199, filed Sep. 9, 2020, Universal Tool Adapter. |
U.S. Appl. No. 18/598,965, filed Mar. 7, 2024, Universal Tool Adapter for Image Guided Surgery. |
U.S. Appl. No. 18/044,380, filed Mar. 8, 2023, Universal Tool Adapter for Image-Guided Surgery. |
U.S. Appl. No. 16/901,026 U.S. Pat. No. 11,389,252, filed Jun. 15, 2020 Jul. 19, 2022, Rotating Marker for Image Guided Surgery. |
U.S. Appl. No. 18/008,980, filed Dec. 8, 2022, Rotating Marker. |
U.S. Appl. No. 17/368,859 U.S. Pat. No. 11,896,445, filed Jul. 7, 2021 Feb. 13, 2024, Iliac Pin and Adapter. |
U.S. Appl. No. 18/437,898, filed Feb. 9, 2024, Iliac Pin and Adapter. |
U.S. Appl. No. 18/576,516, filed Jan. 4, 2024, Iliac Pin and Adapter. |
U.S. Appl. No. 17/388,064, filed Jul. 29, 2021, Rotating Marker and Adapter for Image-Guided Surgery. |
U.S. Appl. No. 18/291,731, filed Jan. 24, 2024, Rotating Marker and Adapter for Image-Guided Surgery. |
U.S. Appl. No. 18/365,844, filed Aug. 4, 2023, Augmented-Reality Surgical System Using Depth Sensing. |
U.S. Appl. No. 18/683,676, filed Feb. 14, 2024, Stereoscopic Display and Digital Loupe for Augmented-Reality Near-Eye Display. |
U.S. Appl. No. 18/683,680, filed Feb. 14, 2024, Augmented Reality Assistance for Osteotomy and Discectomy. |
U.S. Appl. No. 18/684,756, filed Feb. 19, 2024, Registration and Registration Validation in Image-Guided Surgery. |
U.S. Appl. No. 18/365,566, filed Aug. 4, 2023, Systems for Medical Image Visualization. |
U.S. Appl. No. 18/399,253, filed Dec. 28, 2023, Methods for Medical Image Visualization. |
U.S. Appl. No. 18/398,837, filed Dec. 28, 2023, Adjustable Augmented Reality Eyewear for Image-Guided Medical Intervention. |
U.S. Appl. No. 35/508,942 U.S. Pat. No. D. 930,162, filed Feb. 13, 2020 Sep. 7, 2021, Medical Headset. |
U.S. Appl. No. 15/896,102 U.S. Pat. No. 10,134,166, filed Feb. 14, 2018 Nov. 20, 2018, Combining Video-Based and Optic-Based Augmented Reality in a Near Eye Display. |
U.S. Appl. No. 18/693,338, filed Mar. 19, 2024, Surgical Planning and Display. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/IB2023/059049, mailed on Feb. 8, 2024, 24 pages. |
16 Augmented Reality Glasses of 2021 (with Features), in Back to News, Dated May 6, 2022, accessed at https://web.archive.org/web/20221127195438/https://circuitstream.com/blog/16-augmented-reality-glasses-of-2021-with-features-breakdowns/. |
Vuzix Blades, Prescription Lens Installation Guide, copyright 2020. |
Frames Direct, InSpatialRx Prescription Insert, Prescription Insert for Magic Leap 1, accessed Mar. 8, 2024 at https://www.framesdirect.com/inspatialrx-prescription-insert.html. |
Everysight, Installing your RX Adaptor, accessed Mar. 13, 2024 at https://support.everysight.com/hc/en-us/articles/115000984571-Installing-your-RX-Adaptor. |
Reddit, Notice on Prescription Lenses for Nreal Glasses, accessed Mar. 13, 2024 at https://www.reddit.com/r/nreal/comments/x1fte5/notice_on_prescription_lenses_for_nreal_glasses/. |
Everysight, Raptor User Manual, copyright 2017, in 46 pages. |
Number | Date | Country | |
---|---|---|---|
20240126087 A1 | Apr 2024 | US |
Number | Date | Country | |
---|---|---|---|
63405901 | Sep 2022 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 18398837 | Dec 2023 | US |
Child | 18399433 | US | |
Parent | PCT/IB2023/059049 | Sep 2023 | WO |
Child | 18398837 | US |