Tracking system for image-guided surgery

Information

  • Patent Grant
  • 11766296
  • Patent Number
    11,766,296
  • Date Filed
    Monday, November 26, 2018
    5 years ago
  • Date Issued
    Tuesday, September 26, 2023
    a year ago
  • Inventors
  • Original Assignees
    • AUGMEDICS LTD.
  • Examiners
    • Fernandez; Katherine L
    • Bruce; Farouk A
    Agents
    • KNOBBE, MARTENS, OLSON & BEAR, LLP
Abstract
Apparatus and methods are described including tracking a tool portion and a patient marker from a first line of sight, using a first tracking device disposed upon a first head-mounted device that includes a display. The tool portion and the patient marker are tracked from a second line of sight, using a second tracking device. When a portion of the patient marker and the tool portion are both within the first line of sight, an augmented reality image is generated upon the first display based upon data received from the first tracking device and without using data from the second tracking device. When at least the patient marker portion and the tool portion are not both within the first line of sight, a virtual image of the tool and anatomy of the patient is generated using data received from the second tracking device. Other applications are also described.
Description
FIELD OF EMBODIMENTS OF THE INVENTION

The present invention relates to generally to an augmented reality system, and specifically to a tracking system for an augmented reality system that is used to perform image-guided surgery.


BACKGROUND

A head-mounted display is sometimes used as part of an augmented reality system. The display is used to generate an augmented reality scene, in which a scene that is being viewed by a user of the head-mounted display is altered, typically by being augmented or supplemented. The alteration is computer generated, and typically involves presenting real-time video, and/or non-real-time images, to the user while the user is gazing at the scene.


In some cases, an augmented reality system is used for performing image-guided surgery, as part of a medical procedure. For example, a computer-generated image may be presented to a healthcare professional who is performing the procedure. The image may be presented on a head-mounted display such that the image is aligned with an anatomical portion of a patient who is undergoing the procedure. Although some misalignment of the image with the patient's body may be acceptable, for satisfactory presentation of the images the misalignment may typically not be more than about 2-3 mm. In order to account for such a limit on the misalignment of the patient's anatomy with the presented images, the position of the patient's body or a portion thereof is typically tracked.


In some cases, an image of a tool that is used to perform the procedure is incorporated into the image that is displayed on the head-mounted display. In order to incorporate an image of the tool into the image, in a manner in which the position of the tool with respect to the image and/or the patient's anatomy is accurately reflected, the position of the tool or a portion thereof is typically tracked.


Triangulation techniques are commonly employed for tracking positions of a patient's body and/or a tool. In such techniques, a plurality of imaging devices, which are disposed at known locations with respect to each other, are used to detect a feature (such as a marker) on the patient's body, and/or on the tool. The location of the feature is then derived, using a combination of the known locations of the imaging devices, as well as the location of the feature as detected by each of the imaging devices.


SUMMARY OF EMBODIMENTS

In accordance with some applications of the present invention, a first healthcare professional (e.g., a surgeon performing a procedure) wears a first head-mounted device. Typically, the head-mounted device includes one or more head-mounted displays. For some applications, the head-mounted displays are generally similar to those described in U.S. Pat. No. 9,928,629 to Benishti, which is incorporated herein by reference. For example, the head-mounted displays may include a combiner that is controlled by a computer processor, such as to display an augmented reality image to the healthcare professional. For some applications, the image is presented on the head-mounted display such that (a) a computer-generated image is projected onto a first portion of the display, and (b) the computer-generated image is aligned with an anatomical portion of a patient who is undergoing the procedure, with the anatomical portion of a patient visible through a second portion of the display. Typically, the computer-generated image includes a virtual image of the tool overlaid upon a virtual image of the patient's anatomy. For some applications, a portion of the tool that would not otherwise be visible to the healthcare professional (for example, by virtue of being hidden by the patient's anatomy) is included in the computer-generated image.


Typically, the head-mounted device includes a tracking device that is configured to facilitate determination of the location and orientation of the head-mounted device with respect to a portion of the patient's body (e.g., the patient's back), and/or the position and orientation of the tool with respect to the patient's body. For example, the tracking device may include an image-capturing device, such as a camera, that is configured to image a patient marker and/or a tool marker. Typically, the patient marker is configured to provide data that is sufficient for the computer processor to determine the location and orientation of the head-mounted device with respect to the portion of the patient's body using data collected from a single tracking device that is disposed on the head-mounted display. For example, the patient marker may include an array of elements that is visible to the tracking device of the head-mounted device, and that is configured such that at any location and orientation of the head-mounted device with respect to the patient marker, the array of elements has an appearance that is unique to that location and orientation. In this manner, the computer processor is able to determine the location and orientation of the head-mounted device with respect to the portion of the patient's body without requiring the use of triangulation techniques. Typically, a single camera is used in the tracking device of the head-mounted device. For some applications, the camera is a high-speed camera. For example, the camera may acquire more than 50 frames per second.


Typically, in order to generate the augmented reality image upon the head-mounted display, a computer processor determines the location and orientation of the head-mounted device with respect to a portion of the patient's body (e.g., the patient's back), and/or the position and orientation of the tool with respect to the portion of the patient's body. As described hereinabove, in general, the patient marker is configured to provide data that is sufficient for the computer processor to determine the location and orientation of the head-mounted device with respect to the portion of the patient's body using data collected from a single tracking device that is disposed on the head-mounted device. However, for some applications, at least under certain conditions, the computer processor is configured to incorporate tracking data that is received from at least one additional tracking device (i.e., a tracking device in addition to the tracking device that is included in the head-mounted device of the first healthcare professional), in order to generate the image upon the head-mounted display of the first healthcare professional.


For some such applications, the computer processor is configured to incorporate the additional data in cases in which the first tracking device that is included in the head-mounted device of the first healthcare professional loses its line of sight with the patient marker and/or the tool marker and/or portions thereof. For example, the computer processor may be configured to receive data from a tracking device of an additional head-mounted device that is configured to be worn by an additional healthcare professional who is present in the procedure (e.g., an accompanying surgeon or a nurse). Typically, the additional head-mounted device is generally similar to the first head-mounted device, and the tracking device of the additional head-mounted device is generally similar to that of the first head-mounted device. For some applications, when at least a portion of the patient marker and a portion of the tool (e.g., the tool marker) are both within the line of sight of the first tracking device, the computer processor generates an augmented reality image upon the head-mounted display, based upon data received from first tracking device and without using data received from the additional tracking device. When at least the portion of the patient marker and the portion of the tool are not both within the line of sight of the first tracking device, the computer processor generates an augmented reality image upon the first head-mounted display, at least partially based upon data received from the additional tracking device.


There is therefore provided, in accordance with some applications of the present invention, a method for use with a tool configured to be placed within a portion of a body of a patient, the method including:

    • tracking at least a portion of the tool and a patient marker that is placed upon the patient's body from a first line of sight, using a first tracking device that is disposed upon a first head-mounted device that is worn by a first person, the first head-mounted device including a first head-mounted display;
    • tracking at least the portion of the tool and the patient marker, from a second line of sight, using a second tracking device; and
    • using at least one computer processor:
      • when at least a portion of the patient marker and the portion of the tool are both within the first line of sight, generating an augmented reality image upon the first head-mounted display based upon data received from the first tracking device and without using data from the second tracking device, the augmented reality image including (a) a virtual image of the tool and anatomy of the patient, overlaid upon (b) the patient's body; and
      • when at least the portion of the patient marker and the portion of the tool are not both within the first line of sight, generating a virtual image of the tool and anatomy of the patient upon the first head-mounted display, at least partially based upon data received from the second tracking device.


In some applications, tracking the portion of the tool includes tracking a tool marker. In some applications, tracking at least the portion of the tool and the patient marker, from the second line of sight, using the second tracking device, includes tracking at least the portion of the tool and the patient marker from the second line of sight, using a second tracking device that is disposed in a stationary position. In some applications, tracking at least the portion of the tool and the patient marker using the first tracking device includes tracking at least the portion of the tool and the patient marker using a first camera, and tracking at least the portion of the tool and the patient marker using the second tracking device includes tracking at least the portion of the tool and the patient marker using a second camera.


In some applications, generating the virtual image of the tool and anatomy of the patient upon the first head-mounted display, at least partially based upon data received from the second tracking device includes:

    • in response to the portion of the patient marker being within the first line of sight, and the portion of the tool not being within the first line of sight:
      • determining a position of the tool with respect to the subject's anatomy using data received from the second tracking device;
      • generating the virtual image of the tool and anatomy of the patient upon the first head-mounted display, based upon the determined position of the tool with respect to the subject's anatomy;
      • determining a position of the patient's body with respect to the first head-mounted device based upon data received from the first tracking device; and
      • overlaying the virtual image upon the patient's body, based upon the determined position of the patient's body with respect to the first head-mounted device.


In some applications, generating the virtual image of the tool and anatomy of the patient upon the first head-mounted display, at least partially based upon data received from the second tracking device includes:

    • in response to the portion of the tool being within the first line of sight, and the portion of the patient marker not being within the first line of sight:
      • determining a position of the tool with respect to the subject's anatomy using data received from the second tracking device;
      • generating the virtual image of the tool and anatomy of the patient upon the first head-mounted display, based upon the determined position of the tool with respect to the subject's anatomy.


In some applications, generating the virtual image of the tool and anatomy of the patient upon the first head-mounted display further includes overlaying the virtual image upon the patient's body, based upon a position of the patient's body with respect to the first head-mounted device as determined based upon data received from the first tracking device at a time when the portion of the patient marker was within the first line of sight.


In some applications, overlaying the virtual image upon the patient's body includes tracking movements of the head-mounted device between the time when the portion of the patient marker was within the first line of sight and the portion of the patient marker not being within the first line of sight, using an inertial-measurement unit disposed upon the first head-mounted device.


In some applications, generating the virtual image of the tool and anatomy of the patient upon the first head-mounted display, at least partially based upon data received from the second tracking device includes:

    • in response to the portion of the tool and the portion of the patient marker both not being within the first line of sight:
      • determining a position of the tool with respect to the subject's anatomy using data received from the second tracking device;
      • generating the virtual image of the tool and anatomy of the patient upon the first head-mounted display, based upon the determined position of the tool with respect to the subject's anatomy.


In some applications, generating the virtual image of the tool and anatomy of the patient upon the first head-mounted display further includes overlaying the virtual image upon the patient's body, based upon a position of the patient's body with respect to the first head-mounted device as determined based upon data received from the first tracking device at a time when the portion of the patient marker was within the first line of sight. In some applications, overlaying the virtual image upon the patient's body includes tracking movements of the head-mounted device between the time when the portion of the patient marker was within the first line of sight and the portion of the patient marker not being within the first line of sight, using an inertial-measurement unit disposed upon the first head-mounted device.


In some applications, tracking at least the portion of the tool and the patient marker, from the second line of sight, using the second tracking device, includes tracking at least the portion of the tool and the patient marker from the second line of sight, using a second tracking device that is disposed upon a second head-mounted device that is worn by a second person. In some applications, the second head-mounted device includes a second head-mounted display, the method further including generating a further augmented-reality image upon the second head-mounted display.


There is further provided, in accordance with some applications of the present invention, apparatus for use with a tool configured to be placed within a portion of a body of a patient, the apparatus including:

    • a patient marker configured to be placed upon the patient's body;
    • a first head-mounted device including a first head-mounted display, and a first tracking device that is configured to track at least a portion of the tool and the patient marker from a first line of sight;
    • a second tracking device that is configured to track at least the portion of the tool and the patient marker from a second line of sight; and
    • at least one computer processor configured:
      • when at least a portion of the patient marker and the portion of the tool are both within the first line of sight, to generate an augmented reality image upon the first head-mounted display, based upon data received from the first tracking device and without using data from the second tracking device, the augmented reality image including (a) a virtual image of the tool and anatomy of the patient, overlaid upon (b) the patient's body; and
      • when at least the portion of the patient marker and the portion of the tool are not both within the first line of sight, to generate a virtual image of the tool and anatomy of the patient upon the first head-mounted display, at least partially based upon data received from the second tracking device.


There is further provided, in accordance with some applications of the present invention, a method for use with a tool configured to be placed within a portion of a body of a patient, the method including:

    • tracking at least a portion of the tool and a patient marker that is placed upon the patient's body from a first line of sight, using a first tracking device that is disposed upon a first head-mounted device that is worn by a first person, the first head-mounted device including a first head-mounted display;
    • tracking at least the portion of the tool and the patient marker from a second line of sight, using a second tracking device that is disposed upon a second head-mounted device that is worn by a second person; and
    • using at least one computer processor, generating an augmented reality image upon the first head-mounted display, based upon data received from the first tracking device in combination with data received from the second tracking device, the augmented reality image including (a) a virtual image of the tool and anatomy of the patient, overlaid upon (b) the patient's body.


In some applications, tracking the portion of the tool includes tracking a tool marker. In some applications, the second head-mounted device includes a second head-mounted display, the method further including generating a further augmented-reality image upon the second head-mounted display. In some applications, tracking at least the portion of the tool and the patient marker using the first tracking device includes tracking at least the portion of the tool and the patient marker using a first camera, and tracking at least the portion of the tool and the patient marker using the second tracking device includes tracking at least the portion of the tool and the patient marker using a second camera.


In some applications, generating the augmented reality image upon the first head-mounted display includes:

    • determining a position of the tool with respect to the subject's anatomy using data received from the first tracking device in combination with data received from the second tracking device;
    • generating the virtual image of the tool and anatomy of the patient upon the first head-mounted display, based upon the determined position of the tool with respect to the subject's anatomy;
    • determining a position of the patient's body with respect to the first head-mounted device based upon data received from the first tracking device; and
    • overlaying the virtual image upon the patient's body, based upon the determined position of the patient's body with respect to the first head-mounted device.


There is further provided, in accordance with some applications of the present invention, apparatus for use with a tool configured to be placed within a portion of a body of a patient, the apparatus including:

    • a patient marker configured to be placed upon the patient's body;
    • a first head-mounted device configured to be worn by a first person, the first head-mounted device including a first head-mounted display, and a first tracking device that is configured to track at least a portion of the tool and the patient marker from a first line of sight;
    • a second head-mounted device configured to be worn by a second person, the second head-mounted device including a second tracking device that is configured to track at least a portion of the tool and the patient marker from a second line of sight; and
    • at least one computer processor configured to generate an augmented reality image upon the first head-mounted display, based upon data received from the first tracking device in combination with data received from the second tracking device, the augmented reality image including (a) a virtual image of the tool and anatomy of the patient, overlaid upon (b) the patient's body.


The present invention will be more fully understood from the following detailed description of applications thereof, taken together with the drawings, in which:





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a schematic illustration of image-guided surgery being performed upon a patient, in accordance with some applications of the present invention;



FIG. 2 is a schematic illustration of a head-mounted device, in accordance with some applications of the present invention;



FIGS. 3A and 3B are schematic illustrations of examples of displays of head-mounted devices as worn by respective healthcare professionals, in accordance with some applications of the present invention;



FIGS. 4A and 4B are schematic illustrations of examples of displays of head-mounted devices as worn by respective healthcare professionals, when the line of sight between a tracking device of one of the healthcare professionals with respect to a patient marker is at least partially blocked, in accordance with some applications of the present invention;



FIGS. 5A and 5B are schematic illustrations of examples of displays of head-mounted devices as worn by respective healthcare professionals, when the line of sight between a tracking device of one of the healthcare professionals with respect to a patient marker is at least partially blocked, in accordance with some applications of the present invention; and



FIGS. 6A and 6B are schematic illustration of examples of displays of head-mounted devices as worn by respective healthcare professionals, when the line of sight between a tracking device of one of the healthcare professionals with respect to a patient marker is at least partially blocked, in accordance with some applications of the present invention.





DETAILED DESCRIPTION OF EMBODIMENTS

Reference is now made to FIG. 1, which is a schematic illustration of a medical procedure that incorporates image-guided surgery being performed upon a patient 20, in accordance with some applications of the present invention. In the medical procedure shown in FIG. 1, a tool 22 is used to perform an action with respect to the patient's back, the tool being inserted via an incision 24 on the patient's back 27. However, the apparatus and techniques described herein may be used in any surgical procedure that is performed upon a patient's body, mutatis mutandis. Reference is also made to FIG. 2, which is a schematic illustration of a head-mounted device 28, in accordance with some applications of the present invention.


For some applications, a first healthcare professional 26 (e.g., a surgeon performing the procedure) wears a first head-mounted device 28. Typically, the head-mounted device includes one or more head-mounted displays 30. For some applications, the head-mounted displays are generally similar to those described in U.S. Pat. No. 9,928,629 to Benishti, which is incorporated herein by reference. For example, the head-mounted displays may include a combiner that is controlled by a computer processor (e.g., computer processor 32 and/or computer processor 45 described hereinbelow), such as to display an augmented reality image to the healthcare professional. For some applications, the image is presented on head-mounted display 30 such that (a) a computer-generated image is projected onto a first portion 33 of the display by projector 31, and (b) the computer-generated image is aligned with an anatomical portion of a patient who is undergoing the procedure, with the anatomical portion of a patient being visible through a second portion 35 of the display. Typically, the computer-generated image includes a virtual image of the tool overlaid upon a virtual image of the patient's anatomy. For some applications, a portion of the tool that would not otherwise be visible to the healthcare professional (for example, by virtue of being hidden by the patient's anatomy) is included in the computer-generated image.


Although some misalignment of the image with the patient's body may be acceptable, for satisfactory presentation of the images the misalignment may typically not be more than about 2-3 mm. In order to account for such a limit on the misalignment of the patient's anatomy with the presented images, the position of the patient's body, or a portion thereof, with respect to the head-mounted device is typically tracked. In some cases, an image of a tool that is used to perform the procedure is incorporated into the image that is displayed on the head-mounted display. In order to incorporate an image of the tool into the image, in a manner in which the position of the tool with respect to the patient's anatomy is accurately reflected, the position of the tool or a portion thereof (e.g., the tool marker) is typically tracked. It is typically desirable to determine the location of the tool with respect to the patient's body such that errors in the determined location of the tool with respect to the patient's body are less than 2 mm.


Typically, head-mounted device 28 includes a tracking device 34 that is configured to facilitate determination of the location and orientation of head-mounted device 28 with respect to a portion of the patient's body (e.g., the patient's back) and/or with respect to tool 22, and/or the position and orientation of the tool with respect to the portion of the patient's body. For example, the tracking device may include an image-capturing device 36, such as a camera, that is configured to image a patient marker 38 and/or a tool marker 40. Typically, a single camera is used in the tracking device of the head-mounted device. For some applications, the camera is a high-speed camera. For example, the camera may acquire more than 50 frames per second.


For some applications, tracking device 34 includes a light source 42, which is mounted on the head-mounted device. The light source is typically configured to irradiate the patient marker and/or the tool marker, such that light reflects from the markers toward the camera. For some applications, image-capturing device 36 is a monochrome camera that includes a filter that is configured to only allow light to pass through that is of a similar wavelength to the light that is generated by the light source. For example, the light source may be an infrared light source (for example, a light source that generates light at a wavelength of between 700 nm and 1000 nm (e.g., between 700 nm and 800 nm)), and the camera may include a corresponding infrared filter. For some applications, an inertial-measurement unit 44 (e.g., an inertial-measurement unit configured to measure in 6 degrees-of-freedom) is disposed on the head-mounted device, as described in further detail hereinbelow. For some applications, the head-mounted device includes additional cameras 43, which are configured to capture images of scenes in the visible spectrum, as described in U.S. Pat. No. 9,928,629 to Benishti, which is incorporated herein by reference. For some applications, head-mounted device 28 includes additional components, for example, as described in U.S. Pat. No. 9,928,629 to Benishti, which is incorporated herein by reference.


Typically, in order to generate an augmented reality image upon display 30, a computer processor determines the location and orientation of head-mounted device 28 with respect to a portion of the patient's body (e.g., the patient's back), and/or the position and orientation of the tool with respect to the patient's body. For example, a computer processor 45 that is integrated within the head-mounted device may perform the aforementioned functionalities. Alternatively or additionally, computer processor 32, which is disposed externally to the head-mounted device and is typically in wireless communication with the head-mounted device may be used to perform these functionalities. Computer processor 32 typically comprises a portion of a processing system 50 that is used with the head-mounted device in order to facilitate the image-guided surgery. For some applications, the processing system additionally includes an output device 52 (e.g., a display, such as a monitor) for outputting information to an operator of the system, and/or an input device 54 (such as a pointing device, a keyboard, a mouse, etc.) configured to allow the operator to input data into the system. In general, in the context of the present application, when a computer processor is described as performing certain steps, these steps may be performed by external computer processor 32, and/or computer processor 45 that is integrated within the head-mounted device.


For some applications, the patient marker and/or the tool marker includes reflective elements that are configured to reflect light that is generated by light source 42. For some such applications, the location and orientation of a portion of the subject's body (e.g., the subject's back) with respect to the head-mounted device is tracked, by directing light from light source 42 toward a region of interest in which the patient marker is disposed. Alternatively or additionally, the location and orientation of the tool with respect to the portion of the subject's body, is tracked by directing light from light source 42 toward a region of interest in which the patient marker and/or the tool marker is disposed. Typically, image-capturing device 36 is disposed upon the head-mounted device in close proximity to the light source, such that the image-capturing device is configured to capture light that is retro-reflected from the patient marker and/or the tool marker. As described hereinabove, for some applications, the image-capturing device is a monochrome camera that includes a filter that is configured to only allow light to pass through that is of a similar wavelength to the light that is generated by the light source. For such applications, the camera typically receives a grayscale image showing the reflective elements of the tool marker and/or the patient marker. Typically, the computer processor determines the location of a portion of the subject's body (e.g., the subject's back) with respect to the head-mounted device by analyzing the images acquired by the image-capturing device. Further typically, the computer processor determines the location and orientation of the tool with respect to the portion of the subject's body, by analyzing the images acquired by the image-capturing device.


It is noted that the above-described technique for tracking the patient marker and/or the tool marker is presented by way of example, and that for some applications, alternative techniques are used for tracking the patient marker and/or the tool marker. For example, the patient marker and/or the tool marker may include light-absorbing elements, and/or light-generating elements, and the image-capturing device may be configured to track the patient marker and/or the tool marker by detecting these elements. Alternatively or additionally, a different type of detector may be used for tracking the patient marker and/or the tool marker.


Typically, the patient marker is configured to provide data that is sufficient for the computer processor to determine the location and orientation of the head-mounted device with respect to the portion of the patient's body using data collected from a single tracking device that is disposed on the head-mounted display. For example, the patient marker may include an array of elements that is visible to the tracking device of the head-mounted device, and that is configured such that at any location and orientation of the head-mounted device with respect to the patient marker, the array of elements has an appearance that is unique to that location and orientation. In this manner, the computer processor is able to determine the location and orientation of the head-mounted device with respect to the portion of the patient's body without requiring the use of triangulation techniques.


As described in the above paragraph, in general, the patient marker is configured to provide data that is sufficient for the computer processor to determine the location and orientation of the head-mounted device with respect to the portion of the patient's body using data collected from a single tracking device that is disposed on the head-mounted display. However, for some applications, at least under certain circumstances, the computer processor is configured to incorporate tracking data that is received from an additional tracking device (i.e., an additional tracking device to first tracking device 34), in order to generate the image upon head-mounted display 30 of first head-mounted device 28 of first healthcare professional 26.


For some such applications, the computer processor is configured to incorporate the additional data in cases in which tracking device 34 loses its line of sight with the patient marker and/or the tool marker and/or portions thereof. An example of this is shown in FIG. 1, which shows that the right hand of first healthcare professional 26 is blocking the line of sight of his/her tracking device 34 with respect to patient marker 38. For some applications, in such cases, the computer processor is configured to receive data from a tracking device 34′ of an additional head-mounted device 28′ that is configured to be worn by an additional healthcare professional 26′ who is present in the procedure (e.g., an accompanying surgeon, or a nurse), e.g., as shown in FIG. 1. Typically, the additional head-mounted device 28′ is generally similar to the first head-mounted device 28, and the tracking device 34′ of the additional head-mounted device is generally similar to that of the first head-mounted device. For some applications, when at least a portion of the patient marker and a portion of the tool (e.g., the tool marker) are both within the line of sight of the first tracking device 34, the computer processor generates an augmented reality image upon the head-mounted display 30, based upon data received from first tracking device 34 and without using data received from tracking device 34′. When at least the portion of the patient marker and the portion of the tool are not both within the line of sight of first tracking device 34, the computer processor generates an augmented reality image upon head-mounted display 30, at least partially based upon data received from second tracking device 34′.


Alternatively or additionally, a tracking device 60, which is not mounted on a head-mounted device, is disposed in the operating room. Typically, tracking device 60 is disposed in a stationary position within the operating room. For example, tracking device 60 may be ceiling-mounted, wall-mounted, and/or disposed on a stand, such as a tripod. For some applications, tracking device 60 includes a light source 62 and an image-capturing device 64, which function in a generally similar manner to that described hereinabove with reference to light source 42 and image-capturing device 36.


Reference is now made to FIGS. 3A and 3B, which are schematic illustration of examples of displays 30′, 30 of head-mounted devices 28′, 28 as worn by respective healthcare professionals 26′, 26, in accordance with some applications of the present invention. FIG. 3A shows an example of displays 30′ of second healthcare professional 26′, who is shown on the right side of the patient in FIG. 1, and FIG. 3B shows an example of displays 30 of first healthcare professional 26, who is shown on the left side of the patient in FIG. 1. Typically, the image that is generated upon each of head-mounted displays 30 and head-mounted displays 30′ is an augmented-reality view showing virtual patient anatomy aligned with the actual patient anatomy and a virtual tool aligned with the virtual anatomy. As described hereinabove, for some applications, the virtual tool and virtual anatomy are displayed upon a first portion 33 of head-mounted display 30, 30′, and the actual patient anatomy is visible through a second portion 35 of head-mounted display 30, 30′. For some applications, the computer processor is configured to generate such a view both in 2D and 3D. In order to generate such a view, it is typically necessary to track the location and orientation of the head-mounted device relative to the patient, in order to correctly align the virtual anatomy with the actual patient anatomy. FIGS. 3A and 3B show how the respective head-mounted displays typically appear, when tracking devices 34 of each of the healthcare professionals has a clear line of sight of the patient marker.


Reference is now made to FIGS. 4A and 4B, which are schematic illustration of examples of displays 30′, 30 of head-mounted devices 28′, 28 as worn by respective healthcare professionals 26′, 26, when the line of sight between tracking device 34 of first healthcare professional 26 with respect to patient marker 38 is at least partially blocked, in accordance with some applications of the present invention. FIG. 4A shows an example of display 30′ of second healthcare professional 26′, who is shown on the right side of the patient in FIG. 1, and FIG. 4B shows an example of display 30 of first healthcare professional 26, who is shown on the left side of the patient in FIG. 1.


For some such applications, the computer processor generates a virtual image upon head-mounted display 30 of first healthcare professional 26 that shows the virtual view of the second healthcare professional 26′ (i.e., the second healthcare professional's view of the virtual anatomy and the virtual tool), as determined based upon the data received from second tracking device 34′. For example, the overall view of the second healthcare professional (including both his/her view of the virtual anatomy and the virtual tool, as well as his/her view of the actual patient anatomy) may be displayed upon head-mounted display 30 of the first healthcare professional. Such an example is shown in FIGS. 4A and 4B, which show the head-mounted displays 30 of first healthcare professional 26 showing the same overall view as that of the second healthcare professional 26′.


Reference is now made to FIGS. 5A and 5B, which are schematic illustration of examples of displays 30′, 30 of head-mounted devices 28′, 28 as worn by respective healthcare professionals 26′, 26, when the line of sight between tracking device 34 of first healthcare professional 26 with respect to patient marker 38 is at least partially blocked, in accordance with some applications of the present invention. FIG. 5A shows an example of display 30′ of second healthcare professional 26′, who is shown on the right side of the patient in FIG. 1, and FIG. 5B shows an example of display 30 of first healthcare professional 26, who is shown on the left side of the patient in FIG. 1. For some applications, when the line of sight between tracking device 34 of first healthcare professional 26 with respect to patient marker 38 is at least partially blocked, the virtual image (of the tool and the anatomy) from the line of sight of the second healthcare professional is displayed, such that it fills substantially the whole head-mounted display 30 of the first healthcare professional, and the first healthcare professional is not be shown any of the actual patient anatomy via a transparent portion of the display. An example of such an embodiment is shown in FIGS. 5A and 5B, which show the virtual image from head-mounted displays 30′ (shown in FIG. 5A) displayed within portion 33 of the head-mounted displays 30 of the first healthcare professional, and portion 33 filling substantially the whole of head-mounted displays 30 of the first healthcare professional (shown in FIG. 5B).


Reference is now made to FIGS. 6A and 6B, which are schematic illustration of examples of displays 30′, 30 of head-mounted devices 28′, 28 as worn by respective healthcare professionals 26′, 26, when the line of sight between tracking device 34 of first healthcare professional 26 with respect to patient marker 38 is at least partially blocked, in accordance with some applications of the present invention. FIG. 6A shows an example of display 30′ of second healthcare professional 26′, who is shown on the right side of the patient in FIG. 1, and FIG. 6B shows an example of display 30 of first healthcare professional 26, who is shown on the left side of the patient in FIG. 1. For some applications, in response to detecting that tracking device 34 has lost its line of sight of the patient marker, such that the location and/or orientation of the head-mounted device relative to the patient cannot be determined to a given level of accuracy using tracking device 34, the computer processor generates an image of the virtual tool within the virtual anatomy of the subject, but without regard to aligning the computer-generated image with the actual patient anatomy. For some such applications, the virtual image that is generated in portion 33 of display 30 of the first healthcare professional continues to be shown from the first healthcare professional's previous known line of sight, but the position of the tool with respect to the anatomy is updated based upon data received from tracking device 34′. Second portion 35 of the display of the first healthcare professional is kept transparent such that the first healthcare professional sees the patient's anatomy from his/her own current line of sight. An example of such an embodiment is shown in FIGS. 6A and 6B. As shown in FIG. 6B, since changes in the location of head-mounted device 28 with respect to the patient marker are not tracked and accounted for, this may result in a slight misalignment of the virtual image (shown in portion 33) with respect to the patient's body (shown in portion 35). In this regard, it is noted that, in general, the first healthcare professional uses the virtual image of the tool overlaid upon the virtual image of the patient's anatomy, for navigation of the tool. As such, the healthcare professional is typically able to continue to navigate the tool, even though the virtual image of the tool and the patient's anatomy is not aligned with his/her view of the patient's anatomy.


For some applications, generally similar techniques to those described in the above paragraph are performed, but with the additional tracking data that is used for generating an image on head-mounted display 30 being received from tracking device 60, as an alternative to, or in addition to, being received from tracking device 34′ of second head-mounted device 28′.


For some applications, in response to detecting that tracking device 34 has lost its line of sight of the tool marker, such that the location and/or orientation of the tool with respect to the patient cannot be determined to a given level of accuracy, the computer processor determines the location of the tool relative to the patient, using data received from tracking device 34′ and/or tracking device 60. Typically, a virtual image, which includes the virtual patient anatomy and the virtual tool shown at its current location, is displayed on head-mounted display 30′ of head-mounted device 28, with the current location of the tool with respect to the patient having been determined based upon the data received from tracking device 34′ and/or tracking device 60.


For some applications, the computer processor is configured to incorporate tracking data that is received from an additional tracking device (i.e., a tracking device in addition to tracking device 34) in order to generate an image upon head-mounted display 30 of first head-mounted device 28, even when the patient marker and the tool marker are within the line of sight of tracking device 34. For some applications, the computer processor determines the location of the tool with respect to the patient, using a combination of data received from tracking device 34′ and data received from tracking device 34, and/or using a combination of data received from tracking device 60 and data received from tracking device 34. For example, the computer processor may determine an average (e.g., a mean) current location of the tool with respect to the patient, using the aforementioned combinations of received data, and the computer processor may the generate an image of a virtual tool on virtual anatomy upon head-mounted display 30, in which the tool is positioned at the determined current position.


For some applications, even if a portion of the tracking elements on the patient marker become obscured such that they are not within the line of sight of tracking device 34, the computer processor continue to track the location of the head-mounted device with respect to the patient by tracking the marker using a tracking algorithm (e.g., using a Kalman filter). Typically, in such cases, at least while the patient marker is partially obscured, the computer processor does not continue to actively identify the marker. Rather, the computer processor continues to track the already-identified marker using the aforementioned tracking algorithm. For some applications, if the patient marker becomes obscured (e.g., partially obscured or fully obscured) such that at least a portion of the patient marker is not within the line of sight of tracking device 34, the computer processor continues to determine the location of the patient relative to the head-mounted device, using inertial measurement unit 44, in combination with the last location of the patient marker as determined using data from tracking device 34.


Applications of the invention described herein can take the form of a computer program product accessible from a computer-usable or computer-readable medium (e.g., a non-transitory computer-readable medium) providing program code for use by or in connection with a computer or any instruction execution system, such as computer processor 32 and/or 45. For the purpose of this description, a computer-usable or computer readable medium can be any apparatus that can comprise, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium. Typically, the computer-usable or computer readable medium is a non-transitory computer-usable or computer readable medium.


Examples of a computer-readable medium include a semiconductor or solid-state memory, magnetic tape, a removable computer diskette, a random-access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk. Current examples of optical disks include compact disk-read only memory (CD-ROM), compact disk-read/write (CD-R/W) and DVD.


A data processing system suitable for storing and/or executing program code will include at least one processor (e.g., computer processor 32 and/or 45) coupled directly or indirectly to memory elements through a system bus. The memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution. The system can read the inventive instructions on the program storage devices and follow these instructions to execute the methodology of the embodiments of the invention.


Network adapters may be coupled to the processor to enable the processor to become coupled to other processors or remote printers or storage devices through intervening private or public networks. Modems, cable modem and Ethernet cards are just a few of the currently available types of network adapters.


Computer program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object-oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the C programming language or similar programming languages.


It will be understood that the algorithms described herein, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general-purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer (e.g., computer processor 32 and/or 45) or other programmable data processing apparatus, create means for implementing the functions/acts specified in the algorithms described in the present application. These computer program instructions may also be stored in a computer-readable medium (e.g., a non-transitory computer-readable medium) that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instruction means which implement the function/act specified in the algorithms. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the algorithms described in the present application.


Computer processor 32 and/or computer processor 45 is typically a hardware device programmed with computer program instructions to produce a special purpose computer. For example, when programmed to perform the algorithms described with reference to the figures, computer processor 32 and/or 45 typically acts as a special purpose image-generating computer processor. Typically, the operations described herein that are performed by computer processor 32 and/or 45 transform the physical state of a memory, which is a real physical article, to have a different magnetic polarity, electrical charge, or the like depending on the technology of the memory that is used. For some applications, operations that are described as being performed by a computer processor are performed by a plurality of computer processors in combination with each other.


It will be appreciated by persons skilled in the art that the present invention is not limited to what has been particularly shown and described hereinabove. Rather, the scope of the present invention includes both combinations and subcombinations of the various features described hereinabove, as well as variations and modifications thereof that are not in the prior art, which would occur to persons skilled in the art upon reading the foregoing description.

Claims
  • 1. A method for use with a tool configured to be placed within a portion of a body of a patient, the method comprising: tracking the tool and a patient marker that is placed upon the patient's body from a first line of sight, using a first tracking device that is disposed upon a first head-mounted device that is worn by a first person, the first head-mounted device including a first head-mounted display;tracking the tool and the patient marker, from a second line of sight, using a second tracking device; andusing at least one computer processor for:generating an augmented reality image upon the first head-mounted display based upon data received from the first tracking device and without using data from the second tracking device, the augmented reality image including (a) a virtual image of the tool and anatomy of the patient, overlaid upon (b) the patient's body;detecting that the first tracking device no longer has both the patient marker and the tool in the first line of sight;in response to detecting that the first tracking device no longer has both the patient marker and the tool within the first line of sight, generating the virtual image of the tool and anatomy of the patient upon the first head-mounted display from the first line of sight of the first tracking device worn by the first person, by incorporating data received from the second tracking device with respect to a position of the tool in the body into the virtual image; andin response to detecting a portion of the tool being within the first line of sight, and a portion of the patient marker not being within the first line of sight:determining a position of the tool with respect to the anatomy of the patient using data received from the second tracking device; andgenerating the virtual image of the tool and anatomy of the patient upon the first head-mounted display, based upon the determined position of the tool with respect to the anatomy of the patient.
  • 2. The method according to claim 1, wherein tracking the tool comprises tracking a tool marker.
  • 3. The method according to claim 1, wherein tracking the tool and the patient marker, from the second line of sight, using the second tracking device, comprises tracking the tool and the patient marker from the second line of sight, using the second tracking device disposed in a stationary position.
  • 4. The method according to claim 1, wherein generating the virtual image of the tool and anatomy of the patient upon the first head-mounted display further comprises overlaying the virtual image upon the patient's body, based upon a position of the patient's body with respect to the first head-mounted device as determined based upon data received from the first tracking device at a time when the portion of the patient marker was within the first line of sight.
  • 5. The method according to claim 4, wherein overlaying the virtual image upon the patient's body comprises tracking movements of the first head-mounted device between the time when the portion of the patient marker was within the first line of sight and the portion of the patient marker not being within the first line of sight, using an inertial-measurement unit disposed upon the first head-mounted device.
  • 6. The method according to claim 1, wherein the second tracking device is disposed upon a second head-mounted device that is worn by a second person.
  • 7. The method according to claim 6, further comprising generating a further augmented-reality image upon a second head-mounted display of the second head-mounted device.
  • 8. The method according to claim 1, wherein the at least one computer processor is configured to generate the virtual image of the tool and anatomy of the patient upon the first head-mounted display by overlaying the virtual image of the tool and the anatomy of the patient upon the patient's body based upon a position of the first head-mounted device with respect to the tool as determined based upon data received from the first tracking device.
  • 9. The method according to claim 1, wherein the virtual image of the tool and anatomy of the patient comprises a virtual image of the tool overlaid on a virtual image of the anatomy of the patient, wherein the virtual image of the tool is aligned with the virtual image of the anatomy of the patient.
  • 10. The method according to claim 1, further comprising generating the virtual image of the tool and anatomy of the patient upon the first head-mounted display when at least the patient marker or the tool is within the first line of sight such that the virtual image of the tool and anatomy of the patient is aligned with actual patient anatomy based upon data received from the first tracking device.
  • 11. A method for use with a tool configured to be placed within a portion of a body of a patient, the method comprising: tracking the tool and a patient marker that is placed upon the patient's body from a first line of sight, using a first tracking device that is disposed upon a first head-mounted device that is worn by a first person, the first head-mounted device including a first head-mounted display;tracking the tool and the patient marker, from a second line of sight, using a second tracking device; andusing at least one computer processor for:generating an augmented reality image upon the first head-mounted display based upon data received from the first tracking device and without using data from the second tracking device, the augmented reality image including (a) a virtual image of the tool and anatomy of the patient, overlaid upon (b) the patient's body;detecting that the first tracking device no longer has both the patient marker and the tool in the first line of sight; in response to detecting that the first tracking device no longer has both the patient marker and the tool within the first line of sight, generating the virtual image of the tool and anatomy of the patient upon the first head-mounted display from the first line of sight of the first tracking device worn by the first person, by incorporating data received from the second tracking device with respect to a position of the tool in the body into the virtual image; andin response to detecting a portion of the patient marker being within the first line of sight, and a portion of the tool not being within the first line of sight:determining a position of the tool with respect to the anatomy of the patient using data received from the second tracking device;generating the virtual image of the tool and anatomy of the patient upon the first head-mounted display, based upon the determined position of the tool with respect to the anatomy of the patient;determining a position of the patient's body with respect to the first head-mounted device based upon data received from the first tracking device; andoverlaying the virtual image upon the patient's body, based upon the determined position of the patient's body with respect to the first head-mounted device.
  • 12. The method according to claim 11, wherein the virtual image of the tool and anatomy of the patient comprises a virtual image of the tool overlaid on a virtual image of the anatomy of the patient, wherein the virtual image of the tool is aligned with the virtual image of the anatomy of the patient.
  • 13. The method according to claim 11, further comprising generating the virtual image of the tool and anatomy of the patient upon the first head-mounted display when at least the patient marker or the tool is within the first line of sight such that the virtual image of the tool and anatomy of the patient is aligned with actual patient anatomy based upon data received from the first tracking device.
  • 14. Apparatus for use with a tool configured to be placed within a portion of a body of a patient, the apparatus comprising: a first head-mounted device comprising a first head-mounted display, and a first tracking device that is configured to track the tool and a patient marker from a first line of sight, wherein the patient marker is configured to be placed upon the patient's body;a second tracking device that is configured to track the tool and the patient marker from a second line of sight; andat least one computer processor configured:to generate an augmented reality image upon the first head-mounted display, based upon data received from the first tracking device and without using data from the second tracking device, the augmented reality image including (a) a virtual image of the tool and anatomy of the patient, overlaid upon (b) the patient's body;after generating the augmented reality image, to detect that the first tracking device no longer has both the patient marker and the tool in the first line of sight;in response to detecting that the first tracking device no longer has both the patient marker and the tool within the first line of sight, to generate the virtual image of the tool and anatomy of the patient upon the first head-mounted display from the first line of sight of the first tracking device worn by the first person, by incorporating data received from the second tracking device with respect to a position of the tool in the body into the virtual image; andin response to detecting a portion of the tool being within the first line of sight, and a portion of the patient marker not being within the first line of sight:to determine a position of the tool with respect to the anatomy of the patient using data received from the second tracking device; andto generate the virtual image of the tool and anatomy of the patient upon the first head-mounted display, based upon the determined position of the tool with respect to the anatomy of the patient.
  • 15. The apparatus according to claim 14, wherein the tool includes a tool marker, and wherein the first and second tracking devices are configured to track the tool by tracking the tool marker.
  • 16. The apparatus according to claim 14, wherein the second tracking device comprises a tracking device that is disposed in a stationary position.
  • 17. The apparatus according to claim 14, wherein the at least one computer processor is configured to generate the virtual image of the tool and anatomy of the patient upon the first head-mounted display by overlaying the virtual image upon the patient's body, based upon the position of the patient's body with respect to the first head-mounted device as determined based upon the data received from the first tracking device at a time when the portion of the patient marker was within the first line of sight.
  • 18. The apparatus according to claim 17, further comprising an inertial-measurement unit disposed upon the first head-mounted device, wherein the at least one computer processor is configured to overlay the virtual image upon the patient's body by tracking movements of the head-mounted device between the time when the portion of the patient marker was within the first line of sight and the portion of the patient marker not being within the first line of sight, using data from the inertial-measurement unit.
  • 19. The apparatus according to claim 14, wherein the first head-mounted device is configured to be worn by a first person, the apparatus further comprising a second head-mounted device that is configured to be worn by a second person, and wherein the second tracking device is disposed upon the second head-mounted device.
  • 20. The apparatus according to claim 14, further comprising the patient marker.
  • 21. The apparatus according to claim 14, wherein the at least one computer processor is configured to generate the virtual image of the tool and anatomy of the patient upon the first head-mounted display by overlaying the virtual image of the tool and the anatomy of the patient upon the patient's body based upon a position of the first head-mounted device with respect to the tool as determined based upon data received from the first tracking device.
  • 22. The apparatus according to claim 14, wherein the virtual image of the tool and anatomy of the patient comprises a virtual image of the tool overlaid on a virtual image of the anatomy of the patient, wherein the virtual image of the tool is aligned with the virtual image of the anatomy of the patient.
  • 23. The apparatus according to claim 14, wherein the at least one computer processor is further configured to generate the virtual image of the tool and anatomy of the patient upon the first head-mounted display when at least the patient marker or the tool is within the first line of sight such that the virtual image of the tool and anatomy of the patient is aligned with actual patient anatomy based upon data received from the first tracking device.
  • 24. Apparatus for use with a tool configured to be placed within a portion of a body of a patient, the apparatus comprising: a first head-mounted device comprising a first head-mounted display, and a first tracking device that is configured to track the tool and a patient marker from a first line of sight, wherein the patient marker is configured to be placed upon the patient's body;a second tracking device that is configured to track the tool and the patient marker from a second line of sight; andat least one computer processor configured:to generate an augmented reality image upon the first head-mounted display, based upon data received from the first tracking device and without using data from the second tracking device, the augmented reality image including (a) a virtual image of the tool and anatomy of the patient, overlaid upon (b) the patient's body;after generating the augmented reality image, to detect that the first tracking device no longer has both the patient marker and the tool in the first line of sight;in response to detecting that the first tracking device no longer has both the patient marker and the tool within the first line of sight, to generate the virtual image of the tool and anatomy of the patient upon the first head-mounted display from the first line of sight of the first tracking device worn by the first person, by incorporating data received from the second tracking device with respect to a position of the tool in the body into the virtual image; andin response to detecting a portion of the patient marker being within the first line of sight, and a portion of the tool not being within the first line of sight:to determine a position of the tool with respect to the anatomy of the patient using data received from the second tracking device;to generate the virtual image of the tool and anatomy of the patient upon the first head-mounted display, based upon the determined position of the tool with respect to the anatomy of the patient;to determine a position of the patient's body with respect to the first head-mounted device based upon data received from the first tracking device; andto overlay the virtual image upon the patient's body, based upon the determined position of the patient's body with respect to the first head-mounted device.
  • 25. The apparatus according to claim 24, wherein the virtual image of the tool and anatomy of the patient comprises a virtual image of the tool overlaid on a virtual image of the anatomy of the patient, wherein the virtual image of the tool is aligned with the virtual image of the anatomy of the patient.
  • 26. The apparatus according to claim 24, wherein the at least one computer processor is further configured to generate the virtual image of the tool and anatomy of the patient upon the first head-mounted display when at least the patient marker or the tool is within the first line of sight such that the virtual image of the tool and anatomy of the patient is aligned with actual patient anatomy based upon data received from the first tracking device.
US Referenced Citations (845)
Number Name Date Kind
4459358 Berke Jul 1984 A
4863238 Brewster Sep 1989 A
5441042 Putman Aug 1995 A
5442146 Bell Aug 1995 A
5510832 Garcia Apr 1996 A
5771121 Hentschke Jun 1998 A
5792046 Dobrovolny Aug 1998 A
5841507 Barnes et al. Nov 1998 A
6006126 Cosman Dec 1999 A
6125164 Murphy Sep 2000 A
6227667 Halldorsson May 2001 B1
6256529 Holupka et al. Jul 2001 B1
6314310 Ben-Haim et al. Nov 2001 B1
6444192 Mattrey Sep 2002 B1
6449090 Omar Sep 2002 B1
6549645 Oikawa Apr 2003 B1
6609022 Vilsmeier et al. Aug 2003 B2
6610009 Person Aug 2003 B2
6675040 Cosman Jan 2004 B1
6737425 Yamamoto May 2004 B1
6740882 Weinberg May 2004 B2
6757068 Foxlin Jun 2004 B2
6759200 Stanton Jul 2004 B1
6856324 Sauer Feb 2005 B2
6856826 Seeley et al. Feb 2005 B2
6891518 Sauer et al. May 2005 B2
6919867 Sauer Jul 2005 B2
6921167 Nagata Jul 2005 B2
6966668 Cugini Nov 2005 B2
6980849 Sasso Dec 2005 B2
6993374 Sasso Jan 2006 B2
6997552 Hung Feb 2006 B1
7043961 Pandey et al. May 2006 B2
7103233 Stearns Sep 2006 B2
7107091 Jutras et al. Sep 2006 B2
7112656 Desnoyers Sep 2006 B2
7141812 Appleby Nov 2006 B2
7157459 Ohta Jan 2007 B2
7169785 Timmer Jan 2007 B2
7171255 Holupka et al. Jan 2007 B2
7187792 Fu Mar 2007 B2
7194295 Vilsmeier Mar 2007 B2
7229078 Lechot Jun 2007 B2
7231076 Fu Jun 2007 B2
7259266 Carter Aug 2007 B2
7260426 Schweikard Aug 2007 B2
7281826 Huang Oct 2007 B2
7320556 Vagn-Erik Jan 2008 B2
7330578 Wang Feb 2008 B2
7359535 Salla Apr 2008 B2
7364314 Nilsen et al. Apr 2008 B2
7379077 Bani-Hashemi May 2008 B2
7431453 Hogan Oct 2008 B2
7435219 Kim Oct 2008 B2
7458977 McGinley Dec 2008 B2
7462852 Appleby Dec 2008 B2
7493153 Ahmed et al. Feb 2009 B2
7505617 Fu Mar 2009 B2
7507968 Wollenweber Mar 2009 B2
7518136 Appleby Apr 2009 B2
D592691 Chang May 2009 S
D592692 Chang May 2009 S
D592693 Chang May 2009 S
7542791 Mire Jun 2009 B2
7556428 Sukovic et al. Jul 2009 B2
7567834 Clayton Jul 2009 B2
D602620 Cristoforo Oct 2009 S
7605826 Sauer Oct 2009 B2
7607775 Hermanson Oct 2009 B2
7620223 Xu Nov 2009 B2
7630753 Simon et al. Dec 2009 B2
7633501 Wood Dec 2009 B2
7645050 Wilt Jan 2010 B2
7689019 Boese Mar 2010 B2
7689042 Brunner Mar 2010 B2
7689320 Prisco Mar 2010 B2
7699486 Beiner Apr 2010 B1
7699793 Gotte Apr 2010 B2
7719769 Sugihara et al. May 2010 B2
D617825 Chang Jun 2010 S
D619285 Cristoforo Jul 2010 S
7758204 Klipstein Jul 2010 B2
7768702 Hirose et al. Aug 2010 B2
7769236 Fiala Aug 2010 B2
7774044 Sauer et al. Aug 2010 B2
D628307 Krause-Bonte Nov 2010 S
7831096 Williamson Nov 2010 B2
7835778 Foley Nov 2010 B2
7835784 Mire et al. Nov 2010 B2
7837987 Shi Nov 2010 B2
7840256 Lakin et al. Nov 2010 B2
7853305 Simon Dec 2010 B2
7854705 Pawluczyk Dec 2010 B2
7857271 Lees Dec 2010 B2
7860282 Boese Dec 2010 B2
D630766 Harbin Jan 2011 S
7865269 Prisco Jan 2011 B2
7874686 Rossner et al. Jan 2011 B2
7881770 Melkent et al. Feb 2011 B2
7893413 Appleby Feb 2011 B1
7894649 Fu Feb 2011 B2
7938553 Beiner May 2011 B1
7945310 Gattani May 2011 B2
7953471 Clayton May 2011 B2
7974677 Mire Jul 2011 B2
7985756 Barlow Jul 2011 B2
7991557 Liew Aug 2011 B2
7993353 Robner et al. Aug 2011 B2
8022984 Cheong Sep 2011 B2
8045266 Nakamura Oct 2011 B2
8060181 Ponce Nov 2011 B2
8068896 Daghighian Nov 2011 B2
8077943 Wiliams Dec 2011 B2
8085075 Huffman Dec 2011 B2
8085897 Morton Dec 2011 B2
8090175 Fu Jan 2012 B2
8092400 Warkentine Jan 2012 B2
8108072 Zhao Jan 2012 B2
8112292 Simon Feb 2012 B2
8120847 Chang Feb 2012 B2
8155479 Hoffman Apr 2012 B2
8180429 Sasso May 2012 B2
8208599 Ye Jun 2012 B2
8221402 Francischelli Jul 2012 B2
8253778 Takahashi Aug 2012 B2
8271069 Jascob et al. Sep 2012 B2
8285021 Boese Oct 2012 B2
8305685 Heine Nov 2012 B2
8306305 Porat et al. Nov 2012 B2
8309932 Haselman Nov 2012 B2
8317320 Huang Nov 2012 B2
8335553 Rubner Dec 2012 B2
8369925 Giesel Feb 2013 B2
8386022 Jutras et al. Feb 2013 B2
8394144 Zehavi Mar 2013 B2
8444266 Waters May 2013 B2
8467851 Mire et al. Jun 2013 B2
8469902 Dick Jun 2013 B2
8531394 Maltz Sep 2013 B2
8540364 Waters Sep 2013 B2
8545012 Waters Oct 2013 B2
8556883 Saleh Oct 2013 B2
8559596 Thomson Oct 2013 B2
8567945 Waters Oct 2013 B2
8600477 Beyar Dec 2013 B2
8634897 Simon Jan 2014 B2
8674902 Park Mar 2014 B2
8690776 Razzaque et al. Apr 2014 B2
8692845 Fedorovskaya et al. Apr 2014 B2
8693632 Allison Apr 2014 B2
8694075 Groszmann Apr 2014 B2
8699765 Hao Apr 2014 B2
8705829 Frank Apr 2014 B2
8746887 Shestak Jun 2014 B2
8786689 Liu Jul 2014 B1
8831706 Fu Sep 2014 B2
8836768 Rafii et al. Sep 2014 B1
8848977 Bammer et al. Sep 2014 B2
8855395 Baturin Oct 2014 B2
8878900 Yang et al. Nov 2014 B2
8890772 Woo Nov 2014 B2
8890773 Pederson Nov 2014 B1
8890943 Lee Nov 2014 B2
8897514 Feikas Nov 2014 B2
8903150 Star-Lack Dec 2014 B2
8917268 Johnsen Dec 2014 B2
8920776 Gaiger Dec 2014 B2
8942455 Chou Jan 2015 B2
8950877 Northey et al. Feb 2015 B2
8969829 Wollenweber Mar 2015 B2
8989349 Thomson Mar 2015 B2
8992580 Bar Mar 2015 B2
8994795 Oh Mar 2015 B2
9004711 Gerolemou Apr 2015 B2
9057759 Klingenbeck Jun 2015 B2
9066751 Sasso Jun 2015 B2
9081436 Berme Jul 2015 B1
9085643 Svanborg Jul 2015 B2
9100643 McDowall Aug 2015 B2
9111175 Strommer Aug 2015 B2
9125556 Zehavi Sep 2015 B2
9129372 Kriston Sep 2015 B2
9149317 Arthur et al. Oct 2015 B2
9179984 Teichman et al. Nov 2015 B2
D746354 Chang Dec 2015 S
9208916 Appleby Dec 2015 B2
9220573 Kendrick et al. Dec 2015 B2
9235934 Mandella Jan 2016 B2
9247240 Park Jan 2016 B2
9265572 Fuchs et al. Feb 2016 B2
9269192 Kobayashi Feb 2016 B2
9283052 Ponce Mar 2016 B2
9320474 Demri Apr 2016 B2
9323055 Baillot Apr 2016 B2
9330477 Rappel May 2016 B2
9335567 Nakamura May 2016 B2
9341704 Picard May 2016 B2
9344686 Moharir May 2016 B2
9349066 Koo May 2016 B2
9349520 Demetriou May 2016 B2
9378558 Kajiwara et al. Jun 2016 B2
9380287 Nistico Jun 2016 B2
9387008 Sarvestani Jul 2016 B2
9395542 Tilleman et al. Jul 2016 B2
9414041 Ko Aug 2016 B2
9424611 Kanjirathinkal et al. Aug 2016 B2
9424641 Wiemker Aug 2016 B2
9438894 Park Sep 2016 B2
9443488 Borenstein Sep 2016 B2
9456878 Macfarlane et al. Oct 2016 B2
9465235 Chang Oct 2016 B2
9468373 Larsen Oct 2016 B2
9470908 Frankel Oct 2016 B1
9473766 Douglas Oct 2016 B2
9495585 Bicer et al. Nov 2016 B2
9498231 Haider et al. Nov 2016 B2
9513495 Waters Dec 2016 B2
9521966 Schwartz Dec 2016 B2
9526443 Berme Dec 2016 B1
9532846 Nakamura Jan 2017 B2
9532849 Anderson et al. Jan 2017 B2
9538962 Hannaford et al. Jan 2017 B1
9545233 Sirpad Jan 2017 B2
9546779 Rementer Jan 2017 B2
9561095 Nguyen Feb 2017 B1
9565415 Zhang et al. Feb 2017 B2
9572661 Robin Feb 2017 B2
9629595 Walker Apr 2017 B2
9633431 Merlet Apr 2017 B2
9672597 Amiot Jun 2017 B2
9672640 Kleiner Jun 2017 B2
9675306 Morton Jun 2017 B2
9675319 Razzaque Jun 2017 B1
RE46463 Feinbloom Jul 2017 E
9710968 Dillavou et al. Jul 2017 B2
9713502 Finkman Jul 2017 B2
9724119 Hissong Aug 2017 B2
9724165 Arata et al. Aug 2017 B2
9726888 Giartisio Aug 2017 B2
9728006 Varga Aug 2017 B2
9729831 Birnkrant Aug 2017 B2
9757034 Desjardins Sep 2017 B2
9757087 Simon Sep 2017 B2
9766441 Rappel Sep 2017 B2
9767608 Lee et al. Sep 2017 B2
9770203 Berme Sep 2017 B1
9772102 Ferguson Sep 2017 B1
9772495 Tam Sep 2017 B2
9791138 Feinbloom Oct 2017 B1
9800995 Libin Oct 2017 B2
9805504 Zhang Oct 2017 B2
9808148 Miller Nov 2017 B2
9844413 Daon et al. Dec 2017 B2
9851080 Wilt Dec 2017 B2
9861446 Lang Jan 2018 B2
9864214 Fass Jan 2018 B2
9872733 Shoham et al. Jan 2018 B2
9877642 Duret Jan 2018 B2
9885465 Nguyen Feb 2018 B2
9886552 Dillavou et al. Feb 2018 B2
9892564 Cvetko et al. Feb 2018 B1
9898866 Fuchs et al. Feb 2018 B2
9901414 Lively Feb 2018 B2
9911187 Steinle Mar 2018 B2
9927611 Rudy Mar 2018 B2
9928629 Benishti et al. Mar 2018 B2
9940750 Dillavou et al. Apr 2018 B2
9943374 Merritt et al. Apr 2018 B2
9947110 Haimerl Apr 2018 B2
9956054 Aguirre-Valencia May 2018 B2
9959629 Dillavou et al. May 2018 B2
9968297 Connor May 2018 B2
9980780 Lang May 2018 B2
9986228 Woods May 2018 B2
10010379 Gibby et al. Jul 2018 B1
10013531 Richards Jul 2018 B2
10016243 Esterberg Jul 2018 B2
10022065 Yishai et al. Jul 2018 B2
10022104 Sell et al. Jul 2018 B2
10023615 Bonny Jul 2018 B2
10026015 Cavusoglu Jul 2018 B2
10034713 Yang et al. Jul 2018 B2
10046165 Frewin Aug 2018 B2
10066816 Chang Sep 2018 B2
10073515 Awdeh Sep 2018 B2
10080616 Wilkinson et al. Sep 2018 B2
10082680 Chang Sep 2018 B2
10085709 Lavallee et al. Oct 2018 B2
10107483 Oren Oct 2018 B2
10108833 Hong et al. Oct 2018 B2
10123840 Dorman Nov 2018 B2
10132483 Feinbloom Nov 2018 B1
10134166 Benishti et al. Nov 2018 B2
10134194 Kepner Nov 2018 B2
10139652 Windham Nov 2018 B2
10139920 Isaacs Nov 2018 B2
10142496 Rao Nov 2018 B1
10151928 Ushakov Dec 2018 B2
10154239 Casas Dec 2018 B2
10159530 Lang Dec 2018 B2
10166079 McLachlin et al. Jan 2019 B2
10175507 Nakamura Jan 2019 B2
10175753 Boesen Jan 2019 B2
10181361 Dillavou et al. Jan 2019 B2
10186055 Takahashi Jan 2019 B2
10194131 Casas Jan 2019 B2
10194990 Amanatullah et al. Feb 2019 B2
10194993 Roger et al. Feb 2019 B2
10195076 Fateh Feb 2019 B2
10197803 Badiali et al. Feb 2019 B2
10197816 Waisman Feb 2019 B2
10207315 Appleby Feb 2019 B2
10230719 Vaugn Mar 2019 B2
10231893 Lei Mar 2019 B2
10235606 Miao Mar 2019 B2
10240769 Braganca Mar 2019 B1
10247965 Ton Apr 2019 B2
10251724 McLachlin et al. Apr 2019 B2
10278777 Lang May 2019 B1
10292768 Lang May 2019 B2
10296805 Yang et al. May 2019 B2
10326975 Casas Jun 2019 B2
10352543 Braganca Jul 2019 B1
10357146 Fiebel Jul 2019 B2
10357574 Hilderbrand Jul 2019 B2
10366489 Boettger et al. Jul 2019 B2
10368947 Lang Aug 2019 B2
10368948 Tripathi Aug 2019 B2
10386645 Shousha Aug 2019 B2
10405927 Lang Sep 2019 B1
10419655 Sivan Sep 2019 B2
10420626 Tokuda et al. Sep 2019 B2
10420813 Newell-Rogers Sep 2019 B2
10424115 Ellerbrock Sep 2019 B2
10431008 Djajadiningrat Oct 2019 B2
10433814 Razzaque Oct 2019 B2
10434335 Takahashi Oct 2019 B2
10448003 Grafenberg Oct 2019 B2
10449040 Lashinski Oct 2019 B2
10453187 Peterson Oct 2019 B2
10463434 Siegler et al. Nov 2019 B2
10465892 Feinbloom Nov 2019 B1
10470732 Baumgart Nov 2019 B2
10473314 Braganca Nov 2019 B1
10485989 Jordan Nov 2019 B2
10488663 Choi Nov 2019 B2
10499997 Weinstein et al. Dec 2019 B2
10504231 Fiala Dec 2019 B2
10507066 DiMaio Dec 2019 B2
10511822 Casas Dec 2019 B2
10517544 Taguchi Dec 2019 B2
10537395 Perez Jan 2020 B2
10540780 Cousins Jan 2020 B1
10543485 Ismagilov Jan 2020 B2
10546423 Jones et al. Jan 2020 B2
10548557 Lim Feb 2020 B2
10555775 Hoffman Feb 2020 B2
10571716 Chapiro Feb 2020 B2
10573087 Gallop Feb 2020 B2
10602114 Casas Feb 2020 B2
10577630 Zhang Mar 2020 B2
10586400 Douglas Mar 2020 B2
10592748 Cousins Mar 2020 B1
10595716 Nazareth Mar 2020 B2
10601950 Devam et al. Mar 2020 B2
10603113 Lang Mar 2020 B2
10603133 Wang et al. Mar 2020 B2
10606085 Toyama Mar 2020 B2
10594998 Casas Apr 2020 B1
10610172 Hummel et al. Apr 2020 B2
10610179 Altmann Apr 2020 B2
10613352 Knoll Apr 2020 B2
10617566 Esmonde Apr 2020 B2
10620460 Carabin Apr 2020 B2
10625099 Takahashi Apr 2020 B2
10626473 Mariani Apr 2020 B2
10631907 Zucker Apr 2020 B2
10634331 Feinbloom Apr 2020 B1
10638080 Ovchinnikov Apr 2020 B2
10646285 Siemionow et al. May 2020 B2
10650594 Jones May 2020 B2
10652525 Woods May 2020 B2
10660715 Dozeman May 2020 B2
10663738 Carlvik May 2020 B2
10682112 Pizaine Jun 2020 B2
10682767 Grafenberg et al. Jun 2020 B2
10687901 Thomas Jun 2020 B2
10691397 Clements Jun 2020 B1
10702713 Mori Jul 2020 B2
10709398 Schweizer Jul 2020 B2
10713801 Jordan Jul 2020 B2
10716643 Justin et al. Jul 2020 B2
10722733 Takahashi Jul 2020 B2
10725535 Yu Jul 2020 B2
10731832 Koo Aug 2020 B2
10732721 Clements Aug 2020 B1
10742949 Casas Aug 2020 B2
10743939 Lang Aug 2020 B1
10747315 Tungare Aug 2020 B2
10777094 Rao Sep 2020 B1
10777315 Zehavi Sep 2020 B2
10781482 Gubatayao Sep 2020 B2
10792110 Leung et al. Oct 2020 B2
10799296 Lang Oct 2020 B2
10799316 Sela et al. Oct 2020 B2
10810799 Tepper et al. Oct 2020 B2
10818019 Piat Oct 2020 B2
10818101 Gallop et al. Oct 2020 B2
10818199 Buras et al. Oct 2020 B2
10825563 Gibby et al. Nov 2020 B2
10831943 Santarone Nov 2020 B2
10835296 Elimelech et al. Nov 2020 B2
10838206 Fortin-Deschenes et al. Nov 2020 B2
10839629 Jones Nov 2020 B2
10839956 Beydoun et al. Nov 2020 B2
10841556 Casas Nov 2020 B2
10842002 Chang Nov 2020 B2
10842461 Johnson et al. Nov 2020 B2
10849691 Zucker Dec 2020 B2
10849693 Lang Dec 2020 B2
10849710 Liu Dec 2020 B2
10861236 Geri et al. Dec 2020 B2
10865220 Ebetino Dec 2020 B2
10869517 Halpern Dec 2020 B1
10872472 Watola Dec 2020 B2
10877262 Luxembourg Dec 2020 B1
10877296 Lindsey Dec 2020 B2
10878639 Douglas Dec 2020 B2
10893260 Trail et al. Jan 2021 B2
10895742 Schneider Jan 2021 B2
10895743 Dausmann Jan 2021 B2
10898151 Harding et al. Jan 2021 B2
10921595 Rakshit Feb 2021 B2
10928321 Rawle Feb 2021 B2
10928638 Ninan Feb 2021 B2
10935815 Castaneda Mar 2021 B1
10935816 Ban Mar 2021 B2
10936537 Huston Mar 2021 B2
10939973 DiMaio Mar 2021 B2
10939977 Messinger et al. Mar 2021 B2
10941933 Ferguson Mar 2021 B2
10946108 Zhang Mar 2021 B2
10950338 Douglas Mar 2021 B2
10951872 Casas Mar 2021 B2
10964095 Douglas Mar 2021 B1
10964124 Douglas Mar 2021 B1
11000335 Dorman May 2021 B2
11006093 Hegyi May 2021 B1
11013560 Lang May 2021 B2
11013562 Marti May 2021 B2
11013573 Chang May 2021 B2
11013900 Malek May 2021 B2
11019988 Fiebel Jun 2021 B2
11027027 Manning Jun 2021 B2
11029147 Abovitz et al. Jun 2021 B2
11030809 Wang Jun 2021 B2
11041173 Zhang Jun 2021 B2
11045663 Mori Jun 2021 B2
11049293 Chae Jun 2021 B2
11050990 Casas Jun 2021 B2
11057505 Dharmatilleke Jul 2021 B2
11058390 Douglas Jul 2021 B1
11061257 Hakim Jul 2021 B1
11065062 Frushour Jul 2021 B2
11067387 Marell Jul 2021 B2
11071497 Hallack Jul 2021 B2
11087039 Duff Aug 2021 B2
11090019 Siemionow et al. Aug 2021 B2
11097129 Sakata Aug 2021 B2
11099376 Steier Aug 2021 B1
11103320 LeBoeuf Aug 2021 B2
11109762 Steier Sep 2021 B1
11122164 Gigante Sep 2021 B2
11123604 Fung Sep 2021 B2
11135015 Crawford Oct 2021 B2
11135016 Frielinghaus et al. Oct 2021 B2
11141221 Hobeika Oct 2021 B2
11153549 Casas Oct 2021 B2
11153555 Healy et al. Nov 2021 B1
11163176 Karafin Nov 2021 B2
11164324 Liu Nov 2021 B2
11166006 Hegyi Nov 2021 B2
11172990 Lang Nov 2021 B2
11179136 Kohli Nov 2021 B2
11180557 Noelle Nov 2021 B2
11185891 Cousins Nov 2021 B2
11202682 Staunton Dec 2021 B2
11207150 Healy Dec 2021 B2
11217028 Jones Jan 2022 B2
11224763 Takahashi Jan 2022 B2
11227417 Berlinger Jan 2022 B2
11253323 Hughes et al. Feb 2022 B2
11257190 Mao Feb 2022 B2
11263772 Siemionow et al. Mar 2022 B2
11272151 Casas Mar 2022 B2
11278359 Siemionow et al. Mar 2022 B2
11278413 Lang Mar 2022 B1
11280480 Wilt Mar 2022 B2
11284846 Graumann Mar 2022 B2
11311341 Lang Mar 2022 B2
11291521 Im Apr 2022 B2
11294167 Ishimoda Apr 2022 B2
11297285 Pierce Apr 2022 B2
11300252 Nguyen Apr 2022 B2
11307402 Steier Apr 2022 B2
11317973 Calloway May 2022 B2
11348257 Lang May 2022 B2
11350072 Casas May 2022 B1
11351006 Aferzon Jun 2022 B2
11360315 Tu Jun 2022 B2
11382699 Wassall Jul 2022 B2
11382700 Calloway Jul 2022 B2
11382712 Elimelech et al. Jul 2022 B2
11382713 Healy Jul 2022 B2
11389252 Gera et al. Jul 2022 B2
11432828 Lang Sep 2022 B1
11432931 Lang Sep 2022 B2
11452568 Lang Sep 2022 B2
11460915 Frielinghaus Oct 2022 B2
11461983 Jones Oct 2022 B2
11464581 Calloway Oct 2022 B2
11483532 Casas Oct 2022 B2
20020082498 Wendt et al. Jun 2002 A1
20030117393 Sauer et al. Jun 2003 A1
20030130576 Seeley Jul 2003 A1
20030156144 Morita Aug 2003 A1
20030210812 Khamene et al. Nov 2003 A1
20030225329 Rossner et al. Dec 2003 A1
20040030237 Lee et al. Feb 2004 A1
20040138556 Cosman Jul 2004 A1
20050017972 Poole Jan 2005 A1
20050024586 Teiwes et al. Feb 2005 A1
20050119639 McCombs et al. Jun 2005 A1
20050203367 Ahmed Sep 2005 A1
20050215879 Chuanggui Sep 2005 A1
20060134198 Tawa Jun 2006 A1
20060176242 Jaramaz et al. Aug 2006 A1
20070018975 Chaunggui et al. Jan 2007 A1
20080007645 Mccutchen Jan 2008 A1
20080085033 Haven et al. Apr 2008 A1
20080159612 Fu Jul 2008 A1
20080183065 Goldbach Jul 2008 A1
20080221625 Hufner et al. Sep 2008 A1
20080262812 Arata et al. Oct 2008 A1
20090018437 Cooke Jan 2009 A1
20090036902 Dimaio et al. May 2009 A1
20090227847 Tepper et al. Sep 2009 A1
20090300540 Russell Dec 2009 A1
20100114110 Taft et al. May 2010 A1
20100149073 Chaum et al. Jun 2010 A1
20100210939 Hartmann et al. Aug 2010 A1
20110004259 Stallings et al. Jan 2011 A1
20110098553 Robbins et al. Apr 2011 A1
20110105895 Kornblau et al. May 2011 A1
20110216060 Weising et al. Sep 2011 A1
20110245625 Trovato et al. Oct 2011 A1
20110254922 Schaerer et al. Oct 2011 A1
20120014608 Watanabe Jan 2012 A1
20120068913 Bar-zeev et al. Mar 2012 A1
20120078236 Schoepp Mar 2012 A1
20120109151 Maier-hein et al. May 2012 A1
20120143050 Heigl Jun 2012 A1
20120155064 Waters Jun 2012 A1
20120182605 Hall et al. Jul 2012 A1
20120216411 Wevers et al. Aug 2012 A1
20120289777 Chopra et al. Nov 2012 A1
20120306850 Balan et al. Dec 2012 A1
20120320100 Machida et al. Dec 2012 A1
20130002928 Imai Jan 2013 A1
20130009853 Hesselink et al. Jan 2013 A1
20130050258 Liu et al. Feb 2013 A1
20130050833 Lewis et al. Feb 2013 A1
20130057581 Meier Mar 2013 A1
20130083009 Geisner et al. Apr 2013 A1
20130106833 Fun May 2013 A1
20130135734 Shafer et al. May 2013 A1
20130190602 Liao Jul 2013 A1
20130209953 Arlinsky et al. Aug 2013 A1
20130234914 Fujimaki Sep 2013 A1
20130234935 Griffith Sep 2013 A1
20130237811 Mihailescu et al. Sep 2013 A1
20130249787 Morimoto Sep 2013 A1
20130249945 Kobayashi Sep 2013 A1
20130265623 Sugiyama et al. Oct 2013 A1
20130267838 Fronk et al. Oct 2013 A1
20130278635 Maggiore Oct 2013 A1
20130300760 Sugano et al. Nov 2013 A1
20130342571 Kinnebrew et al. Dec 2013 A1
20140031668 Mobasser et al. Jan 2014 A1
20140049629 Siewerdsen Feb 2014 A1
20140088402 Xu Mar 2014 A1
20140088990 Nawana et al. Mar 2014 A1
20140104505 Koenig Apr 2014 A1
20140114173 Bar-Tal et al. Apr 2014 A1
20140142426 Razzaque et al. May 2014 A1
20140168261 Margolis et al. Jun 2014 A1
20140176661 Smurro et al. Jun 2014 A1
20140177023 Gao et al. Jun 2014 A1
20140189508 Granchi et al. Jul 2014 A1
20140198129 Liu et al. Jul 2014 A1
20140240484 Kodama et al. Aug 2014 A1
20140243614 Rothberg et al. Aug 2014 A1
20140256429 Kobayashi et al. Sep 2014 A1
20140266983 Christensen Sep 2014 A1
20140268356 Bolas et al. Sep 2014 A1
20140270505 Mccarthy Sep 2014 A1
20140275760 Lee et al. Sep 2014 A1
20140285404 Takano et al. Sep 2014 A1
20140285429 Simmons Sep 2014 A1
20140300632 Laor Oct 2014 A1
20140300967 Tilleman et al. Oct 2014 A1
20140303491 Shekhar et al. Oct 2014 A1
20140320399 Kim et al. Oct 2014 A1
20140333899 Smithwick Nov 2014 A1
20140336461 Reiter Nov 2014 A1
20140340286 Machida et al. Nov 2014 A1
20140361956 Mikhailov et al. Dec 2014 A1
20150005772 Anglin et al. Jan 2015 A1
20150018672 Blumhofer et al. Jan 2015 A1
20150070347 Hofmann et al. Mar 2015 A1
20150084990 Labor Mar 2015 A1
20150150641 Daon et al. Jun 2015 A1
20150182293 Yang et al. Jul 2015 A1
20150209119 Theodore et al. Jul 2015 A1
20150287188 Gazit Oct 2015 A1
20150287236 Winn Oct 2015 A1
20150297314 Fowler et al. Oct 2015 A1
20150305828 Park et al. Oct 2015 A1
20150310668 Ellerbrock Oct 2015 A1
20150351863 Plassky et al. Dec 2015 A1
20150366620 Cameron et al. Dec 2015 A1
20160103318 Du et al. Apr 2016 A1
20160125603 Tanji May 2016 A1
20160143699 Tanji May 2016 A1
20160153004 Zhang Jun 2016 A1
20160175064 Stenile et al. Jun 2016 A1
20160178910 Gudicell et al. Jun 2016 A1
20160191887 Casas Jun 2016 A1
20160223822 Harrison et al. Aug 2016 A1
20160256223 Haimer et al. Sep 2016 A1
20160302870 Wilkinson et al. Oct 2016 A1
20160324580 Esterberg Nov 2016 A1
20160324583 Kheradpr et al. Nov 2016 A1
20160339337 Ellsworth et al. Nov 2016 A1
20170027650 Merck et al. Feb 2017 A1
20170068119 Antaki Mar 2017 A1
20170076501 Jagga et al. Mar 2017 A1
20170086941 Marti et al. Mar 2017 A1
20170112586 Dhupar Apr 2017 A1
20170014119 Capote et al. Jun 2017 A1
20170164919 LaVallee et al. Jun 2017 A1
20170164920 Lavallee et al. Jun 2017 A1
20170178375 Benishti Jun 2017 A1
20170220224 Kodali Aug 2017 A1
20170239015 Sela et al. Aug 2017 A1
20170251900 Hansen et al. Sep 2017 A1
20170252109 Yang et al. Sep 2017 A1
20170258526 Lang Sep 2017 A1
20170281283 Siegler et al. Oct 2017 A1
20170312032 Amanatullah et al. Nov 2017 A1
20170348055 Salcedo et al. Dec 2017 A1
20170348061 Joshi et al. Dec 2017 A1
20170367766 Mahfouz Dec 2017 A1
20170367771 Tako et al. Dec 2017 A1
20170372477 Penne Dec 2017 A1
20180003981 Urey Jan 2018 A1
20180018791 Guoyi Jan 2018 A1
20180028266 Barnes et al. Feb 2018 A1
20180036884 Chen et al. Feb 2018 A1
20180049622 Ryan et al. Feb 2018 A1
20180055579 Daon et al. Mar 2018 A1
20180078316 Schaewe et al. Mar 2018 A1
20180082480 White et al. Mar 2018 A1
20180092698 Chopra et al. Apr 2018 A1
20180092699 Finley Apr 2018 A1
20180116732 Lin et al. May 2018 A1
20180117150 O'Dwyer May 2018 A1
20180133871 Farmer May 2018 A1
20180153626 Yang et al. Jun 2018 A1
20180185100 Weinstein et al. Jul 2018 A1
20180193097 McLachlin et al. Jul 2018 A1
20180200002 Kostrzewski et al. Jul 2018 A1
20180247128 Alvi et al. Aug 2018 A1
20180262743 Casas Sep 2018 A1
20180311011 Van Beek et al. Nov 2018 A1
20180317803 Ben-Yishai et al. Nov 2018 A1
20180318035 McLachlin et al. Nov 2018 A1
20190000372 Gullotti et al. Jan 2019 A1
20190015163 Abhari et al. Jan 2019 A1
20190038362 Nash Feb 2019 A1
20190038365 Soper Feb 2019 A1
20190043238 Benishti et al. Feb 2019 A1
20190046272 Zoabi et al. Feb 2019 A1
20190046276 Inglese Feb 2019 A1
20190053851 Siemionow et al. Feb 2019 A1
20190069971 Tripathi et al. Mar 2019 A1
20190080515 Geri Mar 2019 A1
20190105116 Johnson et al. Apr 2019 A1
20190130792 Rios May 2019 A1
20190142519 Siemionow et al. May 2019 A1
20190144443 Jackson May 2019 A1
20190175228 Elimelech et al. Jun 2019 A1
20190192230 Siemionow et al. Jun 2019 A1
20190201106 Siemionow et al. Jul 2019 A1
20190216537 Eltorai Jul 2019 A1
20190254753 Johnson Aug 2019 A1
20190273916 Benishti et al. Sep 2019 A1
20190333480 Lang Oct 2019 A1
20190369717 Frielinghaus Dec 2019 A1
20190387351 Lyren Dec 2019 A1
20200019364 Pond Jan 2020 A1
20200020249 Jarc et al. Jan 2020 A1
20200038112 Amanatullah Feb 2020 A1
20200078100 Weinstein et al. Mar 2020 A1
20200085511 Oezbek et al. Mar 2020 A1
20200088997 Lee Mar 2020 A1
20200159313 Gibby et al. Mar 2020 A1
20200100847 Siegler et al. Apr 2020 A1
20200117025 Sauer Apr 2020 A1
20200129058 Li Apr 2020 A1
20200129136 Harding et al. Apr 2020 A1
20200129262 Verard Apr 2020 A1
20200129264 Onativia et al. Apr 2020 A1
20200133029 Yonezawa Apr 2020 A1
20200138518 Lang May 2020 A1
20200143594 Lal et al. May 2020 A1
20200146546 Chene May 2020 A1
20200151507 Siemionow et al. May 2020 A1
20200156259 Morales May 2020 A1
20200163723 Wolf et al. May 2020 A1
20200163739 Messinger et al. May 2020 A1
20200184638 Meglan Jun 2020 A1
20200186786 Gibby et al. Jun 2020 A1
20200188034 Lequette et al. Jun 2020 A1
20200201082 Carabin Jun 2020 A1
20200229877 Siemionow et al. Jul 2020 A1
20200237256 Farshad et al. Jul 2020 A1
20200237459 Racheli et al. Jul 2020 A1
20200237880 Kent Jul 2020 A1
20200246074 Lang Aug 2020 A1
20200246081 Johnson et al. Aug 2020 A1
20200265273 Wei Aug 2020 A1
20200275988 Johnson Sep 2020 A1
20200305980 Lang Oct 2020 A1
20200321099 Holladay et al. Oct 2020 A1
20200323460 Busza Oct 2020 A1
20200327721 Siemionow et al. Oct 2020 A1
20200330179 Ton Oct 2020 A1
20200337780 Winkler Oct 2020 A1
20200341283 McCracken Oct 2020 A1
20200352655 Freese Nov 2020 A1
20200355927 Marcellin-Dibon Nov 2020 A1
20200360091 Murray et al. Nov 2020 A1
20200375666 Murphy Dec 2020 A1
20200377493 Heiser Dec 2020 A1
20200377956 Vogelstein Dec 2020 A1
20200388075 Kazanzides et al. Dec 2020 A1
20200389425 Bhatia Dec 2020 A1
20200390503 Casas et al. Dec 2020 A1
20200402647 Domracheva Dec 2020 A1
20200410687 Siemionow et al. Dec 2020 A1
20200413031 Khani Dec 2020 A1
20210004956 Book et al. Jan 2021 A1
20210009339 Morrison et al. Jan 2021 A1
20210015583 Avisar Jan 2021 A1
20210022599 Freeman et al. Jan 2021 A1
20210022808 Lang Jan 2021 A1
20210022811 Mahfouz Jan 2021 A1
20210029804 Chang Jan 2021 A1
20210030374 Takahashi Feb 2021 A1
20210030511 Wolf et al. Feb 2021 A1
20210038339 Yu Feb 2021 A1
20210065911 Goel et al. Mar 2021 A1
20210077195 Saeidi Mar 2021 A1
20210077210 Itkowitz Mar 2021 A1
20210080751 Lindsey Mar 2021 A1
20210090344 Geri et al. Mar 2021 A1
20210093391 Poltaretskyi et al. Apr 2021 A1
20210093392 Poltaretskyi et al. Apr 2021 A1
20210093400 Quid et al. Apr 2021 A1
20210104055 Ni et al. Apr 2021 A1
20210107923 Jackson Apr 2021 A1
20210109349 Schneider Apr 2021 A1
20210109373 Loo Apr 2021 A1
20210110517 Flohr Apr 2021 A1
20210113269 Vilsmeier Apr 2021 A1
20210121238 Palushi et al. Apr 2021 A1
20210137634 Lang et al. May 2021 A1
20210141887 Kim et al. May 2021 A1
20210150702 Claessen May 2021 A1
20210157544 Denton May 2021 A1
20210160472 Casas May 2021 A1
20210161614 Elimelech et al. Jun 2021 A1
20210162287 Xing Jun 2021 A1
20210165207 Peyman Jun 2021 A1
20210169504 Brown Jun 2021 A1
20210169578 Calloway et al. Jun 2021 A1
20210169581 Calloway et al. Jun 2021 A1
20210169605 Calloway et al. Jun 2021 A1
20210196404 Wang Jul 2021 A1
20210223577 Zheng Jul 2021 A1
20210227791 De Oliveira Seixas Jul 2021 A1
20210235061 Hegyi Jul 2021 A1
20210248822 Choi Aug 2021 A1
20210282887 Wiggermann Sep 2021 A1
20210290046 Nazareth Sep 2021 A1
20210290336 Wang Sep 2021 A1
20210290394 Mahfouz Sep 2021 A1
20210295512 Knoplioch Sep 2021 A1
20210298835 Wang Sep 2021 A1
20210306599 Pierce Sep 2021 A1
20210311322 Belanger Oct 2021 A1
20210315636 Akbarian Oct 2021 A1
20210315662 Freeman et al. Oct 2021 A1
20210325684 Ninan Oct 2021 A1
20210333561 Oh Oct 2021 A1
20210346115 Dulin et al. Nov 2021 A1
20210349677 Baldev Nov 2021 A1
20210369226 Siemionow et al. Dec 2021 A1
20210371413 Thurston Dec 2021 A1
20210373333 Moon Dec 2021 A1
20210373344 Loyola Dec 2021 A1
20210378757 Bay Dec 2021 A1
20210389590 Freeman Dec 2021 A1
20210400247 Casas Dec 2021 A1
20210401533 Im Dec 2021 A1
20210402255 Fung Dec 2021 A1
20210405369 King Dec 2021 A1
20220003992 Ahn Jan 2022 A1
20220007006 Healy et al. Jan 2022 A1
20220008135 Frielinghaus et al. Jan 2022 A1
20220038675 Hegyi Feb 2022 A1
20220039873 Harris Feb 2022 A1
20220051484 Jones et al. Feb 2022 A1
20220079675 Lang Mar 2022 A1
20220121041 Hakim Apr 2022 A1
20220142730 Wolf et al. May 2022 A1
20220155861 Myung May 2022 A1
20220159227 Casas May 2022 A1
20220179209 Cherukuri Jun 2022 A1
20220192776 Gibby et al. Jun 2022 A1
20220245400 Siemionow et al. Aug 2022 A1
20220133484 Lang Sep 2022 A1
20220287676 Steines et al. Sep 2022 A1
20220295033 Casas Sep 2022 A1
Foreign Referenced Citations (86)
Number Date Country
3022448 Feb 2018 CA
101379412 Mar 2009 CN
101379412 Mar 2009 CN
103106348 May 2013 CN
111915696 Nov 2020 CN
112489047 Mar 2021 CN
112489047 Mar 2021 CN
202004011567 Nov 2004 DE
102014008153 Oct 2014 DE
0933096 Aug 1999 EP
1640750 Mar 2006 EP
2134847 Jun 2015 EP
2891966 Jan 2017 EP
2654749 May 2017 EP
3216416 Sep 2017 EP
2032039 Oct 2017 EP
2030193 Jul 2018 EP
3034607 Mar 2019 EP
2892558 Apr 2019 EP
2635299 Jul 2019 EP
3505050 Jul 2019 EP
3224376 Aug 2019 EP
2875149 Dec 2019 EP
3206583 Sep 2020 EP
2625845 Mar 2021 EP
3076660 Apr 2021 EP
3858280 Aug 2021 EP
3593227 Sep 2021 EP
3789965 Dec 2021 EP
3634294 Jan 2022 EP
3952331 Feb 2022 EP
2507314 Apr 2014 GB
20140120155 Oct 2014 KR
03034705 Apr 2003 WO
2007051304 May 2007 WO
2007115826 Oct 2007 WO
2008103383 Aug 2008 WO
2010067267 Jun 2010 WO
WO2010074747 Jul 2010 WO
WO2012101286 Aug 2012 WO
2013112554 Aug 2013 WO
2014024188 Feb 2014 WO
2014037953 Mar 2014 WO
WO2014037953 Mar 2014 WO
2014113455 Jul 2014 WO
2014125789 Aug 2014 WO
2014167563 Oct 2014 WO
2014174067 Oct 2014 WO
2015058816 Apr 2015 WO
WO2015061752 Apr 2015 WO
WO2015109145 Jul 2015 WO
2016151506 Sep 2016 WO
WO2007115826 Oct 2017 WO
2018073452 Apr 2018 WO
WO2018200767 Apr 2018 WO
2018206086 Nov 2018 WO
2019195926 Oct 2019 WO
2019211741 Nov 2019 WO
WO2019210353 Nov 2019 WO
2020109903 Jun 2020 WO
2020109904 Jun 2020 WO
2021019369 Feb 2021 WO
WO2021017019 Feb 2021 WO
WO2021023574 Feb 2021 WO
WO2021046455 Mar 2021 WO
WO2021048158 Mar 2021 WO
WO2021021979 Apr 2021 WO
WO2021061459 Apr 2021 WO
WO2021062375 Apr 2021 WO
WO2021073743 Apr 2021 WO
WO2021087439 May 2021 WO
WO2021091980 May 2021 WO
2021255627 Jun 2021 WO
WO2021112918 Jun 2021 WO
2021130564 Jul 2021 WO
WO2021137752 Jul 2021 WO
WO2021141887 Jul 2021 WO
WO2021145584 Jul 2021 WO
WO2021154076 Aug 2021 WO
WO2021183318 Dec 2021 WO
WO2021257897 Dec 2021 WO
WO2021258078 Dec 2021 WO
WO2022009233 Jan 2022 WO
2022053923 Mar 2022 WO
2022079565 Apr 2022 WO
2023281395 Jan 2023 WO
Non-Patent Literature Citations (62)
Entry
US 11,395,705 B2, 09/2022, Lang (withdrawn)
International Application # PCT/IB2019/059770 search report dated Mar. 17, 2020.
International Application # PCT/IB2019/059771 search report dated Mar. 1, 2020.
U.S. Appl. No. 16/419,023 Third party submission dated Jan. 19, 2020.
Sagitov et al., “Comparing Fiducial Marker Systems in the Presence of Occlusion”, International Conference on Mechanical, System and Control Engineering (ICMSC), pp. 1-6, 2017.
Liu et al., “Marker orientation in fiducial registration”, Medical Imaging 2003: Image Processing, Proceedings of SPIE vol. 5032, pp. 1176-1185, 2003.
U.S. Appl. No. 16/419,023 office action dated Oct. 4, 2019.
Fingas., “Fraunhofer iPad app guides liver surgery through augmented reality”, pp. 1-6, Aug. 22, 2013.
U.S. Appl. No. 16/419,023 Office Action dated Sep. 3, 2020.
U.S. Appl. No. 16/199,281 Office Action dated Jun. 11, 2020.
Liao et al., ‘3-D Augmented Reality for MRI-Guided Surgery Using Integral Videography Autostereoscopic Image Overlay’, IEEE Transactions on Biomedical Engineering, vol. 57, No. 6, pp. 1476-1486, Feb. 17, 2010.
Hainich et al., “Near-Eye displays”, Chapter 10 of Displays: Fundamentals and Applications, CRC press, pp. 439-504, Jul. 5, 2011.
Brainlab—Image Registration Options Enhanced Visualization Leveraging More Data , pp. 1-4, Feb. 2019.
Lumus Ltd., “DK-32 See-through Wearable Display Development Kit”, Rehovot, Israel, 2 pages, Dec. 24, 2013.
International Application # PCT/IB2020/056893 Search Report dated Nov. 9, 2020.
International Application # PCT/IB2020/060017 Search Report dated Jan. 7, 2021.
Elimelech et al., U.S. Appl. No. 16/724,297, filed Dec. 22, 2019.
U.S. Appl. No. 16/724,297 Office Action dated Jan. 26, 2021.
JP Application # 2021525186 Office Action dated Dec. 1, 2021.
EP Application # 19796580.9 Search Report dated Dec. 20, 2021.
International Application # PCT/IB2021/058088 Search Report dated Dec. 20, 2021.
International Application # PCT/IB2021/055242 Search Report dated Oct. 7, 2021.
U.S. Appl. No. 16/724,297 Office Action dated Nov. 4, 2021.
CN Application # 2019800757525 Office Action dated Mar. 1, 2022.
U.S. Appl. No. 16/419,023 Office Action dated Mar. 1, 2022.
U.S. Appl. No. 16/524,258 Office Action dated Apr. 11, 2022.
EP Application # 16767845.7 Office Action dated Apr. 29, 2022.
Lorensen et al., “Marching Cubes: A High Resolution 3D Surface Construction Algorithm,” ACM SIGGRAPH '87, Computer Graphics, vol. 21, No. 4, pp. 163-169, Jul. 1987.
Wikipedia, “Marching Cubes,” pp. 1-4, last edited Sep. 4, 2021.
Milletari et al., “V-Net: fully Convolutional Neural Networks for Volumetric Medical Image Segmentation,” arXiv:1606.04797v1, pp. 1-11, Jun. 15, 2016.
EP Application # 19891059.8 Search Report dated Jul. 27, 2022.
EP Application # 19890849.3 Search Report dated Jul. 27, 2022.
U.S. Appl. No. 16/419,023 Office Action dated Sep. 1, 2022.
U.S. Appl. No. 16/524,258 Office Action dated Oct. 24, 2022.
Mitrasinovic et al., “Clinical and surgical applications of smart glasses”, pp. 381-401, Technology and Health Care, issue 23, year 2015.
Martin-Gonzalez et al., “Head-mounted virtual loupe with sight-based activation for surgical applications”, IEEE symposium on mixed and augmented reality, pp. 207-208, Oct. 19-22, 2009.
Figl et al., “A fully automated calibration method for an optical see-through head-mounted operating microscope with variable zoom and focus”, pp. 1492-1499, IEEE transactions on medical imaging, vol. 24, No. 11, Nov. 2005.
Medithinq Co. Ltd., “Metascope: world's first wearable scope”, pp. 1-7, Jan. 2023.
Martin-Gonzalez et al., “Sight-based magnification system for surgical applications”, pp. 26-30, Conference proceedings of Bildverarbeitung für die Medizin, year 2010.
Burstrom et al., “Frameless patient tracking with adhesive optical skin markers for augmented reality surgical navigation in spine surgery”, Spine, vol. 45, No. 22, pp. 1598-1604, year 2020.
Suenaga et al., “Vision-based markerless registration using stereo vision and an augmented reality surgical navigation system: a pilot study”, BMC Medical Imaging, pp. 1-11, year 2015.
Mayfield Clinic, “Spinal Fusion: Lateral Lumbar Interbody Fusion (LLIF)”, pp. 1-6, Jan. 2021.
Qian et al., “AR-Loupe: Magnified Augmented Reality by Combining an Optical See-Through Head-Mounted Display and a Loupe”, pp. 2550-2562, IEEE Transactions on Visualization and Computer Graphics, vol. 28, No. 7, Jul. 2022.
Kazanzides et al., “Systems and Methods for Augmented Reality Magnifying Loupe”, case ID 15944, pp. 1-2, Nov. 26, 2020.
Zhang et al., “Medical Volume Rendering Techniques,” Independent Research, Spring 2014, arXiv:1802.07710v1, pp. 1-33, Feb. 21, 2018.
Van Ooijen et al., “Noninvasive Coronary Imaging Using Electron Beam CT: Surface Rendering Versus Volume Rendering,” Computers in Radiology, AJR, vol. 180, pp. 223-226, Jan. 2003.
Webster (ed.), “Structured Light Techniques and Applications,” Wiley Encyclopedia of Electrical and Electronics Engineering, pp. 1-24, year 2016.
Liberadzki et al., “Structured-Light-Based System for Shape Measurement of the Human Body in Motion,” Sensors, vol. 18, pp. 1-19, year 2018.
Romero, “Volume Ray Casting Techniques and Applications Using General Purpose Computations on Graphics Processing Units,” Thesis/Dissertation Collections, Rochester Institute of Technology, RIT Scholar Works, pp. 1-140, Jun. 2009.
International Application PCT/IB2022/056986 filed Jul. 28, 2022.
International Application PCT/IB2022/057733 filed Aug. 18, 2022.
International Application PCT/IB2022/057735 filed Aug. 18, 2022.
International Application PCT/IB2022/057736 filed Aug. 18, 2022.
International Application PCT/IB2022/057965 filed Aug. 25, 2022.
International Application PCT/IB2022/059030 filed Sep. 23, 2022.
Gera et al., U.S. Appl. No. 17/388,064, filed Jul. 29, 2021.
International Application PCT/IB2022/057965 Search Report dated Dec. 15, 2022.
U.S. Appl. No. 16/524,258 Office Action dated Jan. 24, 2023.
International Application PCT/IB2022/057733 Search Report dated Jan. 26, 2023.
European Application 22203956.2 Search Report dated Feb. 9, 2023.
International Application PCT/IB2022/059030 Search report dated Feb. 28, 2023.
U.S. Appl. No. 16/419,023 Office Action dated Jul. 22, 2021.
Related Publications (1)
Number Date Country
20200163723 A1 May 2020 US