Many surgical procedures require large amounts of information for planning and/or undertaking the procedure. One way to manage this is to improve the way information is presented to a user, e.g., a surgeon. Augmented Reality (AR) provides an overlay of virtual information on or adjacent to a “real-world” object visually perceived by a user, usually through an AR device such as a headset, Google Glass, etc. An AR device is configured to display information, such as pictures, video, text, warnings, models, simulations, etc., while not obscuring the user's view of the real-world objects in her proximity.
However, the information displayed may be selectable, pertinent, and customizable. For example, it would be beneficial to provide information that helps a surgeon visualize items that can't be directly perceived. Furthermore, specific use cases may present challenges that can be at least ameliorated by properly configured AR systems.
Accordingly, there is a need for improved systems, methods, and devices to employ AR that can improve patient outcome and surgical efficiency.
Systems and methods are disclosed for computer aided surgery (CAS), comprising an augmented reality (AR) system configured to display augmented reality information, a position tracking system configured to track positions of objects, an instrument coupled to a tracker detectable by the position tracking system, and a controller configured to: display augmented reality information comprising an indicator of a cut on the bone that would be produced by a current orientation of the instrument and an indicator of a predetermined target plane for the cut, determine, as the instrument is moved, when the current orientation of the instrument is aligned with the predetermined target plane, and based on the determined alignment, display augmented reality information comprising an indicator representing that the current orientation of the instrument is aligned with the predetermined target plane. In some embodiments, the controller is further configured to display augmented reality information comprising an indicator of the axis of the bone. In some embodiments, the predetermined target plane is defined from acquired bony landmark points, such as in an image-less planning procedure.
Methods and implementations are provided to assist a surgeon to perform intra-operative (intra-op) visualization and/or planning from an AR headset, with minimal impact to their workflow. AR provides control to the surgeon, for example, for orthopedic procedures. Example applications include knee surgery (e.g., total knee arthroplasty (TKA) or uni-compartmental knee arthroplasty (UKA)), and other orthopedic surgeries. In some embodiments, the system may enhance what the surgeon may see and help the surgeon visualize what they can't see. The display may include virtual targets on the anatomy and information related to the instrument relative to the target. Provided is an AR system that has a user interface (e.g., with a controller) and a display, such as is typically associated with a headset, head-mounted display (HMD), Google Glass, etc. As will be described, navigation/tracking may be provided. In some embodiments, this application is directed to computer aided surgery (CAS) comprising an augmented reality (AR) system configured to display augmented reality information, a position tracking system configured to track positions of objects, an instrument coupled to a navigational tracker detectable by the position tracking system, and a controller configured to determine a position of the instrument, based on the determined position, display augmented reality information using the AR system. The controller may be used to send and receive information to and from the AR system. The controller typically includes a power supply, AC/DC converters, control system interface circuits, and other components included in computer aided surgery. The controller is also configured to perform the systems and methods described herein.
The controller may be configured for navigation and/or tracking. A position tracking system may comprise a tracker comprising a navigation array including a plurality of markers in a unique constellation or geometric arrangement. For example, optical navigation or tracking systems may utilize stereoscopic sensors (of a tracking unit) to detect light emitting diodes (LEDs) or infra-red (IR) light reflected or emitted from one or more optical markers affixed to the array. For example, when the markers are reflective elements, once detected by stereoscopic sensors, the relative arrangement of the elements in the sensors' field of view, in combination with the known geometric arrangement of the elements, may allow the system to determine a three-dimensional position of the array, and hence the instrument. Other examples of tracking systems in include ultrasonic sensors, radio-frequency identification (RFID) sensors or other radio frequency (RF) tracking systems, electromagnetic interference (EMI) tracking systems, etc. The tracking unit may detect the relative motions between any and all trackers in real time. The tracking unit (e.g., a stereoscopic reflector detecting camera, an AR system camera, or a stand-alone camera) may track the trackers for purposes of determining their relative locations and orientations (e.g., position). A surgeon may view the patient through the AR system and may manipulate the instrument. In some embodiments, a computer aided surgery (CAS) system may comprise an augmented reality (AR) system configured to display augmented reality information, a position tracking system configured to track positions of objects, an instrument coupled to a tracker detectable by the position tracking system, and a controller. The controller may be configured to cause the AR system to display, such as on the headset, augmented reality information about a position of the instrument and a position of the patient. Other augmented reality information may be overlaid (e.g., displayed on the headset as superimposed on the real-world scene). The controller may be configured to, if the instrument moves to another position, cause the AR system to display updated augmented reality information, as will now be described in greater detail. As can be appreciated, using a two-dimensional display to guide a surgeon trying to manipulate or adjust an instrument in three dimensions can be an imposing challenge.
One or more trackers may be attached to a patient, and another tracker attached to the instrument, thus allowing the position of the instrument to be relative to the patient. The controller may determine (or be informed of) a position of a patient anatomy (a femur, a tibia, etc.). The additional tracker(s) may be present elsewhere in an operating theater, e.g., such as coupled to a surgical table. The additional tracker(s) may assist with tracking an anatomy (e.g., a bone) of interest. A patient coordinate system may be defined to refer to the position of the patient with respect to the instrument.
The camera (e.g., an AR system camera or a stand-alone camera) may track the trackers for purposes of determining their relative locations and orientations (e.g., position) and, in some cases, for displaying virtual information as will be described. A surgeon may view the patient through the AR system and may manipulate the instrument. In some embodiments, a computer aided surgery (CAS) system may comprise an augmented reality (AR) system configured to display augmented reality information, a position tracking system configured to track positions of objects, an instrument coupled to a tracker detectable by the position tracking system, and a controller. The controller may be configured to cause the AR system to display, such as on the headset, augmented reality information about a position of the instrument and a position of the patient. The controller may be configured to, if the instrument moves to another position, cause the AR system to display updated augmented reality information, as will now be described in greater detail.
At
Optionally, the controller is further configured to determine an axis of the bone and display augmented reality information comprising an indicator of the axis of the bone. As depicted, the bone axis is a dotted line, but other indicators are contemplated. For example, a surgeon may manipulate a limb of the patient where the bone to be treated is located (e.g., in knee surgery, the surgeon moves the leg and flexes and extends the knee) and the controller (e.g., using the tracking system) may determine the axis. The optional bone axis indicator may be beneficial to the surgeon, for example in cases where the instrument plane should be perpendicular to the bone axis. Preferably, the indicator of the axis of the bone is displayed for at least a portion of the surgical planning procedure. The controller may be configured to toggle display of the bone axis indicator on or off upon an input from the user.
Continuing with the example where the instrument is a cut guide, the controller is further configured to determine a cut on the bone that would be produced by a current orientation of the instrument. Moreover, the controller is further configured to display augmented reality information using the AR system, the augmented reality information comprising an indicator of a cut on the bone that would be produced by a current orientation of the instrument. In
Turning to
Turning to
At
The controller is further configured to display augmented reality information using the AR system, the augmented reality information comprising an indicator of a cut on the bone that would be produced by a current orientation of the instrument. In
Turning to
Turning to
At
The controller is further configured to display augmented reality information using the AR system, the augmented reality information comprising an indicator of a cut on the bone that would be produced by a current orientation of the instrument. In
The controller may be further configured to display augmented reality information comprising a guide indicator that is a pair of lines extending between the indicator of the cut on the bone that would be produced by the current orientation of the instrument and the indicator of the predetermined target plane for the cut. The guide indicator may enhance perception of both an angular difference and the magnitude of difference between the current orientation of the instrument and the predetermined target plane. For example, a length of each line may represent a relative difference. The guide indicator may graphically represent a distance between the current orientation of the instrument and the predetermined target plane. For example, each arrow may be a certain preset length such as 1 mm, 5 mm, etc.
Turning to
Turning to
At
The controller is further configured to display augmented reality information using the AR system, the augmented reality information comprising an indicator of a cut on the bone that would be produced by a current orientation of the instrument. In
The bone axis indicator is displayed in this embodiment. The controller may be further configured to display augmented reality information comprising a guide indicator that is an ellipse centered on an intersection of the axis and the predetermined target plane. The guide indicator may enhance perception of a center of the predetermined target plane.
Turning to
Turning to
In a first embodiment, a computer aided surgery (CAS) system is provided. The CAS system comprises an augmented reality (AR) system configured to display augmented reality information; a position tracking system configured to track positions of objects; an instrument coupled to a tracker detectable by the position tracking system; and a controller configured to: receive information regarding at least three points on a bone of a patient; display augmented reality information using the AR system, the augmented reality information comprising an indicator of a cut on the bone that would be produced by a current orientation of the instrument and an indicator of a predetermined target plane for the cut; determine, as the instrument is moved, when the current orientation of the instrument is aligned with the predetermined target plane; and based on the determined alignment, display augmented reality information comprising an indicator representing that the current orientation of the instrument is aligned with the predetermined target plane.
In some embodiments, the at least three points on a bone of a patient are based on acquired points. In some embodiments, the at least three points on a bone of a patient are bony landmarks of the bone. In some embodiments, the controller is further configured to determine the target plane for the cut without using any pre-operative three dimensional (3D) images of the bone.
In some embodiments, the controller is further configured to determine an axis of the bone and display augmented reality information comprising an indicator of the axis of the bone.
In some embodiments, the controller is further configured to display each of the indicators as a different color. In some embodiments, the controller is further configured to display each of the indicators as an ellipse. In some embodiments, the controller is further configured to display the indicator of the cut on the bone that would be produced by the current orientation of the instrument as decreasing in size as the current orientation of the instrument is brought closer to alignment with the predetermined target plane.
In some embodiments, the controller is further configured to display augmented reality information comprising a guide indicator that is a pair of lines extending between the indicator of the cut on the bone that would be produced by the current orientation of the instrument and the indicator of the predetermined target plane for the cut. In some embodiments, the guide indicator enhances perception of an angular difference between the current orientation of the instrument and the predetermined target plane. In some embodiments, the guide indicator graphically represents a distance between the current orientation of the instrument and the predetermined target plane.
In another embodiment, a computer aided surgery (CAS) system is provided. The CAS system comprises an augmented reality (AR) system configured to display augmented reality information; a position tracking system configured to track positions of objects; an instrument coupled to a tracker detectable by the position tracking system; and a controller configured to: receive information regarding at least three points on a bone of a patient; wherein the controller is further configured to determine an axis of the bone and display augmented reality information using the AR system, the augmented reality information comprising an indicator of the axis of the bone; display augmented reality information comprising an indicator of a cut on the bone that would be produced by a current orientation of the instrument and an indicator of a predetermined target plane for the cut; determine, as the instrument is moved, when the current orientation of the instrument is aligned with the predetermined target plane; and based on the determined alignment, display augmented reality information comprising an indicator representing that the current orientation of the instrument is aligned with the predetermined target plane.
In some embodiments, the controller is further configured to display augmented reality information comprising a guide indicator that is an ellipse centered on an intersection of the axis and the predetermined target plane.
In some embodiments, the at least three points on a bone of a patient are based on acquired points. In some embodiments, the at least three points on a bone of a patient are bony landmarks of the bone. In some embodiments, the controller is further configured to determine the target plane for the cut without using any pre-operative three dimensional (3D) images of the bone.
In some embodiments, the controller is further configured to determine an axis of the bone and display augmented reality information comprising an indicator of the axis of the bone.
In some embodiments, the controller is further configured to display each of the indicators as a different color. In some embodiments, the controller is further configured to display each of the indicators as an ellipse. In some embodiments, the controller is further configured to display the indicator of the cut on the bone that would be produced by the current orientation of the instrument as decreasing in size as the current orientation of the instrument is brought closer to alignment with the predetermined target plane.
In some embodiments, the controller is further configured to display augmented reality information comprising a guide indicator that is a pair of lines extending between the indicator of the cut on the bone that would be produced by the current orientation of the instrument and the indicator of the predetermined target plane for the cut. In some embodiments, the guide indicator enhances perception of an angular difference between the current orientation of the instrument and the predetermined target plane. In some embodiments, the guide indicator graphically represents a distance between the current orientation of the instrument and the predetermined target plane.
In yet another embodiment, a method of computer aided surgery (CAS) is provided. The method comprising determining a position of an instrument, wherein the instrument is coupled to a navigational tracker detectable by a position tracking system; displaying, on an augmented reality (AR) system, augmented reality information comprising an indicator of a cut on the bone that would be produced by a current orientation of the instrument and an indicator of a predetermined target plane for the cut; determining, as the instrument is moved, when the current orientation of the instrument is aligned with the predetermined target plane; and based on the determined alignment, displaying augmented reality information comprising an indicator representing that the current orientation of the instrument is aligned with the predetermined target plane.
In some embodiments, the method further comprises determining an axis of the bone and displaying augmented reality information comprising an indicator of the axis of the bone.
In some embodiments, the method further comprises displaying augmented reality information comprising a guide indicator that is an ellipse centered on an intersection of the axis and the predetermined target plane.
In some embodiments, the method further comprises displaying the indicator of the cut on the bone that would be produced by the current orientation of the instrument as decreasing in size as the current orientation of the instrument is brought closer to alignment with the predetermined target plane.
The embodiments of the present disclosure described above are intended to be merely examples; numerous variations and modifications within the scope of this disclosure. Accordingly, the disclosure is not to be limited by what has been particularly shown and described. All publications and references cited herein are expressly incorporated by reference in their entirety, except for any definitions, subject matter disclaimers or disavowals, and except to the extent that the incorporated material is inconsistent with the express disclosure herein, in which case the language in this disclosure controls.