Surgical joint repair procedures involve repair and/or replacement of a damaged or diseased joint. Many times, a surgical joint repair procedure, such as joint arthroplasty as an example, involves replacing the damaged joint with a prosthetic that is implanted into the patient's bone. Proper selection of a prosthetic that is appropriately sized and shaped and proper positioning of that prosthetic to ensure an optimal surgical outcome can be challenging.
This disclosure describes a variety of techniques for providing preoperative planning for surgical joint repair procedures (e.g., arthroplasty procedures). The techniques may be used independently or in various combinations to support particular phases or settings for surgical joint repair procedures or to provide a multi-faceted ecosystem to support surgical joint repair procedures.
A surgical joint repair procedure may involve a surgeon installing an implant in a bone of a patient. Prior to starting the joint repair procedure, the surgeon may select a size of the implant and determine a location at which to position the implant. One of the difficulties of joint repair procedure is the planning stage, which may include tradeoffs and compromises between each surgery decision in order to achieve the best outcome. An example of trade off for a total ankle repair (TAR) may be the decision to minimize the tibial implant overhang at the possible cost of deteriorating the Antero-Posterior (AP) alignment of the implant with the patient anatomy (e.g., the AP alignment of the patient foot). Many factors influence the tibial implant overhang and AP alignment, such as implant size and implant location.
In accordance with one or more aspects of this disclosure, a system may provide automated alignment and sizing advice to a surgeon. For instance, the system may combine multiple surgery criteria and corresponding measures (e.g., implant overhang area and AP alignment angle) into a single planning quality measure to obtain the most appropriate trade-off between each surgery criteria. In some examples, the system may maximize the quality measure function using a non-linear optimization using surgery criteria as optimization arguments. The non-linear optimization may yield an implant emplacement (i.e., a 3D coordinate) and the implant orientation (i.e., a 3D rotation matrix) that corresponds to an appropriate compromise between a minimized implant overhang and a maximized implant alignment. However, an automated alignment and sizing that relies on a non-linear optimization of surgery criteria and associated measures may require modeling and programming of each surgery criteria and trade-off. Such modeling and programming may be cumbersome and time consuming. As such, it may be desirable for a system to provide automated alignment and sizing advice without such modeling and programming.
In accordance with one or more aspects of this disclosure, a system may provide automated alignment and sizing advice for a particular patient based on alignment and sizing of implants of other patients. For instance, one or more processors of the system may obtain a target atlas of a particular patient on which an arthroplasty procedure is to be performed, and may obtain a plurality of reference atlases of other patients on which the arthroplasty procedure has been performed. The one or more processors may select, based on a comparison of values of the target atlas and the plurality of reference atlases, at least one reference atlas of the plurality of reference atlases of the other patients. For instance, the one or more processors may select a reference atlas of the plurality of reference atlases that is most similar to the target atlas. The one or more processors may determine, based on the selected at least one reference atlas, one or both of an implant size and an implant alignment for the particular patient. For instance, the one or more processors may recommend the implant size and/or implant alignment of the reference atlas as the implant size and/or implant alignment for the particular patient. As such, the system may exploit knowledge from retrospective surgeries to propose the most appropriate plan of a currently planned case, i.e., the size and alignment of the ankle implant.
As noted above, the system may utilize an atlas of a particular patient (i.e., a target atlas) and at least one reference atlas of another (i.e., a different) patient. A reference atlas of a patient may include such things as a preoperative scan the patient, 3D models of a bone of the patient, reference anatomical axes of each bone, and/or corresponding implant size and placement used to install an implant in the patient. A target atlas may include similar components to the reference atlas (but does not include the implant size and placement). An atlas, be it target or reference, may include other data points (e.g., cyst 3D models, age or weight of the patient).
The system may perform the reference atlas selection (e.g., selecting the one or more reference atlases that are most similar to the target atlas) using any suitable means. For instance, the system may determine a similarity measure between each of the plurality of reference atlases and the target atlas. The system may be configured to select the reference atlas (or atlases) with the greatest similarity measure as the one or more reference atlases. The similarity measure can be based on multiple criteria. For instance, a similarity measure between a particular reference atlas and a target atlas may be based on a difference between a 3D model of the particular reference atlas and a 3D model of the target atlas (e.g., a mean distance in in/mm, a Hausdorff distance, etc.). In some examples, the similarity measure can rely on criteria such as the surgery strategy (for example, a preferred implant type thereby excluding some atlases from the search) or patient demographics information (e.g., the age or the weight).
The details of various examples of the disclosure are set forth in the accompanying drawings and the description below. Various features, objects, and advantages will be apparent from the description, drawings, and claims.
Certain examples of this disclosure are described with reference to the accompanying drawings, wherein like reference numerals denote like elements. It should be understood, however, that the accompanying drawings illustrate only the various implementations described herein and are not meant to limit the scope of various technologies described herein. The drawings show and describe various examples of this disclosure.
In the following description, numerous details are set forth to provide an understanding of the present disclosure. However, it will be understood by those skilled in the art that the systems, devices and techniques of this disclosure may be practiced without these details and that numerous variations or modifications from the described examples may be possible.
Orthopedic surgery, such as a surgical joint repair procedure, can involve performing various steps to prepare bone for implantation of one or more prosthetic devices to repair or replace a patient's damaged or diseased joint. Virtual surgical planning tools may be available that use image data of the diseased or damaged joint to generate an accurate three-dimensional bone model that can be viewed and manipulated preoperatively by the surgeon. These tools can enhance surgical outcomes by allowing the surgeon to simulate the surgery, select or design an implant that more closely matches the contours of the patient's actual bone, and select or design surgical instruments and guide tools that are adapted specifically for repairing the bone of a particular patient. Use of these planning tools typically results in generation of a preoperative surgical plan, complete with an implant and surgical instruments that are selected or manufactured for the individual patient.
As noted above, a surgical joint repair procedure may involve a surgeon installing an implant in a bone of a patient. Prior to starting the joint repair procedure, the surgeon may select a size of the implant and determine a location at which to position the implant. One of the difficulties of joint repair procedure is the planning stage, which may include tradeoffs and compromises between each surgery decision in order to achieve the best outcome. An example of trade off for a total ankle repair (TAR) may be the decision to minimize the tibial implant overhang at the possible cost of deteriorating the Antero-Posterior (AP) alignment of the implant with the patient anatomy (e.g., the AP alignment of the patient foot). Many factors influence the tibial implant overhang and AP alignment, such as implant size and implant location.
In accordance with one or more aspects of this disclosure, a system may provide automated alignment and sizing advice to a surgeon. For instance, the system may combine multiple surgery criteria and corresponding measures (e.g., implant overhang area and AP alignment angle) into a single planning quality measure to obtain the most appropriate trade-off between each surgery criteria. In some examples, the system may maximize the quality measure function using a non-linear optimization using surgery criteria as optimization arguments. The non-linear optimization may yield an implant emplacement (i.e., a 3D coordinate) and the implant orientation (i.e., a 3D rotation matrix) that corresponds to an appropriate compromise between a minimized implant overhang and a maximized implant alignment. However, an automated alignment and sizing that relies on a non-linear optimization of surgery criteria and associated measures may require modeling and programming of each surgery criteria and trade-off. Such modeling and programming may be cumbersome and time consuming. As such, it may be desirable for a system to provide automated alignment and sizing advice without such modeling and programming.
In accordance with one or more aspects of this disclosure, a system may provide automated alignment and sizing advice for a particular patient based on alignment and sizing of implants of other patients. For instance, one or more processors of the system may obtain a target atlas of a particular patient on which an arthroplasty procedure is to be performed, and obtain a plurality of reference atlases of other patients on which the arthroplasty procedure has been performed. The one or more processors may select, based on a comparison of values of the target atlas and the plurality of reference atlases, at least one reference atlas of the plurality of reference atlases of the other patients. For instance, the one or more processors may select a reference atlas of the plurality of reference atlases that is most similar to the target atlas. The one or more processors may determine, based on the selected at least one reference atlas, one or both of an implant size and an implant alignment for the particular patient. For instance, the one or more processors may recommend the implant size and/or implant alignment of the reference atlas as the implant size and/or implant alignment for the particular patient. The system may store the recommended implant size and/or implant alignment in a preoperative surgical plan for the particular patient. As such, the system may exploit knowledge from retrospective surgeries to propose the most appropriate plan of a currently planned case, i.e., the size and alignment of the ankle implant.
As noted above, the system may utilize an atlas of a particular patient (i.e., a target atlas) and at least one reference atlas of another (i.e., a different) patient. A reference atlas of a patient may include a preoperative scan the patient, 3D models of a bone of the patient, reference anatomical axes of each bone, and corresponding implant size and placement used to install an implant in the patient. A target atlas may include similar components (but does not include the implant size and placement). An atlas, such as a target atlas or a reference atlas, may include other data points (e.g., cyst 3D models, age or weight of the patient).
The system may perform the reference atlas selection (e.g., selecting the one or more reference atlases that are most similar to the target atlas) using any suitable means. For instance, the system may determine a similarity measure between each of the plurality of reference atlases and the target atlas. The system may select the reference atlas (or atlases) with the greatest similarity measure as the one or more reference atlases. The similarity measure can be based on multiple criteria. For instance, a similarity measure between a particular reference atlas and a target atlas may be based on a difference between a 3D model of the particular reference atlas and a 3D model of the target atlas (e.g., a mean distance in in/mm, a Hausdorff distance, etc.). In some examples, the similarity measure can rely on criteria such as the surgery strategy (for example, a preferred implant type thereby excluding some atlases from the search) or patient demographics information (e.g., the age or the weight).
In some situations, once in the actual operating environment, the surgeon may choose to verify the preoperative surgical plan intraoperatively relative to the patient's actual bone. This verification may result in a determination that an adjustment to the preoperative surgical plan is needed, such as a different implant, a different positioning or orientation of the implant, and/or a different surgical guide for carrying out the surgical plan. In addition, a surgeon may want to view details of the preoperative surgical plan relative to the patient's real bone during the actual procedure in order to more efficiently perform standard steps, perform ancillary steps, and accurately position and orient the implant components. For example, the surgeon may want to obtain intraoperative visualization that provides guidance for positioning and orientation of implant components, guidance for preparation of bone or tissue to receive the implant components, guidance for reviewing the details of a procedure or procedural step, and/or guidance for selection of tools or implants and tracking of surgical procedure workflow.
Accordingly, this disclosure describes systems and methods for using a mixed reality (MR) visualization system to assist with creation, implementation, verification, and/or modification of a surgical plan before and during a surgical procedure. Because MR, or in some instances virtual reality (VR), may be used to interact with the surgical plan, this disclosure may also refer to the surgical plan as a “virtual” surgical plan. Visualization tools other than or in addition to mixed reality visualization systems may be used in accordance with techniques of this disclosure. A surgical plan, e.g., as generated by the BLUEPRINT™ system or another surgical planning platform, may include information defining a variety of features of a surgical procedure, such as features of particular surgical procedure steps to be performed on a patient by a surgeon according to the surgical plan including, for example, bone or tissue preparation steps and/or steps for selection, modification and/or placement of implant components. Such information may include, in various examples, dimensions, shapes, angles, surface contours, and/or orientations of implant components to be selected or modified by surgeons, dimensions, shapes, angles, surface contours and/or orientations to be defined in bone or tissue by the surgeon in bone or tissue preparation steps, and/or positions, axes, planes, angle and/or entry points defining placement of implant components by the surgeon relative to patient bone or tissue. Information such as dimensions, shapes, angles, surface contours, and/or orientations of anatomical features of the patient may be derived from imaging (e.g., x-ray, CT, MRI, ultrasound or other images), direct observation, or other techniques.
In this disclosure, the term “mixed reality” (MR) refers to the presentation of virtual objects such that a user sees images that include both real, physical objects and virtual objects. Virtual objects may include text, 2-dimensional surfaces, 3-dimensional models, or other user-perceptible elements that are not actually present in the physical, real-world environment in which they are presented as coexisting. In addition, virtual objects described in various examples of this disclosure may include graphics, images, animations or videos, e.g., presented as 3D virtual objects or 2D virtual objects. Virtual objects may also be referred to as virtual elements. Such elements may or may not be analogs of real-world objects. In some examples, in mixed reality, a camera may capture images of the real world and modify the images to present virtual objects in the context of the real world. In such examples, the modified images may be displayed on a screen, which may be head-mounted, handheld, or otherwise viewable by a user. This type of mixed reality is increasingly common on smartphones, such as where a user can point a smartphone's camera at a sign written in a foreign language and see in the smartphone's screen a translation in the user's own language of the sign superimposed on the sign along with the rest of the scene captured by the camera. In some examples, in mixed reality, see-through (e.g., transparent) holographic lenses, which may be referred to as waveguides, may permit the user to view real-world objects, i.e., actual objects in a real-world environment, such as real anatomy, through the holographic lenses and also concurrently view virtual objects.
The Microsoft HOLOLENS™ headset, available from Microsoft Corporation of Redmond, Washington, is an example of a MR device that includes see-through holographic lenses, sometimes referred to as waveguides, that permit a user to view real-world objects through the lens and concurrently view projected 3D holographic objects. The Microsoft HOLOLENS™ headset, or similar waveguide-based visualization devices, are examples of an MR visualization device that may be used in accordance with some examples of this disclosure. Some holographic lenses may present holographic objects with some degree of transparency through see-through holographic lenses so that the user views real-world objects and virtual, holographic objects. In some examples, some holographic lenses may, at times, completely prevent the user from viewing real-world objects and instead may allow the user to view entirely virtual environments. The term mixed reality may also encompass scenarios where one or more users are able to perceive one or more virtual objects generated by holographic projection. In other words, “mixed reality” may encompass the case where a holographic projector generates holograms of elements that appear to a user to be present in the user's actual physical environment.
In some examples, in mixed reality, the positions of some or all presented virtual objects are related to positions of physical objects in the real world. For example, a virtual object may be tethered to a table in the real world, such that the user can see the virtual object when the user looks in the direction of the table but does not see the virtual object when the table is not in the user's field of view. In some examples, in mixed reality, the positions of some or all presented virtual objects are unrelated to positions of physical objects in the real world. For instance, a virtual item may always appear in the top right of the user's field of vision, regardless of where the user is looking.
In this disclosure, the term augmented reality (AR) refers to technology that is similar to MR in the presentation of both real-world and virtual elements, but AR generally refers to presentations that are mostly real, with a few virtual additions to “augment” the real-world presentation. For purposes of this disclosure, MR is considered to include AR. For example, in AR, parts of the user's physical environment that are in shadow can be selectively brightened without brightening other areas of the user's physical environment. This example is also an instance of MR in that the selectively-brightened areas may be considered virtual objects superimposed on the parts of the user's physical environment that are in shadow.
Furthermore, in this disclosure, the term “virtual reality” (VR) refers to an immersive artificial environment that a user experiences through sensory stimuli (such as sights and sounds) provided by a computer. Thus, in virtual reality, the user may not see any physical objects as they exist in the real world. Video games set in imaginary worlds are a common example of VR. The term “VR” also encompasses scenarios where the user is presented with a fully artificial environment in which some virtual object's locations are based on the locations of corresponding physical objects as they relate to the user. Walk-through VR attractions are examples of this type of VR.
The term “extended reality” (XR) is a term that is used in this disclosure to encompasses a spectrum of user experiences that includes virtual reality, mixed reality, augmented reality, and other user experiences that involve the presentation of at least some perceptible elements as existing in the user's environment that are not present in the user's real-world environment. Thus, the term “extended reality” may be considered a genus for MR, AR, and VR. XR visualizations may be presented in any of the techniques for presenting mixed reality discussed elsewhere in this disclosure or presented using techniques for presenting VR, such as VR goggles.
In some examples, mixed reality systems and methods can be part of an intelligent surgical planning system that includes multiple subsystems that can be used to enhance surgical outcomes. In addition to the preoperative and intraoperative applications discussed above, an intelligent surgical planning system can include postoperative tools to assist with patient recovery and which can provide information that can be used to assist with and plan future surgical revisions or surgical cases for other patients.
Accordingly, systems and methods are also described herein that can be incorporated into an intelligent surgical planning system, such as artificial intelligence systems to assist with planning, implants with embedded sensors (e.g., smart implants) to provide postoperative feedback for use by the healthcare provider and the artificial intelligence system, and mobile applications to monitor and provide information to the patient and the healthcare provider in real-time or near real-time.
Visualization tools are available that utilize patient image data to generate three-dimensional models of bone contours to facilitate preoperative planning for joint repairs and replacements. These tools allow surgeons to design and/or select surgical guides and implant components that closely match the patient's anatomy. These tools can improve surgical outcomes by customizing a surgical plan for each patient. An example of such a visualization tool for shoulder repairs is the BLUEPRINT™ system available from Wright Medical Technology, Inc. The BLUEPRINT™ system provides the surgeon with two-dimensional planar views of the bone repair region as well as a three-dimensional virtual model of the repair region. The surgeon can use the BLUEPRINT™ system to select, design or modify appropriate implant components, determine how best to position and orient the implant components and how to shape the surface of the bone to receive the components, and design, select or modify surgical guide tool(s) or instruments to carry out the surgical plan. The information generated by the BLUEPRINT™ system is compiled in a preoperative surgical plan for the patient that is stored in a database at an appropriate location (e.g., on a server in a wide area network, a local area network, or a global network) where it can be accessed by the surgeon or other care provider, including before and during the actual surgery.
Users of orthopedic surgical system 100 may utilize virtual planning system 102 to plan orthopedic surgeries. Users of orthopedic surgical system 100 may use planning support system 104 to review surgical plans generated using orthopedic surgical system 100. Manufacturing and delivery system 106 may assist with the manufacture and delivery of items needed to perform orthopedic surgeries. Intraoperative guidance system 108 provides guidance to assist users of orthopedic surgical system 100 in performing orthopedic surgeries. Medical education system 110 may assist with the education of users, such as healthcare professionals, patients, and other types of individuals. Pre- and postoperative monitoring system 112 may assist with monitoring patients before and after the patients undergo surgery. Predictive analytics system 114 may assist healthcare professionals with various types of predictions. For example, predictive analytics system 114 may apply artificial intelligence techniques to determine a classification of a condition of an orthopedic joint, e.g., a diagnosis, determine which type of surgery to perform on a patient and/or which type of implant to be used in the procedure, determine types of items that may be needed during the surgery, and so on.
The subsystems of orthopedic surgical system 100 (i.e., virtual planning system 102, planning support system 104, manufacturing and delivery system 106, intraoperative guidance system 108, medical education system 110, pre- and postoperative monitoring system 112, and predictive analytics system 114) may include various systems. The systems in the subsystems of orthopedic surgical system 100 may include various types of computing systems, computing devices, including server computers, personal computers, tablet computers, smartphones, display devices, Internet of Things (IoT) devices, visualization devices (e.g., MR visualization devices, VR visualization devices, holographic projectors, or other devices for presenting XR visualizations), surgical tools, and so on. A holographic projector, in some examples, may project a hologram for general viewing by multiple users or a single user without a headset, rather than viewing only by a user wearing a headset. For example, virtual planning system 102 may include a MR visualization device and one or more server devices, planning support system 104 may include one or more personal computers and one or more server devices, and so on. A computing system is a set of one or more computing systems configured to operate as a system. In some examples, one or more devices may be shared between two or more of the subsystems of orthopedic surgical system 100. For instance, in the previous examples, virtual planning system 102 and planning support system 104 may include the same server devices.
In the example of
Communications network 116 may include various types of communication networks including one or more wide-area networks, such as the Internet, local area networks, and so on. In some examples, communications network 116 may include wired and/or wireless communication links.
Many variations of orthopedic surgical system 100 are possible in accordance with techniques of this disclosure. Such variations may include more or fewer subsystems than the version of orthopedic surgical system 100 shown in
In the example of
In the example of
In some examples, multiple users can simultaneously use MR system 212. For example, MR system 212 can be used in a spectator mode in which multiple users each use their own visualization devices so that the users can view the same information at the same time and from the same point of view. In some examples, MR system 212 may be used in a mode in which multiple users each use their own visualization devices so that the users can view the same information from different points of view.
In some examples, processing device(s) 210 can provide a user interface to display data and receive input from users at healthcare facility 204. Processing device(s) 210 may be configured to control visualization device 213 to present a user interface. Furthermore, processing device(s) 210 may be configured to control visualization device 213 to present virtual images, such as 3D virtual models, 2D images, and so on. Processing device(s) 210 can include a variety of different processing or computing devices, such as servers, desktop computers, laptop computers, tablets, mobile phones and other electronic computing devices, or processors within such devices. In some examples, one or more of processing device(s) 210 can be located remote from healthcare facility 204. In some examples, processing device(s) 210 reside within visualization device 213. In some examples, at least one of processing device(s) 210 is external to visualization device 213. In some examples, one or more processing device(s) 210 reside within visualization device 213 and one or more of processing device(s) 210 are external to visualization device 213.
In the example of
Network 208 may be equivalent to network 116. Network 208 can include one or more wide area networks, local area networks, and/or global networks (e.g., the Internet) that connect preoperative surgical planning system 202 and MR system 212 to storage system 206. Storage system 206 can include one or more databases that can contain patient information, medical information, patient image data, and parameters that define the surgical plans. For example, medical images of the patient's diseased or damaged bone typically are generated preoperatively in preparation for an orthopedic surgical procedure. The medical images can include images of the relevant bone(s) taken along the sagittal plane and the coronal plane of the patient's body. The medical images can include X-ray images, magnetic resonance imaging (MRI) images, computerized tomography (CT) images, ultrasound images, and/or any other type of 2D or 3D image that provides information about the relevant surgical area. Storage system 206 also can include data identifying the implant components selected for a particular patient (e.g., type, size, etc.), surgical guides selected for a particular patient, and details of the surgical procedure, such as entry points, cutting planes, drilling axes, reaming depths, etc. Storage system 206 can be a cloud-based storage system (as shown) or can be located at healthcare facility 204 or at the location of preoperative surgical planning system 202 or can be part of MR system 212 or visualization device (VD) 213, as examples.
MR system 212 can be used by a surgeon before (e.g., preoperatively) or during the surgical procedure (e.g., intraoperatively) to create, review, verify, update, modify and/or implement a surgical plan. In some examples, MR system 212 may also be used after the surgical procedure (e.g., postoperatively) to review the results of the surgical procedure, assess whether revisions are required, or perform other postoperative tasks. To that end, MR system 212 may include a visualization device 213 that may be worn by the surgeon and (as will be explained in further detail below) is operable to display a variety of types of information, including a 3D virtual image of the patient's diseased, damaged, or postsurgical joint and details of the surgical plan, such as a 3D virtual image of the prosthetic implant components selected for the surgical plan, 3D virtual images of entry points for positioning the prosthetic components, alignment axes and cutting planes for aligning cutting or reaming tools to shape the bone surfaces, or drilling tools to define one or more holes in the bone surfaces, in the surgical procedure to properly orient and position the prosthetic components, surgical guides and instruments and their placement on the damaged joint, and any other information that may be useful to the surgeon to implement the surgical plan. MR system 212 can generate images of this information that are perceptible to the user of the visualization device 213 before and/or during the surgical procedure.
In some examples, MR system 212 includes multiple visualization devices (e.g., multiple instances of visualization device 213) so that multiple users can simultaneously see the same images and share the same 3D scene. In some such examples, one of the visualization devices can be designated as the master device and the other visualization devices can be designated as observers or spectators. Any observer device can be re-designated as the master device at any time, as may be desired by the users of MR system 212. Moreover, in some situations, observers or spectators may assist in one or more aspects of a surgical procedure.
In this way,
The virtual surgical plan may include a 3D virtual model corresponding to the anatomy of interest of the particular patient and a 3D model of a prosthetic component matched to the particular patient to repair the anatomy of interest or selected to repair the anatomy of interest. Furthermore, in the example of
In some examples, visualization device 213 is configured such that the user can manipulate the user interface (which is visually perceptible to the user when the user is wearing or otherwise using visualization device 213) to request and view details of the virtual surgical plan for the particular patient, including a 3D virtual model of the anatomy of interest (e.g., a 3D virtual bone model of the anatomy of interest, such as a glenoid bone or a humeral bone) and/or a 3D model of the prosthetic component selected to repair an anatomy of interest. In some such examples, visualization device 213 is configured such that the user can manipulate the user interface so that the user can view the virtual surgical plan intraoperatively, including (at least in some examples) the 3D virtual model of the anatomy of interest (e.g., a 3D virtual bone model of the anatomy of interest). In some examples, MR system 212 can be operated in an augmented surgery mode in which the user can manipulate the user interface intraoperatively so that the user can visually perceive details of the virtual surgical plan projected in a real environment, e.g., on a real anatomy of interest of the particular patient. In this disclosure, the terms real and real world may be used in a similar manner. For example, MR system 212 may present one or more virtual objects that provide guidance for preparation of a bone surface and placement of a prosthetic implant on the bone surface. Visualization device 213 may present one or more virtual objects in a manner in which the virtual objects appear to be overlaid on an actual, real anatomical object of the patient, within a real-world environment, e.g., by displaying the virtual object(s) with actual, real-world patient anatomy viewed by the user through holographic lenses. For example, the virtual objects may be 3D virtual objects that appear to reside within the real-world environment with the actual, real anatomical object.
As described in this disclosure, orthopedic surgical system 100 (
Various workflows may exist within the surgical process of
Furthermore, the example of
The example of
Additionally, in the example of
Furthermore, in the example of
In accordance with one or more aspects of this disclosure, when performing the automatic processing step, a computing system (e.g., virtual planning system 102) may select, based on the values of the plurality of parameters, one or more ancillary steps of a plurality of ancillary steps for inclusion in the arthroplasty procedure. The plurality of ancillary steps may be different than a standard set of steps included in the arthroplasty procedure (e.g., the plurality of ancillary steps are not included in the standard set of steps).
The example of
A virtual planning step (412) may follow the manual correction step in
Furthermore, in the example of
Additionally, in the example of
In the example of
Postoperative patient monitoring may occur after completion of the surgical procedure (420). During the postoperative patient monitoring step, healthcare outcomes of the patient may be monitored. Healthcare outcomes may include relief from symptoms, ranges of motion, complications, performance of implanted surgical items, and so on. Pre- and postoperative monitoring system 112 (
The medical consultation, case creation, preoperative patient monitoring, image acquisition, automatic processing, manual correction, and virtual planning steps of
As mentioned above, one or more of the subsystems of orthopedic surgical system 100 may include one or more MR systems, such as MR system 212 (
In some examples, screen 520 may include see-through holographic lenses. sometimes referred to as waveguides, that permit a user to see real-world objects through (e.g., beyond) the lenses and also see holographic imagery projected into the lenses and onto the user's retinas by displays, such as liquid crystal on silicon (LCoS) display devices, which are sometimes referred to as light engines or projectors, operating as an example of a holographic projection system 538 within visualization device 213. In other words, visualization device 213 may include one or more see-through holographic lenses to present virtual images to a user. Hence, in some examples, visualization device 213 can operate to project 3D images onto the user's retinas via screen 520, e.g., formed by holographic lenses. In this manner, visualization device 213 may be configured to present a 3D virtual image to a user within a real-world view observed through screen 520, e.g., such that the virtual image appears to form part of the real-world environment. In some examples, visualization device 213 may be a Microsoft HOLOLENS™ headset, available from Microsoft Corporation, of Redmond, Washington, USA, or a similar device, such as, for example, a similar MR visualization device that includes waveguides. The HOLOLENS™ device can be used to present 3D virtual objects via holographic lenses, or waveguides, while permitting a user to view actual objects in a real-world scene, i.e., in a real-world environment, through the holographic lenses.
Although the example of
Visualization device 213 can also generate a user interface (UI) 522 that is visible to the user, e.g., as holographic imagery projected into see-through holographic lenses as described above. For example, UI 522 can include a variety of selectable widgets 524 that allow the user to interact with a MR system, such as MR system 212 of
Visualization device 213 can also include a transceiver 528 to connect visualization device 213 to a processing device 510 and/or to network 208 and/or to a computing cloud, such as via a wired communication protocol or a wireless protocol, e.g., Wi-Fi, Bluetooth, etc. Visualization device 213 also includes a variety of sensors to collect sensor data, such as one or more optical camera(s) 530 (or other optical sensors) and one or more depth camera(s) 532 (or other depth sensors), mounted to, on or within frame 518. In some examples, the optical sensor(s) 530 are operable to scan the geometry of the physical environment in which a user of MR system 212 is located (e.g., an operating room) and collect two-dimensional (2D) optical image data (either monochrome or color). Depth sensor(s) 532 are operable to provide 3D image data, such as by employing time of flight, stereo or other known or future-developed techniques for determining depth and thereby generating image data in three dimensions. Other sensors can include motion sensors 533 (e.g., Inertial Mass Unit (IMU) sensors, accelerometers, etc.) to assist with tracking movement.
MR system 212 processes the sensor data so that geometric, environmental, textural, or other types of landmarks (e.g., corners, edges or other lines, walls, floors, objects) in the user's environment or “scene” can be defined and movements within the scene can be detected. As an example, the various types of sensor data can be combined or fused so that the user of visualization device 213 can perceive 3D images that can be positioned, or fixed and/or moved within the scene. When a 3D image is fixed in the scene, the user can walk around the 3D image, view the 3D image from different perspectives, and manipulate the 3D image within the scene using hand gestures, voice commands, gaze line (or direction) and/or other control inputs. As another example, the sensor data can be processed so that the user can position a 3D virtual object (e.g., a bone model) on an observed physical object in the scene (e.g., a surface, the patient's real bone, etc.) and/or orient the 3D virtual object with other virtual images displayed in the scene. In some examples, the sensor data can be processed so that the user can position and fix a virtual representation of the surgical plan (or other widget, image or information) onto a surface, such as a wall of the operating room. Yet further, in some examples, the sensor data can be used to recognize surgical instruments and the position and/or location of those instruments.
Visualization device 213 may include one or more processors 514 and memory 516, e.g., within frame 518 of the visualization device. In some examples, one or more external computing resources 536 process and store information, such as sensor data, instead of or in addition to in-frame processor(s) 514 and memory 516. In this way, data processing and storage may be performed by one or more processors 514 and memory 516 within visualization device 213 and/or some of the processing and storage requirements may be offloaded from visualization device 213. Hence, in some examples, one or more processors that control the operation of visualization device 213 may be within visualization device 213, e.g., as processor(s) 514. Alternatively, in some examples, at least one of the processors that controls the operation of visualization device 213 may be external to visualization device 213, e.g., as processor(s) 210. Likewise, operation of visualization device 213 may, in some examples, be controlled in part by a combination one or more processors 514 within the visualization device and one or more processors 210 external to visualization device 213.
For instance, in some examples, when visualization device 213 is in the context of
In some examples, MR system 212 can also include user-operated control device(s) 534 that allow the user to operate MR system 212, use MR system 212 in spectator mode (either as master or observer), interact with UI 522 and/or otherwise provide commands or requests to processing device(s) 210 or other systems connected to network 208. As examples, control device(s) 534 can include a microphone, a touch pad, a control panel, a motion sensor or other types of control input devices with which the user can interact.
Speakers 604, in some examples, may form part of sensory devices 526 shown in
In some examples, a user may interact with and control visualization device 213 in a variety of ways. For example, microphones 606, and associated speech recognition processing circuitry or software, may recognize voice commands spoken by the user and, in response, perform any of a variety of operations, such as selection, activation, or deactivation of various functions associated with surgical planning, intraoperative guidance, or the like. As another example, one or more cameras or other optical sensors 530 of sensors 614 may detect and interpret gestures to perform operations as described above. As a further example, sensors 614 may sense gaze direction and perform various operations as described elsewhere in this disclosure. In some examples, input devices 608 may receive manual input from a user, e.g., via a handheld controller including one or more buttons, a keypad, a touchscreen, joystick, trackball, and/or other manual input media, and perform, in response to the manual user input, various operations as described above.
As discussed above, surgical lifecycle 300 may include a preoperative phase 302 (
Furthermore, in the example of
In the example of
Furthermore, in the example of
Thus, as described in the examples above, two or more of the individuals described above (e.g., the first surgeon, the patient, the nurse, and the second surgeon) can view the same or different MR preoperative planning content 702 at the same time. In examples where two or more of the individuals are viewing the same MR preoperative planning content 702 at the same time, the two or more individuals may concurrently view the same MR preoperative guidance content 702 from the same or different perspectives. Moreover, in some examples, two or more of the individuals described above can view the same or different MR preoperative planning content 702 at different times. Preoperative planning content 702 may include an information model of a surgical plan, virtual 3D model information representing patient anatomy, such as bone and/or tissue, alone, or in combination with virtual 3D model information representing surgical procedure steps and/or implant placement and positioning. Examples of preoperative planning content 702 may include a surgical plan for a shoulder arthroplasty, virtual 3D model information representing scapula and/or glenoid bone, or representing humeral bone, with virtual 3D model information of instruments to be applied to the bone or implants to be positioned on or in the bone. In some examples, multiple users may be able to change and manipulate preoperative planning content 702.
In the example of
Additionally, a surgical plan may be selected based on the pathology (804). The surgical plan is a plan to address the pathology. For instance, in the example where the area of interest is the patient's shoulder, the surgical plan may be selected from an anatomical shoulder arthroplasty, a reverse shoulder arthroplasty, a post-trauma shoulder arthroplasty, or a revision to a previous shoulder arthroplasty. The surgical plan may then be tailored to patient (806). As one example, tailoring the surgical plan may involve selecting and/or sizing surgical items needed to perform the selected surgical plan. As another example, tailoring the surgical plan may involve determining a location (e.g., a position and/or an orientation) at which to install an implant. Additionally, the surgical plan may be tailored to the patient in order to address issues specific to the patient, such as the presence of osteophytes. As described in detail elsewhere in this disclosure, one or more users may use mixed reality systems of orthopedic surgical system 100 to tailor the surgical plan to the patient, including comparing the surgical plan for the patient to surgical plans for other patients.
The surgical plan may then be reviewed (808). For instance, a consulting surgeon may review the surgical plan before the surgical plan is executed. As described in detail elsewhere in this disclosure, one or more users may use MR systems of orthopedic surgical system 100 to review the surgical plan. In some examples, a surgeon may modify the surgical plan using an MR system by interacting with a UI and displayed elements, e.g., to select a different procedure, change the sizing, shape or positioning of implants, or change the angle, depth or amount of cutting or reaming of the bone surface to accommodate an implant.
Additionally, in the example of
The user can also organize or customize UI 522 by manipulating, moving and orienting any of the displayed widgets according to the user's preferences, such as by visualization device 213 or other device detecting gaze direction, hand gestures and/or voice commands. Further, the location of widgets that are displayed to the user can be fixed relative to the scene. Thus, as the user's gaze (i.e., eye direction) moves to view other features of the user interface 522, other virtual images, and/or real objects physically present in the scene (e.g., the patient, an instrument set, etc.), the widgets may remain stationary and do not interfere with the user's view of the other features and objects. As yet another example, the user can control the opacity or transparency of the widgets or any other displayed images or information. The user also can navigate in any direction between the buttons 1002 on the workflow bar 1000 and can select any one of buttons 1002 at any time during use of MR system 212. Selection and manipulation of widgets, information, images or other displayed features can be implemented based on visualization device 213 or other device detecting user gaze direction, hand motions, voice commands or any combinations thereof.
In the example of
As shown
The surgical plan image 1006 may be a compilation of preoperative (and, optionally, postoperative) patient information and the surgical plan for the patient that are stored in a database in storage system 206. As such, surgical plan image 1006 may include at least some components of an atlas of the patient. In some examples, surgical plan image 1006 can correspond to a multi-page document through which the user can browse. For example, further images of pages can display patient information, information regarding the anatomy of interest, postoperative measurements, and various 2D images of the anatomy of interest. Yet further page images can include, as examples, planning information associated with an implant selected for the patient, such as anatomy measurements and implant size, type and dimensions; planar images of the anatomy of interest; images of a 3D model showing the positioning and orientation of a surgical guide selected for the patient to assist with execution of the surgical plan; etc.
It should be understood that the surgical plan image 1006 can be displayed in any suitable format and arrangement and that other implementations of the systems and techniques described herein can include different information depending upon the needs of the application in which the plan image 1006 is used.
Referring again
Returning to the example of
Workflow bar 1000 in
In the example shown, each workflow page that can be selected by the user (e.g., a surgeon) can include an Augment Surgery widget that, when selected, launches an operational mode of MR system 212 in which a user using (e.g., wearing) visualization device 213 (
In this example of a shoulder repair procedure, and with reference
In some examples, the images displayed on UI 522 of MR system 212 can be viewed outside or within the surgical operating environment and, in spectator mode, can be viewed by multiple users outside and within the operating environment at the same time. In some circumstances, such as in the operating environment, the surgeon may find it useful to use a control device 534 to direct visualization device 213 such that certain information should be locked into position on a wall or other surface of the operating room, as an example, so that the information does not impede the surgeon's view during the procedure. For example, relevant surgical steps of the surgical plan can be selectively displayed and used by the surgeon or other care providers to guide the surgical procedure.
In some examples, the display of surgical steps can be automatically controlled so that only the relevant steps are displayed at the appropriate times during the surgical procedure.
As discussed above, surgical lifecycle 300 may include an intraoperative phase 306 during which a surgical operation is performed. One or more users may use orthopedic surgical system 100 in intraoperative phase 306. In some examples, one or more users, including at least one surgeon, may use orthopedic surgical system 100 in an intraoperative setting to perform shoulder surgery.
In the example of
Furthermore, in the example of
As discussed above, the humerus preparation process may enable the surgeon to access the patient's glenoid. In the example of
The surgeon may perform a reaming axis drilling process (1906). During the reaming axis drilling process, the surgeon may drill a reaming axis guide pin hole in the patient's glenoid to receive a reaming guide pin. In some examples, at a later stage of the shoulder surgery, the surgeon may insert a reaming axis pin into the reaming axis guide pin hole. In some examples, the reaming axis pin may itself be the drill bit that is used to drill the reaming axis guide pin hole (e.g., the reaming axis pin may be self-tapping). Thus, in such examples, it may be unnecessary to perform a separate step of inserting the reaming axis pin. In some examples, an MR system (e.g., MR system 212, MR system 1800A, etc.) may present a virtual reaming axis to help the surgeon perform the drilling in alignment with the reaming axis and thereby place the reaming guide pin in the correct location and with the correct orientation.
The surgeon may perform the reaming axis drilling process in one of various ways. For example, the surgeon may perform a guide-based process to drill the reaming axis pin hole. In the case, a physical guide is placed on the glenoid to guide drilling of the reaming axis pin hole. In other examples, the surgeon may perform a guide-free process, e.g., with presentation of a virtual reaming axis that guides the surgeon to drill the reaming axis pin hole with proper alignment. An MR system (e.g., MR system 212, MR system 1800A, etc.) may help the surgeon perform either of these processes to drill the reaming axis pin hole.
Furthermore, in the surgical process of
After performing the reaming axis insertion process, the surgeon may perform a glenoid reaming process (1910). During the glenoid reaming process, the surgeon reams the patient's glenoid. Reaming the patient's glenoid may result in an appropriate surface for installation of a glenoid implant. In some examples, to ream the patient's glenoid, the surgeon may affix a reaming bit to a surgical drill. The reaming bit defines an axial cavity along an axis of rotation of the reaming bit. The axial cavity has an inner diameter corresponding to an outer diameter of the reaming axis pin. After affixing the reaming bit to the surgical drill, the surgeon may position the reaming bit so that the reaming axis pin is in the axial cavity of the reaming bit. Thus, during the glenoid reaming process, the reaming bit may spin around the reaming axis pin. In this way, the reaming axis pin may prevent the reaming bit from wandering during the glenoid reaming process. In some examples, multiple tools may be used to ream the patient's glenoid. An MR system (e.g., MR system 212, MR system 1800A, etc.) may present virtual guidance to help the surgeon or other users to perform the glenoid reaming process. For example, the MR system may help a user, such as the surgeon, select a reaming bit to use in the glenoid reaming process. In some examples, the MR system presents virtual guidance to help the surgeon control the depth to which the surgeon reams the user's glenoid. In some examples, the glenoid reaming process includes a paleo reaming step and a neo reaming step to ream different parts of the patient's glenoid.
Additionally, in the surgical process of
In some examples, the glenoid implantation process includes a process to fix the glenoid implant to the patient's scapula (1914). In some examples, the process to fix the glenoid implant to the patient's scapula includes drilling one or more anchor holes or one or more screw holes into the patient's scapula and positioning an anchor such as one or more pegs or a keel of the implant in the anchor hole(s) and/or inserting screws through the glenoid implant and the screw holes, possibly with the use of cement or other adhesive. An MR system (e.g., MR system 212, MR system 1800A, etc.) may present virtual guidance to help the surgeon with the process of fixing the glenoid implant the glenoid bone, e.g., including virtual guidance indicating anchor or screw holes to be drilled or otherwise formed in the glenoid, and the placement of anchors or screws in the holes.
Furthermore, in the example of
Furthermore, in the example surgical process of
After performing the humerus implant installation process, the surgeon may perform an implant alignment process that aligns the installed glenoid implant and the installed humerus implant (1920). For example, in instances where the surgeon is performing an anatomical shoulder arthroplasty, the surgeon may nest the convex surface of the humerus implant into the concave surface of the glenoid implant. In instances where the surgeon is performing a reverse shoulder arthroplasty, the surgeon may nest the convex surface of the glenoid implant into the concave surface of the humerus implant. Subsequently, the surgeon may perform a wound closure process (1922). During the wound closure process, the surgeon may reconnect tissues severed during the incision process in order to close the wound in the patient's shoulder.
As mentioned elsewhere in this disclosure, a user interface of MR system 212 may include workflow bar 1000. Workflow bar 1000 include icons corresponding to workflow pages. In some examples, each workflow page that can be selected by the user (e.g., a surgeon) can include an Augment Surgery widget, that, when selected, launches an operational mode of MR system 212 in which a user wearing or otherwise using visualization device 213 can see the details (e.g., virtual images of details) of the surgical plan projected and matched onto the patient bone and use the plan intraoperatively to assist with the surgical procedure. In general, the Augment Surgery mode allows the surgeon to register the virtual 3D model of the patient's anatomy of interest (e.g., glenoid) with the observed real anatomy so that the surgeon can use the virtual surgical planning to assist with implementation of the real surgical procedure, as will be explained in further detail below.
For a shoulder arthroplasty application, the registration process may start by virtualization device 213 presenting the user with 3D virtual bone model 1008 of the patient's scapula and glenoid that was generated from preoperative images of the patient's anatomy, e.g., by surgical planning system 102. The user can then manipulate 3D virtual bone model 1008 in a manner that aligns and orients 3D virtual bone model 1008 with the patient's real scapula and glenoid that the user is observing in the operating environment. As such, in some examples, the MR system may receive user input to aid in the initialization and/or registration. However, as discussed above, in some examples, the MR system may perform the initialization and/or registration process automatically (e.g., without receiving user input to position the 3D bone model). For other types of arthroplasty procedures, such as for the knee, hip, foot, ankle or elbow, different relevant bone structures can be displayed as virtual 3D images and aligned and oriented in a similar manner with the patient's actual, real anatomy.
Regardless of the particular type of joint or anatomical structure involved, selection of the augment surgery mode initiates a procedure where 3D virtual bone model 1008 is registered with an observed bone structure. In general, the registration procedure can be considered as a classical optimization problem (e.g., either minimization or maximization). For a shoulder arthroplasty procedure, known inputs to the optimization (e.g., minimization) analysis are the 3D geometry of the observed patient's bone (derived from sensor data from the visualization device 213, including depth data from the depth camera(s) 532) and the geometry of the 3D virtual bone derived during the virtual surgical planning state (such as by using the BLUEPRINT™ system). Other inputs include details of the surgical plan (also derived during the virtual surgical planning stage, such as by using the BLUEPRINT™ system), such as the position and orientation of entry points, cutting planes, reaming axes and/or drilling axes, as well as reaming or drilling depths for shaping the bone structure, the type, size and shape of the prosthetic components, and the position and orientation at which the prosthetic components will be placed or, in the case of a fracture, the manner in which the bone structure will be rebuilt.
Upon selection of a particular patient from the welcome page of UI 522 of MR system 212 (
On occasion, during a surgery, the surgeon may determine that there is a need to modify the preoperative surgical plan. MR system 212 allows for intraoperative modifications to the surgical plan that then can be executed in the Augmented Surgery Mode. For instance, in some examples, the user can manipulate the user interface so that the user can view the virtual surgical plan intraoperatively, including at least the 3D virtual bone anatomy of interest. In such examples, the user can manipulate the user interface so that the user can modify the virtual surgical plan intraoperatively. As an example, selection of the Planning page on the workflow bar 1000 of the UI 522 shown in
As discussed elsewhere in this disclosure, orthopedic surgical procedures may involve performing various work on a patient's anatomy. Some examples of work that may be performed include, but are not necessarily limited to, cutting, drilling, reaming, screwing, adhering, and impacting. In general, it may be desirable for a practitioner (e.g., surgeon, physician's assistant, nurse, etc.) to perform the work as accurately as possible. For instance, if a surgical plan for implanting a prosthetic in a particular patient specifies that a portion of the patient's anatomy is to be reamed at a particular diameter to a particular depth, it may be desirable for the surgeon to ream the portion of the patient's anatomy to as close as possible to the particular diameter and to the particular depth (e.g., to increase the likelihood that the prosthetic will fit and function as planned and thereby promote a good health outcome for the patient).
In some examples, a surgeon may perform one of more work operations “free hand” (i.e., by applying or otherwise using a tool without mechanical or visual guides/aids for the tool). In some examples, in the course of an orthopedic surgical procedure, a surgeon may perform one of more work operations, which also may be referred to as surgical steps, with the assistance of a mechanical guide. In some examples, a visualization system, such as MR visualization system 212, may be configured to display virtual guidance including one or more virtual guides for performing work on a portion of a patient's anatomy.
For instance, the visualization system may display a virtual cutting plane overlaid on an anatomic neck of the patient's humerus. In some examples, a user such as a surgeon may view real-world objects in a real-world scene. The real-world scene may be in a real-world environment such as a surgical operating room. In this disclosure, the terms real and real-world may be used in a similar manner. The real-world objects viewed by the user in the real-world scene may include the patient's actual, real anatomy, such as an actual glenoid or humerus, exposed during surgery. The user may view the real-world objects via a see-through (e.g., transparent) screen, such as see-through holographic lenses, of a head-mounted MR visualization device, such as visualization device 213, and also see virtual guidance such as virtual MR objects that appear to be projected on the screen or within the real-world scene, such that the MR guidance object(s) appear to be part of the real-world scene, e.g., with the virtual objects appearing to the user to be integrated with the actual, real-world scene. For example, the virtual cutting plane/line may be projected on the screen of a MR visualization device, such as visualization device 213, such that the cutting plane is overlaid on, and appears to be placed within, an actual, observed view of the patient's actual humerus viewed by the surgeon through the transparent screen, e.g., through see-through holographic lenses. Hence, in this example, the virtual cutting plane/line may be a virtual 3D object that appears to be part of the real-world environment, along with actual, real-world objects.
A screen through which the surgeon views the actual, real anatomy and also observes the virtual objects, such as virtual anatomy and/or virtual surgical guidance, may include one or more see-through holographic lenses. The holographic lenses, sometimes referred to as “waveguides,” may permit the user to view real-world objects through the lenses and display projected holographic objects for viewing by the user. As discussed above, an example of a suitable head-mounted MR device for visualization device 213 is the Microsoft HOLOLENS™ headset, available from Microsoft Corporation, of Redmond, Washington, USA. The HOLOLENS™ headset includes see-through, holographic lenses, also referred to as waveguides, in which projected images are presented to a user. The HOLOLENS™ headset also includes an internal computer, cameras and sensors, and a projection system to project the holographic content via the holographic lenses for viewing by the user. In general, the Microsoft HOLOLENS™ headset or a similar MR visualization device may include, as mentioned above, LCOS display devices that project images into holographic lenses, also referred to as waveguides, e.g., via optical components that couple light from the display devices to optical waveguides. The waveguides may permit a user to view a real-world scene through the waveguides while also viewing a 3D virtual image presented to the user via the waveguides. In some examples, the waveguides may be diffraction waveguides.
The visualization system (e.g., MR system 212/visualization device 213) may be configured to display different types of virtual guides. Examples of virtual guides include, but are not limited to, a virtual point, a virtual axis, a virtual angle, a virtual path, a virtual plane, and a virtual surface or contour. As discussed above, the visualization system (e.g., MR system 212/visualization device 213) may enable a user to directly view the patient's anatomy via a lens by which the virtual guides are displayed, e.g., projected. The virtual guides may guide or assist various aspects of the surgery. For instance, a virtual guide may guide at least one of preparation of anatomy for attachment of the prosthetic or attachment of the prosthetic to the anatomy.
The visualization system may obtain parameters for the virtual guides from a virtual surgical plan, such as the virtual surgical plan described herein. Example parameters for the virtual guides include, but are not necessarily limited to a guide location, a guide orientation, a guide type, a guide color, etc.
The visualization system may display a virtual guide in a manner in which the virtual guide appears to be overlaid on an actual, real anatomical object of the patient, within a real-world environment, e.g., by displaying the virtual guide(s) with actual, real-world patient anatomy (e.g., at least a portion of the patient's anatomy) viewed by the user through holographic lenses. For example, the virtual guides may be 3D virtual objects that appear to reside within the real-world environment with the actual, real anatomical object. The visualization system may display virtual guidance for any combination of standard steps and ancillary steps.
The techniques of this disclosure are described below with respect to an ankle arthroplasty surgical procedure. However, the techniques are not so limited, and the visualization system may be used to provide virtual guidance information, including virtual guides in any type of surgical procedure. Other example procedures in which a visualization system, such as MR system 212, may be used to provide virtual guides include, but are not limited to, other types of orthopedic surgeries; any type of procedure with the suffix “plasty,” “stomy,” “ectomy,” “clasia,” or “centesis,”; orthopedic surgeries for other joints, such as elbow, wrist, finger, hip, knee, shoulder, or toe, or any other orthopedic surgical procedure in which precision guidance is desirable.
A typical shoulder arthroplasty includes various work on a patient's scapula and performing various work on the patient's humerus. The work on the scapula may generally be described as preparing the scapula (e.g., the glenoid cavity of the scapula) for attachment of a prosthesis and attaching the prosthesis to the prepared scapula. Similarly, the work on the humerus may generally be described as preparing the humerus for attachment of a prosthesis and attaching the prosthesis to the prepared humerus. As described herein, the visualization system may provide guidance for any or all work performed in such an arthroplasty procedure.
This disclosure describes techniques that use XR to assist users (e.g., surgeons or other types of persons) through the workflow steps of orthopedic surgeries in a way that may address challenges such as those mentioned above. As described elsewhere in this disclosure, XR may include VR, MR, and AR. In examples where XR is used to assist a user through the workflow steps of an orthopedic surgery and XR takes the form of VR, the user may be performing a simulation of the orthopedic surgery or may be performing the orthopedic surgery remotely. In examples where XR is used to assist a user through workflow steps of an orthopedic surgery and XR takes the form of MR or AR, the surgeon may concurrently perceive real-world objects and virtual objects during the orthopedic surgery.
As noted above, the techniques of this disclosure may be applicable to ankle surgery (e.g., total ankle arthroplasty). In the example of a total ankle arthroplasty, a surgeon may perform a distal tibial cut, a proximal calcaneus cut, and two other medial/lateral cuts. To do so, the surgeon may need to place a cutting guide on the ankle joint. The cutting guide is placed so that the cuts will be perpendicular to the mechanical axis of the tibia. The placement of the cutting guide is then refined by adjusting three angles relative to the three anatomical planes (axial, sagittal and coronal). The surgeon can perform these cuts using a cut jig or can perform these cuts directly using an oscillating saw. Next, the surgeon performs the posterior and anterior talar chamfer cut.
Many of the examples provided above with regards to cutting and drilling are applicable to the cutting and drilling operations performed during a total ankle arthroplasty. For example, during preoperative phase 302 (
Furthermore, during the intraoperative phase 306 (
In the example of
The surgeon may perform a registration process that registers a virtual tibia object with the patient's actual tibia bone (15004) in the field of view presented to the surgeon by visualization device 213. For instance, MR system 212 may obtain the virtual tibia object from storage system 206 of
The surgeon may perform various work steps to prepare the tibia bone (15006). Example work steps to prepare the tibia bone include, but are not limited to, installing one or more guide pins into the tibia bone, drilling one or more holes in the tibia bone, and/or attaching one or more guides to the tibia bone. MR system 212 may provide virtual guidance to assist the surgeon with the various work steps to prepare the tibia bone. As discussed above, MR system 212 may display an animation, video, or text to describe how a particular step or steps are to be performed. For instance, MR system 212 may cause visualization device 213 to display a diagram or animation showing how the tibia is to be prepared. As also discussed above, MR system 212 may display a virtual checklist, with each item on the checklist items corresponding to an item in a checklist of steps of an orthopedic surgery. For instance, MR system 212 may display a virtual checklist having a checklist item specifying a current step, or sequence of steps, of preparing the tibia bone.
In some examples, such as the example of
In addition to, or in place of tibial guide 15112, MR system 212 may provide virtual guidance to assist the surgeon with the installation of guide pins 15104A, 15104B, 15106A, and 15106B. For instance, visualization device 213 may display a virtual marker that guides a surgeon in installing a guide pin. Visualization device 213 may display the virtual marker with an appearance that the virtual marker is overlaid on tibia 15102 (e.g., to indicate the position and/or orientation at which the guide pin is to be installed). The virtual marker may be a virtual axis at a point on tibia 15102 that guides a surgeon in installing a guide pin. For instance, as shown in
MR system 212 may utilize different types of virtual markers depending on whether or not a physical guide is also used. As one example, in the example of
In examples where multiple guide pins are to be installed, visualization device 213 may display a respective virtual marker for each guide pin. In the example of
MR system 212 may display the virtual markers with particular colors. For instance, in some examples, MR system 212 may preferably display the virtual markers in a color other than red, such as green, blue, yellow, etc. Displaying the virtual markers in a color or colors other than red may provide one or more benefits. For instance, as blood appears red and blood may be present on or around the anatomy of interest, a red colored virtual marker may not be visible.
In some examples, such as where visualization system 213 displays multiple virtual markers at the same time, visualization system 213 may alter or otherwise modify the display of a virtual marker after the surgeon has completed a corresponding work step. Alterations of the display of virtual markers may include, but are not limited to, changing a color, changing a marker type, animating (e.g., blinking or flashing), displaying an additional element (e.g., an X or a checkmark on or near the virtual marker) or any other visually perceptible alteration. For instance, visualization system 213 may initially display a first virtual marker to guide installation of guide pin 15104A as a virtual axis and a second virtual marker to guide installation of guide pin 15104B as a virtual axis. After the surgeon installs guide pin 15104A, visualization system 213 may modify the first virtual marker displayed to guide installation of guide pin 15104A (e.g., changing from a virtual axis to a reticle) while maintaining the display of the second virtual marker as a virtual axis.
MR system 212 may provide other virtual guidance in addition to, or in place of, the virtual markers. For instance, MR system 212 may display depth guidance to enable the surgeon to install the guide pins to a target depth. As another example, MR system 212 may provide targeting guidance. For instance, MR system 212 may display one or both of a virtual marker that identifies a center point or prescribed axis of the pin installation and/or an indication of whether the guide pin is aligned with the prescribed axis. As discussed above, MR system 212 may determine whether the guide pin is aligned with the prescribed axis by monitoring a position/orientation of the guide pin and/or a drill driving the guide pin, and comparing the monitored position/orientation with the prescribed axis.
The surgeon may install guide pins 15104A, 15104B, 15106A, and 15106B using the virtual guidance. In examples where tibial guide 15112 was used, the surgeon may remove tibial guide 15112 after installation of guide pins 15104A, 15104B, 15106A, and 15106B.
In addition to, or in place of drilling guide 15202, MR system 212 may provide virtual guidance to assist the surgeon with the drilling of the proximal corners of tibia 15102. For instance, visualization device 213 may display a virtual marker that guides a surgeon in drilling a hole in tibia 15102. Visualization device 213 may display the virtual marker overlaid on tibia 15102 (e.g., to indicate the position and/or orientation at which the hole is to be drilled). The virtual marker may be a virtual drilling axis at a point on tibia 15102 that guides a surgeon in performing the drilling. Similar to the virtual markers discussed above that guide installation of guide pins, visualization device 213 device may display the virtual markers that guide the drilling of the proximal corners of tibia 15102 concurrently or sequentially, and the virtual markers that guide the drilling at each respective proximal corner of the tibia.
MR system 212 may provide other virtual guidance in addition to, or in place of, the virtual markers. For instance, MR system 212 may display depth guidance to enable the surgeon to drill the holes to a target depth. As another example, MR system 212 may provide targeting guidance. For instance, MR system 212 may display one or both of a virtual marker that identifies a center point or prescribed axis of the drilling, e.g., into the tibia or talus, and/or an indication of whether the drill bit is aligned with the prescribed axis. As discussed above, MR system 212 may determine whether the drill bit is aligned with the prescribed axis by monitoring a position/orientation of the drill bit and/or a drill driving the drill bit, and comparing the monitored position/orientation with the prescribed axis.
With continued reference to the stages of an ankle joint repair surgery of
In addition to, or in place of resection guide 15302, MR system 212 may provide virtual guidance to assist the surgeon with performing the resection of tibia 15102. For instance, visualization device 213 may display a virtual marker that guides a surgeon in performing a cut in tibia 15102. Visualization device 213 may display the marker overlaid on tibia 15102 (e.g., to indicate the position and/or orientation at which the cut is to be made). The virtual marker may be a virtual cutting line, a virtual cutting surface or a virtual cutting plane at a point on tibia 15102 that guides a surgeon in performing the cut. Similar to the virtual markers discussed above that guide installation of guide pins, visualization device 213 device may display the virtual markers that guide the performance of the proximal, medial, and lateral cuts concurrently or sequentially. In this way, MR system 212 may display a plurality of virtual cutting surfaces each having parameters obtained from the virtual surgical plan, the plurality of virtual cutting surfaces configured to guide resection of the tibia.
MR system 212 may provide other virtual guidance in addition to, or in place of, the virtual markers. For instance, MR system 212 may display depth guidance to enable the surgeon to perform the cuts to a target depth. As another example, MR system 212 may provide targeting guidance. For instance, MR system 212 may display one or both of a virtual marker that identifies a prescribed plane of the cutting and/or an indication of whether the saw blade is aligned with the prescribed plane. As discussed above, MR system 212 may determine whether the saw blade is aligned with the prescribed plane by monitoring a position/orientation of the saw blade and/or a motor driving the saw blade the guide pin, and comparing the monitored position/orientation with the prescribed plane.
The surgeon may remove the resection (i.e., the portion of tibia 15102 separated via the cuts). Guide pins 15104A and 15104B may be attached to the resection and removed as a consequence of the resection removal.
As discussed above, MR system 212 may cause the second visualization device to display virtual information that identifies surgical item selected for a current step of the ankle arthroplasty procedure. For instance, in the example of
Furthermore, with reference to the stages of the ankle joint repair surgery of
Additionally, in the example of
In some examples, such as the example of
In addition to, or in place of talar guide 15404, MR system 212 may provide virtual guidance to assist the surgeon with the installation of guide pins 15402A and 15402B. For instance, visualization device 213 may display one or more virtual markers that guide a surgeon in installing a guide pin of guide pins 15402A and 15402B. For instance, as shown in
The surgeon may install guide pins 15402A and 15402B using the virtual guidance. For example, the surgeon may align guide longitudinal axes of pins 15402A and 15402B with respective virtual axes to place the pins in bone. In examples where talar guide 15404 was used, the surgeon may remove talar guide 15404 after installation of guide pins 15402A and 15402B.
As discussed above, MR system 212 may cause the second visualization device to display virtual information that identifies a surgical item selected for a current step of the ankle arthroplasty procedure. For instance, where the surgeon may use talar guide 15404 to install guide pins 15402A and 15402B, MR system 212 may select talar guide 15404 as the selected surgical item.
With continued reference to
In addition to, or in place of resection guide 15302, MR system 212 may provide virtual guidance to assist the surgeon with performing the resection of talus 15308. For instance, visualization device 213 may display a virtual marker that guides a surgeon in performing a cut in talus 15108. Visualization device 213 may display the marker overlaid on talus 15108 (e.g., to indicate the position and/or orientation at which the cut is to be made). The virtual marker may be a virtual cutting line, virtual cutting surface or virtual cutting plane at a point on talus 15108 that guides a surgeon in performing the cut. In this way, MR system 212 may display a virtual cutting surface having parameters obtained from the virtual surgical plan, the virtual cutting surface configured to guide primary resection of the talus.
MR system 212 may provide other virtual guidance in addition to, or in place of, the virtual markers. For instance, MR system 212 may display depth guidance to enable the surgeon to perform the cut to a target depth (e.g., depth guidance similar to the depth guidance discussed above). As another example, MR system 212 may provide targeting guidance. For instance, MR system 212 may display one or both of a virtual marker that identifies a prescribed plane of the cutting and/or an indication of whether the saw blade is aligned with the prescribed plane. As discussed above, in some examples, MR system 212 may determine whether the saw blade is aligned with the prescribed plane by registering the saw blade or something connected thereto (e.g., a saw motor body, a saw handle, a physical registration marker, etc.) with a corresponding virtual model, and comparing the position of the corresponding virtual model with the prescribed plane.
The surgeon may remove the resection (i.e., the portion of talus 15108 separated via the cuts). In some examples, the surgeon may use various tools (e.g., a reciprocating saw or bone rasp) to remove any excess bone left after the resection has been removed.
The surgeon may perform one or more additional work steps on one or both of tibia 15102 and/or talus 15108 to prepare tibia 15102 and/or talus 15108 to receive implants. Example additional work steps include, but are not necessarily limited to, tibial tray trialing, tibial peg broaching, talar chamfer resections, and talar peg drilling.
To perform tibial tray trialing, the surgeon may attach tibial tray trial 15702 to tibia 15102. As shown in
In some examples, the surgeon may utilize fluoroscopy to perform the tibial tray trialing. For instance, the surgeon may utilize fluoroscopy to determine the relative positions of tibial tray trial 15702 and tibia 15102.
MR system 212 may provide virtual guidance to assist with tibial tray trialing. As one example, visualization device 213 may display a synthesized view showing the relative positions of tibial tray trial 15702 and tibia 15102. For instance, MR system 212 may register tibial tray trial 15702 to a corresponding virtual model of tibial tray trial and utilize the registered virtual models of tibial tray trial 15702 and tibia 15102 to synthesize a view showing the relative positions of the virtual models of tibial tray trial 15702 and tibia 15102. As the virtual models of tibial tray trial 15702 and tibia 15102 are respectively registered to tibial tray trial 15702 and tibia 15102, the relative positions of the virtual models of tibial tray trial 15702 and tibia 15102 corresponds to the relative positions of tibial tray trial 15702 and tibia 15102. The synthesized views may appear similar to the conceptual diagrams of
The surgeon may utilize the synthesized view to perform one or more adjustments on tibial tray trial 15702. For instance, if the synthesized view indicates that posterior edge 15704 of tibial tray trial 15702 extends past posterior edge 15706 of tibia 15102, the surgeon may adjust tibial tray trial 15702 to anteriorly advance posterior edge 15704 of tibial tray trial 15702. For instance, the surgeon may utilize tool 15708 to anteriorly translate tibial tray trial 15702.
The surgeon may utilize the synthesized view to determine which size tibial implant is to be utilized. For instance, if the synthesized view indicates that indicator 15710 (illustrated in
As described above, MR system 212 may enable the surgeon to perform tibial tray trialing using virtual guidance. In some examples, MR system 212 may enable the surgeon to perform tibial tray trialing without using fluoroscopy.
As discussed above, MR system 212 may cause the second visualization device to display virtual information that identifies surgical item selected for a current step of the ankle arthroplasty procedure. For instance, in the example of
The surgeon may create anchorage points for the tibial implant. For instance, the surgeon may utilize a tibial tray trial to perform tibial peg broaching.
As discussed above, MR system 212 may cause the second visualization device to display virtual information that identifies surgical item selected for a current step of the ankle arthroplasty procedure. For instance, in the example of
The surgeon may perform one or more talar chamfer resections to further prepare talus 15108 to receive the talar implant. In some examples, the surgeon may perform an anterior talar chamfer resection and a posterior talar chamfer resection. To perform the one or more talar resections, the surgeon may attach one or more guide pins to talus 15108.
In some examples, the surgeon may utilize a physical guide to assist with the installation of guide pins 15904A and 15904B to talus 15108. For instance, the surgeon may utilize fluoroscopy to position a talar dome trial component. When the talar dome trial component is positioned, the surgeon may utilize holes in the talar dome trial component to guide the installation of guide pins 15904A and 15904B.
The surgeon may perform the talar chamfer resections using guide pins 15904A and 15904B. For instance, as shown in
MR system 212 may provide virtual guidance to assist the surgeon with the installation of fixation screws 16102A and 16102B. As one example, visualization device 213 may display virtual markers that indicate the location and axis at which fixation screws 16102A and 16102B are to be installed. As another example, visualization device 213 may provide depth guidance to enable the surgeon to install fixation screws 16102A and 16102B to a target depth. In some examples, MR system 212 may utilize closed-loop tool control to positively control a drill used to attach fixation screws 16102A and 16102B.
The surgeon may utilize talar resection guide base 16002 to perform the posterior talar chamfer resection. For instance, as shown in
As discussed above, MR system 212 may cause the second visualization device to display virtual information that identifies surgical item selected for a current step of the ankle arthroplasty procedure. For instance, in the example of
In addition to, or in place of talar resection guide base 16002, MR system 212 may provide virtual guidance to assist the surgeon with performing the posterior talar chamfer resection. For instance, visualization device 213 may display a virtual marker that guides a surgeon in performing the posterior talar chamfer resection. Visualization device 213 may display the marker overlaid on talus 15108 (e.g., to indicate the position and/or orientation at which the cut is to be made). The virtual marker may be a virtual surface or virtual cutting plane at a point on talus 15108 that guides a surgeon in performing the cut.
The surgeon may utilize talar resection guide base 16002 to perform the anterior talar chamfer resection. For instance, as shown in
In some examples, for one or both of the anterior flat and anterior chamfer preparation, the surgeon may perform plunge cuts (e.g., using talar reamer 16204) to prepare talus 15108 for reaming. For instance, the surgeon may attach a pilot guide with holes that guide performance of the plunge cuts. Depth stop 16206 of talar reamer 16204 may engage with a surface of the pilot guide the control the plunge depth.
In addition to, or in place of talar resection guide base 16002, MR system 212 may provide virtual guidance to assist the surgeon with performing the anterior talar chamfer resection. For instance, visualization device 213 may display one or more virtual markers that guide a surgeon in performing the plunge cuts and/or horizontal reaming. As one example, visualization device 213 may display a respective virtual axis for each of the plunge cuts. MR system 212 may provide other virtual guidance to assist with performing the plunge cuts and/or horizontal reaming in addition to, or in place of, the virtual markers. For instance, MR system 212 may provide any of the additional virtual guidance (e.g., depth guidance, targeting guidance, etc.) discussed above.
The surgeon may perform talar peg drilling to create anchorage points in talus 15108 for the talar implant. MR system 212 may provide virtual guidance to assist the surgeon with performing the anterior talar chamfer resection. For instance, visualization device 213 may display one or more virtual markers that guide a surgeon in drilling holes in talus 15108. As shown in
With continued reference to
The surgeon may install tibial implant 16602 such that posterior peg 16604A, and anterior pegs 16604B and 16604C of tibial implant 16602 engage with peg holes 16702A-16702C of tibia 15102. For instance, the surgeon may position tibial implant 16602 such that posterior peg 16604A lines up with peg hole 16702A, anterior peg 16604B lines up with peg hole 16702B, and anterior peg 16604C lines up with peg hole 16702C. Once the pegs are lined up with their corresponding peg holes, the surgeon may impact tibial implant 16602 into tibia 15102. As discussed above, MR system 212 may display an animation, video, or text to describe how a particular step or steps are to be performed. For instance, MR system 212 may cause visualization device 213 to display a diagram or animation showing how tibial implant 16602 is to be installed. As also discussed above, MR system 212 may display a virtual checklist, with each item on the checklist items corresponding to an item in a checklist of steps of an orthopedic surgery. For instance, MR system 212 may display a virtual checklist having a checklist item specifying a current step, or sequence of steps, of installing the tibial implant.
With continued reference to
The surgeon may install talar implant 16902 such that first peg 16904A and second peg 16904B of talar implant 16902 engage with peg holes 16502A and 16502B of talus 15108. For instance, the surgeon may position talar implant 16902 such that first peg 16904A lines up with peg hole 16502A, and second peg 16904B of talar implant 16902 lines up with peg hole 16502B. Once the pegs are lined up with their corresponding peg holes, the surgeon may impact talar implant 16902 into talus 15108.
As discussed above, MR system 212 may cause the second visualization device to display virtual information that identifies a surgical item selected for a current step of the ankle arthroplasty procedure. For instance, in the example of
As discussed above, MR system 212 may cause the second visualization device to display virtual information that identifies surgical item selected for a current step of the ankle arthroplasty procedure. For instance, in the example of
With continued reference to
Subsequently, in the example of
As discussed above, implants may be available in various sizes. For instance, one or both of tibial implant 16602 of
As also discussed above, implant alignment may also be an important aspect of surgical planning. Implant alignment may include one or both of implant position/location (e.g., in a cartesian sense, such as a 3D coordinate) and orientation (e.g., in a rotational sense, such as a 3D rotation matrix).
In accordance with one or more aspects of this disclosure, virtual planning system 102 may provide automated alignment and sizing advice for a particular patient based on alignment and sizing of implants of other patients. In some examples, virtual planning system 102 may utilize patient atlases to provide the automated alignment and sizing advice. The atlas of a current patient (i.e., the patient for which virtual planning system 102 is providing the advice) may be referred to as a target atlas. The atlases of other patients may be referred to as reference atlases. A reference atlas of a patient may include a preoperative scan the patient, 3D models of a bone of the patient, reference anatomical axes of each bone, and corresponding implant size and placement used to install an implant in the patient. A target atlas may include similar components (but does not include the implant size and alignment). An atlas, be it target or reference, may include other data points (e.g., cyst 3D models, age or weight of the patient). In the context of an ankle arthroplasty, an atlas may include one or more of the following: a CT scan, a 3D model of the distal tibia along with the corresponding anatomical axes of that bone (AP, AM, ML and mechanical axis), a 3D model of the talus along with the corresponding anatomical axes of that bone (AP, AM, ML and mechanical axis), an implant size and placement, a contour of the tibial cut, a contour of the talar cut, anatomical measures of the foot and ankle, a surgery strategy, a surgery type (e.g., primary, revision, or fusion take down), patient specificities (e.g., Bone fusions, former fractures, presence of other hardware), and a fore foot condition (e.g., an angle between the first and second metatarsals. Atlases may be sanitized of patient personal identifying information (e.g., names and other such information may be removed such that the reference atlases are anonymized).
In some examples the atlases may be pre-processed. For instance, virtual planning system 102 (or another component of orthopedic surgical system 100) may re-align the atlases such that the medial-lateral (ML), anthro-posterior (AP), and superior axis correspond to the X, Y and Z axes, respectively.
In operation, virtual planning system 102 may obtain a target atlas of a particular patient on which an arthroplasty procedure is to be performed, and obtain a plurality of reference atlases of other patients on which the arthroplasty procedure has been performed. Virtual planning system 102 may obtain the atlases from a central repository, such as a server of orthopedic surgical system 100. In some examples, to obtain the plurality of reference atlases, virtual planning system 102 may obtain an index of the plurality of reference atlases.
Virtual planning system 102 may, in some examples, generate the target atlas. For instance, virtual planning system 102 may segment an image (e.g., a CT scan) of the current patient to generate 3D models of the patient's bone (e.g., tibia proximal, and talus distal). Virtual planning system 102 may estimate anatomical landmarks that may enable creation of an anatomical coordinate system. Such a coordinate system may define the ML, AP, and Superior/mechanical axes of the bone (e.g., the tibia).
Virtual planning system 102 may select, based on a comparison of values of the target atlas and the plurality of reference atlases, at least one reference atlas of the plurality of reference atlases of the other patients. For instance, virtual planning system 102 may align/superimpose the origin and the reference axes of the target and the reference atlas models. Such an alignment may correspond to a registration of the target atlas and reference atlas axes. As such, virtual planning system 102 may align axes of a bone model of a target atlas and axes of bone models of a plurality of reference atlases.
Virtual planning system 102 may select, as the at least one reference atlas, the reference atlas for which a distance between both the distal and proximal tibia axes and target is minimal. However, please note that a good matching between the proximal tibia target and atlas may not be relevant to planning. For example, a size of the tibia in the target atlas may be very different from a size of a tibia in a particular reference atlas but may show a very similar distal tibia to that in the particular reference atlas. In this previous example, the similarity between the target atlas and the particular reference atlas may be low so that the particular reference atlas may be excluded, despite that the particular reference could have been relevant. Similarly, that a similarity measure between distal tibia target and atlas may also not be relevant because such a similarity measure would ignore the mechanical axis of the tibia.
For each atlas registered to the target, virtual planning system 102 may cut the two distal tibia models atlas and target to create a 3D regional model of the tibia relevant for implant size selection. Virtual planning system 102 may measure the distance between both regional models and select N (e.g., 1, 2, 3, 4, 5, etc.) atlases that yields the highest similarities. In this way, virtual planning system 102 may select N reference atlases of the plurality of reference atlases that are most similar to the target atlas.
Based on the selected reference atlases, virtual planning system 102 may determine candidate implant sizes for the current patient. For instance, virtual planning system 102 may select the implant sizes of the selected N reference atlases as the candidate implant sizes for the current patient. In this way, virtual planning system 102 may obtain a collection of reference atlases having distal tibias with a geometry very similar to the target atlas. However, the bone cut contour of the reference atlases may still be different because of local variation on the target, such as osteophytes.
As noted above, in addition to or in place of implant size selection, virtual planning system 102 may select an implant alignment. In some examples, virtual planning system 102 may select the implant alignment based on the size candidates. For instance, virtual planning system 102 may, for each of the selected N atlases, re-use the implant sizes and alignments from the reference atlases and place the implant on the target atlas as performed on the N atlases to generate N candidate placements on the target. The N candidate placements may represent N candidate implant alignments.
Virtual planning system 102 may output, for display, a graphical representation of the determined implant size or implant alignment. User interface 1300 of
Virtual planning system 102 may provide a textual representation of the determined implant size or implant alignment. For instance, as shown in
Virtual planning system 102 may provide a representation of a result of the determined implant size and implant alignment. As one example, the aforementioned graphical representations may virtually depict impacts of the determined size and alignment. As another example, virtual planning system 102 may output text showing impact (e.g., resection height, anterior underhang, posterior underhang, and MM thickness). An example of such text for each of the three candidates is shown in
As discussed above, virtual planning system 102 may obtain a plurality of reference atlases. The plurality of reference atlases may be referred to as an atlas database and may be constructed using any suitable technique. As one example, the atlas database may be constructed using a statistical shape model (SSM) that represents a desired percentage of the patient population. As another example, the atlas database may be constructed of retrospective cases (e.g., retrospective total ankle replacement (TAR) cases).
The atlas database may include a quantity of atlases that is statistically equivalent to the patient population so that the atlas collection should be complete (any target case should be represented in the atlas collection). As such, the atlas database may correspond to a multi-atlas basis and should provide good spanning properties. The database may include distributions of cases using the following parameters: gender, implant type, bone morphometry (related to the bone size).
In some examples, the atlas database may be pruned or otherwise managed to be free to avoid overlap of atlases. Overlap of patient morphology can be detected by measuring a dice or Hausdorff distance).
In some examples, the atlases may be optimized for different type of patients and surgical preferences. As such, the search space (i.e., the quantity of reference atlases compared to the target atlas) can be reduced by using sub-atlases bases. Sub-atlases sub-basis or additional atlases can be constructed by using surgical preferences such as anatomical versus mechanical axis referencing or patient profiles (gender if relevant or bone size) or preferred implant type (for example, a surgeon prefers to employ a specific implant so that the search should be performed in that sub-atlases basis.
As discussed above, virtual planning system 102 may select reference atlases based on a comparison between 3D bone models. In some examples, virtual planning system 102 may select reference atlases based on a state/density of the bone. As such. the similarity between an atlas and the target may not be exclusively based on the 3D bone models. The stability of the implant may be important and depends also on the bone density/quality, the presence of cavities, etc. around the positioned implants. Using measures based on CT intensities around the implants can change the ranking or invalidate planning candidates.
Virtual planning system 102 may obtain a target atlas of a particular patient on which an arthroplasty procedure is to be performed (1702). For instance, one or more processors of virtual planning system 102 may generate the target atlas of the particular patient and obtain the plurality of reference atlases from an atlas database.
Virtual planning system 102 may select, based on a comparison of values of the target atlas and a plurality of reference atlases of other patients on which the arthroplasty procedure has been performed, at least one reference atlas of the plurality of reference atlases of the other patients (1704). For instance, the one or more processors of virtual planning system 102 may select, as the at least one reference atlas, a reference atlas of the plurality of reference atlases that is most similar to the target atlas.
Virtual planning system 102 may determine, based on the selected at least one reference atlas, one or both of an implant size and an implant alignment for the particular patient (1706). For instance, the one or more processors of virtual implant system 102 may select the implant size and the implant alignment for the particular patient based on the implant size and implant alignment of the at least one reference atlas.
Virtual planning system 102 may generate virtual guidance to guide a surgeon in preparing bone for an implant having the selected implant size at the selected implant alignment. For instance, virtual planning system 102 may generate virtual guidance to prepare a tibia and or a talus as discussed above.
While the techniques been disclosed with respect to a limited number of examples, those skilled in the art, having the benefit of this disclosure, will appreciate numerous modifications and variations there from. For instance, it is contemplated that any reasonable combination of the described examples may be performed. It is intended that the appended claims cover such modifications and variations as fall within the true spirit and scope of the invention.
It is to be recognized that depending on the example, certain acts or events of any of the techniques described herein can be performed in a different sequence, may be added, merged, or left out altogether (e.g., not all described acts or events are necessary for the practice of the techniques). Moreover, in certain examples, acts or events may be performed concurrently, e.g., through multi-threaded processing, interrupt processing, or multiple processors, rather than sequentially.
In one or more examples, the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit. Computer-readable media may include computer-readable storage media, which corresponds to a tangible medium such as data storage media, or communication media including any medium that facilitates transfer of a computer program from one place to another, e.g., according to a communication protocol. In this manner, computer-readable media generally may correspond to (1) tangible computer-readable storage media which is non-transitory or (2) a communication medium such as a signal or carrier wave. Data storage media may be any available media that can be accessed by one or more computers or one or more processors to retrieve instructions, code and/or data structures for implementation of the techniques described in this disclosure. A computer program product may include a computer-readable medium.
By way of example, and not limitation, such computer-readable storage media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage, or other magnetic storage devices, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer. Also, any connection is properly termed a computer-readable medium. For example, if instructions are transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. It should be understood, however, that computer-readable storage media and data storage media do not include connections, carrier waves, signals, or other transitory media, but are instead directed to non-transitory, tangible storage media. Disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray disc, where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
Operations described in this disclosure may be performed by one or more processors, which may be implemented as fixed-function processing circuits, programmable circuits, or combinations thereof, such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Fixed-function circuits refer to circuits that provide particular functionality and are preset on the operations that can be performed. Programmable circuits refer to circuits that can programmed to perform various tasks and provide flexible functionality in the operations that can be performed. For instance, programmable circuits may execute instructions specified by software or firmware that cause the programmable circuits to operate in the manner defined by instructions of the software or firmware. Fixed-function circuits may execute software instructions (e.g., to receive parameters or output parameters), but the types of operations that the fixed-function circuits perform are generally immutable. Accordingly, the terms “processor” and “processing circuitry,” as used herein may refer to any of the foregoing structures or any other structure suitable for implementation of the techniques described herein.
Various examples have been described. These and other examples are within the scope of the following claims.
This application claims the benefit of U.S. Provisional Patent Application No. 63/328,080, filed Apr. 6, 2022, the entire content of which is incorporated by reference.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2023/063331 | 2/27/2023 | WO |
Number | Date | Country | |
---|---|---|---|
63328080 | Apr 2022 | US |