The present invention relates generally to the field of computer-assisted medical systems and, more particularly, to a computer-assisted knee replacement apparatus and method.
Image-based surgical navigation systems display the positions of surgical tools with respect to preoperative (prior to surgery) or intraoperative (during surgery) image datasets. Two and three dimensional image data sets are used, as well as time-variant images data (i.e. multiple data sets take at different times). Types of data sets that are primarily used include two-dimensional fluoroscopic images and three-dimensional data sets include magnetic resonance imaging (MRI) scans, computer tomography (CT) scans, positron emission tomography (PET) scans, and angiographic data. Intraoperative images are typically fluoroscopic, as a C-arm fluoroscope is relatively easily positioned with respect to patient and does not require that a patient be moved. Other types of imaging modalities require extensive patient movement and thus are typically used only for preoperative and post-operative imaging.
The most popular navigation systems make use of a tracking or localizing system to track tools, instruments and patients during surgery. These systems locate in predefined coordinate space specially recognizable markers or elements that are attached or affixed to, or possibly inherently a part of, an object such as an instrument or a patient. The elements can take several forms, including those that can be located using optical (or visual), magnetic, or acoustical methods. Furthermore, at least in the case of optical or visual systems, the location of an object's position may be based on intrinsic features or landmarks that, in effect, function as recognizable elements. The elements will have a known, geometrical arrangement with respect to, typically, an end point and/or axis of the instrument. Thus, objects can be recognized at least in part from the geometry of the elements (assuming that the geometry is unique), and the orientation of the axis and location of endpoint within a frame of reference deduced from the positions of the elements.
A typical optical tracking system functions primarily in the infrared range. They usually include a stationary stereo camera pair that is focused around the area of interest and sensitive to infrared radiation. Elements emit infrared radiation, either actively or passively. An example of an active element is a light emitting diode (LED). An example of a passive element is a reflective element, such as ball-shaped element with a surface that reflects incident infrared radiation. Passive systems require an infrared radiation source to illuminate the area of focus. A magnetic system may have a stationary field generator that emits a magnetic field that is sensed by small coils integrated into the tracked tools.
Most computer-assisted surgery (CAS) systems are capable of continuously tracking, in effect, the position of tools (sometimes also called instruments). With knowledge of the position of the relationship between the tool and the patient and the patient and an image data sets, a system is able to continually superimpose a representation of the tool on the image in the same relationship to the anatomy in the image as the relationship of the actual tool to the patient's anatomy. To obtain these relationships, the coordinate system of the image data set must be registered to the relevant anatomy of the actual patient and portions of the of the patient's anatomy in the coordinate system of the tracking system. There are several known registration methods.
In CAS systems that are capable of using two-dimensional image data sets, multiple images are usually taken from different angles and registered to each other so that a representation of the tool or other object (which can be real or virtual) can be, in effect, projected into each image. As the position of the object changes in three-dimensional space, its projection into each image is simultaneously updated. In order to register two or more two-dimensional data images together, the images are acquired with what is called a registration phantom in the field of view of the image device. In the case of a two-dimensional fluoroscopic images, the phantom is a radio-translucent body holding radio-opaque fiducials having a known geometric relationship. Knowing the actual position of the fiducials in three-dimensional space when each of the images are taken permits determination of a relationship between the position of the fiducials and their respective shadows in each of the images. This relationship can then be used to create a transform for mapping between points in three-dimensional space and each of the images. By knowing the positions of the fiducials with respect to the tracking system's frame of reference, the relative positions of tracked tools with respect to the patient's anatomy can be accurately indicated in each of the images, presuming the patient does not move after the image is acquired, or that the relevant portions of the patient's anatomy are tracked. A more detailed explanation of registration of fluoroscopic images and coordination of representations of objects in patient space superimposed in the images is found in U.S. Pat. No. 6,198,794 of Peshkin, et al., entitled “Apparatus and method for planning a stereotactic surgical procedure using coordinated fluoroscopy.”
The invention is generally directed to improved computer-implemented methods and apparatus for further reducing the invasiveness of surgical procedures, eliminating or reducing the need for external fixtures in certain surgical procedures, and/or improving the precision and/or consistency of surgical procedures. The invention finds particular advantage in orthopedic procedures involving implantation of devices, though it may also be used in connection with other types of surgical procedures.
The computer-assisted knee replacement apparatus and method of the present invention assists with performing a total knee replacement procedure. In this embodiment, the knee replacement application provides implant sizing corresponding to the subject. The knee replacement application also provides planning and guiding of femoral and/or tibial resection preparation.
For a more complete understanding of the present invention and the advantages thereof, reference is now made to the following descriptions taken in connection with the accompanying drawings in which:
The preferred embodiments of the present invention and the advantages thereof are best understood by referring to
Tracking system 22 continuously determines, or tracks, the position of one or more trackable elements disposed on, incorporated into, or inherently a part of surgical instruments or tools 20 with respect to a three-dimensional coordinate frame of reference. With information from the tracking system 22 on the location of the trackable elements, CAS system 10 is programmed to be able to determine the three-dimensional coordinates of an endpoint or tip of a tool 20 and, optionally, its primary axis using predefined or known (e.g. from calibration) geometrical relationships between trackable elements on the tool and the endpoint and/or axis of the tool 20. A patient, or portions of the patient's anatomy, can also be tracked by attachment of arrays of trackable elements.
The CAS system 10 can be used for both planning surgical procedures (including planning during surgery) and for navigation. It is therefore preferably programmed with software for providing basic image guided surgery functions, including those necessary for determining the position of the tip and axis of instruments and for registering a patient and preoperative and/or intraoperative diagnostic image data sets to the coordinate system of the tracking system. The programmed instructions for these functions are indicated as core CAS utilities 24. These capabilities allow the relationship of a tracked instrument to a patient to be displayed and constantly updated in real time by the CAS system 10 overlaying a representation of the tracked instrument on one or more graphical images of the patient's anatomy on display device 12. The graphical images may be a virtual representation of the patient's anatomy or may be constructed from one or more stored image data sets 26 acquired from a diagnostic imaging device 28. The imaging device may be a fluoroscope, such as a C-arm fluoroscope, capable of being positioned around a patient laying on an operating table. It may also be a MR, CT or other type of imaging device in the room or permanently located elsewhere. Where more than one image is shown, as when multiple fluoroscopic images are simultaneously displayed of display device 12, the representation of the tracked instrument or tool is coordinated between the different images. However, CAS system 10 can be used in some procedures without the diagnostic image data sets, with only the patient being registered. Thus, the CAS system 10 may need not to support the use diagnostic images in some applications—i.e., an imageless application.
Furthermore, as disclosed herein, the CAS system 10 may be used to run application-specific programs that are directed to assisting a surgeon with planning and/or navigation during specific types of procedures. For example, the application programs may display predefined pages or images corresponding to specific steps or stages of a surgical procedure. At a particular stage or part of a program, a surgeon may be automatically prompted to perform certain tasks or to define or enter specific data that will permit, for example, the program to determine and display appropriate placement and alignment of instrumentation or implants or provide feedback to the surgeon. Other pages may be set up to display diagnostic images for navigation and to provide certain data that is calculated by the system for feedback to the surgeon. Instead of or in addition to using visual means, the CAS system 10 could also communicate information in ways, including using audibly (e.g. using voice synthesis) and tactilely, such as by using a haptic interface type of device. For example, in addition to indicating visually a trajectory for a drill or saw on the screen, the CAS system 10 may feedback to a surgeon information whether he is nearing some object or is on course with a audible sound or by application of a force or other tactile sensation to the surgeon's hand.
To further reduce the burden on the surgeon, the program may automatically detect the stage of the procedure by recognizing the instrument picked up by a surgeon and move immediately to the part of the program in which that tool is used. Application data generated or used by the application may also be stored in processor-based system 16.
Various types of user input methods can be used to improve ease of use of the CAS system 10 during surgery. One example is the use the use of speech recognition to permit a doctor to speak a command. Another example is the use of a tracked object to sense a gesture by a surgeon, which is interpreted as an input to the CAS system 10. The meaning of the gesture could further depend on the state of the CAS system 10 or the current step in an application process executing on the CAS system 10. Again, as an example, a gesture may instruct the CAS system 10 to capture the current position of the object. One way of detecting a gesture is to occlude temporarily one or more of the trackable elements on the tracked object (e.g. a probe) for a period of time, causing loss of the CAS system's 10 ability to track the object. A temporary visual occlusion of a certain length (or within a certain range of time), coupled with the tracked object being in the same position before the occlusion and after the occlusion, would be interpreted as an input gesture. A visual or audible indicator that a gesture has been recognized could be used to provide feedback to the surgeon.
Yet another example of such an input method is the use of tracking system 22 in combination with one or more trackable data input devices 30. Defined with respect to the trackable input device 30 are one or more defined input areas, which can be two-dimensional or three-dimensional. These defined input areas are visually indicated on the trackable input device 30 so that a surgeon can see them. For example, the input areas may be visually defined on an object by representations of buttons, numbers, letters, words, slides and/or other conventional input devices. The geometric relationship between each defined input area and the trackable input device 30 is known and stored in processor-based system 16. Thus, the processor 17 can determine when another trackable object touches or is in close proximity a defined input area and recognize it as an indication of a user input to the processor based system 16. For example, when a tip of a tracked pointer is brought into close proximity to one of the defined input areas, the processor-based system 16 will recognize the tool near the defined input area and treat it as a user input associated with that defined input area. Preferably, representations on the trackable user input correspond user input selections (e.g. buttons) on a graphical user interface on display device 12. The trackable input device 30 may be formed on the surface of any type of trackable device, including devices used for other purposes. In a preferred embodiment, representations of user input functions for graphical user interface are visually defined on a rear, flat surface of a base of a tool calibrator.
Processor-based system 16 is, in one example, a programmable computer that is programmed to execute only when single-use or multiple-use software is loaded from, for example, removable media 18. The software would include, for example the application program for use with a specific type of procedure. The application program can be sold bundled with disposable instruments specifically intended for the procedure. The application program would be loaded into the processor-based system 16 and stored there for use during one (or a defined number) of procedures before being disabled. Thus, the application program need not be distributed with the CAS system 10. Furthermore, application programs can be designed to work with specific tools and implants and distributed with those tools and implants. Preferably, also, the most current core CAS utilities 24 may also be stored with the application program. If the core CAS utilities 24 on the processor-based system 16 are outdated, they can be replaced with the most current utilities.
In
Briefly, in one embodiment, the knee replacement application 40 cooperates with a tracking system 22 to plan femoral and/or tibial implant sizes for the subject. The knee replacement application 40 also cooperates with tracking system 22 to assist with planning, selecting and preparing femoral and/or tibial resections by locating and positioning cutting guides and other tools relative to the subject's knee to minimize the invasiveness of the procedure and increase the accuracy of knee implant placement. Application 40 also enables a user to review kinematic parameters of the subject's knee.
At step 104, the knee replacement application 40 displays a calibration grid 304, as best illustrated in
At step 122, the knee replacement application 40 requests registration of each of the anterior/posterior and medial/lateral images of the knee, ankle, and hip joints relative to the subject reference frame. For example, trackable element arrays may be secured or otherwise coupled to the subject, such as secured to the femur and tibia of the subject, such that tracking system 22 may register each image to the subject reference frame. Thus, at step 124, tracking system 22 registers the image data 26 of the ankle, knee, and hip joints to the subject reference frame. As illustrated in
At step 126, the knee replacement application 40 displays on display device 12 anterior/posterior image data 26, as shown in the left portion of
At step 132, the knee replacement application 40 displays image data 26 of the ankle joint of the subject for selection of the ankle center, as best illustrated in
At step 138, the knee replacement application 40 displays the image data 26 of the knee joint on display device 12. At step 140, the knee replacement application 40 requests and receives a selection of the knee center. For example, as best illustrated in
At step 148, the knee replacement application 40 displays a femoral implant sizing guide 322 on display device 12 relative to the knee image data 26, as best illustrated in
At step 154, the knee replacement application 40 automatically determines the distal femoral resection plane based on the determined femoral mechanical axis and the location of the posterior condyle and anterior cortex. At step 156, the knee replacement application 40 displays the distal femoral resection plane on the displayed knee image data 26. For example, referring to
At step 158, the knee replacement application 40 retrieves implant data 60 corresponding to available femoral implants. For example, the femoral implant data 60 may comprise information associated with various sizes of femoral implants such that the geometric characteristics of the various femoral implants may be displayed relative to the displayed knee image data 26. At step 160, the knee replacement application 40 automatically determines a suggested femoral implant size corresponding to the determined distal femoral resection plane and the locations of the posterior condyle and anterior cortex. At step 162, the knee replacement application 40 displays the femoral resection surfaces for the femoral implant on the relative to the displayed knee image data 26. For example, referring to
At decisional step 170, a determination is made whether a distal shift of the femoral implant is desired. If a distal shift of the femoral implant is not desired, the method proceeds from step 170 to decisional step 176. If a distal shift of the femoral implant is desired, the method proceeds from step 170 to step 172, where the knee replacement application 40 receives a desired or requested distal shift of the femoral implant guide. For example, as best illustrated in
At decisional step 176, a determination is made whether an anterior/posterior shift of the femoral implant guide is desired. If an anterior/posterior shift of the femoral implant guide is not desired, the method proceeds from step 176 to 182. If an anterior/posterior shift of the femoral implant guide is desired, the method proceeds from step 176 to step 178, where the knee replacement application 40 receives a desired or requested anterior/posterior shift of the femoral implant guide. For example, as best illustrated in
Upon completion of femoral implant sizing, the knee replacement application 40 stores the femoral implant size and location data as data 62. In
At step 184, the knee replacement application 40 displays a tibial resection planning guide 348 relative to the knee image data, as best illustrated in
At step 192, the knee replacement application 40 retrieves implant data 60 corresponding to available tibial implants. For example, the tibial implant data 60 may comprise information corresponding to the geometric characteristics of available tibial implant sizes. At step 194, the knee replacement application 40 requests and receives a desired tibial implant size. For example, as illustrated in
At step 202, the knee replacement application 40 requests and receives a desired tibial implant posterior slope. For example, as illustrated in
At step 208, the knee replacement application 40 retrieves femoral distal resection guide data 66. For example, the femoral distal resection guide data 66 may comprise information associated with geometric characteristics of a femoral distal resection guide such that the femoral distal resection guide may be located relative to the subject's femur corresponding to a desired distal femoral resection plane. At step 210, the knee replacement application 40 determines the resection guide pin locations and trajectories corresponding to the desired femoral resection plane. For example, based on a desired location of the femoral resection plane, the knee replacement application 40 automatically determines the locations and trajectories of the attachment pins of the resection guide for placement relative to the femur of the subject to accurately locate and guide distal femoral resection. At step 212, the knee replacement application 40 displays the distal femoral resection guide pin locations and trajectories relative to the knee image data 26, as best illustrated in
At step 222, the knee replacement application 40 requests placement of a trackable tool 20 along the epicondylar axis of the resected femur, as best illustrated in
At step 228, the knee replacement application 40 retrieves femoral anterior/posterior/chamfer resection guide data 70. For example, the femoral anterior/posterior/chamfer resection guide data 70 may comprise information associated with the geometric characteristics of the resection guide such that knee replacement application 40 may locate the resection guide relative to the femur of the subject to accommodate locating the femoral anterior, posterior, and chamfer resections corresponding to a desired femoral implant. Thus, at step 230, the knee replacement application 40 determines pin trajectories for the femoral anterior/posterior/chamfer resection guide based on the epicondylar axis data 68. At step 232, the knee replacement application 40 displays the pin trajectories for the resection guide on the knee image data 26, as best illustrated in
At step 242, the knee replacement application 40 retrieves tibial resection guide data 72. For example, the tibial resection guide data 72 may comprise information associated with the geometric characteristics of a tibial resection guide such that the knee replacement application 40 may locate the tibial resection guide relative to the tibia corresponding to a desired tibial resection plane. At step 246, the knee replacement application 40 determines tibial resection guide pin locations and trajectories corresponding to the desired tibial resection plane using the tibial resection guide data 72. At step 248, the knee replacement application 40 displays the tibial resection guide pin locations and trajectories on the knee image data 26, as best illustrated in
At step 250, the tracking system 22 acquires tracking data for a drill guide. At step 252, the knee replacement application 40 and tracking system 22 monitor and display the location and orientation of the drill guide relative to the pin locations and trajectories for the tibial resection guide on the displayed knee image data 26. At decisional step 254, a determination is made whether the drill guide is aligned with a pin location and trajectory for the tibial resection guide. If the drill guide is not aligned with a pin location and trajectory for the tibial resection guide, the method returns to step 252. If the drill guide is aligned with the pin location and trajectory for the tibial resection guide, the method proceeds from step 254 to step 256, where the knee replacement application 40 signals drill guide alignment. For example, as described above, the knee replacement application 40 may provide an audible, visual, or other type of signal indicating alignment. The knee replacement application 40 may also display alignment of the drill guide with a pin location and trajectory of the tibial resection guide with a crosshair and bullseye, similar to as described above in connection with
At step 258, the knee replacement application 40 requests probe placement along a tibial template axis. For example, after completion of the tibial resection, trial femoral and tibial implants may be located on the subject and the trackable probe placed at a particular orientation or position relative to one of the trial implants, indicated generally by 402, to obtain tibial and/or femoral rotation angle information. At step 260, the knee replacement application 40 in cooperation with the tracking system 22 acquires tracking data for the probe and displays the tracking data of the position and orientation of the probe relative to the knee image data 26, as best illustrated in
Additionally, as best illustrated in
This patent application is a continuation of application Ser. No. 10/772,085, entitled “Computer-Assisted Knee Replacement Apparatus and Method,” filed Feb. 4, 2004; and claim the benefit of U.S. provisional patent application Ser. No. 60/444,988, entitled “Computer-Assisted Knee Replacement Apparatus and Method”, filed Feb. 4, 2003, the disclosures of which are incorporated herein by reference. This application relates to the following U.S. provisional patent applications: Ser. No. 6-/444,824, entitled “Interactive Computer-Assisted Surgery System and Method”; Ser. No. 60/444,975, entitled “System and Method for Providing Computer Assistance With Spinal Fixation Procedures”; Ser. No. 60/445,078, entitled “Computer-Assisted Knee Replacement Apparatus and Method”; Ser. No. 60/444,989, entitled “Computer-Assisted External Fixation Apparatus and Method”; Ser. No. 60/445,002, entitled “Method and Apparatus for Computer Assistance With Total Hip Replacement Procedure”; Ser. No. 60/445,001, entitled “Method and Apparatus for Computer Assistance With Intramedullary Nail Procedure”; and Ser. No. 60/319,924, entitled “Portable, Low-Profile Integrated Computer, Screen and Keyboard for Computer Surgery Applications”; each of which was filed on Feb. 4, 2003 and is incorporated herein by reference. This application also relates to the following applications: U.S. patent application Ser. No. 10/772,083, entitled “Interactive Computer-Assisted Surgery System and Method”; U.S. patent application Ser. No. 10/771,850, entitled “System and Method for Providing Computer Assistance With Spinal Fixation Procedures”; U.S. patent application Ser. No. 10/772,139, entitled “Computer-Assisted Knee Replacement Apparatus and Method”; U.S. patent application Ser. No. 10/772,142, entitled Computer-Assisted External Fixation Apparatus and Method”; U.S. patent application Ser. No. 10/772,092, entitled “Method and Apparatus for Computer Assistance With Total Hip Replacement Procedure”; U.S. patent application Ser. No. 10/771,851, entitled “Method and Apparatus for Computer Assistance With Intramedullary Nail Procedure”; and U.S. patent application Ser. No. 10/772,137, entitled “Portable Low-Profile Integrated Computer, Screen and Keyboard for Computer Surgery Applications”; each of which was filed on Feb. 4, 2004 and is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | 10772085 | Feb 2004 | US |
Child | 11006494 | Dec 2004 | US |