The present invention relates generally to computer-assisted surgery systems and surgical navigation systems.
Image-based surgical navigation systems display the positions of surgical tools with respect to preoperative (prior to surgery) or intraoperative (during surgery) image data sets. Two and three dimensional image data sets are used, as well as time-variant images data (i.e. multiple data sets taken at different times). Types of data sets that are primarily used include two-dimensional fluoroscopic images and three-dimensional data sets include magnetic resonance imaging (MRI) scans, computer tomography (CT) scans, positron emission tomography (PET) scans, and angiographic data. Intraoperative images are typically fluoroscopic, as a C-arm fluoroscope is relatively easily positioned with respect to patients and does not require that a patient be moved. Other types of imaging modalities require extensive patient movement and thus are typically used only for preoperative and post-operative imaging.
The most popular navigation systems make use of a tracking or localizing system to track tools, instruments and patients during surgery. These systems locate in predefined coordinate space specially recognizable markers that are attached or affixed to, or possibly inherently a part of, an object such as an instrument or a patient. Markers can take several forms, including those that can be located using optical (or visual), electromagnetic, radio or acoustic methods. Furthermore, at least in the case of optical or visual systems, location of an object's position may be based on intrinsic features or landmarks that, in effect, function as recognizable markers. Markers will have a known, geometrical arrangement with respect to, typically, an end point and/or axis of the instrument. Thus, objects can be recognized at least in part from the geometry of the markers (assuming that the geometry is unique), and the orientation of the axis and location of endpoint within a frame of reference deduced from the positions of the markers.
Present-day tracking systems are typically optical, functioning primarily in the infrared range. They usually include a stationary stereo camera pair that is focused around the area of interest and sensitive to infrared radiation. Markers emit infrared radiation, either actively or passively. An example of an active marker is light-emitting diodes (LEDs). An example of a passive marker is a reflective marker, such as ball-shaped marker with a surface that reflects incident infrared radiation. Passive systems require a an infrared radiation source to illuminate the area of focus. A magnetic system may have a stationary field generator that emits a magnetic field that is sensed by small coils integrated into the tracked tools.
Most CAS systems are capable of continuously tracking, in effect, the position of tools (sometimes also called instruments). With knowledge of the position of the relationship between the tool and the patient and the patient and image data sets, a system is able to continually superimpose a representation of the tool on the image in the same relationship to the anatomy in the image as the relationship of the actual tool to the patient's anatomy. To obtain these relationships, the coordinate system of the image data set must be registered to the relevant anatomy of the actual patient portions of the of the patient's anatomy in the coordinate system of the tracking system. There are several known registration methods.
In CAS systems that are capable of using two-dimensional image data sets, multiple images are usually taken from different angles and registered to each other so that a representation of the tool or other object (which can be real or virtual) can be, in effect, projected into each image. As the position of the object changes in three dimensional space, its projection into each image is simultaneously updated. In order to register two or more two-dimensional data images together, the images are acquired with what is called a registration phantom in the field of view of the image device. In the case of a two dimensional fluoroscopic images, the phantom is a radio-translucent body holding radio-opaque fiducials having a known geometric relationship. Knowing the actual position of the fiducials in three dimensional space when each of the images are taken permits determination of a relationship between the position of the fiducials and their respective shadows in each of the images. This relationship can then be used to create a transform for mapping between points in three-dimensional space and each of the images. By knowing the positions of the fiducials with respect to the tracking system's frame of reference, the relative positions of tracked tools with respect to the patient's anatomy can be accurately indicated in each of the images, presuming the patient does not move after the image is acquired, or that the relevant are portions of the patient's anatomy is are tracked. A more detailed explanation of registration of fluoroscopic images and coordination of representations of objects in patient space superimposed in the images is found in U.S. Pat. No. 6,198,794 of Peshkin, et al., entitled “Apparatus and method for planning a stereotactic surgical procedure using coordinated fluoroscopy”.
The invention is generally directed to improved computer-implemented methods and apparatus for further reducing the invasiveness of surgical procedures, eliminating or reducing the need for external fixtures in certain surgical procedures, and/or improving the precision and/or consistency of surgical procedures. The invention finds particular advantage in orthopedic procedures involving implantation of devices, though it may also be used in connection with other types of surgical procedures.
For example, a surgeon encounters or has to overcome several problems during insertion of an intramedullary nail (“IM nail”), an elongated rod-shaped prosthetic device, into the canal of a fractured femur. These problems include matching the leg length of the injured leg with the well leg of the patient, improper rotation of the injured leg, and unpredictable flexing of the distal end of the nail. To reduce the incidence of malrotation of the leg, fluoroscopic images are taken frequently during the procedure, thus exposing the patient and operating room personnel to radiation. Furthermore, implantation of the IM nail using traditional methods requires use of an extra pin for determining the version of the leg for proper alignment of the rod, as well as use of a special, radio-translucent drill so that fluoroscopic images can be captured during insertion of screws into the distal end of the femur to secure the distal end of the nail.
To address one or more of these problems, various aspects of a specially-programmed, computer-assisted surgery system are used to reduce the number of fluoroscopic images required to be taken, especially during the course of the procedure, eliminate the need for a Steinman pin, and assist the surgeon in properly aligning and securing the nail during insertion. A preferred embodiment of such an application for programming a computer-assisted surgery system is described below.
For a more complete understanding of the present invention, the objects and advantages thereof, reference is now made to the following descriptions taken in connection with the accompanying drawings in which:
In the following description, like numbers refer to like elements. References to “surgeon” include any user of a computer-assisted surgical system, a surgeon being typically a primary user.
Tracking system 22 continuously determines, or tracks, the position of one or more trackable markers disposed on, incorporated into, or inherently a part of surgical tools or instruments 20 with respect to a three-dimensional coordinate frame of reference. With information from the tracking system on the location of the trackable markers, CAS system 10 is programmed to be able to determine the three-dimensional coordinates of an endpoint or tip of a tool and, optionally, its primary axis using predefined or known (e.g. from calibration) geometrical relationships between trackable markers on the tool and the end point and/or axis of the tool. A patient, or portions of the patient's anatomy, can also be tracked by attachment of arrays of trackable markers.
The CAS system can be used for both planning surgical procedures (including planning during surgery) and for navigation. It is therefore preferably programmed with software for providing basic image-guided surgery functions, including those necessary determining the position of the tip and axis of instruments and for registering a patient and preoperative and/or intraoperative diagnostic image data sets to the coordinate system of the tracking system. The programmed instructions for these functions are indicated as core CAS utilities 24. These capabilities allow the relationship of a tracked instrument to a patient to be displayed and constantly updated in real time by the CAS system overlaying a representation of the tracked instrument on one or more graphical images of the patient's internal anatomy on display device 12. The graphical images are constructed from one or more stored image data sets 26 acquired from diagnostic imaging device 28. Imaging device may be a fluoroscope, such as a C-arm fluoroscope, capable of being positioned around a patient lying on an operating table. It may also be a MR, CT or other type of imaging device in the room or permanently located elsewhere. Where more than one image is shown, as when multiple fluoroscopic images are simultaneously displayed of display device 12, the representation of the tracked instrument or tool is coordinated between the different images. However, CAS system can be used in some procedures without the diagnostic image data sets, with only the patient being registered. Thus, the CAS system need not support the use of diagnostic images in some applications—i.e. an imageless application.
Furthermore, as disclosed herein, the CAS system may be used to run application-specific programs 30 that are directed to assisting a surgeon with planning and/or navigation during specific types of procedures. For example, the application programs may display predefined pages or images corresponding to specific steps or stages of a surgical procedure. At a particular stage or part of a program, a surgeon may be automatically prompted to perform certain tasks or to define or enter specific data that will permit, for example, the program to determine and display appropriate placement and alignment of instrumentation or implants or provide feedback to the surgeon. Other pages may be set up to display diagnostic images for navigation and to provide certain data that is calculated by the system for feedback to the surgeon. Instead of or in addition to using visual means, the CAS system could also communicate information in ways, including using audibly (e.g. using voice synthesis) and tactilely, such as by using a haptic interface of device. For example, in addition to indicating visually a trajectory for a drill or saw on the screen, a CAS system may feedback to a surgeon information whether he is nearing some object or is on course with a audible sound or by application of a force or other tactile sensation to the surgeon's hand.
To further reduce the burden on the surgeon, the program may automatically detect the stage of the procedure by recognizing/identifying the instrument picked up by a surgeon and move immediately to the part of the program in which that tool is used. Application data 32—data generated or used by the application—may also be stored processor-based system.
Various types of user input methods can be used to improve ease of use of the CAS system during surgery. One example is the use of speech recognition to permit a doctor to speak a command. Another example is the use of a tracked object to sense a gesture by a surgeon, which is interpreted as an input to the CAS system. The meaning of the gesture could further depend on the state of the CAS system or the current step in an application process executing on the CAS system. Again, as an example, a gesture may instruct the CAS system to capture the current position of the object. One way of detecting a gesture is to occlude temporarily one or more of the trackable markers on the tracked object (e.g. a probe) for a period of time, causing loss of the CAS system's ability to track the object. A temporary visual occlusion of a certain length (or within a certain range of time), coupled with the tracked object being in the same position before the occlusion and after the occlusion, would be interpreted as an input gesture. A visual or audible indicator that a gesture has been recognized could be used to provide feedback to the surgeon.
Yet another example of such an input method is the use of tracking system 22 in combination with one or more trackable data input devices 34. Defined with respect to the trackable input device 34 are one or more defined input areas, which can be two-dimensional or three-dimensional. These defined input areas are visually indicated on the trackable input device so that a surgeon can see them. For example, the input areas may be visually defined on an object by representations of buttons, numbers, letters, words, slides and/or other conventional input devices. The geometric relationship between each defined input area and the trackable input device is known and stored in processor-based system 16. Thus, the processor can determine when another trackable object touches or is in close proximity a defined input area and recognize it as an indication of a user input to the processor-based systems. For example, when a tip of a tracked pointer is brought into close proximity to one of the defined input areas, the processor-based system will recognize the tool near the defined input area and treat it as a user input associated with that defined input area. Preferably, representations on the trackable user input correspond with user input selections (e.g. buttons) on a graphical user interface on display device 12. The trackable input device may be formed on the surface of any type of trackable device, including devices used for other purposes. In a preferred embodiment, representations of user input functions for graphical user interface are visually defined on a rear, flat surface of a base of a tool calibrator.
Processor-based system 16 is, in one example, a programmable computer that is programmed to execute only when single-use or multiple-use software is loaded from, for example, removable media. The software would include, for example the application program 30 for use with a specific type of procedure. Media storing the application program can be sold bundled with disposable instruments specifically intended for the procedure. The application program would be loaded into the processor-based system and stored there for use during one (or a defined number) of procedures before being disabled. Thus, the application program need not be distributed with the CAS system. Furthermore, application programs can be designed to work with specific tools and implants and distributed with those tools and implants. Preferably, also, the most current core CAS utilities may also be stored with the application program. If the core CAS utilities on the processor-based system are outdated, they can be replaced with the most current utilities.
Referring now to
Process 300, or parts thereof, preferably display a series of pages corresponding to stages or sub-procedures, each page being set up to display directions and information (including images) relevant to the stage of the procedure. However, as previously mentioned, the CAS system may in addition to the pages or in place of the pages, communicate some or all of this information by other means, including audible and haptic means. Although the process may constrain what a surgeon does in terms of the ordering of certain steps, the process preferably follows the surgeon, rather than requiring the surgeon to follow the process. This is particularly useful during the planning and navigation or execution phases of the process, where the surgeon may need to go back and change a plan or repeat steps. Thus, in the following explanation of process 300, some steps may be performed out of sequence or repeated. The surgeon may indicate to the process the stage he or she is in or wants to go to. This may be done through user input or by the process automatically recognizing when the surgeon has either finished a stage or is preparing to go to another stage (not necessarily the next stage) by, for example, the surgeon picking up an instrument used in a particular stage and showing it to the cameras of the tracking system. Details of the process 300 will be described with reference to representative examples of screens from such pages, shown in
Referring to
At step 410 the surgeon is asked to specify application-specific tools that he will use during the procedure that can be or will be tracked. Surgeons may prefer to use different tools for a given step, and this step permits the surgeon to select the tool of choice so that the CAS system can properly track it. The application may display a different page at a given step, or display pages in a different order, based on selection of the tool. Furthermore, a surgeon may, for example, elect not to use a tool during a given step, or not have it tracked. The process will adjust as necessary to accommodate the preferences to avoid forcing a surgeon to find ways to bypass steps or alter presentation of the pages. The CAS system is typically programmed or set up to operate with a probe and other basic tools that a surgeon may use.
Preferably, the surgeon is given a list of the tool or tools that the application can track, from which he may select.
At step 412, the CAS system calibrates the selected fluoroscope using known methods. The interface for this step is illustrated in
Steps 414, 416, 418 and 420 direct the acquisition of certain fluoroscopic images during the procedure, followed by registration of those images using known methods. If the surgeon specified that the well leg would be used for reference, images of the well leg are acquired in addition to images of the injured leg. Exemplary screen shots of the pages corresponding to the acquisition and registration of the well leg and injured leg are shown in
Referring now to
At steps 601 and 602 the surgeon is prompted to indicate, and the process receives, an estimated nail diameter on isthmus of uninjured leg and the center of the femoral head and the axis of its neck with reference to displayed A/P and M/L images of the proximal end of the femur. As illustrated in representative page or interface of
At step 604, the process displays A/P and lateral images of the distal end of the femur. The surgeon indicates on the images a marker for services as a reference point for determining a reference length for the femur. The program stores this information. At step 606, the reference length is calculated using the references marked on the proximal end of the femur and the reference marked on the distal end of the femur. The program also prompts, using, for example, directions displayed on the displayed page, and receives from the surgeon at step 406 the position and orientation for the trans-epicondylar axis of the femur.
Steps 608, 610, 611, 612, 614 and 616 assist the surgeon with selecting a nail of appropriate length and screw dimensions using the well leg. At step 608, the surgeon indicates, with respect to A/P and M/L images of the distal and proximal ends of the femur, end points for the nail. The process automatically determines the distance between the end points and then it selects and displays on the images a representation of the closest standard length nail. As indicated by steps 610 and 611, screw placement and dimensions for the proximal end of the nail and the placement of the nail end are indicated with respect to the uninjured leg. A representation of the closest standard nail to the indications is then displayed at step 612. The surgeon is then permitted to change, shift, rotate and move the representation in order to check its fit. If the fit is not correct, the surgeon can change the end points and/or select a different nail, as indicated by steps 614 and 616.
Referring now to
At step 620, the surgeon is prompted to mark in the images showing the edge of the fracture at the canal of the femur. A representative screen of the page displayed for this step is shown in
Injured-leg planning continues at steps 622 at the proximal end of the injured femur by the surgeon marking in the images the center of the femoral head and the axis of the femoral neck substantially in the same manner as discussed in connection with step 602. This information will be used to calculate reference length and version for the injured leg.
In a manner similar to step 606, the same landmarks used in marking the distal end of the well femur in step 606 are marked at step 624 by the surgeon and stored for use in calculating reference length and version for comparison to the well leg. A representative screen of a display page for this step is shown in
Once the reference points are marked, the process proceeds to steps 628 and 630, where the surgeon indicates to the process the entry point for the nail and desired position of the nail head and the screws that lock the nail head. As show in
As a final step before execution, tools previously selected for use in the procedure are calibrated if they are not already calibrated at step 632. A representative screen of an exemplary page that may be displayed at this step is shown in
Referring now to
In window or area 2407 the relative positions and orientations of the proximal and femoral fragments of the fractured femur are indicated by representations 2410 and 2408. This window is preferably displayed during steps 708 and 714. Displayed in area 2412 is reference length and version information that is continuously calculated based on the relative positions of the fragments. This tracking is possible due to the known relationship between trackable marker array and the reference landmarks specified on the fragment. At the time when the landmarks on each fragment were specified, the positions of the trackable markers were also stored, thereby permitting the relative relationship to be determined. Using the relative relationship between each trackable marker 224 and the landmarks on the fragment to which it is attached, the referenced lengths and version are calculated based on the relative positions of the two trackable markers.
Referring now to FIGS. 7 and
In
Once the surgeon inserts the nail and the proximal locking screws, the distal end locking screws must be inserted. The nail guide does not typically incorporate an external guide due at least in part to a possibility of the nail bending during insertion. In order to locate screw openings in the nail and determine trajectory of the screws, another set of lateral and A/P and M/L images of the distal end of the femur is required. Therefore, at step 712, the surgeon is prompted to acquire the additional images.
The second set of stored A/P and M/L images of the distal end of the femur should clearly show the screw holes in the distal end of the nail. In order to clearly see the holes, the lateral image needs to be a true lateral image relative to the nail. When a surgeon brings the instrument previously specified as being used for distal screw insertion into the area of focus of the tracking system, the CAS system preferably automatically displays a screen or page similar to the one of
At the conclusion of the procedure, the surgeon is prompted to specify whether to archive data generated by the procedure for later reference. The CAS system archives the data as directed, such as to a disk drive or removable media. This step is not illustrated.
If desired, the different steps discussed herein may be performed in any order and/or concurrently with each other. Furthermore, if desired, one or more of the above described steps may be optional or may be combined without departing from the scope of the present invention.
Embodiments of the present invention may be implemented in software, hardware, application logic or a combination of software, hardware and application logic. The software, application logic and/or hardware may reside on processor-based system 16 or on a removable storage medium. If desired, part of the software, application logic and/or hardware may reside on processor-based system 16 and part of the software, application logic and/or hardware may reside on the removable storage medium.
This patent application is a continuation of patent application Ser. No. 11/006,513, entitled “Method and Apparatus for Computer Assistance with Intramedullary Nail Procedure, filed Dec. 6, 2004, which is a continuation of patent application Ser. No. 10/771,851, entitled “Method and Apparatus for Computer Assistance With Intramedullary Nail Procedure,” filed Feb. 4, 2004; and claims the benefit of U.S. provisional patent application Ser. No. 60/445,001, entitled “Method and Apparatus for Computer Assistance With Intramedullary Nail Procedure”, filed Feb. 4, 2003, the disclosure of which is incorporated herein by reference. This application relates to the following U.S. provisional patent applications Ser. No. 60/444,824, entitled “Interactive Computer-Assisted Surgery System and Method”; Ser. No. 60/444,975, entitled “System and Method for Providing Computer Assistance With Spinal Fixation Procedures”; Ser. No. 60/445,078, entitled “Computer-Assisted Knee Replacement Apparatus and Method”; Ser. No. 60/444,989, entitled “Computer-Assisted External Fixation Apparatus and Method”; Ser. No. 60/444,988, entitled “Computer-Assisted Knee Replacement Apparatus and Method”; Ser. No. 60/445,202, entitled “Method and Apparatus for Computer Assistance With Total Hip Replacement Procedure”; and Ser. No. 60/319,924, entitled “Portable, Low-Profile Integrated Computer, Screen and Keyboard for Computer Surgery Applications”; each of which was filed on Feb. 4, 2003 and is incorporated herein by reference. This application also relates to the following applications: U.S. patent application Ser. No. 10/772,083, entitled “Interactive Computer-Assisted Surgery System and Method”; U.S. patent application Ser. No. 10/771,850, entitled “System and Method for Providing Computer Assistance With Spinal Fixation Procedures”; U.S. patent application Ser. No. 10/772,139, entitled “Computer-Assisted Knee Replacement Apparatus and Method”; U.S. patent application Ser. No. 10/772,142, entitled Computer-Assisted External Fixation Apparatus and Method”; U.S. patent application Ser. No. 10/772,085, entitled “Computer-Assisted Knee Replacement Apparatus and Method”; U.S. patent application Ser. No. 10/772,092, entitled “Method and Apparatus for Computer Assistance With Total Hip Replacement Procedure”; and U.S. patent application Ser. No. 10/772,137, entitled “Portable Low-Profile Integrated Computer, Screen and Keyboard for Computer Surgery Applications”; each of which was filed on Feb. 4, 2004 and is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
60445001 | Feb 2003 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11006513 | Dec 2004 | US |
Child | 11201741 | Aug 2005 | US |
Parent | 10771851 | Feb 2004 | US |
Child | 11006513 | Dec 2004 | US |