1. Field of the Invention
The present invention concerns a method for superimposing or fusing a 2D image obtained with a C-arm x-ray system, with a preoperative 3D image. The invention particularly concerns the display of a medical instrument in the 3D image which is in the examination region of a patient and is included in the 2D image.
2. Description of the Prior Art
An increasing number of examinations or treatments of patients are performed minimally invasively, that is, with the least possible surgical trauma. Examples are treatments with endoscopes, laparoscopes, or catheters, all of which are inserted into the examination zone of a patient through a small opening in the body. Catheters, for example, are often used in the course of cardiological examinations.
A problem from a medical-technical point of view is that the medical instrument (in the following, a catheter will be referred to as a non-restrictive example) during the procedure (operation or examination) can be visualized very exactly and with high resolution using an intraoperative X-ray control with the C-arm system in one or more transirradiation images, also known as 2D fluoroscopic images, but the anatomy of the patient can be only insufficiently visualized in the 2D fluoroscopic images during the intervention. Moreover, the physician often has the desire within the scope of operation planning to display the medical instrument in a 3D image (3D dataset) obtained before the intervention (preoperatively.)
An object of the present invention is to merge an intraoperatively obtained 2D fluoroscopic image showing the medical instrument in a simple way with a preoperatively obtained 3D image.
This object is solved in accordance with the invention by a method for automatic merging of a 2D fluoroscopic C-arm image with a preoperative 3D image using navigation markers wherein markers in a marker-displaying preoperative 3D image are registered relative to a navigation system, a tool plate fixed to a C-arm system is registered in a reference position relative to the navigation system, a 2D C-arm image (2D fluoroscopic image) that contains the image of at least a medical instrument in an arbitrary C-arm position is obtained, a projection matrix for a 2D/3D merge is determined on the basis of the tool plate and reference positions relative to the navigation system, and the 2D fluoroscopic image is superimposed onto the 3D image E on the basis of the projection matrix.
The preoperative 3D image containing the markers can be a pre-existing image that is stored and made available to a computer wherein the automatic merging takes place.
In a first alternative embodiment, artificial markers are used and the preoperative 3D image containing the artificial markers is obtained after the artificial markers have been set relative to the patient. This can ensue, if necessary, by surgically opening the patient or, if suitable, the artificial markers can be fixed on the surface of the body. After the artificial markers are set in one of these ways, registration of the set of artificial markers is then undertaken.
In a second alternative embodiment of the method of the invention, anatomical markers are used, which are identified and registered.
Ideally, the reference position is measured with a fixed chassis, 0° angulation, and 0° orbital angle of the C-arm.
The preoperative 3D image can be obtained in different ways, for instance using magnetic resonance tomography, computed tomography, ultrasound, positron emission tomography, or nuclear medicine.
The above object also is achieved in accordance with the principles of the present invention in a C-arm x-ray imaging device for implementing the above-described method.
In the immediate vicinity of the imaging system 2, there is a navigation sensor S, by means of which the current position of the tool plate TP can be recorded and thus the C-arm 3, as well as the position and orientation of a medical instrument 11 used for the procedure, and the patient.
The system 1 is operated using a control and processing unit 8, which among other things controls the image data acquisition. It also includes an image processing unit, not shown in detail. In this unit, among other things, is a 3D image data set E, which ideally is recorded preoperatively. This preoperative data set E can be recorded with any arbitrary imaging modality, for example with a computed tomography device CT, a magnetic resonance tomography device MRT, an ultrasound device UR, a nuclear medicine device NM, a positron emission tomography device PET, etc. The data set E alternatively can be recorded as a quasi intraoperative data set with its own imaging system 2, thus directly before the actual intervention, whereby the imaging system 2 would then be operated in a 3D angiography mode.
In the example shown, a catheter 11 is introduced into the examination zone 6, here the heart. The position and orientation of this catheter 11 can first be detected using the navigation system S, and then visualized with an intraoperative C-arm image (2D fluoroscopic image) 10. Such an image is shown in
The current invention provides a method in which an intraoperative 2D fluoroscopic image 10 recorded in an arbitrary C-arm position, which includes the medical instrument 11 (here a catheter), is automatically, that is using a computer and the processing system 8, overlaid (merged) with the preoperative 3D image E, so that the visualization and navigation of the instrument in the 3D data set E is possible. The result of such a merge is shown in
In order to be able to obtain a correct (correctly oriented) overlay of intraoperative 2D fluoroscopic images with the preoperative 3D data set E, it is necessary to register both images relative to one another or each relative to the navigation sensor S. Registration of two image data sets (of three-dimensional and/or two-dimensional nature) means to correlate their coordinate systems with one another, or to derive a mapping process which converts one image data set into the other. In general, such a mapping process or registration is specified using a matrix. The term “matching” is often used for such a registration. Among other words for registration are “merging” or “correlation”. Such a registration, for instance, can be performed interactively by the user on a display screen.
There are different possibilities for the registration of the two images:
Markers of anatomical origin—such as for instance blood vessel branching points or small sections of coronary artery, but also the corner of the mouth or the tip of the nose—are called “anatomical markers”. Artificially inserted or attached marking points are called “artificial markers”. Artificial markers are, for instance, screws which are set in a preoperative procedure, or simply objects which are attached to the surface of the body (for instance, glued in place).
Anatomic or artificial markers can be determined interactively by the user in the 2D fluoroscopic image (for instance, by clicking on the display) and then searched for and identified in the 3D image using suitable analysis algorithms. Such a registration is called “marker-based registration”.
2. A second possibility is so-called “image-based registration”. Here, a 2D project image is created from the 3D image in the form of a digitally reconstructed radiogram (DRR), which is compared to the 2D fluoroscopic image with regards to its matching features, whereby, to optimize the comparison, the DRR image is changed using translation and/or rotation and/or stretching relative to the 2D fluoroscopic image, until the matching features of both images have reached a given minimum. It is practical for the user to move the DRR image after its creation into a position in which it is as similar as possible to the 2D fluoroscopic image and only then to initiate the optimization cycle, in order to minimize the processing time for the registration.
Also shown is the original 3D image E′ immediately after it is obtained, without it being registered relative to the 2D fluoroscopic image 10′.
For registration, there are identified or defined several markers—in the example shown, three spherical artificial markers 16a′, 16b′, and 16c′. These markers are also identified in the original 3D image E′. As can be seen from
For registration, the 3D image E′ is now moved through translation and rotation (in this example, no scaling is necessary) until the markers 17a″, 17b″, 17c″ of the repositioned 3D image E″ can be projected onto the markers 16a′, 16b′, 16c′, and the registration is now complete.
Both image-based and marker-based registration have significant disadvantage. A marker-based registration often makes an additional operative procedure necessary to set artificial markers. Anatomic markers are often difficult to locate uniquely, often making calibration relative to a marker-based registration error-prone. Image-based registration requires very long processing times and, due to numerical instabilities, is a very unreliable procedure and therefore seldom used.
The identification of markers in marker-based registration need not necessarily be performed on the display screen. If a navigation system is present (navigation sensor S, see
However, navigation-supported registration still presents significant disadvantages: if it is desired to register intraoperatively recorded 2D fluoroscopic images with the preoperative 3D image on a navigation-supported basis, then in a navigation-supported marker-based registration, the markers would have to be manually selected for each C-arm position of the 2D fluoroscopic image to be recorded. Such a procedure, in practice, is very error-prone and tedious. If the markers are selected in a different order in the image from those in the patient, anatomic markers cannot be found in a reproducible way, or if the relative orientation of the markers has changed, erroneous positioning will result. In addition, if navigation is misadjusted at any point during the intervention, registration must be repeated each time. In a conventional marker- or image-based registration, the above disadvantages apply to the corresponding procedure.
The method of the invention still uses navigation markers (navigation-supported or computer-based). However, to avoid or significantly decrease the disadvantages of a marker-based merge, in the method of the invention the problematic marker-based registration must be performed only for the first 2D fluoroscopic image to be merged, or an already existing marker-based registration from the navigation procedure for the medical instrument can be used. For all further 2D-3D merges required during the intervention or examination, no additional interactive registration is necessary, as will be shown using the process flowcharts in
In a first step S1, artificial markers are set in a preoperative intervention. A preoperative intervention is not necessary if the artificial markers can, for example, be glued to the patient's skin. In a second step S2, a preoperative 3D data set E is recorded, in which all artificial markers are included and can be displayed. The 3D data set can be recorded with any arbitrary image capture modality (MRT, CT, PET, US, etc.) In a third step S3, a first operative intervention is performed in which the patient is opened, in order to register the artificial markers in E relative to a navigation system S in a fourth step S4. The registration is performed by manual selection of the markers with a navigation pointer. An operative intervention as in step S3 is not necessary if the markers are attached to the surface of the body (for instance, glued). In a fifth step, a second operative intervention is performed, in which a surgical instrument registered in S can be introduced with navigational support into E. In order to be able to merge arbitrary intraoperative 2D fluoroscopic images with E intraoperatively during such a navigation-supported operation, in step S6 a tool plate fixed on the C-arm is registered in system S in a reference position of the C-arm. If now a 2D fluoroscopic image is recorded in a seventh step S7 in an arbitrary C-arm position, this can be registered (merged) relative to E on the basis of knowledge of the current C-arm position during the recording. Thus in an eighth step S8, a projection matrix L is determined with which a 2D-3D image merge can be performed. In a final step S9, the 2D fluoroscopic image can finally be merged with the 3D image on the basis of L.
The projection matrix L is derived by measuring the position of the tool plate fixed on the C-arm in a defined C-arm position. This results in a tool plate reference position TPRef, which is for example measured with a fixed chassis, 0° angulation, and 0° orbital angle. Since both TPRef and E are known in S, the new position of the tool plate TP in any arbitrary C-arm position (defined relative to S through TP) can be calculated relative to S. The registration characterized by L is thus given by determination of TP relative to S and thus to E. L can be used to give the desired merge of the 2D fluoroscopic image with the preoperative 3D data directly.
Using the invented method, the problems of marker-based registration (merging) are minimized. The method utilizes the navigation procedure required for a navigation-supported intervention, whereby the problematic registration is only performed for the first image to be merged.
It should also be noted, that for the determination of L at an angulation # 0°, a C-arm distortion can occur, which can be corrected using look-up tables. The determination of a position matrix for C-arm devices is sufficiently well-known and need not be explained in further detail.
Although modifications and changes may be suggested by those skilled in the art, it is the intention of the inventors to embody within the patent warranted hereon all changes and modifications as reasonably and properly come within the scope of their contribution to the art.
Number | Date | Country | Kind |
---|---|---|---|
103 23 008.4 | May 2003 | DE | national |