This application claims priority to the German application No. 10 2004 011 154.5, filed Mar. 8, 2004 which is incorporated by reference herein in its entirety.
The present invention relates to a method for registering a sequence of 2D image data of a hollow channel, in particular of a vessel, which was recorded with an imaging endoluminal instrument where relative displacement positions of the instrument in the hollow channel are known, with 3D image data of the hollow channel from 3D imaging processes, such as computer tomography (CT), magnetic resonance tomography (MR), 3D angiography or 3D ultrasound.
Imaging endoluminal instruments can be used to record two-dimensional images of the interior of a hollow channel, in particular of a vessel or of a hollow organ. Use is made here of imaging methods such as intravascular ultrasound (IVUS), optical coherence tomography (OCT) or fluorescent imaging. The image is recorded during the continuous or gradual controlled movement of the instrument in the hollow channel. Thus for example imaging intravascular catheters can provide two-dimensional sectional images from the interior of vessels, e.g. from the vascular system of the heart.
Furthermore DE 199 19 907 A1 discloses a method for catheter navigation in three-dimensional vascular tree exposures, in which the spatial position of the catheter is detected and blended into a 3D view of a preoperatively recorded vascular tree. For this, use is made of a catheter with an integrated position sensor, via which the respective current spatial position of the catheter tip is detected. This position sensor is registered with the 3D image data before the intervention takes place, using special markers which are visible in the 3D image and which are approached with the catheter. This type of registration is required for all applications in which the recorded 2D image data is to be combined with 3D image data.
Based on this prior art, an object of the present invention is to specify a method for registering 2D image data of a hollow channel, recorded with an imaging endoluminal instrument, with 3D image data, said method dispensing with the use of a position sensor.
The object is achieved by the claims. Advantageous embodiments of the method are the subject of the dependent claims or can be taken from the following description and the exemplary embodiments.
In the present method for registering a sequence of 2D image data of a hollow channel, in particular of a vessel, which is recorded using an imaging endoluminal instrument where relative displacement positions of the instrument in the hollow channel are known, with 3D image data of the hollow channel, a three-dimensional path of a central axis of a definable section of the hollow channel is first determined from the 3D image data. This determination of the three-dimensional path of the central axis can for example be undertaken by segmenting the wall of the hollow channel identifiable in the 3D image data, on the basis of which the central axis is determined geometrically. Other image processing algorithms for determining the path of the axis are of course possible. Next this three-dimensional path of the central axis is converted by an isometric first transformation of the 3D image data of the definable section of the hollow channel into a rectilinear path and the transformation parameters required for this transformation are stored for a subsequent reverse transformation. A combined 3D image data record is generated from the sequence of 2D image data by means of a parallel, congruent side-by-side arrangement on a central straight line in accordance with the known relative displacement positions, as is also already known from the prior art. The combined 3D image data record is finally registered by equating the central straight line with the straight path of the central axis of the transformed 3D image data record and by suitable translation for superposition of a shared reference point with this transformed 3D image data. Any differences in resolution between the combined 3D image data record and the transformed 3D image data can be eliminated by a voxel interpolation in all 3 dimensions. A branch of the hollow channel or another prominent or known point can for example be used as a reference point. In this way registration can be achieved by simple translation of the two 3D image data records to one another. After this registration the combined 3D image data record or the 2D image data contained therein is transformed back into the position of the original 3D image data, taking into account the stored transformation parameters. This is achieved preferably by a direct second transformation of the combined 3D image data record or of the 2D image data contained therein, which represents a reverse transformation to the first transformation with the help of the stored transformation parameters. If the original 3D data record was not saved at the time of the first transformation, this is likewise obtained again by a reverse transformation of the transformed 3D image data. As a result of the present method the 2D image data of the section under investigation of the hollow channel now exists registered with the 3D image data record. In addition this 2D image data can be arranged or reconstructed in accordance with the true 3D anatomy of the hollow channel. This reconstruction occurs automatically at the time of the reverse transformation of the combined 3D image data record.
Thus a method is provided for registration of a sequence of 2D image data recorded continuously while the instrument is being guided through a hollow channel with an anatomical 3D image data record, which dispenses with the use of a position sensor. The registration step is greatly simplified by the proposed transformation of the 3D image data, since all that is required is a translation of the two 3D image data records to one another. The present method is suitable here in particular for use with imaging methods such as IVUS, OCT or fluorescent imaging with the help of a catheter or endoscope for registering the image data of vessels or hollow organs with image data from 3D imaging procedures such as CT, MR, 3D angiography or 3D ultrasound. The simple registration means the image data can then be displayed combined in any way for the user, e.g. superposed. The present method can of course also be applied to image data from examinations of other tubular parts of the body, such as intestinal examinations or bronchioscopies.
The present method is explained again in greater detail below on the basis of an exemplary embodiment in conjunction with the drawings. These show:
In the present example the registration of 2D image data with 3D image data of a vascular section is illustrated by way of example. The recording of a series of 2D image data with a catheter has already been briefly explained in the introductory part of the description on the basis of
First a section of the vessel 2 to be merged is defined in the 3D image data displayed to a user by identification of a three-dimensional start point and of a three-dimensional end point. Once the vascular section is specified, the three-dimensional path of the central axis 10 of the vascular section is extracted in this anatomical 3D image data 8. The vascular section is here first segmented, in order to use the segmented data to geometrically determine the path of the central axis 10.
On the basis of this three-dimensional path 10 of the central axis the 3D vascular structure of this vascular section is transformed into a rectilinear or tubular form. This is illustrated in
As illustrated in
The last method steps result in two 3D image data records 9, 11, each of which show the vascular section of interest in a rectilinear orientation. These two data records 9, 11 can now be assigned, i.e. registered, by a simple superposition of the central axis and reciprocal translation on the basis of a shared known point. A vascular branch can for example be recorded in the two 3D images by the user as a shared known point and brought into agreement.
After this registration of the two rectilinear-oriented 3D image data records 9, 11 a reverse transformation takes place into the original 3D form. This is illustrated in
Number | Date | Country | Kind |
---|---|---|---|
10 2004 011 154 | Mar 2004 | DE | national |
Number | Name | Date | Kind |
---|---|---|---|
5771895 | Slager | Jun 1998 | A |
6148095 | Prause et al. | Nov 2000 | A |
6248074 | Ohno et al. | Jun 2001 | B1 |
6317621 | Graumann et al. | Nov 2001 | B1 |
6389104 | Bani-Hashemi et al. | May 2002 | B1 |
6501848 | Carroll et al. | Dec 2002 | B1 |
6928314 | Johnson et al. | Aug 2005 | B1 |
7273469 | Chan et al. | Sep 2007 | B1 |
20020106116 | Knoplioch et al. | Aug 2002 | A1 |
Number | Date | Country |
---|---|---|
WO 9728743 | Aug 1997 | WO |
WO 9832371 | Jul 1998 | WO |
Number | Date | Country | |
---|---|---|---|
20050196028 A1 | Sep 2005 | US |