1. Field of the Invention
The invention concerns an imaging method for medical diagnostics as well as an imaging device operating according to this method.
2. Description of the Prior Art
In medical diagnostics, in a series of application cases different imaging methods are used simultaneously or successively in order to facilitate the diagnosis or to avoid misdiagnoses. For example, in urology it is known to implement both an endoscopic examination and an x-ray examination. In an endoscopic method, high-resolution, color, optical images are generated in real time in the viewing direction of the endoscope. Each image, however, offers only two-dimensional information of the state of a section of the inner surface of the cavity and no information from deeper slices or in other viewing directions. In an x-ray method, information is also acquired from regions that are not visible in an endoscopy image. The linking of individual or multiples endoscopic images with one or more x-ray images of the same body require on the part of the observer not only a precise knowledge of the anatomy but also a developed three-dimensional spatial sense that must first be learned and frequently leads to misinterpretations. Moreover, pathological structures often deviate in a complex manner from the standard, such that a linking of endoscopy images and x-ray images is made more difficult.
One of the causes for these difficulties is that the methods normally used in addition to endoscopy (for example magnetic resonance methods or ultrasound methods in addition to the aforementioned x-ray methods) provide slice images of the examination subject, while in an endoscopy image only a boundary surface between an optically permeable medium and an optically impermeable medium is presented on a two-dimensional image plane.
An object of the present invention is to provide an imaging method for medical diagnostics that makes the interpretation of image data acquired with endoscopic methods and non-endoscopic methods easier for the user. A further object of the invention is to provide a device operating according to such a method.
The object according to the invention is achieved by an imaging method wherein, during an endoscopic examination of a body region of a patient, an image is generated of the body region with a non-endoscopic imaging method, and the image field of the endoscope is determined and rendered in the image with accurate position and orientation.
Via this measure it is possible for the user to associate the structures identified in the endoscopy image or in the image with one another and to assess them with regard to their diagnostic relevance.
In the present specification the term “image” is used exclusively for images that have been generated with a non-endoscopic imaging method, for example x-ray images, magnetic resonance images or ultrasound images.
A particularly simple determination of the position and orientation of the image field of the endoscope is possible via comparison of the image of the endoscope generated with the non-endoscopic imaging method (i.e. the endoscope rendered in the image) with a stored three-dimensional model of the endoscope for the non-endoscopic method. This model is a three-dimensional image of the endoscope that has been generated with this method in a calibration. In such a model comparison, the model is rotated and displaced (shifted) with corresponding coordinate transformations until the image of the endoscope and the model projected on the image plane after the coordinate transformations show maximal correlation. The image field of the endoscope (which is projected in the image plane just like the model) is also linked with the model.
As an alternative, the image coordinates of the image are determined in a fixed coordinate system and the position and orientation of the image field of the endoscope is measured in this fixed coordinate system with a position detection device. Due to this measure a model comparison is unnecessary and the image field can be projected into the image purely by calculation even if the endoscope itself is not visible in the image.
The association of a structure rendered in the endoscopy image with a structure rendered in the image is made easier for the observer if the rendered image field is bounded by optically impermeable structures recognizable in the image.
An additional facilitation for the user is achieved when a marker placed by the user in the endoscopy image is displayed in the image as a ray beam emanating from the endoscope or as an end point at an optically impermeable structure.
As an alternative or in addition, an area selected by the user in the endoscopy image is displayed in the image as a beam emanating from the endoscope or as an end surface at an optically impermeable structure.
In an embodiment of the invention, the marker or surface is automatically segmented with methods of image processing.
If the image generated with a non-endoscopic imaging method is a 3D image, the image field of the endoscope can be visualized particularly vividly in spatial representation.
A device according to the invention implements the method and achieves advantages that correspond to the advantages specified with regard to the method.
As shown in
An endoscope 20 with which it is possible to optically observe a section of the internal surface of a wall 30 of a cavity 18 is inserted via a bodily orifice into said cavity 18 of the body 12. The lateral edge 19 of an image field 22 acquired by the endoscope 20 is drawn in dashes in
Starting from the wall 30 of the cavity 18, in the example a pathological tissue zone 24 (for example a tumor) extends into a region lying behind the wall 30 (i.e. outside of the cavity 18). In the shown example this pathological tissue zone 24 lies in the image field 22 of the endoscope 20 and is recognizable as a planar structure 240 of the inner surface in an endoscopy image 26 acquired in this position of the endoscope 20, which planar structure 240 is in fact emphasized relative to the surroundings but whose unambiguous assessment is not possible without further measures.
Among other things, the contour of the wall 30 of the cavity 18 (which contour forms an optically impermeable structure) as well as further structures 32 situated in the slice plane are recognizable in a non-endoscopic image 28 (in the example a two-dimensional x-ray slice image) generated with the image generation system 4. A structure 242 that extends into the tissue and that reflects pathological tissue zones 24 is now recognizable for the user in this image 28. Moreover, the endoscope 20 is visible in the image 28 in the shown exemplary embodiment.
Using a three-dimensional model of the endoscope 20 stored in a memory 40 of the image generation system 4, the position of the endoscope 20 (and therefore of the image field 22) relative to an image coordinate system xB, yB associated with the image generation system 4 is now determined by comparison of the image of the endoscope 20 with this model. The image field 22 is mixed into the image 28 with accurate position and orientation (i.e. is visibly emphasized for the user) with an image processing software implemented in a control and evaluation device 42 of the image generation system 4. For example, the entire image field 22 is colored for this purpose. Alternatively, it is also possible to exclusively display the edge 19 of the image field 22 as boundary rays in the image 28.
Moreover, in the shown example both the edge 19 and the image field 22 end at the contour (recognizable in the image 28) of the wall 30 of the cavity 18 in order to visualize to the observer that only a surface region corresponding to this contour is visible in the endoscopy image 26. Alternatively or additionally, it is possible to emphasize the end surface of the image field 22 at the optically impermeable structure (wall 30).
By comparison of endoscopy image 26 and image 28, the observer now recognizes that the flat structure 240 visible in the endoscopy image 26 belongs to the structure 242 extending deep into the tissue in image 28, and the observer can in this manner now clearly associate both structures 240, 242 with one another and, for example, better assess their volume extent since the endoscopy image 26 and the image 28 impart size impressions in slice planes perpendicular to one another.
Moreover, in the shown example a position detection device 50 is associated with the endoscope 20, with which position detection device 50 the position and orientation of the endoscope 20 (and therefore also the position and orientation of the image field 22) can be determined in a fixed coordinate system x, y, z with the use of sensors (not shown in the Figure) arranged in the region of the endoscope tip. Given a known relationship between the image coordinate system xB, yB of the image 28 and the fixed coordinate system x, y, z, it is possible to enter the image field 22 into the image 28 with correct position without it being necessary to store a model of the endoscope 20.
The observer now has the possibility to mark an area of interest to him or her in the endoscopy image, which area encompasses the structure 240 in the example. This area 52 is relayed from a control and evaluation device 54 of the endoscopy apparatus 2 to the control and evaluation device 42 of the image generation system 4 and is mixed into the image 28 with the image processing software, for example in the form of a limited image field 56 shown in hatching or in the form of boundary rays 58 drawn in dashes, as is illustrated in
As an alternative to this, the observer can also place a marker 60 at a single point of interest to the observer, which marker 60 is then shown either as a sight line or ray 62 (drawn in dashes) ending at the wall 30 or likewise as an end point 61 of this ray 62 at the wall 30, i.e. the optically impermeable structure in the image 28.
However, in principle it is also possible that the area 52 of interest or the marker is automatically segmented with methods of image processing.
Situations in which the endoscope 20 is visible in the image (slice image) are presented in
A situation in which the tip of the endoscope is arranged outside of the plane of the drawing (the image plane of image 28) shown in
In the exemplary embodiment, two-dimensional slice images are generated by an image generation system 4. However, the method can be used with particularly great advantage even when the image generation system 4 generates a three-dimensional image data set of the examination subject 12. Then the normally conical image field 22 of the endoscope can be spatially mixed into the 3D data set and the orientation of the user is significantly facilitated.
The invention is also not limited to the flexible endoscope depicted in the exemplary embodiment. In principle the endoscope can also be executed rigidly or as an endoscopy capsule.
Although modifications and changes may be suggested by those skilled in the art, it is the intention of the inventors to embody within the patent warranted hereon all changes and modifications as reasonably and properly come within the scope of their contribution to the art.
Number | Date | Country | Kind |
---|---|---|---|
10 2007 029 888 | Jun 2007 | DE | national |
Number | Name | Date | Kind |
---|---|---|---|
5704897 | Truppe | Jan 1998 | A |
6368269 | Lane | Apr 2002 | B1 |
6442417 | Shahidi et al. | Aug 2002 | B1 |
6661571 | Shioda et al. | Dec 2003 | B1 |
6768496 | Bieger et al. | Jul 2004 | B2 |
6892090 | Verard et al. | May 2005 | B2 |
7542791 | Mire et al. | Jun 2009 | B2 |
7967742 | Hoeg et al. | Jun 2011 | B2 |
8090174 | Navab | Jan 2012 | B2 |
20040024310 | Graumann et al. | Feb 2004 | A1 |
20040210105 | Hale et al. | Oct 2004 | A1 |
20050054895 | Hoeg et al. | Mar 2005 | A1 |
20050113809 | Melkent et al. | May 2005 | A1 |
20050187432 | Hale et al. | Aug 2005 | A1 |
20070208252 | Makower | Sep 2007 | A1 |
20070225553 | Shahidi | Sep 2007 | A1 |
20070265518 | Boese et al. | Nov 2007 | A1 |
20080243142 | Gildenberg | Oct 2008 | A1 |
20080269596 | Revie et al. | Oct 2008 | A1 |
20080281181 | Manzione et al. | Nov 2008 | A1 |
20080287805 | Li | Nov 2008 | A1 |
20120147359 | Stetten et al. | Jun 2012 | A9 |
20120253515 | Coste-Maniere et al. | Oct 2012 | A1 |
20130144116 | Cooper et al. | Jun 2013 | A1 |
Number | Date | Country |
---|---|---|
DE102005056080 | May 2007 | EP |
05285087 | Nov 1993 | JP |
Number | Date | Country | |
---|---|---|---|
20090005641 A1 | Jan 2009 | US |