This application claims priority of German patent application no. 10 2014 107 443.2, filed May 27, 2014, the entire content of which is incorporated herein by reference.
The present invention relates to a microscope system, for example a surgical microscope system especially for neurosurgical applications. The invention furthermore relates to a microscopy method, for example for surgical microscopes especially for neurosurgical applications.
A tumor resection by means of a surgical microscope constitutes a general challenge in surgery. Particularly in the case of resection in functional regions, the exact three-dimensional morphology of the tumor has to be known to the surgeon in order to be able to choose the most exact possible cut boundary between functional tissue and malignant tissue during an intervention.
With currently available navigation solutions, the tumor margin calculated on the basis of pre-operative data, for example from MRI examinations, is displayed only for the current focal plane or focal slice or individual adjacent slices through the eyepiece in the surgical microscope.
For greater representation and identification of the three-dimensional tumor morphology, the surgeon must either call up a corresponding morphology from memory or, during the intervention, abandon the surgeon's customary view through the eyepiece of the surgical microscope in order to view an external visualization unit. Both the morphology learned beforehand and the momentary view of an external visualization unit pose risks for the patient since the operation flow is interrupted.
Therefore, it is an object of the present invention to provide an advantageous microscope system and an advantageous microscopy method.
The microscope system according to the invention includes a microscope, for example a surgical microscope, for generating a microscopic image of an observation region to be examined. It furthermore includes a display unit, for example a visualization unit or a display, for visualizing the microscopic image. The microscope system additionally includes a registration unit and an evaluation unit. The microscope, the display unit, the registration unit and the evaluation unit are connected to one another, especially for data transfer.
The microscope system according to the invention is distinguished by the fact that the registration unit is configured to transfer the three-dimensional structure of an observation object from available data, for example data obtained at an earlier point in time, to the position of the observation object in the observation region. The observation object can be tissue, for example, especially tissue of a tumor. The registration unit can therefore be configured to register the morphology of the tissue, for example the tumor morphology, from pre-operative data to the current observation region or the current operation scene.
The microscope system according to the invention is furthermore distinguished by the fact that the evaluation unit is configured to calculate a depth preview map of the three-dimensional structure of the observation object from available data, for example data obtained at an earlier point in time, and to transmit the depth preview map to the display unit for visualizing the three-dimensional structure in relation to the position of the observation object in the observation region. By way of example, the evaluation unit can be configured to calculate the morphology of tissue, especially of a tumor, from available data and to transmit it to the display unit for visualizing the three-dimensional structure in relation to the position of the observation object in the observation region. The evaluation unit can be a PC, for example.
The problem—described above in connection with tumor resection—of the inadequate possibility of visualizing a tumor during an operative intervention can be solved with the aid of the microscope system according to the invention by means of a tumor depth preview map which visualizes the entire morphology of the tumor for example as iso-depth lines (aka isobaths) which are inserted directly into the eyepiece of the surgical microscope. For better differentiation, individual characteristic depth regions can optionally also be provided with color coding. The surgeon thus need not abandon the surgeon's customary view through the eyepiece of the surgical microscope. The surgeon need not rely on a morphology learned beforehand either. This improves the operation flow and reduces the abovementioned risks for the patient.
In principle, in the context of the microscope system according to the invention, techniques, such as the fusion of image data, for example, from the field of augmented reality can be used in order to enable a seamless and realistic insertion into the operation scene.
The microscope system advantageously comprises a measuring system configured for detecting the topography of the observation region, in particular of the current operation scene. The measuring system can comprise a stereoscopic sensor and/or a laser scanner and/or a sensor for time-of-flight measurement, for example a time-of-flight camera (TOF-camera), and/or an apparatus for structured illumination. The stereoscopic sensor can comprise for example two cameras integrated in the surgical microscope.
Optionally, the microscope system comprises a visualization system, for example in the form of a video camera, configured to detect images, for example current images, of the observation region, in particular of the operation scene, in order to combine them with the depth preview map. In principle, the visualization of the depth preview map or depth map can be effected as an opaque superimposition.
An alternative form of visualization is linear display of the iso-depth lines, wherein the lines can be shown as solid, dashed, dotted or in an arbitrary combination thereof. Any form of superimposition of iso-depth lines or opaque representation with the current operation scene or the observation region is also conceivable in order to enable realistic “look-and-feel”. Different visualization parameters can advantageously be chosen for visible and for non-visible regions of the malignant tissue.
In a further embodiment, the depth information can also be displayed in a locally delimited fashion, in order not to restrict the field of view of the examining person, for example the operating surgeon.
Preferably, the visualization is primarily effected in the eyepiece of the surgical microscope. Alternatively or additionally, the visualization can also be effected on an external display unit, for example a monitor, data spectacles or the like.
The registration unit can comprise a navigation device. It can be configured, in principle, for rigid or for non-rigid transfer or registration. The registration unit can be embodied for example in the form of a navigation device.
As already mentioned, the display unit can comprise an eyepiece display or external display. Furthermore, the display unit can be configured for visualizing the depth preview map or depth map as an opaque superimposition and/or for visualizing the depth preview map or depth map in the form of iso-depth lines. The externally embodied display unit can be a monitor or data spectacles, for example.
Overall, the microscope system according to the invention has the advantage that it enables an improved representation of the overall morphology of an observation object with an observation region, for example an improved representation of the overall morphology of malignant tissue within an operation scene.
In the context of the microscopy method according to the invention, a microscopic image of an observation region to be examined is generated by means of a microscope and visualized by means of a display unit. The three-dimensional structure of an observation object, for example of a tissue region, is transferred or registered from available data, in particular pre-operative data obtained at an earlier point in time, to the position of the observation object in the observation region, in particular to the current observation region or the current operation scene. Furthermore, a depth preview map of a three-dimensional structure of the observation object, for example of the tissue region, is calculated from available data, for example pre-operative data obtained at an earlier point in time. The calculated three-dimensional structure is visualized in relation to the position of the observation object in the observation region with the aid of the display unit.
The method according to the invention can be carried out for example with the aid of the above-described microscope system according to the invention. In principle, it has the advantages like the above-described microscope system according to the invention.
A registration unit described in the context of the microscope system according to the invention and/or an evaluation unit described in that context can advantageously be used. The observation region is preferably a surgical, for example neurosurgical, operation scene.
The topography of the observation region, for example the current operation scene, can advantageously be detected. A corresponding measuring system configured for detecting the topography of the observation region can be used for this purpose. The detection can be effected, in principle, stereoscopically, for example with the aid of a stereoscopic sensor, in particular with the aid of two video cameras integrated in the surgical microscope, and/or with the aid of a laser scanner and/or with the aid of a method for time-of-flight measurement, for example with the aid of a sensor for time-of-flight measurement such as preferably a TOF camera. Alternatively or additionally, the topography detection can be effected by means of structured illumination, for example with the aid of an apparatus for structured illumination.
As a result of the inclusion of current topography information, the depth information and the visualization thereof can be adapted to the current conditions. In a different extension, the intraoperative image data are used to compensate for a possible geographical deviation from pre- and intraoperative data for the visualization of the depth information. Various image processing algorithms and/or various illumination modes and/or markers, for example, contrast agents, can be used for this purpose.
In principle, the depth information can be detected with the aid of external navigation solutions. Specifically, the depth information can be detected by means of MRI (magnetic resonance imaging), CT (computed tomography) or the like. The navigation system can segment the data, that is, assign them to bone or tumor tissue, for example, and can supply them either as raw data or in a manner already corrected computationally to the optical axis of the surgical microscope.
In one embodiment thereof, the navigation system provides the entire depth information via an interface. In another embodiment, the surgical microscope, on the basis of the current focal plane, transmits “virtual depths” (distance of the adjustable image focus relative to the surgical microscope) in a defined range of values to an external navigation solution in order to obtain the respective contour of the malignant tissue in the pre-operative state for the transmitted depth. These contours are then combined to form the depth map in the surgical microscope.
Optionally, the microscopically generated images of the observation region to be examined, for example of the operation scene, are detected. They are subsequently combined with the depth preview map. In particular, a correspondingly configured visualization system, for example a video camera, can be used for this purpose.
In the context of the microscopy method, the depth preview map or depth map can be visualized as an opaque superimposition and/or in the form of iso-depth lines, for example with the aid of a display unit. The display unit used can comprise an eyepiece display or external display. It can be configured in particular for visualizing the depth map as an opaque superimposition and/or for visualization in the form of iso-depth lines. In particular, a monitor or data spectacles can be used as external display unit.
The transfer or registration of the three-dimensional structure of the observation object on the basis of available data to the position of the observation object in the observation region can be effected rigidly or non-rigidly, in principle. A registration unit configured for rigid or for non-rigid transfer or registration can be used for this purpose.
The invention will now be described with reference to the drawings wherein:
The fundamental construction of the surgical microscope 2 is explained below with reference to
The surgical microscope 2 shown in
A magnification changer 11 is arranged at the observer side of the objective 5. The magnification changer can be embodied either, as in the embodiment shown, as a zoom system for changing the magnification factor in a continuously variable fashion or as a so-called Galilean changer for changing the magnification factor step by step. In a zoom system constructed, for example, from a lens combination including three lenses, the two object-side lenses can be displaced in order to vary the magnification factor. In actual fact, however, the zoom system can also comprise more than three lenses, for example, four or more lenses, wherein the outer lenses can then also be arranged in a fixed fashion.
In a Galilean changer, by contrast, there are a plurality of fixed lens combinations which represent different magnification factors and can be introduced into the beam path alternately. Both a zoom system and a Galilean changer convert an object-side parallel beam into an observer-side parallel beam having a different beam diameter. In the embodiment, the magnification changer 11 is already part of the binocular beam path of the surgical microscope 1, that is, it has a dedicated lens combination for each stereoscopic partial beam path (9A, 9B) of the surgical microscope 1. In the present embodiment, the setting of a magnification factor by the magnification changer 11 is effected by a motor-driven actuator which, together with the magnification changer 11, is part of a magnification changing unit for setting the magnification factor.
An interface arrangement (13A, 13B) is adjacent to the magnification changer 11 on the observer side. Via the interface arrangement, external devices can be connected to the surgical microscope 1 and which interface arrangement comprises beam splitter prisms (15A, 15B) in the present embodiment. In principle, however, other types of beam splitters can also be used, for example, partly transmissive mirrors. In the present embodiment, the interfaces (13A, 13B) serve for coupling out a beam from the beam path of the surgical microscope 2 (beam splitter prism 15B) and for coupling a beam into the beam path of the surgical microscope 2 (beam splitter prism 15A).
In the present embodiment, the beam splitter prism 15A in the component beam path 9A serves, with the aid of a display 37, (for example, a digital mirror device (DMD) or an LCD display), and an associated optical unit 39, via the beam splitter prism 15A, to reflect information or data for an observer into the component beam path 9A of the surgical microscope 1. In the other component beam path 9B, a camera adapter 19 with a camera 21 fixed thereto is arranged at the interface 13B. The camera is equipped with an electronic image sensor 23, for example, with a CCD sensor or a CMOS sensor. An electronic and especially a digital image of the tissue region 3 can be recorded by the camera 21. In particular, a hyperspectral sensor containing not just three spectral channels (for example, red, green and blue) but rather a multiplicity of spectral channels can also be used as the image sensor.
A binocular tube 27 is adjacent to the interface (13A, 13B) on the observer side. The binocular tube includes two tube objectives (29A, 29B) which focus the respective parallel component beams (9A, 9B) onto intermediate image planes (31A, 31B), that is, image the observation object 3 onto the respective intermediate image planes (31A, 31B). The intermediate images situated in the intermediate image planes (31A, 31B) are finally, in turn, imaged toward infinity by eyepiece lenses (35A, 35B) such that an observer can observe the intermediate image with a relaxed eye. Moreover, in the binocular tube, the distance between the two component beams (9A, 9B) is magnified by a mirror system or by prisms (33A, 33B) in order to adapt the distance to the intraocular distance of the observer. Image erection is additionally carried out by the mirror system or the prisms (33A, 33B).
The surgical microscope 2 is additionally equipped with an illumination apparatus that can be used to illuminate the object field 3 with broadband illumination light. For this purpose, in the present embodiment, the illumination apparatus includes a white light source 41, for instance a halogen incandescent lamp or a gas discharge lamp. The light emerging from the white light source 41 is directed via a deflection mirror 43 or a deflection prism in the direction of the object field 3 in order to illuminate the latter. Furthermore, an illumination optical unit 45 is present in the illumination apparatus and provides for uniform illumination of the entire observed object field 3.
It is noted that the illumination beam path shown in
The illumination can be influenced in the surgical microscope shown in
The illumination apparatus can additionally be equipped with a unit for changing the illumination light source. The latter is indicated in
In the embodiment variant of the surgical microscope 2 shown in
One example of a varifocal objective is shown schematically in
Although the positive element 51 is embodied in displaceable fashion in
In principle, it is possible here to use techniques from the field of augmented reality for the fusion of image data. This enables seamless and realistic insertion into the operation scene.
With the aid of the tumor depth preview map 10, the entire morphology of the tumor 12 is visualized as iso-depth lines, as shown in
The depth information is detected for example with the aid of external navigation solutions. In this case, the navigation system can provide the entire depth information via an interface. In another embodiment, in addition or as an alternative thereto, on the basis of the current focal plane, the surgical microscope 2 can transmit “virtual depths” in a defined range of values to an external navigation solution in order to obtain each contour of the malignant tissue 12 in the pre-operative state for the corresponding transmitted depth.
In an extended variant of the topography information of the operation scene is detected by a suitable sensor, for example a stereoscopic sensor, a laser scanner, a time-of-flight sensor or with the aid of structured illumination. The stereoscopic sensor can comprise for example two video cameras integrated in the surgical microscope. As a result of the inclusion of current topography information, the depth information and the visualization thereof can be adapted to the current circumstances. In particular, deformations that occur can be taken into account.
As a further variant, the intra-operative image data are used to compensate for a possible geographical deviation from pre- and intra-operative data for the visualization of the depth information. Various image processing algorithms, illumination modes and markers, for example contrast agents, can be used for this purpose.
In principle, the visualization of the depth preview map or depth map can be effected as an opaque superimposition, as shown for example in
Preferably, the visualization is primarily effected in the eyepiece of the surgical microscope, for example as perspectively correct superimposition or as picture-in-picture (PiP) at the edge of the field of view in order to minimize the concealment of relevant image information. However, the visualization can optionally also be effected on an external display unit.
The surgical system can optionally include a measuring system 63 configured to detect the topography of the current operation scene. The microscope system can likewise optionally comprise a system, for example in the form of a video camera, configured to detect current images of the operation scene and to combine them with the depth preview map.
It is understood that the foregoing description is that of the preferred embodiments of the invention and that various changes and modifications may be made thereto without departing from the spirit and scope of the invention as defined in the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
10 2014 107 443.2 | May 2014 | DE | national |