The present disclosure relates to a surgical navigation system for navigation during a surgical intervention on a patient for tracking at least one medical instrument used during the intervention. In addition, the present disclosure relates to a navigation tower, a navigation method and a computer-readable storage medium.
Surgical navigation can usually only be performed with special instruments that have special marking systems with markers, such as infrared reference points, or an electromagnetic tracking system. These instruments are known as indicators or pointers and are specially adapted to mark a position in three-dimensional space with their tip, which is detected by the navigation system.
In order to obtain the required navigation information during an intervention, a user, such as a surgeon, has to interrupt his process sequence, take the specially adapted (pointer) instrument in his hand and guide it to the desired location, wherein the only function of the (pointer) instrument is to provide navigation information. Such a continuous change between the individual instruments prolongs an intervention on a patient in a disadvantageous way and also leads to fatigue on the part of the surgeon. In addition, a large number of medical instruments with different functions always have to be available.
In addition, there is currently the problem that navigation systems usually use an external camera system, such as a stereo camera, which is positioned at a distance from an intervention region. In a surgical navigation system, for example an infrared-based navigation system, a field of view within lines of sight is therefore severely limited in a complex environment of an intervention region, in particular when a navigation system and a surgical microscope are used simultaneously. The tracing of the instrument is temporarily interrupted and the precision of the tracing also decreases. In addition, restrictions regarding the field of view to be kept free for the camera system need to be observed, which further complicates the intervention on the patient.
Therefore the objects and objectives of the present disclosure are to provide a surgical navigation system, a navigation tower, a navigation method as well as a computer-readable storage medium that avoids or at least reduces the disadvantages of the prior art. In particular, a surgical navigation system as well as a navigation method are to be provided that provide a higher precision of tracking of a medical instrument and allow continuous and uninterrupted tracing as well as an even better detection of an intervention region. A user, in particular a surgeon, is to receive a better and safer navigation modality. In addition, a partial object is to support a surgical process sequence even better and to perform it faster. Another partial object can be seen in using existing surgical instruments as pointers or at least only having to change them slightly, in particular only supplementing them. Another partial object is to increase the available navigation time during an intervention.
The objects of the present disclosure are solved with respect to a generic surgical navigation system according to the invention, with respect to a generic navigation tower according to the invention, with respect to a navigation method according to the invention, and with respect to a computer-readable storage medium according to the invention.
A basic idea of the present disclosure can thus be seen in that the surgical navigation system and navigation method do not detect a medical instrument directly and absolutely via, for example, a camera system, but indirectly via, on the one hand, the tracing of the imaging head of the medical imaging device, which is arranged closer to the intervention region than, for example, the camera system, and, on the other hand, the tracing of the medical instrument by the imaging head itself, so that the position and/or orientation of the instrument is computable or determinable via a serial linking of the two trackings. If the patient is registered to the navigation system, 3D imaging data/3D image data, in particular preoperative 3D imaging data such as MRI images and/or CT images, are also registered to the patient and the determined position and/or orientation of the instrument can be determined in the 3D imaging data for navigation and can then displayed to the user.
In a sense, decoupling of the direct tracing of the instrument with at least two-part, sequential tracing is provided, i.e. the tracing of the imaging head and an independent tracing of the instrument by the imaging head. The two tracings provide two individual (coordinate) transformations that are linked together in order to obtain a transformation for the instrument to be tracked. Due to the independence, different tracing systems or methods may also be used, so that, for tracing the imaging head, a tracking system configured for it and adapted for greater distances may be used, while different and specially adapted tracing for a shorter distance in the area of the intervention may also be used for tracing the instrument. In this way, even common, e.g. standardized medical, in particular surgical, instruments can be detected, which is not possible in the prior art.
This at least coupled/linked two-part or two-stage, serial tracing can achieve a particularly high precision of tracking/a high tracing accuracy due to the small distance between (the imaging head) of the medical imaging device, in particular a stereomicroscope, and the instrument used. In this way, the problems with a line of sight with an external navigation camera, for example, are also avoided, since the navigation camera does not need to see the instrument to be tracked itself, but only the imaging head of the medical imaging device, in particular a microscope head with an optical system of a surgical microscope. It is also advantageous that the surgical process sequence is not interrupted by the use of the navigation system, since the detection and tracing of the instrument by the imaging head of the imaging device means that standard instruments can also be followed and do not have to be replaced by navigated instruments such as a pointer.
An intervention with navigation method is furthermore even safer, since the net navigation time available to the surgeon is further increased. Standard surgical instruments are navigated throughout their entire use.
In other words, a surgical navigation system for navigation in a surgical intervention on a patient for tracing of at least one object, in particular a medical instrument, is provided, comprising: a display device, in particular a monitor, for displaying a visual content; at least one object to be tracked, in particular a medical instrument; a medical imaging device having an imaging head, which is adapted to create an optical/visual image of an intervention region in a patient as well as to detect and to trace/to track the object to be tracked, in particular a medical instrument, in the surgical intervention region of the patient with respect to the imaging head; an (external) tracking system, which is adapted to detect at least the imaging head of the imaging device and to track it with respect to the tracking system, as well as to detect and in particular to track at least a partial portion of the patient with the intervention region for registration; a data provision unit, in particular a storage unit, which is adapted to provide digital 3D imaging data, in particular preoperative 3D imaging data, of the patient; and a control unit, which is adapted to process the data of the imaging device, the data of the tracking system as well as the provided 3D imaging data and to determine a position and/or orientation of the object to be tracked, in particular an instrument, in particular an instrument tip of the instrument, by linking (a transformation or transformation matrix of) the tracing from the tracking system to the imaging head as well as the tracing from the imaging head to the object, in particular instrument, and to transfer this to the 3D imaging data of the patient registered to the tracking system and to output this visually by the display device. The control unit can create a correlation representation with the 3D imaging data registered for the patient and this determined/calculated position and/or orientation of the instrument and can output this visually by the display device. In this way, the surgeon can display the (virtual) instrument, in particular the exact position of the instrument tip, in the 3D imaging data on an OR monitor, for example.
Based on the detected patient, the detected imaging head and the instrument detected by the imaging head, a position and/or orientation of the instrument, in particular of an instrument tip, in relation to the 3D imaging data can be determined. In this way, it is possible to determine the position and/or orientation of surgical instruments relative to 3D imaging data (3D image) of the patient.
Thus, a navigation system is provided for following surgical instruments and indicating their position, in particular their pose, relative to 3D imaging data (3D image set) of the patient during surgery, wherein the instrument is followed by an optical system, the optical system is in turn followed by a tracking system of the navigation system, and the patient is also followed by the navigation system (for registration to the 3D imaging data).
The term ‘position’ means a geometric position in three-dimensional space, which is specified in particular via coordinates of a Cartesian coordinate system. In particular, the position can be specified by the three coordinates X, Y and Z.
The term ‘orientation’ in turn indicates an alignment (such as at the position) in space. It can also be said that the orientation specifies an alignment with an indication of a direction or rotation in three-dimensional space. In particular, the orientation can be specified using three angles.
The term ‘pose’ includes both a position and an orientation. In particular, the pose can be specified using six coordinates, three position coordinates X, Y and Z and three angular coordinates for the orientation.
The term 3D defines that the image data is spatial, i.e. three-dimensional. The patient's body or at least a partial region of the body with spatial extension can be digitally available as image data in a three-dimensional space with a Cartesian coordinate system (X, Y, Z).
The medical instrument to be tracked may be a suction tube or forceps, for example, whose distal instrument tip can be used to define a point in space with particular precision. The surgical navigation system may of course also be used to track (smaller) medical devices.
Advantageous embodiments are explained in particular below.
According to a preferred embodiment, the medical imaging device may be a surgical microscope for surgery and/or a medical endoscope/surgical endoscope adapted to perform spatial detection for tracing, and in particular comprising a 3D-camera system for detecting depth information. In craniotomy, for example, a surgical microscope supports the surgeon during the intervention. The position of the microscope head of the surgical microscope is followed with the tracking system of the surgical navigation system. In particular, a surgical stereo microscope is used during the intervention in order to recognize surgical instruments and to calculate their relative position with respect to the microscope's image sensors. The imaging device (vision system) may thus be a surgical microscope or a surgical endoscope. In particular, the surgical navigation system may therefore be used in conjunction with a surgical microscope, in particular a stereo microscope, wherein the control unit is specially adapted to provide corresponding navigation data. With a (conventional) surgical microscope, a part of the intervention region of interest can be targeted down to a distance of a few centimeters and the instrument can be localized and tracked with particular precision.
According to a further preferred embodiment, the imaging head of the medical imaging device may comprise a stereo camera for a stereo image and, in particular, the control unit may be adapted to detect a spatial, three-dimensional position and/or orientation of the instrument, in particular of the instrument tip, relative to the imaging head from the stereo image via machine vision. In other words, the imaging device may have a stereo camera system and/or a 3D camera system with depth information. This means that only the control unit for the evaluation needs to be adapted accordingly and standardized devices such as stereo microscopes or endoscopes with a stereo camera on the end face can be used.
According to a further configuration example of the disclosure, the navigation system may comprise an image analysis device, which is adapted to perform a spatial three-dimensional detection of an instrument from at least two image perspectives, in particular a stereo image, via machine vision. In other words, the navigation system may have a camera system with at least one camera and perform three-dimensional detection and tracing of the instrument on the basis of machine vision.
Preferably, the control unit may be adapted to determine a three-dimensional position and/or orientation of the instrument via image processing (analysis) techniques and/or via triangulation of the stereo image/a stereo picture and/or via reconstruction of a disparity overlap of a stereo image. In particular, the control unit may also be adapted to perform navigation using principles of 3D reconstruction and position determination from stereo images. In particular, for each pixel in the left image of the stereo image, a corresponding pixel is searched for in the right image. The positions of these two pixels are used to calculate the apparent depth of the pixel. Alternatively or additionally, the pose (3D position) of a surgical instrument in relation to the imaging head can be calculated using image processing techniques. In particular, the pose (3D position) of the instrument can be determined by recognizing the instrument in the left and right image according to the principle of triangulation. Alternatively or additionally, the position and/or orientation can be determined by reconstructing a disparity overlap of the stereo images. In such a depth map/disparity map, an instrument is recognized and its pose (3D position) is calculated directly from the corresponding depth values. The recognition of instruments may be performed using the following methods: image processing methods for each individual color image of the left image or of the right image; image processing methods that use a pair of the left image and of the right image (simultaneously); image processing methods that use a single disparity map/depth map; or image processing methods that use a combination of the aforementioned image processing methods. In particular, methods of machine vision include deep learning methods using neural networks or image transformations (vision transformers); hand-crafted design features such as line detection or color detection; or an adaptation of 3D models to images. In particular, the at least one instrument is tracked, preferably by machine vision, by a surgical (surgical stereo) microscope and the surgical microscope itself is tracked by a tracking system of the navigation system.
In particular, a pre-determined/defined optical pattern, which is detected by the imaging head, may be provided, in particular integrated, on an optically visible outer side of the medical instrument to be tracked, in particular on a distal end or end portion of the instrument. The control unit is in turn adapted to recognize and decode the optical pattern or to compare it with a reference stored in a storage unit and to determine a position of an instrument tip relative to the optical pattern or a geometry of the instrument on the basis of the detected optical pattern. The control unit can therefore use the optical pattern to obtain information on a position of the instrument tip or on a geometry of the instrument. The at least one instrument therefore has a predefined optical pattern/optical marking, which is integrated in particular in the distal end or end portion and is detected by the imaging head. The information conveyed via the optical pattern can be useful, since the recognition of the exact position of the instrument tip is difficult due to covering of the instrument tip or a low contrast of the instrument. The recognition of a determined optical pattern on a main body portion of the instrument, on the other hand, is much easier and the control unit can use the information to deduce the position of the instrument tip. One advantage is that even better navigation ergonomics are provided. In particular, the surgical instruments used in the navigation system are not or only slightly modified, in particular by a marking, preferably on or in the area of the tip. This relatively simple and small modification is carried out in order to follow the instrument even more precisely compared to classic navigated instruments with large rigid bodies. For example, direct optical marking can be achieved by simply engraving specific patterns on the instrument, such as QR codes, data matrix codes, barcodes, lines, dots or textures. The optical markings may be QR codes, at least two rings with a predefined distance between them or another unique pattern that the control unit can decode and, in particular, link to stored information. In particular, these optical markings may be used to encode the position of the instrument tip and/or to indicate the geometry of the instrument. In particular, the pre-determined optical pattern may be a QR code and a distance from the QR code to the instrument tip may be encoded in the QR code so that the position of the instrument tip is determinable.
According to a further embodiment, the optical pattern may be a QR code, wherein a distance from the QR code to the instrument tip is encoded in the QR code, so that the position of the instrument syringe is determinable.
In particular, image processing techniques may be used to learn the visual appearance of common surgical instruments in order to deliver optimal recognition accuracy.
According to one embodiment, a geometric shape of the at least one medical instrument, in particular of several medical instruments, may be stored in a storage unit, in particular by an initial three-dimensional detection by the imaging head and/or by the tracking system, and the control unit may determine the position, in particular pose, of the distal tip on the basis of a partial portion of the instrument to be tracked detected by the imaging head and the stored geometric structure. In particular, the recognition of instruments may be simplified by standardizing the visual or geometric appearance of the at least one instrument. In this way, the instruments do not have to be changed and the known, stored geometric shape information is used to recognize the 3D shape of the instrument and finally to determine the pose of the instrument.
Preferably, the tracking system may comprise an infrared-based camera system and/or electromagnetic-based system and/or an IMU (Inertial Measurement Unit)-based tracking system. In the case of an IMU (Inertial Measurement Unit), this may be arranged in particular in the imaging head in order to track the imaging head.
The objects of the present disclosure are solved with respect to a mobile medical navigation tower in that it comprises: a navigation system according to the present disclosure; and a mobile base/mobile cart with wheels for mobile placement of the navigation tower. Due to the design as a compact mobile unit, the navigation tower can be placed flexibly at different locations in an operating room. Thus, a mobile medical and wheeled cart with the navigation system according to the present disclosure is proposed, which in particular comprises a computer, which implements the control unit, and a monitor.
With regard to a navigation method for navigation in a surgical intervention for a patient for tracing/tracking of at least one medical instrument, in particular in a surgical navigation system of the present disclosure, the objects are solved by the steps:
According to an embodiment, the navigation method may further comprise the steps of: creating a stereo image by the medical imaging device; creating, based on the stereo image, a depth map with depth information by an image processing technique and/or by a triangulation and/or by a reconstruction of a disparity overlap; and determining the position and/or orientation of the instrument based on the stereo image and the depth map.
In particular, if the pre-determined optical pattern is a QR code and a distance from the QR code to the instrument tip is encoded in the QR code, the navigation method may further comprise the steps of: decoding the QR code; reading the distance to the instrument tip; determining the position of the instrument tip relative to the imaging head and via the tracked imaging head relative to the 3D imaging data.
With respect to a computer-readable storage medium, the objects are fulfilled by comprising instructions which, when executed by a computer, cause the computer to perform the method steps of the navigation method according to the present embodiment.
Any disclosure related to the navigation system of the present disclosure also applies to the navigation method of the present disclosure and vice versa.
The present invention is explained in more detail below with reference to the accompanying Figures via preferred configuration examples.
The Figures are schematic in nature and are only intended to aid understanding of the invention. Identical elements are marked with the same reference signs. The features of the various embodiments can be interchanged.
In an intervention region of the patient, here on the patient's head, the surgeon acts with a medical instrument 4, for example a pair of forceps, a suction tube or a sealing instrument, which he wishes to track with the navigation system 1, and performs the intervention accordingly.
A surgical stereo microscope (hereinafter referred to only as microscope) 6 as a medical imaging device has a (stereo) microscope head as imaging head 8, which has an optical system (not shown) with downstream sensors such as CMOS sensors or CCD sensors in order to create a visual stereo image A (two video recordings at a distance from each other from different positions) of a part or portion of the intervention region E in the patient P. In addition to the intervention region itself, the microscope 6 also detects the medical instrument 4 to be tracked, which is used in the intervention region E by the surgeon. As will be explained in more detail later, the navigation system 1 can use the stereo image A to spatially detect and trace or respectively track the position and orientation of the instrument, i.e. its pose in relation to the imaging head 8.
An external tracking system 10 of the navigation system 1, in particular an infrared-based tracking system 10, in the form of an external (stereo) camera, which is arranged in the area of the patient's legs, in turn detects the imaging head 8 of the microscope 6 and can detect and track it in three dimensions, in particular if infrared markers are attached to it.
A data provision unit in the form of a storage unit 12, such as an SSD memory, provides preoperative, digital 3D imaging data 3DA in the form of MRI images and/or CT images of the patient P for navigation. Patient P is detected by the tracking system 10, for example infrared markers are placed on the patient's head, and registered with respect to the 3D imaging data 3DA. In this way, the ‘real’ current time data, in particular the stereo image A, is correlated with the ‘virtual’ 3D imaging data 3DA. The tracking system 10 also detects a possible movement of the patient's body during the intervention and registers the patient's movement again on the 3D imaging data 3DA.
In the present case, a control unit 14 of the navigation system 1 is specially adapted to process the data of the imaging device 6, the data of the tracking system 10 and the 3D imaging data 3DA provided and, by linking the tracing of the tracking system 10 to the imaging head 8 and the tracing of the imaging head 8 to the instrument 4, to determine a pose of the instrument 4 to be tracked and also a position of an instrument tip 16 of the instrument 4. Specifically, the tracing of the imaging head 8 (microscope head) by the tracking system 10 provides a first transformation matrix in order to infer the current local coordinate system of the imaging head 8 from the local coordinate system of the tracking system 10, in this case a local coordinate system of the external (stereo) camera. The tracing (tracking) of the instrument 4 by the imaging head 8 in turn provides a second transformation matrix from the local coordinate system of the imaging head 8 to the local coordinate system of the instrument 4. The control unit 14 processes this first and second transformation matrix into a total transformation matrix from the local coordinate system of the external (stereo) camera to the instrument 4 itself.
In addition, the patient P was registered by the tracking system 10, i.e. a relation of a local coordinate system of the patient P to the local coordinate system of the external (stereo) camera was detected, which is provided to the control unit 14 as a patient-side transformation matrix. In addition, the patient P is registered to the 3D imaging data 3DA, so that (ideally) the real patient P corresponds to the 3D imaging data 3DA.
The total transformation matrix from the instrument 4 (via imaging head 8) to the external camera, and further from the external camera to the patient P or to the 3D imaging data 3DA, allows the pose (position and orientation) of the instrument 4 to be tracked to be transferred to the 3D imaging data 3DA and displayed accordingly for navigation. The control unit creates a correlation representation which, on the one hand, has the 3D imaging data 3DA registered for the patient P and, on the other hand, also displays the position and/or orientation of the instrument 4, in particular a virtual geometric model of the instrument 4, in the 3D imaging data 3DA (position-correct or orientation-correct, in particular pose-correct). This correlation representation is then output by the OR monitor and the surgeon can see at any time where the instrument 4 with its instrument tip 16 is located in the patient P, even if the instrument tip 16 is not visible.
By combining two tracings, the surgeon can be provided with particularly flexible and safe navigation. Usually, a field of view of the microscope 6 is perpendicular to an intervention region E with the tissue, so that there is virtually no occlusion by other objects. Standardized instruments 4 to be tracked may also be used for navigation during the intervention, since no special pointer instruments are required due to the good visual detection by the microscope 6 with the associated possibility of tracing.
In order to explain the tracing of the instrument 4 by the imaging head 8,
Thus, by linking the tracing of the instrument 4 by the imaging head 8 on the basis of an image analysis and the tracing of the imaging head 8 by the tracking system 10 and the 3D imaging data 3DA registered for the patient P, the pose of the instrument 4, in particular the position of the instrument tip 16, can be displayed in the 3D imaging data in a particularly simple and reliable manner.
Specifically, a QR code 24 is engraved in a distal end region 22 of the instrument 4 on a lateral surface/side surface 26 of the instrument 4. This allows a standardized instrument 4 to be subsequently modified and adapted, for example, whereby only an engraving that is particularly simple and quick to produce, for example via a laser, needs to be made. The QR code 24 contains encoded information that the control unit can decode and interpret. On the one hand, a distance in cm can be encoded directly in the QR code 24, so that the position of the instrument tip 16 can be inferred directly from the position of the QR code 24 along a longitudinal axis 28 of the instrument 4 with the amount of the distance, independently of an evaluation program, or the QR code can have an ID as a reference in order to infer the position of the instrument tip 16 via data stored in the storage unit 12, which also have a distance from the QR code 24 to the instrument tip 16. In this way, the instrument tip 16 can also be displayed in the 3D imaging data 3DA without direct visual contact in order to support the surgeon in navigation.
Via the optical markings 20 on portions of the instrument 4 that are particularly easy to see, a position, in particular a pose, of the associated instrument tip 16 can be encoded with respect to the optical marking and possibly also with respect to characteristic design features or even geometry information of the instrument 4 can be encoded directly or indirectly via a database.
In a first step S1, preoperative 3D imaging data of the patient P is detected.
In a second step S2, the patient P is registered to the 3D imaging data 3DA.
Preferably, in a step S3, the imaging device 6 (visualization system) or the imaging head 8 is oriented to the intervention region E on the patient P.
In a step S4, the imaging head of the imaging device is localized and tracked by the tracking system.
In a step S5, a (three-dimensional) pose (position and orientation) of the instrument relative to the imaging head 8 is determined via the imaging device, in particular via the stereo image.
Preferably, the surgical instrument 4 can be guided by the surgeon in the intervention region E in one step.
Preferably, in a step S6, the patient is detected and localized three-dimensionally by the tracking system 10.
In a subsequent step S7, the position and/or orientation of the instrument 4 relative to the patient P and thus to the previously registered 3D imaging data 3DA is calculated.
Finally, in a step S8, a correlation representation is generated and the position and/or orientation of the instrument 4 is displayed in the 3D imaging data 3DA and output via the monitor.
Number | Date | Country | Kind |
---|---|---|---|
10 2021 128 478.3 | Nov 2021 | DE | national |
This application is the United States national stage entry of International Application No. PCT/EP2022/080276, filed on Oct. 28, 2022, and claims priority to German Application No. 10 2021 128 478.3, filed on Nov. 2, 2021. The contents of International Application No. PCT/EP2022/080276 and German Application No. 10 2021 128 478.3 are incorporated by reference herein in their entireties.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2022/080276 | 10/28/2022 | WO |