This application claims priority to Korean Patent Application No. 10-2017-0028217, filed on Mar. 6, 2017, and all the benefits accruing therefrom under 35 U.S.C. § 119, the contents of which in its entirety are herein incorporated by reference.
The present disclosure relates to an apparatus and method for tracking a location of a surgical tool, and more particularly, to an apparatus and method for tracking a three-dimensional location of a surgical tool based on a location of a marker attached to the surgical tool in a two-dimensional image. In particular, the present disclosure relates to estimating location and shape information of a catheter from a single X-ray image (not several X-ray images) obtained by attaching a marker band to the catheter and photographing the marker band during the epidural endoscopy.
This study was supported by Project No. 1415143890 of Korea Institute for Advancement of Technology and Project No. 10052980 of Korea Evaluation Institute of Industrial Technology.
In these days, back diseases are frequently found at not only old persons but also young persons. Among back diseases, herniation of intervertebral disc, which is known as so-called “disc”, demands surgical operations when it is severe.
Such a surgical operation may give a serious burden on a patient due to a large incision. Thus, in these days, non-invasive epidural epiduroscopy is more frequently performed. However, in the epidural epiduroscopy, a catheter is inserted into a diseased area by utilizing anatomical knowledge obtained from medical images before the surgical operation and also two-dimensional X-ray images taken during the surgical operation, and thus surrounding tissues such as main blood vessels, nerves and fasciae may be damaged when the catheter is inserted.
In addition, in order to accurately insert the catheter, X-ray images are taken in several directions during the surgical operation, and thus it is inevitable that both a patient and a doctor are exposed to X rays for a long time.
To solve this problem, a three-dimensional location tracking technique has been introduced to a surgical instrument by applying a computer vision technology, and Non-patent Literatures 1 and 2 below propose an algorithm for estimating a three-dimensional posture of a surgical instrument by using markers of a measurer on the same line. However, the proposed method is applied to a rigid endoscope, and the rigid endoscope is not applied to various surgical operations since its front end has a limited operating angle.
In order to solve the above problems, embodiments of the present disclosure propose an apparatus and method for estimating a three-dimensional location of a surgical tool from a photographed image in a state where a surgical tool having a front end whose operating angle is freely adjustable is inserted into a surgical site. In more detail, embodiments of the present disclosure propose an apparatus and method for determining a reference point (for example, a center point of a marker band) of a surgical tool in a photographed image.
In one aspect of the present disclosure, there is provided an apparatus for tracking a location of a surgical tool based on a radiographic image, comprising: a photography system configured to photograph a surgical tool having a physical marker frame; and an information processor configured to estimate a three-dimensional location of the surgical tool based on the physical marker frame in the photographed image, wherein the physical marker frame includes three or more marker bands which surrounds a part of the surgical tool.
In an embodiment, the information processor may detect a center point of each marker band in the image, and the information processor may estimate a three-dimensional location of the surgical tool based on a distance between the detected center point in the image and a center point of a true marker band.
In an embodiment, the surgical tool may include two or more physical marker frames, and the two or more physical marker frames may have axes different from each other.
In an embodiment, an interval between the marker bands may be greater than 1.5 times of a width of each marker band.
In an embodiment, the photography system may be a radiography system, and the marker bands may be made of a conductor, and a non-conductor may be provided between the marker bands.
In an embodiment, the radiography system may be an X-ray photography system.
In an embodiment, the surgical tool may be a bendable catheter.
In an embodiment, the surgical tool may be further included.
In an embodiment, the information processor may generate a surgical tool model corresponding to the surgical tool on a three-dimensional virtual space, based on the estimated three-dimensional location of the surgical tool, and the information processor may display the generated surgical tool model on a display together with the image.
In an embodiment, the information processor may resample the photographed image and determine a center point of each marker band in the resampled image.
In an embodiment, the information processor may generate a virtual marker frame corresponding to the physical marker frame on a three-dimensional virtual space, project the generated virtual marker frame to the photographed image, adjust a location of the virtual marker frame on the three-dimensional space so that the virtual marker frame projected to the image is matched with the physical marker frame, and when the projected virtual marker frame is matched with the physical marker frame, determine a center point of the marker band of the virtual marker frame in the image as a center point of the marker band of the physical marker frame.
In an embodiment, the information processor may adjust the location of the virtual marker frame along a line connecting the photography system and the physical marker frame, on the three-dimensional virtual space.
In another aspect of the present disclosure, there is provided a method for tracking a location of a surgical tool based on a radiographic image, comprising: by a photography system, photographing a surgical tool having a physical marker frame composed of three or more marker bands; by an information processor, detecting a center point of each marker band in the photographed image; and by the information processor, estimating a three-dimensional location of the surgical tool based on a distance between the detected center point and a center point of a true marker band.
In another aspect of the present disclosure, there is provided a method for tracking a location of a surgical tool based on a radiographic image, comprising: by a photography system, photographing a surgical tool having a physical marker frame composed of three or more marker bands; by an information processor, detecting a center point of each marker band in the photographed image; by the information processor, estimating a three-dimensional location of the surgical tool based on a distance between the detected center point in the image and a center point of a true marker band; and correcting a three-dimensional location by using a contour of the surgical tool in the image.
According to an embodiment of the present disclosure, the amount of exposed X rays may be reduced in comparison to the existing case since a three-dimensional location of a surgical tool is tracked using a single X-ray image photographed during a surgical operation. In addition, by using the three-dimensional shape information of the inserted surgical tool, it is possible to improve less invasiveness, accuracy and dexterity.
Hereinafter, embodiments are described in detail with reference to the accompanying drawings and the contents recited therein, but the scope of the present disclosure is not limited to the embodiments.
The terms used herein have been selected among general terms widely used in the art at the present in consideration of their functions, but they may be changed according to intention of those skilled in the art, customs, appearance of new technologies or the like. In addition, in a specific case, the applicant has selected a term based on his own discretion, and in this case, its meaning will be described herein. Thus, the terms used herein should be interpreted based on their true meanings and the overall disclosure of this specification, without being limited to its simple name.
For this, the apparatus 1000 for tracking a location of a surgical tool based on a radiographic image according to an embodiment may include a photography system 300 configured to photograph a surgical tool having a physical marker frame and an information processor 100 configured to estimate a three-dimensional location of the surgical tool 200 based on the physical marker frame in the photographed image. Here, the physical marker frame may include three or more marker bands which surround a part of the surgical tool 200 or are inserted into the surgical tool.
In an embodiment, the surgical tool 200 may have a rod shape with a circular section or a polygonal section having a triangular, rectangular or pentagonal shape. In
Moreover, the surgical tool 200 may include one or more physical marker frames 210, 220, and at least two physical marker frames 210, 220 may be disposed to have axes oriented in different directions. In other words, as shown in
In another embodiment, the physical marker frame may surround an outer surface of the surgical tool 200 as shown in
Even though
Even though
In an embodiment, an interval (b) between the marker bands 211 may be greater than 1.5 times of a width (a) of each marker band, without being limited thereto. In addition, a length (c) of the physical marker frame 210 may be in the range of several mm to several ten mm. The width (a) of the marker band, the interval (b) between the marker bands and the length (c) of the physical marker frame are just examples and may be suitably selected as necessary.
In addition, the marker band 211 may be made of a conductor, for example gold and copper. A region 212 between the marker bands 211 may be made of a non-conductor, for example aluminum, without being limited thereto.
In an embodiment of the present disclosure, the photography system 300 may be a radiography system, preferably an X-ray photography system. The photography system 300 may photograph the surgical tool 200 inserted into a surgical site B and transfer the photographed image to the information processor 100.
In an embodiment, the information processor 100 may estimate a three-dimensional location of the surgical tool 200 based on the physical marker frame in the photographed image. For example, the information processor 100 may detect a center point x0-x2 in the image with respect to the marker band 211 in the physical marker frame, and estimate a three-dimensional location of the surgical tool based on a distance between the detected center point x0-x2 in the image and the center point C1-C3 of each true marker band and a focus distance f of the photography system 300 (a distance from a photography system source (e.g., an X-ray source) to the detector). As a premise for this, it may be demanded that the marker bands are on the same line in the mark frame, and a distance between the marker bands is already known.
Referring to
In detail, referring to
Cn=C1+δnb [Equation 1]
In an embodiment, since three true marker bands are on a straight line, the information processor 100 may define a two-dimensional image coordinate un,vn of each marker displayed at the projective plane 50 as in Equation 2 below by using a previously measured distance δ1, δ2 between marker bands (here, each distance may be a distance between center points of the true marker bands) and a focus distance.
Here, f represents a focus distance of the photography system (for example, in case of an X-ray photography system, a distance from the X-ray source to the detector).
Equation 2 may be expressed as in Equation 3 below by using three-dimensional location vectors p1, p2, p3 and direction vectors b1, b2, b3.
Equation 3 may be expressed as in Equation 4 below to find an optimizing condition. In addition, the information processor 100 may perform singular value decomposition to solve Equation 4. As a result, the information processor 100 may obtain a symmetric matrix E as in Equation 5 below. Here, an eigenvector corresponding to a minimum eigenvalue of the obtained E may represent a relative direction vector b.
After that, the information processor 100 may calculate a three-dimensional location vector p by means of the optimized eigenvector b and Equation 6 below.
min∥Ab+Bp∥ subject to bTb=1 [Equation 4]
E=AT(I−B(BTB)−1BT)A [Equation 5]
p=−(BTB)−1BTAb [Equation 6]
Through the above process, the information processor 100 may estimate a three-dimensional location of a marker band from the two-dimensional projection image 50 and estimate a three-dimensional location of a surgical tool therefrom. In addition to the above method, the information processor 100 may use various methods in order to estimate three-dimensional locations of true points by using a plurality of points in a two-dimensional image and distances among the true points.
Meanwhile, in the method for estimating a three-dimensional location of a marker band as described above, the estimated three-dimensional location may be accurate when a location of a “photographed center point of the marker band in the projection image” (namely, locations x0-x2 in
In the present disclosure, the two-dimensional projection image 50 which has photographed the marker band may determine center points of marker bands and then determine three-dimensional locations of the marker bands based on coordinates or intervals of the determined center points. However, since the marker band has a volume, if the marker band is projected to a projection image, the marker band is exhibited to have a predetermined area as shown in
In detail, in
Meanwhile, in
In detail, in
Therefore, in order to reduce the error, it is required to find an accurate center point of the marker band in the two-dimensional projection image.
In an embodiment, the information processor 100 may resample the photographed image and determine a center point of each marker band in the resampled image. According to an embodiment of the present disclosure, the information processor 100 may process the projection image as follows in order to determine the center point of the marker band more accurately in the projection image photographed by the photography system 300.
Referring to
In an embodiment, as shown in
Referring to
In an embodiment, the information processor 100 may generate a virtual marker frame corresponding to the physical marker frame on a three-dimensional virtual space, project the generated virtual marker frame to the photographed projection image, adjust a location of the virtual marker frame on the three-dimensional space so that the virtual marker frame projected to the projection image is matched with the physical marker frame, and then, when the projected virtual marker frame and the projected physical marker frame are matched in the frame image, determine the center point of the marker band of the virtual marker frame in the image as a center point of the marker band of the physical marker frame.
Also, in an embodiment, the information processor 100 may adjust a location of the virtual marker frame along a line (I1-I3 in
In an embodiment, the information processor 100 may generate a virtual marker frame (vmp) and determine (or, correct) a center point of the marker band in the projection image based on the generated virtual marker frame and the image to which the true marker band is projected.
In an embodiment, the information processor 100 may adjust a location of the virtual marker frame on the three-dimensional space so that the virtual marker frame (vpm) projected in the projection image is matched with the projected marker band. If the virtual marker frame (vpm) projected in the projection image is matched with the projected marker band, locations of the virtual marker frame and the true marker band will also correspond to each other on the three-dimensional space.
Therefore, when the virtual marker frame (vpm) projected in the projection image is matched with the projected marker band, the information processor 100 may determine a location of the virtual marker frame as a location of the true marker band.
In order to perform the above calculation efficiently, the information processor 100 may compare a contour (ct) of the projected marker band with the projected virtual marker frame and determine whether they are matched.
The information processor 100 may match the virtual marker frame (vpm) projected in the projection image and the projected marker band by using the following method.
Referring to
The information processor 100 may move the virtual marker frame along at least one of lines I1-I3 between the center point of each marker band and the photography system 300. As shown in
Meanwhile, for convenience, it has been described that a virtual marker frame is generated and projected to a projection image to be matched with a marker band in the projection image. However, in another embodiment, it is also possible to adjust a location of the virtual marker band so that a virtual marker band is generated and projected to a projection image to be matched with a marker band in the projection image.
The method for tracking a location of a surgical tool based on an image according to an embodiment of the present disclosure may be implemented using the components of the apparatus 1000 for tracking a location of a surgical tool based on an image as described above. In an example, the method for tracking a location of a surgical tool based on an image may include: by a photography system, photographing a surgical tool having a physical marker frame composed of three or more marker bands; by an information processor, detecting a center point of each marker band in the photographed image; and by the information processor, estimating a three-dimensional location of the surgical tool based on a distance between the detected center point and a center point of an actual marker band.
The above method may be implemented as an application or program commands executable by various kinds of computer means and recorded on a computer-readable recording medium. The computer-readable recording medium may include program commands, data files, data structures or the like solely or in combination. The program commands recorded on the medium may be specially designed or configured for the present disclosure or known to and available by computer software engineers.
The computer-readable recording medium includes, for example, magnetic media such as a hard disk, a floppy disk and a magnetic tape, optical media such as CD-ROM and DVD, magneto-optical media such as a floptical disk, hardware devices such as ROM, RAM and a flash memory, specially configured to store and perform program commands, or the like. The program commands include not only machine codes made by a complier but also high-level language codes executable by a computer by using an interpreter. The hardware device may be configured to operate as at least one software module to perform the operations of the present disclosure, or vice versa.
In addition, even though the embodiments have been illustrated and explained, the present disclosure is not limited to the specific embodiments as described above but can be modified in various ways without departing from the scope of the claims by those having ordinary skill in the art, and such modifications must not be separately understood from the features of the present disclosure.
In addition, in the specification, both an article invention and a process invention have been described, and the explanations of both inventions may be supplementary to each other.
Number | Date | Country | Kind |
---|---|---|---|
10-2017-0028217 | Mar 2017 | KR | national |
Number | Name | Date | Kind |
---|---|---|---|
7756244 | Mostafavi | Jul 2010 | B2 |
8428690 | Li et al. | Apr 2013 | B2 |
8682062 | Kim et al. | Mar 2014 | B2 |
20030181809 | Hall | Sep 2003 | A1 |
20090226063 | Rangwala | Sep 2009 | A1 |
20090312629 | Razzaque | Dec 2009 | A1 |
20120071751 | Sra | Mar 2012 | A1 |
20120082342 | Kim | Apr 2012 | A1 |
20120289825 | Rai | Nov 2012 | A1 |
20130243153 | Sra | Sep 2013 | A1 |
20140018788 | Engelman | Jan 2014 | A1 |
20140058251 | Stigall | Feb 2014 | A1 |
20140321710 | Robert | Oct 2014 | A1 |
20150087881 | Miyamoto | Mar 2015 | A1 |
Number | Date | Country |
---|---|---|
2002-204832 | Jul 2002 | JP |
2009-000389 | Jan 2009 | JP |
10-0640761 | Nov 2006 | KR |
10-1188715 | Oct 2012 | KR |
10-2014-0065541 | May 2014 | KR |
10-1449830 | Oct 2014 | KR |
10-1652888 | Sep 2016 | KR |
WO 2006124388 | Nov 2006 | WO |
WO 2013036831 | Mar 2013 | WO |
Entry |
---|
Christophe Doignon, “An Introduction to Model-Based Pose Estimation and 3-D Tracking Techniques”, Scene Reconstruction, Pose Estimation and Tracking, Jun. 2007, pp. 359-382. |
Christophe Doignon et al., “Pose estimation and feature tracking for robot assisted surgery with medical imaging”, Unifying Perspectives in Computational and Robot Vision, 2008, vol. 8, pp. 79-101. |
Number | Date | Country | |
---|---|---|---|
20180249973 A1 | Sep 2018 | US |