This invention relates generally to image-guided surgical interventions. More specifically, the invention relates to ultrasound guidance of surgical interventions and a tracked reference device therefor.
A significant drawback to use of ultrasound images in guiding medical interventions is the general difficulty in recognizing target structures in the images. Moreover, the simultaneous manipulation of the ultrasound transducer and the interventional tool (e.g., a needle) requires considerable skill and experience.
Some interventions (e.g., spinal) are performed under X-ray fluoroscopic or computed tomography (CT) guidance, because the interpretation of X-ray based images is not hampered by muscle and ligament layers between the skin and the target. CT and X-ray-based imaging modalities visualize the target anatomy and the needle much better that ultrasound does, but they involve significantly larger and more expensive equipment than ultrasound, and they introduce ionizing radiation to the patient and to a larger extent to the operator who performs these procedures on a regular basis.
Using electromagnetically tracked ultrasound transducers and interventional tools to enhance ultrasound guided interventions with computer navigation has made some procedures accessible for less experienced physicians. Nevertheless, applying electromagnetic tracking in certain procedures, such as spinal interventions, has been hampered because of the difficulty in interpreting spine anatomy in ultrasound images, and in locating relatively small and deep targets under the skin surface. Electromagnetic tracking also suffers from poor accuracy and interference with metal parts in the vicinity of the operating space.
Provided herein is a reference device for surgery, comprising: a base portion, including; a socket that accepts a tracking sensor in a pre-defined orientation; one or more reference divots that accept at least a portion of a surgical intervention tool, the one or more reference divots being substantially transparent to one or more imaging modalities; and a plurality of anatomical direction markers that provide alignment of the reference device with the patient's anatomy.
In one embodiment, the base portion interfaces with a patient's anatomy substantially non-invasively. In another embodiment, the base portion interfaces with an object fixed to the patient's anatomy. In another embodiment, the base portion interfaces with a surface in proximity to a surgical invention site.
In one embodiment, the socket accepts an electromagnetic tracking sensor that is used as a reference point in tracking at least one of position, orientation, and trajectory of the surgical intervention tool in three-dimensional space. In these embodiments, locations of the one or more reference divots are known with respect to the orientation of the tracking sensor.
Also provided is method of medical imaging; comprising: disposing a reference device in a selected orientation with respect to an intervention space of a subject, the reference device providing anatomical orientation of tracked medical images within the intervention space; using an ultrasound imaging system to obtain tracked medical images of the intervention space; and using the anatomical orientation provided by the reference device to display the tracked medical images in the intervention space in a perspective that corresponds to an operator's perspective.
The method may further comprise displaying one or more of position, orientation, and trajectory of a tracked intervention tool with respect to the tracked medical images in the intervention space. The method may further comprise verifying at least one of position, orientation, and trajectory of the tracked intervention tool with respect to the tracked medical images in the intervention space, by placing the tracked intervention tool at one or more locations on the reference device, wherein the locations are known with respect to the position of a sensor associated with the reference device.
In one embodiment, verifying further comprises providing an indication to the system when the tracked intervention tool is disposed at each of the one or more locations.
The method may further comprise disposing an electromagnetic sensor in a known position and orientation with respect to the reference device. The method may further comprise aligning a tracked medical image with a volumetric medical image. The method may further comprise displaying the tracked medical images substantially in real time.
In one embodiment, the medical imaging system is an ultrasound imaging system or a tomographic imaging system. In one embodiment, the tracked medical image is an ultrasound image.
Also provided is programmed media for use with a computer, comprising: a computer program stored on non-transitory storage media compatible with the computer, the computer program containing instructions to direct the computer to perform the following steps: obtain tracked medical images of an intervention space from a medical imaging system; and use anatomical orientation provided by a tracked reference device to display the tracked medical images in the intervention space in a perspective that corresponds to an operator's perspective.
For a greater understanding of the invention and to show more clearly how it may be carried into effect, embodiments are described below, by way of example, with reference to the accompanying drawings, wherein:
Embodiments described herein provide rapid (e.g., substantially instantaneous or real-time) tracking at the intervention site of an invention tool, thereby improving the accuracy of surgical interventions and helping physicians avoid adverse events.
One aspect of the invention provides a hardware reference device that enhances image-guided interventions. The reference device is tracked by the system and used to verify the accuracy of the intervention tool (i.e., a surgical tool) placement before and during the intervention. The reference device holds a reference sensor (e.g., electromagnetic (EM) sensor) in a position aligned with patient anatomy. This is used to show the ultrasound images in the correct orientation to the operator, aiding in target recognition and better navigation.
An embodiment of the tracked reference device is shown in
In one embodiment the base portion 30 may non-invasively interface with the patient's anatomy. The base portion 30 may have a surface that is generally shaped to fit on the exterior anatomy of the patient in the vicinity or region of the patient where the intervention is to take place. For example, the base portion 30 may have a curved surface, for use on a patient's skull. In the embodiment of
Features of the tracked reference device include one or more anatomical direction markers, a socket that accepts or accommodates a tracking sensor in a pre-defined orientation, and one or more reference divots that accept at least a portion of the intervention tool during verification. In general, these features are disposed in or on the base portion. The divots may be sized or shaped to accept a specific tool, such as, e.g., a needle. The divots may be sized or shaped to accept a specific position and/or orientation of a tool. In one embodiment the divots are transparent or substantially transparent to one or more imaging modalities such as ultrasound and tomography. The embodiment of
The tracked reference device may be used with an imaging system, an embodiment of which is shown in
Accurate navigation of the intervention tool 16 ensures that the tool is close to a target when the virtual tool tip is at the target point on the navigation computer display. The system prevents loss of accuracy of the navigation and mitigates any risk of misplacement of the tool. The system may be configured to warn the operator in case of insufficient accuracy before the needle insertion.
In one embodiment, virtual camera alignment in the navigation display is achieved by a series of coordinate transforms, an embodiment of which is illustrated in
The navigation system uses the sensed positions in the reference sensor coordinate system to present virtual models of the ultrasound image, the intervention tool, and optionally additional patient images to serve tool navigation needs. Assessment of tool tracking accuracy before insertion into the patient is performed using the reference divots 34 on the reference device 12. Known (P) and tracked (P′) positions of the tool relative to the reference sensor are compared (
The maximum acceptable difference between known and tracked tool tip positions depends on the size of the target. For example, typical needle targets in the spine require an accuracy of 1-3 mm.
Another aspect of the invention comprises a method that enhances ultrasound-guided interventions. The method works with an ultrasound scanner and a surgical intervention tool, both electromagnetically tracked in 3-dimensional space in real-time. The method may be used in conjunction with the tracked reference device described herein to perform verification before and during the surgical procedure. The method may also create a 3-dimensional augmented reality computer scene with the ultrasound image and the 3-dimensional model of the intervention tool. A feature of the method is that the tracked medical images in the intervention space are displayed in a perspective that corresponds to an operator's perspective.
At least a portion of the method may be implemented in software, including, for example, an algorithm, and stored on non-volatile computer storage media, and run on a suitable computer. The computer may be part of an imaging system. In one embodiment, the imaging system is part of a tracked ultrasound-guided intervention tool navigation system.
As described herein, a target (i.e., an intervention site) is identified in the computer guidance scene, and therefore the intervention tool can be introduced to the target using the computer scene, rather than via direct, live ultrasound imaging. This focuses the attention of the operator to the tool insertion, and ensures higher accuracy even at an early stage of the operator learning curve.
When a pre-operative tomographic image is available for the patient, the reference device allows alignment of the tomographic image with the ultrasound tracking coordinate system, which results in fusion of tomographic and ultrasound images. The tracked reference device ensures correct orientation of the ultrasound image; therefore the dimensionality of the alignment space is reduced to four degrees of freedom (translation+rotation around the left-right axis) from the original six degrees of freedom (including two other rotation axes). 3-D translation alignment with one rotation can be performed robustly and quickly. In such a way, fused ultrasound-tomography images may be made available for insertion planning in a routine procedure.
The invention is further described by way of the following non-limiting examples.
Pedicle screw placement is considered the standard of care in many spinal deformation diseases. Registration of a preoperative CT with an intraoperative stereotactic guidance system can completely eliminate ionizing radiation during pedicle screw placement, while the accuracy and success of pedicle screw placement remains excellent. This registration method requires landmark localization in both the CT and the intraoperative tracking coordinate systems. These landmarks determine the transformation that fuses the preoperative CT with the intraoperative virtual reality navigation scene. In this study, a tracked ultrasound snapshot (TUSS) technique was used with a tracked reference device to find these landmarks through non-invasive ultrasound (US) imaging. The tracked reference device may be a device as described above and shown in
Automatic CT to US image registration methods are promising alternatives to manual landmarking of US images. However, a method to compute a reliable registration transform on all reported experimental test cases at a satisfactory accuracy is not known. Since intraoperative conditions could further reduce the success rate of automatic methods, manually defined landmarks were considered the most accurate available CT registration method for this procedure.
Pedicle screw positions were planned using a preoperative CT scan. The plans were later registered to the surgical navigation coordinate system using TUSS landmarks. The registration was evaluated based on clinical safety parameters of the registered pedicle screw plans in two patient-based phantom models.
The surgical workflow is shown in
The intraoperative navigation system was as shown in
The registration workflow was carried out in two patient-based lumbar spine models. One model was based on healthy anatomy and the other on degenerative spine disease. The tests involved L2-L5 segments in each spine model, with two pedicle screw plans in each vertebra.
Two rapid prototyped spine segments of L2-L5 were used for the evaluation of the TUSS-based pedicle screw plan registration. The spine models were generated by manually contouring healthy and degenerative spine CT scans. Planning of the pedicle screws was done using four points in the CT image of each pedicle (
Registration from the CT image to the surgical navigation scene was done using anatomical landmark points on vertebrae. For this, landmarks (e.g., articular processes of vertebrae) were identified that were visible in both CT and intraoperative US images.
Lumbar spine images of 10 human subjects were examined to verify visibility of anatomical landmarks on US images. The study protocol was approved by the Health Sciences Research Ethics Board at Queen's University. Written informed consent was obtained from subjects prior to participation in the study. The clinical parameters of the examined population are shown in Table 1. Registration landmarks were defined as the most posterior points of the four articular processes of each vertebra.
Finding the articular processes with US imaging can be a difficult task. Therefore, an axial tracked US snapshot was taken to help find the intersecting sagittal US planes that correspond to the facet joint regions, as shown in
The selected four registration landmarks were visible in all 10 human subjects, and in all patient-based simulation phantoms. All vertebrae in the two phantom models were successfully registered using US landmark points.
Translational errors were measured at the center of the screw plan, which was positioned near the center of the pedicles during the planning phase. Orientation errors were decomposed into three Euler angles using the left-right, posterior-anterior, and inferior-superior anatomical axes.
Translational error in the coronal plane of individual screw centers was plotted (
The results confirm that TUSS is a useful tool in pedicle screw navigation, potentially improving the safety and reducing ionizing radiation in spinal fusion surgeries. Landmarks on TUSS images provide sufficient information to register the preoperative screw plans with the surgical navigation system. The translational errors were not uniform in all directions, and the deviation of positions was largest in the inferior-superior anatomical direction. This may be attributed to the elongated shape of the facet joints in the same direction, because facet joints were used as landmarks for US-CT registration. However, the errors were minor and would not detrimentally affect the intervention outcome in a patient. Moreover, the method avoids or substantially reduces the requirement for X-ray, thereby reducing radiation burden on operators and costs.
This example provides a spinal needle insertion navigation system using tracked US snapshots (TUSS) that allows US-guided needle insertions without holding the US probe at the insertion site. The TUSS navigation software platform enables rapid development of image-guided needle placement applications, as well as other interventions, using tracked US for various anatomical targets and clinical indications. TUSS navigation was tested by five orthopedic surgeon residents in this study, guiding facet joint injections in cadaveric lamb and synthetic human spine models. Also reported is the targeting accuracy of the navigation system and a comparison with freehand US-guided needle placement.
The navigation system consisted of a data acquisition and a visualization component. These components used network communication, and were run on two separate computers: the US machine collected image and tracking data, and the navigation computer was responsible for visualization. The system is as shown in
Images were acquired using a SonixTouch (Ultrasonix, Richmond, BC, Canada) US machine with a GPS extension. The GPS extension used the DriveBay EM position tracker (Ascension Technology Corporation, Milton, Vt., USA) with an adjustable arm to conveniently hold the EM transmitter close to the target area. An L14-5GPS linear array US transducer (Ultrasonix) and a 19-gauge nerve block needle (Ultrasonix) were tracked using built-in pose sensors. An additional Model 800 EM tracking sensor (Ascension Technology Corporation) attached to the target phantom or specimen served as the coordinate reference. Alternatively, a tracked reference device as described above with respect to
The software components of the navigation system included the PLUS (Public Library for Ultrasound) open-source software package to operate the US machine and the electromagnetic tracker. PLUS provides an abstraction layer for specific hardware programming interfaces and, importantly, it synchronizes the image and tracker data streams. The OpenIGTLink broadcaster application of the PLUS package was used to send the tracked US image frames to the navigation computer through the OpenIGTLink communication protocol (Tokuda, J., et al. “OpenIGTLink: an open network protocol for image-guided therapy environment”, Int. J. Med. Robot. 5, No. 4 (Dec 2009):423-434).
The navigation computer received the tracked US images, and provided the graphical user interface for needle guidance. The navigation software was implemented as an interactive module for the 3D Slicer application framework. This module, named LiveUltrasound, is shared under the open-source license of 3D Sliced. It provides real-time visualization of the tracked US images and the tracked needle in the three-dimensional graphical views of 3D Slicer, as well as the ability to take tracked US snapshots for TUSS guidance
The navigation software provided needle guidance along an insertion plan. The plan was defined in 3D Slicer by the entry point and target point, i.e., the planned location of the needle piercing the skin and the planned final needle tip position, relative to the tracked US image. The dual 3-D view layout with an insertion plan is shown in
The orientations of the bull's-eye and progress views were aligned with the position of the operator, with respect to the patient (
A total of five orthopedic surgery residents participated in this study as operators to test the TUSS-guided needle navigation. None of the operators had used any form of tracked US needle guidance before performing the experiments. This study was approved by the Queen's University Health Sciences and Affiliated Teaching Hospitals Research Ethics Board.
Ultrasound-guided facet joint injection was not performed routinely by the operators, therefore, they had to learn how to identify the facet joint in the synthetic human spine and cadaveric lamb model. The phantom and the lamb cadavers were scanned using GE LightSpeed CT scanner (GE Healthcare, Chalfont St. Giles, UK), at an image resolution of 512×512 pixels and 0.625 mm slice distance. Bone surface models were extracted from the CT volumes using an intensity threshold. The surface model was registered and visualized together with the tracked US during the training. Surface markers on the synthetic human spine phantom, and nonferromagnetic metal screws in the cadaveric lamb models were used as landmarks for rigid registration between the CT image and the EM position tracking system. During deliberate practice, the 3-D bone surface models were overlaid on the tracked US image in the navigation scene for the operators to learn the position of the facet joints in US with respect to the 3D anatomy. The training session did not involve handling of the tracked needle.
Each needle insertion procedure consisted of three main phases (
In the verification phase, two orthogonal X-ray images were acquired using a GE OEC 9800 fluoroscopy system (GE Healthcare, Chalfont St. Giles, UK) to assess the true needle tip position relative to the planned target. This phase is expected to be eliminated from the workflow, once sufficient evidence proves the reliability of TUSS guidance.
Tracked US snapshot navigation of needle insertion was studied in three experimental setups. Each experiment focused on different aspects of the navigation method. Table 3 summarizes major features of the experiment.
First, targeting accuracy was studied using small artificial targets without anatomical landmarks. Copper spheres of 1.6 mm diameter were placed in acoustically clear Plastisol gel (M-F Manufacturing Company, Inc., Fort Worth Tex.). The needle tip was navigated to these targets using TUSS, and its distance from the surface of the copper spheres was measured using orthogonal X-ray projection images. Second, feasibility in human anatomy was tested using a synthetic, rapid prototyped spine model, placed in Plastisol gel. Cellulose (15 g/l) was mixed to the gel to simulate acoustic speckle of real soft tissue. The spine model was painted with X-ray contrast material (barium-sulphate) to show contrast on fluoroscopic images. The needle was navigated to the facet joints of this spine model using TUSS. Success or failure of needle placement was assessed using two X-ray projection images by a radiologist, blinded to the identity of operators. Registered bone surface model with tracked needle positions were also available during verification of insertion outcomes. For example,
Third, feasibility in biological tissue was tested using two fresh cut lamb lumbar spine regions. Tracked needles were navigated to the facet joints of the spine using TUSS. In order to assess the difference between TUSS-based navigation and freehand US-guided needle placement without position tracking, the cadaveric lamb model facet joint needle insertions were repeated in the same model without TUSS by all operators. Success of each insertion was assessed in the same way as in the synthetic human spine model. Needle insertions in the synthetic human spine phantom and the lamb model were carried out in groups to reduce experiment time. TUSS images were taken from the tracked live US stream for facet joints of five consecutive anatomical segments. Single mouse pointer clicks on these snapshots in the 3-D views were used to define target and entry points for the needle insertion plans (
Targeting error in the accuracy tests was defined as distance of the needle tip from the surface of targeted copper spheres. Insertion time was defined as time from the definition of the insertion plan in the navigation software until the final placement of the needle. Success in facet joint needle placement was defined as the radiographic image of the needle tip being between the articular processes in the postero-anterior fluoroscopic view, and overlapping the articular processes in the lateral view.
Targeting error and insertion time were expressed as mean±standard deviation. The success rates of needle insertions were expressed as percentages. Linear regression was used to analyze trends in targeting error and procedure time with repeated needle insertions. Success rate between TUSS navigation and the freehand US-guided method was compared using a Chi-square test. Significance was defined as p<0.05 in all statistical tests.
System accuracy and the human anatomy feasibility tests were executed by three operators. Thirty needles were successfully positioned for accuracy testing. Targeting error was 1.03±0.48 mm. Maximum targeting error was 1.93 mm. Time from needle plan definition until final needle placement was 42.0±9.17 s. Maximum insertion time was 60 s. Targeting error did not change significantly as the number of needle insertions increased within operators (
Facet joint needle placements in the synthetic human spine phantom were successful at first attempt in 29 insertions out of the total 30 insertions (96.7%) by three operators (10 facet joints each). In the case of the single missed facet joint, post-procedure analysis confirmed that the needle was placed at the planned position; however, the operator confused the facet joint with the gap between the vertebral lamina and the transverse process.
Cadaveric lamb facet joint needle placements were completed by all the five operators. TUSS guidance resulted in a success rate of 47 out of 50 cases (94%) as confirmed by post-insertion orthogonal fluoroscopic images. With freehand US-guided needle placement, success rate was 44% (22 of 50), which is significantly lower (p<0.001) compared to TUSS-guided insertions. Furthermore, the insertion time was significantly less (36.1±28.7 s) with TUSS guidance compared to freehand US guidance (47.9±34.2 s).
The results show that TUSS navigated facet joint needle insertion was significantly more accurate than freehand needle insertion in a patient-based synthetic human spine phantom and in a cadaveric lamb model. These results suggest that EM-tracked facet joint injections may be routinely performed without ionizing radiation imaging. Post insertion fluoroscopic analysis and registration with CT-based bone surface models revealed that all of the few missed needle placements were due to inaccurate US localization of the facet joint by the operators. This indicates the importance of training before the procedure is introduced in clinical practice. Identification of the facet joint by US is not a straightforward task even with a profound knowledge of the spinal anatomy. Operators in this study had no prior experience in US-guided facet joint injections and did not practice other forms of US-guided needle insertions on a daily basis.
Ultrasound guidance methods use landmarks on the images that can be identified with high confidence. Since US provides only a limited view of the underlying structures, the needle path is planned relative to the landmarks. Selection of the landmarks is not limited to one US slice. Landmark points (e.g., fiducials) in the 3D Slicer software can be placed, named, and highlighted in US slices of different orientations. These landmarks can be observed for needle navigation in different 3-D views of the virtual scene, as in the methods described herein. It is expected that these methods are applicable to a broad range of clinical procedures, in addition to the facet joint injections of this example, using anatomical landmarks. For example, for spinal nerve blocks, US guidance has an advantage over more frequently used imaging modalities. That is, US may directly visualize the target nerve, while conventionally used fluoroscopy does not show sufficient soft tissue contrast.
In conclusion, TUSS navigation allows for significantly better success rate and lower insertion time in facet joint injections by medical residents than freehand US needle guidance. Operators achieved good needle placement accuracy immediately as they started to use this guidance technique, which can be attributed to the intuitive user interface. This method may enable US guidance to be routinely used in facet joint injections, improving the safety and accessibility of treatment in patient populations with spine diseases.
In procedures such as the foregoing, use of a reference device in accordance with the described embodiments ensures that the electromagnetic field used for tracking is not distorted, therefore indicating that the needle guidance is accurate. Also, the reference device ensures that the ultrasound image and the tracked tools appear in the navigation computer display aligned with the point of view of the operator. This is essential to make the navigated intervention intuitive for the operator.
The contents of all references, pending patent applications, and published patents cited throughout this application are hereby expressly incorporated by reference.
Those skilled in the art will recognize or be able to ascertain variants of the embodiments described herein. Such variants are within the scope of the invention and are covered by the appended claims.
This application claims the benefit of the filing date of U.S. Patent Application No. 61/791,742, filed on 15 Mar. 2013, the contents of which are incorporated herein by reference in their entirety.
Number | Date | Country | |
---|---|---|---|
61791742 | Mar 2013 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14209232 | Mar 2014 | US |
Child | 15235392 | US |