“Not Applicable”
“Not Applicable”
This invention relates generally to instruments for image guide diagnostic and therapeutic procedures within the body of a patient and more particularly to guides for providing the user with intuitive visual directions to facilitate the precise move the instrument to a desired target within the body of the patient.
Image guided diagnostic and therapeutic procedures use intermittent or constant (real-time) imaging to guide an instrument to a target lesion. Typically, images are acquired as 3-D volume digital data, while the operator views the images in multi-planar 2-D. This has proven to be the best way for the human brain to deal with the 3-D constructs required for accurate needle placement into a target lesion within a human body. These “live” (either real-time and or very near term, “current”) image data may be enhanced by fusion with previously acquired volume image data if there is a valid method to reference and accurately register the data sets. Electromagnetic (EM) fields and sensors have become accepted as one method for establishing a valid frame of reference for volume image registration and instrument/needle tracking. For example, a small lesion in the prostate may be recognized on an MRI scan using an endo-rectal coil, but may not be visible on ultrasound of the same prostate using an endo-rectal ultrasound transducer for ultrasound guided needle biopsy. If an EM field generator is placed adjacent to the patient's pelvis and a 6 degree of freedom (DOF) EM sensor affixed to the transducer, fusion of the prior MRI scan volume data to the real-time ultrasound image(s) can be achieved as follows: Software in the ultrasound machine allows input of the MRI volume images in DICOM format. This volume MR data is then registered to the transducer and the real-time ultrasound image(s) via a series anatomic reference points common to both imaging modalities that are entered into the software to achieve anatomic registration (typically obtained from real time 2-D ultrasound images with the electromagnetic sensor that is attached to the imaging transducer in a known position and orientation, or by an implanted or externally placed 3-D reference marker). The combination of the EM field and sensor becomes the 3-D frame of reference common to both the prior MRI volume image data and the current real time ultrasound images because the position of the ultrasound transducer is being tracked in real-time by the EM sensor. Thus, the 2-D ultrasound images may be fused in real time with the fully registered 3-D volume data of the MRI scan. In this way, ultrasound may be used to “see” the otherwise invisible target area because it is visible on the overlying MR images. Most importantly for purposes herein, additional EM sensors may be attached to other instruments such as needles for 3-D image guided navigation within this common frame of reference.
Instrument/needle guidance in this setting is the subject of the present invention. To date, needle guidance using either real-time or “current” images is accomplished by the operator watching the digital images on a monitor while manipulating the needle blindly with his/her hands. Using prior art EM navigation and instrument tracking, there is additional iconography on the monitor that uses the sensor position in the needle to create a virtual instrument. Various on-screen software constructs give indications of needle position and trajectory in relation to the screen image. And, if a target lesion is marked, the software will have screen icons and/or coloration that indicate the corrections required to put the needle on the correct target trajectory and provide a distance to the target. This is, at least potentially, a major improvement over currently established practice standards: namely, completely blind freehand needle positioning with frequent re-imaging to see the effects or, trying to follow the needle by direct visualization using real time imaging. However, several problems have slowed adoption of EM systems guidance. First, there is added cost. This can be addressed by data showing reduced procedure times and better outcomes. Second, there are issues related to the EM field and accuracy which are being resolve with better technology. But finally, the main problem with existing systems, which all use this standard software overlay approach on the imaging monitors and employ sensors in the needles as described above, is that they are not simple or intuitive.
When using EM instrument tracking, what the practitioner sees on the monitor (prior art) are current relationships in flashing multi-planar 2-D, or scrolling 2-D sequences combined with the desired direction of corrective movement in some 2-D rendering and/or indicator of the 3-D movement required. Furthermore, as the operator makes corrective movements, his/her hand moves are essentially blind, trial and error that are corrected only by recognition of changes in previously learned software indicators and a 3-D construct in his/her mind. Correctly moving an instrument/needle in three dimensions based on 2-D images seen on a screen that has no particular orientation to the operative field and while not watching the hands requires both skill and practice. In addition, an operator must learn the specifics of the iconography designed into the software that indicate the required needle re-direction as well as interpret the image(s). The result is a steep learning curve that requires expensive (skilled teachers plus time) education and results in a spectrum of skill even amongst experienced users. These non-intuitive requirements have slowed the adoption and delayed realization of the full benefits of this otherwise highly developed and beneficial technology.
The subject invention addresses those needs by providing a new guidance device and method of guidance that allows all operators to bypass the skill requirements of prior art systems, flattens the learning curve and makes image guided instrument/needle placement completely intuitive.
In accordance with one aspect of this invention an indicator guide is provided for guiding the movement of an instrument by a user in an image guided medical procedure on a patient. The instrument is arranged to be introduced and guided in a surgical field to a target site within the body of the patient by an electronic tracking system. At least one of the instrument and the indicator guide comprise a sensor for providing a signal to the tracking system. The tracking system is arranged determine the position and orientation of the instrument with respect to the target site in response to the signal from the sensor and for providing output signals to the indicator guide. The indicator guide comprises a display responsive to the output signals from the tracking system for providing a visual indication of the path to which the instrument should be oriented and directed to reach the target site and for providing a perceptible indication of the distance of the instrument to the target site. The display is located on or immediately adjacent the instrument and within the surgical field, whereupon the user can readily see the display while directly viewing and moving the instrument along that path.
In accordance with another aspect of this invention there is provided a method for guiding the movement of an instrument by a user in an image guided medical procedure on a patient. The instrument is arranged to be introduced and guided to a target site within the body of the patient by an electronic tracking system. The tracking system includes a sensor is arranged determine the position and orientation of the instrument with respect to the target site in response to a signal from the sensor and for providing output signals indicative thereof. The method entails providing an instrument guide comprising a display. The display is responsive to the output signals from the tracking system for providing a visual indication of the path to which the instrument should be oriented and directed to reach the target site and for providing a perceptible indication of the distance of the instrument to the target site. The sensor is coupled to at least one of the instrument and the indicator guide. The display is disposed on or immediately adjacent the instrument to be movable with the instrument, whereupon the user can readily see the display while directly viewing and moving the instrument along that path.
Referring now to the various figures of the drawing wherein like reference characters refer to like parts, there is shown at 20 in
The tracking system 12 can be of any suitable construction, such as those commercially available today. In the exemplary embodiment shown in
It should be noted at this juncture that the sensors 14A and 14B can be located at different positions with respect to the instrument and indicator guide, respectively, so long as their respective positions with respect to those components are known by software making up the computerized tracking system 12.
The use of the indicator guide 20 of this invention can be achieved in various ways, as will be described later. Suffice it for now to state that each method requires a 6DOF EM sensor affixed to the indicator guide and/or a 6 or 5DOF sensor affixed to the instrument, e.g., located in the tip of the instrument/needle. Either arrangement may be sufficient depending upon the instrument/needle being used and with the correct software or guide setup. However, it is preferred that two such sensors be used for reasons to be described later.
The methods of use of the indicator guide of this invention also requires a target that is well defined in the images and which can be marked and recognized in the 3-D data set of the tracking system software. Ideally, the target should remain in view of the tracked imaging transducer in the case of real-time ultrasound guidance, or the target needs to be recognized and clearly defined using image recognition software working in the digital image data set of the tracking system software. In either case, the target must be clearly located in the 3-D image volume for the already registered indicator guide to function as described. This is no different from what already exists in the practice today. In this regard, target visualization and marking is already required for all current EM image guided targeting software programs. The paradigm shift for the operator (user) provided by the present invention is the transfer and translation of navigational information onto the instrument itself (or at least onto the surgical field, e.g., on the back of the user's hand), onto a drape or other structure in the surgical field, etc).
In order for the indicator guide 20 to work as designed, it must be in a known spatial relationship to the instrument, e.g., the tip 10A of the instrument, which will be guided by it. This relationship may be tracked in at least three different ways. For example, if as shown in
It is currently envisioned that the optimum clinical setup using the electronic indicator guide of this invention is to make use of an instrument that has a 5DOF or 6DOF EM sensor 14A in its tip and a 6DOF EM sensor 14B in the indicator guide 20, like shown in
As should be appreciated by those skilled in the art in the arrangements shown in
Turning now to
The on-target display portion 28 can take many forms and may be visual or audible or both. In the exemplary embodiment the on-target display portion 28 is visual to provide a visually perceptible signal to the user when the instrument is in a direct path and orientation to reach the target. In particular, it comprises a circular LED element that is located within the direction display array 26 and which illuminates to provide a visual signal to the user when the instrument is directly oriented in a path towards the target. As mentioned earlier, the on-target display can be audible. Thus, instead of providing a visual signal when the instrument is in the direct path to the target, the on-target display may provide an audible signal, e.g., a beep or series of beeps, etc., indicate that fact. An audible on-target signal may be provided in conjunction with a visual on-target signal.
In order to provide the user with information regarding the distance of the instrument, e.g., its tip 10A, to the target, the indicator guide includes the heretofore identified distance display 30 portion. In the exemplary embodiment the distance display portion 30 is in the form of a segmented digital numeric display (e.g., LED or liquid crystal) that is located within the center of the on-target display portion and provides a numeric readout of the distance of the instrument's tip 10A to the target. Like the on-target display 28, the distance display 30 may be in the form of an audible signal, e.g., a synthesized voice annunciating the distance to the target. That audible signal may be used in conjunction with the numeric visual display.
The electrical power and control signals to and from the electronic components making up the indicator guide 20 and the sensor 14A are provided via a strain relief-reinforced cable 32 extending out of the indicator guide's housing. The indicator guide 20 is connected to the tracking system 12 via the cable 32. It is also contemplated that the indicator guide be self-powered, e.g., battery powered, and wirelessly coupled to the tracking system.
Turning now to
Turning now to
As should be appreciated by those skilled in the art from the foregoing the indicator guide 20, in effect, is a simple electronic pointer that visually shows the direction the instrument needle must be moved to reach the target, e.g., lesion, within the patient's body and also provides (e.g., shows) the distance of the tip of the instrument from the target. Thus, the indicator guide clearly indicates the required movement of the handled end of the instrument to direct the instrument tip into the target lesion and to stop at that point. In typical use, the guide 20 is used to get the instrument/needle pointed exactly at the target and then advance it. The guide, by being on or near the instrument/needle, allows the user to intuitively coordinate his/her hand movement with the electronic signal indicating direction because it is in the same direct field of view of the user. The guidance of the instrument becomes primarily directed by the indicator guide, not by some remotely located monitor (as has characterized the prior art). In particular, using the instrument guide 20 of this invention makes checking the “live” images on a remotely located monitor more elective and confirmatory, rather than mandatory and directive (as has characterized the prior art).
The methodology of this invention may make use of virtually the same software that currently is in common use today with conventional EM tracking systems. However, with the subject invention that software will be calculating the difficult spatial translations to lead the operator's movement of the instrument rather than the operator having to rely on his/her brain to reconstruct a mental 3-D image of the operative field to make those hand movements while watching the remote monitor to follow the moves suggested by the monitor's on-screen icons.
As with any image guided therapeutic or diagnostic procedure maintenance of a sterile operative field is necessary. Thus, to that end the use of the subject invention contemplates using a soft, clear plastic sterile cover (not shown) for the indicator guide, the associated EM sensor and the cabling. Alternatively, the core electronic components of the indicator guide and the 6DOF EM sensor could be potted together, have a single cable for re-use and have a sterile injection molded plastic housing that form the indicator dial face with an attached sleeve cover for the cable. It is also contemplated that the indicator guide can be made as a disposable device, although at this time such an arrangement is unlikely due to the inherent costs involved.
Without further elaboration the foregoing will so fully illustrate my invention that others may, by applying current or future knowledge, adopt the same for use under various conditions of service.
This PCT application claims the benefit under 35 U.S.C. §119(e) of Provisional application Ser. No. 61/588,905 filed on Jan. 20, 2012 entitled INDICATOR GUIDE FOR IMPROVED INSTRUMENT NAVIGATION IN IMAGE-GUIDED MEDICAL PROCEDURES whose entire disclosure is incorporated by reference herein.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2013/021550 | 1/15/2013 | WO | 00 |
Number | Date | Country | |
---|---|---|---|
61588905 | Jan 2012 | US |