This disclosure generally relates to image systems for navigating surgical medical devices. More particularly, this disclosure relates to systems and methods for navigating interventional instrumentation, such as surgical needles, and refining registration between image modalities.
In the course of performing a surgical operation or intervention, a medical practitioner (e.g., a surgeon) may use various operating instruments to perform various operations such as needle biopsies, tumor ablations, catheter insertion, orthopedic interventions, etc. These instruments may use several types of image data to aid the medical practitioner or operator in inserting the instrument into a desired location within a patient's body. Typically, the medical practitioner inserts these instruments into a patient's body at very specific locations, orientations, and depths to reach predetermined target areas in order to perform an instrument-specific action or function, which may include tissue sampling, heating, cooling, liquid deposition, suction, or serving as a channel for other objects.
Medical navigation systems based on a variety of different tracking technologies (mechanical, infrared-optical, electromagnetic, etc.) have existed for a long time, and help with aligning patient, imaging data, targets, and instruments. However, the known medical navigation systems remain inaccurate, inconvenient, and ineffective in providing real-time data and guidance in inserting and moving interventional instruments.
Therefore, a need remains for improved systems and methods for interventional image navigation and image registration refinement.
An aspect of the present disclosure is to provide a system for surgical image guidance. The system includes a first imaging device, and an image processing system operatively connected to the first imaging device. The image processing system is configured to: 1) receive real-time image data of a patient from the first imaging device; 2) receive secondary image data from a second imaging device; 3) produce composite image data based on the real-time image data and the secondary image data; and 4) produce enhanced composite image data by improving an alignment of physical structures in a real-time image based on the real-time image data with corresponding physical structures in a secondary image based on the secondary image data. The image processing system is further configured to operate in an unlocked mode in which the real-time image is free to move relative to the secondary image and a locked mode wherein the real-time image and the secondary image are locked relative to each other to prevent relative movement therebetween. The imaging device is configured to be able to provide information to the image processing system to cause the image processing system to operate in the unlocked mode to enable movement of the real-time image relative to the secondary image.
Another aspect of the present disclosure is to provide a method for surgical image guidance. The method includes receiving, by an image processing system, real-time image data of a patient from the first imaging device; receiving, by the image processing system, secondary image data from a second imaging device; producing, by the image processing system, composite image data based on the real-time image data and the secondary image data; and producing, by the image processing system, enhanced composite image data by improving an alignment of physical structures in a real-time image based on the real-time image data with corresponding physical structures in a secondary image based on the secondary image data. The method further includes receiving, by the image processing system, a command to permit moving the real-time image relative to the secondary image when the image processing system is operating in an unlocked mode in which the real-time image is free to move relative to the secondary image, and receiving, by the processing system, a command to put the processing system in a locked operating mode so as to prevent relative movement between the real-time image and the secondary image.
A further aspect of the present disclosure is to provide a non-transitory processor-readable medium having instructions stored thereon, which when executed by one or more processors, cause the one or more processors to implement the above method.
The present disclosure, as well as the methods of operation and functions of the related elements of structure and the combination of parts and economies of manufacture, will become more apparent upon consideration of the following description and the appended claims with reference to the accompanying drawings, all of which form a part of this specification. Like reference numerals designate corresponding parts in the various figures. The drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the invention.
The present disclosure describes medical-device systems and methods that allow operators to perform navigation-assisted interventions using certain types of intervention instruments, such as needle-like instruments, and corresponding methods of optimizing registration between image modalities using visual or automatic termination criteria. Example applications may include needle biopsies, tumor ablations, catheter insertion, orthopedic interventions, and other instruments, all of which may use several types of image data. Typically, these instruments are inserted into a patient's body at very specific locations, orientations, and depths to reach predetermined target areas, where they perform an instrument-specific action or function, which may include tissue sampling, heating, cooling, liquid deposition, suction, or serving as a channel for other objects.
Clear Guide Medical has previously developed a novel visual tracking technology platform based on real-time camera-based computer vision, and embodied the tracking platform in products for ultrasound-based systems and computerized tomography (CT)-based systems for image/instrument guidance and multi-modality fusion. Certain aspects of this technology has already been described in U.S. patent application Ser. Nos. 13/648,245, 14/092,755, 14/092,843, 14/508,223, 14/524,468, 14/524,570, and 14/689,849, which are incorporated herein in their entireties for all purposes.
The system can then continuously extract slices from the patient data that spatially correspond to the current ultrasound slice, and show both overlaid on screen in a fusion view, as shown in
Manual Registration Refinement: However, the above process for performing a “Visual Sweep” registration based on the position of the markers 30 is subject to noise including patient breathing artifacts during initial imaging or the Visual Sweep, marker drift or skin shift between imaging and registration, imperfect marker observations, internal tissue shift etc. These noise sources may deteriorate the registration quality, leading to noticeable displacement between registered image data and actual relevant anatomical features as viewed by real-time imaging such as ultrasound or fluoroscopy.
The term “real-time” imaging is used herein to mean (a) that the images are captured while the patient is imaged with the imaging system(s), (b) the time the patient is willing to wait on the imaging table, (c) the images are viewed within approximately one minute (e.g., less than 30 seconds) from capturing the image by a probe (e.g. ultrasound probe), or (d) the images are viewed within less than one second from capturing the images (i.e., relatively instantaneously).
Therefore, it may be desirable to adjust the initial, automatically determined registration matrix M to achieve better feature alignment. In an embodiment, one possible procedure to correct for misalignment is to allow an operator (e.g., medical practitioner) to perform a “manual registration refinement”, as follows:
By using the above procedure, the operator can guide image registration optimization both by manually searching for the optimal alignment pose, and by visually assessing alignment quality.
Automatic Termination: In another embodiment, the above-described manual refinement procedure may suffer from subjectivity as it depends on a visual alignment assessment of the operator. Furthermore, the above-described manual alignment may also have usability shortcomings in that multiple interaction types during dexterous high-accuracy actions may be needed. As a result, in order to remedy the above deficiencies and further improve upon the above described manual/visual refinement procedure, a further alignment procedure is employed herein that removes the subjective visual assessment of the operator. In an embodiment, the alignment procedure includes:
In the above paragraphs, it is described that the maximum value of the IRQ determines the best or optimum alignment between the images modalities. However, as it can be appreciated, in another embodiment, another type of IRQ metric can also be selected or generated such that the smaller the value of IRQ achieved, the better is the alignment achieved between the two images modalities (e.g., CT image modality 60 and ultrasound image modality 62). In this case, instead of tracking or following the maximum value of the IRQ metric, the minimum value of the IRQ metric provides the indication when the optimum alignment between the image modalities is achieved.
This approach has the benefit of relieving the operator of the alignment assessment burden. The system computes the image registration quality (IRQ) metric by correlating the respective current modality images (real-time and “frozen” static slices). Depending on combination of image modalities, the IRQ function may be straightforward image correlation for identical modalities, but may require pre-processing in other cases. For example, ultrasound is a boundary-based modality, whereas CT is a volume-based modality. Initially, the CT scanner's vertical gradient image may be computed (to approximate the ultrasound's top-down insonification and the resultant highlighting of horizontal tissue interfaces with large acoustic impedance jumps). Then, a correlation or better mutual-information-based (see, also Wells et al., “Multi-Modal Volume Registration by Maximization of Mutual Information,” Medical Image Analysis, 1996, the entire content of which is incorporated herein by reference) may be used to estimate alignment quality.
Since an initial visual sweep registration is almost guaranteed to result in a registration very close to the global optimum of the IRQ function, one may assume that the manual sampling of the pose space by the operator consistently results in higher values closer to the global optimum, and in lower values further away. This safely allows the system to automatically terminate the optimization and return the optimum-to-date pose as the result.
This approach may be also applied in cases where the real-time imaging modality is substituted or combined by one or more other image modalities or data sets (such as other CT, CBCT, MRI, etc. volumes). The manual refinement may be performed with real-time imaging against any other operator or automatically selected static volumes (e.g., by automatically choosing the lowest-quality matched data set for unlocking), as well as with static volumes against other static volumes, using similar interaction and computation approaches as outlined above.
As it can be appreciated from the above paragraphs, in an embodiment of the present disclosure, there is provided a system for surgical image guidance. The system includes a first imaging device and an image processing system operatively connected to the first imaging device. The image processing system is configured to: 1) receive real-time image data of a patient from the first imaging device; 2) receive secondary image data from a second imaging device; 3) produce composite image data based on the real-time image data and the secondary image data; and 4) produce enhanced composite image data by improving an alignment of physical structures in a real-time image based on the real-time image data with corresponding physical structures in a secondary image based on the secondary image data. The image processing system is further configured to operate in an unlocked mode in which the real-time image is free to move relative to the secondary image and a locked mode wherein the real-time image and the secondary image are locked relative to each other to prevent relative movement therebetween. The imaging device is configured to be able to provide information to the image processing system to cause the image processing system to operate in the unlocked mode to enable movement of the real-time image relative to the secondary image.
In an embodiment, the system may further include a display device configured to receive and display the composite image data. In an embodiment, the first imaging device can be an ultrasound device and the second imaging device can be a CT scan device or an Mill device or a three-dimensional medical imaging device.
In an embodiment, the image processing system can be configured to allow correcting of a misalignment between the real-time image data relative to the secondary image data until an operator determines that the real-time image data coincides with the secondary image data.
In an embodiment, the image processing system may include an input device configured to receive an input from an operator to put the image processing system in the unlocked mode to allow the real-time image to update and move in position relative to the secondary image. In an embodiment, the input device can be further configured to receive an input from the operator to put the image processing system in the locked mode to prevent relative movement between the real-time image and the secondary image.
In an embodiment, the image processing system can be configured to compute an image registration quality (IRQ) metric continuously. The IRQ metric quantifies a degree of the alignment of the physical structures in the real-time image with the corresponding physical structures in the secondary image. In an embodiment, the image processing system can be configured to determine whether the IRQ metric meets a predetermined threshold value or a dynamically determined threshold value, and provide, based on the determination, a feedback signal to an operator or to automatically put the image processing system in the locked mode to prevent relative movement between the real-time image and the secondary image. In an embodiment, the image processing system can be configured to display the IRQ metric as feedback. In an embodiment, the dynamically determined threshold value is a maximum or a minimum IRQ value. In an embodiment, the image processing system can be configured to store the IRQ metric and a corresponding alignment pose that achieved the IRQ metric. In an embodiment, the image processing system can be configured to compare a first IRQ metric obtained in a first attempted alignment pose with a second IRQ metric obtained with a second attempted alignment pose, and to store as an alignment pose the first attempted alignment pose or the second attempted alignment pose that achieved a best IRQ value among the first IRQ metric and the second IRQ metric. For example, the best IRQ metric corresponds to a higher IRQ metric in the first and second IRQ metrics, a lower IRQ metric in the first and second IRQ metrics, or an IRQ metric among the first and second IRQ metrics that is closest to the predetermined threshold value.
In an embodiment, the image processing system can be configured to automatically lock an alignment between the physical structures in the real-time image with the corresponding physical structures in the secondary image, if, after a certain number of comparison iterations, no improvement in alignment is possible any more, or no improvement in alignment is expected any more based on a trend of the IRQ metric, or after the IRQ metric has reached the predetermined threshold value, and select a previously stored alignment pose having a best achieved IRQ metric as an optimum alignment pose. For example, the IRQ metric can be determined by performing a two-dimensional correlation between the real-time image data and the corresponding secondary image data. The two-dimensional cross-correlation between the real-time image data and the corresponding secondary image data can be performed according to the following equation:
where f(m,n) and g(m,n) are respective image pixel values in the real-time image data and corresponding secondary image data at locations m,n, with i−j−0 for determining said correlation at the current alignment pose.
As it can be appreciated from the above paragraphs, in an embodiment of the present disclosure, there is also provided a method for surgical image guidance. The method includes 1) receiving, by an image processing system, real-time image data of a patient from the first imaging device; 2) receiving, by the image processing system, secondary image data from a second imaging device; 3) producing, by the image processing system, composite image data based on the real-time image data and the secondary image data; and 4) producing, by the image processing system, enhanced composite image data by improving an alignment of physical structures in a real-time image based on the real-time image data with corresponding physical structures in a secondary image based on the secondary image data. The method also includes receiving, by the image processing system, a command permit moving the real-time image relative to the secondary image when the image processing system is operating in an unlocked mode in which the real-time image is free to move relative to the secondary image. The method further includes receiving, by the processing system, a command to put the processing system in a locked operating mode so as to prevent relative movement between the real-time image and the secondary image.
In an embodiment, the method further includes allowing for correction, by the processing system, for a misalignment between the real-time image data relative to the secondary image data until an operator determines that the real-time image data coincides with the secondary image data. In an embodiment, the method includes computing, by the processing system, an image registration quality (IRQ) metric continously, the IRQ metric quantifying a degree of the alignment of the physical structures in the real-time image with the corresponding physical structures in the secondary image. In an embodiment, the method further includes determining whether the IRQ metric meets a predetermined threshold value or a dynamically determined threshold value; and providing, based on the determination, a feedback signal to an operator or to automatically put the image processing system in the locked mode to prevent relative movement between the real-time image and the secondary image. The method may also include providing the IRQ metric as feedback to the operator. In an embodiment, the dynamically determined threshold value can be a maximum or a minimum IRQ value. The method also includes storing the IRQ metric and a corresponding alignment pose that achieved the IRQ metric.
In an embodiment, the method may also include comparing a first IRQ metric obtained in a first attempted alignment pose with a second IRQ metric obtained in a second attempted alignment pose, and storing as an alignment pose the first attempted alignment pose or the second attempted alignment pose that achieved a best IRQ value among the first IRQ metric and the second IRQ metric. For example, best IRQ metric can correspond to a higher IRQ metric in the first and second IRQ metrics, a lower IRQ metric in the first and second IRQ metrics, or an IRQ metric among the first and second IRQ metrics that is closest to the predetermined threshold value.
In an embodiment, the method may also include automatically locking an alignment between the physical structures in the real-time image with the corresponding physical structures in the secondary image, if, after a certain number of comparison iterations, no improvement in alignment is possible any more, or no improvement in alignment is expected any more based on a trend of the IRQ metric, or after the IRQ metric has reached the predetermined threshold value, and selecting a previously stored alignment pose having a best achieved IRQ metric as an optimum alignment pose.
According to an embodiment of the present disclosure, there is also provided a non-transitory processor-readable medium having instructions stored thereon, which when executed by one or more processors, cause the one or more processors to implement the above method.
The foregoing detailed description of embodiments includes references to the drawings or figures, which show illustrations in accordance with example embodiments. The embodiments described herein can be combined, other embodiments can be utilized, or structural, logical and operational changes can be made without departing from the scope of what is claimed. The foregoing detailed description is, therefore, not to be taken in a limiting sense, and the scope is defined by the appended claims and their equivalents. It should be evident that various modifications and changes can be made to these embodiments without departing from the broader spirit and scope of the present disclosure. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
Present teachings may be implemented using a variety of technologies. For example, certain aspects of this disclosure may be implemented using electronic hardware, computer software, or any combination thereof. Whether such elements are implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. By way of example, the electronic hardware, or any portion of electronic hardware may be implemented with a processing system that includes one or more processors. Examples of processors include microprocessors, microcontrollers, Central Processing Units (CPUs), digital signal processors (DSPs), field programmable gate arrays (FPGAs), programmable logic devices (PLDs), state machines, gated logic, discrete hardware circuits, and other suitable hardware configured to perform various functions described throughout this disclosure. One or more processors in the processing system may execute software, firmware, or middleware (collectively referred to as “software”). The term “software” shall be construed broadly to mean instructions, instruction sets, code, code segments, program code, programs, subprograms, software components, applications, software applications, software packages, routines, subroutines, objects, executables, threads of execution, procedures, functions, etc., whether referred to as software, firmware, middleware, microcode, hardware description language, or otherwise. In certain embodiments, the electronic hardware can also include designed application-specific integrated circuits (ASICs), programmable logic devices, or various combinations thereof. The processing system can refer to a computer (e.g., a desktop computer, tablet computer, laptop computer), cellular phone, smart phone, and so forth. The processing system can also include one or more input devices, one or more output devices (e.g., a display), memory, network interface, and so forth.
If certain functions described herein are implemented in software, the functions may be stored on or encoded as one or more instructions or code on a non-transitory computer-readable medium. Computer-readable media includes computer storage media. Storage media may be any available media that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can comprise a random-access memory (RAM), a read-only memory (ROM), an electrically erasable programmable ROM (EEPROM), compact disk ROM (CD-ROM) or other optical disk storage, magnetic disk storage, solid state memory, or any other data storage devices, combinations of the aforementioned types of computer-readable media, or any other medium that can be used to store computer executable code in the form of instructions or data structures that can be accessed by a computer.
For purposes of this patent document, the terms “or” and “and” shall mean “and/or” unless stated otherwise or clearly intended otherwise by the context of their use. The term “a” shall mean “one or more” unless stated otherwise or where the use of “one or more” is clearly inappropriate. The terms “comprise,” “comprising,” “include,” and “including” are interchangeable and not intended to be limiting. For example, the term “including” shall be interpreted to mean “including, but not limited to.”
Although embodiments have been described with reference to specific example embodiments, it will be evident that various modifications and changes can be made to these example embodiments without departing from the broader spirit and scope of the present application. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
The embodiments illustrated and discussed in this specification are intended only to teach those skilled in the art how to make and use the invention. In describing embodiments of the disclosure, specific terminology is employed for the sake of clarity. However, the disclosure is not intended to be limited to the specific terminology so selected. The above-described embodiments of the disclosure may be modified or varied, without departing from the invention, as appreciated by those skilled in the art in light of the above teachings. It is therefore to be understood that, within the scope of the claims and their equivalents, the invention may be practiced otherwise than as specifically described. For example, it is to be understood that the present disclosure contemplates that, to the extent possible, one or more features of any embodiment can be combined with one or more features of any other embodiment.
The present patent application claims priority benefit from U.S. Provisional Patent Application No. 62/426,024 filed on Nov. 23, 2016, the entire content of which is incorporated herein by reference.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2017/062883 | 11/21/2017 | WO | 00 |
Number | Date | Country | |
---|---|---|---|
62426024 | Nov 2016 | US |