The invention relates to an ultrasound visualization system for tracking a position of an interventional device based on a stream of live ultrasound images of an ultrasound imaging probe. A computer program product is also disclosed. The invention may be used in the medical ultrasound imaging field, and more specifically in ultrasound-based cardiac imaging. The stream of live ultrasound images may exemplarily be provided by a cardiac imaging probe such as an intra cardiac echo, ICE, probe or a trans esophageal echocardiogram, TEE, probe. The interventional device may for example be an ablation catheter.
Interventional cardiology procedures such as the treatment of cardiac arrhythmias by catheter ablation are increasingly using tracking and mapping techniques to navigate interventional devices within the body. The treatment of atrial fibrillation by catheter ablation is an example of such a procedure. In this, an ablation catheter is used to deliver energy to the cardiac wall to destroy living tissue in order to stop undesired conduction of electric pulses that cause the arrhythmia. Catheter navigation and mapping systems such as Carto, produced by Biosense-Webster, and Ensite, produced by St. Jude Medical are routinely used in this procedure to guide the catheter to the treatment site. These known systems typically use magnetic fields and impedance sensing to respectively navigate the interventional device and to perform cardiac mapping.
A drawback of some cardiac navigation systems is their limited ability to visualize cardiac tissue, particularly the cardiac wall. The known systems typically create a virtual 3D model of the targeted anatomy in which the tracked ablation catheter is graphically visualized. The lack of visualization of the anatomy is often compensated-for by using additional X-ray, typically fluoroscopic, imaging to verify the catheter's location. The poor visualization of soft tissue under X-ray typically requires the physician to undergo extensive training in order to be able to correctly determine the catheter position in the fluoroscopic image. In order improve the visualization of cardiac tissue, ultrasound imaging systems such as 2D intra cardiac echography, ICE, are increasingly being used in these procedures. However it can still be difficult to follow the treatment catheter's movements in the ultrasound image.
U.S. Pat. No. 8,303,505 B2 describes an apparatus for image guidance and documentation of medical procedures. One embodiment includes combining small field of view images into a recorded image of with a large field of view and aligning the small field of view real time image with the recorded image through correlation of imaging data. A location and orientation determination system may be used to track the imaging system and provide a starting set of image alignment parameters and/or provide change updates to a set of image alignment parameters, which is then further improved through correlating imaging data. The recorded image may be selected according to real time measurement of a cardiac parameter during an image guided cardiac procedure.
However a need exists for improved anatomical visualization during cardiac, and other interventional medical procedures. There is also a need to reduce the X-ray imaging dose to a patient during such procedures.
The invention seeks to improve anatomical visualization during medical procedures. The invention also seeks to reduce the X-ray imaging dose to a patient during such procedures. Further advantages from the described invention will be apparent to the skilled person. Thereto an ultrasound visualization system for tracking a position of an interventional device based on a stream of live ultrasound images of an ultrasound imaging probe, is provided. The ultrasound visualization system comprises at least one processor configured to: i) receive the stream of live ultrasound images; ii) extract from the stream of live ultrasound images a reference image comprising reference image data, the reference image including an anatomical feature; iii) extract from the stream of live ultrasound images a current image comprising current image data, the current image being later in time to the reference image and including at least a portion of the anatomical feature and including at least a portion of the interventional device at a current position; iv) match the at least a portion of the anatomical feature in the current image with the anatomical feature in the reference image to determine a spatial relationship between the current position of the interventional device and the anatomical feature in the reference image; and v) indicate, in the reference image, the current position of the interventional device, based on the determined spatial relationship.
The use of ultrasound imaging in the system improves the visualization of soft tissue. Moreover, since the reference image, or “map”, is generated from the stream of live ultrasound images, a benefit of the ultrasound visualization system in relation to known tracking systems that generate an anatomical map in a pre-procedural imaging stage includes the obviation of the need to perform two separate medical procedures. Also, since the map is generated from the live ultrasound image stream, it is more up-to-date than a map that is generated a pre-procedural imaging stage. Moreover, the extracted reference image is inherently static, and so any movement of the ultrasound probe after the generation of this “map” is not translated into a movement of the map. This, in combination with indicating the live position of the interventional device in the static map, allows a user of the system to more easily direct the interventional device to an anatomical feature in the map. Also, the X-ray dose to a patient during such a procedure may also be reduced, or even not required, because the ultrasound reference image provides sufficient positioning information of the interventional device in relation to the anatomical feature.
According to one aspect the field of view of the reference image is extended. In this aspect the reference image corresponds to a first field of view, and the current image corresponds to a second field of view that has as an overlapping portion that overlaps with the first field of view and a non-overlapping portion that does not overlap with the first field of view. The at least one processor is further configured to extend the field of view of the reference image by: i) matching the at least a portion of the anatomical feature in the current image with the anatomical feature in the reference image to establish a spatial relationship between the non-overlapping portion of the second field of view and the first field of view; and ii) adapting the reference image to include at least a portion of the non-overlapping portion of the second field of view based on the established spatial relationship. The extended field of view of the reference image therefore includes more anatomical information. During the medical procedure, movements of the ultrasound transducer, for example from heart contractions or from patient breathing, cause changes in the field of view of the current image as compared to that of the reference image. In this aspect the non-overlapping field of view, is used to extend the map. Improved navigation of the interventional device is provided through the ability to track the interventional device over a larger anatomical region. Moreover, the ability to generate such an extended map from the live images means that a more extensive up-to-date map can be generated during a medical procedure. This provides improved guidance because the interventional device can be more accurately guided to a treatment site.
According to another aspect the at least one processor is further configured to: i) receive from the interventional device a trigger signal indicative of an activation of the interventional device; and to ii) provide a marker in the reference image corresponding to the position of the interventional device at the time at which the interventional device was activated. In so doing, the reference image, i.e. the map, may be used to record anatomical sites where a medical intervention such as the delivery of a therapeutic treatment or the removal of tissue or a measurement of tissue properties has taken place within the anatomy. In one example this aspect may be used to record ablation positions of an ablation catheter that serves as the interventional device. Again, since the map is static, improved navigation is provided by a user's ability to more clearly see anatomical regions where an ablation has taken place.
According to another aspect the at least one processor is further configured to: i) receive, from a positioning system comprising at least one position sensor disposed on the interventional device, and at least one position sensor disposed on the ultrasound imaging probe, position data corresponding to a current position of the position sensor disposed on the interventional device respective the ultrasound imaging probe; and to ii) indicate, in the reference image, the current position of the interventional device, based further on the current position of the position sensor disposed on the interventional device respective the ultrasound imaging probe. This aspect may be useful in more accurately positioning the interventional device in the reference image, particularly when the interventional device is located toward the edge of the field of view of the ultrasound imaging probe and where image resolution may be degraded.
Further aspects of the invention are described with reference to the appended claims.
In order to illustrate the principles of the present invention an ultrasound visualization system is described with particular reference to a cardiac ablation procedure in which an ICE imaging probe serves as the ultrasound imaging probe, and in which an ablation catheter serves as the interventional device. It is however to be appreciated that the invention also finds application in the wider ultrasound imaging field. The ultrasound imaging probe may alternatively be for example a TEE imaging probe, and the interventional device may for example be a catheter, an ablation catheter, an ablation support catheter, a biopsy device, a guidewire, a filter device, a balloon device, a stent, a mitral clip, a left atrial appendage closure device, an aortic valve, a pacemaker lead, an intravenous line, or a surgical tool.
Ultrasound visualization system 111 includes at least one processor 117. As described above, the at least one processor may perform some or all of the processing described in relation to the ultrasound imaging probe console 115. Ultrasound visualization system 111 includes at least one processor 117 that is configured to: i) receive the stream of live ultrasound images 113; ii) extract from the stream of live ultrasound images 113 a reference image 118 comprising reference image data, the reference image including an anatomical feature 119; iii) extract from the stream of live ultrasound images 113 a current image 120 comprising current image data, the current image 120 being later in time to the reference image 118 and including at least a portion of the anatomical feature 119 and including at least a portion of the interventional device 112 at a current position 121; iv) match the at least a portion of the anatomical feature in the current image 120 with the anatomical feature in the reference image 118 to determine a spatial relationship between the current position of the interventional device and the anatomical feature in the reference image 118; and to v) indicate, in the reference image 118, the current position 121 of the interventional device 112, based on the determined spatial relationship. In so doing the current, i.e. live, position of the interventional device is provided in the reference image.
Stream of live ultrasound images 113 may be either 2D or 3D ultrasound images. In a particular example the images are 3D ultrasound images from an ICE imaging probe. Interventional device 112 may in general be any interventional device. In a particular example, interventional device 112 is an ablation catheter. Anatomical feature 119 in reference image 118 may be any anatomical feature that is discernible in an ultrasound image. In the specific example of cardiac ablation it may for example correspond to part of a pulmonary vein ostium. Current image 120 may be in general be any image from the stream of live ultrasound images 113 that is later in time to the reference image. Owing to the live nature of the stream, current image 120 is continually replaced in time by the latest image. Optionally, reference image 218 may be periodically updated. It may be useful to generate a more up-to-date reference image that reflects recent anatomical changes, or for example when the ultrasound probe is moved to a new position of the interventional device within the anatomy.
In one implementation the step of extracting current image 120 from the stream of live ultrasound images 113, may include determining the current position of the interventional device by performing an edge detection method on the current image data to determine an outline of the interventional device. Suitable known edge detection methods include frequency domain filtering and Fourier transformations.
The step of matching the at least a portion of the anatomical feature in the current image 120 with the anatomical feature in the reference image 118 to determine a spatial relationship may be performed by one of many known image processing algorithms. Scale-invariant feature transform, SIFT, is one suitable example. Rigid or non-rigid image transforms known from the medical image processing field may also be used. The correlation technique described in U.S. Pat. No. 5,655,535, or the real-time image alignment procedure described in U.S. Pat. No. 8,303,505 may alternatively be used. The spatial relationship may be in 2D image space or in 3D image space and may for example be a translation or a rotation or a combination of a rotation and a translation. The spatial relationship may for example be represented as a vector in Cartesian or polar coordinates.
The step of indicating, in the reference image 118, the current position 121 of the interventional device 112, may include for example positioning a marker such as a cross, a circle or other shape, a point, an outline of the interventional device, or locally changing the color, saturation, or hue of a pixel in the reference image.
Optionally, reference image 218 in
In one embodiment illustrated in
In operation the functionality described with reference to
In one embodiment that is described with reference to
In
The same matching procedures as described above in relation to
The step of adapting the reference image 118 may include using part or all of the image data from the non-overlapping portion of the second field of view 332b to update the reference image, thereby extending it. The adaptation therefore results in an effective field of view that is less than or equal to the combination of fields of views 331, 332 in
Optionally, in the embodiment of
In another embodiment, either of the systems described with reference to
One or more of the method steps disclosed herein, particularly those described in relation to the processor of ultrasound visualization system 111, 211 may be recorded in the form of instructions which when executed on a processor cause the processor to carry out such method steps. The computer program product may be provided by dedicated hardware as well as hardware capable of executing software in association with appropriate software. When provided by a processor, the functions can be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which can be shared. Moreover, explicit use of the term “processor” or “controller” should not be construed to refer exclusively to hardware capable of executing software, and can implicitly include, without limitation, digital signal processor “DSP” hardware, read only memory “ROM” for storing software, random access memory “RAM”, non-volatile storage, etc. Furthermore, embodiments of the present invention can take the form of a computer program product accessible from a computer-usable or computer-readable storage medium providing program code for use by or in connection with a computer or any instruction execution system. For the purposes of this description, a computer-usable or computer readable storage medium can be any apparatus that may include, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, or apparatus or device, or a propagation medium. Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory “RAM”, a read-only memory “ROM”, a rigid magnetic disk and an optical disk. Current examples of optical disks include compact disk read only memory “CD-ROM”, compact disk read/write “CD-R/W”, Blu-Ray™ and DVD.
In summary, an ultrasound visualization system has been described that tracks a position of an interventional device. The system includes a processor that i) receives a stream of live ultrasound images; ii) extracts from the stream of live ultrasound images a reference image comprising reference image data, the reference image including an anatomical feature; iii) extracts from the stream of live ultrasound images a current image comprising current image data, the current image being later in time to the reference image and including at least a portion of the anatomical feature and including at least a portion of the interventional device at a current position; iv) matches the at least a portion of the anatomical feature in the current image with the anatomical feature in the reference image to determine a spatial relationship between the current position of the interventional device and the anatomical feature in the reference image; and v) indicates, in the reference image, the current position of the interventional device, based on the determined spatial relationship.
Number | Date | Country | Kind |
---|---|---|---|
17201547.1 | Nov 2017 | EP | regional |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2018/081143 | 11/14/2018 | WO | 00 |