The present disclosure pertains to the field of medical imaging, and more particular to the registration of arbitrarily aligned 2-D images to allow for the generation/reconstruction of a 3-D image/volume.
Medical imaging, including X-ray, magnetic resonance (MR), computed tomography (CT), ultrasound, and various combinations of these and other image acquisition modalities are utilized to provide images of internal patient structure for diagnostic purposes as well as for interventional procedures. Often, it is desirable to utilize multiple two-dimensional (i.e. 2-D) images to generate (e.g., reconstruct) a three-dimensional (i.e., 3-D) image of an internal structure of interest.
2-D image to 3-D image reconstruction has been used for a number of image acquisition modalities (such as MRI, CT, Ultrasound) and image based/guided procedures. These images may be acquired as a number of parallel 2-D image slices/planes or rotational slices/planes, which are then combined together to reconstruct a 3-D image volume. Generally, the movement of the imaging device has to be constrained such that only a single degree of freedom is allowed (e.g., rotational movement). This single degree of freedom may be rotation of the imaging device or a linear motion of the imaging device. During such a procedure, the presence any other type of movement will typically cause the registration of 2-D images in 3-D space to be inaccurate.
This presents difficulties in handheld image acquisition where rigidly constraining movement of an imaging device to a single degree of freedom is difficult if not impossible. Further constraining an imaging device to a single degree of freedom may also limit the image information that may be acquired. This is true for handheld, automated and semi-automated image acquisition. Depending upon the constraints of the image acquisition methods, this may limit use or functionality of the acquisition system for 3-D image generation.
This application presents a new system and method for image acquisition of internal human tissue, including but not limited to the prostate, as well as a system and method for the guidance and positioning of medical devices relative to the internal tissue. In the presented systems and methods, ultrasound scanned data (e.g., 2-D B-mode images) are acquired freehand absent a mechanical armature that constrains an ultrasound acquisition probe in a known spatial framework. To allow for reconstruction of the scanned data into a 3-D image, multiple tracker sensors that provide position/location information are used with a freehand acquisition probe (e.g., handheld ultrasound probe). The position of such tracker sensors can be calculated when disposed in an electromagnetic field.
However, the orientation of the image plane of the acquisition probe relative to the tracker sensor must be calibrated. That is, the probe is calibrated so pixels in acquired 2D images can be mapped to their 3D coordinates (e.g., within an image cube). The validation of the calibration is performed to confirm the accuracy of the tracking. Additional tracker sensors may be used for the correction of target object movement.
A novel interpolation method is also utilized in the freehand tracking system to reconstruct the internal tissue object from the input data. In such an arrangement, the free hand tracking system takes an average of the tracking data and the corrects the data with information from one or multiple sensors to improve the accuracy of the tracking and the target location inside the scanned image cube as displayed. The needle trajectory can be also monitored by the multiple sensor strategy.
a illustrates an ultrasound probe incorporating a tracker sensor.
b illustrates an ultrasound probe incorporating a tracker sensor.
Reference will now be made to the accompanying drawings, which assist in illustrating the various pertinent features of the present disclosure. Although the present disclosure is described primarily in conjunction with transrectal ultrasound imaging for prostate imaging, it should be expressly understood that aspects of the present invention may be applicable to other medical imaging applications. In this regard, the following description is presented for purposes of illustration and description.
As presented, the invention is directed towards systems and methods for interpolation and reconstruction of a 3-D image from 2-D image planes/frames/slices obtained in arbitrary orientation during, for example, an unconstrained scan procedure. Also included is a method for improving interpolation.
The reconstruction method pertains to all types of 2-D image acquisition methods under various modalities and specifically for 2-D image acquisition methods used while performing an image-guided diagnostic or surgical procedure. It will be appreciated that such procedures include, but are not limited to, ultrasound guided biopsy of various organs, such as prostate (trans-rectal and trans-perineal), liver, kidney, breast, etc., brachytherapy, ultrasound guided laparoscopy, ultrasound guided surgery or an image-guided drug delivery procedures.
Most current methods for reconstructing a 3-D image from 2-D image planes assume some type of uniformity (e.g., constraint) in image acquisition. For example, most previous methods assume (or require) that the 2-D images be obtained as parallel slices or are displaced from each other through an angle while meeting at one fixed axis. The presented system and method alleviates the need for contrast between 2-D images while permitting the images to be disposed in a common 3-D Frame of Reference and/or utilized to generate 3-D images.
In automated arrangements, the probe may be affixed to a positioning device (not shown) and a motor may sweep the transducer of the ultrasound probe 10 over a radial area of interest (e.g., around a fixed axis 70; see
Such handheld acquisition, however, often introduces multiple degrees of freedom into the acquired 2-D images. For example,
The imaging system 30 is operative to correlate the recorded 3-D position of the tracker-14 and a corresponding image acquired by the probe 10. As will be discussed herein, this allows for utilizing non-aligned/arbitrary images for 3-D image reconstruction. That is, the imaging system 30 utilizes the acquired 2-D images 80a-n to populate the 3-D image volume 12 or image cube as per their measured-3-D locations. See also
The imaging system includes a computer or is interconnected to a computer system that runs application software and computer programs, which can be used to control the system components, provide user interface, and provide the features of the imaging system. The software may be originally provided on computer-readable media, such as compact disks (CDs), magnetic tape, or other mass storage medium. Alternatively, the software may be downloaded from electronic links such as a host or vendor website. The software is installed onto the computer system hard drive and/or electronic memory, and is accessed and controlled by the computer's operating system. Software updates are also electronically available on mass storage media or downloadable from the host or vendor website. The software, as provided on the computer-readable media or downloaded from electronic links, represents a computer program product usable with a programmable computer processor having computer-readable program code embodied therein. The software contains one or more programming modules, subroutines, computer links, and compilations of executable code, which perform the functions of the imaging system. The user interacts with the software via keyboard, mouse, voice recognition, and other user-interface devices (e.g., user I/O devices) connected to the computer system.
While use of a tracker 14 in conjunction with the probe 10 allows roughly aligning separate ultrasound 2-D images in a 3-D frame of reference or image cube, the orientation of the image plane 80 of the acquisition probe relative to the tracker sensor must be calibrated. That is, the image plane 80 of the probe and the tracker 14 must be calibrated so pixels in acquired 2D images can be accurately mapped to their 3D coordinates (e.g., global coordinates). The validation of the calibration is performed to confirm the accuracy of the tracking. Additional tracker sensors may be used for the correction of target object movement.
To calibrate the probe, the probe is used to image a known location, which in the embodiment of
P
tip/ref
=T
ref
−1
·P
tip
=T
c
·P
us
=T
c·(u,v,0,1)T (1)
Where Ptip/ref is a vector. For every measurement of the target point by the tracker needle/pointer, measurement data and averaged measurement data is used. After taking n measurements, the equation (1) will become:
The calibration matrix is calculated by SVD solution using:
T
c=(Ptip/ref,1, . . . Ptip/ref,n)·TusT·(Tus·TusT)−1 (3)
Similar calibration can be done if a relative position of a feature is known. At this point, the relationship between the tracker 14 attached to the probe and the image plane is known and the 2D images acquired by the probe may be inserted into a common frame of reference (e.g., an image cube)
It is important to validate the calibration, since it confirms the calibration matrix computed will accurately reconstruct the 2D plane in the tracking space. The setup of the validation is similar to that of the calibration. Again a target point, such as a string phantom, bead, surface, volume etc. and the extra tracker pointer are used for the validation. The validation includes moving the ultrasound probe to the string phantom, making sure that the string crossing point is at the imaging plane. The probe is again fixed and the 2D coordinates (u, v) are saved. Calculating the location of the pixel:
The tracker pointer is moved to a known point (e.g., a string crossing point in a phantom) and the readings from the tracker pointer Pact are saved. The error between the original calibration and the validation is then calculated:
E=|P
tim/ref
−P
act| (5)
The calibrated and validated probe may now be used to acquire images in its frame of reference. That is, the phantom may be removed and a patient may be located within the frame of reference for image acquisition.
During scanning a user can select a patient region of interest to define the 3D image volume he wants to scan. During the scanning, a 2D series of image planes 80 acquired by the probe will be displayed into the 3D volume with certain transparency, and so the user can be aware of how scanning is progressing. That is, if an area of the image volume is not covered, the user may reposition to the probe to acquire data for that area. See.
There are three major scanning methods for 3D B-scan ultrasound systems. The first is rotary scanning in 2D planes with equal angles. The second is the linear scan. The third is freehand scanning with 2D US transducer. In the first situation the positions and values are measured in polar coordinates on the planes with equal angles. In the second and third situation the positions and values are measured on the polar coordinates on the planes with random gaps and directions. For the rotary scanning, if the angle between two scan is taken small enough, e.g. 1 degree, the volume of interest (e.g., image are or cube) can be totally covered (See
For the freehand scanning, if the acquired 2D image planes 80 cover 90% of the 3D image area or cube, (See
In many instances, it is desirable to track a desired position within an object of interest. For instance, performing a biopsy may require that a biopsy device be guided to a desired location within the object. Accordingly, the generated 3D image of such an object may be is used for such tracking. If the object of interest is not moving, tracking relates the current live scanning from the ultrasound image to the scanned image so as to confirm that certain/desired location inside the object is reached. To perform such tracking, the system must calculate the destination location using the transformation matrix and display the region for tracking in the scanned cube.
To provide improved real-time tracking, it may be necessary to synchronize the 2D input image and the reading of the tracker. The readings of the tracker and the images are graphically shown in
In the real world applications it is very common that the patient moves during scanning or navigating, which could create significant error. The present system uses a novel approach to apply a movement correction. More specifically, an additional sensor(s) is provided that outputs patient movement information. That is, another tracker sensor 18 is attached to the patient/target object which reports the movement of the patient. See.
In the scanning phase, once the user begins scanning, the volume around the home position of the probe is filled by the contiguous 2D ultrasound images. The location of the tracker, which attached to the patient, is Ppat, and the rotation matrix is Tpat. Since the location and rotation of the patient tracker sensor 18 is continuously read, if the patient moves during the procedure, the displacement of the tracker is determined and the transformation matrix Tpat can be obtained. In the reconstruction strategy, the location of the 2D image will be corrected as:
P
new
=T
c
·T
pat
·P
us (6)
where Pus is the locations of the pixels in the live image, and Tc is the calibrated transformation matrix. That is, if the tracker position/rotation changes it can be detected by the system, and the self-correction will be applied for the whole volume. Similarly, if the movement happens in the tracking phase, the self-correction can happen so the error can be reduced. Further more, multiple sensors can be attached to the patient so the movement can be better defined.
Another advantage of multiple sensors is that during a biopsy/therapy procedure, a sensor may be attached to the biopsy/therapy needle (e.g. at the base of the needle or introducer) so the needle trajectory is tracked during the operation process. The calibration of the needle with sensor will be done prior to the procedure and is similar to the calibration discussed above. Specifically, once a tracker is attached to the needle the extra pointer sensor 22 marks points (such as the needle tip). That is various needle locations are measured using the extra pointer (e.g., needle tracker/pointer 22; see
Tt-s can be different depending on how the tracker sensor is installed on the needle. Accordingly, by tracking the tracker on the needle (e.g. at the need base) the tip position of the needle may be identified and displayed.
During the online portion of the procedure, two-dimensional ultrasound images/image planes 80 are acquired utilizing a two-dimensional imaging system 122. Such a scanning system utilizes, for example, a two-dimensional transrectal ultrasound system that incorporates the tracker/probe 10 as well as the predetermined calibration results 112. Such a system may be operative to generate a three-dimensional volumetric image where the freehand two-dimensional images 80 are arranged into a three-dimensional space and/or interpolated to generate a three-dimensional volumetric image 136. Once such a three-dimensional image is generated, various tracking and display processes 160 may be performed. In this regard, information within the three-dimensional image may be tracked in real time to provide an output of tracked locations on a display 168. In order to provide updated and real time tracking and display, the process may receive live ultrasound images, information from the tracker/probe, information from the tracker needle, and/or information from a tracker interconnected to a patient.
Generally, the above-noted system allows for acquiring multiple individual ultrasound image planes and reassembling those multiple individual image planes into a common frame of reference and subsequently utilizing the combined information of these images to generate a three-dimensional volumetric image in which one or more points of interest and/or needles may be tracked (e.g. in real time). Further, such a system may be applicable for use with existing two-dimensional ultrasound machines. In this regard, all that is required is that a tracker 14 be securely affixed to an ultrasound probe prior to calibration. However, it will be appreciated that various ultrasound probes may have built in trackers for use with the system without, for example, utilizing a separate tracker interconnected to the probe.
This application is a continuation in part of U.S. patent application Ser. No. 12/840,987 filed on Jul. 21, 2010 and which claims the benefit of filing date of U.S. Provisional Application No. 61/227,274 entitled: “3-D Self-Correcting Freehand Ultrasound Tracking System” and having a filing date of Jul. 21, 2009, the entire contents of both of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
61227274 | Jul 2009 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12840987 | Jul 2010 | US |
Child | 13041990 | US |