The present invention generally relates to automatic location of specific zones of an anatomical structure for visual guidance during an ultrasound-guided procedure (e.g., a prostate biopsy). The present invention specifically relates to zone labeling of a three-dimensional (“3D”) model of the anatomical structure as a basis for visualizing guidance through the zones during an ultrasound-guided procedure.
A medical image registration of a preoperative anatomical image with an intraoperative anatomical image has been utilized to facilitate image-guided interventional/surgical/diagnostic procedures. The main goal for the medical image registration is to calculate a geometrical transformation that aligns the same or different view of the same anatomical structure within the same or different imaging modality.
More particularly, prostate cancer affects one in six men in the western world, and it is the second leading cause of cancer death in American men. Transrectal ultrasound (“TRUS”)-guided systematic biopsy with different schemes (e.g., sextant, extended 12 core, etc.) are considered to be the standard of care in clinical practice. However, as two-dimensional (“2D”) ultrasound imaging is usually used, the field of view is limited. Furthermore, due to the lack of landmarks inside the prostate under ultrasound imaging, ultrasound based prostate imaging requires a significant amount of training and experience to precisely navigate to the desired location for biopsy.
Of importance, according to the urological literature on prostate biopsy, the cancer locations are not uniformly distributed inside the prostate gland. The research shows that there are some high risk areas inside the gland with higher possibilities to detect cancer. Known systematic biopsy schemes are designed in special ways to cover those high risk areas to achieve maximal cancer detection rate with certain number of biopsies. However, due to the limitations of the current ultrasound imaging guidance, there are trends that some zones (e.g., horns of the prostate gland) tend to be missed in many cases. It may result in higher false negative biopsy rates and some cancer cases may be missed.
In summary, it is known that some high risk areas of the prostate can be significantly undersampled during biopsy due to the limitation of the imaging guidance technique. While such limitations may be compensated by the experience of physicians, the present invention provides a systematic technical approach to improve the outcome of biopsy consistently by assisting physicians with automatic zone identification. More particularly, an ultrasound model of an anatomical structure is labeled to two (2) or more procedurally-defined zones derived from a scheme designed for optimizing an ultrasound-guided procedure on the anatomical structure. For example, an ultrasound model of prostate may be labeled by procedurally-defined zones that are associated with a known scheme for an ultrasound-guided biopsy sampling of the gland, particularly a scheme considered to be the standard of care in clinical practice (e.g., sextant, extended 12 core, saturation sampling, anterior sampling etc.).
One form of the present invention is a system for automatic zone visualization employing an ultrasound probe and an ultrasound imaging workstation. In operation, ultrasound probe scans an anatomical region, and the ultrasound imaging workstation tracks a generation of an ultrasound volume of an anatomical structure within a patient space responsive to the scan of the anatomical region by the ultrasound probe. The ultrasound imaging workstation further tracks a labeling of procedurally-defined zones of the anatomical structure within the ultrasound volume derived from an ultrasound volume model of the anatomical structure labeled with the procedurally-defined zones to thereby facilitate an ultrasound-guided visualization of the anatomical structure.
The foregoing form and other forms of the present invention as well as various features and advantages of the present invention will become further apparent from the following detailed description of various embodiments of the present invention read in conjunction with the accompanying drawings. The detailed description and drawings are merely illustrative of the present invention rather than limiting, the scope of the present invention being defined by the appended claims and equivalents thereof.
For purposes of the present invention, the term a “procedurally-defined zone” is broadly defined as zones of an anatomical structure derived from a scheme designed for optimizing an ultrasound-guided procedure on the anatomical structure. For example, an ultrasound model of prostate may be labeled by procedurally-defined zones that are associated with a known or proposed scheme for an ultrasound-guided biopsy sampling of the gland, particularly a scheme considered to be the standard of care in clinical practice (e.g., sextant, extended 12 core, saturation sampling, anterior sampling etc.).
Also, for purposes of the present invention, the terms “tracking”, “reconstruction”, “segmentation” and “registration” as well as related terms are to be broadly interpreted as known in the art of the present invention.
In practice, the present invention applies to any anatomical regions (e.g., head, thorax, pelvis, etc.) and anatomical structures (e.g., bones, organs, circulatory system, digestive system, etc.).
To facilitate an understanding of the present invention, exemplary embodiments of the present invention will be provided herein directed to automatic zone visualization of ultrasound imaging of a prostrate. Nonetheless, those having ordinary skill in the art will appreciate how to execute automatic zone visualization of ultrasound imaging for all anatomical regions and structures therein.
Referring to
An intraoperative ultrasound system 30 employs a 2D ultrasound probe 31 and a ultrasound imaging workstation 32 to generate a stream of ultrasound images 33 of an anatomical tissue of a prostate 13 of a patient 12 as patient 12 is being scanned by 2D ultrasound probe 31. As will be further described in connection with
Please note the images of the zone labeled prostate within model 41 and volume 42 are not intended to be anatomically correct, but serves only as a simplified example of a zone labeled prostate for purposes of facilitating a description of an automatic zone visualization of the present invention based on zone labeled model 41z and volume 42z, which will now be provided herein.
Referring to
Generally, a preoperative resource 60p includes a stream of scanned ultrasound images 23 of prostate 11 for each subject 10 (
Between inputs 60 and output 62, image registration 61 includes various processes for building prostate volume model 41z, tracking a reconstruction of a volume image 42 from ultrasound image stream 33, mapping a segmented prostate from volume image 42 to volume model 41z. By doing these aspects, the zones can be mapped to the real time ultrasound streaming 33z by using the device tracking data TD.
More particularly, ultrasound system 30 (
Ultrasound system 30 also employs or cooperates with a tracking system to obtain a pose and position of 2D ultrasound probe 31 in real time for mapping ultrasound image stream 33 (or 3D image) to the 3D patient space. The pose and position of ultrasound probe 31 may be obtained by using any known device tracking technique (e.g., electromagnetic tracking or optical sensor tracking)
Flowchart 50 has four main stages S51-S54. A stage S51 of flowchart 50 encompasses workstation 20 (
In one embodiment, a preferred biopsy sampling scheme is added to the prostate volume model 41 by identifying a number N of locations in the prostate (typically N≧10). Each location can be identified as a geometrical object (e.g., a point, a small sphere or other simple shape (e.g. ellipsoid) in prostate volume model 41. These locations can later serve to guide the systematic sampling of the prostate according to the desired scheme.
A stage S52 of flowchart 50 encompasses workstation 30 (
A stage S53 of flowchart 50 encompasses workstation 30 segmenting the prostate from ultrasound volume 42 and registering prostate volume model 41 with the segmented prostate. Specifically, a main purpose of performing segmentation is to assist registering the prostate volume model 41 to ultrasound volume 42, as it is very challenging to directly map prostate volume model 41 to ultrasound image stream 33. The segmentation may be performed in two (2) ways. The first option is to segment the ultrasound sweep data frame by frame. The obtained 2D segmentation sequences are then mapped to 3D space using the same transformations as in the reconstruction stage S52 to get the 3D segmentation. Alternatively, the reconstructed ultrasound volume 42 is segmented directly in 3D space by using a model based approach. If the prostate boundary in ultrasound volume 42 is not as clear as in the 2D ultrasound image stream 33, the two (2) options may be combined to achieve better segmentation performance.
Once the reconstructed ultrasound volume 42 is segmented, a surface based registration method (e.g., iterative closest points based registration) is applied to register prostate volume model 41 with the segmented prostate of reconstructed ultrasound volume 42 to yield a zone labeled ultrasound volume 42z. With this registration, procedurally zones labeled in prostate volume model 42z are mapped to the patient space. With the real time tracking information available, the zones may be transformed to ultrasound image stream 33.
A stage S54 of flowchart 50 encompasses workstation 30 displaying a zone visualization in real time. Specifically, once the zones are mapped to ultrasound image stream 33, procedurally-defined zone(s) can be visualized over an ultrasound image 33z when being intersected. The intersected zone(s) are highlighted with a zone label displayed. In addition, different visualized zones may be differentiated with color coding, text labels, or audio feedback. For example, while a set of zones are being intersected by a ultrasound image 33z, the intersection areas are shown in each corresponding color or label with or without audio feedback. As an addition or alternative approach, the locations of the biopsy sampling scheme are visualized jointly with ultrasound image 33z. This helps the user to adjust ultrasound probe 31 look direction until the biopsy path is aligned with the locations of the sampling scheme.
Flowchart 50 is terminated upon completion of the procedure.
In an alternative embodiment of flowchart 50, stage S51 and the registration of stage S53 may be omitted whereby prostate volume model 41 may be defined using reconstructed ultrasound volume 42 in lieu of compounding a model using X number of prior subjects. Specifically, the procedurally-defined zones are created based on geometric sub-division of an intra-procedural segmentation of reconstructed ultrasound volume 42. Examples of such sub-division include, but are not limited to, (1) dividing the intra-procedural segmentation of reconstructed ultrasound volume 42 in two (2) halves along the mid-sagittal plane and thus creating a “left” and “right” zone, and (2) dividing in the intra-procedural segmentation of reconstructed ultrasound volume 42 thirds using axial cut-planes, thus creating base/mid-gland/axial zones.
Referring to
While various embodiments of the present invention have been illustrated and described, it will be understood by those skilled in the art that the embodiments of the present invention as described herein are illustrative, and various changes and modifications may be made and equivalents may be substituted for elements thereof without departing from the true scope of the present invention. In addition, many modifications may be made to adapt the teachings of the present invention without departing from its central scope. Therefore, it is intended that the present invention not be limited to the particular embodiments disclosed as the best mode contemplated for carrying out the present invention, but that the present invention includes all embodiments falling within the scope of the appended claims.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/IB2015/051135 | 2/16/2015 | WO | 00 |
Number | Date | Country | |
---|---|---|---|
61945897 | Feb 2014 | US |