The present invention relates generally to imaging systems. More particularly, the present invention relates to ultrasound imaging systems.
There are a number of disadvantages associated with various imaging systems that are currently in use, particularly when used for medical applications. For example, a number of imaging techniques, such as x-ray imaging, mammography, and computed tomographic (CT) scans, use ionizing radiation that presents a risk of cell mutation when used medically. Also, CT scans and magnetic resonance imaging (MRI) techniques both involve procedures that are relatively expensive, a factor that by itself acts to some degree to limit their use. A significant disadvantage of methods such as mammography is that they rely on two-dimensional images that may disguise three-dimensional structure information that can be critical for diagnosis.
As an alternative to these imaging technologies, the medical community has looked to ultrasound for providing a safe, low-cost, high-resolution imaging tool. There are, however, significant limitations to conventional ultrasound, which may be used in A or B scanning modes. Such modes are distinguished by the fact that an A scan is purely one dimensional while a B scan produces a two-dimensional image. As a result, imaging applications tend to use ultrasonic B scanning. In such conventional ultrasound analysis, a small array of elements is moved by hand in contact with tissue under study. The array sends out waves that reflect from tissues back to the same array. This arrangement results in two major drawbacks. First, ultrasonic B scans do not provide information on the properties of the materials themselves; rather, they provide information only on the reflectivity of the boundaries between different types of materials. Second, the array is incapable of capturing radiation except that reflected back to the hand-held sensing array. Considerable information exists, however, in the transmitted waves, but this information is neither captured nor used diagnostically in conventional ultrasonic B scans.
An additional limitation to traditional ultrasound techniques is that when an unknown object is examined, it is difficult to determine success criteria for the image construction. Thus, it would be useful to be able to benchmark the image construction process in order to determine when sufficient accuracy or precision has been obtained. Moreover, in the past, it has been difficult to correlate positions of features in an image with the position of the patient. It therefore is desirable to develop methods of judging the accuracy of an ultrasound image against objective criteria and adjusting the image to correlate to those criteria.
Another useful application for ultrasound imaging is analyzing changes in a tissue, for example, by creating multiple ultrasound representations of the tissue, perhaps over the course of several days, weeks, months or years. Such analysis is most beneficial, however, if it can be undertaken from a consistent frame of reference, such that the size and orientation of the tissue and any features therein are consistently depicted in each representation. Thus, in creating an ultrasound representation, it would be desirable to develop a method of comparing multiple ultrasound scans from a consistent frame of reference.
There is thus a need for an apparatus and method that provides improved imaging, particularly as applied to medical applications.
Embodiments of the invention thus provide a method and system for examining tissue that solve certain deficiencies with the prior art. In one embodiment, a field, including the tissue to be examined as well as one or more registration fiducials, is insonified. The insonifying acoustic waves are scattered by the field so that scattered acoustic information, including in some embodiments a mix of reflected and transmitted acoustic waves, is received, producing a data set. A representation of a portion of the tissue is then generated from the data set. The representation includes a depiction of the registration fiducials. The representation may be three dimensional or may comprise a two-dimensional slice through the portion of the tissue. In one embodiment, the representation comprises an image.
In some embodiments, the registration fiducial comprises an object of known acoustic properties. In such an embodiment, the representation might be calibrated to correlate the depicted properties of the object with the object's known properties. In some embodiments, the field may be insonified a plurality of times, generating a plurality of data sets from which a plurality of representations may be generated. In such an embodiment, the registration fiducials might comprise a plurality of fiducial markers. The relative positions of the plurality of fiducial markers in the representations can be correlated, allowing a feature of the tissue to be localized in the representations. The methods described above may be implemented with a computer program embodied in a computer-readable storage medium.
Another embodiment of the invention is a system for examining tissue, including a sensor system. The sensor system includes a plurality of acoustic transmission elements and acoustic receiving elements disposed to surround a portion of the tissue being examined. The sensor system also includes one or more registration fiducials. The system may also have a control system, including a controller that controls the acoustic transmission elements and the acoustic receiving elements to insonify the field, receive scattered acoustic information and produce a data set from the received acoustic information. The system also might have a processing system with a processor that generates one or more representations of the field from the data sets produced by the control system, such representations depicting at least a portion of the tissue insonified as well as the registration fiducials. In some embodiments, the processor might also process the representations to correlate the depictions of the registration fiducials with the fiducials' actual properties.
A further understanding of the nature and advantages of the present invention may be realized by reference to the remaining portions of the specification and the drawings, wherein like reference numerals are used throughout the several drawings to refer to similar components. In some instances, a sublabel is associated with a reference numeral and appended to the reference numeral with a hyphen to denote one of multiple similar components. When reference is made to a reference numeral without specification to an existing sublabel, it is intended to refer to all such multiple similar components.
1. Overview
Embodiments of the invention are directed generally to a method and apparatus for examining an object under study, such as tissue.
The system includes a sensor system 104, a control system 108, and a processing system 112. Each of these systems is described in greater detail in the following commonly assigned patents and applications, the entire disclosures of all of which are herein incorporated by reference for all purposes: U.S. Pat. No. 6,385,474 entitled “METHOD AND APPARATUS FOR HIGH-RESOLUTION DETECTION AND CHARACTERIZATION OF MEDICAL PATHOLOGIES,” filed Mar. 19, 1999 by John D. Rather et al., which is a nonprovisional of U.S. Prov. Pat. Appl. No. 60/078,788 entitled “HIGH RESOLUTION ULTRASOUND ANATOMICAL IMAGING SYSTEM,” filed Mar. 20, 1998 by John D. Rather; U.S. patent application Ser. No. 10/323,354 entitled “COMPUTERIZED ULTRASOUND RISK EVALUATION SYSTEM,” filed concurrently with this application by Neb Duric et al.; and U.S. patent application Ser. No. 10/323,467 entitled “DIAGNOSTIC ANALYSIS OF ULTRASOUND DATA,” filed concurrently with this application by David H. Chambers et al.
A connection 116 is provided for the transfer of information between the sensor system 104 and the control system 108 and a connection (not shown in
In some embodiments, the sensor system 104 might comprise acoustic transmission and acoustic receiving elements: Insonification may be achieved with acoustic transmission elements and scattered acoustic information may be received with acoustic receiving elements. In one embodiment, the acoustic transmission elements and the acoustic receiving elements are configured as one or more arrays and are comprised by a paddle. The arrays may be configured for motion to allow the field to be scanned. In one embodiment, the paddle also comprises a pliable bladder configured for contacting the tissue to improve the level of comfort of the patient and to examine regions otherwise difficult to access. The pliable bladder may contain an acoustically transparent liquid. In some embodiments, a second similarly configured paddle may be provided so that scanning of the tissue may be performed by moving arrays of transmission and receiving elements in the separate paddles in parallel.
In the embodiment shown, the sensor system 104 includes a support 136, a source for power connections 120, and a sensor that includes a pair of paddles 128. The lower paddle 128-2 is fixed to the support 136, but the upper paddle 128-1 is configured to be moved with a handle 132 to compress the patient's breast between the two paddles 128. Each of the paddles 128 comprises arrays of ultrasonic transmission and receiving elements (“transducers”). In one embodiment, 512 transmission elements and 512 receiving elements are provided in each paddle. Included in the sensor system are one or more registration fiducials 130. It should be noted that registration fiducials can be any objects or marks within the insonification field that can be measured with or identified by acoustic radiation. For example, in the present embodiment, the registration fiducials 130 are markers implanted into the paddles 128, such that the registration fiducials are included in the insonification field produced by the transducers.
The control system 108 comprises hardware used to form and time ultrasonic transmission pulses and circuitry that records the received ultrasonic information. The operator 148 operates the control system 108 via interaction 154. In operation, the patient 140 has an interaction 152 with the sensor system 104 by being positioned so that the paddles 128 are contacting the patient's breast. The operator 148 has an interaction 154 with the control system 108 to set up the operational parameters. In one embodiment, the control system 108 is configured to provide a graphical user interface from which operational parameters such as mode selection and timing initiation may be established. The control system 108 derives control information from the instructions provided by the operator 148.
Once the operation mode has been established, the control system 108 directs the sensor system 104 to begin acquiring data via interaction 160. The sensor, shown as paddles 128 in the illustrated embodiment, insonifies a field, the field including the tissue (not shown) and the registration fiducials 130, and receives acoustic information scattered from the field. Scattered acoustic information can be any acoustic radiation received by the sensor system that contains information about objects in the field. For example, in certain embodiments, scattered acoustic information is acoustic radiation reflected by, refracted by or transmitted through the objects in the field. In some embodiments, scattered acoustic information is some combination of transmitted and reflected acoustic information.
Transducers within the paddles 128 convert the received acoustic information into electrical signals that are communicated back to the control system 108 through interaction 158. The control system 108 performs an analysis of the electrical signals to create a data set that is transmitted via interaction 156 to the processing system 112. The processing system generates a representation of the field, the representation including a depiction of the registration fiducials 130 as well as the tissue, and processes the representation based on the depiction of the registration fiducials 130.
The processed representation may then be transmitted to the control system 108 through interaction 156. A professional evaluator 144, such as a radiologist, may have a direct interaction 164 with the operator system 112 to view the representation. In alternative embodiments, selected representations may be printed or stored for later viewing and analysis by the evaluator 144.
A tomographic “view” is defined by data generated for transmission of acoustic radiation from a single transmission element and reception by a plurality of the receiving elements. A tomographic “slice” is defined by data generated for a plurality of views, i.e. derived from transmission of acoustic radiation from a plurality of transmission elements and reception by a plurality of receiving elements.
In block 226, the processing system 112 processes the representations to correlate the depicted properties of the registration fiducials with the fiducials' actual properties. By way of example, as described below, in certain embodiments, the registration fiducials are objects with known acoustic properties. Such properties may include sound speed, attenuation, density, compressibility, acoustic impedance change, and the like. After the representation is derived from the data set, the representation can be calibrated with reference to the object of known acoustic properties. For instance, if the object has a known sound speed, the representation can be adjusted so that the represented sound speed of the object corresponds to the known sound speed for that object. Thus calibrated, the representation will more accurately depict the sound speed of the tissue insonified, enabling more accurate examination of the tissue. In other embodiments, the registration fiducials might comprise a plurality of objects with known acoustic properties.
Finally, in certain embodiments, the representations may be displayed to the operator or radiologist as shown in block 230, stored for later recall as shown in block 234, or printed as shown in block 238. The control system coordinates each function in communication with the sensor system and the processing system by defining setup conditions for each function and determining when each function is complete.
2. Fiducial Object for Iterative Processing
Referring back now to
In the present embodiment, if the depicted properties of the object do not match its known properties, the processing system calculates a correction factor at block 332 and uses that correction factor to generate a corrected representation at block 222. A correction factor is any set of one or more coefficients or algorithms that, when applied to a data set or representation, will allow the resulting representation to conform more closely with the actual objects represented. In this embodiment, for example, the correction factor might be derived with a set of curve-fitting algorithms that correlate a range of measured values of a fiducial object's acoustic properties with the object's known, actual properties. Thus, the process iterates until the depicted properties of the fiducial object match the known properties thereof to within a predetermined tolerance, as shown at block 328. At that point, as discussed above, the representation can be displayed to the operator at block 336, or in other embodiments, printed, stored for future reference or the like. Notably, the correction factor that produces an accurate depiction of the fiducial object's acoustic properties applies equally well to the rest of the representation and therefore can be used to correct the depiction of an unknown object, for instance, a patient's tissue.
In the embodiment illustrated by
3. Positional Correlation of Fiducial Markers
According to the present embodiment, after the relative positions of the fiducial markers have been correlated, the representations are displayed at block 436. In some embodiments, however, the representations might be printed, stored for future reference, or the like. In other embodiments, the representations might be superimposed upon one another to allow for comparative analysis of the plurality of representations.
One benefit of the illustrated embodiment is that, by correlating the relative positions of the fiducial markers, a feature within the tissue can be localized in each of the representations. One possible application of such an embodiment is to allow the tissue to be examined a plurality of times—perhaps over a span of multiple days, weeks, months or years—creating multiple representations. By correlating the depictions of the fiducial markers, a feature of the tissue can be localized in each representation. In this way, any change in the feature over time can be tracked and studied from a consistent frame of reference. Where, for instance, the feature is a cancerous tumor, such localized representations could allow medical practitioners more precisely to determine the growth of the tumor or, alternatively, the tumor's response to treatment.
Having described several embodiments, it will be recognized by those of skill in the art that various modifications, alternative constructions, and equivalents may be used without departing from the spirit of the invention. Accordingly, the above description should not be taken as limiting the scope of the invention, which is defined in the following claims.
The Government has rights in this invention pursuant to U.S. Dept. of Energy Work for Others Agreement L-8420.
Number | Name | Date | Kind |
---|---|---|---|
3154067 | Stenstrom et al. | Oct 1964 | A |
3881466 | Wilcox | May 1975 | A |
3886489 | Jones | May 1975 | A |
4028934 | Sollish | Jun 1977 | A |
4059010 | Sachs | Nov 1977 | A |
4075883 | Glover | Feb 1978 | A |
4105018 | Greenleaf et al. | Aug 1978 | A |
4222274 | Johnson | Sep 1980 | A |
4317369 | Johnson | Mar 1982 | A |
4515165 | Carroll | May 1985 | A |
4542744 | Barnes et al. | Sep 1985 | A |
4564019 | Miwa | Jan 1986 | A |
4662222 | Johnson | May 1987 | A |
4671256 | Lemelson | Jun 1987 | A |
4722056 | Roberts et al. | Jan 1988 | A |
4855911 | Lele et al. | Aug 1989 | A |
4858124 | Lizzi et al. | Aug 1989 | A |
4917096 | Engelhart et al. | Apr 1990 | A |
4941474 | Pratt, Jr. | Jul 1990 | A |
5003979 | Merickel et al. | Apr 1991 | A |
5029476 | Metala | Jul 1991 | A |
RE33672 | Miwa | Aug 1991 | E |
5143069 | Kwon | Sep 1992 | A |
5158071 | Umemura et al. | Oct 1992 | A |
5179455 | Garlick | Jan 1993 | A |
5212571 | Garlick et al. | May 1993 | A |
5255683 | Monaghan | Oct 1993 | A |
5260871 | Goldberg | Nov 1993 | A |
5269309 | Fort et al. | Dec 1993 | A |
5280788 | Janes et al. | Jan 1994 | A |
5304173 | Kittrell et al. | Apr 1994 | A |
5318028 | Mitchell et al. | Jun 1994 | A |
5329817 | Garlick et al. | Jul 1994 | A |
5339282 | Kuhn et al. | Aug 1994 | A |
5349954 | Tiemann et al. | Sep 1994 | A |
5394875 | Lewis et al. | Mar 1995 | A |
5413108 | Alfano | May 1995 | A |
5415164 | Faupel | May 1995 | A |
5433202 | Mitchell et al. | Jul 1995 | A |
5463548 | Asada et al. | Oct 1995 | A |
5465722 | Fort et al. | Nov 1995 | A |
5474072 | Shmulewitz | Dec 1995 | A |
5479927 | Shmulewitz | Jan 1996 | A |
5485839 | Aida et al. | Jan 1996 | A |
5487387 | Trahey et al. | Jan 1996 | A |
5553618 | Suzuki et al. | Sep 1996 | A |
5558092 | Unger et al. | Sep 1996 | A |
5573497 | Chapelon | Nov 1996 | A |
5582173 | Li | Dec 1996 | A |
5588032 | Johnson et al. | Dec 1996 | A |
5588430 | Bova et al. | Dec 1996 | A |
5590653 | Aida et al. | Jan 1997 | A |
5596992 | Haaland et al. | Jan 1997 | A |
5606971 | Sarvazyan | Mar 1997 | A |
5620479 | Diederich | Apr 1997 | A |
5640956 | Getzinger et al. | Jun 1997 | A |
5643179 | Fujimoto | Jul 1997 | A |
5664573 | Shmulewitz | Sep 1997 | A |
5678565 | Sarvazyan | Oct 1997 | A |
5722411 | Suzuki et al. | Mar 1998 | A |
5743863 | Chapelon | Apr 1998 | A |
5762066 | Law et al. | Jun 1998 | A |
5766129 | Mochizuki | Jun 1998 | A |
5797849 | Vesley et al. | Aug 1998 | A |
5800350 | Coppleson et al. | Sep 1998 | A |
5810731 | Sarvazyan et al. | Sep 1998 | A |
5817025 | Alekseev et al. | Oct 1998 | A |
5833614 | Dodd et al. | Nov 1998 | A |
5846202 | Ramamurthy et al. | Dec 1998 | A |
5865167 | Godik | Feb 1999 | A |
5865743 | Godik | Feb 1999 | A |
5891619 | Zakim et al. | Apr 1999 | A |
6002958 | Godik | Dec 1999 | A |
6005916 | Johnson et al. | Dec 1999 | A |
6109270 | Mah et al. | Aug 2000 | A |
6117080 | Schwartz | Aug 2000 | A |
6135960 | Holmberg | Oct 2000 | A |
6235038 | Hunter et al. | May 2001 | B1 |
20020131551 | Johnson | Sep 2002 | A1 |
Number | Date | Country |
---|---|---|
A-3443295 | Feb 1996 | AU |
0 351 610 | Jan 1990 | EP |
0 538 241 | Apr 1993 | EP |
0 538 241 | Apr 1993 | EP |
0 284 055 | Sep 1993 | EP |
0 609 922 | Aug 1994 | EP |
0 661 029 | Jul 1995 | EP |
0 774 276 | May 1997 | EP |
Number | Date | Country | |
---|---|---|---|
20040122313 A1 | Jun 2004 | US |