Method and apparatus to produce ultrasonic images using multiple apertures

Information

  • Patent Grant
  • 10130333
  • Patent Number
    10,130,333
  • Date Filed
    Thursday, August 18, 2016
    8 years ago
  • Date Issued
    Tuesday, November 20, 2018
    6 years ago
Abstract
A combination of an ultrasonic scanner and an omnidirectional receive transducer for producing a two-dimensional image from received echoes is described. Two-dimensional images with different noise components can be constructed from the echoes received by additional transducers. These can be combined to produce images with better signal to noise ratios and lateral resolution. Also disclosed is a method based on information content to compensate for the different delays for different paths through intervening tissue is described. The disclosed techniques have broad application in medical imaging but are ideally suited to multi-aperture cardiac imaging using two or more intercostal spaces. Since lateral resolution is determined primarily by the aperture defined by the end elements, it is not necessary to fill the entire aperture with equally spaced elements. Multiple slices using these methods can be combined to form three-dimensional images.
Description
BACKGROUND OF THE INVENTION
1. Field of the Invention

The present invention relates generally to imaging techniques used in medicine, and more particularly to medical ultrasound, and still more particularly to an apparatus for producing ultrasonic images using multiple apertures.


2. Discussion of Related Art Including Information Disclosed Under 37 CFR §§ 1.97, 1.98

In conventional ultrasonic imaging, a focused beam of ultrasound energy is transmitted into body tissues to be examined and the returned echoes are detected and plotted to form an image. In echocardiography the beam is usually stepped in increments of angle from a center probe position, and the echoes are plotted along lines representing the paths of the transmitted beams. In abdominal ultrasonography the beam is usually stepped laterally, generating parallel beam paths, and the returned echoes are plotted along parallel lines representing these paths. The following description will relate to the angular scanning technique for echocardiography (commonly referred to as a sector scan). However, the same concept with modifications can be implemented in abdominal scanners.


The basic principles of conventional ultrasonic imaging are well described in the first chapter of Echocardiography, by Harvey Feigenbaum (Lippincott Williams & Wilkins, 5th ed., Philadelphia, 1993). These will not be repeated here except as necessary to illustrate the differences between the conventional techniques and the present invention.


It is well known that the average velocity v of ultrasound in human tissue is about 1540 m/sec, the range in soft tissue being 1440 to 1670 m/sec (see, for example, P. N. T. Wells, Biomedical Ultrasonics, Academic Press, London, New York, San Francisco, 1977). Therefore, the depth of an impedance discontinuity generating an echo can be estimated as the round-trip time for the echo multiplied by v/2, and the amplitude is plotted at that depth along a line representing the path of the beam. After this has been done for all echoes along all beam paths, an image is formed, such as the image 10 shown in FIG. 1, in which a circle has been imaged. The gaps between the scan lines are typically filled in by interpolation. One of the earliest interpolation algorithms applied to echocardiography was described in U.S. Pat. No. 4,271,842, to Specht et al.


In order to insonify the body tissues, a beam formed either by a phased array or a shaped transducer is scanned over the tissues to be examined. Traditionally, the same transducer or array is used to detect the returning echoes. This design configuration lies at the heart of one of the most significant limitations in the use of ultrasonic imaging for medical purposes; namely, poor lateral resolution. Theoretically the lateral resolution could be improved by increasing the aperture of the ultrasonic probe, but the practical problems involved with aperture size increase have kept apertures small and lateral resolution large. Unquestionably, ultrasonic imaging has been very useful even with this limitation, but it could be more effective with better resolution.


In the practice of cardiology, for example, the limitation on single aperture size is dictated by the space between the ribs (the intercostal spaces). For scanners intended for abdominal and other use, the limitation on aperture size is not so obvious, but it is a serious limitation nevertheless. The problem is that it is difficult to keep the elements of a large aperture array in phase because the speed of ultrasound transmission varies with the type of tissue between the probe and the area of interest. According to the book by Wells (cited above), the speed varies up to plus or minus 10% within the soft tissues. When the aperture is kept small, the intervening tissue is, to a first order of approximation, all the same and any variation is ignored. When the size of the aperture is increased to improve the lateral resolution, the additional elements of a phased array may be out of phase and may actually degrade the image rather than improving it. The instant disclosure teaches methods to maintain all of the information from an extended phased array “in phase” and thus to achieve sought-after improved lateral resolution.


In the case of cardiology, it has long been thought that extending the phased array into a second or third intercostal space would improve the lateral resolution, but this idea has met with two problems. First, elements over the ribs have to be eliminated, leaving a sparsely filled array. New theory is necessary to steer the beam emanating from such an array. Second, the tissue speed variation described above, but not adequately addressed until this time, needs to be compensated. The same solution taught in this disclosure is equally applicable for multi-aperture cardiac scanning, or for extended sparsely populated apertures for scans on other parts of the body.


BRIEF SUMMARY OF THE INVENTION

The present invention solves both the problem of using more than one intercostal space and the problem of accommodating unknown phase delays from using elements spread over a large sparse aperture. The solution involves separating the insonifying probe from the imaging elements. The separation can be a physical separation or simply a separation in concept wherein some of the elements of the array can be shared for the two functions.


A single omni-directional receive element (such as a receive transducer) can gather all of the information necessary to reproduce a two-dimensional section of the body. Each time a pulse of ultrasound energy is transmitted along a particular path, the signal received by the omnidirectional probe can be recorded into a line of memory. The terms “omni-directional probe,” “omni probe” and/or “omni,” are used synonymously herein to mean an omnidirectional probe. When this is done for all of the lines in a sector scan, the memory can be used to reconstruct the image. This can be accomplished in the same time as data is being collected for the next frame.


There are numerous advantages to this approach, and these comprise the objects and advantages of the present invention. They include, among others:


The dominance of specular reflections so prominent in reconstructing images by returns to the main probe is greatly attenuated.


More than one omnidirectional probe can be used. Each one can be used to reconstruct an entire sector image but with different point spread functions. These can be combined to produce an image with a sharper point spread function.


Compensations can be made for different delays in different paths through the tissue.


Many more scan lines can be reconstructed than the number of pulses generated by the main probe. This overcomes the traditional limit of the number of scan lines by the speed of ultrasound in tissue, tissue depth of interest, and the time allowed between frames, which is typically 1/30 second.


Artificial scan lines can be considered as overlapping, and each pixel on an output image can be imaged from information from more than one omni line of data. Therefore the output pixel can be averaged from multiple data, thus improving the signal-to-noise ratio.


Omnidirectional probes can be placed in multiple intercostal spaces, the suprasternal notch, the substernal window, multiple apertures along the abdomen and other parts of the body, and even on the end of a catheter. An advantage in using omnidirectional probes for catheter placement is that no steering is required of the probe.


Probes can be placed either on the image plane, off of it, or any combination. When placed away from the image plane, omni probe information can be used to narrow the thickness of the sector scanned.


There has thus been broadly outlined the more important features of the invention in order that the detailed description that follows may be better understood, and in order that the present contribution to the art may be better appreciated. Additional objects, advantages and novel features of the invention will be set forth in part in the description as follows, and in part will become apparent to those skilled in the art upon examination of the following. Furthermore, such objects, advantages and features may be learned by practice of the invention, or may be realized and attained by means of the instrumentalities and combinations particularly pointed out in the appended claims.


Still other objects and advantages of the present invention will become readily apparent to those skilled in this art from the following detailed description, which shows and describes only the preferred embodiments of the invention, simply by way of illustration of the best mode now contemplated of carrying out the invention. As will be realized, the invention is capable of modification in various obvious respects without departing from the invention. Accordingly, the drawings and description of the preferred embodiment are to be regarded as illustrative in nature, and not as restrictive.





BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS


FIG. 1 is a diagrammatic view of a simulation showing a circular object imaged by a conventional sector scanner.



FIG. 2 is a schematic diagram of the axes representing the relative positions of the insonifying and omni-directional probes.



FIG. 3 is a graph showing the orientation of the point spread function of an imaging system using a single omnidirectional probe as a function of the position of the omnidirectional probe.



FIG. 4 is a graph showing the orientation of the point spread function of an imaging system using a single omnidirectional probe as a function of the position of the omnidirectional probe when it is not in the scan plane.



FIG. 5 is a diagrammatic view of a simulation showing the same circular object of FIG. 1 imaged by data received by a single omni element located at x0=40 mm, y0=0 mm, z0=0 mm.



FIG. 6 is a schematic diagram illustrating the relative positions of probes showing, in addition, points A-A′, which have equal round trip distances and times.



FIG. 7 is a schematic diagram showing a possible fixture for positioning an omni-directional probe relative to the main (insonifying) probe.



FIG. 8 is a schematic diagram showing a non-instrumented linkage for two probes.



FIG. 9 is a schematic diagram showing variables for computation of x and y positions from received echoes.



FIG. 10A is a phantom image taken with a standard Acuson 128 XP-10 with a 3.5 MHz transducer and harmonic processing.



FIG. 10B is the same phantom image as that shown in FIG. 10A and taken with the same XP-10, wherein the center 64 elements were obscured but external processing employed to show improved lateral resolution. The progressions of anechoic areas on the phantom are 8 mm diameter, 6 mm, 4 mm, 3 mm, and 2 mm.



FIG. 11 is an image of the same phantom produced by the same transducer as the images in FIGS. 10A and 10B, with the center obscured, but with substantial image processing over multiple scans. Note that even though the total aperture is only 19 mm that the 2 mm diameter anechoic areas are now visible. Lateral resolution could be greatly improved if the two parts of the transducer were physically separated and the phased delays reprogrammed for the resulting geometry.



FIG. 12A is a schematic perspective view showing an adjustable, extendable hand held two-aperture probe (especially adapted for use in cardiology US imaging). This view shows the probe in a partially extended configuration.



FIG. 12B is a side view in elevation thereof showing the probe in a collapsed configuration.



FIG. 12C shows the probe extended so as to place the heads at a maximum separation distance permitted under the probe design, and poised for pushing the separated probe apertures into a collapsed configuration.



FIG. 12D is a side view in elevation again showing the probe in a collapsed configuration, with adjustment means shown (i.e., as scroll wheel).



FIG. 12E is a detailed perspective view showing surface features at the gripping portion of the probe.



FIG. 13 is a 3D image highlighting the anechoic tubes of the ATS Model 539 phantom.





DETAILED DESCRIPTION OF THE INVENTION

A key element of the present invention is that returned echoes in ultrasonography can be detected by a separate relatively non-directional receive transducer located away from the insonifying probe (transmit transducer), and the non-directional receive transducer can be placed in a different acoustic window from the insonifying probe. This probe will be called an omni-directional probe because it can be designed to be sensitive to a wide field of view.


If the echoes detected at the omni probe are stored separately for every pulse from the insonifying transducer, it is surprising to note that the entire two-dimensional image can be formed from the information received by the one omni. Additional copies of the image can be formed by additional omni-directional probes collecting data from the same set of insonifying pulses.


A large amount of straightforward computation is required to plot the amplitude of echoes received from the omni. Referring now to FIG. 2, in which there is shown the position of the omni-directional probe 100 relative to the position of the insonifying (main) probe 120 and the insonifying beam 130. The position of the omni-directional probe relative to the beam is indicated by x0, y0 and z0 140, where x0 and y0 are in the scan plane 150 scanned by the insonifying beam and z0 is distance perpendicular to that plane. Instead of simply plotting the depth along the scan line d=t v/2 (where t is the round-trip time, it is now computed as that point d=sqrt(x^2+y^2) for which t v=d+sqrt((x−x0)^2+(y−y0)^2+z0^2).


This procedure will produce a sector scan image similar to that using the conventional technique except that the point spread function will be rotated. FIG. 3 shows the orientation 200 of the psf as a function of the position of the omni probe. A single point at x=0, y=70 mm, z=0 is the point being imaged, and the groups of dots 210 each indicate the orientation of the psf if the omni-probe were placed at the location of the center of each group. The insonifying probe (main probe) is located at the center group 220 on the bottom of the figure. In this simulation the horizontal (x axis) 230 shown goes from −40 mm to +40 mm. The vertical (y axis) 240 goes from 0 to 80 mm depth.



FIG. 4 shows the orientation 300 of the psf as a function of the position of the omni probe if it is not in the scan plane. In this simulation the horizontal (x axis) 310 shown again goes from −40 mm to +40 mm, but the vertical axis 320 is the z axis (distance away from the scan plane) and goes from 0 to 80 mm away from the scan plane.



FIG. 5 shows a plot of the same circular object 20 as in FIG. 1, plotted, however, from data received by a single omni element. Other complete two dimensional reconstructions may be formed using data from additional omni elements, if desired.


Specular reflection is reduced using the omni probe compared to using the main probe for both insonification and detection. This is because all parts of a surface normal to the main beam are insonified with the same phase. When the phase is such that maximum echo is returned, all the echoes add to produce a specular echo. When the signals are normalized to accommodate the dynamic response of a particular display device, non-specular echoes will tend to drop out.


In contrast to this, and referring now to FIG. 6, there is also shown the relative positions of probes, but there is also shown points A-A′ 400, which have equal round trip distances and times. Points A-A′ are on a surface normal to the bisector line a1=a2 410, which also have equal round trip distances and times, are not insonified with the same phase, and do not all reflect equally. This attenuation of the specular reflection is particularly important when visualizing circular structures which frequently have surfaces normal to the main beam.


An algorithm to plot this data on a rectangular grid is: (a) for each point on the x,y grid, convert x,y to depth and angle; (b) then find closest angle k scanned by the insonifying beam; (c) if it is sufficiently close, then convert x,y to distance to the omni; (d) compute time t=(distance to insonifying beam+distance to omni)/v; and (e) plot amplitude recorded by the omni for the k scan at x,y.


However, more information is available and should be used. It is possible to use the same technique to plot additional scan lines which were not explicitly insonified by the insonifying probe. Because of the inherently wide beam width of medical ultrasonic probes, much tissue between intentional scan lines is also insonified and returns echoes. Making use of this information is particularly important when capturing motion (especially in echocardiography) because the number of pulses that can be generated is strictly limited by the speed of ultrasound in tissue and the scan repetition rate desired.


The reconstructed image will get better as the angle between the main beam and the omni gets larger. However it is not necessary to focus a narrow beam on every element of tissue to be imaged as is true if the data is not stored and then processed before display. The lateral resolution can be reconstructed using a Wiener filter to be much better than the beam width if the noise spectrum is low enough. In one simulation of 2 circles of diameter 2.2 mm and 4.0 mm, both imaged well enough that the center was clear even though the beam width was 4.4 mm tapered from 1 to 0 by a cosine function. The Wiener filter is described in the next section.


There are four main sources of noise in ultrasonic imaging: (1) blur due to array size not wide enough; (2) shot noise; (3) reverberation from big interfaces; and (4) speckle.


Multiple probes give independent measures of shot noise, but using closely spaced elements in the main probe (if it is a phased array) will not give independent noise for the other three sources. Adding one or more omni probes will change the look angle, which will thereby change the speckle pattern and the reverberation pattern. These can be averaged out to lower the noise power spectrum. The Wiener filter can then be employed to cancel the blur.


Another way to eliminate speckle is to obtain a good sample of it for estimates of the noise spectrum to then be used in the Wiener filter.


De-blurring and de-noising by these techniques using only an external omni probe or probes will make it possible to visualize small and moving objects such as the coronary arteries. In such a case medical personnel could assess the degree of opening in the lumen or patency of bypass grafts without resort to invasive catheterization techniques.


When combining more than one image such as one from the main probe and another from one or more omni probes for the purpose of averaging out the various sources of noise, it is necessary to compensate for the variation in ultrasound velocity through different paths. Experiments have shown that small unaccounted errors in path velocity will displace the reconstructed image in both horizontal and vertical positions. Cross correlation techniques should be used to find the displacement with one image taken as reference before addition or other combination of images.


Two possibilities exist with regard to the Wiener filter. In one, a Wiener filter can be used separately on each image and then combine them. Alternatively, one can first combine the images (yielding a more complex point spread function) and then employ Wiener filtering.


In order to perform the indicated computations, it is necessary to either measure or estimate the position (x0, y0, z0) of the omni relative to the main probe. Exact knowledge of these positions is not required because, as we have seen, the displacement of the image from the omni probe is also affected by the variation of the velocity of ultrasound in different types of tissue. For this reason it is necessary to use cross correlation or some other matching criterion to make a final correction of the position of the omni-generated image before combining with the reference image or images.


Determining the Position of the Omni Probe(s):


There are many ways to determine the position of the omni probe. Referring now to FIG. 7, one way is to provide apparatus 500 for pivotally and/or swivelingly mounting the omni probe 510 on a fixture 520 attached to the insonifying probe 530. The fixture preferably includes articulated joints 540 with sensors (not shown) to measure angles and distances 550 of the links. FIG. 7 illustrates a simplified version of such a fixture, wherein fixed hinges allow movement of only x0 and y0.


Another method is to have no mechanical connection between the omni probe and the main probe (except wires for signals and power). Instead, the omni probe can transmit a signal using radio frequencies, light, or a different frequency ultrasound to triangulation receivers mounted on the main probe or a separate platform.


A third method again has no mechanical connection between the omni probe and main probe. For this method the omni probe (or probes) can be attached to the patient with tape, and the ultrasonographer can manipulate the main probe to find the best image without regard to positioning the omni probe(s). As indicated in FIG. 4, a two-dimensional image can be formed separately from the echoes received from the omni probe and from the main probe. By adjusting four variables (x0, y0, z0 and D, the average difference in time for ultrasound to go through different tissue types instead of traveling though idealized tissue of constant ultrasound velocity), the images can be made to coincide. The four variables can be adjusted iteratively to maximize cross-correlation or another measure of similarity. Standard multi-dimensional search techniques that could be used include gradient ascent, second order (Newton's method), simulated annealing, and genetic algorithms. A fifth variable, the angle of the psf which is necessary for deconvolution, is implied from the first four variables. Misregistration of the images can be caused by inaccurate estimation of any of the four variables, but good registration can be achieved by simply adjusting D which will tend to compensate for errors in estimates of the others.


When the application requires the highest resolution compatible with capturing motion at a high frame rate, the four variables can be estimated over several frames of information. When the ultrasonographer has selected a good view angle, the frames can be combined at high rate holding x0, y0, z0 and D constant.


When the application requires the highest possible resolution, data can be captured (perhaps with EKG gating to capture separate images at systole and diastole) and the multi-dimensional search to optimize matching can be done more accurately although not in real time. Two advantages of this approach is that different values of D can be found for systole and diastole, and that different psf's can be used for deconvolution at different depths in the image.


Determining the Position of the Omni Probe(s) Using Correlation of the Scan Line Data Rather than Complete 2D Sectors:


A fourth method for determining the position of the omni probe(s) entails replacing the omni probe or probes with a “semi-omni probe” or probes. The reason for this is to increase the signal to noise ratio by restricting the sensitive region of the receive transducer to a plane rather than a hemisphere. Because of this restriction it is necessary to have a mechanical linkage to ensure that both the transmit and receive transducers are focused on the same plane.


Two probes could be placed in any two acoustic windows. In the case of echocardiography, one would likely be in the normal parasternal window which typically gives the clearest view of the whole heart. A second window available in most patients is the apical view. Another window usually available is the subcostal. The two probes could even be placed on either side of the sternum or in parasternal windows in two intercostal spaces.


One probe could be the standard phased array cardiac probe. The second (and third, etc.) would be used as receive only. Theoretically it could be omnidirectional, but that would necessarily provide lower signals and therefore low signal to noise ratios (S/N). A better alternative is to use a probe which is ground to be sensitive to a plane of scan but omnidirectional within that plane. A single piece of PZT would work well, but to minimize the amount of new design required it is also possible to use a second probe head similar to the main probe and then use individual elements or small groups combined to act as single elements. The design goal is to use as many elements as possible to maximize signal to noise ratio while using few enough to minimize angle sensitivity.


In this embodiment 600 (see FIG. 8), the two probes 610, 620, may be linked together with an articulating mechanical linkage, which ensures that the plane of scan of each probe includes the other, but the distance between them is unconstrained. A slave servomechanism is also possible, but the mechanical linkage will be described here.


The procedure is to aim the main probe 620 at the target 630 (e.g., heart) and position the secondary probe 610 at a second window with maximum received signal strength. One possibility is that the main probe be positioned for a long axis view with the secondary probe over the apex of the heart. Some slight deviation of the long axis view may be necessary in order to maintain the secondary probe in its most sensitive spot.


The secondary probe would now be held on the patient with a mechanical housing which allows a fan or rocking motion. The disadvantage of having two probes in fixed positions on the body is that the plane of scan must include these two points. The only degree of freedom is the angle at which the scan plane enters the patient's body. For a conventional 2D examination this is a severe limitation, but if the goal is to gather three-dimensional information, this is not a limitation. The 3D information is obtained by rocking the main probe back and forth through a sufficient angle so that the entire heart is insonified. The secondary probe also rocks back and forth by virtue of the mechanical linage between the probes. The instantaneous angle of rocking must be monitored—perhaps by reference to a gyroscope mounted with the main probe. The rocking could be actuated by the hand motion of the ultrasound technician, or it could be motorized for a more-uniform angle rate. In an alternative preferred embodiment (for echocardiography), the main probe and an array of omni probes are placed in adjacent intercostal spaces using a mechanism as shown in FIG. 12.


Computer software could be provided such that the 2D slices would fill a 3D volume of voxels. After adjacent voxels are filled through interpolation, 3D information can be displayed as projections or as slices through the volume at arbitrary orientations.


The need for and one important use of the 3D information is covered in U.S. patent application Ser. No. 11/532,013, now U.S. Pat. No. 8,105,239, also by the present inventor, and which application is incorporated in its entirety by reference herein.


Yet another variation on this theme is to have the secondary transducer mechanically linked to the primary so that each plane of scan contains the other transducer (as above), but allow rotation of the main probe about its own axis. In this case the secondary probe would be allowed to move on the patient's body (properly prepared with ultrasound gel). It would have many elements, and an attached computer would scan them all to find those elements which have the strongest return signal.


Estimating Relative Probe Positions from Reflected Signals:


For image reconstruction it is essential to know the position of the secondary probe (x, y) relative to the main probe. This has to be evaluated separately for each frame of data because of the motion of the patient, technician, and/or motorized angle actuator. Since the linkage will prevent any difference in position (z) perpendicular to the scan plane, only x and y need be assessed.


Note that any tilt of the main probe will change the reference axes so that x and particularly y will change too.


When a pulse is transmitted from the main probe it insonifies a sequence of tissues in the path of the beam. The returns from the tissues will be received by both the main probe and the secondary, digitized and stored in the computer. Echoes from relatively proximate tissues will be different for the two probes, but echoes from mid- to far range will be similar. It is possible to use cross correlation to find similar small patches in the two stored returns. They will be similar except for the time delay relative to the launching of the pulse from the main probe. The time delay will be related to the offsets x and y. Values for x and y cannot be determined from one set of time delays, but can be determined by solving a set of simultaneous equations from two detected similar returns. These could be different patches of the same pulse return or from returns from differently directed main pulses.


Referring now to FIG. 9, if the main probe 700 transmits at angle 90°—a 710 relative to its centerline and an identifiable packet of returns occurs at time t1m at the main probe and at time t.sub.1s at the secondary (omni) probe 720, then:

    • tissue packet at (x1 y1) is received at time t1m, and distance m1=sqrt(x12+y12)
    • tissue packet at (x2, y2) is received at time t2m, and distance m2=sqrt(x22+y22)
    • t1m corresponds to time of two trips of distance m1
    • t1ms=2 m1, where s=speed of ultrasound in same units as m=approx. 1.54×106 mm/sec
    • t1ss=m1+sqrt((x1−x)2+(y1−y)2).


Similarly, t2ms=2 m2
t2ss=m2+sqrt((x2−x)2+(y2−y)2)
(t1ss−0.5t1m)=(x1−x)2+(y1−y)2
(t2ss−0.5t2ms)2=(x2−x)2+(y2−y)2  (1)


Since Xj, y15 x2, y2 and the times are known, one can solve the last two simultaneous equations for x and y. Similarly, if a z offset between the two probes is allowed, x, y, and z can be calculated by solving three simultaneous equations.


Many more measurements from packet pairs are available. One could make a measurement on several or every scan line (angle) as measured from the main probe. Then we would have many equations in 2 unknowns which can be used to make more-accurate estimations of the 2 unknowns. Since these are nonlinear equations, a search technique can be utilized. One way to accomplish this is to compute error squared over a grid of (x, y) points using the equation:










E
2

=




i
=
1

N




(





(


x
i

-
x

)

2

+


(


y
i

-
y

)

2



-

t
is

+

0.5






t
im


s


)

2






(
2
)







The minimum E2 will indicate the minimum squared error estimate of (x, y). The search should be conducted over the expected range of x and y to save time and to avoid spurious ambiguous minima.


When the z component of the relative position is not constrained to be zero, the comparable error squared equation is:







E
2

=




i
=
1

N




(







(


x
i

-
x

)

2

+


(


y
i

-
y

)

2

+

z
i

-
z

)

2


-

t
is

+

0.5






t
im


s


)

2






The minimum E2 will indicate the minimum squared error estimate of (x, y, z).


If the speed of sound on the return path to the secondary (omni) transducer is different from s due to different types of tissues being traversed, the values of x and y (and z if used) will be different from the geometric values. However, use of these values in the image reconstruction algorithm will automatically compensate for the different speeds.


Obviously, the probes that have been described for imaging the heart would work equally well for imaging abdominal organs and other parts of the body such as legs, arms, and neck. In fact, use of receive-only transducers in conjunction with a transmit/receive probe would work better for abdominal organs because the orientation of the probe set is not limited by the intercostal spaces formed by the ribs. Whereas the locations of the acoustic windows to the heart limit the orientation of the probe to only a few orientations and it is necessary to rock the probe to gather three dimensional data, the probes can be used on the abdomen in any orientation presently used. Therefore the probes can be used for real-time 2D scans to duplicate presently accepted procedures except with much higher lateral resolution. In fact, this application of the technology may be as important as the application to cardiology (which was our original motivation).


For abdominal scanning it is not necessary to have an elaborate spacing adjustment between the active transmit/receive elements and the receive-only elements. In fact they could all be mounted together in one rigid probe, either as a linear array or an array with known curvature. Some prior art wide linear arrays exist which insonify tissue by using a small subset of the total number of elements to transmit and receive a beam perpendicular to the array. Then another partially overlapping subset of elements is used to transmit and receive another line parallel to the first one, and so on until an entire scan is completed.


However, the same array could be partitioned into an active section plus one or more passive sections where all sections would be used for each pulse. The active section of elements would be used in transmit as a sector scanner sending out beams in a sequence of angular paths. On receive, all elements would be treated as independent relatively nondirectional receivers and their outputs would be combined to form a high resolution image by the methods taught in this patent. Cross-correlation image matching to account for the variations in ultrasound speeds could be done separately for each receive element or for groups of elements for which the speed corrections would be nearly the same.


The concept of mounting the active and receive-only elements on a rigid structure eliminates the necessity for articulating and instrumenting the spacing between elements thus making practical combined probes to be used for trans-esophageal (TEE), trans-vaginal, and trans-rectal imaging.


A final class of probes would involve putting a receive-only transducer or transducers on the end of a catheter to be inserted in an artery, vein, or urethra while a separated transmit transducer array is applied to the surface of the skin. The advantage of this approach is that the catheter could be positioned close to an organ of interest thereby reducing the total transit distance from the transmit transducer to the receive element and thus higher frequencies could be used for better resolution. The receive element(s) on the catheter would not have to be steered as it (they) would be relatively omnidirectional.


The Wiener Filter:


The Wiener filter itself is not new, but since it is important for the de-convolution step it will be described briefly here in the context of the present invention. The Wiener filter is the mean squared error optimal stationary linear filter for images degraded by additive noise and blurring. Wiener filters are usually applied in the frequency domain. Given a degraded image I(n.m.), one takes the discrete Fourier Transform (DFT) or the Fast Fourier Transform (FFT) to obtain I(u,v). The true image spectrum is estimated by taking the product of I(u,v) with the Wiener filter G(u,v):

Ŝ=G(u,v)I(u,v)


The inverse DFT or FFT is then used to obtain the image estimate s(n,m) from its spectrum. The Wiener filter is defined in terms of the following spectra:

    • (a) H(u,v)—Fourier transform of the point spread function (psf);
    • (b) Ps(u,v)—Power spectrum of the signal process, obtained by taking the Fourier transform of the signal autocorrelation;
    • (c) Pn(u,v)—Power spectrum of the noise process, obtained by taking the Fourier transform of the noise autocorrelation;


The Wiener filter is:







G


(

u
,
v

)


=


H
*

(

u
,
v

)




P
S



(

u
,
v

)









H


(

u
,
v

)




2




P
S



(

u
,
v

)



+


P
n



(

u
,
v

)








The ratio Ps/Pn can be interpreted as signal-to-noise ratio. At frequencies with high signal to noise ratio, the Wiener filter becomes H−1(u,v), the inverse filter for the psf. At frequencies for which the signal to noise ratio is low, the Wiener filter tends to 0 and blocks them out.


Ps(u,v)+Pn(u,v)=|I(u,v)|2. The right hand function is easy to compute from the Fourier transform of the observed data. Pn(u,v) is often assumed to be constant over (u,v). It is then subtracted from the total to yield Ps(u,v).


The psf can be measured by observing a wire phantom in a tank using the ultrasound instrument. The Fourier transform of the psf can then be stored for later use in the Wiener filter when examining patients.


Because the psf is not constant as a function of range, the Wiener filter will have to be applied separately for several range zones and the resulting images will have to be pieced together to form one image for display. A useful compromise might be to optimize the Wiener filter just for the range of the object of interest such as a coronary artery or valve. It will be necessary to store separate Wiener filters for each omni-directional probe and for the main probe when it is used as a receive transducer.


An alternative to the Wiener Filter for deconvolution is the least mean square (LMS) adaptive filter described in U.S. patent application Ser. No. 11/532,013, now U.S. Pat. No. 8,105,239. LMS Filtering is used in the spatial domain rather than the frequency domain, and can be applied to the radial scan line data, the lateral data at each depth, or both together.


Image sharpening can be accomplished by the use of unsharp masking. Because aperture blur is much more pronounced in the lateral dimension perpendicular to the insonifying beam) than in the radial dimension, it is necessary to perform unsharp masking in only one dimension. When using a sector scanner, this masking should be performed before scan conversion. When using a linear phased array, the unsharp masking should be performed on each data set of constant range. Unsharp masking consists of intentionally blurring an image, subtracting the result from the original image, multiplying the difference by an arbitrary factor, and adding this to the original image. In one dimension this is the same as blurring a line of data, subtracting it from the original line, and adding a multiple of the difference to the original line.


Multiple Active Transducers—Two Alternative Approaches:


It is possible to use more than one active transducer placed at multiple acoustic windows in order to achieve the same goals of increased lateral resolution and noise suppression. A practical method of providing multiple omni probes is to use a second phased array head in a second acoustic window and then treating each element or group of elements of the second phased array as a separate omni. With this configuration of probes it would be possible to switch the functions of the two probe heads on alternate scans thereby generating images with different speckle patterns which can be averaged out.


Multiple phased array heads can also be used together so that both are active on the same scan. When two (or more) phased array transducers are placed in the same scan plane, they can be programmed with delays such that they act as a single array with a gap in the array of transducer elements. The advantages of having a gap in the array include a) achieving the lateral resolution of a wide aperture without the expense of filling in acoustic elements through the gap, and b) the gap in the probe or between probes can be fitted over ribs or the sternum. The first advantage applies equally to applications other than cardiac. The disadvantages of multiple active probes is that both the transmit and receive delays have to be recomputed for each new gap dimension and/or angular orientation of one probe relative to the others.


An active probe with a gap has been demonstrated to produce lateral resolution as good as the probe without the gap. This implies that larger gaps will achieve higher resolution since lateral resolution is determined primarily by the overall aperture. Referring now to FIG. 10A, there is illustrated the image 800 of an ATS Laboratories Model 539 phantom was imaged using an Acuson 128 XP-10 ultrasonic scanner with a 4V2c probe. In FIG. 10B, the same probe was used to image the phantom with its center 64 elements totally obscured by aluminum foil and electrical tape. As can be seen, the lateral resolution in the image 900 is as good as the original although the image quality is degraded by speckle and other noise.



FIG. 11 shows an image 1000 of the same phantom produced by the same transducer with the center obscured, but with substantial image processing over multiple scans.



FIGS. 12A-E are various views showing an adjustable, extendable hand held two-aperture probe 1100 adapted for use in cardiology US imaging. This apparatus embodies the inventive concept of separating the insonifying probe 1110 (a transmit transducer) from the imaging elements 1120 (receiver transducer). This comfortable device includes adjustment means 1130, such as a scroll wheel, which selectively drives the elements either closer or further apart along either a medial telescoping portion 1140 or a medial insertable sleeve, and thereby provides a range of separation at predetermined distances. The gripping portion 1150 provides easy access to the scroll wheel and places the user's hand in the functional position to minimize overuse injury. FIG. 12A shows the probe in a partially extended configuration. FIG. 12B shows the probe in a collapsed configuration. FIG. 12C shows the probe extended so as to place the heads at a maximum separation distance permitted under the probe design. FIG. 12D shows the probe in a collapsed configuration, with adjustment means shown. And FIG. 12E is a detailed perspective view showing surface features at the gripping portion of the probe.



FIG. 13 shows a three-dimensional display 1200 of the anechoic tubes of the Model 539 phantom. This 3D display was formed from 13 parallel slices produced with the same transducer with the center obscured. When the total aperture is increased it will be possible to display smaller anechoic tubes such as the coronary arteries. The processing involved for this display is a combination of the techniques of the instant patent and those of U.S. patent application Ser. No. 11/532,013, now U.S. Pat. No. 8,105,239.


Having fully described several embodiments of the present invention, many other equivalents and alternative embodiments will be apparent to those skilled in the art. These and other equivalents and alternatives are intended to be included within the scope of the present invention.

Claims
  • 1. An ultrasound imaging method, comprising: transmitting, from a first transmit array, a first ultrasound signal along an intended line into a target object;receiving echoes with a first omni-directional receive transducer and storing the received echoes as data in a memory;defining a rectangular grid within the insonified region;selecting a point in the grid wherein the point is not on the intended line;determining a position of the transmit array relative to the omni-directional receive transducer;determining a first time for the transmitted scanline to travel from the transmit array to the point and from the point to the omni-directional receiver;retrieving from the memory an echo received at the first time by the omni-directional receive transducer; andplotting an amplitude of the retrieved echo at the point; and displaying an image of the point on a display.
  • 2. The method of claim 1, further comprising transmitting a plurality of ultrasound signals along a plurality of intended lines, and plotting amplitudes of a plurality of points in between the intended lines.
  • 3. The method of claim 2, wherein plotting and displaying amplitudes of a plurality of points in between the intended lines comprises combining amplitudes obtained from echoes of ultrasound signals transmitted along at least two intended lines.
  • 4. The method of claim 1, wherein the transmit array is positioned in a first acoustic window and the omni-directional receive transducer is positioned in a second acoustic window that does not overlap the first acoustic window.
  • 5. The method of claim 4, wherein the first acoustic window is separated from the second acoustic window by a distance.
  • 6. The method of claim 5, further comprising adjusting the distance.
  • 7. The method of claim 6, wherein the transmit array is located on a first probe and the omni-directional receive transducer is an element on a second array of a second probe joined to the first probe by a mechanical linkage.
  • 8. The method of claim 7, wherein mechanical linkage comprises a telescoping portion.
  • 9. The method of claim 7, wherein the second array is configured to only receive echoes.
  • 10. The method of claim 1, further comprising completing a sector scan of the target object by transmitting a plurality of successive ultrasound signals along a plurality of successive intended lines from the transmit array; receiving echoes of each of the plurality of successive ultrasound pulses with the first omnidirectional receive transducer; and storing the received echoes from each of the plurality of successive ultrasound pulses in the memory.
  • 11. The method of claim 10, further comprising retrieving, from the memory, the stored echoes of each of the successive ultrasound signals along the plurality of successive intended lines; plotting amplitudes for points along each intended line and between adjacent lines and displaying the plotted amplitudes on the display.
  • 12. The method of claim 1, further comprising: receiving echoes of the transmitted ultrasound signal with a second omnidirectional receive transducer;storing the echoes received by the second omnidirectional receive transducer in the memory;determining a position of the second omni-directional receive transducer relative to the transmit array;determining a second time for the transmitted scanline to travel from the transmit array to the point and from the point to the second omni-directional receiver;retrieving from the memory a second echo received at the second time by the second omni-directional receive transducer;plotting a second amplitude of the retrieved second echo at the point;combining the first amplitude for the point with the second amplitude for the point; anddisplaying an image of the point on a display based on the combined amplitudes.
CROSS REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 14/754,422, filed Jun. 29, 2015, now U.S. Pat. No. 9,420,994; which is a continuation of U.S. patent application Ser. No. 14/157,257, filed Jan. 16, 2014, now U.S. Pat. No. 9,072,495; which is a continuation of U.S. patent application Ser. No. 13/632,929, filed Oct. 1, 2012, now U.S. Pat. No. 8,684,936; which is a continuation of U.S. patent application Ser. No. 13/215,966, filed Aug. 23, 2011, now U.S. Pat. No. 8,277,383; which is a continuation of U.S. patent application Ser. No. 11/865,501, filed Oct. 1, 2007, now U.S. Pat. No. 8,007,439; which application claims the benefit of U.S. Provisional Patent Applications No. 60/862,951, filed Oct. 25, 2006, and No. 60/940,261, filed May 25, 2007; all of which are incorporated by reference in their entirety herein.

US Referenced Citations (336)
Number Name Date Kind
3174286 Erickson Mar 1965 A
3895381 Kock Jul 1975 A
3974692 Hassler Aug 1976 A
4055988 Dutton Nov 1977 A
4072922 Taner et al. Feb 1978 A
4097835 Green Jun 1978 A
4105018 Greenleaf et al. Aug 1978 A
4180792 Lederman et al. Dec 1979 A
4259733 Taner et al. Mar 1981 A
4265126 Papadofrangakis et al. May 1981 A
4271842 Specht et al. Jun 1981 A
4325257 Kino et al. Apr 1982 A
4327738 Green et al. May 1982 A
4452084 Taenzer Jun 1984 A
4501279 Seo Feb 1985 A
4511998 Kanda et al. Apr 1985 A
4539847 Paap Sep 1985 A
4566459 Umemura et al. Jan 1986 A
4567768 Satoh et al. Feb 1986 A
4604697 Luthra et al. Aug 1986 A
4662222 Johnson May 1987 A
4669482 Ophir Jun 1987 A
4682497 Sasaki Jul 1987 A
4781199 Hirama Nov 1988 A
4817434 Anderson Apr 1989 A
4831601 Breimesser et al. May 1989 A
4893284 Magrane Jan 1990 A
4893628 Angelsen Jan 1990 A
5050588 Grey et al. Sep 1991 A
5141738 Rasor et al. Aug 1992 A
5161536 Vilkomerson et al. Nov 1992 A
5197475 Antich et al. Mar 1993 A
5226019 Bahorich Jul 1993 A
5230339 Charlebois Jul 1993 A
5269309 Fort et al. Dec 1993 A
5278757 Hoctor et al. Jan 1994 A
5293871 Reinstein et al. Mar 1994 A
5299576 Shiba Apr 1994 A
5301674 Erikson et al. Apr 1994 A
5305756 Entrekin et al. Apr 1994 A
5339282 Kuhn et al. Aug 1994 A
5340510 Bowen Aug 1994 A
5345426 Lipschutz Sep 1994 A
5349960 Gondo Sep 1994 A
5355888 Kendall Oct 1994 A
5381794 Tei et al. Jan 1995 A
5398216 Hall et al. Mar 1995 A
5442462 Guissin Aug 1995 A
5454372 Banjanin et al. Oct 1995 A
5515853 Smith et al. May 1996 A
5515856 Olstad et al. May 1996 A
5522393 Phillips Jun 1996 A
5526815 Granz et al. Jun 1996 A
5544659 Banjanin Aug 1996 A
5558092 Unger Sep 1996 A
5564423 Mele et al. Oct 1996 A
5568812 Murashita et al. Oct 1996 A
5570691 Wright et al. Nov 1996 A
5581517 Gee et al. Dec 1996 A
5625149 Gururaja et al. Apr 1997 A
5628320 Teo May 1997 A
5673697 Bryan et al. Oct 1997 A
5675550 Ekhaus Oct 1997 A
5720291 Schwartz Feb 1998 A
5720708 Lu et al. Feb 1998 A
5744898 Smith et al. Apr 1998 A
5769079 Hossack Jun 1998 A
5784334 Sena et al. Jul 1998 A
5785654 Iinuma et al. Jul 1998 A
5795297 Daigle Aug 1998 A
5797845 Barabash et al. Aug 1998 A
5798459 Ohba et al. Aug 1998 A
5820561 Olstad et al. Oct 1998 A
5838564 Bahorich et al. Nov 1998 A
5850622 Vassiliou et al. Dec 1998 A
5862100 VerWest Jan 1999 A
5870691 Partyka et al. Feb 1999 A
5876342 Chen et al. Mar 1999 A
5891038 Seyed-Bolorforosh et al. Apr 1999 A
5892732 Gersztenkorn Apr 1999 A
5919139 Lin Jul 1999 A
5920285 Benjamin Jul 1999 A
5930730 Marfurt et al. Jul 1999 A
5940778 Marfurt et al. Aug 1999 A
5951479 Holm et al. Sep 1999 A
5964707 Fenster et al. Oct 1999 A
5969661 Benjamin Oct 1999 A
5999836 Nelson et al. Dec 1999 A
6007499 Martin et al. Dec 1999 A
6013032 Savord Jan 2000 A
6014473 Hossack et al. Jan 2000 A
6049509 Sonneland et al. Apr 2000 A
6050943 Slayton et al. Apr 2000 A
6056693 Haider May 2000 A
6058074 Swan et al. May 2000 A
6077224 Lang et al. Jun 2000 A
6092026 Bahorich et al. Jul 2000 A
6122538 Sliwa, Jr. et al. Sep 2000 A
6123670 Mo Sep 2000 A
6129672 Seward et al. Oct 2000 A
6135960 Holmberg Oct 2000 A
6138075 Yost Oct 2000 A
6148095 Prause et al. Nov 2000 A
6162175 Marian, Jr. et al. Dec 2000 A
6166384 Dentinger et al. Dec 2000 A
6166853 Sapia et al. Dec 2000 A
6193665 Hall et al. Feb 2001 B1
6196739 Silverbrook Mar 2001 B1
6200266 Shokrollahi et al. Mar 2001 B1
6210335 Miller Apr 2001 B1
6213958 Winder Apr 2001 B1
6221019 Kantorovich Apr 2001 B1
6231511 Bae May 2001 B1
6238342 Feleppa et al. May 2001 B1
6246901 Benaron Jun 2001 B1
6251073 Imran et al. Jun 2001 B1
6264609 Herrington et al. Jul 2001 B1
6266551 Osadchy et al. Jul 2001 B1
6278949 Alam Aug 2001 B1
6289230 Chaiken et al. Sep 2001 B1
6299580 Asafusa Oct 2001 B1
6304684 Niczyporuk et al. Oct 2001 B1
6309356 Ustuner et al. Oct 2001 B1
6324453 Breed et al. Nov 2001 B1
6345539 Rawes et al. Feb 2002 B1
6361500 Masters Mar 2002 B1
6363033 Cole et al. Mar 2002 B1
6370480 Gupta et al. Apr 2002 B1
6374185 Taner et al. Apr 2002 B1
6394955 Perlitz May 2002 B1
6423002 Hossack Jul 2002 B1
6436046 Napolitano et al. Aug 2002 B1
6449821 Sudol et al. Sep 2002 B1
6450965 Williams et al. Sep 2002 B2
6468216 Powers et al. Oct 2002 B1
6471650 Powers et al. Oct 2002 B2
6475150 Haddad Nov 2002 B2
6480790 Calvert et al. Nov 2002 B1
6487502 Taner Nov 2002 B1
6499536 Ellingsen Dec 2002 B1
6508768 Hall et al. Jan 2003 B1
6508770 Cai Jan 2003 B1
6517484 Wilk et al. Feb 2003 B1
6526163 Halmann et al. Feb 2003 B1
6543272 Vitek Apr 2003 B1
6547732 Jago Apr 2003 B2
6551246 Ustuner et al. Apr 2003 B1
6565510 Haider May 2003 B1
6585647 Winder Jul 2003 B1
6604421 Li Aug 2003 B1
6614560 Silverbrook Sep 2003 B1
6620101 Azzam et al. Sep 2003 B2
6668654 Dubois et al. Dec 2003 B2
6672165 Rather et al. Jan 2004 B2
6681185 Young et al. Jan 2004 B1
6690816 Aylward et al. Feb 2004 B2
6692450 Coleman Feb 2004 B1
6695778 Golland et al. Feb 2004 B2
6719693 Richard Apr 2004 B2
6728567 Rather et al. Apr 2004 B2
6755787 Hossack et al. Jun 2004 B2
6790182 Eck et al. Sep 2004 B2
6837853 Marian Jan 2005 B2
6843770 Sumanaweera Jan 2005 B2
6847737 Kouri et al. Jan 2005 B1
6865140 Thomenius et al. Mar 2005 B2
6932767 Landry et al. Aug 2005 B2
7033320 Von Behren et al. Apr 2006 B2
7087023 Daft et al. Aug 2006 B2
7104956 Christopher Sep 2006 B1
7221867 Silverbrook May 2007 B2
7231072 Yamano et al. Jun 2007 B2
7269299 Schroeder Sep 2007 B2
7283652 Mendonca et al. Oct 2007 B2
7285094 Nohara et al. Oct 2007 B2
7293462 Lee et al. Nov 2007 B2
7313053 Wodnicki Dec 2007 B2
7366704 Reading et al. Apr 2008 B2
7402136 Hossack et al. Jul 2008 B2
7410469 Talish et al. Aug 2008 B1
7443765 Thomenius et al. Oct 2008 B2
7444875 Wu et al. Nov 2008 B1
7447535 Lavi Nov 2008 B2
7448998 Robinson Nov 2008 B2
7466848 Metaxas et al. Dec 2008 B2
7469096 Silverbrook Dec 2008 B2
7474778 Shinomura et al. Jan 2009 B2
7497830 Li Mar 2009 B2
7510529 Chou et al. Mar 2009 B2
7514851 Wilser et al. Apr 2009 B2
7549962 Dreschel et al. Jun 2009 B2
7574026 Rasche et al. Aug 2009 B2
7625343 Cao et al. Dec 2009 B2
7668583 Fegert et al. Feb 2010 B2
7682311 Simopoulos et al. Mar 2010 B2
7750311 Daghighian Jul 2010 B2
7787680 Ahn et al. Aug 2010 B2
7822250 Yao et al. Oct 2010 B2
7862508 Davies et al. Jan 2011 B2
7914451 Davies Mar 2011 B2
7919906 Cerofolini Apr 2011 B2
7984637 Ao et al. Jul 2011 B2
8007439 Specht Aug 2011 B2
8105239 Specht Jan 2012 B2
8277383 Specht Oct 2012 B2
8412307 Willis et al. Apr 2013 B2
8419642 Sandrin et al. Apr 2013 B2
8473239 Specht et al. Jun 2013 B2
8602993 Specht et al. Dec 2013 B2
8672846 Napolitano et al. Mar 2014 B2
8684936 Specht Apr 2014 B2
9072495 Specht Jul 2015 B2
9146313 Specht et al. Sep 2015 B2
9192355 Specht et al. Nov 2015 B2
9220478 Smith et al. Dec 2015 B2
9247926 Smith et al. Feb 2016 B2
9265484 Brewer et al. Feb 2016 B2
9282945 Smith et al. Mar 2016 B2
9339256 Specht et al. May 2016 B2
9420994 Specht Aug 2016 B2
20020035864 Paltieli et al. Mar 2002 A1
20020087071 Schmitz et al. Jul 2002 A1
20020138003 Bukshpan Sep 2002 A1
20020161299 Prater et al. Oct 2002 A1
20030013962 Bjaerum et al. Jan 2003 A1
20030028111 Vaezy et al. Feb 2003 A1
20030040669 Grass et al. Feb 2003 A1
20030228053 Li et al. Dec 2003 A1
20040054283 Corey et al. Mar 2004 A1
20040068184 Trahey et al. Apr 2004 A1
20040100163 Baumgartner et al. May 2004 A1
20040111028 Abe et al. Jun 2004 A1
20040122313 Moore et al. Jun 2004 A1
20040122322 Moore et al. Jun 2004 A1
20040138565 Trucco Jul 2004 A1
20040144176 Yoden Jul 2004 A1
20040236217 Cerwin et al. Nov 2004 A1
20040236223 Barnes et al. Nov 2004 A1
20050004449 Mitschke et al. Jan 2005 A1
20050053305 Li et al. Mar 2005 A1
20050054910 Tremblay et al. Mar 2005 A1
20050090743 Kawashima et al. Apr 2005 A1
20050090745 Steen Apr 2005 A1
20050111846 Steinbacher et al. May 2005 A1
20050113689 Gritzky May 2005 A1
20050113694 Haugen et al. May 2005 A1
20050124883 Hunt Jun 2005 A1
20050131300 Bakircioglu et al. Jun 2005 A1
20050147297 McLaughlin et al. Jul 2005 A1
20050165312 Knowles et al. Jul 2005 A1
20050203404 Freiburger Sep 2005 A1
20050215883 Hundley et al. Sep 2005 A1
20050240125 Makin et al. Oct 2005 A1
20050252295 Fink et al. Nov 2005 A1
20050281447 Moreau-Gobard et al. Dec 2005 A1
20050288588 Weber et al. Dec 2005 A1
20060062447 Rinck et al. Mar 2006 A1
20060074313 Slayton et al. Apr 2006 A1
20060074315 Liang et al. Apr 2006 A1
20060074320 Yoo et al. Apr 2006 A1
20060079759 Vaillant et al. Apr 2006 A1
20060079778 Mo et al. Apr 2006 A1
20060079782 Beach et al. Apr 2006 A1
20060094962 Clark May 2006 A1
20060111634 Wu May 2006 A1
20060122506 Davies et al. Jun 2006 A1
20060173327 Kim Aug 2006 A1
20060262961 Holsing et al. Nov 2006 A1
20060270934 Savord et al. Nov 2006 A1
20070036414 Georgescu et al. Feb 2007 A1
20070055155 Owen et al. Mar 2007 A1
20070078345 Mo et al. Apr 2007 A1
20070088213 Poland Apr 2007 A1
20070138157 Dane et al. Jun 2007 A1
20070161898 Hao et al. Jul 2007 A1
20070167752 Proulx et al. Jul 2007 A1
20070167824 Lee et al. Jul 2007 A1
20070232914 Chen et al. Oct 2007 A1
20070238985 Smith et al. Oct 2007 A1
20080110261 Randall et al. May 2008 A1
20080114255 Schwartz et al. May 2008 A1
20080125659 Wilser et al. May 2008 A1
20080181479 Yang et al. Jul 2008 A1
20080183075 Govari et al. Jul 2008 A1
20080188747 Randall et al. Aug 2008 A1
20080188750 Randall et al. Aug 2008 A1
20080194957 Hoctor et al. Aug 2008 A1
20080194958 Lee et al. Aug 2008 A1
20080208061 Heimann Aug 2008 A1
20080242996 Hall et al. Oct 2008 A1
20080249408 Palmeri et al. Oct 2008 A1
20080255452 Entrekin Oct 2008 A1
20080269604 Boctor et al. Oct 2008 A1
20080269613 Summers et al. Oct 2008 A1
20080275344 Glide-Hurst et al. Nov 2008 A1
20080287787 Sauer et al. Nov 2008 A1
20080294045 Ellington et al. Nov 2008 A1
20080294050 Shinomura et al. Nov 2008 A1
20080294052 Wilser et al. Nov 2008 A1
20080306382 Guracar et al. Dec 2008 A1
20080306386 Baba et al. Dec 2008 A1
20080319317 Kamiyama et al. Dec 2008 A1
20090010459 Garbini et al. Jan 2009 A1
20090012393 Choi Jan 2009 A1
20090016163 Freeman et al. Jan 2009 A1
20090018445 Schers et al. Jan 2009 A1
20090036780 Abraham Feb 2009 A1
20090043206 Towfiq et al. Feb 2009 A1
20090069681 Lundberg et al. Mar 2009 A1
20090069686 Daft et al. Mar 2009 A1
20090182237 Angelsen et al. Jul 2009 A1
20090208080 Grau et al. Aug 2009 A1
20100121193 Fukukita et al. May 2010 A1
20100168566 Bercoff et al. Jul 2010 A1
20100168578 Garson, Jr. et al. Jul 2010 A1
20100217124 Cooley Aug 2010 A1
20100256488 Kim et al. Oct 2010 A1
20100262013 Smith et al. Oct 2010 A1
20120095347 Adam et al. Apr 2012 A1
20120116226 Specht May 2012 A1
20130144166 Specht et al. Jun 2013 A1
20130253325 Call et al. Sep 2013 A1
20130261463 Chiang et al. Oct 2013 A1
20140043933 Belevich et al. Feb 2014 A1
20140058266 Call et al. Feb 2014 A1
20140073921 Specht et al. Mar 2014 A1
20140269209 Smith et al. Sep 2014 A1
20150045668 Smith et al. Feb 2015 A1
20150080727 Specht et al. Mar 2015 A1
20150374345 Specht et al. Dec 2015 A1
20160095579 Smith et al. Apr 2016 A1
20160135783 Brewer et al. May 2016 A1
20160157833 Smith et al. Jun 2016 A1
20160256134 Specht et al. Sep 2016 A1
20170074982 Smith et al. Mar 2017 A1
20170079621 Specht et al. Mar 2017 A1
Foreign Referenced Citations (67)
Number Date Country
1781460 Jun 2006 CN
101116622 Feb 2008 CN
101190134 Jun 2008 CN
101453955 Jun 2009 CN
1949856 Jul 2008 EP
1757955 Nov 2010 EP
1850743 Dec 2012 EP
1594404 Sep 2013 EP
2026280 Oct 2013 EP
2851662 Aug 2004 FR
S49-11189 Jan 1974 JP
S54-44375 Apr 1979 JP
S55-103839 Aug 1980 JP
57-31848 Feb 1982 JP
58-223059 Dec 1983 JP
59-101143 Jun 1984 JP
S59-174151 Oct 1984 JP
S60-13109 Jan 1985 JP
S60-68836 Apr 1985 JP
2-501431 May 1990 JP
03015455 Jan 1991 JP
03126443 May 1991 JP
04017842 Jan 1992 JP
4-67856 Mar 1992 JP
05-042138 Feb 1993 JP
6-125908 May 1994 JP
7-051266 Feb 1995 JP
07204201 Aug 1995 JP
08154930 Jun 1996 JP
08-252253 Oct 1996 JP
9-103429 Apr 1997 JP
9-201361 Aug 1997 JP
2777197 May 1998 JP
10-216128 Aug 1998 JP
11-089833 Apr 1999 JP
11-239578 Sep 1999 JP
2001-507794 Jun 2001 JP
2001-245884 Sep 2001 JP
2002-209894 Jul 2002 JP
2002-253548 Sep 2002 JP
2002-253549 Sep 2002 JP
2004-167092 Jun 2004 JP
2004-215987 Aug 2004 JP
2004-337457 Dec 2004 JP
2004-351214 Dec 2004 JP
2005152187 Jun 2005 JP
2005-523792 Aug 2005 JP
2005-526539 Sep 2005 JP
2006051356 Feb 2006 JP
2006-61203 Mar 2006 JP
2006-122657 May 2006 JP
2006130313 May 2006 JP
2007-325937 Dec 2007 JP
2008-122209 May 2008 JP
2008-513763 May 2008 JP
2008132342 Jun 2008 JP
2008522642 Jul 2008 JP
2008-259541 Oct 2008 JP
2008279274 Nov 2008 JP
2010526626 Aug 2010 JP
100715132 Apr 2007 KR
WO9218054 Oct 1992 WO
WO9800719 Jan 1998 WO
WO0164109 Sep 2001 WO
WO02084594 Oct 2002 WO
WO2005009245 Feb 2005 WO
WO2006114735 Nov 2006 WO
Non-Patent Literature Citations (46)
Entry
Belevich et al.; U.S. Appl. No. 15/400,826 entitled “Calibration of multiple aperture ultrasound probes,” filed Jan. 6, 2017.
Davies et al.; U.S. Appl. No. 15/418,534 entitled “Ultrasound imaging with sparse array probes,” filed Jan. 27, 2017.
Call et al.; U.S. Appl. No. 15/500,933 entitled “Network-based ultrasound imaging system,” filed Feb. 1, 2017.
Call et al.; U.S. Appl. No. 15/495,591 entitled “Systems and methods for improving ultrasound image quality by applying weighting factors,” filed Apr. 24, 2017.
Chen et al.; Maximum-likelihood source localization and unknown sensor location estimation for wideband signals in the near-field; IEEE Transactions On Signal Processing; 50(8); pp. 1843-1854; Aug. 2002.
Chen et al.; Source localization and tracking of a wideband source using a randomly distributed beamforming sensor array; International Journal of High Performance Computing Applications; 16(3); pp. 259-272; Fall 2002.
Cristianini et al.; An Introduction to Support Vector Machines; Cambridge University Press; pp. 93-111; Mar. 2000.
Feigenbaum, Harvey, M.D.; Echocardiography; Lippincott Williams & Wilkins; Philadelphia; 5th Ed.; pp. 482, 484; Feb. 1994.
Fernandez et al.; High resolution ultrasound beamforming using synthetic and adaptive imaging techniques; Proceedings IEEE International Symposium on Biomedical Imaging; Washington, D.C.; pp. 433-436; Jul. 7-10, 2002.
Gazor et al.; Wideband multi-source beamforming with array location calibration and direction finding; Conference on Acoustics, Speech and Signal Processing ICASSP-95; Detroit, MI; vol. 3 IEEE; pp. 1904-1907; May 9-12, 1995.
Haykin, Simon; Neural Networks: A Comprehensive Foundation (2nd Ed.); Prentice Hall; pp. 156-187; Jul. 16, 1998.
Heikkila et al.; A four-step camera calibration procedure with implicit image correction; Proceedings IEEE Computer Scociety Conference on Computer Vision and Pattern Recognition; San Juan; pp. 1106-1112; Jun. 17-19, 1997.
Hendee et al.; Medical Imaging Physics; Wiley-Liss, Inc. 4th Edition; Chap. 19-22; pp. 303-353; (year of pub. sufficiently earlier than effective US filed and any foreign priority date) © 2002.
Hsu et al.; Real-time freehand 3D ultrasound calibration; CUED/F-INFENG/TR 565; Department of Engineering, University of Cambridge, United Kingdom; 14 pages; Sep. 2006.
Jeffs; Beamforming: a brief introduction; Brigham Young University; 14 pages; retrieved from the internet (http://ens.ewi.tudelft.nl/Education/courses/et4235/Beamforming.pdf); Oct. 2004.
Khamene et al.; A novel phantom-less spatial and temporal ultrasound calibration method; Medical Image Computing and Computer-Assisted Intervention—MICCAI (Proceedings 8th Int. Conf.); Springer Berlin Heidelberg; Palm Springs, CA; pp. 65-72; Oct. 26-29, 2005.
Kramb et al,.; Considerations for using phased array ultrasonics in a fully automated inspection system. Review of Quantitative Nondestructive Evaluation, vol. 23, ed. D. O. Thompson and D. E. Chimenti, pp. 817-825, (year of publication is sufficiently earlier than the effective U.S. filed and any foreign priority date) 2004.
Ledesma-Carbayo et al.; Spatio-temporal nonrigid registration for ultrasound cardiac motion estimation; IEEE Trans. On Medical Imaging; vol. 24; No. 9; Sep. 2005.
Leotta et al.; Quantitative three-dimensional echocardiography by rapid imaging . . . ; J American Society of Echocardiography; vol. 10; No. 8; ppl 830-839; Oct. 1997.
Li et al.; An efficient speckle tracking algorithm for ultrasonic imaging; 24; pp. 215-228; Oct. 1, 2002.
Morrison et al.; A probabilistic neural network based image segmentation network for magnetic resonance images; Proc. Conf. Neural Networks; Baltimore, MD; vol. 3; pp. 60-65; Jun. 1992.
Nadkarni et al.; Cardiac motion synchronization for 3D cardiac ultrasound imaging; Ph.D. Dissertation, University of Western Ontario; Jun. 2002.
Press et al.; Cubic spline interpolation; §3.3 in “Numerical Recipes in FORTRAN: The Art of Scientific Computing”, 2nd Ed.; Cambridge, England; Cambridge University Press; pp. 107-110; Sep. 1992.
Sakas et al.; Preprocessing and volume rendering of 3D ultrasonic data; IEEE Computer Graphics and Applications; pp. 47-54, Jul. 1995.
Sapia et al.; Deconvolution of ultrasonic waveforms using an adaptive wiener filter; Review of Progress in Quantitative Nondestructive Evaluation; vol. 13A; Plenum Press; pp. 855-862; (year of publication is sufficiently earlier than the effective U.S. filed and any foreign priority date) 1994.
Sapia et al.; Ultrasound image deconvolution using adaptive inverse filtering; 12 IEEE Symposium on Computer-Based Medical Systems, Cbms, pp. 248-253; Jun. 1999.
Sapia, Mark Angelo; Multi-dimensional deconvolution of optical microscope and ultrasound imaging using adaptive least-mean-square (LMS) inverse filtering; Ph.D. Dissertation; University of Connecticut; Jan. 2000.
Smith et al.; High-speed ultrasound volumetric imaging system. 1. Transducer design and beam steering; IEEE Trans. Ultrason., Ferroelect., Freq. Contr.; vol. 38; pp. 100-108; Mar. 1991.
Specht et al.; Deconvolution techniques for digital longitudinal tomography; SPIE; vol. 454; presented at Application of Optical Instrumentation in Medicine XII; pp. 319-325; Jun. 1984.
Specht et al.; Experience with adaptive PNN and adaptive GRNN; Proc. IEEE International Joint Conf. on Neural Networks; vol. 2; pp. 1203-1208; Orlando, FL; Jun. 1994.
Specht, D.F.; A general regression neural network; IEEE Trans. On Neural Networks; vol. 2.; No. 6; Nov. 1991.
Specht, D.F.; Blind deconvolution of motion blur using LMS inverse filtering; Lockheed Independent Research (unpublished); Jun. 23, 1975.
Specht, D.F.; Enhancements to probabilistic neural networks; Proc. IEEE International Joint Conf. on Neural Networks; Baltimore, MD; Jun. 1992.
Specht, D.F.; GRNN with double clustering; Proc. IEEE International Joint Conf. Neural Networks; Vancouver, Canada; Jul. 16-21, 2006.
Specht, D.F.; Probabilistic neural networks; Pergamon Press; Neural Networks; vol. 3; pp. 109-118; Feb. 1990.
UCLA Academic Technology; SPSS learning module: How can I analyze a subset of my data; 6 pages; retrieved from the internet (http://www.ats.ucla.edu/stat/spss/modules/subset_analyze.htm) Nov. 26, 2001.
Von Ramm et al.; High-speed ultrasound volumetric imaging—System. 2. Parallel processing and image display; IEEE Trans. Ultrason., Ferroelect., Freq. Contr.; vol. 38; pp. 109-115; Mar. 1991.
Wang et al.; Photoacoustic tomography of biological tissues with high cross-section resolution; reconstruction and experiment; Medical Physics; 29(12); pp. 2799-2805; Dec. 2002.
Wells, P.N.T.; Biomedical ultrasonics; Academic Press; London, New York, San Francisco; pp. 124-125; Mar. 1977.
Widrow et al.; Adaptive signal processing; Prentice-Hall; Englewood Cliffs, NJ; pp. 99-116; Mar. 1985.
Wikipedia; Point cloud; 2 pages; retrieved Nov. 24, 2014 from the Internet (https://en.wikipedia.org/w/index.php?title=Point_cloud&oldid=472583138).
Wikipedia; Curve fitting; 5 pages; retrieved from the internet (http:en.wikipedia.org/wiki/Curve_fitting) Dec. 19, 2010.
Wikipedia; Speed of sound; 17 pages; retrieved from the internet (http:en.wikipedia.org/wiki/Speed_of_sound) Feb. 15, 2011.
Arigovindan et al.; Full motion and flow field recovery from echo doppler data; IEEE Transactions on Medical Imaging; 26(1); pp. 31-45; Jan. 2007.
Capineri et al.; A doppler system for dynamic vector velocity maps; Ultrasound in Medicine & Biology; 28(2); pp. 237-248; Feb. 28, 2002.
Dunmire et al.; A brief history of vector doppler; Medical Imaging 2001; International Society for Optics and Photonics; pp. 200-214; May 30, 2001.
Related Publications (1)
Number Date Country
20160354059 A1 Dec 2016 US
Provisional Applications (2)
Number Date Country
60862951 Oct 2006 US
60940261 May 2007 US
Continuations (5)
Number Date Country
Parent 14754422 Jun 2015 US
Child 15240884 US
Parent 14157257 Jan 2014 US
Child 14754422 US
Parent 13632929 Oct 2012 US
Child 14157257 US
Parent 13215966 Aug 2011 US
Child 13632929 US
Parent 11865501 Oct 2007 US
Child 13215966 US