1. Field of the Invention
The present invention is generally related to medical ultrasound, and more particularly to a non-invasive screening method and apparatus for visualizing coronary artery obstruction which solves artery curvature problems by projecting the three-dimensional volume onto a two-dimensional screen, highlighting the arteries and veins in contrast to other tissue and the heart chambers by using nonlinear techniques, storing many processed 2D slices into a 3D volume, and projecting the volume with varying view angles controlled by a pointing device.
2. Discussion of Related Art including information disclosed under 37 CFR §§1.97, 1.98
In his well-respected textbook, Echocardiography, Harvey Feigenbaum, M.D., describes analysis techniques for many cardiac conditions. Most of these are used in routine clinical practice. An exception is the visualization of the coronary arteries (CAs) and assessment of their condition. Considering the seriousness of coronary artery disease, it is obvious that a non-invasive screening technique to assess obstruction in these arteries would be of great importance.
At pages 482-490 of Echocardiography, 5th Edition, Feigenbaum shows 2D views of the coronary arteries. These studies prove that clinical ultrasound machines circa 1993 already had sufficient resolution to image the larger parts of these arteries. However, as Feigenbaum states, the curved nature of the arteries usually does not permit them to be seen for any length in an individual frame. In addition, it takes great skill and knowledge to recognize the arteries when they do come into view. For these reasons, clinical ultrasound is rarely used to visualize the CAs.
Because of the curved nature of the CAs, it would be advantageous and desirable to visualize them in three dimensions. The current patent teaches the steps required to achieve this goal.
Several U.S. Patents and/or patent applications teach or show methods of imaging coronary arteries using ultrasound or other medical imaging techniques. Notable among them are U.S. Pat. Appln. No. 2006/0079782, by Beach et al., which shows an ultrasonic technique for visualizing coronary arteries using a 2-D scan and a 3-D display.
U.S. Pat. Appln. No. 2006/0079759, by Vaillant et al., discloses a method and apparatus for registering 3-D models of the heart using ultrasound.
U.S. Pat. Appln. No. 2005/0281447, by Moreau-Gobard et al., teaches a method of producing a 3-D image of the coronary artery system using ultrasound.
U.S. Pat. Appln. No. 2005/0004449, by Mitschke et al., teaches the use of ultrasound to acquire preoperative 3-D images for marker-less navigation of a medical instrument.
U.S. Pat. Appln. No. 20020087071, by Schmitz et al., teaches a process for graphic visualization and diagnosis of thrombi as well as the use of particle suspensions for the production of contrast media for the visualization of thrombi (circumscribed blood solidification that forms in arteries or veins by intravascular clotting) through the use of nuclear spin tomography. This method produces 3-D images from a 2-D source.
U.S. Pat. No. 6,148,095, to Prause et al., shows a method of three-dimensional reconstruction of coronary arteries by fusing data between biplane angiography and IVUS frames of a pullback sequence. The 3D course of the tortuous vessel is first determined from the angiograms and then combined with the 2D representations regarding the 3D course (e.g., segmented IVUS frames of a pullback sequence) using a data fusion apparatus and method: The determination of the 3D pullback path is represented by the external energy of the tortuous vessel and the internal energy of a line object such as a catheter.
U.S. Pat. Appln. No. 2005/0288588, by Weber et al., discloses a method and apparatus for electronic volume data acquisition using ultrasound generates image data in a coherent aperture combining beamforming (CAC-BF) scanning and imaging process.
U.S. Pat. No. 6,166,853, to Sapia et al., teaches use of an adaptive structure of a Wiener filter to deconvolve three-dimensional wide-field microscope images for the purposes of improving spatial resolution and removing out-of-focus light. The filter is a three-dimensional kernel representing a finite-impulse-response (FIR) structure requiring on the order of one thousand (1,000) taps or more to achieve an acceptable mean-square-error. Converging to a solution is done in the spatial-domain. Alternatively, a three-dimensional kernel representing an infinite-impulse-response (IIR) structure may be employed, as an IIR structure typically requires fewer taps to achieve the same or better performance, resulting in higher resolution images with less noise and faster computations.
The foregoing patents reflect the current state of the art of which the present inventor is aware. Reference to, and discussion of, these patents is intended to aid in discharging Applicant's acknowledged duty of candor in disclosing information that may be relevant to the examination of claims to the present invention. However, it is respectfully submitted that none of the above-indicated patents disclose, teach, suggest, show, or otherwise render obvious, either singly or when considered in combination, the invention described and claimed herein.
The present invention is a method and apparatus for providing three-dimensional images of the coronary arteries.
It is an object of the present invention to provide such three-dimensional images using ultrasound.
It is a further object of the present invention to provide a non-invasive screening test to assess the patency of the coronary arteries.
It is yet another object of the present invention to provide a new and improved means of evaluating the degree of obstruction in partially occluded arteries.
It is still another object of the present invention to provide a new and improved means to visualize coronary arteries which overcomes artery curvature problems by projecting the three-dimensional volume onto a two-dimensional screen.
It is still another object of the present invention is to provide improved three dimensional images of coronary arteries by utilizing linear filtering to reduce noise, using nonlinear techniques, such as neural networks, to highlight arteries and veins in contrast to other tissue and the heart chambers, storing numerous processed 2D slices into a 3D volume, and then projecting the volume with varying view angles controlled by a pointing device.
An even further object of the present invention is to provide gyroscopic stabilization to the ultrasound probe to minimize unwanted angulation during data collection.
The foregoing summary broadly sets out the more important features of the present invention so that the detailed description that follows may be better understood, and so that the present contributions to the art may be better appreciated. There are additional features of the invention that will be described in the detailed description of the preferred embodiments of the invention which will form the subject matter of the claims appended hereto.
Accordingly, before explaining the preferred embodiment of the disclosure in detail, it is to be understood that the disclosure is not limited in its application to the details of the construction and the arrangements set forth in the following description or illustrated in the drawings. The inventive apparatus described herein is capable of other embodiments and of being practiced and carried out in various ways.
Also, it is to be understood that the terminology and phraseology employed herein are for descriptive purposes only, and not limitation. Where specific dimensional and material specifications have been included or omitted from the specification or the claims, or both, it is to be understood that the same are not to be incorporated into the appended claims.
As such, those skilled in the art will appreciate that the conception, upon which this disclosure is based may readily be used as a basis for designing other structures, methods, and systems for carrying out the several purposes of the present invention. It is important, therefore, that the claims are regarded as including such equivalent constructions as far as they do not depart from the spirit and scope of the present invention. Rather, the fundamental aspects of the invention, along with the various features and structures that characterize the invention, are pointed out with particularity in the claims annexed to and forming a part of this disclosure. For a better understanding of the present invention, its advantages and the specific objects attained by its uses, reference should be made to the accompanying drawings and descriptive matter in which there are illustrated the preferred embodiment.
The invention will be better understood and objects other than those set forth above will become apparent when consideration is given to the following detailed description thereof. Such description makes reference to the annexed drawings wherein:
The present invention is a method and apparatus that renders a projection of images of the coronary arteries in three dimensions using ultrasound. In its most essential aspect, this is accomplished by first producing a 3D array of voxels indicating the blood-filled areas of the heart. Next, a 2D image of the blood-filled areas is projected as a function of view angle and rotation, and this allows an observation and evaluation of the blood-filled areas from a number of view angles such that the coronary arteries and veins are seen unobscured by the major chambers of the heart. The objective is to provide a non-invasive screening test to assess the patency of the coronary arteries. It is hoped that in addition to detecting complete blockages of the arteries, it will also be possible to assess the degree of obstruction in partially occluded arteries.
Several methods are available to obtain the necessary three dimensional information using ultrasound. Two methods have been published concerning direct 3D. One is the RT3D method developed by Dr. Von Ramm and associates at Duke University (see, S. W. Smith, H. G. Pavy, and O. T. von Ramm, “High-speed ultrasound volumetric imaging-system. 1. Transducer design and beam steering,” IEEE Trans. Ultrason., Ferroelect., Freq. Contr., vol. 38, pp. 100-108, 1991; and O. T. von Ramm, S. W. Smith, and H. G. Pavy, “High-speed ultrasound volumetric imaging-System. 2. Parallel processing and image display,” IEEE Trans. Ultrason., Ferroelect., Freq. Contr., vol. 38, pp. 109-115, 1991.)
Another direct 3D method is the CAC-BF method patent pending by Weber et al. (discussed supra).
Neither of the above-referenced direct 3D methods provides sufficient resolution to reliably image the coronary arteries. One reason for this is that resolution is dependent on focusing of both the transmitted beam and the received energy. In order to capture a 3D image of the heart fast enough to freeze motion at a particular part of the cardiac cycle, the number of transmit pulses possible due to the speed of ultrasound in tissue is very limited. These pulses cannot be sharply focused if one is to cover the entire volume. Although no such limitation applies to the focusing of the received beam, the combination of transmit focusing and receive focusing is not as sharp as is possible with a 2D scanner. For this reason, the preferred implementation of the present invention is to utilize the superior resolution of 2D scanners and store a sufficient number of closely spaced 2D slices to capture the structural features of the coronary arteries and other portions of the heart H.
Referring now to
Several method steps are needed to capture the information and assemble it into a useful display. They are as follows:
First, an LMS adaptive filter or Wiener filter is used to remove aperture blur and speckle noise from each 2D image obtained. It is known that linear filtering can be very effective on 2D scans (see, in particular, Sapia, M. A., Fox, M. D., Loew, L. M., Schaff, J. C., “Ultrasound Image Deconvolution Using Adaptive Inverse Filtering”, 12th IEEE Symposium on Computer-Based Medical Systems, CBMS 1999, pp. 248-253; Sapia, M. A., “Deconvolution of Ultrasonic Waveforms Using an Adaptive Wiener Filter,” Review of Progress in Quantitative Nondestructive Evaluation, Volume 13, Edited by D. O. Thompson and D. E. Chimenti, New York, Plenum Press, pp. 855-862, 1994; Mark Angelo Sapia, “Multi-Dimensional Deconvolution of Optical Microscope and Ultrasound Imaging Using Adaptive Least-Mean-Square (LMS) Inverse Filtering,” Ph.D. Dissertation, University of Connecticut, 2000; Specht, D. F., Blind deconvolution of motion blur using LMS inverse filtering, Lockheed Independent Research, unpublished, 1976); U.S. Pat. No. 6,166,853, to Sapia et al.; U.S. Pat. Appl. No. 2005/0053305, by Li et al). This step may not be necessary, as “standard” 2D scanners are constantly improving and may correct sufficiently for aperture blur.
Next, after collecting a sufficient number of closely spaced 2D slices to represent the structures of the heart in the vicinity of the CAs, this information is used to fill the voxels of a 3D array. It is also necessary to compensate for non-uniform sampling and jitter in the positioning of adjacent slices due to motion of the transducer and of the heart. Techniques to accomplish this are discussed in depth in Nadkarni, Seemantini, “Cardiac Motion Synchronization for 3D Cardiac Ultrasound Imaging,” Ph.D. Dissertation, Univ. of Western Ontario, 2002, and Maria J. Ledesma-Carbayo et al., “Spatio-Temporal Nonrigid Registration for Ultrasound Cardiac Motion Estimation,” IEEE Trans. on Medical Imaging, v24, No. 9, September 2005, both of which are incorporated in their entirety by reference herein.
Next, in order to accomplish the previous step it is necessary to record the position and angle of each 2D slice relative to the other slices. This can be accomplished by a fixture positioned rigidly above the patient's chest. However, the preferred embodiment of this invention allows the sonographer to scan manually in order to find the best view for a particular patient. The sonographer must take care to scan over the entire volume of interest. Redundant images are discarded or averaged. A sensor to record relative instantaneous positions of the probe may be attached to a standard handheld probe. A gyroscopic stabilizer attached to the probe is used to minimize angular variations except on the axis desired. The gyroscope also provides a reference for measuring the angular position of the probe. A sensor measures the position and provides this information to the computer.
Then, in order to display the coronary arteries with maximum lumen opening, EKG gating or image-based synchronization is employed to capture the images at only one part of the cardiac cycle. Thus the total scan time will probably take many cardiac cycles. Nadkarni, cited above, shows that image-based synchronization is more reliable than the traditional EKG synchronization, but it is more complicated. Therefore image-based synchronization is an alternate method that may be employed.
Next, the images are segmented. Linear filtering may not be effective in the third dimension because the time between slices is necessarily large and the beating heart cannot be considered a stationary target. There will also be jitter displacement between adjacent slices. It is not an object of this invention simply to perform deconvolution of a point spread function in three dimensions (although this could be done). Rather, as indicated above, it is an object of this invention to discriminate between blood-filled areas and other areas of tissue so as to display only the blood-filled areas. These will include, in addition to the main chambers of the heart, the coronary arteries and veins. It is a further object of this invention to separate the arteries and veins from everything else so that they can be studied in detail. The separation is accomplished by further discriminating between large blood-filled areas and narrow ones, such as arteries and veins, and displaying only the later. Several feedforward neural networks can be used for this step. All require training on examples of the two categories (e.g. blood-filled areas and other tissue). The most-popular neural networks to be used in this step include: (1) the Multi-Layer Perceptron (discussed in Simon Haykin, Neural Networks: A Comprehensive Foundation, 2nd Edition, Prentice Hall, 1999); (2) the Probabilistic Neural Network (variously considered in D. F. Specht, “Probabilistic Neural Networks.” Neural Networks, vol. 3, pp 109-118, 1990; D. F. Specht, “Enhancements to Probabilistic Neural Networks,” Proc. IEEE International Joint Conference on Neural Networks, Baltimore, Md., June, 1992; and D. F. Specht and H. Romsdahl, “Experience with Adaptive PNN and Adaptive GRNN,” Proc. IEEE International Joint Conference on Neural Networks, vol. 2, pp. 1203-1208, Orlando, Fla., June 1994; and (3) the Support Vector Machine (discussed in Nello Cristianini and John Shawe-Taylor, An Introduction to Support Vector Machines, Cambridge University Press, 2000.
Alternatively, the broad categories of blood-filled areas and other tissue areas can be more specifically designated as tubular blood-filled areas vs. everything else, thereby letting the neural network suppress the large blood-filled areas of the chambers of the heart. It is important that neural networks are not limited to linear relationships. In addition to classification of each voxel as either a tubular blood-filled or not, image segmentation can be further enhanced by influence of neighbor pixels or voxels, as described in M. Morrison and Y. Attikiouzel, “A probabilistic neural network based image segmentation network for magnetic resonance images,” in Proc. Conf. Neural Networks, Baltimore, Md., 1992, vol. 3, pp. 60-65. This paper describes an image segmentation process in which only neighboring pixels are considered, but neighboring voxels in the 3D representation can be used in the presently inventive ultrasound application.
Alternatively, discrimination between blood-filled tissues and other tissues can be accomplished using techniques such as Classification and Regression Trees (CART), Hidden Markov Models (HMM) or Fuzzy Logic. An efficient classifier is found in the latest version of the General Regression Neural Network, described in a paper authored by the present inventor, D. F. Specht, “GRNN with Double Clustering,” Proc. IEEE International Joint Conference on Neural Networks, Vancouver, Canada, Jul. 16-21, 2006.
Finally, the point of view of the 3D image is rotated so that one group of coronary arteries at a time can be observed not obscured by the others or non-suppressed portions of the main chambers of the heart or other artifacts.
Linear filtering such as LMS or Wiener filtering is very effective when the point spread function of the imaging system is known and the object is stationary. The point spread function can be measured using a known tissue phantom when the object is reasonably stationary during a single 2D scan with duration on the order of 1/30 second. However, the second condition does not apply for the time required to acquire a 3D volume. For this reason a linear deconvolution filter is not a good choice for the third dimension. An artificial neural network (“ANN”) has the capacity to select the surrounding voxels that contribute to a reliable classification while rejecting the voxels that detract, and then weighting those that are intermediate. Clearly the voxels within a given 2D slice will have more relevance than the others, but significant information can be extracted from adjacent slices.
A second reason for using a nonlinear filter is that small echos from solid tissue and large echos from solid tissue, including specular reflections and speckle patterns, all must be displayed similarly as small values compared to those of the blood-filled areas. Although linear filtering with thresholding could accomplish this portion of the task, ANNs are inherently nonlinear.
The Wiener Filter: The Wiener filter is not new, but since it is important for the debluring step, it will be described briefly here in the context of the present invention.
The Wiener filter is the mean squared error optimal stationary linear filter for images degraded by additive noise and blurring. Wiener filters are usually applied in the frequency domain. Given a degraded image i(n,m), one takes the Discrete Fourier Transform (DFT) or the Fast Fourier Transform (FFT) to obtain I(u,v). The true image spectrum is estimated by taking the product of I(u,v) with the Wiener filter G(u,v): Ŝ=G(u,v)I(u,v)
The inverse DFT or FFT is then used to obtain the image estimate s(n,m) from its spectrum. The Wiener filter is defined in terms of these spectra:
H(u,v) Fourier transform of the point spread function (psf)
Ps(u,v) Power spectrum of the signal process, obtained by taking the Fourier transform of the signal autocorrelation
Pn(u,v) Power spectrum of the noise process, obtained by taking the Fourier transform of the noise autocorrelation
The Wiener filter is:
The ratio Ps/Pn can be interpreted as signal-to-noise ratio. At frequencies with high signal to noise ratio, the Wiener filter becomes H−1(u,v), the inverse filter for the psf. At frequencies for which the signal to noise ratio is low, the Wiener filter tends to 0 and blocks them out.
Ps(u,v)+Pn(u,v)=|I(u, v)|2. The right hand function is easy to compute from the Fourier transform of the observed data. Pn(u,v) is often assumed to be constant over (u,v). It is then subtracted from the total to yield Ps(u,v).
The psf can be measured by observing a wire phantom in a tank using the ultrasound instrument. The Fourier transform of the psf can then be stored for later use in the Wiener filter when examining patients.
Because the psf is not constant as a function of range, the Wiener filter will have to be applied separately for several range zones and the resulting images will have to be pieced together to form one image for display. A useful compromise might be to optimize the Wiener filter just for the range of the object of interest such as a coronary artery or valve.
An Adaptive Inverse Filter: As pointed out by Sapia and Fox (1999), an adaptive filter in the spatial domain is essentially equivalent to the Wiener Filter implemented in the frequency domain and has some additional advantages. The main advantages are the simplicity of the algorithm, and that, being adaptive, it minimizes sources of noise such as edge effects in addition to deconvolution of blurring resulting from the point spread function of the system.
In the spatial domain a transversal filter 100 as in
Let Xk be a N-dimensional vector of the inputs used for estimating y[k], and let Wk be the set of weights after training k samples. The linear estimate when trained will be y[k]=XkT W.
For training it is necessary to know the desired value of the output pixels, d[k] 130. These can be obtained by imaging a phantom with known geometry. After each training iteration, the error can be evaluated as
εk=d[k]−XkTWk.
The LMS algorithm in equation form is:
Wk+1=Wk+2μεkXk
where μ is the convergence coefficient. [See B. Widrow and S. D. Stearns, Adaptive Signal Processing, Englewood Cliffs, N.J., Prentice-Hall, pp. 99-137, 1985.]
Because the psf is not constant as a function of range, the adaptive inverse filter also will have to be applied separately for several range zones and the resulting images will have to be pieced together to form one image for display.
Required Hardware: A typical hardware system 200 incorporating the invention is shown in
The standard scanner includes a phased array probe 220 shown separately. Mechanically attached to the probe is a gyroscopic stabilizer and position sensor 230 which must, at minimum, measure the relative angle of the probe as it is rotated to insonify the volume of the heart. For the purpose intended, it need not insonify the entire heart muscle, but only the main coronary arteries. The angle could be referenced to a stationary fixture. However, because only the angle is required, the reference can be a gyroscope attached to the probe. Small variations in the location of the probe from one scan to the next can be compensated by software using correlation or related techniques. A more-complex model could use integrating accelerometers to maintain exact position information.
The output of the standard scanner is displayed on a monitor 240. The display typically has several modes including 2D sector, M mode, and doppler.
A computer 250 is necessary to implement the software algorithms described. Inputs to the computer include the processed scan line returns 260 from the sector scanner, the position sensors 270 mentioned above, some sort of pointing device such as a mouse 280, and usually electrocardiogram sensor input 290 from an EKG sensor 300 for synchronization. The scan line information needed for LMS filtering or Wiener filtering is that of individual scan lines after beam formation but before scan conversion for display. The scan line information needed for discrimination, classification, and segmentation is the output 310 of the scan converter 320 because dimensions are constant after this conversion. After discrimination, classification, and segmentation, a 3D image is displayed on a monitor 330.
Simulated Display: Using the inventive method, a simulated representation of the coronary arteries and veins has been formed in a three dimensional format. Two-dimensional projections have been formed under control of a mouse to select view angles to best highlight the areas of interest. Results from two particular view angles 400, and 500, respectively, are reproduced in
Gyroscopic Stabilization and Angle Measurement: When collecting data for a 3D presentation, it is important to keep the phased array transducers in the same position while the probe is being rotated. Otherwise the spatial position (x, y, and z) and the angular position (pitch, yaw, and roll) must be continuously measured and compensated for by software. The fixture mounted over the patient mentioned in the previous section can assure that, of the six degrees of freedom, all are fixed during the collection of data except either roll or pitch, which is intentionally changed and measured. Some compensation for spatial position will still have to be done (by correlation techniques) because of the motion of the heart. The fixture satisfies the technical requirements for the instrumentation, but sacrifices the convenience that comes with hand positioning of the probe to find the best view for a particular patient.
A preferred embodiment of the probe for this invention includes the following: First and second gyroscopes mechanically attached to a standard phased array probe designed for two dimensional scans. The first gyroscope is aligned with the center line of the sector scan (i.e., perpendicular to the surface of the skin and aimed at the tissue to be imaged). The second gyroscope is mounted perpendicular to the alignment of the first gyroscope with a sensor to measure the relative angle between it and the probe handle as it is mechanically rotated.
The first gyroscope is motorized and has sufficient mass and rotational speed to stabilize the handheld probe in pitch and yaw even though the technician's hand might apply slight pressure to twist it in pitch or yaw. The second gyroscope can be much smaller in mass (or it could be solid state, such as piezoelectric) as its only function is to measure the rotation angle of the probe.
In operation, the gyroscopes would initially be turned off while the technician positions the probe to find the cone which best contains the volume of interest. When he or she is satisfied with the orientation of the probe, he or she turns on the gyroscopes to stabilize that orientation and then scans through the range of rotation angles (180 degrees covers the entire cone, but a more limited range may be sufficient). The scanning through rotation angles may be done by hand or it could be automated.
An alternate embodiment of the probe for this invention includes the following: First and second gyroscopes mechanically attached to a standard phased array probe designed for two dimensional scans. The first gyroscope is aligned parallel to the line of transducer elements in the probe (or the closest mean-squared-error fit line in the case of a curved array). It is not instrumented for angle measurement. The second gyroscope is mounted perpendicular to the alignment of the first gyroscope with a sensor to measure the relative angle between it and the probe handle.
The first gyroscope is motorized and has sufficient mass and rotational speed to stabilize the handheld probe in yaw and roll, even though the technician's hand might apply slight pressure to twist it in yaw and roll. For these purposes, pitch angle is defined as angle of rotation about the axis of the first gyroscope. The second gyroscope can be much smaller in mass (or it could be solid state, such as piezoelectric) as its only function is to measure the pitch angle.
In operation, the gyroscopes would initially be turned off while the technician positions the probe to find the best range of views. When he or she is satisfied with the orientation of the probe, he or she turns on the gyroscopes to stabilize that orientation and then scans through the range of pitch angles.
Another implementation of the probe is to have a two dimensional array of phased array transducers so that the angles can be adjusted electronically.
The above disclosure is sufficient to enable one of ordinary skill in the art to practice the invention, and provides the best mode of practicing the invention presently contemplated by the inventor. While there is provided herein a full and complete disclosure of the preferred embodiments of this invention, it is not desired to limit the invention to the exact construction, dimensional relationships, and operation shown and described. Various modifications, alternative constructions, changes and equivalents will readily occur to those skilled in the art and may be employed, as suitable, without departing from the true spirit and scope of the invention. Such changes might involve alternative materials, components, structural arrangements, sizes, shapes, forms, functions, operational features or the like.
Therefore, the above description and illustrations should not be construed as limiting the scope of the invention, which is defined by the appended claims.
This application is a continuation of U.S. patent application Ser. No. 11/532,013 filed Sep. 14, 2006 now U.S. Pat. No. 8,105,239; which application claims the benefit of U.S. Provisional Patent Application No. 60/765,887, filed Feb. 6, 2006; which applications are incorporated by reference in their entirety herein.
Number | Name | Date | Kind |
---|---|---|---|
3174286 | Erickson | Mar 1965 | A |
3895381 | Kock | Jul 1975 | A |
3974692 | Hassler | Aug 1976 | A |
4055988 | Dutton | Nov 1977 | A |
4072922 | Taner et al. | Feb 1978 | A |
4097835 | Green | Jun 1978 | A |
4105018 | Greenleaf et al. | Aug 1978 | A |
4259733 | Taner et al. | Mar 1981 | A |
4265126 | Papadofrangakis et al. | May 1981 | A |
4271842 | Specht et al. | Jun 1981 | A |
4325257 | Kino et al. | Apr 1982 | A |
4327738 | Green et al. | May 1982 | A |
4452084 | Taenzer | Jun 1984 | A |
4501279 | Seo | Feb 1985 | A |
4539847 | Paap | Sep 1985 | A |
4566459 | Umemura et al. | Jan 1986 | A |
4567768 | Satoh et al. | Feb 1986 | A |
4604697 | Luthra et al. | Aug 1986 | A |
4662222 | Johnson | May 1987 | A |
4669482 | Ophir | Jun 1987 | A |
4682497 | Sasaki | Jul 1987 | A |
4781199 | Hirama et al. | Nov 1988 | A |
4817434 | Anderson | Apr 1989 | A |
4831601 | Breimesser et al. | May 1989 | A |
4893284 | Magrane | Jan 1990 | A |
4893628 | Angelsen | Jan 1990 | A |
5050588 | Grey et al. | Sep 1991 | A |
5141738 | Rasor et al. | Aug 1992 | A |
5161536 | Vilkomerson et al. | Nov 1992 | A |
5197475 | Antich et al. | Mar 1993 | A |
5226019 | Bahorich | Jul 1993 | A |
5230339 | Charlebois | Jul 1993 | A |
5269309 | Fort et al. | Dec 1993 | A |
5278757 | Hoctor et al. | Jan 1994 | A |
5293871 | Reinstein et al. | Mar 1994 | A |
5299576 | Shiba | Apr 1994 | A |
5301674 | Erikson et al. | Apr 1994 | A |
5305756 | Entrekin et al. | Apr 1994 | A |
5339282 | Kuhn et al. | Aug 1994 | A |
5340510 | Bowen | Aug 1994 | A |
5345426 | Lipschutz | Sep 1994 | A |
5349960 | Gondo | Sep 1994 | A |
5355888 | Kendall | Oct 1994 | A |
5398216 | Hall et al. | Mar 1995 | A |
5442462 | Guissin | Aug 1995 | A |
5454372 | Banjanin et al. | Oct 1995 | A |
5515853 | Smith et al. | May 1996 | A |
5515856 | Olstad et al. | May 1996 | A |
5522393 | Phillips et al. | Jun 1996 | A |
5526815 | Granz et al. | Jun 1996 | A |
5544659 | Banjanin | Aug 1996 | A |
5558092 | Unger et al. | Sep 1996 | A |
5564423 | Mele et al. | Oct 1996 | A |
5568812 | Murashita et al. | Oct 1996 | A |
5570691 | Wright et al. | Nov 1996 | A |
5581517 | Gee et al. | Dec 1996 | A |
5625149 | Gururaja et al. | Apr 1997 | A |
5628320 | Teo | May 1997 | A |
5673697 | Bryan et al. | Oct 1997 | A |
5675550 | Ekhaus | Oct 1997 | A |
5720291 | Schwartz | Feb 1998 | A |
5720708 | Lu et al. | Feb 1998 | A |
5744898 | Smith et al. | Apr 1998 | A |
5769079 | Hossack | Jun 1998 | A |
5784334 | Sena et al. | Jul 1998 | A |
5785654 | Iinuma et al. | Jul 1998 | A |
5795297 | Daigle | Aug 1998 | A |
5797845 | Barabash et al. | Aug 1998 | A |
5798459 | Ohba et al. | Aug 1998 | A |
5820561 | Olstad et al. | Oct 1998 | A |
5838564 | Bahorich et al. | Nov 1998 | A |
5850622 | Vassiliou et al. | Dec 1998 | A |
5862100 | VerWest | Jan 1999 | A |
5870691 | Partyka et al. | Feb 1999 | A |
5876342 | Chen et al. | Mar 1999 | A |
5891038 | Seyed-Bolorforosh et al. | Apr 1999 | A |
5892732 | Gersztenkorn | Apr 1999 | A |
5919139 | Lin | Jul 1999 | A |
5920285 | Benjamin | Jul 1999 | A |
5930730 | Marfurt et al. | Jul 1999 | A |
5940778 | Marfurt et al. | Aug 1999 | A |
5951479 | Holm et al. | Sep 1999 | A |
5964707 | Fenster et al. | Oct 1999 | A |
5969661 | Benjamin | Oct 1999 | A |
5999836 | Nelson et al. | Dec 1999 | A |
6007499 | Martin et al. | Dec 1999 | A |
6013032 | Savord | Jan 2000 | A |
6014473 | Hossack et al. | Jan 2000 | A |
6049509 | Sonneland et al. | Apr 2000 | A |
6050943 | Slayton et al. | Apr 2000 | A |
6056693 | Haider | May 2000 | A |
6058074 | Swan et al. | May 2000 | A |
6077224 | Lang et al. | Jun 2000 | A |
6092026 | Bahorich et al. | Jul 2000 | A |
6122538 | Sliwa, Jr. et al. | Sep 2000 | A |
6123670 | Mo | Sep 2000 | A |
6129672 | Seward et al. | Oct 2000 | A |
6135960 | Holmberg | Oct 2000 | A |
6138075 | Yost | Oct 2000 | A |
6148095 | Prause et al. | Nov 2000 | A |
6162175 | Marian, Jr. et al. | Dec 2000 | A |
6166384 | Dentinger et al. | Dec 2000 | A |
6166853 | Sapia et al. | Dec 2000 | A |
6193665 | Hall et al. | Feb 2001 | B1 |
6196739 | Silverbrook | Mar 2001 | B1 |
6200266 | Shokrollahi et al. | Mar 2001 | B1 |
6210335 | Miller | Apr 2001 | B1 |
6213958 | Winder | Apr 2001 | B1 |
6221019 | Kantorovich | Apr 2001 | B1 |
6231511 | Bae | May 2001 | B1 |
6238342 | Feleppa et al. | May 2001 | B1 |
6246901 | Benaron | Jun 2001 | B1 |
6251073 | Imran et al. | Jun 2001 | B1 |
6264609 | Herrington et al. | Jul 2001 | B1 |
6266551 | Osadchy et al. | Jul 2001 | B1 |
6278949 | Alam | Aug 2001 | B1 |
6289230 | Chaiken et al. | Sep 2001 | B1 |
6304684 | Niczyporuk et al. | Oct 2001 | B1 |
6309356 | Ustuner et al. | Oct 2001 | B1 |
6324453 | Breed et al. | Nov 2001 | B1 |
6345539 | Rawes et al. | Feb 2002 | B1 |
6361500 | Masters | Mar 2002 | B1 |
6363033 | Cole et al. | Mar 2002 | B1 |
6370480 | Gupta et al. | Apr 2002 | B1 |
6374185 | Taner et al. | Apr 2002 | B1 |
6394955 | Perlitz | May 2002 | B1 |
6423002 | Hossack | Jul 2002 | B1 |
6436046 | Napolitano et al. | Aug 2002 | B1 |
6449821 | Sudol et al. | Sep 2002 | B1 |
6450965 | Williams et al. | Sep 2002 | B2 |
6468216 | Powers et al. | Oct 2002 | B1 |
6471650 | Powers et al. | Oct 2002 | B2 |
6475150 | Haddad | Nov 2002 | B2 |
6480790 | Calvert et al. | Nov 2002 | B1 |
6487502 | Taner | Nov 2002 | B1 |
6499536 | Ellingsen | Dec 2002 | B1 |
6508768 | Hall et al. | Jan 2003 | B1 |
6508770 | Cai | Jan 2003 | B1 |
6517484 | Wilk et al. | Feb 2003 | B1 |
6526163 | Halmann et al. | Feb 2003 | B1 |
6543272 | Vitek | Apr 2003 | B1 |
6547732 | Jago | Apr 2003 | B2 |
6551246 | Ustuner et al. | Apr 2003 | B1 |
6565510 | Haider | May 2003 | B1 |
6585647 | Winder | Jul 2003 | B1 |
6604421 | Li | Aug 2003 | B1 |
6614560 | Silverbrook | Sep 2003 | B1 |
6620101 | Azzam et al. | Sep 2003 | B2 |
6668654 | Dubois et al. | Dec 2003 | B2 |
6672165 | Rather et al. | Jan 2004 | B2 |
6681185 | Young et al. | Jan 2004 | B1 |
6690816 | Aylward et al. | Feb 2004 | B2 |
6692450 | Coleman | Feb 2004 | B1 |
6695778 | Golland et al. | Feb 2004 | B2 |
6719693 | Richard | Apr 2004 | B2 |
6728567 | Rather et al. | Apr 2004 | B2 |
6755787 | Hossack et al. | Jun 2004 | B2 |
6790182 | Eck et al. | Sep 2004 | B2 |
6837853 | Marian | Jan 2005 | B2 |
6843770 | Sumanaweera | Jan 2005 | B2 |
6847737 | Kouri et al. | Jan 2005 | B1 |
6932767 | Landry et al. | Aug 2005 | B2 |
7033320 | Von Behren et al. | Apr 2006 | B2 |
7087023 | Daft et al. | Aug 2006 | B2 |
7104956 | Christopher | Sep 2006 | B1 |
7221867 | Silverbrook | May 2007 | B2 |
7231072 | Yamano et al. | Jun 2007 | B2 |
7269299 | Schroeder | Sep 2007 | B2 |
7283652 | Mendonca et al. | Oct 2007 | B2 |
7285094 | Nohara et al. | Oct 2007 | B2 |
7313053 | Wodnicki | Dec 2007 | B2 |
7366704 | Reading et al. | Apr 2008 | B2 |
7402136 | Hossack et al. | Jul 2008 | B2 |
7410469 | Talish et al. | Aug 2008 | B1 |
7443765 | Thomenius et al. | Oct 2008 | B2 |
7444875 | Wu et al. | Nov 2008 | B1 |
7447535 | Lavi | Nov 2008 | B2 |
7448998 | Robinson | Nov 2008 | B2 |
7466848 | Metaxas et al. | Dec 2008 | B2 |
7469096 | Silverbrook | Dec 2008 | B2 |
7474778 | Shinomura et al. | Jan 2009 | B2 |
7497830 | Li | Mar 2009 | B2 |
7510529 | Chou et al. | Mar 2009 | B2 |
7514851 | Wilser et al. | Apr 2009 | B2 |
7549962 | Dreschel et al. | Jun 2009 | B2 |
7574026 | Rasche et al. | Aug 2009 | B2 |
7625343 | Cao et al. | Dec 2009 | B2 |
7668583 | Fegert et al. | Feb 2010 | B2 |
7682311 | Simopoulos et al. | Mar 2010 | B2 |
7750311 | Daghighian | Jul 2010 | B2 |
7787680 | Ahn et al. | Aug 2010 | B2 |
7822250 | Yao et al. | Oct 2010 | B2 |
7860290 | Gulsun et al. | Dec 2010 | B2 |
7862508 | Davies et al. | Jan 2011 | B2 |
7914451 | Davies | Mar 2011 | B2 |
7919906 | Cerofolini | Apr 2011 | B2 |
8007439 | Specht | Aug 2011 | B2 |
8105239 | Specht | Jan 2012 | B2 |
8412307 | Willis et al. | Apr 2013 | B2 |
8672846 | Napolitano et al. | Mar 2014 | B2 |
20020035864 | Paltieli et al. | Mar 2002 | A1 |
20020087071 | Schmitz et al. | Jul 2002 | A1 |
20020161299 | Prater et al. | Oct 2002 | A1 |
20030013962 | Bjaerum et al. | Jan 2003 | A1 |
20030028111 | Vaezy et al. | Feb 2003 | A1 |
20030040669 | Grass et al. | Feb 2003 | A1 |
20030228053 | Li et al. | Dec 2003 | A1 |
20040054283 | Corey et al. | Mar 2004 | A1 |
20040068184 | Trahey et al. | Apr 2004 | A1 |
20040100163 | Baumgartner et al. | May 2004 | A1 |
20040111028 | Abe et al. | Jun 2004 | A1 |
20040122313 | Moore et al. | Jun 2004 | A1 |
20040122322 | Moore et al. | Jun 2004 | A1 |
20040138565 | Trucco | Jul 2004 | A1 |
20040144176 | Yoden | Jul 2004 | A1 |
20040236217 | Cerwin et al. | Nov 2004 | A1 |
20040236223 | Barnes et al. | Nov 2004 | A1 |
20050004449 | Mitschke et al. | Jan 2005 | A1 |
20050053305 | Li et al. | Mar 2005 | A1 |
20050054910 | Tremblay et al. | Mar 2005 | A1 |
20050090743 | Kawashima et al. | Apr 2005 | A1 |
20050090745 | Steen | Apr 2005 | A1 |
20050111846 | Steinbacher et al. | May 2005 | A1 |
20050113689 | Gritzky | May 2005 | A1 |
20050113694 | Haugen et al. | May 2005 | A1 |
20050124883 | Hunt | Jun 2005 | A1 |
20050131300 | Bakircioglu et al. | Jun 2005 | A1 |
20050147297 | McLaughlin et al. | Jul 2005 | A1 |
20050165312 | Knowles et al. | Jul 2005 | A1 |
20050203404 | Freiburger | Sep 2005 | A1 |
20050215883 | Hundley et al. | Sep 2005 | A1 |
20050240125 | Makin et al. | Oct 2005 | A1 |
20050252295 | Fink et al. | Nov 2005 | A1 |
20050281447 | Moreau-Gobard et al. | Dec 2005 | A1 |
20050288588 | Weber et al. | Dec 2005 | A1 |
20060062447 | Rinck et al. | Mar 2006 | A1 |
20060074313 | Slayton et al. | Apr 2006 | A1 |
20060074315 | Liang et al. | Apr 2006 | A1 |
20060074320 | Yoo et al. | Apr 2006 | A1 |
20060079759 | Vaillant et al. | Apr 2006 | A1 |
20060079778 | Mo et al. | Apr 2006 | A1 |
20060079782 | Beach et al. | Apr 2006 | A1 |
20060094962 | Clark | May 2006 | A1 |
20060111634 | Wu | May 2006 | A1 |
20060122506 | Davies et al. | Jun 2006 | A1 |
20060173327 | Kim | Aug 2006 | A1 |
20060262291 | Hess et al. | Nov 2006 | A1 |
20060262961 | Holsing et al. | Nov 2006 | A1 |
20070036414 | Georgescu et al. | Feb 2007 | A1 |
20070055155 | Owen et al. | Mar 2007 | A1 |
20070078345 | Mo et al. | Apr 2007 | A1 |
20070088213 | Poland | Apr 2007 | A1 |
20070138157 | Dane et al. | Jun 2007 | A1 |
20070161898 | Hao et al. | Jul 2007 | A1 |
20070167752 | Proulx et al. | Jul 2007 | A1 |
20070167824 | Lee et al. | Jul 2007 | A1 |
20070232914 | Chen et al. | Oct 2007 | A1 |
20070238985 | Smith et al. | Oct 2007 | A1 |
20080181479 | Yang et al. | Jul 2008 | A1 |
20080194958 | Lee et al. | Aug 2008 | A1 |
20080255452 | Entrekin | Oct 2008 | A1 |
20080269604 | Boctor et al. | Oct 2008 | A1 |
20080269613 | Summers et al. | Oct 2008 | A1 |
20080287787 | Sauer et al. | Nov 2008 | A1 |
20080294045 | Ellington et al. | Nov 2008 | A1 |
20080294050 | Shinomura et al. | Nov 2008 | A1 |
20080319317 | Kamiyama et al. | Dec 2008 | A1 |
20090012400 | Guracar et al. | Jan 2009 | A1 |
20090016163 | Freeman et al. | Jan 2009 | A1 |
20090082671 | Guracar et al. | Mar 2009 | A1 |
20090082672 | Guracar et al. | Mar 2009 | A1 |
20090182237 | Angelsen et al. | Jul 2009 | A1 |
20090208080 | Grau et al. | Aug 2009 | A1 |
20090259128 | Stribling | Oct 2009 | A1 |
20100168566 | Bercoff et al. | Jul 2010 | A1 |
20100217124 | Cooley | Aug 2010 | A1 |
20100262013 | Smith et al. | Oct 2010 | A1 |
20100268503 | Specht et al. | Oct 2010 | A1 |
20110178400 | Specht et al. | Jul 2011 | A1 |
20110201933 | Specht et al. | Aug 2011 | A1 |
20110306885 | Specht | Dec 2011 | A1 |
20120057428 | Specht et al. | Mar 2012 | A1 |
20130035595 | Specht | Feb 2013 | A1 |
20130144166 | Specht et al. | Jun 2013 | A1 |
20140043933 | Belevich et al. | Feb 2014 | A1 |
Number | Date | Country |
---|---|---|
1636518 | Jul 2005 | CN |
1781460 | Jun 2006 | CN |
101116622 | Feb 2008 | CN |
101453955 | Jun 2009 | CN |
1949856 | Jul 2008 | EP |
1757955 | Nov 2010 | EP |
1850743 | Dec 2012 | EP |
1594404 | Sep 2013 | EP |
2851662 | Aug 2004 | FR |
S49-11189 | Jan 1974 | JP |
S54-44375 | Apr 1979 | JP |
S55-103839 | Aug 1980 | JP |
57-31848 | Feb 1982 | JP |
59-101143 | Jun 1984 | JP |
S59-174151 | Oct 1984 | JP |
S60-13109 | Jan 1985 | JP |
S60-68836 | Apr 1985 | JP |
2-501431 | May 1990 | JP |
4-67856 | Mar 1992 | JP |
05-042138 | Feb 1993 | JP |
6-125908 | May 1994 | JP |
7-051266 | Feb 1995 | JP |
08-252253 | Oct 1996 | JP |
9-103429 | Apr 1997 | JP |
9-201361 | Aug 1997 | JP |
10-216128 | Aug 1998 | JP |
11-089833 | Apr 1999 | JP |
11-239578 | Sep 1999 | JP |
2001-507794 | Jun 2001 | JP |
2001-245884 | Sep 2001 | JP |
2002-209894 | Jul 2002 | JP |
2002-253548 | Sep 2002 | JP |
2002-253549 | Sep 2002 | JP |
2004-167092 | Jun 2004 | JP |
2004-215987 | Aug 2004 | JP |
2004-337457 | Dec 2004 | JP |
2004-351214 | Dec 2004 | JP |
2005152187 | Jun 2005 | JP |
2005-523792 | Aug 2005 | JP |
2005-526539 | Sep 2005 | JP |
2006-61203 | Mar 2006 | JP |
2006-122657 | May 2006 | JP |
2007-325937 | Dec 2007 | JP |
2008-513763 | May 2008 | JP |
100715132 | Apr 2007 | KR |
WO 9218054 | Oct 1992 | WO |
WO 9800719 | Jan 1998 | WO |
WO 0164109 | Sep 2001 | WO |
WO02084594 | Oct 2002 | WO |
WO2005009245 | Feb 2005 | WO |
WO 2006114735 | Nov 2006 | WO |
Entry |
---|
Sapia et al.; Deconvolution of ultrasonic waveforms using an adaptive wiener filter; Review of Progress in Quantitative Nondestructive Evaluation; vol. 13A; Plenum Press; pp. 855-862; 1994. |
Call et al.; U.S. Appl. No. 13/971,689 entitled “Ultrasound Imaging System Memory Architecture,” filed Aug. 20, 2013. |
Specht et al.; U.S. Appl. No. 14/078,311 entitled “Imaging with Multiple Aperture Medical Ultrasound and Synchronization of Add-On Systems,” filed Nov. 12, 2013. |
Specht, D. F.; U.S. Appl. No. 14/157,257 entitled “Method and Apparatus to Produce Ultrasonic Images Using Multiple Apertures,” filed Jan. 16, 2014. |
Li et al.; An efficient speckle tracking algorithm for ultrasonic imaging; 24; pp. 215-228; Oct. 1, 2002. |
UCLA Academic Technology; SPSS learning module: How can I analyze a subset of my data; 6 pages; retrieved from the internet (http://www.ats.ucla.edu/stat/spss/modules/subset—analyze.htm) Nov. 26, 2001. |
Wikipedia; Curve fitting; 5 pages; retrieved from the internet (http: en.wikipedia.org/wiki/Curve—fitting) Dec. 19, 2010. |
Kramb et al,.; Considerations for using phased array ultrasonics in a fully automated inspection system. Review of Quantitative Nondestructive Evaluation, vol. 23, ed. D. O. Thompson and D. E. Chimenti, pp. 817-825, (month unavailable) 2004. |
Hendee et al.; Medical Imaging Physics; Wiley-Liss, Inc. 4th Edition; Chap. 19-22; pp. 303-353; © 2002. |
Wikipedia; Point cloud; 2 pages; Nov. 24, 2014; retrieved from the internet (https://en.wikipedia.org/w/index.php?title=Point—cloud&oldid=472583138). |
Smith et al.; U.S. Appl. No. 14/526,186 entitled “Universal multiple aperture medical ultrasound probe,” filed Oct. 28, 2014. |
Smith et al.; U.S. Appl. No. 14/595,083 entitled “Concave ultrasound transducers and 3D arrays,” filed Jan. 12, 2015. |
Specht et al.; U.S. Appl. No. 14/279,052 entitled “Ultrasound imaging using apparent point-source transmit transducer,” filed May 15, 2014. |
Specht et al.; U.S. Appl. No. 13/894,192 entitled “Multiple Aperture Ultrasound Array Alignment Fixture,” filed May 14, 2013. |
Cristianini et al.; An Introduction to Support Vector Machines; Cambridge University Press; pp. 93-111; Mar. 2000. |
Feigenbaum, Harvey, M.D.; Echocardiography; Lippincott Williams & Wilkins; Philadelphia; 5th Ed.; pp. 428, 484; Feb. 1994. |
Haykin, Simon; Neural Networks: A Comprehensive Foundation (2nd Ed.); Prentice Hall; pp. 156-187;Jul. 16, 1998. |
Ledesma-Carbayo et al.; Spatio-temporal nonrigid registration for ultrasound cardiac motion estimation; IEEE Trans. on Medical Imaging; vol. 24; No. 9; Sep. 2005. |
Leotta et al.; Quantitative three-dimensional echocardiography by rapid imaging . . . ; J American Society of Echocardiography; vol. 10; No. 8; ppl 830-839; Oct. 1997. |
Morrison et al.; A probabilistic neural network based image segmentation network for magnetic resonance images; Proc. Conf. Neural Networks; Baltimore, MD; vol. 3; pp. 60-65; Jun. 1992. |
Nadkarni et al.; Cardiac motion synchronization for 3D cardiac ultrasound imaging; Ph.D. Dissertation, University of Western Ontario; Jun. 2002. |
Pratt, William K.; Digital Image Processing; John Wiley & Sons; New York; Apr. 1991. |
Press et al.; Cubic spline interpolation; §3.3 in “Numerical Recipes in FORTRAN: The Art of Scientific Computing”, 2nd Ed.; Cambridge, England; Cambridge University Press; pp. 107-110; Sep. 1992. |
Sakas et al.; Preprocessing and volume rendering of 3D ultrasonic data; IEEE Computer Graphics and Applications; pp. 47-54, Jul. 1995. |
Sapia et al.; Ultrasound image deconvolution using adaptive inverse filtering; 12 IEEE Symposium on Computer-Based Medical Systems, CBMS, pp. 248-253; Jun. 1999. |
Sapia, Mark Angelo; Multi-dimensional deconvolution of optical microscope and ultrasound imaging using adaptive least-mean-square (LMS) inverse filtering; Ph.D. Dissertation; University of Connecticut; Jan. 2000. |
Smith et al.; High-speed ultrasound volumetric imaging system. 1. Transducer design and beam steering; IEEE Trans. Ultrason., Ferroelect., Freq. Contr.; vol. 38; pp. 100-108; Mar. 1991. |
Specht et al.; Deconvolution techniques for digital longitudinal tomography; SPIE; vol. 454; presented at Application of Optical Instrumentation in Medicine XII; pp. 319-325; Jun. 1984. |
Specht et al.; Experience with adaptive PNN and adaptive GRNN; Proc. IEEE International Joint Conf. on Neural Networks; vol. 2; pp. 1203-1208; Orlando, FL; Jun. 1994. |
Specht, D.F.; A general regression neural network; IEEE Trans. on Neural Networks; vol. 2.; No. 6; Nov. 1991. |
Specht, D.F.; Blind deconvolution of motion blur using LMS inverse filtering; Lockheed Independent Research (unpublished); Jun. 23, 1975. |
Specht, D.F.; Enhancements to probabilistic neural networks; Proc. IEEE International Joint Conf. on Neural Networks; Baltimore, MD; Jun. 1992. |
Specht, D.F.; GRNN with double clustering; Proc. IEEE International Joint Conf. Neural Networks; Vancouver, Canada; Jul. 16-21, 2006. |
Specht, D.F.; Probabilistic neural networks; Pergamon Press; Neural Networks; vol. 3; pp. 109-118; Feb. 1990. |
Von Ramm et al.; High-speed ultrasound volumetric imaging—System. 2. Parallel processing and image display; IEEE Trans. Ultrason., Ferroelect., Freq. Contr.; vol. 38; pp. 109-115; Mar. 1991. |
Wells, P.N.T.; Biomedical ultrasonics; Academic Press; London, New York, San Francisco; pp. 124-125; Mar. 1977. |
Widrow et al.; Adaptive signal processing; Prentice-Hall; Englewood Cliffs, NJ; pp. 99-116; Mar. 1985. |
Adam et al.; U.S. Appl. No. 13/272,098 entitled “Multiple aperture probe internal apparatus and cable assemblies,” filed Oct. 12, 2011. |
Smith et al.; U.S. Appl. No. 13/272,105 entitled “Concave Ultrasound Transducers and 3D Arrays,” filed Oct. 12, 2011. |
Brewer et al.; U.S. Appl. No. 13/730,346 entitled “M-Mode Ultrasound Imaging of Arbitrary Paths,” filed Dec. 28, 2012. |
Specht et al.; U.S. Appl. No. 13/773,340 entitled “Determining Material Stiffness Using Multiple Aperture Ultrasound,” filed Feb. 21, 2013. |
Call et al.; U.S. Appl. No. 13/850,823 entitled “Systems and methods for improving ultrasound image quality by applying weighting factors,” filed Mar. 26, 2013. |
Specht; U.S. Appl. No. 14/754,422 entitled “Method and apparatus to produce ultrasonic images using multiple apertures,” filed Jun. 29, 2015. |
Chen et al.; Maximum-likelihood source localization and unknown sensor location estimation for wideband signals in the near-field; IEEE Transactions on Signal Processing; 50(8); pp. 1843-1854; Aug. 2002. |
Chen et al.; Source localization and tracking of a wideband source using a randomly distributed beamforming sensor array; International Journal of High Performance Computing Applications; 16(3); pp. 259-272; Fall 2002. |
Fernandez et al.; High resolution ultrasound beamforming using synthetic and adaptive imaging techniques; Proceedings IEEE International Symposium on Biomedical Imaging; Washington, D.C.; pp. 433-436; Jul. 7-10, 2002. |
Gazor et al.; Wideband multi-source beamforming with array location calibration and direction finding; Conference on Acoustics, Speech and Signal Processing ICASSP-95; Detroit, MI; vol. 3 IEEE; pp. 1904-1907; May 9-12, 1995. |
Heikkila et al.; A four-step camera calibration procedure with implicit image correction; Proceedings IEEE Computer Scociety Conference on Computer Vision and Pattern Recognition; San Juan; pp. 1106-1112; Jun. 17-19, 1997. |
Hsu et al.; Real-time freehand 3D ultrasound calibration; CUED/F-INFENG/TR 565; Department of Engineering, University of Cambridge, United Kingdom; 14 pages; Sep. 2006. |
Khamene et al.; A novel phantom-less spatial and temporal ultrasound calibration method; Medical Image Computing and Computer-Assisted Intervention—MICCAI (Proceedings 8th Int. Conf.); Springer Berlin Heidelberg; Palm Springs, CA; pp. 65-72; Oct. 26-29, 2005. |
Wang et al.; Photoacoustic tomography of biological tissues with high cross-section resolution: reconstruction and experiment; Medical Physics; 29(12); pp. 2799-2805; Dec. 2002. |
Wikipedia; Speed of sound; 17 pages; retrieved from the internet (http: en.wikipedia.org/wiki/Speed—of—sound) Feb. 15, 2011. |
Jeffs; Beamforming: a brief introduction; Brigham Young University; 14 pages; retrieved from the internet (http://ens.ewi.tudelft.nl/Education/courses/et4235/Beamforming.pdf); Oct. 2004. |
Smith et al.; U.S. Appl. No. 14/210,015 entitled “Alignment of ultrasound transducer arrays and multiple aperture probe assembly,” filed Mar. 13, 2014. |
Capineri et al., A doppler system for dynamic vector velocity maps; Ultrasound in Medicine & Biology; 28(2); pp. 237-248; Feb. 28, 2002. |
Dunmire et al.; A brief history of vector doppler; Medical Imaging 2001; International Society for Optics and Photonics; pp. 200-214; May 30, 2001. |
Number | Date | Country | |
---|---|---|---|
20120116226 A1 | May 2012 | US |
Number | Date | Country | |
---|---|---|---|
60765887 | Feb 2006 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11532013 | Sep 2006 | US |
Child | 13333611 | US |