M-mode ultrasound imaging of arbitrary paths

Abstract
Systems and methods of M-mode ultrasound imaging allows for M-mode imaging along user-defined paths. In various embodiments, the user-defined path can be a non-linear path or a curved path. In some embodiments, a system for M-mode ultrasound imaging can comprise a multi-aperture probe with at least a first transmitting aperture and a second receiving aperture. The receiving aperture can be separate from the transmitting aperture. In some embodiments, the transmitting aperture can be configured to transmit an unfocused, spherical, ultrasound ping signal into a region of interest. The user-defined path can define a structure of interest within the region of interest.
Description
INCORPORATION BY REFERENCE

All publications and patent applications mentioned in this specification are herein incorporated by reference to the same extent as if each individual publication or patent application was specifically and individually indicated to be incorporated by reference.


FIELD

This invention generally relates to ultrasound imaging, and more particularly to M-mode imaging of arbitrary paths.


BACKGROUND

Conventional ultrasound (or “scanline based” ultrasound as used herein) utilizes a phased array controller to produce and steer a substantially linear transmit waveform. In order to produce a B-mode image, a sequence of such linear waveforms (or “scanlines”) may be produced and steered so as to scan across a region of interest. Echoes are received along each respective scanline. The individual scanlines from a complete scan may then be combined to form a complete image (sometimes referred to as a “sector scan” image).


A display method known as M-mode (or motion mode) imaging is commonly used in cardiology and other fields where it is desirable to view the motion of imaged objects. In some forms of M-mode imaging, echoes from a one-dimensional line are displayed over time relative to a static reference point in order to allow a clinician to evaluate movement of a particular structure (such as a cardiac wall or valve) over time. Because a traditional scanline-based ultrasound path is directional (along the scanline axis), available M-mode lines tend to be limited to paths along a scanline.


Generally, M-mode imaging provides a graphic indication of positions and movements of structures within a body over time. In some cases, a single stationary focused acoustic beam is fired at a high frame rate and the resulting M-mode images or lines are displayed side-by-side, providing an indication of the function of a heart over multiple heart cycles.


SUMMARY OF THE DISCLOSURE

A method of defining and displaying an m-mode path for display in an ultrasound imaging system, the method comprising transmitting an ultrasound signal from a transmitting transducer element into a region of interest including a structure of interest, receiving echoes with at least one receiving transducer element, producing an image of the region of interest from the received echoes, displaying the image of the region of interest including the structure of interest to a user, defining a one-pixel-wide path through the structure of interest, where the path does not lie along a line that intersects the transmitting transducer element or the receiving transducer element, and displaying a graph of a magnitude of pixels along the path over time.


In some embodiments, the path is non-linear. In other embodiments, the path has at least one curved segment. In one embodiment, the path has at least one linear segment and at least one curved segment. In another embodiment, the path has at least two linear segments that intersect at an angle other than 180 degrees. In some embodiments, the path has at least two dis-continuous segments.


In one embodiment, the transmitting transducer element lies on a separate physical transducer array from an array containing the at least one receiving transducer element.


In another embodiment, the transmitting transducer is configured to transmit an unfocused ping ultrasound signal into the region of interest.


In some embodiments, the method further comprises receiving echoes from the entire region of interest with the at least one receiving transducer element, receiving echoes from the entire region of interest with a second receiving transducer element, and producing an image of the region of interest by combining echoes received at the first and second transducer elements.


In some embodiments, defining a path through the structure of interest is performed substantially concurrently with said transmitting and receiving.


In another embodiment, the transmitting transducer is configured to insonify a phased array scan line.


A method of ultrasound imaging is also provided, comprising transmitting ultrasound signals into a region of interest and receiving echoes of the transmitted ultrasound signals with an ultrasound probe, defining a first image window as a portion of the region of interest, identifying an M-mode path intersecting a feature visible in the first image window, displaying data representing the M-mode path on a common display with a B-mode image of the first image window, defining a second image window as a portion of the region of interest that is different than the first image window, and displaying the data representing the M-mode path on a common display with a B-mode image of the second image window.


In one embodiment, all of the method steps are performed during a live real-time imaging session.


In another embodiment, the M-mode path includes at least one non-linear segment. In one embodiment, the M-mode path is not a line intersecting the probe.


In another embodiment, all of the method steps are performed using stored raw echo data retrieved from a raw data memory device.


In some embodiments, the first image window is smaller than and lies entirely within the second image window. In another embodiment, the second image window does not overlap the first image window.


In an additional embodiment, the method further comprises simultaneously displaying the data of the M-mode path on a common display with B-mode images of both the first image window and the second window.


In some embodiments, the M-mode path has at least two dis-continuous segments.


A multi-aperture M-mode ultrasound imaging system is also provided, comprising a transmitting transducer element configured to transmit an ultrasound signal into a region of interest including a structure of interest, a receiving transducer element separate from the transmitting transducer element, the receiving transducer element configured to receive echoes from the ultrasound signal, a controller configured to produce an image of the region of interest from the received echoes, an input mechanism configured to receive a user input defining a one-pixel-wide path through the structure of interest, where the path does not lie along a line that intersects the transmitting transducer element or the receiving transducer element, and a display configured to display the region of interest including the structure of interest, the display also configured to display a graph of a magnitude of pixels along the path over time.


In some embodiments, the transmitting transducer is configured to transmit an unfocused ping ultrasound signal into the region of interest.


In another embodiment, the transmitting transducer is configured to transmit an unfocused spherical ping ultrasound signal into the region of interest. In some embodiments, the transmitting transducer is configured insonify a phased array scan line.





BRIEF DESCRIPTION OF THE DRAWINGS

The novel features of the invention are set forth with particularity in the claims that follow. A better understanding of the features and advantages of the present invention will be obtained by reference to the following detailed description that sets forth illustrative embodiments, in which the principles of the invention are utilized, and the accompanying drawings of which:


Having thus summarized the general nature of the invention, embodiments and modifications thereof will become apparent to those skilled in the art from the detailed description below with reference to the attached figures.



FIG. 1A is a block diagram illustrating components of an ultrasound imaging system.



FIG. 1B is a block diagram illustrating another embodiment of an ultrasound imaging system.



FIG. 2 is a section view of a multiple aperture ultrasound imaging probe.



FIG. 3 is a schematic illustration of a multiple aperture ultrasound imaging process using a point-source transmit signal.



FIG. 4A is an illustration of a B-mode ultrasound image with an M-mode path defined through a portion of an imaged object.



FIG. 4B is an illustration of an M-mode graph of the data along the M-mode path of FIG. 4A.



FIG. 5A is an illustration of a B-mode ultrasound image with multiple M-mode paths defined through a portion of an imaged object.



FIG. 5B is an illustration of an M-mode graph of the data along the multiple m-mode paths of FIG. 5A.





DETAILED DESCRIPTION

In traditional ultrasound systems, images are generated by combining echoes from a series of pulses transmitted as phased array scan lines. In such scanline-based ultrasound imaging systems, the coordinate system used by the user interface usually lies along the scan lines. As a result, in such systems, a user interface for selecting an M-mode line typically involves selecting a desired segment of one of the scan lines. However, requiring the use of scan lines as M-mode lines means that the sonographer must position and hold the probe such that at least one of the scanlines intersects an anatomical feature through which an M-mode line is desired. In practice, this may be difficult and/or time consuming, and may limit the field of view.


Embodiments below provide systems and methods for obtaining M-mode data substantially in real-time along an arbitrary and/or user-defined path that does not necessarily lie along an ultrasound scan line. In some embodiments, the path may be a one-dimensional straight line. In other embodiments, the path may comprise a zig-zag pattern, a curved path, or any other non-linear path. As used herein the term “one-dimensional” may refer to a narrow path, whether linear, curved, or otherwise shaped. In some embodiments, a one-dimensional path may have a width of a single display pixel. In other embodiments, a one-dimensional path may have a width greater than one display pixel (e.g., 2 or 3 pixels), but may still have a length that is substantially greater than its width. As will be clear to the skilled artisan, the relationship between actual dimensions of represented objects and image pixels may be any value defined by the imaging system. In some embodiments, the M-mode path is not necessarily a straight line, and may include components at any orientation within the scan plane.


In some embodiments, an ultrasound imaging system may be configured to obtain three-dimensional (3D) image data, in which case an M-mode path may be selected from a displayed 3D volume. For example, an M-mode path may be defined in a 3D volume by selecting a desired plane through the 3D volume, and then defining an M-mode path within the selected 2D plane using any of the systems and methods described herein.


Some embodiments of systems and methods for specifying and displaying arbitrary M-mode lines may be used in conjunction with ping-based and/or multiple aperture ultrasound imaging systems. In other embodiments, systems and methods for specifying and displaying arbitrary M-mode lines as shown and described herein may also be used in conjunction with scanline-based imaging systems.


Ultrasound Imaging System Components



FIG. 1A is a block diagram illustrating components of an ultrasound imaging system that may be used with some embodiments of M-mode imaging systems and methods. The ultrasound system 10 of FIG. 1A may be particularly suited for scanline-based imaging and may be configured for acquiring real-time cardiac images either as 2D tomographic slices or as volumetric image data. The system may include a central controller/processor configured to control the other system components, including the probe 12 which includes one or more transducer arrays, elements of which may transmit and/or receive ultrasound signals. In some embodiments, the transducer array(s) may include a 1 D, 2D or other dimensional arrays formed from any suitable transducer material. The probe may generally be configured to transmit ultrasonic waves and to receive ultrasonic echo signals. In some embodiments, such transmission and reception may be controlled by a controller which may include a beamformer 14. The echo information from the beamformer 14 may then be processed by a B-mode processor 20 and/or other application-specific processors as needed (e.g., Doppler processors, contrast signal processors, elastography processors, etc.).


The B-Mode processor 20 may be configured to perform functions that include but are not limited to filtering, frequency and spatial compounding, harmonic data processing and other B-Mode functions. In some embodiments, the processed data may then be passed through a scan converter 24 configured to geometrically correct the data from a linear or polar geometry used by a phased-array scanning probe into a Cartesian format (x,y or x,y,z) with appropriate scaling in each dimension. In some embodiments, such as the embodiment described below with reference to FIGS. 2 and 3, a scan converter 24 may be omitted from the system.


Data for each 2D image or 3D volume may then be stored in a memory 28. The memory 28 may be volatile and/or non-volatile memory configured to store a few seconds up to several minutes or more of 2D or 3D echo image data. The video processor 26 may be configured to take the echo data stored in memory 28 and instructions from the central controller 16 to form video images, including any added graphic overlays and/or text annotation (e.g., patient information). Processed video data may then be passed on to the display 30 for presentation to the operator. The central controller 16 can direct the video processor 26 to display the most recently acquired data in memory as a real-time display, or it can replay sequences of older stored 2D slice or 3D volume data.


An M-mode processor 235 may also be provided to receive a definition of an M-mode path from a user interface and to form the images displaying the selected M-mode data in a desired output format. In some embodiments, an M-mode processor 235 may also include a (volatile or non-volatile) memory device for storing the defined M-mode path. In some embodiments, an M-mode processor 235 may be logically positioned between the video processor 26 and the display 30 in the diagram of FIG. 1A. In other embodiments, an M-mode processor 235 may be a set of functions built into the video processor 26 or another component of the system.



FIG. 1B illustrates another embodiment of an ultrasound imaging system 200 comprising an ultrasound probe 202 which may include a plurality of individual ultrasound transducer elements, some of which may be designated as transmit elements, and others of which may be designated as receive elements. In some embodiments, each probe transducer element may convert ultrasound vibrations into time-varying electrical signals and vice versa. In some embodiments, the probe 202 may include any number of ultrasound transducer arrays in any desired configuration. A probe 202 used in connection with the systems and methods described herein may be of any configuration as desired, including single aperture and multiple aperture probes.


The transmission of ultrasound signals from elements of the probe 202 may be controlled by a transmit controller 204. Upon receiving echoes of transmit signals, the probe elements may generate time-varying electric signals corresponding to the received ultrasound vibrations. Signals representing the received echoes may be output from the probe 202 and sent to a receive subsystem 210. In some embodiments, the receive subsystem may include multiple channels, each of which may include an analog front-end device (“AFE”) 212 and an analog-to-digital conversion device (ADC) 214. In some embodiments, each channel of the receive subsystem 210 may also include digital filters and data conditioners (not shown) after the ADC 214. In some embodiments, analog filters prior to the ADC 214 may also be provided. The output of each ADC 214 may be directed into a raw data memory device 220. In some embodiments, an independent channel of the receive subsystem 210 may be provided for each receive transducer element of the probe 202. In other embodiments, two or more transducer elements may share a common receive channel.


In some embodiments, an analog front-end device 212 (AFE) may perform certain filtering processes before passing the signal to an analog-to-digital conversion device 214 (ADC). The ADC 214 may be configured to convert received analog signals into a series of digital data points at some pre-determined sampling rate. Unlike most ultrasound systems, some embodiments of the ultrasound imaging system of FIG. 1B may then store digital data representing the timing, phase, magnitude and/or the frequency of ultrasound echo signals received by each individual receive element in a raw data memory device 220 before performing any further beamforming, filtering, image layer combining or other image processing.


In order to convert the captured digital samples into an image, the data into an image, the data may be retrieved from the raw data memory 220 by an image generation subsystem 230. As shown, the image generation subsystem 230 may include a beamforming block 232 and an image layer combining (“ILC”) block 234. In some embodiments, a beamformer 232 may be in communication with a calibration memory 238 that contains probe calibration data. Probe calibration data may include information about the precise acoustic position, operational quality, and/or other information about individual probe transducer elements. The calibration memory 238 may be physically located within the probe, within the imaging system, or in location external to both the probe and the imaging system.


In some embodiments, after passing through the image generation block 230, image data may then be stored in an image buffer memory 236 which may store beamformed and (in some embodiments) layer-combined image frames. A video processor 242 within a video subsystem 240 may then retrieve image frames from the image buffer, and may process the images into a video stream that may be displayed on a video display 244 and/or stored in a video memory 246 as a digital video clip, e.g. as referred to in the art as a “cine loop”.


An M-mode processor 235 may also be provided to receive a definition of an M-mode path from a user interface and to form the images displaying the selected M-mode data in a desired output format. In some embodiments, an M-mode processor 235 may also include a (volatile or non-volatile) memory device for storing the defined M-mode path. In some embodiments, an M-mode processor 235 may be logically positioned between the image buffer 236 and the video processor 242 in the diagram of FIG. 1B. In other embodiments, an M-mode processor 235 may be a set of functions built into the image generation subsystem 230 or the video processor 242 or any other suitable component of the system.


In some embodiments, raw echo data stored in a memory device may be retrieved, beamformed, processed into images, and displayed on a display using a device other than an ultrasound imaging system. For example, such a system may omit the probe 202, the transmit controller 204 and the receive sub-system 210 of FIG. 1B, while including the remaining components. Such a system may be implemented predominantly in software running on general purpose computing hardware. Such alternative processing hardware may comprise a desktop computer, a tablet computer, a laptop computer, a smartphone, a server or any other general purpose data processing hardware.


Introduction to Ping-Based Imaging


Some embodiments of ultrasound imaging systems to be used in combination with the systems and methods described herein may use point source transmission of ultrasound signals during the transmit pulse. An ultrasound wavefront transmitted from a point source (also referred to herein as a “ping”) illuminates the entire region of interest with each circular or spherical wavefront. Echoes from a single ping received by a single receive transducer element may be beamformed to form a complete image of the insonified region of interest. By combining data and images from multiple receive transducers across a wide probe, and by combining data from multiple pings, very high resolution images may be obtained.


As used herein the terms “point source transmission” and “ping” may refer to an introduction of transmitted ultrasound energy into a medium from a single spatial location. This may be accomplished using a single ultrasound transducer element or combination of adjacent transducer elements transmitting together. A single transmission from one or more element(s) may approximate a uniform spherical wave front, or in the case of imaging a 2D slice, may create a uniform circular wavefront within the 2D slice. In some cases, a single transmission of a circular or spherical wavefront from a point source transmit aperture may be referred to herein as a “ping” or a “point source pulse” or an “unfocused pulse.”


Point source transmission differs in its spatial characteristics from a scanline-based “phased array transmission” or a “directed pulse transmission” which focuses energy in a particular direction (along a scanline) from the transducer element array. Phased array transmission manipulates the phase of a group of transducer elements in sequence so as to strengthen or steer an insonifying wave to a specific region of interest.


Images may be formed from such ultrasound pings by beamforming the echoes received by one or more receive transducer elements. In some embodiments, such receive elements may be arranged into a plurality of apertures in a process referred to as multiple aperture ultrasound imaging.


Beamforming is generally understood to be a process by which imaging signals received at multiple discrete receptors are combined to form a complete coherent image. The process of ping-based beamforming is consistent with this understanding. Embodiments of ping-based beamforming generally involve determining the position of reflectors corresponding to portions of received echo data based on the path along which an ultrasound signal may have traveled, an assumed-constant speed of sound and the elapsed time between a transmit ping and the time at which an echo is received. In other words, ping-based imaging involves a calculation of distance based on an assumed speed and a measured time. Once such a distance has been calculated, it is possible to triangulate the possible positions of any given reflector. This distance calculation is made possible with accurate information about the relative positions of transmit and receive transducer elements. (As discussed in Applicants' previous applications referenced above, a multiple aperture probe may be calibrated to determine the acoustic position of each transducer element to at least a desired degree of accuracy.) In some embodiments, ping-based beamforming may be referred to as “dynamic beamforming.”


A dynamic beamformer may be used to determine a location and an intensity for an image pixel corresponding to each of the echoes resulting from each transmitted ping. When transmitting a ping signal, no beamforming need be applied to the transmitted waveform, but dynamic beamforming may be used to combine the echoes received with the plurality of receive transducers to form pixel data.


The image quality may be further improved by combining images formed by the beamformer from one or more subsequent transmitted pings. Still further improvements to image quality may be obtained by combining images formed by more than one receive aperture. An important consideration is whether the summation of images from different pings or receive apertures should be coherent summation (phase sensitive) or incoherent summation (summing magnitude of the signals without phase information). In some embodiments, coherent (phase sensitive) summation may be used to combine echo data received by transducer elements located on a common receive aperture resulting from one or more pings. In some embodiments, incoherent summation may be used to combine echo data or image data received by receive apertures that could possibly contain cancelling phase data. Such may be the case with receive apertures that have a combined total aperture that is greater than a maximum coherent aperture width for a given imaging target.


As used herein the terms “ultrasound transducer” and “transducer” may carry their ordinary meanings as understood by those skilled in the art of ultrasound imaging technologies, and may refer without limitation to any single component capable of converting an electrical signal into an ultrasonic signal and/or vice versa. For example, in some embodiments, an ultrasound transducer may comprise a piezoelectric device. In some alternative embodiments, ultrasound transducers may comprise capacitive micromachined ultrasound transducers (CMUT). Transducers are often configured in arrays of multiple elements. An element of a transducer array may be the smallest discrete component of an array. For example, in the case of an array of piezoelectric transducer elements, each element may be a single piezoelectric crystal.


As used herein, the terms “transmit element” and “receive element” may carry their ordinary meanings as understood by those skilled in the art of ultrasound imaging technologies. The term “transmit element” may refer without limitation to an ultrasound transducer element which at least momentarily performs a transmit function in which an electrical signal is converted into an ultrasound signal. Similarly, the term “receive element” may refer without limitation to an ultrasound transducer element which at least momentarily performs a receive function in which an ultrasound signal impinging on the element is converted into an electrical signal. Transmission of ultrasound into a medium may also be referred to herein as “insonifying.” An object or structure which reflects ultrasound waves may be referred to as a “reflector” or a “scatterer.”


As used herein the term “aperture” refers without limitation to one or more ultrasound transducer elements collectively performing a common function at a given instant of time. For example, in some embodiments, the term aperture may refer to a group of transducer elements performing a transmit function. In alternative embodiments, the term aperture may refer to a plurality of transducer elements performing a receive function. In some embodiments, group of transducer elements forming an aperture may be redefined at different points in time.


Generating ultrasound images using a ping-based ultrasound imaging process means that images from an entire region of interest are “in focus” at all times. This is true because each transmitted ping illuminates the entire region, receive apertures receive echoes from the entire region, and the dynamic multiple aperture beamforming process may form an image of any part or all of the insonified region. In such cases, the maximum extent of the image may be primarily limited by attenuation and signal-to-noise factors rather than by the confined focus of a transmit or receive beamforming apparatus. As a result, a full-resolution image may be formed from any portion of a region of interest using the same set of raw echo data. As used herein, the term “image window” will be used to refer to a selected portion of an entire insonified region of interest that is being displayed at any given time. For example, a first image window may be selected to include an entire insonified area, and then a user may choose to “zoom in” on a smaller selected area, thereby defining a new image window. The user may then choose to zoom out or pan the image window vertically and/or horizontally, thereby selecting yet another image window. In some embodiments, separate simultaneous images may be formed of multiple overlapping or non-overlapping image windows within a single insonified region.


Embodiments of Multiple Aperture Ultrasound Imaging Systems and Methods


Applicant's prior U.S. patent application Ser. No. 11/865,501 filed Oct. 1, 2007, now U.S. Pat. No. 8,007,439, and U.S. patent application Ser. No. 13/029,907 (“the '907 application”), now U.S. Pat. No. 9,146,313, describe embodiments of ultrasound imaging techniques using probes with multiple apertures to provide substantially increased resolution over a wide field of view.


In some embodiments, a probe may include one, two, three or more apertures for ultrasound imaging. FIG. 2 illustrates one embodiment of a multiple aperture ultrasound probe which may be used for ultrasound imaging with a point source transmit signal. The probe of FIG. 2 comprises three transducer arrays 60, 62, 64, each one of which may be a 1D, 2D, CMUT or other ultrasound transducer array. In alternative embodiments, a single curved array may also be used, each aperture being defined logically electronically as needed. In still further embodiments, any single-aperture or multiple-aperture ultrasound imaging probe may also be used. As shown, the lateral arrays 60 and 64 may be mounted in a probe housing 70 at angles relative to the center array 62. In some embodiments, the angle Θ of the lateral arrays relative to the central array may be between zero and 45 degrees or more. In one embodiment, the angle Θ is about 30 degrees. In some embodiments, the right and left lateral arrays 60, 64 may be mounted at different angles relative to the center array 62. In some embodiments, the probe 50 of FIG. 2 may have a total width 74 substantially wider than 2 cm, and in some embodiments 10 cm or greater.


In some embodiments as shown in FIG. 2, separate apertures of the probe may comprise separate transducer arrays which may be physically separated from one another. For example, in FIG. 2, a distance 72 physically separates the center aperture 62 from the right lateral aperture 64. The distance 72 can be the minimum distance between transducer elements on aperture 62 and transducer elements on aperture 64. In some embodiments, the distance 72 may be equal to at least twice the minimum wavelength of transmission from the transmit aperture. In some embodiments of a multiple aperture ultrasound imaging system, a distance between adjacent apertures may be at least a width of one transducer element. In alternative embodiments, a distance between apertures may be as large as possible within the constraints of a particular application and probe design.


In some embodiments, a probe such as that illustrated in FIG. 2 may be used with an ultrasound imaging system such as that illustrated in FIG. 1 but omitting the scan converter. As will be described in more detail below, some embodiments of a point-source imaging method negate the need for a scan converter. The probe 50 may also include one or more sensors 52 and/or controllers 54 joined to an ultrasound imaging system and/or to the transducer arrays by cables 56, 57, 58. Embodiments of similar multiple aperture probes 50 are also shown and described in US Patent Publication No. 2010/0262013 and U.S. patent application Ser. No. 13/029,907, filed Feb. 17, 2011, now U.S. Pat. No. 9,146,313, both of which are incorporated herein by reference.


Embodiments of multiple aperture ultrasound imaging methods using a point-source transmit signal will now be described with reference to FIG. 3. FIG. 3 illustrates a probe 300 with a first aperture 302 and a second aperture 304 directed toward a region of interest represented by the grid below the probe. In the illustrated embodiment, the first aperture is used as a transmit aperture 302, and the second aperture 304 is used for receiving echoes. In some embodiments, an ultrasound image may be produced by insonifying an entire region of interest to be imaged with a point-source transmitting element in a transmit aperture 302, and then receiving echoes from the entire imaged plane on one or more receive elements (e.g., R1-Rm) in one or more receive apertures 304.


In some embodiments, subsequent insonifying pulses may be transmitted from each of the elements T1-Tn on the transmitting aperture 302 in a similar point-source fashion. Echoes may then be received by elements on the receive aperture(s) 302 after each insonifying pulse. An image may be formed by processing echoes from each transmit pulse. Although each individual image obtained from a transmit pulse may have a relatively low resolution, combining these images may provide a high resolution image.


In some embodiments, transmit elements may be operated in any desired sequential order, and need not follow a prescribed pattern. In some embodiments, receive functions may be performed by all elements in a receive array 302. In alternative embodiments, echoes may be received on only one or a select few elements of a receive array 302.


The data received by the receiving elements is a series of echoes reflected by objects within the target region. In order to generate an image, each received echo must be evaluated to determine the location of the object within the target region that reflected it (each reflected point may be referred to herein as a scatterer). For a scatterer point represented by coordinates (i,j) in FIG. 3, it is a simple matter to calculate the total distance “a” from a particular transmit element Tx to an element of internal tissue or target object T at (i,j), and the distance “b” from that point to a particular receive element. These calculations may be performed using basic trigonometry. The sum of these distances is the total distance traveled by one ultrasound wave.


Assuming the speed of the ultrasound waves traveling through the target object is known, these distances can be translated into time delays which may be used to identify a location within the image corresponding to each received echo. When the speed of ultrasound in tissue is assumed to be uniform throughout the target object, it is possible to calculate the time delay from the onset of the transmit pulse to the time that an echo is received at the receive element. Thus, a given scatterer in the target object is the point for which a+b=the given time delay. The same method can be used to calculate delays for all points in the desired target to be imaged, creating a locus of points. As discussed in more detail in the '907 application, adjustments to time delays may be made in order to account for variations in the speed of sound through varying tissue paths.


A method of rendering the location of all of the scatterers in the target object, and thus forming a two dimensional cross section of the target object, will now be described with reference to FIG. 3 which illustrates a grid of points to be imaged by apertures 302 and 304. A point on the grid is given the rectangular coordinates (i,j). The complete image will be a two dimensional array of points provided to a video processing system to be displayed as a corresponding array of pixels. In the grid of FIG. 3, ‘mh’ is the maximum horizontal dimension of the array and ‘mv’ is the maximum vertical dimension. FIG. 3 also illustrates MAUI electronics, which can comprise any hardware and/or software elements as needed, such as those described above with reference to FIG. 1.


In some embodiments, the following pseudo code may be used to accumulate all of the information to be gathered from a transmit pulse from one transmit element (e.g., one element of T1 . . . Tn from aperture 302), and the consequent echoes received by one receive element (e.g., one element of R1 . . . Rm from aperture 304) in the arrangement of FIG. 3.














for (i = 0; i < mh; i++){


 for (j = 0;j < mv; j++){


  compute distance a


  compute distance b


  compute time equivalent of a+b


  echo[ i ][ j ] = echo[i ][ j]+stored received echo at the computed time


 }


}









A complete two dimensional image may be formed by repeating this process for every receive element in a receive aperture 304 (e.g., R1 . . . Rm). In some embodiments, it is possible to implement this code in parallel hardware resulting in real time image formation.


In some embodiments, image quality may be further improved by combining similar images resulting from pulses from other transmit elements. In some embodiments, the combination of images may be performed by a simple summation of the single point source pulse images (e.g., coherent addition). Alternatively, the combination may involve taking the absolute value of each element of the single point source pulse images first before summation (e.g., incoherent addition). Further details of such combinations, including corrections for variations in speed-of-sound through different ultrasound paths, are described in Applicant's prior US Patent Applications referenced above.


As discussed above, because embodiments of an imaging system using a point source transmit signal and a multiple-aperture receive probe are capable of receiving an entire scan-plan image in response to a single insonifying pulse, a scan converter is not needed, and may therefore be omitted from an ultrasound imaging system. Having received a series of image frames in a similar manner, the image data may be processed and sent to a display for viewing by an operator. In addition to ultrasound imaging systems using point-source transmit signals, the following methods of selecting and displaying arbitrary m-mode paths may also be used with any other ultrasound imaging system, including phased array transmit systems, single-aperture probes, 3D probes, and probes in systems using synthetic aperture techniques.


Embodiments for Defining and Displaying Arbitrary M-Mode Paths



FIG. 4A illustrates an example of an ultrasound image with a specified m-mode path 100 drawn through an imaged object 110. The amplitude of each pixel along the m-mode path may be displayed in a graph (e.g., a bar graph, line graph or any other desired format). Changing pixel amplitude values may be illustrated over time. FIG. 4B illustrates an example of a graph of data taken along the m-mode path 100 of FIG. 4A.


In some embodiments, a sonographer may wish to simultaneously view changes along two or more separate M-mode paths. Thus in some embodiments, a user may define a plurality of M-mode paths 110, 112 as shown in FIG. 5A. The change in pixel values lying along the first and second paths 110, 112 may be displayed simultaneously in a pair of amplitude/time charts as shown for example in FIG. 5B. FIG. 5A also shows an example of a non-linear path 112. As discussed in further detail below, a non-linear M-mode path may have any length and shape as desired.


Multiple discontinuous M-mode paths and/or non-linear M-mode paths may be beneficial in viewing movement of multiple structures simultaneously. For example, a curve M-mode path may be beneficial when imaging anatomic structures such as a moving valve, such as a tricuspid valve, an aortic valve or a mitral valve. In other embodiments, multiple simultaneous but discontinuous m-mode lines may be used to simultaneously view the movement of multiple structures. For example, a first m-mode path may be drawn to view operation of a tricuspid valve, and a second M-mode path may be drawn to view operation of a mitral valve. Viewing the function of both valves simultaneously may provide substantial diagnostic benefits, such as allowing for precise calibration of a pacemaker.


Selection of an M-mode path generally involves identifying a group of image pixel locations which are to be presented over time as an M-mode graph. Identifying a group of pixels for an m-mode path may comprise identifying the coordinates of selected pixels in a coordinate system used by the video processing system. In some embodiments, M-mode selection and display methods as described herein may be performed in real-time using an ultrasound imaging system such as those illustrated in FIGS. 1A and 1B. With reference to FIGS. 1A and 1B, selection of an M-mode path may be performed by a user via a suitable user interface interaction performed in communication with the M-mode processor 235. The identification of selected pixels may be at least temporarily stored in a memory device associated with the M-mode processor 235. The selected pixels defining the M-mode path may then be retrieved from image frames in the image buffer and/or in the video processor, and an M-mode graph or image illustrating the values of the selected pixels may be formed by the M-mode processor 235 and transmitted to the display to be displayed along with the B-mode image. In alternative embodiments, M-mode selection and display methods as described herein may be performed on a workstation playing back stored 2D or 3D image data.


In some embodiments, selection of a group of pixel locations for presentation as an M-mode path may be assisted by or entirely performed automatically, such as by using a computer aided detection (CAD) system configured to identify a desired anatomical or other feature through which an m-mode path may be desired. For example, US Publication No. 2011/0021915 describes a system for automatic detection of structures in M-mode ultrasound imaging. In other embodiments, a desired M-mode path may be chosen by a user through any of several possible user interface interactions, several examples of which are provided below.


As will be clear to the skilled artisan, an imaging system or an image display system may include a variety of user interface devices through which a user may input information to or modify information or objects in a displayed image. Such user interface devices may comprise any of the following, trackballs, buttons, keys, keypads, sliders, dials, voice commands, touch screen, joystick, mouse, etc. The use of these and other user input devices will be clear to the skilled artisan.


In some embodiments, any arbitrary line or path in the image plane may be selected by a user as a line for M-mode display. In some embodiments, a linear path of defined length may be selected as an m-mode path. This may be facilitated through a number of user interface interactions, some examples of which are provided below.


In some embodiments, the ultrasound display may include a touch screen, and a user may define an M-mode path by simply drawing the desired path with a finger or stylus directly on the display screen. In other embodiments, a user may draw a freehand path using a separate user interface device such as a mouse or a drawing tablet. In some embodiments, after drawing a path of a desired shape, an M-mode path of the desired shape may be dragged across a display and/or rotated to a desired position.


In one embodiment of a user interface interaction, a linear m-mode path segment may be selected by first defining a line length, then defining a rotation angle, and then translating the line into a desired position. In some embodiments, further adjustments to the line length, rotation angle, and position may be made as needed. In some embodiments, defining a line length may comprise entering a numeric value with a numeric keypad or increasing/decreasing a numeric line length value with a scroll wheel, track ball, dial, slider, arrow keys or other input device. Similarly, in some embodiments, a rotation angle may be defined by entering a numeric value with a numeric keypad or any other input device. A rotation angle may be defined relative to any suitable coordinate system. For example, in some embodiments, a rotation angle of zero degrees may correspond to a three o'clock position (e.g., assuming the top of the image is 12 o'clock).


In some embodiments, numeric values of line length or rotation angle may not be displayed, instead only changes to a line length or rotation angle of the line may be shown on the display screen. In some embodiments, translating the line up, down, left or right within the image plane may be performed using arrow keys, a track ball, a mouse, touch screen, voice commands or other input devices.


In another embodiment of a user interface interaction, a desired linear m-mode path segment may be selected by defining or adjusting a line length, translating the line until a first end point is in a desired position, fixing the first end point and rotating the second end point until the line is rotated to the desired orientation and position.


In another embodiment of a user interface interaction, a desired linear m-mode path segment may be selected by first selecting a first end point, such as by positioning a cursor at a desired position on the image. A line length and rotation angle may then be defined and adjusted as needed. In some embodiments, a rotation angle may be defined by directing the system to pivot the line about the selected first end point. Alternatively, a user may select the second end point or another point along the line about which to pivot the line in order to define a desired rotation angle.


In another embodiment of a user interface interaction, a desired linear M-mode path segment may be selected by selecting a first end point with a cursor and then dragging the cursor in a desired direction to draw a line. In other embodiments, a line may be defined by selecting first and second end points, defining a line by joining the two points.


In any case, once a line is defined, either automatically or through a user interface interaction such as those described above, the length and rotation angle may be adjustable through further user interface interactions. For example, a user may define a pivot point about which to pivot the line in order to adjust a rotation angle. Similarly, a user may select a fixed point from which to increase or decrease the length of the line. Such fixed points and pivot points may be either one of the end points, or any other point along the line.


In some embodiments, a non-linear M-mode path may be defined through any of the above user interface interactions by joining linear segments to form any desired non-linear path made up of linear segments. In some embodiments, a user may choose to apply a radius to the M-mode path in areas adjacent intersections of linear segments. In some embodiments, such a radius may be applied automatically, or may be increased or decreased through a user interface interaction.


In other embodiments, a non-linear M-mode path may be defined by providing a user with a free-form drawing cursor with which the user may draw any non-linear path as desired. Further adjustments may then be made to the path, such as by selecting and dragging one or more individual points along the path to obtain a desired M-mode path.


As described above, multiple images may be formed for two or more separate simultaneous image windows showing different overlapping or non-overlapping portions of an insonified region of interest. Thus, in some embodiments, an M-mode path may be defined while a first image window is displayed, and a user may then zoom or pan the image to a second image window. In some embodiments, the system may be configured to continue displaying the data along the defined M-mode path even when the displayed B-mode image is changed to a different image window than the one in which the M-mode path was defined. For example, a user may zoom in to view a heart valve, and may define an M-mode path intersecting the valve in the zoomed-in image window. The user may then choose to zoom out to view the movement of the whole heart (or a different region of the heart) while continuing to monitor data along the M-mode line intersecting the heart valve.


In some embodiments, the system may store a definition of the image window in which the M-mode line was defined, and may allow a user to toggle between a B-mode image of the M-mode defining image window and a B-mode image of at least one other image window. In still further embodiments, the system may be configured to simultaneously display B-mode images of both the M-mode defining window and another image window (e.g., in a picture-in-picture mode or in a side-by-side view).


Any of the above user interface interactions may also be used to define an M-mode path through a displayed 3D volume. In some embodiments, defining an M-mode path from a 3D volume may also involve a step of rotating an image of a 3D volume before after or during any of the M-mode path defining user interface steps described above.


Although various embodiments are described herein with reference to ultrasound imaging of various anatomic structures, it will be understood that many of the methods and devices shown and described herein may also be used in other applications, such as imaging and evaluating non-anatomic structures and objects. For example, the ultrasound probes, systems and methods described herein may be used in non-destructive testing or evaluation of various mechanical objects, structural objects or materials, such as welds, pipes, beams, plates, pressure vessels, layered structures, etc. Therefore, references herein to medical or anatomic imaging targets such as blood, blood vessels, heart or other organs are provided merely as non-limiting examples of the nearly infinite variety of targets that may be imaged or evaluated using the various apparatus and techniques described herein.


Although this invention has been disclosed in the context of certain preferred embodiments and examples, it will be understood by those skilled in the art that the present invention extends beyond the specifically disclosed embodiments to other alternative embodiments and/or uses of the invention and obvious modifications and equivalents thereof. Thus, it is intended that the scope of the present invention herein disclosed should not be limited by the particular disclosed embodiments described above, but should be determined only by a fair reading of the claims that follow. In particular, materials and manufacturing techniques may be employed as within the level of those with skill in the relevant art. Furthermore, reference to a singular item, includes the possibility that there are plural of the same items present. More specifically, as used herein and in the appended claims, the singular forms “a,” “and,” “said,” and “the” include plural referents unless the context clearly dictates otherwise. It is further noted that the claims may be drafted to exclude any optional element. As such, this statement is intended to serve as antecedent basis for use of such exclusive terminology as “solely,” “only” and the like in connection with the recitation of claim elements, or use of a “negative” limitation. Unless defined otherwise herein, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.

Claims
  • 1. A method of defining and displaying an m-mode path for display in an ultrasound imaging system, the method comprising: transmitting a first unfocused ultrasound signal from a single transmitting transducer element into a region of interest including a structure of interest;receiving first echoes of the first unfocused ultrasound signal with a first group of receiving transducer elements;receiving second echoes of the first unfocused ultrasound signal with a second group of receiving transducer elements;retrieving position data describing an acoustic position of the single transmitting transducer element, each element of the first group of receiving transducer elements, and each element of the second group of receiving transducer elements;forming three-dimensional volumetric data from the first received echoes, the second received echoes, and the position data;displaying a volumetric image representing the three-dimensional volumetric data;selecting a first plane through the three-dimensional volumetric data and intersecting the structure of interest, and simultaneously displaying the selected first plane;defining an arbitrary M-mode path through the structure of interest within the selected first plane; andsimultaneously displaying a graph of a magnitude of pixels along the selected M-mode path over time.
  • 2. The method of claim 1, wherein the arbitrary M-mode path is non-linear.
  • 3. The method of claim 2, wherein the arbitrary M-mode path has at least one curved segment.
  • 4. The method of claim 2, wherein the arbitrary M-mode path has at least two linear segments that intersect at an angle other than 180 degrees.
  • 5. The method of claim 1, wherein the arbitrary M-mode path has at least one linear segment and at least one curved segment.
  • 6. The method of claim 1, wherein the arbitrary M-mode path has at least two dis-continuous segments.
  • 7. The method of claim 1, further comprising rotating the three-dimensional volumetric image prior to selecting the first plane.
  • 8. The method of claim 1, further comprising selecting a second plane through the three dimensional volume and displaying the selected second plane.
  • 9. The method of claim 1, wherein defining a path through the structure of interest is performed substantially concurrently with transmitting first unfocused ultrasound signal, receiving first echoes, and receiving second echoes.
CROSS REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. application Ser. No. 13/730,346, filed Dec. 28, 2012, which application claims the benefit of US Provisional Application No. 61/581,583, titled “M-Mode Ultrasound Imaging Of Arbitrary Paths,” filed Dec. 29, 2011, and U.S. Provisional Application No. 61/691,717, titled “Ultrasound Imaging System Memory Architecture,” filed Aug. 21, 2012, all of which are incorporated herein by reference.

US Referenced Citations (542)
Number Name Date Kind
3174286 Erickson Mar 1965 A
3895381 Kock Jul 1975 A
3974692 Hassler Aug 1976 A
4055988 Dutton Nov 1977 A
4072922 Taner et al. Feb 1978 A
4097835 Green Jun 1978 A
4105018 Greenleaf et al. Aug 1978 A
4180792 Lederman et al. Dec 1979 A
4205394 Pickens May 1980 A
4229798 Rosie Oct 1980 A
4259733 Taner et al. Mar 1981 A
4265126 Papadofrangakis et al. May 1981 A
4271842 Specht et al. Jun 1981 A
4325257 Kino et al. Apr 1982 A
4327738 Green et al. May 1982 A
4333474 Nigam Jun 1982 A
4339952 Foster Jul 1982 A
4452084 Taenzer Jun 1984 A
4501279 Seo Feb 1985 A
4511998 Kanda et al. Apr 1985 A
4539847 Paap Sep 1985 A
4566459 Umemura et al. Jan 1986 A
4567768 Satoh et al. Feb 1986 A
4604697 Luthra et al. Aug 1986 A
4662222 Johnson May 1987 A
4669482 Ophir Jun 1987 A
4682497 Sasaki Jul 1987 A
4694434 Vonn Ramm et al. Sep 1987 A
4781199 Hirama et al. Nov 1988 A
4817434 Anderson Apr 1989 A
4831601 Breimesser et al. May 1989 A
4893284 Magrane Jan 1990 A
4893628 Angelsen Jan 1990 A
4990462 Sliwa, Jr. Feb 1991 A
5050588 Grey et al. Sep 1991 A
5062295 Shakkottai et al. Nov 1991 A
5141738 Rasor et al. Aug 1992 A
5161536 Vilkomerson et al. Nov 1992 A
5197475 Antich et al. Mar 1993 A
5226019 Bahorich Jul 1993 A
5230339 Charlebois Jul 1993 A
5269309 Fort et al. Dec 1993 A
5278757 Hoctor et al. Jan 1994 A
5293871 Reinstein et al. Mar 1994 A
5299576 Shiba Apr 1994 A
5301674 Erikson et al. Apr 1994 A
5305756 Entrekin et al. Apr 1994 A
5339282 Kuhn et al. Aug 1994 A
5340510 Bowen Aug 1994 A
5345426 Lipschutz Sep 1994 A
5349960 Gondo Sep 1994 A
5355888 Kendall Oct 1994 A
5381794 Tei et al. Jan 1995 A
5398216 Hall et al. Mar 1995 A
5409010 Beach et al. Apr 1995 A
5442462 Guissin Aug 1995 A
5454372 Banjanin et al. Oct 1995 A
5503152 Oakley et al. Apr 1996 A
5515853 Smith et al. May 1996 A
5515856 Olstad et al. May 1996 A
5522393 Phillips et al. Jun 1996 A
5526815 Granz et al. Jun 1996 A
5544659 Banjanin Aug 1996 A
5558092 Unger et al. Sep 1996 A
5564423 Mele et al. Oct 1996 A
5568812 Murashita et al. Oct 1996 A
5570691 Wright et al. Nov 1996 A
5581517 Gee et al. Dec 1996 A
5625149 Gururaja et al. Apr 1997 A
5628320 Teo May 1997 A
5673697 Bryan et al. Oct 1997 A
5675550 Ekhaus Oct 1997 A
5720291 Schwartz Feb 1998 A
5720708 Lu et al. Feb 1998 A
5744898 Smith et al. Apr 1998 A
5769079 Hossack Jun 1998 A
5784334 Sena et al. Jul 1998 A
5785654 Linuma et al. Jul 1998 A
5795297 Daigle Aug 1998 A
5797845 Barabash et al. Aug 1998 A
5798459 Ohba et al. Aug 1998 A
5820561 Olstad et al. Oct 1998 A
5838564 Bahorich et al. Nov 1998 A
5850622 Vassiliou et al. Dec 1998 A
5862100 VerWest Jan 1999 A
5870691 Partyka et al. Feb 1999 A
5876342 Chen et al. Mar 1999 A
5891038 Seyed-Bolorforosh et al. Apr 1999 A
5892732 Gersztenkorn Apr 1999 A
5916169 Hanafy et al. Jun 1999 A
5919139 Lin Jul 1999 A
5920285 Benjamin Jul 1999 A
5930730 Marfurt et al. Jul 1999 A
5940778 Marfurt et al. Aug 1999 A
5951479 Holm et al. Sep 1999 A
5964707 Fenster et al. Oct 1999 A
5969661 Benjamin Oct 1999 A
5999836 Nelson et al. Dec 1999 A
6007499 Martin et al. Dec 1999 A
6013032 Savord Jan 2000 A
6014473 Hossack et al. Jan 2000 A
6048315 Chiao et al. Apr 2000 A
6049509 Sonneland et al. Apr 2000 A
6050943 Slayton et al. Apr 2000 A
6056693 Haider May 2000 A
6058074 Swan et al. May 2000 A
6077224 Lang et al. Jun 2000 A
6092026 Bahorich et al. Jul 2000 A
6122538 Sliwa, Jr. et al. Sep 2000 A
6123670 Mo Sep 2000 A
6129672 Seward et al. Oct 2000 A
6135960 Holmberg Oct 2000 A
6138075 Yost Oct 2000 A
6148095 Prause et al. Nov 2000 A
6162175 Marian, Jr. et al. Dec 2000 A
6166384 Dentinger et al. Dec 2000 A
6166853 Sepia et al. Dec 2000 A
6193665 Hall et al. Feb 2001 B1
6196739 Silverbrook Mar 2001 B1
6200266 Shokrollahi et al. Mar 2001 B1
6210335 Miller Apr 2001 B1
6213958 Winder Apr 2001 B1
6221019 Kantorovich Apr 2001 B1
6231511 Bae May 2001 B1
6238342 Feleppa et al. May 2001 B1
6246901 Benaron Jun 2001 B1
6251073 Imran et al. Jun 2001 B1
6264609 Herrington et al. Jul 2001 B1
6266551 Osadchy et al. Jul 2001 B1
6278949 Alam Aug 2001 B1
6289230 Chaiken et al. Sep 2001 B1
6299580 Asafusa Oct 2001 B1
6304684 Niczyporuk et al. Oct 2001 B1
6309356 Ustuner et al. Oct 2001 B1
6324453 Breed et al. Nov 2001 B1
6345539 Rawes et al. Feb 2002 B1
6361500 Masters Mar 2002 B1
6363033 Cole et al. Mar 2002 B1
6370480 Gupta et al. Apr 2002 B1
6374185 Taner et al. Apr 2002 B1
6394955 Periitz May 2002 B1
6423002 Hossack Jul 2002 B1
6436046 Napolitano et al. Aug 2002 B1
6449821 Sudol et al. Sep 2002 B1
6450965 Williams et al. Sep 2002 B2
6468216 Powers et al. Oct 2002 B1
6471650 Powers et al. Oct 2002 B2
6475150 Haddad Nov 2002 B2
6480790 Calvert et al. Nov 2002 B1
6487502 Taner Nov 2002 B1
6499536 Ellingsen Dec 2002 B1
6508768 Hall et al. Jan 2003 B1
6508770 Cai Jan 2003 B1
6517484 Wilk et al. Feb 2003 B1
6526163 Heimann et al. Feb 2003 B1
6543272 Vitek Apr 2003 B1
6547732 Jago Apr 2003 B2
6551246 Ustuner et al. Apr 2003 B1
6565510 Haider May 2003 B1
6585647 Winder Jul 2003 B1
6597171 Hurlimann et al. Jul 2003 B2
6604421 Li Aug 2003 B1
6614560 Silverbrook Sep 2003 B1
6620101 Azzam et al. Sep 2003 B2
6652461 Levkovitz Nov 2003 B1
6668654 Dubois et al. Dec 2003 B2
6672165 Rather et al. Jan 2004 B2
6681185 Young et al. Jan 2004 B1
6690816 Aylward et al. Feb 2004 B2
6692450 Coleman Feb 2004 B1
6695778 Golland et al. Feb 2004 B2
6702745 Smythe Mar 2004 B1
6704692 Banerjee et al. Mar 2004 B1
6719693 Richard Apr 2004 B2
6728567 Rather et al. Apr 2004 B2
6752762 DeJong et al. Jun 2004 B1
6755787 Hossack et al. Jun 2004 B2
6780152 Ustuner et al. Aug 2004 B2
6790182 Eck et al. Sep 2004 B2
6835178 Wilson et al. Dec 2004 B1
6837853 Marian Jan 2005 B2
6843770 Sumanaweera Jan 2005 B2
6847737 Kouri et al. Jan 2005 B1
6854332 Alleyne Feb 2005 B2
6865140 Thomenius et al. Mar 2005 B2
6932767 Landry et al. Aug 2005 B2
7033320 Von Behren et al. Apr 2006 B2
7087023 Daft et al. Aug 2006 B2
7104956 Christopher Sep 2006 B1
7217243 Takeuchi May 2007 B2
7221867 Silverbrook May 2007 B2
7231072 Yamano et al. Jun 2007 B2
7269299 Schroeder Sep 2007 B2
7283652 Mendonca et al. Oct 2007 B2
7285094 Nohara et al. Oct 2007 B2
7293462 Lee et al. Nov 2007 B2
7313053 Wodnicki Dec 2007 B2
7366704 Reading et al. Apr 2008 B2
7402136 Hossack et al. Jul 2008 B2
7410469 Talish et al. Aug 2008 B1
7415880 Renzel Aug 2008 B2
7443765 Thomenius et al. Oct 2008 B2
7444875 Wu et al. Nov 2008 B1
7447535 Lavi Nov 2008 B2
7448998 Robinson Nov 2008 B2
7466848 Metaxas et al. Dec 2008 B2
7469096 Silverbrook Dec 2008 B2
7474778 Shinomura et al. Jan 2009 B2
7481577 Ramamurthy et al. Jan 2009 B2
7491171 Barthe et al. Feb 2009 B2
7497828 Wilk et al. Mar 2009 B1
7497830 Li Mar 2009 B2
7510529 Chou et al. Mar 2009 B2
7514851 Wilser et al. Apr 2009 B2
7549962 Dreschel et al. Jun 2009 B2
7574026 Rasche et al. Aug 2009 B2
7625343 Cao et al. Dec 2009 B2
7637869 Sudol Dec 2009 B2
7668583 Fegert et al. Feb 2010 B2
7674228 Williams et al. Mar 2010 B2
7682311 Simopoulos et al. Mar 2010 B2
7699776 Walker et al. Apr 2010 B2
7722541 Cai May 2010 B2
7744532 Ustuner et al. Jun 2010 B2
7750311 Daghighian Jul 2010 B2
7764984 Desmedt et al. Jul 2010 B2
7785260 Umemura et al. Aug 2010 B2
7787680 Ahn et al. Aug 2010 B2
7806828 Stringer Oct 2010 B2
7819810 Stringer et al. Oct 2010 B2
7822250 Yao et al. Oct 2010 B2
7824337 Abe et al. Nov 2010 B2
7833163 Cal Nov 2010 B2
7837624 Hossack et al. Nov 2010 B1
7846097 Jones et al. Dec 2010 B2
7850613 Stribling Dec 2010 B2
7862508 Davies et al. Jan 2011 B2
7876945 Lötjönen Jan 2011 B2
7880154 Otto Feb 2011 B2
7887486 Ustuner et al. Feb 2011 B2
7901358 Mehi et al. Mar 2011 B2
7907758 Hill et al. Mar 2011 B2
7914451 Davies Mar 2011 B2
7919906 Cerofolini Apr 2011 B2
7926350 Kroning et al. Apr 2011 B2
7927280 Davidsen Apr 2011 B2
7972271 Johnson et al. Jul 2011 B2
7984637 Ao et al. Jul 2011 B2
7984651 Randall et al. Jul 2011 B2
8002705 Napolitano et al. Aug 2011 B1
8007439 Specht Aug 2011 B2
8057392 Hossack et al. Nov 2011 B2
8057393 Yao et al. Nov 2011 B2
8079263 Randall et al. Dec 2011 B2
8079956 Azuma et al. Dec 2011 B2
8088067 Vortman et al. Jan 2012 B2
8088068 Yao et al. Jan 2012 B2
8088071 Hwang et al. Jan 2012 B2
8105239 Specht Jan 2012 B2
8135190 Bae et al. Mar 2012 B2
8157737 Zhang et al. Apr 2012 B2
8182427 Wu et al. May 2012 B2
8202219 Luo et al. Jun 2012 B2
8265175 Barsoum et al. Sep 2012 B2
8277383 Specht Oct 2012 B2
8279705 Choi et al. Oct 2012 B2
8412307 Willis et al. Apr 2013 B2
8414564 Goldshleger et al. Apr 2013 B2
8419642 Sandrin et al. Apr 2013 B2
8473239 Specht et al. Jun 2013 B2
8478382 Burnside et al. Jul 2013 B2
8483804 Hsieh et al. Jul 2013 B2
8532951 Roy et al. Sep 2013 B2
8582848 Funka-Lea et al. Nov 2013 B2
8602993 Specht et al. Dec 2013 B2
8627724 Papadopoulos et al. Jan 2014 B2
8634615 Brabec Jan 2014 B2
8672846 Napolitano et al. Mar 2014 B2
8684936 Specht Apr 2014 B2
9036887 Fouras et al. May 2015 B2
9072495 Specht Jul 2015 B2
9146313 Specht et al. Sep 2015 B2
9176078 Flohr et al. Nov 2015 B2
9192355 Specht et al. Nov 2015 B2
9220478 Smith et al. Dec 2015 B2
9247926 Smith et al. Feb 2016 B2
9265484 Brewer Feb 2016 B2
9268777 Lu et al. Feb 2016 B2
9271661 Moghari et al. Mar 2016 B2
9277861 Kowal et al. Mar 2016 B2
9282945 Specht et al. Mar 2016 B2
9392986 Ning et al. Jul 2016 B2
9576354 Fouras et al. Feb 2017 B2
9606206 Boernert et al. Mar 2017 B2
10342518 Specht et al. Jul 2019 B2
10380399 Call et al. Aug 2019 B2
20020035864 Paltieli et al. Mar 2002 A1
20020087071 Schmitz et al. Jul 2002 A1
20020111568 Bukshpan Aug 2002 A1
20020138003 Bukshpan Sep 2002 A1
20020161299 Prater et al. Oct 2002 A1
20030013962 Bjaerum et al. Jan 2003 A1
20030028111 Vaezy et al. Feb 2003 A1
20030040669 Grass et al. Feb 2003 A1
20030228053 Li et al. Dec 2003 A1
20040015079 Berger et al. Jan 2004 A1
20040054283 Corey et al. Mar 2004 A1
20040068184 Trahey et al. Apr 2004 A1
20040100163 Baumgartner et al. May 2004 A1
20040111028 Abe Jun 2004 A1
20040122313 Moore et al. Jun 2004 A1
20040122322 Moore et al. Jun 2004 A1
20040127793 Mendlein et al. Jul 2004 A1
20040138565 Trucco Jul 2004 A1
20040144176 Yoden Jul 2004 A1
20040215075 Zagzebski et al. Oct 2004 A1
20040236217 Cerwin et al. Nov 2004 A1
20040236223 Barnes et al. Nov 2004 A1
20040267132 Podany Dec 2004 A1
20050004449 Mitschke et al. Jan 2005 A1
20050053305 Li et al. Mar 2005 A1
20050054910 Tremblay et al. Mar 2005 A1
20050061536 Proulx Mar 2005 A1
20050090743 Kawashima et al. Apr 2005 A1
20050090745 Steen Apr 2005 A1
20050111846 Steinbacher et al. May 2005 A1
20050113689 Gritzky May 2005 A1
20050113694 Haugen et al. May 2005 A1
20050124883 Hunt Jun 2005 A1
20050131300 Bakircioglu et al. Jun 2005 A1
20050147297 McLaughlin et al. Jul 2005 A1
20050165312 Knowles et al. Jul 2005 A1
20050203404 Freiburger Sep 2005 A1
20050215883 Hundley et al. Sep 2005 A1
20050240125 Makin et al. Oct 2005 A1
20050252295 Fink et al. Nov 2005 A1
20050281447 Moreau-Gobard et al. Dec 2005 A1
20050288588 Weber et al. Dec 2005 A1
20060058664 Barthe et al. Mar 2006 A1
20060062447 Rinck et al. Mar 2006 A1
20060074313 Slayton et al. Apr 2006 A1
20060074315 Liang et al. Apr 2006 A1
20060074320 Yoo et al. Apr 2006 A1
20060079759 Vaillant et al. Apr 2006 A1
20060079778 Mo et al. Apr 2006 A1
20060079782 Beach et al. Apr 2006 A1
20060094962 Clark May 2006 A1
20060111634 Wu May 2006 A1
20060122506 Davies et al. Jun 2006 A1
20060173327 Kim Aug 2006 A1
20060262961 Noising et al. Nov 2006 A1
20060270934 Savord et al. Nov 2006 A1
20070016022 Blalock et al. Jan 2007 A1
20070016044 Blalock et al. Jan 2007 A1
20070036414 Georgescu et al. Feb 2007 A1
20070055155 Owen et al. Mar 2007 A1
20070073781 Adkins et al. Mar 2007 A1
20070078345 Mo et al. Apr 2007 A1
20070088213 Poland Apr 2007 A1
20070138157 Dane et al. Jun 2007 A1
20070161898 Hao et al. Jul 2007 A1
20070161904 Urbano Jul 2007 A1
20070167752 Proulx et al. Jul 2007 A1
20070167824 Lee et al. Jul 2007 A1
20070232914 Chen et al. Oct 2007 A1
20070238985 Smith et al. Oct 2007 A1
20070242567 Daft et al. Oct 2007 A1
20080110261 Randall et al. May 2008 A1
20080110263 Klessel et al. May 2008 A1
20080112265 Urbano et al. May 2008 A1
20080114241 Randall et al. May 2008 A1
20080114245 Randall et al. May 2008 A1
20080114246 Randall et al. May 2008 A1
20080114247 Urbano et al. May 2008 A1
20080114248 Urbano et al. May 2008 A1
20080114249 Randall et al. May 2008 A1
20080114250 Urbano et al. May 2008 A1
20080114251 Weymer et al. May 2008 A1
20080114252 Randall et al. May 2008 A1
20080114253 Randall et al. May 2008 A1
20080114255 Schwartz et al. May 2008 A1
20080125659 Wilser et al. May 2008 A1
20080181479 Yang et al. Jul 2008 A1
20080183075 Govari et al. Jul 2008 A1
20080188747 Randall et al. Aug 2008 A1
20080188750 Randall et al. Aug 2008 A1
20080194957 Hoctor et al. Aug 2008 A1
20080194958 Lee et al. Aug 2008 A1
20080194959 Wang et al. Aug 2008 A1
20080208061 Halmann Aug 2008 A1
20080242996 Hall et al. Oct 2008 A1
20080249408 Palmeri et al. Oct 2008 A1
20080255452 Entrekin Oct 2008 A1
20080269604 Boctor et al. Oct 2008 A1
20080269613 Summers et al. Oct 2008 A1
20080275344 Glide-Hurst et al. Nov 2008 A1
20080285819 Konofagou et al. Nov 2008 A1
20080287787 Sauer et al. Nov 2008 A1
20080294045 Ellington et al. Nov 2008 A1
20080294050 Shinomura et al. Nov 2008 A1
20080294052 Wilser et al. Nov 2008 A1
20080306382 Guracar et al. Dec 2008 A1
20080306386 Baba et al. Dec 2008 A1
20080319317 Kamiyama et al. Dec 2008 A1
20090010459 Garbini et al. Jan 2009 A1
20090012393 Choi Jan 2009 A1
20090015665 Willsie Jan 2009 A1
20090016163 Freeman et al. Jan 2009 A1
20090018445 Schers et al. Jan 2009 A1
20090024039 Wang et al. Jan 2009 A1
20090036780 Abraham Feb 2009 A1
20090043206 Towfiq et al. Feb 2009 A1
20090048519 Hossack et al. Feb 2009 A1
20090069681 Lundberg et al. Mar 2009 A1
20090069686 Daft et al. Mar 2009 A1
20090069692 Cooley et al. Mar 2009 A1
20090079299 Bradley et al. Mar 2009 A1
20090099483 Rybyanets Apr 2009 A1
20090112095 Daigle Apr 2009 A1
20090131797 Jeong et al. May 2009 A1
20090143680 Yao et al. Jun 2009 A1
20090148012 Altmann et al. Jun 2009 A1
20090150094 Van Velsor et al. Jun 2009 A1
20090182233 Wodnicki Jul 2009 A1
20090182237 Angelsen et al. Jul 2009 A1
20090198134 Hashimoto et al. Aug 2009 A1
20090203997 Ustuner Aug 2009 A1
20090208080 Grau et al. Aug 2009 A1
20090259128 Stribling Oct 2009 A1
20090264760 Lazebnik et al. Oct 2009 A1
20090306510 Hashiba et al. Dec 2009 A1
20090326379 Daigle et al. Dec 2009 A1
20100010354 Skerl et al. Jan 2010 A1
20100016725 Thiele Jan 2010 A1
20100036258 Dietz et al. Feb 2010 A1
20100063397 Wagner Mar 2010 A1
20100063399 Walker et al. Mar 2010 A1
20100069751 Hazard et al. Mar 2010 A1
20100069756 Ogasawara et al. Mar 2010 A1
20100085383 Cohen et al. Apr 2010 A1
20100106431 Baba et al. Apr 2010 A1
20100109481 Buccafusca May 2010 A1
20100121193 Fukukita et al. May 2010 A1
20100121196 Hwang et al. May 2010 A1
20100130855 Lundberg et al. May 2010 A1
20100145195 Hyun Jun 2010 A1
20100168566 Bercoff et al. Jul 2010 A1
20100168578 Garson, Jr. et al. Jul 2010 A1
20100174194 Chiang et al. Jul 2010 A1
20100174198 Young et al. Jul 2010 A1
20100191110 Insana et al. Jul 2010 A1
20100217124 Cooley Aug 2010 A1
20100228126 Emery et al. Sep 2010 A1
20100240994 Zheng Sep 2010 A1
20100249570 Carson et al. Sep 2010 A1
20100249596 Magee Sep 2010 A1
20100256488 Kim et al. Oct 2010 A1
20100262013 Smith Oct 2010 A1
20100266176 Masumoto et al. Oct 2010 A1
20100286525 Osumi Nov 2010 A1
20100286527 Cannon et al. Nov 2010 A1
20100310143 Rao et al. Dec 2010 A1
20100317971 Fan et al. Dec 2010 A1
20100324418 El-Aklouk et al. Dec 2010 A1
20100324423 El-Aklouk et al. Dec 2010 A1
20100329521 Beymer et al. Dec 2010 A1
20110005322 Ustuner Jan 2011 A1
20110016977 Guracar Jan 2011 A1
20110021915 Feng Jan 2011 A1
20110021920 Shafir et al. Jan 2011 A1
20110021923 Daft et al. Jan 2011 A1
20110033098 Richter et al. Feb 2011 A1
20110044133 Tokita Feb 2011 A1
20110066030 Yao Mar 2011 A1
20110098565 Masuzawa Apr 2011 A1
20110112400 Emery et al. May 2011 A1
20110112404 Gourevitch May 2011 A1
20110125017 Ramamurthy et al. May 2011 A1
20110178441 Tyler Jul 2011 A1
20110270088 Shiina Nov 2011 A1
20110301470 Sato et al. Dec 2011 A1
20110306886 Daft et al. Dec 2011 A1
20110319764 Okada et al. Dec 2011 A1
20120004545 Ziv-Ari et al. Jan 2012 A1
20120035482 Kim et al. Feb 2012 A1
20120036934 Kröning et al. Feb 2012 A1
20120085173 Papadopoulos et al. Apr 2012 A1
20120095347 Adam et al. Apr 2012 A1
20120101378 Lee Apr 2012 A1
20120114210 Kim et al. May 2012 A1
20120116226 Specht May 2012 A1
20120121150 Murashita May 2012 A1
20120137778 Kitazawa et al. Jun 2012 A1
20120140595 Amemiya Jun 2012 A1
20120141002 Urbano et al. Jun 2012 A1
20120165670 Shi et al. Jun 2012 A1
20120179044 Chiang et al. Jul 2012 A1
20120226201 Clark et al. Sep 2012 A1
20120235998 Smith-Casem et al. Sep 2012 A1
20120243763 Wen et al. Sep 2012 A1
20120253194 Tamura Oct 2012 A1
20120265075 Pedrizzetti et al. Oct 2012 A1
20120277585 Koenig et al. Nov 2012 A1
20130070062 Fouras et al. Mar 2013 A1
20130076207 Krohn et al. Mar 2013 A1
20130079639 Hoctor et al. Mar 2013 A1
20130083628 Qiao et al. Apr 2013 A1
20130088122 Krohn et al. Apr 2013 A1
20130116561 Rothberg et al. May 2013 A1
20130131516 Katsuyama May 2013 A1
20130144165 Ebbini et al. Jun 2013 A1
20130144166 Specht et al. Jun 2013 A1
20130204136 Duric et al. Aug 2013 A1
20130204137 Roy et al. Aug 2013 A1
20130218012 Specht et al. Aug 2013 A1
20130253325 Call et al. Sep 2013 A1
20130258805 Hansen et al. Oct 2013 A1
20130261463 Chiang et al. Oct 2013 A1
20140043933 Belevich et al. Feb 2014 A1
20140058266 Call et al. Feb 2014 A1
20140073921 Specht et al. Mar 2014 A1
20140086014 Kobayashi Mar 2014 A1
20140147013 Shandas et al. May 2014 A1
20140243673 Anand et al. Aug 2014 A1
20140269209 Smith et al. Sep 2014 A1
20150045668 Smith et al. Feb 2015 A1
20150080727 Specht et al. Mar 2015 A1
20150297184 Specht Oct 2015 A1
20150374345 Specht et al. Dec 2015 A1
20160095579 Smith et al. Apr 2016 A1
20160157833 Smith et al. Jun 2016 A1
20170074982 Smith et al. Mar 2017 A1
20170079621 Specht et al. Mar 2017 A1
20180049717 Adam et al. Feb 2018 A1
20180153511 Specht et al. Jun 2018 A1
20180279991 Call et al. Oct 2018 A1
20190008487 Belevich et al. Jan 2019 A1
20190021697 Specht et al. Jan 2019 A1
20190083058 Specht Mar 2019 A1
20190175152 Smith et al. Jun 2019 A1
20190200961 Specht et al. Jul 2019 A1
20190328367 Specht et al. Oct 2019 A1
Foreign Referenced Citations (134)
Number Date Country
1535243 Oct 2004 CN
1781460 Jun 2006 CN
101103927 Jan 2008 CN
101116622 Feb 2008 CN
101190134 Jun 2008 CN
101453955 Jun 2009 CN
101609150 Dec 2009 CN
101843501 Sep 2010 CN
101912278 Dec 2010 CN
102018533 Apr 2011 CN
102112047 Jun 2011 CN
102123668 Jul 2011 CN
102599930 Jul 2012 CN
102011114333 Mar 2013 DE
1949856 Jul 2008 EP
2058796 May 2009 EP
2101191 Sep 2009 EP
2182352 May 2010 EP
2187813 May 2010 EP
2198785 Jun 2010 EP
1757955 Nov 2010 EP
2325672 May 2011 EP
1462819 Jul 2011 EP
2356941 Aug 2011 EP
1979739 Oct 2011 EP
2385391 Nov 2011 EP
2294400 Feb 2012 EP
2453256 May 2012 EP
1840594 Jun 2012 EP
2514368 Oct 2012 EP
1850743 Dec 2012 EP
1594404 Sep 2013 EP
2026280 Oct 2013 EP
2851662 Aug 2004 FR
S49-11189 Jan 1974 JP
S54-44375 Apr 1979 JP
S55-103839 Aug 1980 JP
57-31848 Feb 1982 JP
58-223059 Dec 1983 JP
59-101143 Jun 1984 JP
S59-174151 Oct 1984 JP
S60-13109 Jan 1985 JP
S60-68836 Apr 1985 JP
01164354 Jun 1989 JP
2-501431 May 1990 JP
03015455 Jan 1991 JP
03126443 May 1991 JP
04017842 Jan 1992 JP
4-67856 Mar 1992 JP
05-042138 Feb 1993 JP
6-125908 May 1994 JP
06254092 Sep 1994 JP
7-051266 Feb 1995 JP
07204201 Aug 1995 JP
08154930 Jun 1996 JP
08-252253 Oct 1996 JP
9-103429 Apr 1997 JP
9-201361 Aug 1997 JP
2777197 May 1998 JP
10-216128 Aug 1998 JP
11-089833 Apr 1999 JP
11-239578 Sep 1999 JP
2001-507794 Jun 2001 JP
2001-245884 Sep 2001 JP
2002-209894 Jul 2002 JP
2002-253548 Sep 2002 JP
2002-253549 Sep 2002 JP
2003235839 Aug 2003 JP
2004-167092 Jun 2004 JP
2004-215987 Aug 2004 JP
2004-337457 Dec 2004 JP
2004-351214 Dec 2004 JP
2004340809 Dec 2004 JP
2005046192 Feb 2005 JP
2005152187 Jun 2005 JP
2005-523792 Aug 2005 JP
2005-526539 Sep 2005 JP
2006051356 Feb 2006 JP
2006-61203 Mar 2006 JP
2006-122657 May 2006 JP
2006130313 May 2006 JP
2006204923 Aug 2006 JP
2007-325937 Dec 2007 JP
2008-122209 May 2008 JP
2008-513763 May 2008 JP
2008515557 May 2008 JP
2008132342 Jun 2008 JP
2008522642 Jul 2008 JP
2008-259541 Oct 2008 JP
2008279274 Nov 2008 JP
2008307087 Dec 2008 JP
2009240667 Oct 2009 JP
20105375 Jan 2010 JP
2010124842 Jun 2010 JP
2010526626 Aug 2010 JP
2011529362 Dec 2011 JP
2013121493 Jun 2013 JP
2014087448 May 2014 JP
100715132 Apr 2007 KR
1020080044737 May 2008 KR
1020090103408 Oct 2009 KR
WO9218054 Oct 1992 WO
WO9800719 Jan 1998 WO
WO0164109 Sep 2001 WO
WO02084594 Oct 2002 WO
WO2005009245 Feb 2005 WO
WO2006114735 Nov 2006 WO
WO2007127147 Nov 2007 WO
WO2008097479 Aug 2008 WO
WO2009060182 May 2009 WO
WO2010095094 Aug 2010 WO
WO2010137453 Dec 2010 WO
WO2010139519 Dec 2010 WO
WO2011004661 Jan 2011 WO
WO2011057252 May 2011 WO
WO2011064688 Jun 2011 WO
WO2011100697 Aug 2011 WO
WO2011123529 Oct 2011 WO
WO2012028896 Mar 2012 WO
WO2012049124 Apr 2012 WO
WO2012049612 Apr 2012 WO
WO2012078639 Jun 2012 WO
WO2012091280 Jul 2012 WO
WO2012112540 Aug 2012 WO
WO2012131340 Oct 2012 WO
WO2012160541 Nov 2012 WO
WO2013059358 Apr 2013 WO
WO2013109965 Jul 2013 WO
WO2013116807 Aug 2013 WO
WO2013116809 Aug 2013 WO
WO2013116851 Aug 2013 WO
WO2013116854 Aug 2013 WO
WO2013116866 Aug 2013 WO
WO2013128301 Sep 2013 WO
Non-Patent Literature Citations (58)
Entry
Specht; U.S. Appl. No. 15/240,884 entitled “Method and apparatus to produce ultrasonic images using multiple apertures,” filed Aug. 18, 2016.
Arigovindan et al.; Full motion and flow field recovery from echo doppler data; IEEE Transactions on Medical Imaging; 26(1); pp. 31-45; Jan. 2007.
Capineri et al.; A doppler system for dynamic vector velocity maps; Ultrasound in Medicine & Biology; 28(2); pp. 237-248; Feb. 28, 2002.
Dunmire et al.; A brief history of vector doppler; Medical Imaging 2001; International Society for Optics and Photonics; pp. 200-214; May 30, 2001.
Saad et al.; Computer vision approach for ultrasound doppler angle estimation; Journal of Digital Imaging; 22(6); pp. 681-688; Dec. 1, 2009.
Zang et al.; A high-frequency high frame rate duplex ultrasound linear array imaging system for small animal imaging; IEEE transactions on ultrasound, ferroelectrics, and frequency control; 57(7); pp. 1548-1567; Jul. 2010.
Specht et al.; U.S. Appl. No. 15/155,908 entitled “Determining material stiffness using multiple aperture ultrasound,” filed May 16, 2016.
Abeysekera et al.; Alignment and calibration of dual ultrasound transducers using a wedge phantom; Ultrasound in Medicine and Biology; 37(2); pp. 271-279; Feb. 2011.
Carson et al.; Measurement of photoacoustic transducer position by robotic source placement and nonlinear parameter estimation; Biomedical Optics (BIOS); International Society for Optics and Photonics (9th Conf. on Biomedical Thermoacoustics, Optoacoustics, and Acousto-optics; vol. 6856; 9 pages; Feb. 28, 2008.
Chen et al.; Maximum-likelihood source localization and unknown sensor location estimation for wideband signals in the near-field; IEEE Transactions on Signal Processing; 50(8); pp. 1843-1854; Aug. 2002.
Chen et al.; Source localization and tracking of a wideband source using a randomly distributed beamforming sensor array; International Journal of High Performance Computing Applications; 16(3); pp. 259-272; Fall 2002.
Cristianini et al.; An Introduction to Support Vector Machines; Cambridge University Press; pp. 93-111; Mar. 2000.
Du et al.; User parameter free approaches to multistatic adaptive ultrasound imaging; 5th IEEE International Symposium; pp. 1287-1290, May 2008.
Feigenbaum, Harvey, M.D.; Echocardiography; Lippincott Williams & Wilkins; Philadelphia; 5th Ed.; pp. 482, 484; Feb. 1994.
Fernandez et al.; High resolution ultrasound beamforming using synthetic and adaptive imaging techniques; Proceedings IEEE International Symposium on Biomedical Imaging; Washington, D.C.; pp. 433-436; Jul. 7-10, 2002.
Gazor et al.; Wideband multi-source beamforming with array location calibration and direction finding; Conference on Acoustics, Speech and Signal Processing ICASSP-95; Detroit, MI; vol. 3 IEEE; pp. 1904-1907; May 9-12, 1995.
Haykin, Simon; Neural Networks: A Comprehensive Foundation (2nd Ed.); Prentice Hall; pp. 156-187; Jul. 16, 1998.
Heikkila et al.; A four-step camera calibration procedure with implicit image correction; Proceedings IEEE Computer Scociety Conference on Computer Vision and Pattern Recognition; San Juan; pp. 1106-1112; Jun. 17-19, 1997.
Hendee et al.; Medical Imaging Physics; Wiley-Liss, Inc. 4th Edition; Chap. 19-22; pp. 303-353; (year of pub. sufficiently earlier than effective US filing date and any foreign priority date) © 2002.
Hsu et al.; Real-time freehand 3D ultrasound calibration; CUED/F-INFENG/TR 565; Department of Engineering, University of Cambridge, United Kingdom; 14 pages; Sep. 2006.
Jeffs; Beamforming: a brief introduction; Brigham Young University; 14 pages; retrieved from the internet (http://ens.ewi.tudelft.nl/Education/courses/et4235/Beamforming.pdf); Oct. 2004.
Khamene et al.; A novel phantom-less spatial and temporal ultrasound calibration method; Medical Image Computing and Computer-Assisted Intervention—MICCAI (Proceedings 8th Int. Conf.); Springer Berlin Heidelberg; Palm Springs, CA; pp. 65-72; Oct. 26-29, 2005.
Kramb et al,.; Considerations for using phased array ultrasonics in a fully automated inspection system. Review of Quantitative Nondestructive Evaluation, vol. 23, ed. D. O. Thompson and D. E. Chimenti, pp. 817-825, (year of publication is sufficiently earlier than the effective U.S. filing date and any foreign priority date) 2004.
Ledesma-Carbayo et al.; Spatio-temporal nonrigid registration for ultrasound cardiac motion estimation; IEEE Trans. on Medical Imaging; vol. 24; No. 9; Sep. 2005.
Leotta et al.; Quantitative three-dimensional echocardiography by rapid imaging . . . ; J American Society of Echocardiography; vol. 10; No. 8; pp. 830-839; Oct. 1997.
Li et al.; An efficient speckle tracking algorithm for ultrasonic imaging; 24; pp. 215-228; Oct. 1, 2002.
Morrison et al.; A probabilistic neural network based image segmentation network for magnetic resonance images; Proc. Conf. Neural Networks; Baltimore, MD; vol. 3; pp. 60-65; Jun. 1992.
Nadkarni et al.; Cardiac motion synchronization for 3D cardiac ultrasound imaging; Ph.D. Dissertation, University of Western Ontario; Jun. 2002.
Opretzka et al.; A high-frequency ultrasound imaging system combining limited-angle spatial compounding and model-based synthetic aperture focusing; IEEE Transactions on Ultrasonics, Ferroelectrics and Frequency Control, IEEE, US; 58(7); pp. 1355-1365; Jul. 2, 2011.
Press et al.; Cubic spline interpolation; §3.3 in “Numerical Recipes in FORTRAN: The Art of Scientific Computing”, 2nd Ed.; Cambridge, England; Cambridge University Press; pp. 107-110; Sep. 1992.
Sakas et al.; Preprocessing and volume rendering of 3D ultrasonic data; IEEE Computer Graphics and Applications; pp. 47-54, Jul. 1995.
Sapia et al.; Deconvolution of ultrasonic waveforms using an adaptive wiener filter; Review of Progress in Quantitative Nondestructive Evaluation; vol. 13A; Plenum Press; pp. 855-862; (year of publication is sufficiently earlier than the effective U.S. filing date and any foreign priority date) 1994.
Sapia et al.; Ultrasound image deconvolution using adaptive inverse filtering; 12 IEEE Symposium on Computer-Based Medical Systems, CBMS, pp. 248-253; Jun. 1999.
Sapia, Mark Angelo; Multi-dimensional deconvolution of optical microscope and ultrasound imaging using adaptive least-mean-square (LMS) inverse filtering; Ph.D. Dissertation; University of Connecticut; Jan. 2000.
Slavine et al.; Construction, calibration and evaluation of a tissue phantom with reproducible optical properties for investigations in light emission tomography; Engineering in Medicine and Biology Workshop; Dallas, TX; IEEE pp. 122-125; Nov. 11-12, 2007.
Smith et al.; High-speed ultrasound volumetric imaging system. 1. Transducer design and beam steering; IEEE Trans. Ultrason., Ferroelect., Freq. Contr.; vol. 38; pp. 100-108; Mar. 1991.
Specht et al.; Deconvolution techniques for digital longitudinal tomography; SPIE; vol. 454; presented at Application of Optical Instrumentation in Medicine XII; pp. 319-325; Jun. 1984.
Specht et al.; Experience with adaptive PNN and adaptive GRNN; Proc. IEEE International Joint Conf. on Neural Networks; vol. 2; pp. 1203-1208; Orlando, FL; Jun. 1994.
Specht, D.F.; A general regression neural network; IEEE Trans. on Neural Networks; vol. 2.; No. 6; Nov. 1991.
Specht, D.F.; Blind deconvolution of motion blur using LMS inverse filtering; Lockheed Independent Research (unpublished); Jun. 23, 1975.
Specht, D.F.; Enhancements to probabilistic neural networks; Proc. IEEE International Joint Conf. on Neural Networks; Baltimore, MD; Jun. 1992.
Specht, D.F.; GRNN with double clustering; Proc. IEEE International Joint Conf. Neural Networks; Vancouver, Canada; Jul. 16-21, 2006.
Specht, D.F.; Probabilistic neural networks; Pergamon Press; Neural Networks; vol. 3; pp. 109-118; Feb. 1990.
UCLA Academic Technology; SPSS learning module: How can I analyze a subset of my data; 6 pages; retrieved from the internet (http://www.ats.ucla.edu/stat/spss/modules/subset_analyze.htm) Nov. 26, 2001.
Urban et al; Implementation of vibro-acoustography on a clinical ultrasound system; IEEE Transactions on Ultrasonics, Ferroelectrics, and Frequency Control; 58(6); pp. 1169-1181 (Author Manuscript, 25 pgs.); Jun. 2011.
Urban et al; Implementation of vibro-acoustography on a clinical ultrasound system; IEEE Ultrasonics Symposium (IUS); pp. 326-329; Oct. 14, 2010.
Von Ramm et al.; High-speed ultrasound volumetric imaging-System. 2. Parallel processing and image display; IEEE Trans. Ultrason., Ferroelect., Freq. Contr.; vol. 38; pp. 109-115; Mar. 1991.
Wang et al.; Photoacoustic tomography of biological tissues with high cross-section resolution: reconstruction and experiment; Medical Physics; 29(12); pp. 2799-2805; Dec. 2002.
Wells, P.N.T.; Biomedical ultrasonics; Academic Press; London, New York, San Francisco; pp. 124-125; Mar. 1977.
Widrow et al.; Adaptive signal processing; Prentice-Hall; Englewood Cliffs, NJ; pp. 99-116; Mar. 1985.
Wikipedia; Point cloud; 2 pages; retrieved Nov. 24, 2014 from the internet (https://en.wikipedia.org/w/index.php?title=Point_cloud&oldid=472583138).
Wikipedia; Curve fitting; 5 pages; retrieved from the internet (http:en.wikipedia.org/wiki/Curve_fitting) Dec. 19, 2010.
Wikipedia; Speed of sound; 17 pages; retrieved from the internet (http:en.wikipedia.org/wiki/Speed_of_sound) Feb. 15, 2011.
Yang et al.; Time-of-arrival calibration for improving the microwave breast cancer imaging; 2011 IEEE Topical Conf. on Biomedical Wireless Technologies, Networks, and sensing Systems (BioWireleSS); Phoenix, AZ; pp. 67-70; Jan. 16-19, 2011.
Belevich et al.; U.S. Appl. No. 15/400,826 entitled “Calibration of multiple aperture ultrasound probes,” filed Jan. 6, 2017.
Davies et al.; U.S. Appl. No. 15/418,534 entitled “Ultrasound imaging with sparse array probes,” filed Jan. 27, 2017.
Call et al.; U.S. Appl. No. 15/500,933 entitled “ Network-based ultrasound imaging system,” filed Feb. 1, 2017.
Call et al.; U.S. Appl. No. 15/495,591 entitled “Systems and methods for improving ultrasound image quality by applying weighting factors,” filed Apr. 24, 2017.
Related Publications (1)
Number Date Country
20160135783 A1 May 2016 US
Provisional Applications (2)
Number Date Country
61581583 Dec 2011 US
61691717 Aug 2012 US
Continuations (1)
Number Date Country
Parent 13730346 Dec 2012 US
Child 15005866 US