Claims
- 1. A method for reducing speckle associated with three-dimensional ultrasound images, the method comprising the steps of:
(a) acquiring first and second pluralities of two-dimensional frames of data, each two-dimensional frame of data associated with at least one point in an elevation position different than each other two-dimensional frame of data; (b) compounding two-dimensional frames of data within the first plurality; (c) compounding two-dimensional frames of data within the second plurality; (d) generating a three-dimensional reconstruction from the compounded two-dimensional frames of data associated with the first and second pluralities; and (e) displaying a representation of the three-dimensional reconstruction.
- 2. The method of claim 1 further comprising a step (f) of filtering the three-dimensional reconstruction at least along an elevation dimension with an anisotropic filter.
- 3. The method of claim 1 wherein step (a) comprises detecting envelopes associated with each of a plurality of samples comprising each two-dimensional frame of data.
- 4. The method of claim 1 wherein steps (b) and (c) comprise averaging the frames of data.
- 5. The method of claim 1 wherein steps (b) and (c) comprise the steps of:
(i) weighting the frames of data; and (ii) adding the weighted frames of data.
- 6. The method of claim 1 wherein step (a) comprises acquiring each two-dimensional frame of data off-set in elevation at an angle relative to a range dimension from each of the two-dimensional frames of data.
- 7. The method of Claim 1 wherein step (a) comprises acquiring each two-dimensional frame of data associated with parallel scan planes.
- 8. The method of claim 1 wherein step (a) comprises acquiring each frame of data spaced in the elevation dimension from another of the frames of data less than an elevation beam width associated with a transmit beam.
- 9. The method of claim 1:further comprising step (f) of generating at least first and second three-dimensional representations from the first and second pluralities of two-dimensional frames of data, each of the first and second three-dimensional representations comprising frames of data from both the first and second pluralities; and wherein steps (b), (c), and (d) comprise compounding frames of data from the first three-dimensional representation with frames of data from the second three-dimensional representation.
- 10. The method of claim 1 further comprising the step (f) of aligning the frames of data in range and azimuth dimensions prior to steps (b) and (c).
- 11. The method of claim 10 wherein step (f) comprises obtaining frame alignment information.
- 12. The method of claim 1 wherein step (e) comprises displaying a projection rendering of the three-dimensional reconstruction.
- 13. A method for reducing speckle associated with three-dimensional ultrasound images, the method comprising the steps of:
(a) acquiring a plurality of two-dimensional frames of data, each two-dimensional frame of data associated with a respective unique elevation position; (b) generating a three-dimensional reconstruction from the plurality of two-dimensional frames of data; (c) filtering at least along a substantially elevation dimension of the three-dimensional reconstruction with an anisotropic filter. (d) displaying a representation of the three-dimensional reconstruction responsive to the step (c).
- 14. The method of claim 13:further comprising a step (e) of compounding at least first and second sub-sets of the plurality of frames of data; and wherein step (b) comprises generating the three-dimensional reconstruction from compounded frames of data.
- 15. The method of claim 13 wherein step (a) comprises detecting envelopes associated with each of a plurality of samples comprising each two-dimensional frame of data.
- 16. The method of claim 13 wherein step (c) comprises averaging planes of data within the three-dimensional reconstruction.
- 17. The method of claim 13 wherein step (c) comprises the steps of:
(c1) weighting planes of data within the three-dimensional reconstruction; and (c2) adding the weighted planes of data.
- 18. The method of claim 13 wherein step (a) comprises acquiring each two-dimensional frame of data off-set in elevation at an angle relative to a range dimension from each of the two-dimensional frames of data.
- 19. The method of claim 13 wherein step (a) comprises acquiring each two-dimensional frame of data associated with parallel scan planes.
- 20. The method of claim 13 wherein step (a) comprises acquiring each frame of data spaced in the elevation dimension from another of the frames of data less than an elevation beam width associated with a transmit beam.
- 21. The method of claim 13 wherein step (b) comprises aligning the plurality of frames of data in elevation, range and azimuth dimensions.
- 22. The method of claim 21 wherein step (b) comprises obtaining frame alignment information.
- 23. The method of claim 13 wherein step (d) comprises displaying a projection rendering of the three-dimensional reconstruction.
- 24. A method for generating three-dimensional Doppler representations for ultrasound imaging, the method comprising the steps of:
(a) obtaining a first type of Doppler data representing a region; (b) obtaining a second type of Doppler data representing the region; (c) generating a first three-dimensional data set from the first type of Doppler data; (d) generating a second three dimensional data set from the second type of Doppler data; and (e) combining the first and second three-dimensional data sets.
- 25. The method of claim 24 wherein step (e) comprises:
(e1) assigning a relationship between the first and second three-dimensional data sets in response to user input after steps (c) and (d); and (e2) combining the first and second three-dimensional data sets in response to the relationship.
- 26. The method of claim 25 wherein the relationship of step (e1) is selected from the group consisting of: relative hues, relative opacities, weights, thresholds, mixing function, and combinations thereof.
- 27. The method of claim 26 wherein the mixing function of step (c1) is selected from the group consisting of: dividing, multiplying, summing, weighted summing, differencing, selecting look up table combinations of the first and second sets of data, and combinations thereof.
- 28. The method of claim 25 further comprising the steps of:
(f) assigning a second relationship between the first and subsequent three-dimensional data sets in response to a subsequent user input; and (h) combining the first and second three-dimensional data sets in response to the second relationship.
- 29. The method of claim 24 further comprising the step of:
(f) rendering an image from the combined first and second three-dimensional data set.
- 30. The method of claim 24 further comprising the step of:
(f) calculating a quantity from the combined first and second three-dimensional data sets.
- 31. The method of claim 24 further comprising the step of:
(f) filtering in three dimensions data selected from the group consisting of: the first three-dimensional data set, the second three-dimensional data set, the combined first and second three-dimensional data sets, and combinations thereof.
- 32. The method of claim 24 wherein:
step (c) comprises the step of applying two-dimension to three-dimension interpolation factors to the first type of Doppler data; and step (d) comprises the step of applying the same two-dimension to three-dimension interpolation factors to the second type of Doppler data.
- 33. The method of claim 24 wherein the first and second types of Doppler data of steps (a) and (b) are selected from the group consisting of, Doppler velocity, Doppler energy, Doppler variance, Doppler Tissue energy, Doppler Tissue velocity and Doppler Tissue variance.
- 34. The method of claim 24 further comprising the step of:
(f) storing the first and second three-dimensional data sets as bit values, each bit valve comprising at least eight bits.
- 35. An apparatus for generating three-dimensional representations for ultrasound imaging, the apparatus comprising:
a Doppler processor for obtaining at least first and second types of Doppler data; a processor for generating first and second three-dimensional data sets from the first and second types of Doppler data, respectively; a memory for storing the first and second three-dimensional data sets; a user input; and a processor for combining the first and second three-dimensional data sets in response to input from the user input.
- 36. The apparatus of claim 35 further comprising a display for displaying an image rendered from the combined first and second three-dimensional data set.
- 37. A method for generating three-dimensional Doppler representations for ultrasound imaging, the method comprising the steps of:
(a) obtaining first and second independent types of Doppler data; (b) generating first and second three-dimensional data sets from the first and second types of Doppler data, respectively; (c) storing the first and second three-dimensional data sets; (d) receiving first user input relationship information; (e) combining the first and second three-dimensional data sets in response to the first relationship information; (g) receiving second user input relationship information; and (f) combining the first and second three-dimensional data sets in response to the second relationship information.
- 38. The method of claim 37 wherein the first and second relationship information are selected from the group consisting of: relative hues, relative opacities, weights, thresholds, mixing function, and combinations thereof.
- 39. A method for generating three-dimensional Doppler representations for ultrasound imaging, the method comprising the steps of:
(a) generating a first three-dimensional data set from a first set of B-mode data, the first set of B-mode data indicative of a first type of B-mode intensity values; (b) generating a second three dimensional data set from a second set of B-mode data, the second set of B-mode data indicative of a second type of B-mode intensity values and representing a substantially same region as the first set; (c) storing the first and second three-dimensional data sets; (d) assigning a relationship between the first and second three-dimensional data set in response to user input; and (e) combining the first and second three-dimensional data sets in response to the relationship.
- 40. The method of claim 39 wherein the relationship of step (d) is selected for the group consisting of: relative hues, relative opacities, weights, thresholds, mixing function, and combinations thereof.
- 41. The method of claim 39 wherein the first and second sets of B-mode data correspond to harmonic and fundamental frequency, respectively, types of B-mode intensity values.
- 42. A method for generating three-dimensional representations for ultrasound imaging, the method comprising the steps of:
(a) generating a B-mode three-dimensional data set; (b) generating a Doppler three-dimensional data set, the Doppler three dimensional data set representing a substantially same region as the B-mode three-dimensional data set; (c) separately storing the B-mode and Doppler three-dimensional data sets; and (d) displaying an image comprising a three-dimensional representation responsive to the Doppler three-dimensional data set and a two-dimensional representation responsive to the B-mode three-dimensional data set.
- 43. The method of claim 42 further comprising step (e) of generating the three-dimensional representation as a surface rendering.
- 44. The method of claim 42 further comprising step (e) of generating the three-dimensional representation as a volume rendering.
- 45. The method of claim 42 further comprising step (e) of generating the two-dimensional representation as a plane.
- 46. The method of claim 45 further comprising step (f) of varying a position of the plane relative to the three-dimensional representation.
- 47. The method of claim 42 further comprising step (e) of selecting an opacity relationship of the two-dimensional representation relative to the three dimensional representation.
- 48. The method of claim 42 further comprising step (e) of selectively removing one of the three-dimensional and two-dimensional representations from the image.
- 49. The method of claim 42 further comprising step (e) of changing a viewing angle associated with the three-dimensional representation.
- 50. The method of claim 42 further comprising step (e) of displaying at least a second two-dimensional representation in the image.
- 51. A method for generating a compounded image with an ultrasound system, the method comprising the steps of:
(a) acquiring a sequence of at least three frames of ultrasound data; (b) storing each of the at least three frames of ultrasound data separately; (c) inputting compounding information from a user; (d) compounding at least two of the at least three frames of ultrasound data in response to the compounding information, an finite impulse response and after step (b); and (e) displaying a compounded image responsive to step (d).
- 52. The method of claim 51 wherein step (a) comprises acquiring the ultrasound data with a one-dimensional transducer array.
- 53. The method of claim 51 wherein step (d) comprises compounding with a finite impulse response filter.
- 54. The method of claim 53 wherein step (d) comprises compounding the at least three frames of ultrasound data.
- 55. The method of claim 54 wherein step (d) comprises:
(d1) multiplying the at least three frames of ultrasound data by at least three respective weights with a finite impulse response filter; and (d2) summing the weighted at least three frames of ultrasound data with the finite impulse response filter.
- 56. The method of claim 53 wherein step (e) comprises displaying a series of compound images, wherein each compound image in the series is responsive to a different set of the separately stored sequence of at least three frames of ultrasound data.
- 57. The method of claim 51 wherein step (c) comprises inputting a temporal persistence value.
- 58. The method of claim 57 wherein step (c) comprises determining a filter size and weighting coefficients in response to the temporal persistence value.
- 59. The method of claim 57 further comprising the steps:
(f) inputting additional compounding information after step (e); (g) compounding the at least two of the at least three frames of ultrasound data in response to the additional compounding information; and (h) displaying an additional compounded image responsive to step (g).
- 60. The method of claim 51 where steps (b), (d) and (e) are performed by a processor remote from the ultrasound system.
- 61. The method of claim 51 further comprising:
(f) transforming the at least three frames of ultrasound data responsive to step (f); and wherein step (d) comprises compounding transformed compressed ultrasound data responsive to step (f).
- 62. The method of claim 61 wherein step (f) comprises compressing the at least three frames of ultrasound data; and
wherein step (d) comprises compounding compressed ultrasound data.
- 63. The method of claim 61 further comprises (g) accounting for non-linear processes in step (f) prior to step (d).
- 64. The method of claim 51 further comprising (f) of aligning the at least two of the at least three frames of ultrasound data in a spatial parameter selected from the group consisting of: range, azimuth, rotation and combinations thereof prior to steps (d).
- 65. The method of claim 64 further comprising determining alignment information for step (f) as a function of a region of interest.
- 66. The method of claim 51 wherein step (a) comprises acquiring the sequence, wherein each of the at least three frames of ultrasound image date is associated with at least one point in an elevation position different than an equivalent point in another of the at least three frames of ultrasound data.
- 67. The method of claim 51 wherein:
step (a) comprises acquiring the sequence wherein each of the at least three frames of ultrasound data represent a substantially same region; and step (d) comprises temporally persisting the at least two of the at least three frames of ultrasound data.
- 68. The method of claim 51 wherein at least 200 milliseconds passes between steps (b) and (d).
- 69. The method of claim 51 wherein step (a) comprises acquiring the sequence wherein each of the at least three frames of ultrasound data comprises persisted frames of ultrasound data.
- 70. The method of claim 51 wherein steps (c), (d), and (e) are performed during a non-real time review.
- 71. The method of claim 51 wherein step (d) comprises compounding in response to a non-linear function.
- 72. The method of claim 51:wherein step (c) comprises inputting user selected alteration information; further comprising (f) of altering the compounded ultrasound data responsive to step (d); and wherein step (e) is responsive to step (f).
- 73. An ultrasound system for generating a compounded image, the system comprising:
a beamformer for acquiring a sequence of at least three frames of ultrasound data; a memory for storing each of the at least three frames of ultrasound data separately; a user interface for inputting compounding information; a finite impulse response compounding processor for compounding at least two of the at least three frames of ultrasound data in response to the compounding information and after storage of each of the at least three frames of ultrasound data; and a display for displaying a compounded image responsive to step (d).
- 74. The system of claim 73 wherein the compounding processor comprises a finite impulse response filter.
- 75. The system of claim 73 wherein the display is operable to display a series of compound images, wherein each compound image in the series is responsive to a different set of the separately stored sequence of at least three frames of ultrasound data.
- 76. The system of claim 73 wherein the compounding information comprises a number of frames to compound.
- 77. The system of claim 73 wherein the user interface is operable to receive additional compounding information after the display of the compounded image;
the compounding processor is operable to compound the at least two of the at least three frames of ultrasound data in response to the additional compounding information; and the display is operable to display an additional compounded image responsive to step (g).
- 78. The system of claim 73 where the memory and the compounding processor comprise a workstation remote from the ultrasound system.
- 79. The system of claim 73 wherein at least 200 milliseconds passes between storage and compounding.
- 80. A method for generating a compounded image with an ultrasound system, the method comprising the steps of:
(a) acquiring the sequence of at least two frames of ultrasound data; (b) storing a sequence of at least two frames of ultrasound data representing a substantially same two-dimensional region; (c) compounding the at least two frames of ultrasound data with a finite impulse response filter; and (d) displaying a compounded image responsive to step (b).
wherein at least 200 milliseconds passes after step (b) and before performing step (d).
- 81. The method of claim 80 further comprising step (e) of acquiring the sequence of at least two frames of ultrasound data with a one-dimensional transducer array.
- 82. The method of claim 80 wherein step (c) comprises compounding at least three frames of ultrasound data.
- 83. The method of claim 80 wherein step (c) comprises:
(c1) multiplying the at least two frames of ultrasound data by at least two respective weights; and (c2) summing the weighted at least two frames of ultrasound data.
- 84. The method of claim 80 wherein:
step (b) comprises storing the at least two frames of ultrasound data separately; and step (d) comprises displaying a series of compound images, wherein each compound image in the series is responsive to a different set of the separately stored sequence of at least two frames of ultrasound data, where each different set is independent of at least one of the separately stored sequence of at least two frames of ultrasound data of at least one of the other sets.
- 85. The method of claim 80 further comprising:
(e) inputting persistence information from a user.
- 86. The method of claim 85 wherein step (e) comprises determining a filter size and weighting coefficients in response to the persistence information.
- 87. The method of claim 86 further comprising the steps:
(f) inputting additional persistence information after step (d); (g) compounding the at least two frames of ultrasound data in response to the additional persistence information; and (h) displaying an additional compounded image responsive to step (g).
- 88. The method of claim 80 where steps (b), (c) and (d) are performed by a processor remote from the ultrasound system.
- 89. The method of claim 80 further comprising:
(e) transforming the at least two frames of ultrasound data; and wherein step (c) comprises compounding transformed ultrasound data.
- 90. The method of claim 89 wherein step (e) comprises compressing the at lest two frames of ultrasound data.
- 91. The method of claim 80 further comprising (e) of aligning the at least two frames of ultrasound data in a spatial parameter selected from the group consisting of: range, azimuth, rotation and combinations thereof prior to step (c).
- 92. The method of claim 91 further comprising determining alignment information for step (e) as a function of a region of interest.
- 93. The method of claim 91 wherein step (a) comprises acquiring the sequence, wherein each of the at least two frames of ultrasound image date is associated with at least one point in an elevation position different than an equivalent point in another of the at least two frames of ultrasound data.
- 94. The method of claim 80 wherein step (a) comprises acquiring the sequence wherein each of the at least two frames of ultrasound data comprises persisted frames of ultrasound data.
- 95. The method of claim 80 wherein steps (c) and (d) are performed during a non-real time review.
- 96. The method of claim 95 wherein step (b) comprises storing the sequence in a CINE memory.
- 97. The method of claim 80 wherein step (c) comprises compounding in response to a non-linear function.
- 98. The method of claim 80 further comprising:
(e) of inputting user selected alteration information selected from the group consisting of: histogram equalization, contrast and resolution mapping function, and combinations thereof; and (f) of altering the compounded ultrasound data resulting from step (c) in response to the user selected alteration information; and wherein step (d) is responsive to step (f).
- 99. The method of claim 80 further comprising:
(e) determining a degree of correlation between the at least two frames of ultrasound data; and wherein step (c) is responsive to the correlation.
- 100. An ultrasound system for generating a compounded image, the system comprising:
a beamformer for acquiring the sequence of at least two frames of ultrasound data; a memory for storing a sequence of at least two frames of ultrasound data representing a substantially same two-dimensional region; a finite impulse response filter for compounding the at least two frames of ultrasound data; and a display for displaying a compounded image responsive to an output of the finite impulse response filter; wherein at least 200 milliseconds passes after storage of the sequence and compounding.
- 101. The system of claim 100 wherein:
the memory is operable to store the at least two frames of ultrasound data separately; and wherein the display is operable to display a series of compound images, wherein each compound image in the series is responsive to a different set of the separately stored sequence of at least two frames of ultrasound data, where each different set is independent of at least one of the separately stored sequence of at least two frames of ultrasound data of at least one of the other sets.
- 102. The system of claim 100 further comprising a user interface operable to receive compounding input information selected from the group consisting of: a filter size and weighting coefficients.
- 103. The system of claim 102 wherein:
the user interface is operable to receive additional compounding input information after the display of the compounded image; the filter is operable to compounding the at least two frames of ultrasound data in response to the additional compounding input information; and the display is operable to display an additional compounded image responsive to the additional compounding input information.
- 104. A method for generating a compounded image with an ultrasound system, the method comprising the steps of:
(a) acquiring first and second two-dimensional frames of ultrasound data, each two-dimensional frame of ultrasound data associated with a different elevation position than each other two-dimensional frame of data; (b) determining a degree of correlation between the first and second frames of ultrasound data; (c) compounding the first and second two-dimensional frames of ultrasound data less if the degree of correlation is low and more if the degree of correlation is high; (d) displaying a compounded image responsive to step (c).
- 105. The method of claim 104 further comprising:
(e) transforming the first and second frames of ultrasound data; and wherein step (c) comprises compounding the transformed ultrasound data.
- 106. The method of claim 105 wherein step (e) comprises compressing the first and second frames of ultrasound data.
- 107. The method of claim 104 further comprising (e) of aligning the first and second frames of ultrasound data in a spatial parameter selected from the group consisting of: range, azimuth, rotation and combinations thereof prior to step (c).
- 108. The method of claim 107, further comprising determining an alignment correlation;
wherein step (e) is a function of the alignment correlation.
- 109. The method of claim 108 further comprising determining the alignment correlation as a function of a region of interest.
- 110. The method of claim 104 wherein step (c) comprises compounding with a finite impulse response filter.
- 111. The method of claim 104 further comprising:
(e) storing the first and second two-dimensional frames of ultrasound data; (f) inputting user persistence information; wherein step (c) is responsive to the user persistence information and occurs at least 200 milliseconds after step (a).
- 112. The method of claim 104 wherein step (b) comprises calculating a minimum sum of absolute differences.
- 113. The method of claim 104 wherein step (c) comprises:
(c1) compounding more as selected from the group consisting of: using more evenly distributed weights, including additional two-dimensional frames of ultrasound data, and combinations thereof; and (c2) compounding less as selected from the group consisting of: weighting the second two-dimensional frame of ultrasound data more than other frames of two-dimensional ultrasound data, including fewer two-dimensional frames of ultrasound data, and combinations thereof.
- 114. The method of claim 104 further comprising:
(e) repeating steps (a), (b), (c), and (d) for at least three sets of frames of ultrasound data; and wherein step (c) comprises interpolating the degree of correlation for one of the at least three sets from the degree of correlation from another one of the at least three sets.
- 115. The method of claim 104 further comprising:
(e) inputting a user selected correlation coefficient threshold.
- 116. The method of claim 104 wherein step (b) comprises determining the degree of correlation from data selected from the group consisting of: data associated with a near field image region and data prior to focal gain compensation.
- 117. An ultrasound system for generating a compounded image, the system comprising:
first and second two-dimensional frames of ultrasound data, each two-dimensional frame of ultrasound data associated with a different elevation position than each other two-dimensional frame of data; a compounding processor for determining a degree of correlation between the first and second frames of ultrasound data and for compounding the first and second two-dimensional frames of ultrasound data less if the degree of correlation is low and more if the degree of correlation is high; and a display for displaying a compounded image responsive to the output of the compounding processor.
- 118. The system of claim 117 wherein the compounding processor Is operable to compound compressed ultrasound data.
- 119. The system of claim 117 wherein step the compounding processor comprises a finite impulse response filter.
- 120. The system of claim 117 further comprising:
a memory operable to store the first and second two-dimensional frames of ultrasound data; and a user interface operable to receive user persistence information; wherein the compound processor is responsive to the user persistence information and compounds at least 200 milliseconds after storage in the memory.
- 121. A method for generating a compounded image with an ultrasound system, the method comprising the steps of:
(a) transforming first and second frames of ultrasound data; (b) compounding the first and second frames of ultrasound data; (c) decompressing a compounded frame of ultrasound data responsive to step (b); and (d) displaying an image responsive to the compounded frame of ultrasound data.
- 122. The method of claim 121 wherein step (a) comprises transforming pursuant to at least one step of JPEG compression steps and step (c) comprises decompressing pursuant to at least one step of JPEG decompression steps.
- 123. The method of claim 121 further comprising:
(e) transforming the compressed first and second frames of ultrasound data as a function of a pure transform operation in step (a) prior to step (b).
- 124. The method of claim 121 further comprising:
(e) inputting persistence information from a user; wherein step (b) is responsive to the temporal persistence information.
- 125. The method of claim 121 further comprising:
(e) determining a filter size and at least one weighting coefficient as a function of user provided compounding information; wherein step (b) is responsive to the filter size and the at least one weighting coefficient.
- 126. The method of claim 121 wherein step (b) comprises compounding with a finite impulse response filter.
- 127. The method of claim 121 further comprising:
(e) determining a degree of correlation relating the first and second frames of ultrasound image; and wherein step (b) is responsive to the degree of correlation.
- 128. The method of claim 121 further comprising:
(e) acquiring the first and second frames of ultrasound data; (f) storing the first and second frames of ultrasound data separately; wherein at least 200 milliseconds passes between steps (f) and (d).
- 129. The method of claim 127 wherein steps (b), (c), and (d) are performed during a non-real time review.
- 130. An ultrasound system for generating a compounded image, the system comprising:
first and second frames of transformed ultrasound data; a compounding processor for compounding the first and second frames of ultrasound data and for decompressing a compounded frame of ultrasound data; and a display for displaying an image responsive to the compounded frame of ultrasound data.
- 131. The system of claim 130 wherein the compounding processor is operable to perform JPEG decompression.
- 132. The system of claim 130 further comprising:
a user interface for receiving input persistence information; wherein the compounded frame of ultrasound data is responsive to the persistence information.
- 133. The system of claim 130 further comprising:
a beamformer for acquiring the first and second frames of ultrasound data; and a memory for storing the first and second frames of ultrasound data separately; wherein at least 200 milliseconds passes between storing and compounding the first and second frames of ultrasound data.
- 134. The method of claim 51 further comprising:
(f) alternatively displaying another image responsive to one of the at least three frames of ultrasound data and different compounding after step (e).
- 135. The method of claim 134 wherein steps (e) and (f) are performed during the display of the sequence.
- 136. The method of claim 134 further comprising:
(g) applying different processing selected from the group consisting of: brightness, contrast, and combinations thereof to the compounded image than to the other image.
- 137. The method of claim 51 wherein step (c) comprises selecting preset persistence values.
- 138. The method of claim 137 further comprising:
(f) automatically adjusting one or both of brightness and contrast in response to the input of the compounding information.
- 139. The method of claim 51 wherein step (c) comprises selecting incremental values.
- 140. The method of claim 51 further comprising:
(f) adjusting one or both of brightness and contrast of the compounded image in response to a user selectable value.
- 141. A method for generating an ultrasound image, the method comprising the steps of:
(a) acquiring a frame of ultrasound data; (b) ultrasound image processing the frame of ultrasound data in response to at least one ultrasound image processing parameter; (c) generating a first image in response to the ultrasound image processed frame of ultrasound data of step (b); (d) obtaining the frame of ultrasound data and the ultrasound image processing parameter; (e) ultrasound image processing the frame of ultrasound data in response to the ultrasound image processing parameter; and (f) generating a second image in response to the ultrasound image processed frame of ultrasound data of step (e).
- 142. The method of claim 141:further comprising (g) storing the frame of ultrasound data and the ultrasound image processing parameter; and wherein step (d) comprises obtaining the frame of ultrasound data and the ultrasound image processing parameter from the storage.
- 143. The method of claim 141 where step (d) comprises receiving the frame of ultrasound data and the ultrasound image processing parameter at a remote processor.
- 144. The method of claim 141 wherein steps (a), (b) and (c) are performed during an imaging session and steps (e) and (f) are performed after the imaging session.
- 145. The method of claim 141 wherein the frame of ultrasound data comprises raw acoustic beam data.
- 146. The method of claim 141 wherein the frame of ultrasound data comprises data responsive to a first type of ultrasound image processing step;
step (b) comprises applying a second type of ultrasound image processing; and step (e) comprises applying the second type of ultrasound image processing.
- 147. The method of claim 141 wherein steps (b) and (e) comprise a type of ultrasound image processing selected from the group consisting of: persistence compounding, spatial compounding, depth gain compensation, focal gain compensation, post-process mapping, dynamic range alteration, histogram equalization and combinations thereof.
- 148. A system for generating an ultrasound image, the system comprising:
a beamformer for acquiring a frame of ultrasound data; a first ultrasound image device operable to ultrasound image process the frame of ultrasound data in response to at least one ultrasound image processing parameter; a first display operable to generate a first image in response to the ultrasound image processed frame of ultrasound data output from the ultrasound image device; a device operable to obtain the frame of ultrasound data and the ultrasound image processing parameter; one of the first and a second ultrasound image device operable to ultrasound image process the frame of ultrasound data in response to the ultrasound image processing parameter; and one of the first and a second display operable to generate a second image in response to the ultrasound image processed frame of ultrasound data output by the one of the first and the second ultrasound image device.
- 149. The system of claim 148 wherein the device operable to obtain comprises a memory operable to store the frame of ultrasound data and the ultrasound image processing parameter.
- 150. The system of claim 148 wherein one of the first and the second ultrasound image devices comprises the second ultrasound image device at a location remote to the first ultrasound image device.
- 151. The system of claim 148 wherein the frame of ultrasound data comprises data raw beam data.
- 152. The system of claim 148 wherein the frame of ultrasound data comprises data responsive to a first type of ultrasound image process;
the first ultrasound image device is operable to apply a second type of ultrasound image processing; and the one of the first and second ultrasound image device is operable to apply the second type of ultrasound image processing.
- 153. The system of claim 148 wherein the first ultrasound image device and the one of the first and second ultrasound image devices is operable to apply a type of ultrasound image processing selected from the group consisting of: persistence compounding, spatial compounding, depth gain compensation, focal gain compensation, post-process mapping, dynamic range alteration, histogram equalization and combinations thereof.
- 154. A method for generating an ultrasound image, the method comprising the steps of:
(a) generating an image in response to an ultrasound image processing parameter and a frame of ultrasound data; and (b) re-generating the image in response to the ultrasound image processing parameter and the frame of ultrasound data.
- 155. The method of claim 154:further comprising (c) storing the frame of ultrasound data and the ultrasound image processing parameter; and wherein step (b) comprises obtaining the frame of ultrasound data and the ultrasound image processing parameter from the storage.
- 156. The method of claim 154 further comprising step (c) of transmitting the frame of ultrasound data and the ultrasound image processing parameter to a remote processor, wherein step (b) comprises re-generating the image with the remote processor.
- 157. The method of claim 154 wherein step (a) is performed during an imaging session and step (b) is performed after the imaging session.
- 158. The method of claim 154 wherein the frame of ultrasound data comprises data raw beam data.
- 159. The method of claim 154 wherein the frame of ultrasound data comprises data responsive to a first type of ultrasound image process;
step (a) comprises applying a second type of ultrasound image process; and step (b) comprises applying the second type of ultrasound image process.
- 160. The method of claim 154 wherein steps (a) and (b) comprise applying a type of ultrasound image processing selected from the group consisting of: persistence compounding, spatial compounding, depth gain compensation, focal gain compensation, post-process mapping, dynamic range alteration, histogram equalization and combinations thereof.
- 161. A method for generating an ultrasound image, the method comprising the steps of:
(a) generating an image in response to an ultrasound image processing parameter and a frame of ultrasound data; (b) transmitting the frame of ultrasound data and the ultrasound image processing parameter; and (c) re-generating the image in response to the ultrasound image processing parameter and the transmitted frame of ultrasound data.
- 162. The method of claim 161 further comprising:
(d) storing the frame of ultrasound data and the ultrasound image processing parameter.
- 163. The method of claim 161 wherein step (b) comprises transmitting the frame of ultrasound data and the ultrasound image processing parameter to a remote processor; and
wherein step (c) comprises re-generating the image with the remote processor.
- 164. The method of claim 161 wherein step (a) is performed during an imaging session and step (c) is performed after the imaging session.
- 165. The method of claim 161 wherein the frame of ultrasound data comprises data raw beam data.
- 166. The method of claim 161 wherein the frame of ultrasound data comprises data responsive to a first type of ultrasound image process;
step (a) comprises applying a second type of ultrasound image process; and step (c) comprises applying the second type of ultrasound image process.
- 167. The method of claim 161 wherein steps (a) and (c) comprise applying a type of ultrasound image processing selected from the group consisting of: persistence compounding, spatial compounding, depth gain compensation, focal gain compensation, post-process mapping, dynamic range alteration, histogram equalization and combinations thereof.
- 168. A network for generating an ultrasound image, the system comprising:
an ultrasound system operable to generate an image in response to an ultrasound image processing parameter and a frame of ultrasound data; a remote system operable to re-generate the image in response to the ultrasound image processing parameter and the transmitted frame of ultrasound data; and a network for transmitting the frame of ultrasound data and the ultrasound image processing parameter from the ultrasound system to the remote system.
RELATED APPLICATIONS
[0001] This application is a continuation-in-part of U.S. patent application Ser. No. 09/199,945, filed Nov. 25, 1998, which is a continuation-in-part of Ser. Nos. 09/089,060 and 09/089,467, both filed on Jun. 2, 1998.
Divisions (1)
|
Number |
Date |
Country |
Parent |
09328113 |
Jun 1999 |
US |
Child |
10299179 |
Nov 2002 |
US |
Continuation in Parts (3)
|
Number |
Date |
Country |
Parent |
09199945 |
Nov 1998 |
US |
Child |
09328113 |
Jun 1999 |
US |
Parent |
09089060 |
Jun 1998 |
US |
Child |
09199945 |
Nov 1998 |
US |
Parent |
09089467 |
Jun 1998 |
US |
Child |
09089060 |
Jun 1998 |
US |