An embodiment of the invention relates generally to ultrasound-based diagnostic systems and procedures employing image acquisition, processing, and image presentation systems and methods.
Computer-based analysis of medical images pertaining to ascertaining organ structures allows for the diagnosis of organ diseases and function. Identifying and measuring organ boundaries allows a medical expert to assess disease states and prescribe therapeutic regimens. The true shape of a cavity or structure within body tissue requires accurate detection for the medical expert to assess organ normalcy or pathological condition. However, inaccurate organ boundary detection can prevent an accurate assessment of a true medical condition since the bladder cavity area and volume is either underestimated or overestimated. Traditional ultrasound technology employs the intensity information from the B-mode images for segmentation. However, due to the complex human anatomy and the artifacts of the ultrasound imaging, this B-mode information is insufficient. There is a need to non-invasively and rapidly identify and accurately measure cavity boundaries within an ultrasound-probed region-of-interest (ROI) so as to enable accurate assessment of a medical condition.
In an embodiment, a system includes at least one transducer configured to transmit at least one ultrasound pulse into a region of interest (ROI) of a patient. The pulse has at least a first frequency and propagates through a bodily structure in the ROI. The system further includes at least one receiver configured to receive at least one echo signal corresponding to the pulse. The echo signal includes the first frequency and at least one harmonic multiple of the first frequency. The system further includes a processor configured to automatically determine, from the at least one harmonic multiple, at least one boundary of the bodily structure. In an embodiment, the processor is configured to automatically determine, from the at least one harmonic multiple, an amount of fluid within the bodily structure.
The file of this patent contains at least one drawing executed in color. Copies of this patent with color drawing(s) will be provided by the Patent and Trademark Office upon request and payment of the necessary fee. Preferred and alternative embodiments of the present invention are described in detail below with reference to the following drawings.
In at least one embodiment, ultrasound systems and methods employ harmonic theory to improve bladder segmentation. The reflected harmonic content associated with tissue regions beyond a volume of liquid, such as urine or amniotic fluid, to be measured is used to make a processing device, such as a computer, aware of the presence of the liquid. A color-coded image in pseudo C-mode view may be constructed based upon the strength of harmonic ratios from structures of a region-of-interest having structural components that increase the harmonic of the ultrasound waveform. The color-coded image can be utilized as a useful guidance for the task of aiming an ultrasound transceiver. In addition, harmonic ratio profile on each scan plane can be used to rectify the bladder (or uterus of non-pregnant female) region segmentation and fluid volume measurement.
In at least one embodiment, ultrasound systems and methods to develop, present, and use the harmonic theory which is only applied to voxels containing the harmonic information to improve bladder segmentation. A goal is to make assignment of regions preceding the bladder back wall tissue to the urine structure, instead of improving the image quality for visualization or image processing. Although there is little echo from a fluid such as urine or amniotic fluid, the propagation history through this type of liquid gives rise to additional decision making capability. The Goldberg number of the liquid is an advantageous indication in an embodiment. The harmonic content from bladder back wall tissue beyond the fluid are impacted by presence of the fluid in the ultrasound path in front of the tissue.
In at least one embodiment, ultrasound systems and methods develop, present, and use a color-coded image of a structure within a region-of-interest. A color-coded image of the structure or region-of-interest may be obtained based upon determining the optimal ultrasound harmonic frequency exhibited by the structure within the region-of-interest.
The harmonic distortion due to non-linear effects prevail in urine is an advantageous element of an embodiment. The concept of the harmonic is not new. For example, many methods have been proposed for using harmonic information to improve ultrasound image quality. In general, these methods use the reflected sound wave at all voxels and enhance the image quality at the corresponding location using its harmonic content. The harmonic information utilized in these applications is from all kinds of tissues. However, instead of improving the image quality for visualization or image processing, an embodiment models fluid in a bodily structure, such as urine inside a bladder, and tissue, such as bladder back wall and tissues behind it, as two different media for harmonic generation and absorption, so as to provide very useful information, such as the length of the path through the urine relative to the current scan path length.
The harmonic information is processed and utilized in a novel way. The entire propagation history information of each scan line is processed to provide a corresponding indicator. The urine in front of tissue will influence the harmonic information reflected from the tissue behind urine. Hence, regions composed of urine along the scan line contribute to the harmonic accumulation that appears on the structure behind the region. The urine itself is anechoic and generally does not present any image signal. Regions devoid of urine do not contribute to the harmonic accumulation. Without considering this accumulation process, looking at the harmonic information at each voxel independently will not provide information such as how much urine is presented in the current scan line. In short, we are not using harmonic image information; we are using harmonic propagation history information.
Another feature of an embodiment is the ultrasound propagation medium model employed. Instead of using harmonic information to differentiate different tissues as suggested by other approaches, we treat all the tissues with a single model. A focus may be the significant difference of harmonic propagations between tissue and urine, which is very clear from harmonic propagation theory. This treatment of the harmonic information gives us the opportunity to make fully or partially automatic determinations of how much urine is under examination, without human intervention based estimation of same.
In at least one embodiment, due to the above harmonic processing features, the transmitting signal we choose is narrowband, which is different from the wideband signals used for harmonic imaging. This is because we process the harmonic propagation history; hence the spatial resolution can be sacrificed and traded for better harmonic amplitude ratio estimation.
In at least one embodiment, for each imaging direction, an ultrasound transceiver transmits two pulses. The first one is a traditional B-mode pulse, while the second one is the narrowband pulse explained above for harmonic ratio estimation. The information obtained from the harmonic (second) pulse is merged with the B-mode information from the first pulse to provide a comprehensive view of the medium under examination. The successful fusion of these two pieces of information is another feature of an embodiment.
The quantitative harmonic amplitude estimation is a very challenging task due to the noisy nature of the spectrum and nonhomogeneous property of the signal. Many advanced spectral estimation algorithms have been developed in the literature to provide improved spectral estimation results for various engineering applications. Based on their principles, these algorithms can be divided into two approaches: parametric and nonparametric. Since the parametric approach is more sensitive to data modeling errors, the nonparametric approach are developed in an embodiment to build a robust spectral estimator. Careful studies of ultrasound propagation can lead to a good choice for this spectral estimator.
Other harmonic approaches use the absolute value of the second or higher harmonics as an indicator for volume rendering or threshold choosing. An embodiment uses the ratio between the second and the first harmonic to give us a better indicator, which is independent of the various echo generating capabilities of the tissues under examination. The conventional harmonic imaging approach cannot provide tissue harmonic absorbing information since the echo generating capability of the tissue will dominate the received signal.
There is a fundamental difference between an embodiment and a known alternative approach: an embodiment is concerned with the tissue harmonic absorption (this is why the harmonic propagation history of one scan line is processed herein), while the harmonic imaging technology from the alternative approach is concerned with the tissue harmonic generation. As discussed above in connection with our ultrasound propagation medium model, the model we selected for urine and tissue are based on their dramatically different harmonic absorption capabilities.
In at least one embodiment, systems and methods are described for acquiring, processing, and presenting a color-coded image in the pseudo C-mode view, based upon the strength of harmonic ratios from structures of a regions-of-interest having structural components that increase the harmonic of the ultrasound waveform. Optimization of image acquisition by providing systems and methods to direct transceiver placement or repositioning is described. When the structure or organ of interest, or region of interest (ROI) is a bladder, harmonic ratio classification results may be applied to alert the computer executable programs to check either or any combination of the volume measurement to properly determine a small or large bladder, the volume measurement of the bladder, and to adjust segmentation algorithms to prevent overestimation of the bladder size.
The result can also be combined with pseudo C-mode view displaying for transceiver aiming or final bladder shape determination. The simplest way to utilize the result may be that if the bladder size is large compared with harmonic ratio classification, we can check the dimension of current shape for over estimation. If the bladder size is too small, an appropriate compensation can be made to enlarge the size of the shape for displaying; if the size is small, we can provide an appropriate modification of the shape. In general, the harmonic ratio is an extra information extracted from the received ultrasound signal, which can be utilized to improve measurement of the bladder and/or fluid volume quantitatively.
Alternate embodiments include systems and/or methods of image processing for automatically segmenting (i.e., automatically detecting the boundaries of bodily structures within a region of interest (ROI) of a single or series of images undergoing dynamic change). Particular and alternate embodiments provide for the subsequent measurement of areas and/or volumes of the automatically segmentated shapes within the image ROI of a singular image or multiple images of an image series undergoing dynamic change.
A directional indicator panel 22 includes a plurality of arrows that may be illuminated for initial targeting and guiding a user to access the targeting of an organ or structure within an ROI. In particular embodiments if the organ or structure is centered from placement of the transceiver 10A acoustically placed against the dermal surface at a first location of the subject, the directional arrows may be not illuminated. If the organ is off-center, an arrow or set of arrows may be illuminated to direct the user to reposition the transceiver 10A acoustically at a second or subsequent dermal location of the subject. The acoustic coupling may be achieved by liquid sonic gel applied to the skin of the patient or by sonic gel pads to which the transceiver dome 20 is placed against. The directional indicator panel 22 may be presented on the display 54 of computer 52 in harmonic imaging subsystems described in
Transceiver 10A includes an inertial reference unit that includes an accelerometer and/or gyroscope (not shown) positioned preferably within or adjacent to housing 18. In case the ROI (region of interest) of one transceiver is not large enough to contain the organ of interest, such as when measuring the amniotic fluid, accelerometer and/or gyroscope can be used to merge several scans at different locations into one reference frame. The accelerometer may be operable to sense an acceleration of the transceiver 10A, preferably relative to a coordinate system, while the gyroscope may be operable to sense an angular velocity of the transceiver 10A relative to the same or another coordinate system. Accordingly, the gyroscope may be of conventional configuration that employs dynamic elements, or it may be an optoelectronic device, such as the known optical ring gyroscope. In one embodiment, the accelerometer and the gyroscope may include a commonly packaged and/or solid-state device. One suitable commonly packaged device is the MT6 miniature inertial measurement unit, available from Omni Instruments, Incorporated, although other suitable alternatives exist. In other embodiments, the accelerometer and/or the gyroscope may include commonly packaged micro-electromechanical system (MEMS) devices, which are commercially available from MEMSense, Incorporated. As described in greater detail below, the accelerometer and the gyroscope cooperatively permit the determination of positional and/or angular changes relative to a known position that is proximate to an anatomical region of interest in the patient.
The transceiver 10A includes (or if capable at being in signal communication with) a display (not shown) operable to view processed results from an ultrasound scan, and/or to allow an operational interaction between the user and the transceiver 10A. For example, the display may be configured to display alphanumeric data that indicates a proper and/or an optimal position of the transceiver 10A relative to the selected anatomical portion. Display may be used to view two- or three-dimensional images of the selected anatomical region. Accordingly, the display may be a liquid crystal display (LCD), a light emitting diode (LED) display, a cathode ray tube (CRT) display, or other suitable display devices operable to present alphanumeric data and/or graphical images to a user.
Still referring to
To scan a selected anatomical portion of a patient, the transceiver dome 20 of the transceiver 10A may be positioned against a surface portion of a patient that is proximate to the anatomical portion to be scanned. The user actuates the transceiver 10A by depressing the trigger 14. In response, the transceiver 10 transmits ultrasound signals into the body, and receives corresponding return echo signals that may be at least partially processed by the transceiver 10A to generate an ultrasound image of the selected anatomical portion. In a particular embodiment, the transceiver 10A transmits ultrasound signals in a range that extends from approximately about two megahertz (MHz) to approximately about ten MHz.
In one embodiment, the transceiver 10A may be operably coupled to an ultrasound system that may be configured to generate ultrasound energy at a predetermined frequency and/or pulse repetition rate and to transfer the ultrasound energy to the transceiver 10A. The system also includes a processor that may be configured to process reflected ultrasound energy that is received by the transceiver 10A to produce an image of the scanned anatomical region. Accordingly, the system generally includes a viewing device, such as a cathode ray tube (CRT), a liquid crystal display (LCD), a plasma display device, or other similar display devices, that may be used to view the generated image. The system may also include one or more peripheral devices that cooperatively assist the processor to control the operation of the transceiver 10A, such a keyboard, a pointing device, or other similar devices. In still another particular embodiment, the transceiver 10A may be a self-contained device that includes a microprocessor positioned within the housing 18 and software associated with the microprocessor to operably control the transceiver 10A, and to process the reflected ultrasound energy to generate the ultrasound image. Accordingly, the display 24 may be used to display the generated image and/or to view other information associated with the operation of the transceiver 10A. For example, the information may include alphanumeric data that indicates a preferred position of the transceiver 10A prior to performing a series of scans. In yet another particular embodiment, the transceiver 10A may be operably coupled to a general-purpose computer, such as a laptop or a desktop computer that includes software that at least partially controls the operation of the transceiver 10A, and also includes software to process information transferred from the transceiver 10A, so that an image of the scanned anatomical region may be generated. The transceiver 10A may also be optionally equipped with electrical contacts to make communication with receiving cradles 50 as discussed in
As described above, the angular movement of the transducer may be mechanically effected and/or it may be electronically or otherwise generated. In either case, the number of lines 48 and the length of the lines may vary, so that the tilt angle φ sweeps through angles approximately between −60° and +60° for a total arc of approximately 120°. In one particular embodiment, the transceiver 10 may be configured to generate approximately about seventy-seven scan lines between the first limiting scan line 44 and a second limiting scan line 46. In another particular embodiment, each of the scan lines has a length of approximately about 18 to 20 centimeters (cm). The angular separation between adjacent scan lines 48 (
The locations of the internal and peripheral scan lines may be further defined by an angular spacing from the center scan line 34B and between internal and peripheral scan lines. The angular spacing between scan line 34B and peripheral or internal scan lines may be designated by angle Φ and angular spacings between internal or peripheral scan lines may be designated by angle Ø. The angles Φ1, Φ2, and Φ3 respectively define the angular spacings from scan line 34B to scan lines 34A, 34C, and 31D. Similarly, angles Ø1, Ø2, and Ø3 respectively define the angular spacings between scan line 31B and 31C, 31C and 34A, and 31D and 31E.
With continued reference to
One of the parameters that expresses the balance between attenuation and harmonic generation for an ultrasound wave is the Goldberg number, G, which represents a measure of the attenuation or harmonic distortion likely to prevail. When G=1, nonlinear effects become comparable to attenuation effect. If the Goldberg number is higher than 1, nonlinear processes dominate the wave propagation behavior. For values of the Goldberg number below 1, attenuation is more significant in governing the amplitude of the harmonic components than the energy transfer due to nonlinear distortion. Fat has a Goldberg number below 1 (0.27). Muscle, liver, and blood have a Goldberg number above but near 1. Urine and amniotic fluid have a Goldberg number of 104. This is caused primarily by the attenuation, which is very low for urine and very high for fat, although the nonlinearity coefficient of fat is higher than that of urine. These simple calculations demonstrate the difference between different media in causing waveform distortion. Urine and amniotic fluid have a higher ability to provoke strong nonlinear distortion compared with other body tissues. In an embodiment, the large Goldberg number value of urine and amniotic fluid is utilized to distinguish the bladder or umbilical region from other tissue regions.
The color-coded images using harmonic ratio specially designed for bladder aiming/targeting is employed in an embodiment. The method is based on the special property of the non-linear propagation of ultrasound wave. A fast interpolation and efficient color map may be explored and a 2D pseudo-color imaging can be generated for each bladder scan. Operator can easily find the urine-filled bladder in the image and adjust scanning direction for best aiming.
Based on the initial design and implementation of 30 tests on human subjects, and using the interpolated shape as reference, the color harmonic imaging method provides accurate feedback about, for example, bladder location, bladder shape and bladder volume. This technique can be easily applied for clinical usage for more accurate data collection and analysis. A process according to an embodiment is illustrated in
At step 1.3 illustrated in
Echo signals received from structures in the body carry not only the frequencies of the original transmit pulse, but also include multiples, or harmonics of these frequencies. These linear components are used in conventional, fundamental B-mode imaging. Harmonic echo frequencies are caused by non-linear effects during the propagation of ultrasound.
For example, THI (tissue harmonic imaging) is based on the phenomenon wherein ultrasound signals are distorted while propagating through tissue with varying acoustic properties. However, THI is merely an imaging method that does not solve the bladder detection problem.
Harmonic information is hidden in the frequency domain and it is an effective indicator for harmonic build-up on each scan line at different depth, based on which bladder lines and tissue lines can be separated. For example, inside a bladder region, there is not enough reflection, so the attenuations of the first and second harmonics are low. Deep behind the bladder wall, both the first and the second harmonics will be attenuated, while the second harmonic will be attenuated much faster than the first one. As a result, harmonic information will be higher for a scan line which passes through a bladder, compared to a scan line that penetrates tissue only.
One way to use the harmonic information is to use relative change of the harmonic information around the 2nd harmonic frequency compared with response at fundamental frequency. The ratio (Goldberg Number) of the peak value around the 2nd harmonic and the peak value around the fundamental frequency is a suitable indicator for such change.
From the clinical data collected from an ultrasound device, it can be observed that its spectrum is very noisy. This holds true even when there is little or no noise presented within the data. The convolution theory indicates that it is hard to use conventional FFT method to get good spectral estimation, not to mention that the stationary assumption does not hold for this data. A robust harmonic processing algorithm enables such a device to have good harmonic estimation results.
Many advanced spectral estimation algorithms have been developed in the literature to provide improved spectral estimation results for various applications. Based on their principle, these algorithms can be divided into two approaches: parametric and nonparametric. Since the parametric approach is more sensitive to data modeling errors, an embodiment includes the nonparametric approach to build a robust spectral estimator.
A block diagram of the Harmonic Analysis Kernel is illustrated in
In the above sub-algorithm, ‘Att_Comp’ is an attenuation compensation parameter (we use 2.5 dB/cm, estimated from clinical data). The ‘threshold’ is a parameter used to reject the data when they are too small. Ratio_low=−35 dB. In summary, the ‘normalization’ step will remove the data segments which are too weak, the compensation step will compensate the harmonic ratio loss in tissue, and the averaging step will provide a more robust ratio estimator.
The final step may be spatially smoothing the harmonic ratios across the scan lines within a plane.
Data collected from a clinical test has been used to validate the model: more urine will lead to more harmonic.
Previous bladder detection methods are focused only around the gradient information from B-mode images. As discussed elsewhere herein, artifacts in ultrasound images create difficulties. Harmonic information provides extra features from the frequency domain and the combination improves application accuracy.
An embodiment includes combining harmonic features with B-mode image properties. Such an approach may include a pre-trained 5 by 5 by 1 Neural Network [
The neural network may require exponential calculation in a logistic function [logistic(x)=1.0/(1+exp(−x))]. In the DSP processing, an embodiment uses a lookup table to give a fast implementation.
The trained network may be in the following configuration:
An embodiment may use harmonic information for bladder detection (Grading on Walls). The goal of using harmonic information is to improve liquid-volume measurement accuracy and help a user locate a bladder region faster. The goal is directly related to the segmentation accuracy of the bladder region. With the harmonic information, we can check if the segmentation (detection of bladder walls) on each scan line is valid. The grading from the neural network provides more robust information to fix the initial bladder walls.
The basic idea is to use the grading value:
In an embodiment, a region G is defined in which all lines are with grading higher than the threshold. Additionally, a region W is defined which is based on the cuts from fixed walls.
For Region G and region W, there may be five different cases to address:
(1) G and W are not Overlapped (Including Empty G or Empty W):
(2) G Inside W:
(3) W Inside G:
(4) G and W are Partly Overlapped:
(5) G and W are Exactly the Same:
It is easy to remove the wrong segmentation line. But, it is difficult to add new lines. An embodiment determines the average of the non-zero initial wall on current line and the non-zero fixed wall from its neighbor.
The bladder detection task is more challenging for a female patient due to the presence therein of a uterus. In general, the uterus is adjacent to the bladder region and it has a very similar pattern in B-mode image.
It is optionally advantageous to exclude the uterus region from the final segmentation. Therefore, the computed volume is the actual urine inside the bladder. Previously, a uterus detection method may address the whole segmentation after wall detection using volume. In other words, it tries to determine that the segmentation is bladder or uterus. However, some times, it is not so simple to refine the result, because the segmentation includes both bladder and uterus. An embodiment may determine which part in the segmentation belongs to the bladder and which part in the segmentation is the uterus. This may be a difficult task, especially when the bladder is small in size.
The uterus can be located side by side with the bladder, and it can also be located under the bladder. For the first case, a method previously described herein can be used to classify the scan lines passing through uterus only from the scan lines passing through bladder. However, such a method may not be able to solve the second problem. When a scan line is propagating through both bladder region and uterus region, further processing has to be made to find which part on the line belongs to the bladder.
An embodiment is based on the following observation: if the scan is on a female patient, there must be a boundary between uterus and bladder region and the uterus is always under the bladder if both regions appear on a scan line. In the B-mode image, for each scan line passing through both regions, a small ridge exists. If the ridge can be located, an embodiment can tell the two structures apart.
A detailed design of an embodiment of this procedure is illustrated in
Referring again to
The definition of a C-mode image may be a plane parallel to the face of the transducer. As illustrated in
An algorithm according to an embodiment to generate the final C-mode view shape is illustrated in
1) Cuts based on segmentation on all planes
In this step, extract the left most and right most cuts on each plane based on the segmentation.
2) check the consistency of the segmentation results
Theoretically, a bladder in the bladder scan is a single connected 3D volume. Due to various reasons (one of which is the segmentation algorithm searches for bladder wall blindly plane by plane), there may be more than one 3D regions and the corresponding bladder walls are also stored in the segmentation results. This step may make a topological consistency checking to guarantee that there is only one connected region in the C-mode view.
3) Compute the mass center of all the valid cuts. Re-compute the corresponding radius and angle of every valid cut. Then smooth the radius.
Compute the Cartesian coordinates for each valid cut and get the mass center. Based on this mass center, compute the corresponding radius and angle of very valid cut. Sort the new angles in ascending order. At the same time align the corresponding radius. In order to smooth the final interpolated shape, an embodiment averages the radii from above result in a pre-defined neighborhood.
4) Linear interpolation between the smoothed cuts is performed.
5) Output the walls of the interpolated shape
In an embodiment, the final output which is used to represent the interpolated shape is stored in two arrays, the size of which is 250. The dimension of the final display is on a 2D matrix, 250 by 250. The two arrays store the upper wall and lower wall location in each column respectively.
As discussed elsewhere herein, we introduced a pseudo C-mode view of the interpolated shape. An advantageous application is to provide guidance for the users to find the best scanning location and angle. This task may be called aiming.
Basically, the aiming is based on the segmentation results and it is similar as the C-mode shape functionality. In an embodiment, there are two kinds of aiming information: arrow on the probe and the intermediate shapes.
1) Arrow Feedback
Using an extra displaying panel on the scanner, an embodiment also provides arrow feedback after a full scan. The arrow feedback may be based on the C-mode view shape. There may be four different arrow feedback modes as illustrated in
Eight arrows may be used. The arrow to be used is determined by the location of the mass center of the interpolated shape in C-mode view. Based on the vector between ultrasound cone center and the mass center, the corresponding angle can be computed in a range from −180 degree to +180 degree. The [−180 180] range is divided into eight parts and each part corresponds to each arrow.
2) Pubic Bone Detection
In order to provide accurate aiming feedback information, the shadow caused by the pubic bone should also be considered. In the ultrasound image, the only feature associated with the pubic bone is the big and deep shadow. If the shadow is far from the bladder region we are interested in for volume calculation, there is no need to use this information. However, if the shadow is too close to the bladder region, or the bladder is partly inside the shadow caused by pubic bone, the corresponding volume determination will be greatly influenced. If the bladder walls are incomplete due to the shadow, we will underestimate the bladder volume.
Therefore, if the user is provided with the pubic bone information, a better scanning location can be chosen and a more accurate liquid volume measurement can be made.
An embodiment includes the following method to effect pubic bone detection based on the special shadow behind it.
In the above procedure, a factor is to determine the KI_threshold based on the B-mode images. An embodiment utilizes an automated thresholding technique in image processing, the Kittler & Illingworth thresholding method. See, Kittler, J., Illingworth, J., 1986, Minimum Error Thresholding, Pattern Recognition, 19, 41-47.
In one instance, the shadow does not affect the volume measurement since the pubic bone is far from the bladder region; in a second case, the influence is strong since the pubic bone blocks the bladder region partly. Using a pubic icon (not shown) on the feedback screen, operators are trained to recognize when a new scanning location should be chosen and when not.
3) Intermediate Shape
Referring to
The first step is to use the grading values to find the cuts on current plane:
The second step is to generate a virtual painting board and draw line between the cuts on current plane and cuts from previous plane. We show a series of intermediate C-mode shapes on the exemplary data set in
Reverberation Noise Control
Before an embodiment calculates the bladder volume based on the detected front and back walls, another extra step may be made to remove the wrong segmentation due to strong reverberation noise.
An embodiment has the advantage over previous approaches in that the grading information will help find the bladder lines as completely as possible. In previous approaches, bladder wall detection will stop early when strong reverberation noise is present.
However, even the above improvement is still not able to fix the wrong segmentation on some lines due to reverberation noises. Therefore, an embodiment includes the following method to remove the small wedges on the bladder walls using shape information:
An embodiment includes an interpolation approach using adjacent bladder wall shape. We have already considered the cases when the bladder shape is indeed with large convex part on the front or back wall by defining two parameters (valid_FW_change and valid_BW_change).
Volume Measurement
Referring now to
For every scan line except the broadside scan line (phi=0), a spherical wedge shape is defined, with the physical scan line passed through the center of the wedge. The spherical wedge is bounded on top by the front wall and on the bottom by the back wall, on the sides by the average of the current scan line spherical angles and the next closest spherical angles. [The left-side image in FIG. 28.]
For broadside scan line, a truncated cone is used. [The right-side image in FIG. 28.]
While the particular embodiments have been illustrated and described for presenting color-coded ultrasound images based upon ultrasound harmonic frequencies exhibiting optimal signal-to-noise ratios for sub-structures, many changes can be made without departing from the spirit and scope of the invention. For example, using harmonics in imaging applications other than ultrasound may be employed. Additionally, although estimations applied to urine content have been emphasized herein throughout, embodiments of the invention apply to analysis of other bodily fluids, such as amniotic fluid and blood, as well. For example, amniotic fluid volume in a pregnant female can be measured by employing at least one embodiment of the invention. The non-pregnant female's uterus can be distinguished from a bladder by employing at least one embodiment of the invention, inasmuch as blood occasionally present within the uterus of the non-pregnant female does not have as high a Goldberg number as amniotic fluid in the pregnant female or urine within the female bladder, in either case. As such, for example, blood in an engorged umbilical cord may be distinguished from amniotic fluid by employing at least one embodiment of the invention. Accordingly, the scope of embodiments of the invention is not limited by the disclosure of the particular embodiments. Instead, embodiments of the invention should be determined entirely by reference to the claims that follow.
This application incorporates by reference and claims priority to U.S. provisional patent application Ser. No. 60/882,888 filed Dec. 29, 2006. This application incorporates by reference and claims priority to U.S. provisional patent application Ser. No. 60/703,201 filed Jul. 28, 2005. This application is a continuation-in-part of and claims priority to U.S. patent application Ser. No. 11/213,284 filed Aug. 26, 2005. This application is a continuation-in-part of and claims priority to U.S. patent application Ser. No. 11/010,539 filed Dec. 13, 2004. This application is a continuation-in-part of and claims priority to U.S. patent application Ser. No. 10/523,681 filed Feb. 3, 2005. This application is a continuation-in-part of and claims priority to U.S. patent application Ser. No. 11/625,802 filed Jan. 22, 2007. This application incorporates by reference and claims priority to U.S. provisional patent application Ser. No. 60/938,446 filed May 16, 2007. This application incorporates by reference and claims priority to U.S. provisional patent application Ser. No. 60/938,359 filed May 16, 2007. This application is a continuation-in-part of and claims priority to U.S. patent application Ser. No. 11/925,843 filed Oct. 27, 2007. This application is a continuation-in-part of and claims priority to U.S. patent application Ser. No. 11/926,522 filed Oct. 27, 2007. This application is a continuation-in-part of and claims priority to U.S. patent application Ser. No. 10/704,996 filed Nov. 10, 2003. This application is a continuation-in-part of and claims priority to U.S. patent application Ser. No. 11/295,043 filed Dec. 6, 2005. This application is a continuation-in-part of and claims priority to U.S. patent application Ser. No. 11/925,850 filed Oct. 27, 2007. This application claims priority to and is a continuation-in-part of U.S. patent application Ser. No. 11/119,355 filed Apr. 29, 2005, which claims priority to U.S. provisional patent application Ser. No. 60/566,127 filed Apr. 30, 2004. This application also claims priority to and is a continuation-in-part of U.S. patent application Ser. No. 10/701,955 filed Nov. 5, 2003, which in turn claims priority to and is a continuation-in-part of U.S. patent application Ser. No. 10/443,126 filed May 20, 2003. This application claims priority to and is a continuation-in-part of U.S. patent application Ser. No. 11/061,867 filed Feb. 17, 2005, which claims priority to U.S. provisional patent application Ser. No. 60/545,576 filed Feb. 17, 2004 and U.S. provisional patent application Ser. No. 60/566,818 filed Apr. 30, 2004. This application is also a continuation-in-part of and claims priority to U.S. patent application Ser. No. 10/704,966 filed Nov. 10, 2004. This application claims priority to and is a continuation-in-part of U.S. patent application Ser. No. 10/607,919 filed Jun. 27, 2005. This application is a continuation-in-part of and claims priority to PCT application serial number PCT/US03/24368 filed Aug. 1, 2003, which claims priority to U.S. provisional patent application Ser. No. 60/423,881 filed Nov. 5, 2002 and U.S. provisional patent application Ser. No. 60/400,624 filed Aug. 2, 2002. This application is also a continuation-in-part of and claims priority to PCT Application Serial No. PCT/US03/14785 filed May 9, 2003, which is a continuation of U.S. patent application Ser. No. 10/165,556 filed Jun. 7, 2002. This application is also a continuation-in-part of and claims priority to U.S. patent application Ser. No. 10/888,735 filed Jul. 9, 2004. This application is also a continuation-in-part of and claims priority to U.S. patent application Ser. No. 10/633,186 filed Jul. 31, 2003 which claims priority to U.S. provisional patent application Ser. No. 60/423,881 filed Nov. 5, 2002 and to U.S. patent application Ser. No. 10/443,126 filed May 20, 2003 which claims priority to U.S. provisional patent application Ser. No. 60/423,881 filed Nov. 5, 2002 and to U.S. provisional application 60/400,624 filed Aug. 2, 2002. All of the above applications are herein incorporated by reference in their entirety as if fully set forth herein.
Number | Date | Country | |
---|---|---|---|
60882888 | Dec 2006 | US | |
60938446 | May 2007 | US | |
60938359 | May 2007 | US | |
60545576 | Feb 2004 | US | |
60566818 | Apr 2004 | US | |
60400624 | Aug 2002 | US | |
60423881 | Nov 2002 | US | |
60423881 | Nov 2002 | US | |
60400624 | Aug 2002 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 10165556 | Jun 2002 | US |
Child | PCT/US03/14785 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11213284 | Aug 2005 | US |
Child | 11968027 | US | |
Parent | 11010539 | Dec 2004 | US |
Child | 11213284 | US | |
Parent | 10523681 | Sep 2005 | US |
Child | 11010539 | US | |
Parent | 11625802 | Jan 2007 | US |
Child | 10523681 | US | |
Parent | 11925843 | Oct 2007 | US |
Child | 11625802 | US | |
Parent | 11926522 | Oct 2007 | US |
Child | 11925843 | US | |
Parent | 10704996 | Nov 2003 | US |
Child | 11926522 | US | |
Parent | 11295043 | Dec 2005 | US |
Child | 10704996 | US | |
Parent | 11925850 | Oct 2007 | US |
Child | 11295043 | US | |
Parent | 11119355 | Apr 2005 | US |
Child | 11925850 | US | |
Parent | 10701955 | Nov 2003 | US |
Child | 11119355 | US | |
Parent | 10443126 | May 2003 | US |
Child | 10701955 | US | |
Parent | 11061867 | Feb 2005 | US |
Child | 10443126 | US | |
Parent | 10704966 | Nov 2003 | US |
Child | 11061867 | US | |
Parent | 10607919 | Jun 2003 | US |
Child | 10704966 | US | |
Parent | PCT/US03/24368 | Aug 2003 | US |
Child | 10607919 | US | |
Parent | PCT/US03/14785 | May 2003 | US |
Child | PCT/US03/24368 | US | |
Parent | 10888735 | Jul 2004 | US |
Child | 10165556 | US | |
Parent | 10633186 | Jul 2003 | US |
Child | 10888735 | US | |
Parent | 10443126 | May 2003 | US |
Child | 10633186 | US |