The present application claims priority from Korean Patent Application No. 10-2008-0094567 filed on Sep. 26, 2008, the entire subject matter of which is incorporated herein by reference.
The present disclosure generally relates to ultrasound imaging, and more particularly to ultrasound volume data processing to visualize a moving object in a 3-dimensional ultrasound image.
An ultrasound diagnostic system has become an important and popular diagnostic tool since it has a wide range of applications. Specifically, due to its non-invasive and non-destructive nature, the ultrasound diagnostic system has been extensively used in the medical profession. Modern high-performance ultrasound diagnostic systems and techniques are commonly used to produce two or three-dimensional diagnostic images of internal features of an object (e.g., human organs).
Recently, the ultrasound diagnostic system has been improved to provide a 3-dimensional ultrasound image. A static 3-dimensional ultrasound image, which is one of the 3-dimensional ultrasound images, is often used for ultrasound diagnostic purposes. By using the static 3-dimensional ultrasound image, it is possible to perform accurate observations, diagnoses or treatments of the human body without conducting complicated procedures such as invasive operations. However, the static 3-dimensional image may not be useful in certain cases, for example, in observing a moving target object in real time such as a fetus in the uterus.
To overcome this shortcoming, a live 3-dimensional imaging method and apparatus for providing a 3-dimensional moving image (rather than the static 3-dimensional image) has been developed. The live 3-dimensional image can show the movement of a moving target object more smoothly than the static 3-dimensional image.
Further, there has been an increased interest in the heart conditions of a fetus since there is an increasing need to perform an early diagnosis of the fetus' status. However, since the systole and diastole of the heart tend to rapidly repeat, it is impossible to scan all the movements of the heart just by using a 3-dimensional probe. Thus, there is a problem in providing a real heartbeat image.
Embodiments for processing volume data are disclosed herein. In one embodiment, by way of non-limiting example, a volume data processing device, comprises: a volume data acquisition unit operable to acquire ultrasound volume data consisting of a plurality of image frames representing a periodically moving target object, wherein each of the frames includes a plurality of pixels; a period setting unit operable to set a feature point for each of the frames and set a moving period of the target object based on the feature points set for the image frames; and a volume data reconstructing unit operable to interpolate the ultrasound volume data to have the same number of the image frames within each moving period and reconstruct the interpolated ultrasound volume data into a plurality of sub volumes based on the moving period.
In another embodiment, a volume data processing method, comprises: a) acquiring volume data having a plurality of frames from a periodically moving target object, wherein each of the frames includes a plurality of pixels; b) setting a feature point at each of the frames based on values of the pixels included therein; c) setting a moving period of the target object based on the feature points set at the frames; d) interpolating the ultrasound volume data to have the same number of the image frames within each moving period; and e) reconstructing the interpolated ultrasound volume data into a plurality of sub volumes based on the moving period.
The Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key or essential features of the claimed subject matter, nor is it intended to be used in determining the scope of the claimed subject matter.
A detailed description may be provided with reference to the accompanying drawings. One of ordinary skill in the art may realize that the following description is illustrative only and is not in any way limiting. Other embodiments of the present invention may readily suggest themselves to such skilled persons having the benefit of this disclosure.
The volume data acquisition unit 110 may include a probe (not shown) that may be operable to transmit ultrasound signals into a target object and receive echo signals reflected from the target object. The probe may further be operable to convert the received echo signals into electrical receive signals. The volume data acquisition unit 110 may further include a beam former (not shown) that may be operable to form a receive-focused beam based on the electrical receive signals, and a signal processor (not shown) that may be operable to perform signal processing upon the receive-focused beam to thereby form a plurality of frames constituting volume data.
The scan conversion unit 120 may be coupled to the volume data acquisition unit 110 to receive the plurality of frames. The scan conversion unit 120 may be operable to perform the scan conversion upon the plurality of frames into a data format suitable for display on the display unit 150.
The period detection unit 130 may include a feature point setting section 131, a feature point curve forming section 132 and a period setting section 133, as illustrated in
Also, the feature point setting section 131 may be operable to horizontally sum pixel values at each of the Y coordinates 1-N in the frame. That is, assuming that pixel values in the frame are represented by PXY, the feature point setting section 130 may be operable to sum P1Y, P2Y, . . . and PMY to thereby output fourth sums Sy1-SyN corresponding to respective Y coordinates. Subsequently, the feature point setting section 131 may further be operable to multiply the fourth sums Sy1-SyN by weights Wy1-WyN, respectively, to thereby output second weighted sums SMy1-SMyN. In one embodiment, the weights Wy1-WyN may be determined by arbitrary values, which increase or decrease at a constant interval. For example, the numbers 1-N may be used as the weight values Wy1-WyN. The feature point setting section 131 may further be operable to sum all of the fourth sums Sy1-SyN to thereby output a fifth sum, and sum all of the second weighted sums SMy1-SMyN to thereby output a sixth sum. The feature point setting section 131 may further be operable to divide the sixth sum by the fifth sum, and then set the division result as the centroid on the Y axis.
Although it is described that the feature point is set by using the centroid of pixel values (intensities) constituting each of the frames, the feature point setting is certainly not limited thereto. The feature point at each of the frames may be set through singular value decomposition upon each of the frames.
Once the setting of the centroid is complete for all of the frames, the feature point curve forming section 132 may be operable to display centroids on the X-Y coordinate system, and then set a principle axis 300 thereon, as illustrated in
In one embodiment, the period detection unit 130 may further include a region of interest (ROI) setting section (not shown) that may be operable to set a region of interest in each of the image frames for calculation reduction. The ROI setting section may be operable to perform horizontal projection for obtaining a projected value summing the brightness of all pixels along a horizontal pixel line in the image frame. Boundaries nT and nB of ROI can be calculated by using equation (1) shown below.
wherein, fn represents a horizontally projected signal, Mean represents a mean of the projected values, nT represents a vertical position of a projected value (in the most left side among the projected values smaller than a mean value), and nB represents a vertical position of a projected value (in the most right side among the projected values smaller than a mean value). nT and nB are used as the boundaries of ROI. The ROI setting section may further be operable to mask the image frame by using the boundaries nT and nB of ROI, thereby removing regions that are located outside the boundaries nT and nB from the image.
The volume data reconstructing unit 160 may be operable to perform interpolation upon the volume data to have the same number of the frames within each period. After completing the interpolation, the volume data reconstructing unit 140 reconstructs the interpolated volume data to provide a 3-dimensional ultrasound image showing a figure of the heartbeat in accordance with the present invention.
Further, when the 3-dimensional volume data are acquired by scanning the target object, the object (e.g., expectant mother or fetus) may be moved. This makes it difficult to accurately detect the heartbeat of the fetus. Accordingly, the ultrasound image processing device may further include a motion compensating unit (not shown). The motion compensating unit may be operable to compensate the motion of the expectant mother or the fetus by matching the brightness of pixels between a previously set VOI and a currently set VOI. The motion compensating unit calculates the motion vectors by summing the absolute differences of brightness of pixels between the previously set VOI and the currently set VOI. For example, assuming that VOI at a nth frame is expressed as Vn(m), VOI at a next frame can be expressed as Vn(m+1). In such a case, a variable m represents the combination of n−1, n and n+1. The motion compensating unit moves Vn(m) up, down, right and left (i, j), and then calculates the absolute differences of brightness of pixels between Vn(m) and Vn(m+1) at each position. A motion vector is estimated at a position where the absolute difference is minimal. The sum of the absolute difference is calculated as the following equation (2).
wherein, W represents a predefined motion estimated range, K represents a total number of the frames, i,j represent motion displacements, k,l represent the position of a pixel in the frame included in VOI, and m represents the number of the frames.
Since the volume data are reconstructed in accordance with the moving period, an improved ultrasound image of the target object can be provided. Also, since the motion of the expectant mother or the fetus is compensated, the ultrasound image can be more accurately and clearly provided.
Although embodiments have been described with reference to a number of illustrative embodiments thereof, it should be understood that numerous other modifications and embodiments can be devised by those skilled in the art that will fall within the spirit and scope of the principles of this disclosure. More particularly, numerous variations and modifications are possible in the component parts and/or arrangements of the subject combination arrangement within the scope of the disclosure, the drawings and the appended claims. In addition to variations and modifications in the component parts and/or arrangements, alternative uses will also be apparent to those skilled in the art.
Number | Date | Country | Kind |
---|---|---|---|
10-2008-0094567 | Sep 2008 | KR | national |