Apparatuses consistent with one or more exemplary embodiments of the present disclosure relate to ultrasound diagnostic apparatuses, and particularly relate to an ultrasound diagnostic apparatus used for examination on the growth of a fetus.
Ultrasound-based diagnostic imaging, by its nature of utilizing sound waves, affects less the human body. Therefore, the ultrasound-based diagnostic imaging is often used for prenatal checkups, and the condition in which a fetus grows is examined with reference to the ultrasound images of the fetus during a checkup.
For the examination on the condition of a growing fetus, it is a well-known method to calculate an estimated weight of the fetus based on the ultrasound images. More specifically, the estimated fetal weight is calculated by measuring the lengths of specific regions (head, abdomen, and thigh) of the fetus in the mother's uterus and substituting the measured values into a formula used for the estimation of the fetal weight.
In the general operation performed in the ultrasound-based diagnostic imaging, the examiner firstly operates a probe in such a manner that the specific regions of a fetus are delineated. Then, the examiner adjusts the probe so that the cross-sectional images which are appropriate for the use in the measurement can be obtained, and allows the measurement images of the specific regions to be displayed. The examiner then measures, on the respective measurement images, a biparietal diameter (BPD) for the head, an abdominal circumference (AC) for the abdomen, and a femoral length (FL) for the thigh, of the fetus. The estimated fetal weight can be obtained by inputting the values which have resulted from the respective measurements into the estimated fetal weight calculation formula as shown in Formula 1 below.
Estimated weight (g)=1.07BPD3+3.00×10−1AC2×FL (Formula 1)
Here, BPD (biparietal diameter/cm), AC (abdominal circumference/cm), and FL (femoral length/cm) are the lengths of the regions respectively shown in
According to such conventional method, an estimated fetal weight can be obtained by measuring the lengths of the BPD, the AC, and the FL after the respective appropriate measurement images (hereafter referred to as “measurement reference images”) have been displayed. Then, by comparing the estimated fetal weight thus obtained and the statistical data of estimated fetal weight, it is possible to examine the condition of a growing fetus.
With the conventional method, however, in the case where the measurement reference images are inappropriate, that is, the case in which the respective measurement reference images are not displayed in an appropriate manner so as to measure the lengths of the BPD, the AC, and the FL, it is impossible to accurately measure these lengths. For example, in the case of displaying a thighbone in the thigh, the thighbone may be displayed with the length shorter than its actual length on the measurement reference image if the angle between the probe and the thighbone is not appropriate. The same applies to the head and the abdomen, and the lengths of the biparietal diameter and the abdominal circumference may be displayed with the lengths longer than their actual lengths depending on the angle that is respectively made with the probe.
Therefore, in order to properly obtain an estimated fetal weight, the examiner has to operate the probe carefully so as to obtain appropriate measurement reference images and thus determine appropriate measurement reference images. In other words, whether or not an estimated fetal weight can be properly obtained (whether the measurement reference images determined by the examiner enable accurate measurements of the BPD, the AC, and the FL) depends on the skills and knowledge of the examiner. This is attributed to the fact that the location and the position of a fetus always change during the examination.
In response to this problem, there is disclosed a technique of obtaining voxel data that compose a three-dimensional region, through the transmission and reception of ultrasound waves, and setting a cut plane for the voxel data so as to obtain cross-sectional images at arbitrary angles (see reference to PTL 1). With the use of the method suggested in PTL 1 for the obtainment of the measurement reference images as described above, the examiner is capable of setting appropriate cut planes after having obtained the voxel data of a fetus during the operation of the probe. In other words, it is possible to set appropriate measurement reference images regardless of the skills of the examiner.
However, with the conventional configuration using the technique disclosed in the aforementioned PTL 1, although the influence caused by the dependence on the examiner's skills is reduced, the examiner needs to set cut planes, and thus, whether or not appropriate measurement reference images can be obtained still depends on the judgments of the examiner. That is to say, the problem, which is caused by the fact that the examiner has to judge whether the respective measurement reference images are appropriate for the measurements and has to give instructions based on the judgments, still remains to be solved.
One or more exemplary embodiments of the present disclosure may overcome the aforementioned conventional problem and other problems not described herein. However, it is understood that one or more exemplary embodiments of the present disclosure are not required to overcome or may not overcome the problem described above and other problems not described herein. One or more exemplary embodiments of the present disclosure provide an ultrasound diagnostic apparatus capable of reducing the dependence on the examiner and calculating an estimated fetal weight with high accuracy and easy operation.
According to an exemplary embodiment of the present disclosure, the ultrasound diagnostic apparatus includes: a three-dimensional data generation unit configured to generate three-dimensional data for one or more regions in a body of a subject based on reflected waves reflecting back from the body of the subject after ultrasound waves have been transmitted towards the body of the subject; a measurement image selection unit configured to select, based on an intensity of the reflected waves, one of two-dimensional cross-sections that compose the three-dimensional data, as a measurement reference image used for measuring a length of each region in the body of the subject; a measurement and calculation unit configured to measure the length of each region in the body of the subject using the selected measurement reference image, and to calculate an estimated weight of the subject using the measured lengths; and an output unit configured to output the calculated estimated weight.
With this configuration, it is possible to realize the ultrasound diagnostic apparatus capable of reducing the dependence on the examiner and calculating an estimated fetal weight with high accuracy and easy operation.
Here, the measurement image selection unit may include: a hyperechoic region extraction unit configured to extract, from the three-dimensional data, a hyperechoic region which is a region corresponding to the reflected waves having a reflection intensity that is greater than a threshold value; a cut plane obtainment unit configured to obtain two-dimensional cross-sections that compose the three-dimensional data, by cutting the three-dimensional data based on a three-dimensional feature of the extracted hyperechoic region; and a reference image selection unit configured to select one of the two-dimensional cross-sections as the measurement reference image used for measuring the length of the region in the body of the subject.
With this configuration, it is possible to select, with high accuracy, a cross-section appropriate for measurement by narrowing down the number of appropriate cut planes based on the three-dimensional features of a hyperechoic region so as to obtain an appropriate cut plane.
It should be noted that the present inventive concept may be implemented, not only as an ultrasound diagnostic apparatus such as that described herein, but also as a method, having as steps, the processing units configuring the ultrasound diagnostic apparatus, and also as a program which causes a computer to execute such characteristic steps, and even as information, data or a signal which indicates the program. In addition, such a program, information, data, and signal can be distributed via a recording medium such as a CD-ROM and via a transmitting medium such as the Internet.
According to one or more exemplary embodiments of the present disclosure, it is possible to realize an ultrasound diagnostic apparatus capable of reducing the dependency on the examiner and calculating an estimated fetal weight with high accuracy and easy operation.
These and other advantages and features of exemplary embodiments of the present disclosure will become apparent from the following description thereof taken in conjunction with the accompanying Drawings that illustrate general and specific exemplary embodiments of the present disclosure. In the Drawings:
Hereinafter, certain exemplary embodiments of the present disclosure shall be described in greater detail with reference to the accompanying Drawings.
Each of the exemplary embodiments described below shows a general or specific example. The numerical values, shapes, materials, structural elements, the arrangement and connection of the structural elements, steps, the processing order of the steps etc. shown in the following exemplary embodiments are mere examples, and therefore do not limit the inventive concept, the scope of which is defined in the appended Claims and their equivalents. Therefore, among the structural elements in the following exemplary embodiments, structural elements not recited in any one of the independent claims defining the most generic part of the inventive concept are not necessarily required to overcome conventional disadvantage(s).
An ultrasound diagnostic apparatus 1 shown in
The ultrasound diagnostic apparatus main body 100 includes a control unit 102, a transmission and reception unit 103, a B-mode image generation unit 104, a three-dimensional data generation unit 105, a measurement image selection unit 106a which includes a hyperechoic region extraction unit 106, a cut plane obtainment unit 107, and a measurement reference image selection unit 108, a data storage unit 109, a measurement and calculation unit 112, and an output unit 113.
The probe 101 is connected to the ultrasound diagnostic apparatus main body 100, and ultrasound transducers for transmitting and receiving ultrasound waves are arranged in the probe 101. The probe 101 transmits ultrasound waves according to an instruction from the transmission and reception unit 103, and receives, as echo signals, reflected waves (ultrasound reflected signals) from the body of the subject. The probe 101 also includes a motor which allows the ultrasound transducers to vibrate in a direction that is vertical to a scanning direction. Therefore, when the body of the subject is scanned using the probe 101, the ultrasound transducers scan the body while vibrating, and thus cross-sectional data in the direction vertical to the scanning direction can be obtained based on the echo signals. It should be noted that the probe 101 is not limited to a probe that has a vibration mechanism. For instance, a drive of the ultrasound transducers that are arranged in a matrix in a two-dimensional array probe may be used, or a mechanism which allows the probe 101 to move parallel at a constant speed can also be used. All that is needed for the probe 101 is a means to three-dimensionally transmit and receive the ultrasound waves.
The control unit 102 controls the respective units in the ultrasound diagnostic apparatus main body 100. Note that although it is not specifically stated hereafter, the control unit 102 governs the respective units and operates these units while controlling the operation timings and others.
The transmission and reception unit 103 transmits, to the probe 101, an instruction signal for generating ultrasound waves by driving the ultrasound transducers of the probe 101, and also receives the ultrasound reflected signals from the probe 101.
The B-mode image generation unit 104 generates B-mode images based on the ultrasound reflected signals received by the transmission and reception unit 103. Specifically, the B-mode image generation unit 104 performs, on the ultrasound reflected signals, filtering processing, and then, envelope detection. In addition, the B-mode generation unit 104 performs logarithmic conversion and gain adjustment on the detected signals and outputs the signals that have been converted and adjusted. It should be noted that B-mode is a method to display images by changing the brightness according to the intensity of the ultrasound reflected signals. A B-mode image is a cross-sectional image depicted by changing the intensity of the ultrasound reflected signals into brightness, by changing the ultrasound wave transmission and reception directions in such a way that the probe scans not only in a single scanning direction but scans sequentially along the scanning direction of the probe.
The three-dimensional data generation unit 105 generates three-dimensional data representing an object which is a region in the body of the subject, based on reflected waves reflecting back from the body of the subject after the ultrasound waves have been transmitted towards the body of the subject. Specifically, the three-dimensional data generation unit 105 generates three-dimensional data based on plural B-mode image data generated by the B-mode image generation unit 104. To be more specific, the three-dimensional data generation unit 105 generates three-dimensional data by performing resampling of the pixel values of the B-mode images into three-dimensional coordinate positions. The three-dimensional data generation unit 105 thus reconstitutes the B-mode image data into data that represents the object having a three-dimensional volume, although the details may differ depending on the method used for changing the ultrasonic wave transmitting and receiving directions.
The measurement image selection unit 106a selects, based on the intensity of the reflected waves, one of the two-dimensional cross-sections that compose the three-dimensional data, as a measurement reference image used for measuring a length of the region in the body of the subject. The measurement reference image selection unit 106a includes the hyperechoic region extraction unit 106, the cut plane obtainment unit 107, and the measurement reference image selection unit 108, as has already been mentioned above. The following gives, in more detail, the description of these processing units.
The hyperechoic region extraction unit 106 extracts, from the three-dimensional data, a hyperechoic region which is a region corresponding to the ultrasound reflected signals having a reflection intensity that is greater than a threshold value. Specifically, the hyperechoic region extraction unit 106 extracts only the data that represents such hyperechoic region from the three-dimensional data generated by the three-dimensional data generation unit 105. Here, a hyperechoic region is a region in which the reflection is stronger than the reflections of the neighboring regions whereas a hypoechoic region is a region in which the reflection is weaker than the reflections of the neighboring regions. Thus, with the setting of an appropriate threshold value, the hyperechoic region extraction unit 106 can extract only the data that represents the hyperechoic region, by comparing a three-dimensional data value and the threshold value. In this case, due to the fact that the subject is a fetus, a bone region is mainly extracted as such hyperechoic region.
It should be noted that, in order to prevent the extraction result from being affected by the data condition such as gain variation, it is desirable to firstly obtain a threshold value using a discrimination analysis method, and compare with the threshold value after binarization is performed.
In this manner, the hyperechoic region extraction unit 106 extracts the three-dimensional features of the hyperechoic region (mainly bone region) as a result of extracting, from the three-dimensional data, the data that represents the hyperechoic region.
The cut plane obtainment unit 107 obtains two-dimensional images which compose the three-dimensional data, by cutting the object represented by the three-dimensional data, based on the three-dimensional features of the extracted hyperechoic region. Specifically, the cut plane obtainment unit 107 obtains two-dimensional images (cut planes) by cutting, at a plane, the object represented by the three-dimensional data generated by the three-dimensional data generation unit 105, based on the three-dimensional features of the hyperechoic region extracted by the hyperechoic region extraction unit 106.
More specifically, the cut plane obtainment unit 107 firstly determines an orientation of a cut plane that is a plane at which the object represented by the three-dimensional data is cut based on the three-dimensional features of the hyperechoic region extracted by the hyperechoic region extraction unit 106, and then determines a cutting region which is a region to be cut in the object represented by the three-dimensional data. In other words, the cut plane obtainment unit 107 compares (matches) the three-dimensional data generated by the three-dimensional data generation unit 105 and plural previously-prepared template data which respectively represent the three-dimensional features of the respective specific regions. In the case where the three-dimensional data matches one of the template data, the cut plane obtainment unit 107 determines a three-dimensional region (the object represented by the three-dimensional data) which corresponds to the template data to be the cutting region, and also determines the orientation of the cut plane (the orientation of a surface normal of the cut plane) based on the template data. Then, the cut plane obtainment unit 107 obtains cut planes in the determined cutting region using the determined orientation. In other words, the cut plane obtainment unit 107 obtains the cut planes (two-dimensional images) which have the surface normal of the determined orientation.
Here, it is assumed that the cut plane obtainment unit 107 compares (matches) the three-dimensional data generated by the three-dimensional data generation unit 105 and the respective previously-prepared template data, and the three-dimensional data matches the most with the template data representing the head of a fetus. In such case, the cut plane obtainment unit 107 determines an area that longitudinally traverses the septum pellucidum to be the cutting region, and determines a plane that is vertical to the data representing the septum pellucidum for the orientation of the cut plane. Specifically, in the case where the three-dimensional data matches the most with the template data representing the head of a fetus, the cut plane obtainment unit 107 firstly extracts a median plane of the skull (dura mater) based on the three-dimensional features of the hyperechoic region, and then extracts the septum pellucidum (hypoechoic region) that is longitudinally traversed by the extracted median plane. Then, the cut plane obtainment unit 107 determines the plane that is vertical to the median plane of the skull (dura mater) for the orientation of the cut plane, and determines the area that longitudinally traverses the septum pellucidum (hypoechoic region) to be the cutting region. In this way, the cut plane obtainment unit 107 obtains the cut plane of the head of a fetus based on the bone and the dura mater which are hyperechoic regions.
Here, it is assumed that the cut plane obtainment unit 107 compares (matches) the three-dimensional data generated by the three-dimensional data generation unit 105 and the respective previously-prepared template data, and the three-dimensional data matches the most with the template data representing the abdomen of a fetus. In such case, the cut plane obtainment unit 107 determines, for the orientation of the cut plane, a plane that is vertical to the data representing the spine, and determines an area that traverses only the spine to be the cutting region. Specifically, in the case where the three-dimensional data matches the most with the template data representing the abdomen of a fetus, the cut plane obtainment unit 107 firstly extracts a columnar region (hyperechoic region) which is the spine, based on the three-dimensional features of the hyperechoic region. Then, the cut plane obtainment unit 107 determines the plane that is vertical to the extracted columnar region (hyperechoic region) for the orientation of the cut plane, and determines the area that longitudinally traverses only the spine to be the cutting region. In this way, the cut plane obtainment unit 107 obtains the cut plane of the abdomen of a fetus based on the bone which is hyperechoic region.
Here, it is assumed that the cut plane obtainment unit 107 compares (matches) the three-dimensional data generated by the three-dimensional data generation unit 105 and the respective previously-prepared template data, and the three-dimensional data matches the most with the template data representing the thigh of a fetus. In such case, the cut plane obtainment unit 107 determines, for the orientation of the cut plane, a plane that traverses the data representing the thighbone, and determines, as the cutting region, an area ranged from 0 to 180 degrees with respect to the data representing the thighbone being located in its center. Specifically, in the case where the three-dimensional data matches the most with the template data representing the thigh of a fetus, the cut plane obtainment unit 107 firstly extracts a bar region (hyperechoic region) which is the thighbone, based on the three-dimensional features of the hyperechoic region. Then, the cut plane obtainment unit 107 determines, for the orientation of the cut plane, the plane that traverses the extracted bar region (hyperechoic region), and determines, as the cutting region, the area having a region that has the plane which traverses the bar region (hyperechoic region) and has the area ranged from 0 to 180 degrees with respect to the determined cut plane. In this way, the cut plane obtainment unit 107 obtains the cut plane of the thigh of a fetus based on the bone which is a hyperechoic region.
As has been described above, the cut plane obtainment unit 107 determines the cutting region and the orientation, and obtains plural cut planes in the determined cutting region using the determined orientation. In other words, the cut plane obtainment unit 107 determines the orientation of two-dimensional image in which the object representing the three-dimensional data is cut, based on the three-dimensional form and location of the extracted hyperechoic region, and thus obtains two-dimensional images in the determined orientation.
The measurement reference image selection unit 108 selects one of the two-dimensional images to be a measurement reference image to be used for measuring a length of a region in the body of the subject. Specifically, the measurement reference image selection unit 108 selects one of the two-dimensional images to be such measurement reference image by evaluating the degree of similarity between each spatial distribution feature of brightness information represented by the respective two-dimensional images and a spatial distribution feature of brightness information represented by the measurement reference image. That is, the measurement reference image selection unit 108 evaluates the cross-sectional images obtained by the cut plane obtainment unit 107, and selects the image that is the most appropriate for measurement to be the measurement reference image. It is desirable to use brightness spatial distribution for the evaluation.
To be more specific, the measurement reference image selection unit 108 studies beforehand a brightness spatial distribution feature that statistically characterizes the measurement reference image, and selects, as such measurement reference image, a cross-sectional image which has a brightness spatial distribution feature that is the closest, among the plural cross-sectional images, to the previously-studied brightness spatial distribution feature of the measurement reference image. In the present embodiment, by comparing the result of the study prepared based on Haar-like features with the result of the feature value calculation performed for the respective cut planes that are obtained by the cut plane obtainment unit 107, the degree of similarity with respect to the measurement reference image can be measured.
The following describes the method for determining measurement reference images for the specific regions that are head, abdomen, and thigh of a fetus which are used for the estimated fetal weight calculation formula.
In order to accurately measure the BPD (biparietal diameter) of a fetus, it is preferable to measure it using a cross-section of the skull, in which the dura mater and the septum pellucidum are located as shown in
Thus, the measurement reference image selection unit 108 evaluates the cross-sectional images obtained by the cut plane obtainment unit 107, and selects, as a measurement reference image, the measurement cross-section which has the brightness spatial distribution feature that is the most corresponded to the feature shown in
In this manner, the measurement reference image selection unit 108 selects a measurement reference image based on the bone and the dura mater which are hyperechoic regions.
Note here that the measurement reference image may be a cross-sectional image which shows that the depicted median line further traverses corpora cisterna magna, as shown in
In order to accurately measure the AC (abdominal circumference) of a fetus, it is preferable to measure it using a cross-section of the abdomen, in which the spine, the umbilical vein, and the gastric vesicle are located as shown in
Thus, the measurement reference image selection unit 108 evaluates the cross-sectional images obtained by the cut plane obtainment unit 107, and selects, as a measurement reference image, the measurement cross-section which has the brightness spatial distribution feature that is the most corresponded to the feature shown in
In this manner, the measurement reference image selection unit 108 selects a measurement reference image based on the bone which is hyperechoic region as well as the blood vessels, the stomach and others which are hypoechoic regions.
It should be noted that although it is desirable to select a cut plane based on a spine that can be extracted as a hyperechoic region, a cut plane may be selected based on an abdominal aortic cross that is extracted as a hypoechoic region.
In order to accurately measure the FL (femoral length) of a fetus, it is preferable to measure the length of the thighbone as shown in
Thus, the measurement reference image selection unit 108 evaluates the cross-sectional images obtained by the cut plane obtainment unit 107, and selects, as a measurement reference image, the measurement cross-section which has the brightness spatial distribution feature that is the most corresponded to the feature shown in
In this manner, the measurement reference image selection unit 108 selects a measurement reference image based on the bone which is a hyperechoic region. As are the other cases, a measurement reference image is determined by evaluating cut planes based on three-dimensional data, not a two-dimensional image (B-mode image). Therefore, it is possible to select, as a measurement reference image, the cross-section with which the length can be accurately measured, as shown in
The data storage unit 109 stores the B-mode images generated by the B-mode image generation unit 104, the three-dimensional data generated by the three-dimensional data generation unit 105, the hyperechoic region data extracted by the hyperechoic region extraction unit 106, and the measurement reference images selected by the measurement reference image selection unit 108.
The operator's instructions are inputted into the operation receiving unit 110. Specifically, the operation receiving unit. 110 is configured of buttons, a keyboard, a mouse, and others, and the examiner's instructions are inputted using these.
The display unit 111 is configured of a display device such as an LCD, and displays B-mode images, an object represented by three-dimensional data, and cut planes.
The measurement and calculation unit 112 measures the lengths of the respective regions in the body of the subject using the respectively selected measurement reference images, and calculates an estimated weight of the subject using the lengths that have been measured. Specifically, the measurement and calculation unit 112 measures the lengths of the respective regions in the body of the subject using the measurement reference images respectively selected by the measurement reference image selection unit 108. The measurement and calculation unit 112 then calculates an estimated weight of the subject based on the lengths of the respective regions in the body of the subject which have thus been measured.
The output unit 113 outputs an estimated weight that has been calculated. Specifically, by outputting the estimated weight calculated by the measurement and calculation unit 112, the output unit 113 causes the display unit 111 to display the calculated estimated weight.
The ultrasound diagnostic apparatus 1 according to Embodiment 1 is configured as has been described above.
Next, the measurement reference image selection process performed by the ultrasound diagnostic apparatus 1 shall be described with reference to
First, the B-mode image generation unit 104 generates B-mode images (step S10).
Specifically, the transmission and reception unit 103 emits ultrasound waves into the body of the subject via the probe 101 and receives the reflected waves via the probe 101. Then, the B-mode image generation unit 104 generates a B-mode image by performing data processing onto the ultrasound reflected signals received by the transmission and reception unit 103, and stores the generated B-mode image into the data storage unit 109. By performing such process while changing the ultrasound wave transmission and reception directions, B-mode images are generated and the generated B-mode images are stored into the data storage unit 109. It should be noted that among the methods of changing the ultrasound wave transmission and reception directions, some use a vibration mechanism of the probe 101, other use a drive of the ultrasound transducers in a two-dimensional array probe, and the others use a mechanism that allows the probe 101 to move parallel at a constant speed, as has already been mentioned above.
Next, the three-dimensional data generation unit 105 generates three-dimensional data based on the B-mode images (step S20). Specifically, the three-dimensional data generation unit 105 generates three-dimensional data by performing resampling of the pixel values of the B-mode images into three-dimensional coordinate positions. The three-dimensional data generation unit 105 thus reconstitutes the B-mode image data into data representing an object that has a three-dimensional volume, although the details may differ depending on the method of changing the ultrasound wave transmission and reception directions.
Then, the hyperechoic region extraction unit 106 extracts a hyperechoic region from the three-dimensional data generated by the three-dimensional data generation unit 105. As a result, the hyperechoic region extraction unit 106 extracts three-dimensional features of the hyperechoic region from the three-dimensional data (step S30).
Then, the cut plane obtainment unit 107 obtains cut planes based on the three-dimensional features of the hyperechoic region (step S40). Specifically, the cut plane obtainment unit 107 compares (matches) the three-dimensional data generated by the three-dimensional data generation unit 105 and each previously-prepared template data which represents the three-dimensional features of the respective specific regions. In the case where the three-dimensional data matches (the degree of similarity is high) one of the template data, the cut plane obtainment unit 107 determines, as the cutting region, the region represented by the three-dimensional data (the object indicated by the three-dimensional data) which corresponds to the template data, and also determines the orientation of a cut plane (the normal orientation of the cut plane) based on the template data. The cut plane obtainment unit 107 then obtains cut planes (two-dimensional images) in the determined cutting region using the determined orientation.
Next, the measurement reference image selection unit 108 evaluates the cut planes obtained by the cut plane obtainment unit 107 (step S50). After having evaluated all the cut planes obtained by the cut plane obtainment unit 107 (step S60), the measurement reference image selection unit 108 then selects, as a measurement reference image, the cut plane that has received the highest evaluation (step S70).
Specifically, by comparing the previously-studied brightness spatial distribution feature which statistically characterizes a measurement reference image and each spatial distribution feature of the respective cut planes obtained by the cut plane obtainment unit 107, the measurement reference image selection unit 108 measures the degree of similarity with respect to the measurement reference image. The measurement reference image selection unit 108 then selects, as a measurement reference image, the cross-sectional image having the brightness spatial distribution feature that is the closest to the previously-studied brightness spatial distribution feature of the measurement reference image, among the cut planes obtained by the cut plane obtainment unit 107.
It should be noted that in the case where the degree of similarity between the feature of the cut plane obtained by the cut plane obtainment unit 107 and the feature of the measurement reference image is low, the measurement reference image selection unit 108 returns to step S40. Then, the cut plane obtainment unit 107 obtains again plural cut planes and proceeds to step S50.
Lastly, the measurement reference image selection unit 108 stores the selected measurement reference image into the data storage unit 109 (step S80).
Thus, the ultrasound diagnostic apparatus 1 performs the measurement reference image selection process. Specifically, the ultrasound diagnostic apparatus 1 determines with accuracy a cross-section that is appropriate for measurement, by narrowing down the number of cut planes based on the three-dimensional features of the bone region that is to be a hyperechoic region, for the obtainment of an appropriate cut plane.
It should be noted that, in step S30, the examiner may judge on the region in the body of the subject based on the three-dimensional features of the hyperechoic region (three-dimensional form and location information of the hyperechoic region) extracted by the hyperechoic region extraction unit 106. In such case, the examiner may notify, via the operation receiving unit 110, the cut plane obtainment unit 107 that the three-dimensional data generated by the three-dimensional data generation unit 105 is the data representing a specific region such as a thigh, for instance, and may thus select down in advance the template data which represents such a specific region and is to be compared (matched) with the three-dimensional data generated by the three-dimensional data generation unit 105. In this way, it is possible to improve the efficiency in the process performed by the cut plane obtainment unit 107 in step S40. In addition, it is also possible to improve the efficiency in the evaluation performed by the measurement reference image selection unit 108 in step S50, and thus to reduce the risk of false evaluation.
Thus, the ultrasound diagnostic apparatus 1 performs the measurement reference image selection process. This enables those who are not skillful in operating an ultrasound diagnostic apparatus to surely obtain an appropriate measurement reference image, and to accurately measure the length of a specific region based on such measurement reference image.
The following shall describe the whole processing performed by the ultrasound diagnostic apparatus 1, that is, the processing which includes the measurement reference image selection process and is up to the process in which the ultrasound diagnostic apparatus 1 calculates an estimated weight of the subject.
The ultrasound diagnostic apparatus 1 firstly generates three-dimensional data for a region in the body of the subject based on the reflected waves of the ultrasound waves which have been transmitted towards the body of the subject and reflected back from the body of the subject. (S110). Specifically, the ultrasound diagnostic apparatus 1 performs the processing in steps S10 and S20 described in
Then, the ultrasound diagnostic apparatus 1 selects, based on the intensity of the reflected waves from the body of the subject, one of the two-dimensional images that compose the three-dimensional data, as a measurement reference image to be used for measuring a length of the region in the body of the subject (S130). Specifically, the ultrasound diagnostic apparatus 1 performs the processing from steps S30 to S80 described in
It should be noted that, in steps 5110 and S130, more precisely, the three-dimensional data is generated for the respective regions in the body of the subject, namely, the head, abdomen, and thigh of a fetus.
As shown in
Next, the ultrasound diagnostic apparatus 1 measures the lengths of the respective regions in the body of the subject using the measurement reference images respectively selected in S130, and calculates an estimated weight of the subject based on the measured lengths (S150).
Specifically, the measurement and calculation unit 112 measures the lengths of the respective regions in the body of the subject using the respectively selected measurement reference images, and calculates an estimated weight of the subject using the measured lengths.
Then, the ultrasound diagnostic apparatus 1 outputs the calculated estimated weight (S170).
Thus, the ultrasound diagnostic apparatus 1 calculates an estimated weight of the subject.
According to the present embodiment, it is possible to achieve the ultrasound diagnostic apparatus capable of reducing the dependence on the examiner and calculating an estimated fetal weight with high accuracy and easy operation.
The ultrasound diagnostic apparatus 2 shown in
The subject's body region specification unit 212 specifies a region, in the body of the subject, which is the object represented by the three-dimensional data. Specifically, the subject's body region specification unit 212 judges that the object represented by the three-dimensional data generated by the three-dimensional data generation unit 105 is a region, for instance, a head, an abdomen, or a thigh. The judgment is based on the three-dimensional features of the hyperechoic region (three-dimensional form and location information of the hyperechoic region) extracted by the hyperechoic region extraction unit 106. The subject's body region specification unit 212 thus specifies the region in the body of the subject (three-dimensional data) which is being observed.
For example, the subject's body region specification unit 212 compares the three-dimensional data generated by the three-dimensional data generation unit 105 and the template data (e.g.,
The ultrasound diagnostic apparatus 2 according to Embodiment 2 is configured as has been described above.
The difference between
In step S35, the subject's body region specification unit 212 judges that the object represented by the three-dimensional data generated by the three-dimensional data generation unit 105 is a region, for instance, a head, an abdomen, or a thigh. The judgment is based on the three-dimensional features of the hyperechoic region (three-dimensional form and location information of the hyperechoic region) extracted by the hyperechoic region extraction unit 106. The subject's body region specification unit 212 thus specifies the region in the body of the subject (three-dimensional data) which is being observed.
Next, the ultrasound diagnostic apparatus 2 proceeds to step S40, and the cut plane obtainment unit 107 obtains two-dimensional images based on the information indicating the three-dimensional form and location of the region specified by the subject's body region specification unit 212 and the three-dimensional form and location of the extracted hyperechoic region.
For example, in the case where the subject's body region specification unit 212 specifies that the region of a fetus, which is the object represented by the three-dimensional data, is a head, the cut plane obtainment unit 107 extracts a region that corresponds to a septum pellucidum, based on the three-dimensional features of the extracted hyperechoic region, determines, based on the extracted region, the orientation of two-dimensional image in which the object represented by the three-dimensional data is cut, and obtains two-dimensional images in the determined orientation.
In addition, in the case where the subject's body region specification unit 212 specifies that the region of a fetus, which is the object represented by the three-dimensional data, is an abdomen, the cut plane obtainment unit 107 extracts a region that corresponds to a spine, based on the three-dimensional features of the extracted hyperechoic region, determines, based on the extracted region, the orientation of two-dimensional image in which the object represented by the three-dimensional data is cut, and obtains two-dimensional images in the determined orientation.
Furthermore, in the case where the subject's body region specification unit 212 specifies that the region of a fetus, which is the object represented by the three-dimensional data, is a thigh, the cut plane obtainment unit 107 extracts a region that corresponds to a thighbone, based on the three-dimensional features of the extracted hyperechoic region, determines, based on the extracted region, the orientation of two-dimensional image in which the object represented by the three-dimensional data is cut, and obtains two-dimensional images in the determined orientation.
Thus, the ultrasound diagnostic apparatus 2 performs the measurement reference image selection process.
As described above, the ultrasound diagnostic apparatus 2 according to the present embodiment thus performs efficient evaluation and reduces the risk of false evaluation. With this, the ultrasound diagnostic apparatus 2 can further select, with high accuracy, a cross-section (measurement reference image) that is appropriate for measurement.
It should be noted that, in the present embodiment, the subject's body region specification unit 212 is configured to judge based on the features of a hyperechoic region, however, the examiner may give an instruction via the operation receiving unit 110. In other words, the subject's body region specification unit 212 may specify a region, in the body of the subject, which is the object represented by the three-dimensional data, according to the examiner's (operator's) instruction received by the operation receiving unit 110. In such case, although such examiner's instruction is a step added to the process, a region in the body of the subject can be precisely determined, which enables more stable obtainment of the measurement reference image that is appropriate for measurement.
According to one or more exemplary embodiments of the present disclosure, it is possible to realize the ultrasound diagnostic apparatus capable of reducing the dependence on the examiner and calculating an estimated fetal weight with high accuracy and easy operation.
It should be noted that although it has been described in the embodiments that the probe 101 and the ultrasound diagnostic apparatus 100 are separately configured, the present inventive concept is not limited to these embodiments. The probe 101 may include part or all of the processing units included in the ultrasound diagnostic apparatus main body 100.
In the above description, the ultrasound diagnostic apparatus main body 100 includes the control unit 102, the transmission and reception unit 103, the B-mode image generation unit 104, the three-dimensional data generation unit 105, the hyperechoic region extraction unit 106, the measurement image selection unit 106a, the data storage unit 109, the measurement and calculation unit 112, and the output unit 113. However, the present inventive concept is not limited to such configuration. As shown in
With the configuration of the ultrasound diagnostic apparatus 1 which includes at least such minimum configuration 100a, it is possible to realize the ultrasound diagnostic apparatus capable of reducing the dependence on the examiner and calculating an estimated fetal weight with high accuracy and easy operation.
Furthermore, in the above description, the measurement and calculation unit 112 performs measurements using the measurement reference images determined by the measurement reference image selection unit 108, and calculates an estimated weight of a fetus being the subject, based on the measured lengths of the regions in the body of the subject. However, the present inventive concept is not limited to this. The ultrasound diagnostic apparatus main body 100 may include neither the measurement and calculation unit 112 nor the output unit 113, and the examiner may calculate an estimated fetal weight based on the lengths of the regions in the body of the subject, which have been measured using the measurement reference images determined by the measurement reference image selection unit 108.
Although the ultrasound diagnostic apparatuses according to the embodiments of the present disclosure have been described up to this point, the present inventive concept is not limited to these embodiments. As long as they do not depart from the essence of the present inventive concept, various modifications obtainable through modifications to the respective embodiments that may be conceived by a person of ordinary skill in the art as well as an embodiment composed by the combination of the constituent elements of different embodiments are intended to be included in the present inventive concept.
For example, an exemplary embodiment of the present disclosure may be the method as described herein, or a computer program for achieving such method by a computer, or a digital signal composed of such computer program.
Furthermore, an exemplary embodiment of the present disclosure may be the aforementioned computer program or digital signal which is recorded in a computer-readable recording medium, such as a flexible disc, a hard disc, a CD-ROM, an MO, a DVD, a DVD-ROM, a DVD-RAM, a BD (Blu-Ray Disc), a semiconductor memory or the like. An exemplary embodiment of the present disclosure may also be the digital signal recorded in such recording medium.
Furthermore, according to an exemplary embodiment of the present disclosure, the aforementioned computer program or digital signal may be transferred via an electric communication line, a wireless or wired communication line, or a network as represented by the Internet, a data broadcasting, etc.
An exemplary embodiment of the present disclosure may be a computer system comprised of a microprocessor and a memory, in which the memory stores the aforementioned computer program and the microprocessor is operated according to such computer program.
The present inventive concept may be implemented in other independent computer system by transferring the aforementioned program or digital signal which has been recorded in the aforementioned recording medium, or by transferring such program or digital signal via the aforementioned network.
Although only some exemplary embodiments have been described in detail above, those skilled in the art will readily appreciate that various modifications may be made in these exemplary embodiments without materially departing from the principles and spirit of the inventive concept, the scope of which is defined in the appended Claims and their equivalents.
One or more exemplary embodiments of the present disclosure are applicable to ultrasound diagnostic apparatuses, and can be applied, in particular, to an ultrasound diagnostic apparatus capable of easily and properly obtaining measurement reference images for the thorough examination on the growth of a fetus.
Number | Date | Country | Kind |
---|---|---|---|
2010-222568 | Sep 2010 | JP | national |
This is a continuation application of PCT Patent Application No. PCT/JP2011/005365 filed on Sep. 26, 2011, designating the United States of America, which is based on and claims priority of Japanese Patent Application No. 2010-222568 filed on Sep. 30, 2010. The entire disclosures of the above-identified applications, including the specifications, drawings and claims are incorporated herein by reference in their entirety.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2011/005365 | Sep 2011 | US |
Child | 13479905 | US |