This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2014-091878, filed on Apr. 25, 2014 and Japanese Patent Application No. 2015-063071, filed on Mar. 25, 2015; the entire contents of which are incorporated herein by reference.
Embodiments described herein relate generally to an ultrasonic diagnostic apparatus, an image processing apparatus, and an image processing method.
Conventionally, as a method of inputting a 3D boundary of an object included in a 3D medical image, there is known a technique of tracing the boundary of an object on a plurality of cross-sectional images and generating the 3D boundary by interpolating the cross-sections.
For example, in the case of inputting a 3D boundary of the myocardium (muscle that makes up the heart) of the left ventricle of the heart included in an ultrasonic medical image, there is known a technique of tracing the myocardial boundary on a plurality of short-axis cross-sections of the left ventricle and generating the 3D myocardial boundary by interpolating the cross-sections.
For example, if the target of analysis is only the main portion (atrium or ventricle) and the inflow portion (for example, the mitral valve in the case of the left ventricle, and the tricuspid valve in the case of the right ventricle) for the blood to flow into the main portion, the position of an ultrasonic probe is set in accordance with the axis passing through the main portion and the inflow portion, and both are displayed relatively clearly with only the short-axis cross-section according to the conventional technique.
However, in the case where the target of analysis includes, in addition to the main portion and the inflow portion, an outflow portion (for example, the pulmonary valve in the case of the right ventricle) for the blood to flow out of the main portion, if the position of the ultrasonic probe is set in accordance with the axis passing through the main portion and one of the inflow portion and the outflow portion, the visibility of one of the inflow portion and the outflow portion may be sufficiently secured, but the visibility of the other may not be sufficiently secured. As a result, the time taken for the analysis and the diagnosis is increased. Besides, it is difficult to appropriately set the myocardial boundary, and thus, the accuracy of the analysis and the diagnosis may not be sufficiently secured.
According to an embodiment, an ultrasonic diagnostic apparatus includes a first acquirer, a first generator, a second generator, and a display controller. The first acquirer acquires a 3D ultrasonic image including a heart for one or more phases. The first generator generates a first cross-sectional image showing a cross-section of the 3D ultrasonic image which includes a first axis passing through a second portion in the 3D ultrasonic image and which includes a third portion. The second portion is a portion by which one of blood inflow into a first portion which is an atrium or a ventricle and blood outflow from the first portion is performed. The third portion is a portion by which another of the blood inflow into the first portion and the blood outflow from the first portion is performed. The second generator generates a second cross-sectional image showing a cross-section of the 3D ultrasonic image which includes a second axis passing through the third portion in the first cross-sectional image and which intersects with the first cross-sectional image. The display controller performs control of displaying the first cross-sectional image and the second cross-sectional image.
Hereinafter, various embodiments will be described in detail with reference to the appended drawings.
The ultrasonic probe 11 includes a plurality of piezoelectric vibrators. The plurality of piezoelectric vibrators generate ultrasonic waves based on drive signals supplied from a transmitter/receiver 101 provided to the apparatus main body 14 described later, and also receive reflected waves from the subject P and convert the same into electrical signals. The ultrasonic probe 11 includes a matching layer provided to the piezoelectric vibrator, a backing member for preventing propagation of ultrasonic waves from the piezoelectric vibrator toward the back, and the like.
When an ultrasonic wave is transmitted to the subject P from the ultrasonic probe 11, the transmitted ultrasonic wave is sequentially reflected by discontinuities of acoustic impedances in the body tissues of the subject P, and reflected wave signals are received by the plurality of piezoelectric vibrators provided to the ultrasonic probe 11. The amplitude of a received reflected wave signal is dependent on the difference in the acoustic impedances at the discontinuity where the ultrasonic wave is reflected. The reflected wave signal of a case where a transmitted ultrasonic pulse is reflected by a moving blood flow, the surface of the heart wall or the like, is dependent on the velocity component of the moving object with respect to the ultrasonic wave transmission direction due to the Doppler effect, and undergoes frequency shifting.
In the first embodiment, a mechanical 4D probe as the ultrasonic probe 11 is connected to the apparatus main body 14 for the purpose of 3D scanning of the subject P, for example. The mechanical 4D probe is able to perform 3D scanning by causing the plurality of piezoelectric vibrators arranged in one line to oscillate at a predetermined angle (angle of oscillation). Moreover, as the ultrasonic probe 11 for 3D scanning, a 2D array probe including a plurality of piezoelectric vibrators arranged in a matrix may also be used.
The input device 12 is a device used by an operator (user) of the ultrasonic diagnostic apparatus 1 to input various instructions and various settings, and may be configured by a mouse, a keyboard and the like, for example. The monitor 13 is a display device for displaying various images, and may be configured by a liquid crystal panel display device, for example. The monitor 13 is capable of displaying a GUI (Graphical User Interface) for the operator of the ultrasonic diagnostic apparatus 1 to input various instructions and various settings by using the input device 12, and of displaying an ultrasonic image and the like generated by the apparatus main body 14.
The apparatus main body 14 is a device capable of generating a 3D ultrasonic image based on 3D reflected wave data received by the ultrasonic probe 11. In the following description, the 3D ultrasonic image may be referred to as “volume data”.
As illustrated in
In the case of performing 3D scanning of the subject P, the transmitter/receiver 101 causes a 3D ultrasonic beam to be transmitted from the ultrasonic probe 11. Then, the transmitter/receiver 101 generates 3D reflected wave data from a 3D reflected wave signal received from the ultrasonic probe 11.
The B-mode processor 102 receives the reflected wave data from the transmitter/receiver 101, and by performing logarithmic amplification, envelope detection processing or the like, generates data (B-mode data) in which the signal intensity is expressed by the brightness of luminance. The B-mode processor 102 of the first embodiment generates 3D B-mode data from 3D reflected wave data.
The Doppler processor 103 performs frequency analysis on velocity information from the reflected wave data that is received from the transmitter/receiver 101, extracts blood flow, tissue and contrast agent echo components due to the Doppler effect, and generates data (Doppler data) extracting moving object information such as average velocity, distribution, power and the like for multiple points. The Doppler processor 103 of the first embodiment generates 3D Doppler data from the 3D reflected wave data.
The image generator 104 generates a 3D ultrasonic image from the B-mode data generated by the B-mode processor 102 or the Doppler data generated by the Doppler processor 103. Specifically, the image generator 104 generates 3D B-mode image data by performing coordinate transformation on the 3D B-mode data generated by the B-mode processor 102. Further, the image generator 104 generates 3D Doppler image data by performing coordinate transformation on the 3D Doppler data generated by the Doppler processor 103. That is, the image generator 104 generates “3D B-mode image data or 3D Doppler image data” as “3D ultrasonic image (volume data)”.
The image processor 105 performs image processing on the volume data generated by the image generator 104, and performs control of displaying an image subjected to the image processing on the monitor 13.
The first acquirer 110 acquires the volume data generated by the image generator 104. In the first embodiment, a case where the volume data to be acquired by the first acquirer 110 is a still image is described as an example, but this is not restrictive. In short, the first acquirer 110 may take any mode as long as a 3D ultrasonic image including the heart at one or more phases is acquired. In the present specification, “one phase” refers to any one time point (timing) in the periodic motion of the heart. In the first embodiment, the first acquirer 110 may also acquire volume data at one phase corresponding to the end-diastole or the end-systole, for example.
The first setter 111 sets, in the volume data acquired by the first acquirer 110, a first axis passing through a second portion by which one of blood inflow into a first portion which is an atrium or the ventricle and blood outflow from the first portion is performed. In the first embodiment, description is given, as an example, of a case where the first portion is the “right ventricle” and the second portion is the “tricuspid valve (inflow portion)” for the blood to flow into the right ventricle, but this is not restrictive. Moreover, the second portion may be a tubular region, but is not limited to be the tubular region. In the first embodiment, the first setter 111 sets the first axis according to an input (operation) of a user. Details are given below.
When the volume data is acquired by the first acquirer 110, a cross-sectional image that passes through an axis 200 of the volume data illustrated in
A user performs an operation of rotating the axis 200 set as the first axis by the first setter 111, and searches for a cross-sectional image showing the pulmonary valve by switching the cross-sectional images displayed on the monitor 13. Then, when a cross-sectional image showing the pulmonary valve is found, an instruction for causing the current cross-sectional image to be the first cross-sectional image is input. The first generator 112 which has received this input generates (sets) the current cross-sectional image as the first cross-sectional image. The first cross-sectional image represents a cross-section of a part of the heart illustrated in
The second generator 114 generates a second cross-sectional image showing a cross-section (cross-section of volume data) which includes the second axis and which crosses the first cross-sectional image. For example, the second generator 114 may generate, as the second cross-sectional image, a cross-section of the volume data which includes the second axis set by the second setter 113 and which is orthogonal to the first cross-sectional image generated by the first generator 112. The second cross-sectional image represents a cross-section of a part of the heart illustrated in
In the case of capturing an image of the subject P by setting the position of the ultrasonic probe 11 according to the axis that passes through the right ventricle and the tricuspid valve (inflow portion), the pulmonary valve (outflow portion) of the right ventricle is drawn unclearly due to the restriction of an acoustic window (an intercostal region, not overlapping the lungs, where an ultrasonic wave may pass), and there is an apparent problem that it is difficult to visually check the myocardial boundary using only the short-axis cross-section, as with the conventional technique. As exemplified by an apical four-chamber view, to cover all of the left ventricle and the right ventricle, an acoustic window by an apical approach is used. As in an apical two-chamber view or an apical long-axis view which is further obtained by this approach, the left ventricle may be drawn as a 2D tomographic image. However, with the right ventricle, the inflow side and the outflow side cannot be drawn at the same time as a 2D tomographic image. Thus, to obtain a cross-section of the right ventricle in the manner of the first cross-sectional image, volume data has to be collected and reconstructed. At this time, since the aorta on the left side is located on the inner body side than the pulmonary artery (blood through which an ultrasonic wave easily passes) on the right side, and the pulmonary artery is present between the aorta and the superior mediastinum, which is a bone, and the lungs near the pulmonary valve, it is not easily affected by the side lobes thereof. On the other hand, the pulmonary artery, which is close to the body surface side, is close to the bones and the lungs, and thus, in the arrangement of the first cross-sectional image, it is easily affected by the side lobe in the azimuth direction, and the image quality is reduced. Accordingly, it is often difficult to set the second axis using the first cross-sectional image.
Accordingly, in the first embodiment, the first cross-sectional image showing the cross-section of volume data which passes through the first axis along the center line of the tricuspid valve and which includes the pulmonary valve, and the second cross-sectional image which passes through the second axis along the center line of the pulmonary valve shown in the first cross-sectional image and which crosses the first cross-sectional image are generated and displayed on the monitor 13. With the second cross-sectional image, by selecting an arrangement, with respect to the outflow portion, according to which the cardiac muscle tissue on the anterior wall side of the left ventricle or the blood in the left chamber of the heart is present between the outflow portion and the lungs or according to which the outflow portion is located between the bones and the lungs and is not in direct contact with these, the influence of the side lobes of the lungs and bones are relatively reduced, and the image quality is improved. By reconstructing and drawing a cross-section with highly visible arrangement as described above by using the volume data, the visibility of the pulmonary valve which was conventionally difficult to see due to the restriction of the acoustic window may be increased, and the user is enabled to set the myocardial boundary with ease and high accuracy. With the setting of the myocardial boundary facilitated, the time required for the analysis and the diagnosis may be reduced. Further, with the increase in the accuracy of setting of the myocardial boundary, the accuracy of the analysis and the diagnosis is also increased. Accordingly, with the first embodiment, both reduction in the time required for the analysis and diagnosis and increase in the accuracy of the analysis and the diagnosis may be achieved.
Hardware Configuration and Program The hardware configuration of the apparatus main body 14 in which the image processor 105 described above is mounted uses the hardware configuration of a computer device including a CPU (Central Processing Unit), ROM, RAM, a communication I/F device and the like. The function of each unit (transmitter/receiver 101, B-mode processor 102, Doppler processor 103, image generator 104, image processor 105 (first acquirer 110, first setter 111, first generator 112, second setter 113, second generator 114, display controller 115)) of the apparatus main body 14 described above is implemented by the CPU loading a program stored in the ROM into the RAM. Furthermore, it is also possible to implement at least a part of the functions of the units of the apparatus main body 14 described above by a dedicated hardware circuit (for example, a semiconductor integrated circuit or the like).
In the first embodiment, the apparatus main body 14 installed with the function of the image processor 105 described above is assumed to correspond to an “image processing apparatus” in the claims.
The programs to be executed by the CPU (computer) described above may be stored in an external device connected to a network such as the Internet, and may be provided by being downloaded via the network. Furthermore, the programs to be executed by the CPU described above may be provided or distributed via the network such as the Internet. Moreover, the programs to be executed by the CPU described above may be provided being embedded in advance in a non-volatile recording medium such as the ROM.
For example, the first axis, the first cross-sectional image and the second axis may be automatically set by pattern recognition.
The first setter 111 arbitrarily sets a plurality of candidates for the first axis for the volume data acquired by the first acquirer 110. Then, the first setter 111 sets the candidate, for the first axis, included in the maximum combination determined by the determiner 116 as the first axis.
The first generator 112 arbitrarily sets a plurality of candidates for the first cross-sectional image for the volume data acquired by the first acquirer 110. Then, the first generator 112 sets the candidate, for the first cross-sectional image, included in the maximum combination determined by the determiner 116 as the first cross-sectional image.
Moreover, the second setter 113 arbitrarily sets a plurality of candidates for the second axis for the volume data acquired by the first acquirer 110. Then, the second setter 113 sets the candidate, for the second axis, included in the maximum combination determined by the determiner 116 as the second axis.
For example, the first acquirer 110 may acquire volume data at two or more phases. In a second modification, after the first axis and the second axis are set for the volume data at a predetermined phase, the positions of the first axis and the second axis at a phase (one or more phases) different from the one predetermined phase are tracked by using a known 3D tracking technique using a volume data group along a time series, and the first cross-sectional image and the second cross-sectional image at the phase different from the predetermined phase are generated using the tracking result. Then, a plurality of first cross-sectional images and second cross-sectional images corresponding one-to-one with a plurality of phases are successively displayed (the first cross-sectional images and the second cross-sectional images are reproduced as a video). Specifics are as below.
For example, in the case where a one-heartbeat section from the first end-diastole to the next end-diastole is set as a tracking target section, a phase corresponding to the first end-diastole may be taken as the predetermined phase described above. In this case, the tracker 117 may estimate the positions of the first axis and the second axis in the volume data at each of a plurality of phases (remaining phases) in the one-heartbeat section other than the phase corresponding to the first end-diastole. Alternatively, for example, in the case where a section from the first end-systole to the next end-systole is set as the tracking target section, a phase corresponding to the first end-systole may be taken as the predetermined phase described above. Still alternatively, a plurality of heartbeat sections may be set as the tracking target section, for example.
For example, the tracker 117 may estimate the positions of the first axis and the second axis in volume data at a phase that is temporally adjacent to a predetermined phase by estimating the motion information between volume data at the predetermined phase and volume data at the phase (an example of the phase that is different from the predetermined phase) that is temporally adjacent to the predetermined phase, and moving the first axis and the second axis that are set for the volume data at the predetermined phase based on the estimated motion information. As the estimation method for the motion information, various known techniques such as local pattern matching processing, an optical flow method and the like may be used.
The first generator 112 generates the first cross-sectional image at a phase that is different from the predetermined phase based on the position of the first axis tracked by the tracker 117 (the first cross-sectional image is generated for each of the one or more phases). The second generator 114 generates the second first cross-sectional image at a phase that is different from the predetermined phase based on the position of the second axis tracked by the tracker 117 (the second cross-sectional image is generated for each of the one or more phases).
Then, the display controller 115 performs control of successively displaying a plurality of first cross-sectional images and second cross-sectional images corresponding one-to-one with a plurality of phases.
For example, the display controller 115 may perform control of successively displaying a plurality of first cross-sectional images and second cross-sectional images corresponding one-to-one with all the phases included in a tracking target section. Alternatively, for example, the display controller 115 may alternately display the first cross-sectional image and the second cross-sectional image corresponding to a phase at the end-diastole in a heartbeat section and the first cross-sectional image and the second cross-sectional image corresponding to a phase at the end-systole in the heartbeat section. Moreover, for example, in the case where a plurality of heartbeat sections are set as the tracking target section, the display controller 115 may perform control of successively displaying the first cross-sectional image and the second cross-sectional image corresponding to a phase at the end-diastole in the first heartbeat section and the first cross-sectional image and the second cross-sectional image corresponding to a phase at the end-systole in the first heartbeat section, and then successively displaying the first cross-sectional image and the second cross-sectional image corresponding to a phase at the end-diastole in the second heartbeat section immediately following the first heartbeat section and the first cross-sectional image and the second cross-sectional image corresponding to a phase at the end-systole in the second heartbeat section.
Furthermore, for example, the display controller 115 may perform control of alternately displaying the first cross-sectional image and the second cross-sectional image corresponding to a phase (in the following description, sometimes referred to as “target phase”) in a tracking target section and the first cross-sectional image and the second cross-sectional image corresponding to a phase preceding or following the target phase.
A function may also be included for correcting the position of the first axis or the second axis according to an input of a user, and changing the first cross-sectional image or the second cross-sectional image according to the correction.
In the case where a user performs an operation on the first axis, as illustrated in
For example, a function may also be included for setting the diameter of the third portion (in this example, the pulmonary valve), which is a tubular portion, by using a second cross-sectional image. Description is given here of case where the third portion is a tubular region as an example, but the third portion is not limited to be a tubular region.
Next, a second embodiment will be described. In the second embodiment, a function of generating a third cross-sectional image which is a cross-section in the short-axis direction of the heart included in volume data and which includes a first portion (in this example, the right ventricle) is further included. Specifics will be given below. Parts common with the first embodiment described above will be omitted from the description as appropriate.
Then, as illustrated in
For example, the display controller 115 may also perform control of displaying, according to an input of a user, boundary information indicating the boundary of the first portion (in this example, the right ventricle) in the first cross-sectional image or the second cross-sectional image, and information indicating the position corresponding to the boundary information in the third cross-sectional image.
In this example, the boundary information is information indicating the myocardial boundary of the right ventricle, and for example, the display controller 115 may generate the boundary information indicating the myocardial boundary by connecting a dot sequence input by the user in the first cross-sectional image, and may superimpose and display the generated boundary information on the first cross-sectional image, as illustrated in
Then, as illustrated in
Next, a third embodiment will be described. In the third embodiment, a function of acquiring 3D-shape information indicating the 3D shape of a first portion (in this example, the right ventricle) is further included, and the display controller 115 performs control of displaying the 3D-shape information in each of the first cross-sectional image and the second cross-sectional image. Specifics will be given below. Parts common with the first embodiment described above will be omitted from the description as appropriate.
Then, the display controller 115 performs control of superimposing and displaying the 3D-shape information acquired by the second acquirer 121 in each of the first cross-sectional image and the second cross-sectional image. In this example, the display controller 115 may also perform control of displaying, in each of the first cross-sectional image and the second cross-sectional image, information (for example, a mark) indicating the position that intersects the myocardial boundary of the right ventricle indicated by the 3D-shape information acquired by the second acquirer 121. As described above, according to the third embodiment, a user may check the myocardial boundary of the right ventricle in a cross-sectional image (first cross-sectional image, second cross-sectional image) with increased visibility of the pulmonary valve (the outflow portion of the right ventricle).
For example, a function may further be included for generating a third cross-sectional image which is a cross-section in the short-axis direction of the heart included in volume data and which includes a first portion (in this example, the right ventricle), and the display controller 115 may display the 3D-shape information in the third cross-sectional image.
For example, the ultrasonic diagnostic apparatus 1 illustrated in
The apparatus main body 310 includes a transmitting circuit 311, a receiving circuit 312, a storage circuit 313, and a processing circuit 314. The transmitting circuit 311 and the receiving circuit 312 correspond to the transmitter/receiver 101 illustrated in
The processing circuit 314 corresponds to the B-mode processor 102, the Doppler processor 103, the image generator 104, and the image processor 105 illustrated in
The processing circuit 314 performs a first acquiring function 314A, a first setting function 314B, a first generating function 314C, a second setting function 314D, a second generating function 314E, and a display controlling function 314F. The first acquiring function 314A is a function implemented by the first acquirer 110 illustrated in
For example, each of the respective processing functions performed by the first acquiring function 314A, the first setting function 314B, the first generating function 314C, the second setting function 314D, the second generating function 314E, and the display controlling function 314F, which are components of the processing circuit 314 illustrated in
For example, Step S1 illustrated in
In
The term “processor” used in the above description, for example, refers to a central preprocess unit (CPU), a graphics processing unit (GPU), or a circuit such as an application specific integrated circuit (ASIC), a programmable logic device (for example, a simple programmable logic device (SPLD)), a complex programmable logic device (CPLD), and a field programmable gate array (FPGA)). The processor reads and executes a program stored in a storage circuit so as to implement a function. The program may be built directly in a circuit of the processor instead of being stored in a storage circuit. In this case, the processor reads and executes the program built in the circuit so as to implement a function. A configuration of the processors in the present embodiment is not limited to a case in which each of the processors is configured as a single circuit. A plurality of separate circuits may be combined into one processor that implements the respective functions. Furthermore, the components in
The circuits exemplified in
The embodiments and the modifications described above may be arbitrarily combined.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
Number | Date | Country | Kind |
---|---|---|---|
2014-091878 | Apr 2014 | JP | national |
2015-063071 | Mar 2015 | JP | national |