The present disclosure relates to an ultrasonic diagnostic apparatus, a learning apparatus, an image processing method, and a program and, in particular, to a technique for improving image quality of an ultrasonic image.
Ultrasonic diagnostic apparatuses are widely used in clinical practice as image diagnostic apparatuses due to, for example, simplicity, high resolution performance, and real-time performance thereof. As a method of image formation of such ultrasonic diagnostic apparatuses, a method of generating an image by beamforming processing of a transmit beam and phasing addition processing of a received signal is commonly used. Beamforming of a transmit beam is achieved by inputting a voltage waveform provided with a time delay relative to a plurality of conversion elements and causing ultrasonic waves to converge inside a living organism. In addition, phasing addition of a received signal is achieved by receiving ultrasonic waves reflected by a structure inside a living organism by a plurality of conversion elements, and providing to obtained received voltage signals a time delay in consideration of a path length with respect to a point of interest, and then adding up the received voltage signals. Due to the beamforming processing of the transmit beam and the phasing addition processing, reflected signals from the point of interest are selectively extracted to perform imaging. In addition, by controlling the transmit beam so that the inside of an imaging region is scanned by the transmit beam, it is possible to obtain an image of a region desired to be observed.
In addition, a method called plane wave transmission is sometimes used as an image generation method. In this method, beamforming of a transmit beam is hardly performed and ultrasonic waves are transmitted by a voltage waveform provided with a time delay so as to form an approximate plane wave or a diffuse wave. In addition, imaging is performed by subjecting reflected waves from a plane wave or a diffuse wave, transmitted in a plurality of directions or transmitted a plurality of times regarding a received signal, to phasing addition (aperture synthesis).
In image generation by plane wave transmission, since reflected waves over a wider range can be obtained by one transmission of a plane wave as compared to the transmit beam described above, in a case of imaging a region with a same size, a received signal can be obtained by a smaller number of transmissions/receptions by using a plane wave than by using a transmit beam. In other words, image generation by plane wave transmission enables imaging to be performed at a higher frame rate.
Japanese Patent Application Laid-open No. 2009-219876 discloses an ultrasonic diagnostic apparatus using a plane wave. Japanese Patent Application Laid-open No. 2019-25044 discloses a medical imaging apparatus using a restorer constituted by a neural network.
In addition, IEEE Transactions on Ultrasonics Ferroelectrics and Frequency Control PP (99):1-1, August 2017 describes imaging using plane wave transmission and convolutional neural networks.
Compared to focused transmission of ultrasonic waves, with plane wave transmission of ultrasonic waves, since a region for the ultrasonic wave transmission cannot be set, a ratio of reflected signals from other to the point of interest mixed in an image increases.
Therefore, there is a problem in that an image acquired using plane wave transmission of ultrasonic waves has a lower image quality than an image acquired using focused transmission of ultrasonic waves.
The present disclosure has been made in consideration of the problem described above and an object thereof is to provide an ultrasonic diagnostic apparatus that enables an image with favorable image quality to be obtained while realizing a high frame rate.
According to an aspect, it is provided an ultrasonic diagnostic apparatus including an ultrasonic probe configured to transmit and receives ultrasonic waves to and from an observation region of an object, and an estimated image generating unit configured to generate estimated image data corresponding to image data based on an ultrasonic focused beam from image data obtained by transmission of an ultrasonic plane-wave beam by using a model having been machine-learned from learning data including image data obtained by the transmission of the ultrasonic plane-wave beam and image data obtained by the transmission of the ultrasonic focused beam.
According to another aspect, it is provided a learning apparatus including a learning unit configured to perform machine learning of a model by using learning data that includes image data, obtained by transmission of an ultrasonic plane-wave beam, as input data and image data, obtained by transmission of an ultrasonic focused beam, as ground truth data.
According to another aspect, it is provided an image processing method including transmitting and receiving ultrasonic waves to and from an observation region of an object by an ultrasonic probe, and generating estimated image data corresponding to image data based on an ultrasonic focused beam from image data obtained by transmission of an ultrasonic plane-wave beam by using a model having been machine-learned from learning data including image data obtained by the transmission of the ultrasonic plane-wave beam and image data obtained by the transmission of the ultrasonic focused beam.
According to another aspect, it is provided a non-transitory computer readable medium storing a program causing a computer to execute the image processing method as described above.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
A first embodiment of the present invention will be described.
The ultrasonic probe 102 is a probe adopting an electronic scan system and has a plurality of transducers 101 arranged one-dimensionally or two-dimensionally at a tip thereof. The transducer 101 is an electric mechanical conversion element that performs mutual conversion between an electric signal (a voltage pulse signal) and an ultrasonic wave (an acoustic wave). The ultrasonic probe 102 transmits ultrasonic waves from the plurality of transducers 101 to the object 100 and receives reflected ultrasonic waves reflecting a difference in acoustic impedances inside the object 100 by the plurality of transducers 101.
The transmission electrical circuit 104 is a transmitting unit that outputs a pulse signal (a drive signal) with respect to the plurality of transducers 101. By applying a pulse signal with a time difference with respect to the plurality of transducers 101, ultrasonic waves with different delay times are transmitted from the plurality of transducers 101 and a transmission ultrasonic beam is formed. By selectively changing the transducer 101 to which the pulse signal is applied (in other words, the transducer 101 to be driven) and changing a delay time (application timing) of the pulse signal, a direction and a focus of the transmission ultrasonic beam can be controlled. An observation region inside the object 100 is scanned by sequentially changing the direction and the focus of the transmission ultrasonic beam. In the following description, among transmission ultrasonic beams formed by the transmission electrical circuit 104, a transmission ultrasonic beam of which a spread is at least a threshold that is approximately half of a transmission aperture size will be referred to as a plane-wave beam. A spread of a transmission ultrasonic wave refers to a beam width from a maximum sound pressure point to a sound pressure point at which sound pressure is approximately half of maximum sound pressure. In addition, among transmission ultrasonic beams formed by the transmission electrical circuit 104, a transmission ultrasonic beam of which a spread is smaller than the threshold of the transmission aperture size will be referred to as a focused beam.
The reception electrical circuit 105 is a receiving unit that inputs, as a received signal, an electric signal output from the transducer 101 having received a reflected ultrasonic wave. The received signal is input to the received signal processing block 106. Operations of the transmission electrical circuit 104 and the reception electrical circuit 105 or, in other words, transmission/reception of ultrasonic waves is controlled by the system control block 109. It should be noted that, in the present specification, both an analog signal output from the transducer 101 and digital data obtained by sampling (digitally converting) the analog signal will be referred to as a received signal without particular distinction. However, a received signal will sometimes be described as received data depending on the context in order to clearly indicate that the received signal is digital data.
The received signal processing block 106 generates image data based on a received signal obtained from the reception electrical circuit 105. The image processing block 107 applies image processing such as brightness adjustment, interpolation, and filter processing on the image data generated by the received signal processing block 106. The display apparatus 108 is a display unit for displaying image data and various kinds of information and is constituted by, for example, a liquid crystal display or an organic EL display. The system control block 109 is a control unit that integrally controls the transmission electrical circuit 104, the reception electrical circuit 105, the received signal processing block 106, the image processing block 107, the display apparatus 108, and the like.
Configuration of Received Signal Processing Block
The plane-wave beam image generating block 201 generates an image from a received signal obtained from the reception electrical circuit 105 based an element arrangement and various conditions of image generation (sound velocity, aperture control, and signal filtering) supplied from the system control block 109. The image generated by the plane-wave beam image generating block 201 is sent to the estimation calculating block 202.
The estimation calculating block 202 generates a focused beam-equivalent image based on an image sent from the plane-wave beam image generating block 201 using a learned model having been obtained by machine learning. A “focused beam-equivalent image” refers to an image of which image quality has been improved to a level equivalent to that of an image (referred to as a focused beam image) obtained by transmitting a focused beam by applying image processing (estimation calculation processing) to a single plane-wave beam image. In the following description, a focused beam-equivalent image may also be referred to as an “estimated value”. An image output from the estimation calculating block 202 is subjected to prescribed processing by the image processing block 107 and subsequently displayed by the display apparatus 108.
The received signal processing block 106 may be constituted by at least one processor and a memory. In this case, functions of the respective blocks 201 and 202 shown in
Estimation Calculating Block
The estimation calculating block 202 will now be described. The estimation calculating block 202 performs processing for generating (estimating) a focused beam-equivalent image using a learned model.
Machine learning is favorably used to learn a model. Examples of a specific algorithm for machine learning include a nearest neighbor method, a naive Bayes method, and a support vector machine. Another example is deep learning that autonomously generates a feature amount and a coupling weight coefficient for learning using a neural network. A usable algorithm among those described above can be appropriately used and applied to the present embodiment.
Learning by the estimation calculating block 202 according to the present embodiment will now be described in greater detail. As shown in
Furthermore, as input data of learning data ID5, a plane-wave beam image PWI2-1 obtained by transmitting a plane-wave beam to an object that differs from the object that is an imaging object of the plane-wave beam image PWI1-1 is used. In addition, as ground truth data of the learning data ID5, a focused beam image FB1 obtained by transmitting a focused beam to a same object as the object of the plane-wave beam image PWI2-1 is used. It should be noted that the plane-wave beam image input to the estimation calculating block 202 may be image data obtained using a signal that combines a plurality of received signals respectively obtained by a plurality of transmissions of the plane-wave beam. As described above, in the present embodiment, learning is performed by using a set of a plane-wave beam image and a focused beam image obtained by transmitting a plane-wave beam under different conditions with respect to different objects.
As shown in
In addition, preprocessing of learning data may be performed. For example, learning efficiency may be improved by correcting non-uniformity of brightness values due to attenuation of ultrasonic waves. Even in a focused beam image, an image of a portion where an ultrasonic beam converges in a favorable manner or, in other words, a vicinity of a depth at which a transmission focus has been set may be extracted and used. Accordingly, an improvement in resolution of an estimated image can be expected. Processing for removing a shadow caused by a separation of an ultrasonic probe during imaging of an object or the like from input data may be performed. Accordingly, stability of estimation accuracy can be improved. Alternatively, by using learning data in which both input data and ground truth data include a shadow caused by a separation of an ultrasonic probe or the like, an effect can be expected that an image enabling separation of the probe to be recognized be estimated by an estimated image when a separation of a probe actually occurs.
In addition, in learning, preprocessing of input data and ground truth data may be further performed using a GUI such as that shown in
While a plane-wave beam image and a transmission angle and/or the number of transmissions of a plane-wave beam are exemplified as input data in the present embodiment, a similar advantageous effect to that of the present embodiment can be obtained even when only a plane-wave beam image is used as input data. In addition, related information other than a plane-wave beam image and a transmission angle and/or the number of transmissions of a plane-wave beam may be added to input data. For example, adding information such as a transmission frequency and a band of a bandpass filter when acquiring a plane-wave beam image to input data increases the possibility that accurate estimation can be performed in accordance with a state of the input data. In addition, information describing which portion of a living organism the object represents, which orientation the ultrasonic probe is in contact relative to a body axis, and the like may be added to input data. It is expected that estimation accuracy will further increase in correspondence to a feature of each site (for example, the presence of a surface fat layer, the presence of a high brightness region created by a fascial structure, or the presence of a low brightness region due to a thick blood vessel). Furthermore, by adding information such as a field of medicine, gender, BMI, age, and pathological condition to input data, there is a possibility that a learned model corresponding to the feature of each site described earlier in greater detail can be obtained and a further increase in estimation accuracy is expected.
In addition, the learned model 305 of the estimation calculating block 202 mounted to the ultrasonic diagnostic apparatus 1 may be a model having learned image data of all fields of medicine or a model having learned image data of each field of medicine. When a model having learned image data of each field of medicine is mounted, the system control block 109 may cause the user of the ultrasonic diagnostic apparatus 1 to input or select information regarding a field of medicine to change the learned model to be used in accordance with the field of medicine. It is expected that estimation accuracy will further increase by selectively using a model for each field of medicine in which imaging sites are limited to a certain degree.
The learned model 305 obtained by performing learning using a variety of such imaging conditions and a plane-wave beam image as input data and a focused beam image as ground truth data operates on the estimation calculating block 202. As a result, it is expected that the estimation calculating block 202 will estimate an image corresponding to a focused beam image with high resolution or contrast with respect the input plane-wave beam image and output the estimated image as an estimation result.
Image Generation Method
Next, details of processing for image generation according to the present embodiment will be described with reference to
The ultrasonic waves transmitted from the plurality of transducers 101 propagate inside the object 100 and are reflected at a boundary of acoustic impedance inside the object 100. The plurality of transducers 101 receive reflected ultrasonic waves that reflect a difference in acoustic impedances and convert the reflected ultrasonic wave into a voltage waveform. The voltage waveform is input to the reception electrical circuit 105 through the probe connecting unit 103. The reception electrical circuit 105 amplifies and digitally samples the voltage waveform as necessary and outputs the voltage waveform as a received signal to the received signal processing block 106.
In step S61, the estimation calculating block 202 executes an estimation calculation using a transmission angle, the number of transmissions, and the like of the plane-wave beam input from the system control block 109 and outputs a focused beam-equivalent image (an estimated image).
The plane-wave beam image input to the estimation calculating block 202 may be an image created by a signal obtained by combining a plurality of received signals respectively obtained by a plurality of transmissions of the plane-wave beam. When creating an image based on a result of a plurality of plane-wave beam transmissions in this manner, the plane-wave beam image output from the plane-wave beam image generating block 201 is saved in a memory. In addition, a plurality of subsequently output plane-wave beam images and the image saved in the memory are combined with each other and a combination result thereof is output together with a plurality of transmission angles to the estimation calculating block 202. The combining of plane-wave beam images may be either coherent or incoherent, and the advantageous effect of the present embodiment is obtained as long as a plane-wave beam image as a result of using the combining method is used as input data during learning.
In step S62, data of the estimated image output from the estimation calculating block 202 is input to the image processing block 107. The image processing block 107 applies brightness adjustment, interpolation, and other filtering with respect to the input data of the estimated image and outputs image data obtained as a result thereof to the display apparatus 108. The display apparatus 108 displays the image data output from the image processing block 107 and the ultrasonic diagnostic apparatus 1 ends processing of the present flow.
According to the present embodiment, an image can be provided based on high-speed data acquisition over a wide range by collecting received data due to plane-wave beam transmission or, in other words, an image can be provided at a higher frame rate than in a case of focused beam transmission. Furthermore, an image with a high image quality that resembles image quality of an image obtained by focused beam transmission can be provided through learning of the estimation calculating block 202. Therefore, the ultrasonic diagnostic apparatus 1 is capable of providing an image with a higher frame rate and higher contrast than conventional apparatuses.
Next, an ultrasonic diagnostic apparatus according to a second embodiment will be described. An overall configuration of the ultrasonic diagnostic apparatus 1 is the same as that of the first embodiment (
Hereinafter, a description will be given with reference to the flow chart shown in
In step S100, a focused beam image is generated and displayed. Specifically, an observation region is scanned by a focused beam, a frame's worth of an entire image is generated, and the generated image is displayed on the display apparatus 108. A time required by the operation is denoted by B1 in
In step S101, an estimated image is generated by performing processing by the plane-wave beam image generating block 803 and the estimation calculating block 804 with respect to a signal received due to transmission of a plane-wave beam. A time required by the operation is denoted by P1 in
In step S102, the system control block 109 evaluates whether or not the estimated image generated by the estimation calculating block 804 satisfies a prescribed condition. A purpose of the evaluation is to determine whether or not reliability of the estimated image (accuracy of estimation) is high and, in the present embodiment, it is determined that the higher a correlation with a last focused beam image stored in the frame memory, the higher the reliability. Metrics for evaluating correlation may be designed in any way. In the example shown in
In step S105, the system control block 109 checks whether or not the number of times the estimated image was consecutively used to update the display image has reached a prescribed number of times N (in the present example, it is assumed that N=10). When the number of times is smaller than N, a return is made to step S101 and an estimated image using the plane-wave beam image is generated (B2 in
When the correlation between the estimated image and the last display image falls below the prescribed threshold while the processing is being repeated, the system control block 109 does not use the estimated image for display and switches control to generating and displaying a new focused beam image (step S100).
In addition, in step S105, when it is determined that the number of times the estimated image was consecutively used to update the display image has reached N times, the system control block 109 stops generation of the estimated image and switches control to generating and displaying a new focused beam image (step S90).
According to the control described above, since an estimated image generated from a focused beam image obtained in one scan is used to update a display image, image display can be realized at a higher frame rate than when updating the display image using only the focused beam image. As is apparent from a comparison between
When calculating a correlation, a correlation between entire observation regions need not be used and a determination may be made based on, after dividing an observation region and calculating respective correlations of the divided regions, whether or not a correlation is at least a certain level in a certain percentage of the divided regions. By performing such control, for example, when imaging a heart, since a correlation of other regions remain high even though a correlation of a region containing a moving valve declines, display at a high frame rate using an estimated image can be continued.
In addition, while an image used to evaluate a correlation and an image used for display are the same in the processing shown in
Next, control in a case where an instruction to save a still image or a moving image is issued by the user during an imaging operation will be described. When receiving an instruction to save a still image, the system control block 109 may save a focused beam image and/or an estimated image acquired at a time point that is closest to a timing at which the instruction had been received. At this point, images having been acquired but not used for display may be excluded from objects to be saved. For example, when an instruction to save a still image is input to the system control block 109 through a GUI or the like at a timing t1 shown in
In addition, with respect to saving a moving image, a focused beam image and an estimated image may be saved separately or saved in a mixed manner Switching between these save methods can also be set as an option of the system. Furthermore, since a frame rate of an image changes depending on control in the image generation method according to the present embodiment, when saving a moving image, interpolation and processing may be applied so as to create data at constant time intervals and a moving image with a constant frame rate may be subsequently saved.
While control for adaptively switching between a focused beam image and an estimated image based on a correlation of images has been described in the present embodiment, a ratio of the images may be fixed or the system control block 109 may perform control so that the ratio can be interactively changed by the user from a GUI. In addition, when there are consecutive estimated images and a correlation between estimated images that are separated from each other by a least one estimated image is high, a determination may be made that an object has hardly moved and a switch to a focused beam image may be automatically made. Accordingly, an image by transmission of a focused beam can be obtained with respect to the object that has hardly moved.
The embodiments described above merely represent specific examples of the present invention. A scope of the present invention is not limited to the configurations of the embodiments described above and various embodiments can be adopted without departing from the spirit of the invention.
For example, while a model using a plane-wave beam image as input data and an estimated image as output data has been used in the first and second embodiments, an input and an output of a model need not be images. For example received data obtained by transmission of a plane-wave beam may be used as-is as input data or received data after being subjected to phasing addition processing may be used as input data. In addition, as ground truth data, received data obtained by transmission of a focused beam may be used as-is or received data after being subjected to phasing addition processing may be used. A similar operational effect to the embodiments described above can be produced even when using such models.
Furthermore, the disclosed technique can take the form of an embodiment of, for example, a system, an apparatus, a method, a program, or a recording medium (a storage medium). Specifically, the disclosed technique may be applied to a system constituted by a plurality of devices (for example, a host computer, an interface device, an imaging apparatus, and a web application) or to an apparatus constituted by a single device.
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
According to the present invention, an ultrasonic diagnostic apparatus that enables an image with good image quality to be obtained while realizing a high frame rate can be provided.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2020-009941, filed on Jan. 24, 2020, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2020-009941 | Jan 2020 | JP | national |