The present invention concerns an ultrasound imaging method for generating a visualization image of a region inside a medium, and an ultrasound imaging apparatus implementing said method.
It is known to have an ultrasound method/apparatus that generates a visualization image that combines two images: a first b-mode image that reveals morphology structure of the medium, and a second flow image that reveals the vascularization in the medium. There are also two sorts of flow images, called “color flow” and “PW mode” depending on the method employed.
Elastography imaging now provides images of medium elasticity, such color images giving quantitative information on medium stiffness that can be relevant for cancer diagnostic.
However, the color flow images and color elastography images are provided separately during two different ultrasound images exam, and relationship between tissue vascularization and tissue elasticity is impossible.
One object of the present invention is to provide an ultrasound imaging method for generating a visualization image of a region inside a medium, wherein the method comprises:
Thanks to these features, the method combines three types of images into a visualization image, the three images being taken in a quasi-simultaneous way inside the medium. The user can identify the relationship between the three types of images.
Such method reduces the examination time and improves the diagnostic accuracy.
Moreover, some of the taken images can be process with the same data from the plurality of received sequences, which improves the accuracy of relationships.
Then, improvements of the visualization image is made possible by various tuning of images combination.
In various embodiments of the method, one and/or other of the following features may optionally be incorporated:
According to an aspect, the second and third images are superposed over the first image.
According to an aspect, the first image is in grey scale, and the second and third images are in color scale with different color ranges.
According to an aspect, the second and/or third image comprises an outline with a predetermined and unique line property.
According to an aspect, the first process is b-mode ultrasound imaging, the second process is an elastography ultrasound imaging, and the third process is flow process imaging.
According to an aspect, the first, second and third processes have various time periodicity.
According to an aspect, the steps are repeated for periodically generating a visualization image that is updated over time.
According to an aspect, at least one of the received sequences is used by the second and third processes to process the corresponding second and third images.
According to an aspect, at least one of the emitted sequences is a sequence generating an unfocussed ultrasound wave inside the medium.
According to an aspect, the unfocussed ultrasound wave is a plane wave.
According to an aspect, the visualization image comprises:
the first image fills said one view,
the second image is superposed over the first image inside the box, and
the third image is superposed over the second image inside the box.
According to an aspect, the second image is superposed with a first opacity property, and the third image is superposed with a third opacity property, the third opacity property being higher than the second opacity property.
According to an aspect, the visualization image comprises:
the first image fills each one of the first and second views,
the second image is superposed over the first image inside the first box, and
the third image is superposed over the first image inside the second box.
According to an aspect, the first and second views are organized vertically or horizontally inside the visualization image.
Another object of the invention is to provide an ultrasound imaging apparatus implementing the above method, said apparatus comprising:
Other features and advantages of the invention will be apparent from the following detailed description of some of its embodiments given by way of non-limiting example, with reference to the accompanying drawings. In the drawings:
The medium 11 is for instance a living body and in particular human or animal bodies, or can be any other biological or physic-chemical medium (e.g. in vitro medium). The volume of medium comprises variations in its physical properties. For example, the medium may comprise tissues and blood vessels, each one having various physical properties. For example, the tissue may comprise an area suffering from an illness (e.g. cancerous cells), or any other singular area, having various physical properties in comparison to other area of the medium. Some portions of the medium 11 may include some added contrast agent (e.g. micro bubbles) for improving the contrast of physical properties of these portions.
The apparatus 10 may include:
In a variant, a single electronic device could fulfil all the functionalities of the electronic unit 13 and of the processing unit 14. The processing unit 14 may be a computer.
The probe 12 can comprise a curved transducer so as to perform an ultrasound focussing to a predetermined position in front of the probe. The probe 12 can comprise a linear array of transducers, few tens of transducers (for instance 100 to 300) juxtaposed along an axis X so as to perform ultrasound focussing into a bi-dimensional (2D) plane. The probe 12 can comprise a bi-dimensional array so as to perform ultrasound focussing into a tri-dimensional (3D) volume.
The processing unit 14 comprises a processor 14a, a memory 14b containing instruction codes for implementation of the method and containing data concerning the method, a keyboard 14c and a display 14d for displaying images and/or visualization images.
The apparatus 10 can determines images inside the medium 10 of a region R and a sub-region SR, said sub-region being included inside the region R, as it will be explained later.
The method 100 for generating a visualization image, according to the invention, is illustrated on
The method is now more detailed.
During the emission and reception step 101, the processing unit 14 controls the electronic unit 13 so as a plurality of emitted sequences of ultrasound waves are emitted by the probe 12 inside the medium 11. The medium 11 then diffuses and reflects said ultrasound waves according to its content and echo ultrasound waves propagate back to the probe 12. Then, a plurality of received sequences of ultrasound waves (echo) are received by the probe 12.
The emitted and received sequences are temporally interleaved, and each received sequence corresponds to a (known) emitted sequence.
During the processing step 102, the processing unit 14 processes the received sequences for generating:
In present case, the first, second and third processes are different one the other ones, so as to generate three different images with a set of data (received sequences).
In a preferred example:
Various b-mode process, elastography process and flow process are well known for ultrasound imaging. For example, ones can refer to patent application US 2009/234230 for a fast elastography method.
The emitted sequences of ultrasound waves during the emission and reception step 101 must correspond to the ones that have to be used for the three images process (b-mode, elastography and flow).
The first image I1 may be in gray scale (as such scale is usually used for a b-mode image).
The second and third images I2, I3 may be in color scales, i.e. a range of predetermined colors. The color scales of second and third images have different colors: they do not overlap, i.e. they do not have common color, so that the second and third images I2, I3 can be easily distinguished one to the other, and can be distinguished from the grey scale of first image I1.
These scales can be determined by user, and displayed in the visualization image, optionally together with scale's values for user understanding (e.g. rigidity values for elastography image scale, and flow speed values for flow image scale).
The second and/or third images I2, I3 may be limited to a predetermined range or threshold: a minimum and/or a maximum value (i.e. physical value of rigidity or speed). Then, the image is not a full image and the image pixels, that are eliminated, are not significant and are not displayed by having a transparent color.
Therefore, an outline can be added inside such image, the outline surrounding the significant pixels of the image. The outline of each image has a line property: for example, a color and/or a thickness and/or a pattern. The outline property of second image I2 is preferably different than the outline property of the third image I3, so that such images differs and can be identified.
This creates outlined image shapes filed with a predetermined color scale. The image shapes of second and third images I2, I3 can be identified: The pixel belonging to second or third image I2, I3 are easily recognized thanks to the various color scales and/or the various outlines.
During the image combining step 103, the processing unit 14 combines the first image I1, the second image I2 and the third image I3 into a visualization image Vi and displays this visualization image to the display 14d so as to simultaneously visualize the result of first process, second process and third process to the user of the ultrasound imaging apparatus 10.
The second and third images I2, I3 are for example superposed over the first image I1: The second image I2 overlays the first image I1. The third image I3 overlays the first image I1.
The superposition of all the images is coherent for the positions in the medium 11 corresponding to the pixels: the superposed pixels correspond to information for the same position inside the medium.
Advantageously, the first image I1 is determined for a wide area inside the medium, corresponding to the region R represented on
Therefore, a box Bx is defined inside the first image I1, its area corresponding to the pixels that are processed for second and third images I2, I3, and corresponding to the real points inside the sub-region SR inside the medium 11. The borders B2 and B3 of second and third images are positioned on the outline of box Bx during superposition.
Then, the first image I1 that represents a general view of the medium, and wherein the user can recognize the organs, surrounds the first and second images I2, I3. This helps to understand the second and third images, and notably, this helps to link a specific zone in the second and/or third image to the position and type of organ inside the medium 11.
The box Bx and second image I2 and third image I3 have for example a rectangular shape. But, they may have any identical shape.
According to a variant of this superposition, a first opacity property is used to overlay the second image on the first image so as the first image is viewed under the second image. A second opacity property is used to overlay the third image on the first image. An opacity property is a percentage of a second image on the first image: If the opacity property is 0%, the combination result only shows the first image. If the opacity property is 100%, the combination result only shows the second image.
The first and second opacity property may be different. The second opacity property may by higher than the first opacity property.
For example, the first opacity property is comprised between 40% to 60% to see the first image under the second image, and the second opacity is comprised between 80% and 100% (included). For example, the first opacity property is 50%, and the second opacity property is 100%. In such a way, the first image I1 can be seen under the second image I2, and the third image I3 can be clearly seen above all with a correct contrast.
Thanks to these features, the first, second and third images can be easily distinguished one to the other while being superposed so as to understand the link between the various information of these images.
The visualization image Vi can have various layouts and can include various additional elements that are now described by way of some examples.
In
The first image I1 comprises a box Bx wherein the second image I2 and the third image I3 are superposed (overlaid) as described above. The borders B2, B3 of the second and third images are also superposed over the box Bx, i.e. positioned on the outline of box Bx.
In the first view V1, the first, second and third images I1, I2, I3 are all superposed.
In
In this example, the first and second views V1, V2 are side by side in a right-left configuration: The first view is on the left side of the visualization image Vi, and the second view is on the right side of the visualization image Vi.
The first image I1 in first view V1 comprises a box Bx1 wherein the second image I2 is superposed (overlaid) over the first image I1 of said view, as described above. The border B2 of the second image is also superposed over the box Bx1, i.e. positioned on the outline of box Bx1.
The first image I1 in second view V2 comprises a box Bx2 (preferably identical to the box Bx1 in the first view V1) wherein the third image I3 is superposed (overlaid) over the first image I1 of said view, as described above. The border B3 of the third image is also superposed over the box Bx2, i.e. positioned on the outline of box Bx2.
In the first view V1, the first and second images I1, I2 are superposed. In the second view V2, the first and third images I1, I3 are superposed. In some cases, such layout may be easier to understand for the user of the ultrasound device.
According to a third example (not represented), the visualization image Vi includes the same elements as in the second example of
In the first view V1, the first and second images I1, I2 are superposed. In the second view V2, the first and third images I1, I3 are superposed. In some cases, such layout may be more comfortable, depending on the display sizes.
The emitted and received sequences are temporally interleaved, and each received sequence corresponds to a (known) emitted sequence.
Moreover, according to a preferred embodiment, image sequences adapted for generating the first, second and third images are also interleaved so as to reduce a time shift between these images. However, as each one requires a different frame rate FR (time interval between two consecutive image sequences for generating two consecutive images in time domain), interleave is predetermined taking into account these constraints for each image generation.
The
In the represented
The
As known, for e.g. by patent application US 2009/234230 the emitted and received sequences included inside an image sequence for second image (elastography image) can be composed of:
For ultrafast imaging the low frequency elastic wave, the plurality emitted sequences of unfocussed waves may be a plurality of plane waves, having a plurality of angle of inclination of said plane waves: There are a number N of tilted plane waves.
The second image process (elastography process) sums coherently the received sequences as explained in the reference patent application US 2009/234230.
Such method can apply to any image sequence interleaving, such as presented on
As proposed, the second and third processes (elastography and flow imaging process) can be combined and use same emitted and received ultrasound waves saved in memory as raw data.
A first step (a beamforming step) consists in reconstructing images (depth×width×frames) from per channel data (time samples×channels×acquisitions number). The number of frames does not necessary equal to the number of acquisitions as a single frame can be reconstructed from a set of transmit-receive events.
The beamformed image is noted img(x,n) where x denotes spatial coordinates and n the index of the reconstructed image.
A second step combines the reconstructing images.
For flow process using the above unfocussed waves, the method may implement a spatial temporal filtering step during which, after beamforming, so as to differentiate tissue motion from flow motion. The spatial temporal filtering step may be performed by a singular value decomposition SVD technique.
The spatial temporal filtering step then comprises the following sub-steps:
Where List corresponds to the selected vectors.
The singular threshold value can be determined by different parameters. For example:
Number | Date | Country | Kind |
---|---|---|---|
16306146 | Sep 2016 | EP | regional |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2017/072784 | 9/11/2017 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2018/046740 | 3/15/2018 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
20030083578 | Abe | May 2003 | A1 |
20060084870 | Kim | Apr 2006 | A1 |
20080137921 | Simon | Jun 2008 | A1 |
20090149750 | Matsumura | Jun 2009 | A1 |
20090177089 | Govari | Jul 2009 | A1 |
20090234230 | Bercoff et al. | Sep 2009 | A1 |
20100130861 | Shimazaki | May 2010 | A1 |
20100179413 | Kadour | Jul 2010 | A1 |
20140039317 | Sato | Feb 2014 | A1 |
20150005630 | Jung | Jan 2015 | A1 |
20150164476 | Kong | Jun 2015 | A1 |
20150209012 | Oh | Jul 2015 | A1 |
20160249884 | Hashimoto | Sep 2016 | A1 |
20170055956 | Osumi | Mar 2017 | A1 |
20180172811 | Mosegaard | Jun 2018 | A1 |
Number | Date | Country |
---|---|---|
2 138 103 | Dec 2009 | EP |
0 034 004 | Jun 2016 | EP |
3034004 | Jun 2016 | EP |
2008284287 | Nov 2008 | JP |
2012061075 | Mar 2012 | JP |
Entry |
---|
Shaaban, “Real-time ultrasound elastography: Does it improve B-mode ultrasound characterization of solid breast lesions?”, Mar. 2012 (Year: 2012). |
English translation of Foreign Kato JP 2012061075 (Year: 2012). |
Translated Waki (2008284287) (Year: 2008). |
Dumont Douglas M et al: “Feasability of a ARFI/B-mode/Doppler system for real-time, freehand scanning of the cardiovascular system”, Medical Imaging 2011: Ultrasonic Imaging, Tomography, and Therapy, SPIE, 1000 20th St. Bellingham WA 98225-6705 USA, vol. 7968, No. 1, Mar. 3, 2011 (Mar. 3, 2011), pp. 1-12, XP060009692, DOI: 10.1117/12.877841. |
Marwa A Shaaban et al: “Real-time ultrasound elastography: Does it improve B-mode ultrasound characterization of solid breast lesions?”, The Egyptian Journal of Radiology and Nuclear Medicine, Elsevier, Amsterdam, NL, vol. 43, No. 2, Feb. 11, 2012 (Nov. 11, 2012), pp. 301-309, XP028509527, ISSN: 0378-603X, [retrieved on Feb. 21, 2012], DOI: 10.1016/J.EJRNM.2012.02.002. |
International Search Report, dated Dec. 12, 2017, from corresponding PCT/EP2017/072784 application. |
Holländer et al., “Plane-Wave Compounding in Automated Breast vol. Scanning: A Phantom-Based Study”, Ultrasound in Medicine & Biology, 2016, vol. 42, No. 10, pp. 2493-2503. |
Number | Date | Country | |
---|---|---|---|
20190200965 A1 | Jul 2019 | US |