This invention relates to medical ultrasound systems and, in particular, to ultrasound systems which perform two-dimensional (2D) and three-dimensional (3D) imaging.
Ultrasound probes are used to transmit ultrasound waves into the body as well as receive the reflected waves. The reflected echoes are transferred to the ultrasound system for additional signal processing and final generation of an image that is displayed on the screen. With recent advances in technology, compact, high density electronics can be fit inside modern-day transducer probes that allows much of the transmit/receiver signal processing to be performed inside the probe itself. This has led to the development of matrix probes that can operate thousands of transducer elements of an ultrasound array that opens up the possibility of new probe geometries and imaging modes. One of the interesting new modes is 3D/4D (live 3D) imaging in B-mode as well as color flow imaging.
For the last thirty years, two-dimensional planar imaging has been the conventional way to view and diagnose pathology and the standard for live ultrasound imaging. Three-dimensional imaging provides the ability to visualize anatomy in three dimensions, enabling clinicians to understand details of pathology from a perspective not previously possible with ultrasound. But 3D imaging also presents new challenges in image acquisition. While 3D has gained rapid adoption in specific scenarios such as rendering baby faces in fetal exams, it has not gained widespread acceptance in general abdominal and vascular imaging. Part of the challenge is the relative unfamiliarity of sonographers with manipulating the system to acquire views of the desired 3D/4D slices and planes. While 3D ultrasound imaging is a very powerful tool, it remains under-utilized primarily due to two reasons. First, clinicians are frequently unfamiliar with the appearance of anatomy in 3D and its use in ultrasound diagnosis. Secondly, many of the system controls for 3D imaging and their interplay are complicated to use. Accordingly, it is desirable to simplify the controls for 3D ultrasound imaging so that the images needed for a diagnosis can be readily obtained by those who are unfamiliar with the use of 3D ultrasound.
It is an object of the present invention to provide a 3D ultrasound system which is simple to operate and control, preferably through automation of system setup and controls.
It is a further object of the present invention to simplify the manipulation of controls needed to acquire diagnostically useful images in the 3D mode.
In one aspect, the present invention includes an ultrasound system for an exam using 3D imaging. The ultrasound system can include a transducer array configured to transmit ultrasound waves to a target anatomy and receive ultrasonic echoes in response. The system can also include a processor configured to, based on the received ultrasonic echoes, produce a 2D ultrasound image including the target anatomy, and based on a system input, select a plurality of graphical icons stored in memory on the system, wherein each graphical icon comprises a different 3D thumbnail view of the target anatomy. The system can further include a display that can display the graphical icons and other user interface features.
In some aspects, an ultrasound system that, based on a system input, can provide a user with graphical icons showing 3D thumbnail views of a particular anatomy of interest. In one example, a user input can be used to identify the particular anatomy being viewed in the 2D ultrasound image, and thereby cause the system to generate graphical icons representing different 3D views of the anatomy. In another example, a reference 2D image can be acquired by the system and used directly by the system or a model can be applied to the reference 2D image to identify the anatomy being imaged. With the reference 2D image or model data identifying the orientation of the probe relative to target anatomy, an automated process invokes 3D graphical icons showing views appropriate for the intended exam. In an implementation of an ultrasound system of the present invention described below, the automated process comprises selection and execution of macro instructions which invoke appropriate 3D system imaging and control setups associated with the particular 3D icons showing the 3D views.
In the drawings:
Referring first to
As described herein, the present invention improves 3D workflows for ultrasound users by, for example, providing graphical icons corresponding to different 3D views of a target anatomy. The graphical icons are generated depending, for example, on the specific target anatomy being imaged and/or on the position and orientation of a transducer array in relation to the target anatomy. Different regions of anatomy have different standard views upon which a clinician will rely when making a diagnosis. For instance, if a carotid is being imaged in 2D, then upon activation of 3D imaging a select set of graphical icons showing 3D thumbnails of the target anatomy will appear for the user to choose from in order to more easily generate a 3D ultrasound image of the target anatomy. Similarly, if the user indicates the orientation of the carotid or the system identifies the orientation of the carotid (e.g., via segmentation modeling), then upon activation of 3D imaging a select set of graphical icons showing 3D thumbnails of the target anatomy will appear for the user to choose from in order to more easily generate a 3D ultrasound image.
In certain aspects, the graphical icons will be generated based on a system input that indicates which target anatomy is being imaged and/or position and orientation information regarding the transducer array in relation to the target anatomy. In some instances, the system input can include a user input that indicates orientation information of the target anatomy with respect to the transducer array. For example, a button indicating a specific target anatomy (e.g., a carotid) and/or orientation (e.g., long axis) can be selected by the user.
Alternatively, the system input can include a text-based or touch-based input configured to identify a viewing orientation of the target anatomy or protocol configured to view the target anatomy.
In some aspects, the system input can be generated by a 2D ultrasound image in which the system uses the 2D ultrasound as a reference image. The reference image, for example, can be a long axis view of a carotid or a short axis view of the carotid as acquired in a 2D ultrasound image by the ultrasound system. The system can, for example, automatically identify the orientation of the target anatomy in the ultrasound image, or the system can be configured to display the different reference images for user selection. A user input can then include selection of a reference image (e.g., a long axis ultrasound image or a short axis ultrasound image) displayed on the display, and the reference image indicates orientation information of the target anatomy with respect to the transducer array.
If the user had set up the ultrasound system previously for the conduct of a specific exam type in order to acquire the reference image, as by initiating a carotid exam protocol on the system, for instance, the system would know to launch 3D imaging setup for a carotid exam. Features which automatically recognize probe orientation or the anatomical view of an image may be used to automatically enter orientation information in the Reference Image area 104. For instance, an EM (electromagnetic) tracking feature such as the Percunav™ option available for Philips Healthcare ultrasound systems, which automatically follows the orientation of a probe relative to a subject, can be accessed to automatically enter probe orientation information. Image recognition processors such as the “Heart Model” feature available for Philips Healthcare ultrasound systems can also provide orientation information from image recognition processing. See, for instance, the image recognition processors and functions described in US pat. pub. no. 2013/0231564 (Zagorchev et al.) and in US pat. pub. no. 2015/0011886 (Radulescu et al.) With the ultrasound system now knowing that the probe is visualizing a carotid bulb in a long-axis view, one or more display options are presented in the “Display” area. In this example the Display area is showing, on the left, a thumbnail of a 3D volume image acquired by the probe, and, on the right, a thumbnail of an MPR slice image of the carotid artery showing the plaque deposit which has been reconstructed from a 3D data set acquired by the probe.
As the user interface 100 of
Referring to
The beamformed receive signals are coupled to a fundamental/harmonic signal separator 22. The separator 22 acts to separate linear and nonlinear signals so as to enable the identification of the strongly nonlinear echo signals returned from microbubbles or tissue and fundamental frequency signals, both for image formation. The separator 22 may operate in a variety of ways such as by bandpass filtering the received signals in fundamental frequency and harmonic frequency bands (including super-, sub-, and/or ultra-harmonic signal bands), or by a process for fundamental frequency cancellation such as pulse inversion or amplitude modulated harmonic separation. Other pulse sequences with various amplitudes and pulse lengths may also be used for both linear signal separation and nonlinear signal enhancement. A suitable fundamental/harmonic signal separator is shown and described in international patent publication WO 2005/074805 (Bruce et al.) The separated fundamental and/or nonlinear (harmonic) signals are coupled to a signal processor 24 where they may undergo additional enhancement such as speckle removal, signal compounding, and noise elimination.
The processed signals are coupled to a B mode processor 26 and a Doppler processor 28. The B mode processor 26 employs amplitude detection for the imaging of structures in the body such as muscle, tissue, and the walls of blood vessels. B mode images of structures of the body may be formed in either the harmonic mode or the fundamental mode. Tissues in the body and microbubbles both return both types of signals and the stronger harmonic returns of microbubbles enable microbubbles to be clearly segmented in an image in most applications. The Doppler processor 28 processes temporally distinct signals from tissue and blood flow by fast Fourier transformation (FFT) or other Doppler detection techniques for the detection of motion of substances in the image field including blood cells and microbubbles. The Doppler processor may also include a wall filter to eliminate unwanted strong signal returns from tissue in the vicinity of flow such as vessel walls. The anatomic and Doppler flow signals produced by these processors are coupled to a scan converter 32 and a volume renderer 34, which produce image data of tissue structure, flow, or a combined image of both of these characteristics such as a color flow or power Doppler image. The scan converter converts echo signals with polar coordinates into image signals of the desired image format such as a sector image in Cartesian coordinates. The volume renderer 34 converts a 3D data set into a projected 3D image as viewed from a given reference point (look direction) as described in U.S. Pat. No. 6,530,885 (Entrekin et al.) As described therein, when the reference point of the rendering is changed the 3D image can appear to rotate in what is known as kinetic parallax. Also described in the Entrekin et al. patent is the representation of a 3D volume by planar images of different image planes reconstructed from a 3D image data set, a technique known as multiplanar reformatting. The volume renderer 34 can operate on image data in either rectilinear or polar coordinates as described in U.S. Pat. No. 6,723,050 (Dow et al.) The 2D or 3D images are coupled from the scan converter and volume renderer to an image processor 30 for further enhancement, buffering and temporary storage for display on an image display 40.
In accordance with the principles of the present invention, the user interface 38, can be embodied, e.g., in both hard key and touch panel form, such as the touchpanel user interfaces 100 and 112 shown and described above, includes a user control by which the system user can identify one or more characteristics of a reference image, such as probe orientation, to the ultrasound system. An example of this was described in conjunction with
Alternatively, the target anatomy and/or position and orientation information can be provided by the user, as described above.
Depending on the system input (e.g., the user touching a button or identification of a reference image, a system controller configured to control the ultrasound system based on the system input can be coupled to a macro storage/processor 42. The macro storage/processor stores a number of macro instructions for image acquisition, formation and manipulation which are selected and arranged by the processor in a given number and sequence of macros instructions. The formulated sequence of macros is forwarded to the system controller, which sets up the ultrasound system for 3D imaging as commanded by the macros. A macro, as the term is used herein, comprises a single instruction that expands automatically into a set of instructions that perform a specific task. For example, a particular macro set can, when given a particular user input or a particular reference image orientation, acquire a 3D image data set in relation to the reference image, and form one or more MPR images from the 3D image data which are orthogonal to the orientation of the reference image. The macros can also turn on the user controls necessary to manipulate the 3D images.
The sequence of macros produced by the macro storage/processor 42 is coupled to the system controller 12, which causes the instructions of image acquisition macros to be carried out by the beamformer controller 18 to acquire desired image data in relation to the reference image. The system controller causes image formation macros to be carried out by the volume renderer and scan converter for the formation of desired volume and planar images. The macros are also applied to a 3D display processor 36 to direct it to process 3D data sets from the volume renderer 34 into desired 3D images such as MPR images. For instance, a particular set of macros can cause the beamformer controller 18 to command acquisition of a 3D data set in relation to a reference image plane, the volume renderer 34 to render a volume image as viewed from a particular look direction, and the 3D display processor to form three orthogonal MPR images in relation to the center of the volume image. Such a set of macro instructions would cause the ultrasound system to produce the four images shown in the lower-left of the Display area of the user interface 100 of
A typical set of macro instructions assembled by the macro storage/processor 42 for a reference image which is a long-axis view of the carotid artery is:
This sequence of macros will carry out the following actions. The display format will be set for the display of two images (“2-up”). The images will be a 3D (volume) and a 2D planar (A-plane) image in the “A” orientation. The volume image will be a live image and the controls will operate to manipulate the volume image. The volume controls are listed in the Controls area of the user interface touchpanel display. When the user manipulates the trackball, it will cause the volume to rotate in the direction of the trackball motion. Cursors in the image which can be manipulated to rotate the volume about specific axes will all be operational and listed in the Controls area of the user interface. A circle will be detected in each MPR slice image and its diameter will be recorded. An ROI box will be shown on the center line of each circle, and its size set to 1.2 cm. The volume image will be initially viewed from the top, and the rotation of three orthogonal planes through the volume is set to zero. As is seen by the above, this set of macros not only commands the acquisition and formation of specific images, but also lists and activates user controls needed for manipulating and measuring them.
Another example of a set of macros which may be assembled by the macro storage/processor 42 is the following:
This sequence of macros will cause the 2D reference plane to be used as the starting plane for 3D data acquisition. A single image will be displayed and the plane in which image data is acquired is set as a cut-plane through a region of interest and a 3D image is initially viewed from above. Of the three orthogonal planes through a 3D image, only the B plane is rotated, and by 900. When the trackball is manipulated it will cause the cut-plane through the 3D image to change position.
It is seen from the foregoing that operation of an ultrasound system in the 3D mode is made much simpler for those who are unfamiliar with 3D ultrasound. A user can apply her expertise in standard 2D imaging to acquire a reference image, which is used by the system as the starting point for automatically setting up the system for 3D operation. The user informs the system of the characteristics of the reference image and the system sets up 3D operation for the desired exam type. The system can not only command the acquisition and formation of the appropriate 3D images for the exam, but can also initialize controls and measurement tools needed for the manipulation and assessment of the 3D images.
It should be noted that an ultrasound system suitable for use in an implementation of the present invention, and in particular the component structure of the ultrasound system described in
As used herein, the term “computer” or “module” or “processor” or “workstation” may include any processor-based or microprocessor-based system including systems using microcontrollers, reduced instruction set computers (RISC), ASICs, logic circuits, and any other circuit or processor capable of executing the functions described herein. The above examples are exemplary only, and are thus not intended to limit in any way the definition and/or meaning of these terms.
The computer or processor executes a set of instructions that are stored in one or more storage elements, in order to process input data. The storage elements may also store data or other information as desired or needed. The storage element may be in the form of an information source or a physical memory element within a processing machine. For example, the macro storage/processor described above comprises a digital memory device which stores digital macro instructions, and a processor that executes instructions which select the appropriate macros which, when executed, set up the ultrasound system for 3D imaging in accordance with the characteristics of the 2D reference image.
The set of instructions of an ultrasound system including those controlling the acquisition, processing, and transmission of ultrasound images as described above may include various commands that instruct a computer or processor as a processing machine to perform specific operations such as the methods and processes of the various embodiments of the invention. The set of instructions may be in the form of a software program. For instance, the ultrasound system of
Furthermore, the limitations of the following claims are not written in means-plus-function format and are not intended to be interpreted based on 35 U.S.C. 112, sixth paragraph, unless and until such claim limitations expressly use the phrase “means for” followed by a statement of function devoid of further structure.
The present application claims priority to U.S. Provisional Appl. No. 62/332,687, filed on May 6, 2016, the entirety of which is incorporated by reference herein.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/IB2017/052559 | 5/3/2017 | WO | 00 |
Number | Date | Country | |
---|---|---|---|
62332687 | May 2016 | US |