ULTRASONIC IMAGING SYSTEM WITH SIMPLIFIED 3D IMAGING CONTROLS

Abstract
An ultrasound system is quickly set up for 3D imaging of target anatomy by clicking on a quick-launch key. The system uses a system input, such as characteristics of a 2D reference image to determine the 3D setup configuration. Based upon the system input, macro instructions can be selected and executed to set up the system for a 3D exam of the target anatomy in a selected mode, with clinically useful 3D images and the appropriate 3D controls enabled.
Description

This invention relates to medical ultrasound systems and, in particular, to ultrasound systems which perform two-dimensional (2D) and three-dimensional (3D) imaging.


Ultrasound probes are used to transmit ultrasound waves into the body as well as receive the reflected waves. The reflected echoes are transferred to the ultrasound system for additional signal processing and final generation of an image that is displayed on the screen. With recent advances in technology, compact, high density electronics can be fit inside modern-day transducer probes that allows much of the transmit/receiver signal processing to be performed inside the probe itself. This has led to the development of matrix probes that can operate thousands of transducer elements of an ultrasound array that opens up the possibility of new probe geometries and imaging modes. One of the interesting new modes is 3D/4D (live 3D) imaging in B-mode as well as color flow imaging.


For the last thirty years, two-dimensional planar imaging has been the conventional way to view and diagnose pathology and the standard for live ultrasound imaging. Three-dimensional imaging provides the ability to visualize anatomy in three dimensions, enabling clinicians to understand details of pathology from a perspective not previously possible with ultrasound. But 3D imaging also presents new challenges in image acquisition. While 3D has gained rapid adoption in specific scenarios such as rendering baby faces in fetal exams, it has not gained widespread acceptance in general abdominal and vascular imaging. Part of the challenge is the relative unfamiliarity of sonographers with manipulating the system to acquire views of the desired 3D/4D slices and planes. While 3D ultrasound imaging is a very powerful tool, it remains under-utilized primarily due to two reasons. First, clinicians are frequently unfamiliar with the appearance of anatomy in 3D and its use in ultrasound diagnosis. Secondly, many of the system controls for 3D imaging and their interplay are complicated to use. Accordingly, it is desirable to simplify the controls for 3D ultrasound imaging so that the images needed for a diagnosis can be readily obtained by those who are unfamiliar with the use of 3D ultrasound.


It is an object of the present invention to provide a 3D ultrasound system which is simple to operate and control, preferably through automation of system setup and controls.


It is a further object of the present invention to simplify the manipulation of controls needed to acquire diagnostically useful images in the 3D mode.


In one aspect, the present invention includes an ultrasound system for an exam using 3D imaging. The ultrasound system can include a transducer array configured to transmit ultrasound waves to a target anatomy and receive ultrasonic echoes in response. The system can also include a processor configured to, based on the received ultrasonic echoes, produce a 2D ultrasound image including the target anatomy, and based on a system input, select a plurality of graphical icons stored in memory on the system, wherein each graphical icon comprises a different 3D thumbnail view of the target anatomy. The system can further include a display that can display the graphical icons and other user interface features.


In some aspects, an ultrasound system that, based on a system input, can provide a user with graphical icons showing 3D thumbnail views of a particular anatomy of interest. In one example, a user input can be used to identify the particular anatomy being viewed in the 2D ultrasound image, and thereby cause the system to generate graphical icons representing different 3D views of the anatomy. In another example, a reference 2D image can be acquired by the system and used directly by the system or a model can be applied to the reference 2D image to identify the anatomy being imaged. With the reference 2D image or model data identifying the orientation of the probe relative to target anatomy, an automated process invokes 3D graphical icons showing views appropriate for the intended exam. In an implementation of an ultrasound system of the present invention described below, the automated process comprises selection and execution of macro instructions which invoke appropriate 3D system imaging and control setups associated with the particular 3D icons showing the 3D views.


In the drawings:






FIG. 1 illustrates a two dimensional image of target anatomy which is used as a reference image to set up the 3D controls of an ultrasound system in accordance with the principles of the present invention.



FIG. 2 illustrates an ultrasound system 2D imaging control panel with a quick-launch key to switch to a 3D imaging mode.



FIG. 3 illustrates a simplified 3D imaging control panel which is set up in accordance with the present invention.



FIG. 4 illustrates the control panel of FIG. 3 when partially set up for a carotid exam in accordance with the present invention.



FIG. 5 illustrates the control panel of FIG. 3 when fully set up for a carotid exam in B mode in accordance with the present invention.



FIG. 6 illustrates a number of 3D view options of the carotid which are based upon a given reference image.



FIG. 7 illustrates the control panel of FIG. 5 when switched to the color flow mode for a 3D exam of the carotid in accordance with the present invention.



FIG. 8 illustrates the control panel of FIG. 5 when switched to the vessel cast (power Doppler) mode for a 3D exam of the carotid in accordance with the present invention.



FIG. 9 illustrates in block diagram form a 2D/3D ultrasound system constructed in accordance with the principles of the present invention.





Referring first to FIG. 1, a 2D ultrasound image 110 of a carotid artery is shown. This image is a typical B mode image of the carotid bulb in a long-axis view as may be acquired during an examination of the carotid artery for signs of plaque. In the image 110 the lumen 70 of the vessel appears black in the image, since this is the way blood flow appears in a B mode image. Also shown in this image is a deposit 72 of plaque in the artery. A clinician seeing this 2D image may want to know more about the plaque deposit 72, such as its shape, surface texture, and extent along or across the vessel, information which is not revealed by a long-axis 2D view. This information may be gleaned from a 3D image of the carotid, which is facilitated in an ultrasound system of the present invention by actuating a quick-launch key to switch to the 3D/4D mode. FIG. 2 shows a touchscreen user interface 112 for the 2D mode, which a clinician has used to acquire the 2D image of FIG. 1. At the right side of this 2D user interface are several 3D quick-launch keys indicated by the oval. In this example the clinician actuates the 4D quick-launch key 116, which switches the user interface to the 3D user interface shown in FIG. 3. The clinician has thereby switched the ultrasound system to the 3D mode with a single click of a control key.


As described herein, the present invention improves 3D workflows for ultrasound users by, for example, providing graphical icons corresponding to different 3D views of a target anatomy. The graphical icons are generated depending, for example, on the specific target anatomy being imaged and/or on the position and orientation of a transducer array in relation to the target anatomy. Different regions of anatomy have different standard views upon which a clinician will rely when making a diagnosis. For instance, if a carotid is being imaged in 2D, then upon activation of 3D imaging a select set of graphical icons showing 3D thumbnails of the target anatomy will appear for the user to choose from in order to more easily generate a 3D ultrasound image of the target anatomy. Similarly, if the user indicates the orientation of the carotid or the system identifies the orientation of the carotid (e.g., via segmentation modeling), then upon activation of 3D imaging a select set of graphical icons showing 3D thumbnails of the target anatomy will appear for the user to choose from in order to more easily generate a 3D ultrasound image.


In certain aspects, the graphical icons will be generated based on a system input that indicates which target anatomy is being imaged and/or position and orientation information regarding the transducer array in relation to the target anatomy. In some instances, the system input can include a user input that indicates orientation information of the target anatomy with respect to the transducer array. For example, a button indicating a specific target anatomy (e.g., a carotid) and/or orientation (e.g., long axis) can be selected by the user.


Alternatively, the system input can include a text-based or touch-based input configured to identify a viewing orientation of the target anatomy or protocol configured to view the target anatomy.


In some aspects, the system input can be generated by a 2D ultrasound image in which the system uses the 2D ultrasound as a reference image. The reference image, for example, can be a long axis view of a carotid or a short axis view of the carotid as acquired in a 2D ultrasound image by the ultrasound system. The system can, for example, automatically identify the orientation of the target anatomy in the ultrasound image, or the system can be configured to display the different reference images for user selection. A user input can then include selection of a reference image (e.g., a long axis ultrasound image or a short axis ultrasound image) displayed on the display, and the reference image indicates orientation information of the target anatomy with respect to the transducer array.



FIG. 3 illustrates one layout of a 3D user interface 100 for an ultrasound system of the present invention. This panel has four areas for 3D/4D imaging. The first area 102 is the “Quick Launch” area 102 which allows a user to launch specific display modes, such as B mode, color flow and color flow Doppler modes. The second area 104, the “Reference Image” area, is an area in which the user is able the specify the probe orientation relative to the anatomy of interest. By defining the probe orientation to the ultrasound system, the system is then able to know the anatomy being imaged and set up the 3D image views most beneficial for a user who is diagnosing that anatomy. Two typical transducer orientations are a long-axis view and a short-axis view, for instance. The third area 106 is the “Display” area. For each quick-launch key and transducer orientation a number of different display options will be available. Each display option is characterized by one or more specific display parameters. Such parameters may include the number of MPRs (multiplanar reformatted slice images) and an associated volume display; ROI (region of interest) box size and location for each MPR image; volume image look direction, and rotation values for A, B, and C planes relative to a reference axis or plane. These display options may be shown in the area 106 in text, but preferably they are shown as thumbnails of the available images as illustrated below. The fourth area 108 is the “Controls” area which presents a list of 3D controls available to the user for each “Display” option.



FIG. 4 illustrates the 3D touchscreen user interface of FIG. 3 when the carotid bulb image of FIG. 1 is specified to the system as the reference image for 3D setup. Specifying a reference image informs the system as to the orientation of the ultrasound probe relative to a target anatomy, in this case, the carotid artery. When the user clicks on the 2D image of the carotid bulb image 110 of FIG. 1 and then clicks in the “Reference Image” area 114, that image is identified to the ultrasound system as the reference image and a thumbnail of the image appear in area 114 of the user interface. In this example the user has further specified the orientation of the probe relative to the carotid bulb to the system by typing “Carotid” and “long axis” in the entry box that pops up below the defined reference image 114. In a given ultrasound system these manual user selection and entry steps may be performed automatically. For instance, once the 3D user interface 100 is launched, a thumbnail of the ultrasound image in the active image display area, the screen which is displaying carotid bulb image 110 in this example, may automatically be caused to appear as the reference image thumbnail in area 114.


If the user had set up the ultrasound system previously for the conduct of a specific exam type in order to acquire the reference image, as by initiating a carotid exam protocol on the system, for instance, the system would know to launch 3D imaging setup for a carotid exam. Features which automatically recognize probe orientation or the anatomical view of an image may be used to automatically enter orientation information in the Reference Image area 104. For instance, an EM (electromagnetic) tracking feature such as the Percunav™ option available for Philips Healthcare ultrasound systems, which automatically follows the orientation of a probe relative to a subject, can be accessed to automatically enter probe orientation information. Image recognition processors such as the “Heart Model” feature available for Philips Healthcare ultrasound systems can also provide orientation information from image recognition processing. See, for instance, the image recognition processors and functions described in US pat. pub. no. 2013/0231564 (Zagorchev et al.) and in US pat. pub. no. 2015/0011886 (Radulescu et al.) With the ultrasound system now knowing that the probe is visualizing a carotid bulb in a long-axis view, one or more display options are presented in the “Display” area. In this example the Display area is showing, on the left, a thumbnail of a 3D volume image acquired by the probe, and, on the right, a thumbnail of an MPR slice image of the carotid artery showing the plaque deposit which has been reconstructed from a 3D data set acquired by the probe.



FIG. 5 illustrates another example of a quick-launched 3D user interface 100 for a carotid artery reference image. In this example two reference images have been specified to the system, the top thumbnail being a long-axis view of the carotid and the bottom thumbnail being a short-axis view of the carotid in an image plane normal to that of the long-axis view. The Display area in this example presents five display options: the top thumbnails are for a 3D volume image and an MPR slice of the long-axis view; the middle-left thumbnails are short-axis volume and long-axis MPR images; the middle-right thumbnails are a rotated long-axis volume view and a short-axis MPR image; the lower-left thumbnails are a long-axis 3D volume image and three orthogonal MPR slice views; and the lower-right thumbnails are a rotated 3D volume image and long-axis and short-axis MPR slices. The “Controls” area 108 is populated with a number of image generation and manipulation controls available to the user when working with any one of the display options, including volume and MPR controls, look direction selection, color control, and a volume rotation control. When the user clicks on one of the thumbnail Display options, the active image display screen of the ultrasound system starts displaying live images in accordance with the selected display option, and with the necessary imaging controls of the Control area activated. As illustrated in FIG. 6, an identified 2D reference image 110 can thus result in live 3D imaging in any of the available 3D imaging options 111-119.


As the user interface 100 of FIG. 5 illustrates, the “Quick Launch” area 102 may also enable the quick launch of 3D imaging in a specific imaging mode. The example user interface of FIG. 5 provides the user with mode choices of color flow and vessel case (color power Doppler), in addition to the base B mode. When the user clicks on the “Color Flow” key in the user interface 100 of FIG. 5, the ultrasound system switches the imaging of the reference image(s) and the display options to the color flow mode as shown in FIG. 7. As this illustration shows, the lumen of the carotid artery in the images is now filled with color depicting the presence and direction of blood flow in the artery. Again, the user may perform detailed 3D imaging with any of the Display options shown in the center panel of the touchscreen user interface 100, now in the color flow mode. The other mode option which is available in the example of FIG. 5 is the “Vessel Cast” (color power Doppler) imaging mode. When the user clicks on this key, the images switch to display in the power Doppler mode as illustrated by the thumbnail images in the user interface 100 of FIG. 8. Other mode selection options may also be made available to a user in a particular implementation of the present invention.


Referring to FIG. 9, an ultrasound system constructed in accordance with the principles of the present invention for ease in 3D imaging is shown in block diagram form. A two-dimensional transducer array 10 is provided for transmitting ultrasonic waves for imaging and receiving echo signals for image formation. The array is located in an ultrasound probe and is typically mounted in the probe with an integral microbeamformer which controls the transmission of beams in two or three dimensions and the partial beamforming of received echo signals. The array and its microbeamformer are coupled to the mainframe ultrasound system by a transmit/receive (T/R) switch 16 which switches between transmission and reception and protects the receive channels of the system beamformer 20 from high energy transmit signals. The transmission of ultrasonic energy from the transducer array 10 and the formation of coherent echo signals by the microbeamformer and the system beamformer 20 are controlled by a transmit controller 18 coupled to the beamformer 20, which receives input from the user's operation of the user interface or control panel 38, such as selection of a particular imaging mode, from a system controller 12. The echo signals received by elements of the array 10 are partially beamformed by the probe microbeamformer, and the resultant partial sum signals are coupled to the system beamformer 20 where the beamforming process is completed to form coherent beamformed signals.


The beamformed receive signals are coupled to a fundamental/harmonic signal separator 22. The separator 22 acts to separate linear and nonlinear signals so as to enable the identification of the strongly nonlinear echo signals returned from microbubbles or tissue and fundamental frequency signals, both for image formation. The separator 22 may operate in a variety of ways such as by bandpass filtering the received signals in fundamental frequency and harmonic frequency bands (including super-, sub-, and/or ultra-harmonic signal bands), or by a process for fundamental frequency cancellation such as pulse inversion or amplitude modulated harmonic separation. Other pulse sequences with various amplitudes and pulse lengths may also be used for both linear signal separation and nonlinear signal enhancement. A suitable fundamental/harmonic signal separator is shown and described in international patent publication WO 2005/074805 (Bruce et al.) The separated fundamental and/or nonlinear (harmonic) signals are coupled to a signal processor 24 where they may undergo additional enhancement such as speckle removal, signal compounding, and noise elimination.


The processed signals are coupled to a B mode processor 26 and a Doppler processor 28. The B mode processor 26 employs amplitude detection for the imaging of structures in the body such as muscle, tissue, and the walls of blood vessels. B mode images of structures of the body may be formed in either the harmonic mode or the fundamental mode. Tissues in the body and microbubbles both return both types of signals and the stronger harmonic returns of microbubbles enable microbubbles to be clearly segmented in an image in most applications. The Doppler processor 28 processes temporally distinct signals from tissue and blood flow by fast Fourier transformation (FFT) or other Doppler detection techniques for the detection of motion of substances in the image field including blood cells and microbubbles. The Doppler processor may also include a wall filter to eliminate unwanted strong signal returns from tissue in the vicinity of flow such as vessel walls. The anatomic and Doppler flow signals produced by these processors are coupled to a scan converter 32 and a volume renderer 34, which produce image data of tissue structure, flow, or a combined image of both of these characteristics such as a color flow or power Doppler image. The scan converter converts echo signals with polar coordinates into image signals of the desired image format such as a sector image in Cartesian coordinates. The volume renderer 34 converts a 3D data set into a projected 3D image as viewed from a given reference point (look direction) as described in U.S. Pat. No. 6,530,885 (Entrekin et al.) As described therein, when the reference point of the rendering is changed the 3D image can appear to rotate in what is known as kinetic parallax. Also described in the Entrekin et al. patent is the representation of a 3D volume by planar images of different image planes reconstructed from a 3D image data set, a technique known as multiplanar reformatting. The volume renderer 34 can operate on image data in either rectilinear or polar coordinates as described in U.S. Pat. No. 6,723,050 (Dow et al.) The 2D or 3D images are coupled from the scan converter and volume renderer to an image processor 30 for further enhancement, buffering and temporary storage for display on an image display 40.


In accordance with the principles of the present invention, the user interface 38, can be embodied, e.g., in both hard key and touch panel form, such as the touchpanel user interfaces 100 and 112 shown and described above, includes a user control by which the system user can identify one or more characteristics of a reference image, such as probe orientation, to the ultrasound system. An example of this was described in conjunction with FIGS. 3 and 4.


Alternatively, the target anatomy and/or position and orientation information can be provided by the user, as described above.


Depending on the system input (e.g., the user touching a button or identification of a reference image, a system controller configured to control the ultrasound system based on the system input can be coupled to a macro storage/processor 42. The macro storage/processor stores a number of macro instructions for image acquisition, formation and manipulation which are selected and arranged by the processor in a given number and sequence of macros instructions. The formulated sequence of macros is forwarded to the system controller, which sets up the ultrasound system for 3D imaging as commanded by the macros. A macro, as the term is used herein, comprises a single instruction that expands automatically into a set of instructions that perform a specific task. For example, a particular macro set can, when given a particular user input or a particular reference image orientation, acquire a 3D image data set in relation to the reference image, and form one or more MPR images from the 3D image data which are orthogonal to the orientation of the reference image. The macros can also turn on the user controls necessary to manipulate the 3D images.


The sequence of macros produced by the macro storage/processor 42 is coupled to the system controller 12, which causes the instructions of image acquisition macros to be carried out by the beamformer controller 18 to acquire desired image data in relation to the reference image. The system controller causes image formation macros to be carried out by the volume renderer and scan converter for the formation of desired volume and planar images. The macros are also applied to a 3D display processor 36 to direct it to process 3D data sets from the volume renderer 34 into desired 3D images such as MPR images. For instance, a particular set of macros can cause the beamformer controller 18 to command acquisition of a 3D data set in relation to a reference image plane, the volume renderer 34 to render a volume image as viewed from a particular look direction, and the 3D display processor to form three orthogonal MPR images in relation to the center of the volume image. Such a set of macro instructions would cause the ultrasound system to produce the four images shown in the lower-left of the Display area of the user interface 100 of FIG. 5, for example. The 3D display processor 36 is also responsive to information about the 2D reference image to generate the images and graphics shown in the Display and Controls areas 106 and 108 of the touchscreen user interface 100, including the thumbnails of the display option images and the initial control settings.


A typical set of macro instructions assembled by the macro storage/processor 42 for a reference image which is a long-axis view of the carotid artery is:












TABLE 1









1.
Select display format as “2-up”



2.
Select the images to be displayed as “Volume




and A-plane”



3.
Make “volume” image the active image



4.
Set “Vol” as the active control



5.
Set trackball arbitration to “rotate volume”



6.
Set all 3 rotate cursors as active



7.
Detect circle in B-plane in MPRs and record




diameter of detected circle



8.
Place ROI box along center line of detected




circle in B-plane



9.
Select ROI box size as 1.2 (diameter of




detected circle in step 7)



10.
Select look direction as “top”



11.
Set A-, B-, and C-plane rotate values to 0.











This sequence of macros will carry out the following actions. The display format will be set for the display of two images (“2-up”). The images will be a 3D (volume) and a 2D planar (A-plane) image in the “A” orientation. The volume image will be a live image and the controls will operate to manipulate the volume image. The volume controls are listed in the Controls area of the user interface touchpanel display. When the user manipulates the trackball, it will cause the volume to rotate in the direction of the trackball motion. Cursors in the image which can be manipulated to rotate the volume about specific axes will all be operational and listed in the Controls area of the user interface. A circle will be detected in each MPR slice image and its diameter will be recorded. An ROI box will be shown on the center line of each circle, and its size set to 1.2 cm. The volume image will be initially viewed from the top, and the rotation of three orthogonal planes through the volume is set to zero. As is seen by the above, this set of macros not only commands the acquisition and formation of specific images, but also lists and activates user controls needed for manipulating and measuring them.


Another example of a set of macros which may be assembled by the macro storage/processor 42 is the following:












TABLE 2









1.
Use the 2D reference plane as the start




acquisition plane



2.
Use display type as 1-up and acquisition plane




as the ROI cut-plane



3.
Set look direction to Top view



4.
Set rotate values to (0, 90, 0) for A, B, and C




planes



5.
Set trackball arbitration to “volume slice”










This sequence of macros will cause the 2D reference plane to be used as the starting plane for 3D data acquisition. A single image will be displayed and the plane in which image data is acquired is set as a cut-plane through a region of interest and a 3D image is initially viewed from above. Of the three orthogonal planes through a 3D image, only the B plane is rotated, and by 900. When the trackball is manipulated it will cause the cut-plane through the 3D image to change position.


It is seen from the foregoing that operation of an ultrasound system in the 3D mode is made much simpler for those who are unfamiliar with 3D ultrasound. A user can apply her expertise in standard 2D imaging to acquire a reference image, which is used by the system as the starting point for automatically setting up the system for 3D operation. The user informs the system of the characteristics of the reference image and the system sets up 3D operation for the desired exam type. The system can not only command the acquisition and formation of the appropriate 3D images for the exam, but can also initialize controls and measurement tools needed for the manipulation and assessment of the 3D images.


It should be noted that an ultrasound system suitable for use in an implementation of the present invention, and in particular the component structure of the ultrasound system described in FIG. 1, may be implemented in hardware, software or a combination thereof. The various embodiments and/or components of an ultrasound system, for example, the modules, or components and controllers therein, also may be implemented as part of one or more computers or processors. The computer or processor may include a microprocessor. The microprocessor may be connected to a communication bus, for example, to access a PACS system or a data network. The computer or processor may also include a memory. The memory devices may include Random Access Memory (RAM) and Read Only Memory (ROM) or other digital or analog signal storage components. The computer or processor further may include a storage device, which may be a hard disk drive or a removable storage drive such as a floppy disk drive, optical disk drive, solid-state thumb drive, and the like. The storage device may also be other similar means for loading computer programs or other instructions into the computer or processor.


As used herein, the term “computer” or “module” or “processor” or “workstation” may include any processor-based or microprocessor-based system including systems using microcontrollers, reduced instruction set computers (RISC), ASICs, logic circuits, and any other circuit or processor capable of executing the functions described herein. The above examples are exemplary only, and are thus not intended to limit in any way the definition and/or meaning of these terms.


The computer or processor executes a set of instructions that are stored in one or more storage elements, in order to process input data. The storage elements may also store data or other information as desired or needed. The storage element may be in the form of an information source or a physical memory element within a processing machine. For example, the macro storage/processor described above comprises a digital memory device which stores digital macro instructions, and a processor that executes instructions which select the appropriate macros which, when executed, set up the ultrasound system for 3D imaging in accordance with the characteristics of the 2D reference image.


The set of instructions of an ultrasound system including those controlling the acquisition, processing, and transmission of ultrasound images as described above may include various commands that instruct a computer or processor as a processing machine to perform specific operations such as the methods and processes of the various embodiments of the invention. The set of instructions may be in the form of a software program. For instance, the ultrasound system of FIG. 9 may be programmed with instructions executing an algorithm which selects macro instructions from storage that are executed to set up the system for a desired exam type with 3D imaging. The software may be in various forms such as system software or application software and which may be embodied as a tangible and non-transitory computer readable medium. Further, the software may be in the form of a collection of separate programs or modules, a program module within a larger program or a portion of a program module. The software also may include modular programming in the form of object-oriented programming. The processing of input data by the processing machine may be in response to operator commands, or in response to results of previous processing, or in response to a request made by another processing machine.


Furthermore, the limitations of the following claims are not written in means-plus-function format and are not intended to be interpreted based on 35 U.S.C. 112, sixth paragraph, unless and until such claim limitations expressly use the phrase “means for” followed by a statement of function devoid of further structure.

Claims
  • 1. An ultrasound system for an exam using 3D imaging, comprising: a transducer array configured to transmit ultrasound waves to a target anatomy and receive ultrasonic echoes in response;a processor configured to: based on the received ultrasonic echoes, produce a 2D ultrasound image including the target anatomy;based on a system input, select a plurality of graphical icons stored in memory on the system, wherein each graphical icon comprises a different thumbnail view of the target anatomy and is configured, responsive to selection of the graphical icon, to cause the ultrasound system to generate an ultrasound image of a view corresponding to the thumbnail view of the selected graphical icon; anda display adapted to display the plurality of graphical icons.
  • 2. The ultrasound system of claim 1, wherein the system input comprises a user input that indicates orientation information of the target anatomy with respect to the transducer array.
  • 3. The ultrasound system of claim 2, wherein the user input comprises a text-based or touch-based input configured to identify a viewing orientation of the target anatomy or protocol configured to view the target anatomy.
  • 4. The ultrasound system of claim 2, wherein the 2D ultrasound image is a reference image, and the user input comprises a selection of the reference image displayed on the display, wherein the reference image indicates orientation information of the target anatomy with respect to the transducer array.
  • 5. The ultrasound system of claim 1, wherein the system input comprises orientation information of the target anatomy with respect to the transducer array generated by a segmentation model of the target anatomy.
  • 6. The ultrasound system of claim 1, further comprising a volume renderer adapted to produce at least one 3D image of the target anatomy that corresponds to at least one of the different thumbnail views in the graphical icons.
  • 7. The ultrasound system of claim 6, further configured to generate a multiplanar view of the target anatomy from 3D image data used to produce the at least one 3D image.
  • 8. The ultrasound system of claim 1, further comprising a system controller that, based on the system input, is adapted to modify 3D image and control settings of the system with a 3D display processor.
  • 9. The ultrasound system of claim 8, wherein the system controller is adapted to modify 3D image and control settings of the system according to a macro stored in memory.
  • 10. The ultrasound system of claim 9, wherein the system controller is further coupled to receive macros from a macro storage and controller, and outputs coupled to the image processor and the volume renderer.
  • 11. The ultrasound system of claim 10, further comprising: a beamformer having an input coupled to receive signals from the transducer array and an output coupled to the processor; anda beamformer controller having an output coupled to the beamformer,wherein an output of the system controller is further coupled to the beamformer controller.
  • 12. The ultrasound system of claim 11, wherein the system controller is further adapted to control the beamformer, the image processor, and the volume renderer in response to its receipt of the macro.
  • 13. The ultrasound system of claim 1, wherein the image processor further comprises a B mode processor and a Doppler processor.
  • 14. The ultrasound system of claim 1, further comprising a user interface adapted to launch a 3D imaging mode in response to actuation of a user control.
  • 15. The ultrasound system of claim 4, further comprising a user interface having an area adapted to contain characteristics of the reference image.
Parent Case Info

The present application claims priority to U.S. Provisional Appl. No. 62/332,687, filed on May 6, 2016, the entirety of which is incorporated by reference herein.

PCT Information
Filing Document Filing Date Country Kind
PCT/IB2017/052559 5/3/2017 WO 00
Provisional Applications (1)
Number Date Country
62332687 May 2016 US