The present invention relates to ultrasound imaging, and, more particularly, to an ultrasound imaging system that assists in the positioning of an ultrasound probe.
Correctly positioning an ultrasound probe such that a diagnostically relevant image is produced is a skill often only obtained after training and consistent ultrasound use. This initial “training period” necessary to become proficient in ultrasound imaging may be a contributing factor to the current underutilization of ultrasound by non-sonographers.
What is needed in the art is an ultrasound imaging system, as in the present invention, which assists a person not experienced in ultrasound imaging in successful image acquisition, via system assisted positioning of an ultrasound probe, such that an image of a location of interest under, i.e., in the imaging view of, the ultrasound probe can be displayed.
The present invention provides an ultrasound imaging system that assists in image acquisition, and in positioning of an ultrasound probe, such that an image of a location of interest under, i.e., in the imaging view of, the probe can be displayed. For example, the ultrasound imaging system assists in the positioning of an ultrasound probe such that a specific image containing a medical device and/or the surrounding area can automatically be presented to the user. The system may further be used to create three-dimensional (3D) images of underlying structures, which may convey additional information regarding the state of the underlying anatomy. This may assist one performing peripheral arterial disease (PAD) or other interventional procedures.
The invention in one form is directed to an ultrasound imaging system that includes an electromagnetic (EM) field generator configured to generate an EM locator field. An interventional medical device is defined by an elongate body having a distal tip and a distal end portion extending proximally from the distal tip. The interventional medical device has a first tracking element mounted at the distal end portion of the interventional medical device. The first tracking element is configured to generate tip location data based on the EM locator field. An ultrasound probe has a probe housing, an ultrasound transducer mechanism, and a second tracking element. The probe housing has a handle portion and a head portion. The ultrasound transducer mechanism and the second tracking element are mounted to the probe housing. The ultrasound transducer mechanism has an active ultrasound transducer array configured to generate two-dimensional ultrasound slice data at any of a plurality of discrete imaging locations within a three-dimensional imaging volume associated with the head portion. The second tracking element is configured to generate probe location data based on the EM locator field. A display screen is configured to display an ultrasound image. A processor circuit is communicatively coupled to the first tracking element, the second tracking element, the ultrasound transducer mechanism, and the display screen. The processor circuit is configured to execute program instructions to process the two-dimensional ultrasound slice data to generate the ultrasound image for display at the display screen. Also, the processor circuit is configured to generate a positioning signal based on the tip location data and the probe location data to dynamically position the active ultrasound transducer array at a desired imaging location of the plurality of discrete imaging locations so that the two-dimensional ultrasound slice data includes at least the distal tip of the interventional medical device so long as a location of the distal tip of the interventional medical device remains in the three-dimensional imaging volume.
A further version of the invention lies in the electromagnetic field generator adapted for use in such a system, the interventional medical device adapted for use in such a system, an ultrasound probe adapted for use in such a system, a display screen adapted for use in such a system, and a processor circuit adapted for use in such a system. An alternative version of the invention lies in a system comprising a combination of any of the objects recited in the previous sentence.
The invention in another form is directed to a method of operating an ultrasound imaging system, including acquiring a position of a first tracking element associated with an interventional medical device; acquiring a position of a second tracking element associated with an ultrasound probe; determining an ultrasound imaging plane position of the ultrasound probe based on the position of the second tracking element; determining an offset distance between the position of first tracking element of the interventional medical device and the ultrasound plane position; and driving an ultrasound transducer mechanism to position an active ultrasound transducer array of the ultrasound probe at a determined point of convergence as defined by the offset distance.
In accordance with another aspect of the invention, a motion indicator is located on at least one of the ultrasound probe and the display screen. The processor circuit is operably coupled to the motion indicator, wherein if the distal tip of the interventional medical device is presently located outside the three-dimensional imaging volume, a visual prompt is generated at the motion indicator to prompt the user to move the head portion of the ultrasound probe in a particular direction to a general location such that the distal tip of the interventional medical device resides in the three-dimensional imaging volume.
In accordance with another aspect of the invention, a third tracking element is attached to a patient, wherein when the third tracking element is energized by the EM field generator. The third tracking element generates six axis patient location data, which is supplied to the processor circuit. The processor circuit processes the six-axis patient location data and assigns location information for images captured by the active ultrasound transducer array to known positions within a 3D volume referenced from the third tracking element.
In accordance with another aspect of the invention, the ultrasound imaging system has a three-dimensional imaging mode, wherein with the ultrasound probe held in a fixed position over an area of interest, a scanning signal is supplied to the ultrasound transducer mechanism to scan the active ultrasound transducer array over at least a portion of the possible imaging volume located below the transducer array. The active transducer array is repeatedly actuated during the scan to generate a plurality of sequential two-dimensional ultrasound data slices which are combined to form three-dimensional ultrasound volumetric data from which a three-dimensional ultrasound image is generated.
In accordance with another aspect of the invention, the active ultrasound transducer array is operated to generate multiple sets of ultrasound image data that includes metadata describing the location of the scan within the three-dimensional volume. The multiple sets of ultrasound image data are summed to generate composite ultrasound image data.
In accordance with another aspect of the invention, a desired image plane is defined in the three-dimensional ultrasound volumetric data. At least one synthetic scan plane is generated corresponding to the desired image plane.
In accordance with another aspect of the invention, a first two-dimensional ultrasound image slice is generated from a series of two-dimensional B-scan ultrasound image slices acquired from the three-dimensional ultrasound volumetric data. The first two-dimensional ultrasound image slice includes a particular region of interest. The first two-dimensional ultrasound image slice lies in a first imaging plane different from that of the native B-scan imaging plane of the series of two-dimensional ultrasound image slices. At least one slice selection slider provides a sequential parallel variation from the first two-dimensional ultrasound image slice to manually select a second two-dimensional ultrasound image slice parallel to the first two-dimensional ultrasound image, wherein the second two-dimensional ultrasound image slice lies on either side of the first two-dimensional ultrasound image slice.
In accordance with another aspect of the invention, an orientation of the ultrasound image that is displayed on a display screen is adjusted such that a vertical top of the acquired ultrasound image data is always rendered as “up” on the display screen relative to the position of the patient, and regardless of the actual orientation of ultrasound probe relative to the patient.
Another aspect of the invention is directed to a method of operating an ultrasound imaging system, including acquiring a position of a first tracking element associated with an interventional medical device; acquiring a position of a second tracking element associated with an ultrasound probe; determining an ultrasound imaging plane position of the ultrasound probe based on the position of the second tracking element; determining an offset distance between the position of first tracking element of the interventional medical device and the ultrasound plane position; and using the offset distance to dynamically control at least one ultrasound imaging setting of the ultrasound imaging system in near real time. As used herein, the term “near real time” means real time as limited by data acquisition and processing speed of the processing system. The at least one ultrasound imaging setting may include ultrasound focus, such that a lateral resolution is optimized at a depth that contains the interventional medical device. Also, the at least one ultrasound imaging setting may include a depth setting, such that a depth of imaging is automatically adjusted to match a depth of the interventional medical device. Also, the at least one ultrasound imaging setting may include zoom, wherein an imaging window can be “zoomed” such that a larger view of an area of interest is automatically displayed to the user.
The above-mentioned and other features and advantages of this invention, and the manner of attaining them, will become more apparent and the invention will be better understood by reference to the following description of embodiments of the invention taken in conjunction with the accompanying drawings, wherein:
Corresponding reference characters indicate corresponding parts throughout the several views. The exemplifications set out herein illustrate embodiments of the invention, and such exemplifications are not to be construed as limiting the scope of the invention in any manner.
Referring now to the drawings, and more particularly to
Ultrasound imaging system 10 includes an electromagnetic (EM) field generator 12, an ultrasound console 14, and an ultrasound probe 16 (handheld). Ultrasound probe 16 is connected to an ultrasound console 14 by a flexible electrical cable 17. Supplemental to ultrasound imaging system 10 is an interventional medical device 18.
As used herein, the term “interventional medical device” is an elongate intrusive medical device that is configured to be inserted into the tissue, vessel or cavity of a patient. In the context of the present invention, interventional medical device 18 may be, for example, a catheter, a lesion crossing catheter such as the CROSSER® Catheter available from C. R. Bard, Inc., a guide wire, a sheath, an angioplasty balloon, a stent delivery catheter, or a needle. It is intended that the interventional medical device 18 may be considered as a part of the overall ultrasound imaging system 10, but alternatively, also may be considered as an auxiliary part of ultrasound imaging system 10 as a separately provided item.
Ultrasound imaging system 10 is configured to track the location of the ultrasound probe 16 and interventional medical device 18, and in turn, to operate ultrasound probe 16 such that an active ultrasound transducer array of ultrasound probe 16 is dynamically positioned to image a desired portion of interventional medical device 18, as further described below.
In the present embodiment, ultrasound console 14 includes a mobile housing 20, to which is mounted a graphical user interface 22, and a processor circuit 24. Graphical user interface 22 may be in the form of a touch-screen display 26 having a display screen 28. Graphical user interface 22 is used in displaying information to the user, and accommodates user input via the touch-screen 26. For example, touch-screen 26 is configured to display an ultrasound image formed from two-dimensional ultrasound slice data provided by ultrasound probe 16, to display virtual location information of tracked elements within a 3D volume, and to display prompts intended to guide the user in the correct positioning of the ultrasound probe 16 above the area of interest.
Processor circuit 24 is an electrical circuit that has data processing capability and command generating capability, and in the present embodiment has a microprocessor 24-1 and associated non-transitory electronic memory 24-2. Microprocessor 24-1 and associated non-transitory electronic memory 24-2 are commercially available components, as will be recognized by one skilled in the art. Microprocessor 24-1 may be in the form of a single microprocessor, or two or more parallel microprocessors, as is known in the art. Non-transitory electronic memory 24-2 may include multiple types of digital data memory, such as random access memory (RAM), non-volatile RAM (NVRAM), read only memory (ROM), and/or electrically erasable programmable read-only memory (EEPROM). Non-transitory electronic memory 24-2 may further include mass data storage in one or more of the electronic memory forms described above, or on a computer hard disk drive or optical disk. Alternatively, processor circuit 24 may be assembled as one or more Application Specific Integrated Circuits (ASIC).
Processor circuit 24 processes program instructions received from a program source, such as software or firmware, to which processor circuit 24 has electronic access. More particularly, processor circuit 24 is configured, as more fully described below, to process location signals received from ultrasound probe 16 and interventional medical device 18, and to generate a digital positioning signal that is conditioned and provided as a control output to ultrasound probe 16. More particularly, the digital positioning signal and control output correspond to a coordinate in the scan axis, e.g., the y-axis, of ultrasound probe 16 where the active ultrasound transducer array of ultrasound probe 16 is to be positioned.
Processor circuit 24 is communicatively coupled to a probe input/output (I/O) interface circuit 30, a probe position control circuit 31, and a device input/output (I/O) interface circuit 32 via an internal bus structure 30-1, 31-1, and 32-1, respectively. As used herein, the term “communicatively coupled” means connected for communication over a communication medium, wherein the communication medium may be a direct wired connection having electrical conductors and/or printed circuit electrical conduction paths, or a wireless connection, and may be an indirect wired or wireless connection having intervening electrical circuits, such as amplifiers or repeaters. Probe input/output (I/O) interface circuit 30 and probe position control circuit 31 are configured to connect to electrical cable 17, which in turn is connected to ultrasound probe 16. In the present embodiment, device input/output (I/O) interface circuit 32 is configured to connect to a flexible electrical cable 34, which in turn is connected to interventional medical device 18.
Referring again to
Referring also to
Tracking element 44 is configured to generate tip location data defining five degrees of freedom based on the EM locator field 36 generated by EM field generator 12. The five degrees of freedom are the X-axis, Y-axis, Z-axis, pitch, and yaw. A sixth degree of freedom, i.e., roll, may be also included, if desired. Tracking element 44 of interventional medical device 18 is communicatively coupled to processor circuit 24 of ultrasound console 14 via electrical cable 34, serving as a communication link 46 between processor circuit 24 and tracking element 44. As used herein, “communications link” refers to an electrical transmission of data, i.e., information, and/or electrical power signals, over a wired or wireless communication medium. In the present embodiment, the communication link 46 provided by electrical cable 34 is a multi-conductor electrical cable that physically connects tracking element 44 to the ultrasound console 14, and in turn to processor circuit 24.
Alternatively, as depicted in
Bluetooth dongle 48 may be disposable, and included with each interventional medical device 18. Alternatively, Bluetooth dongle 48 may be reusable. Sterility requirements for the reusable dongle are addressed by placing the sterilized dongle in a sterile bag through which a sterile connection to interventional medical device 18 is made.
As shown in
Ultrasound probe 16 is communicatively coupled to processor circuit 24 of ultrasound console 14 via electrical cable 17, which may be a wired or a wireless connection. In the present embodiment, with reference to
Referring to
Referring to
Active ultrasound transducer array 66 is communicatively coupled to processor circuit 24 via communication link 58, and supplies two-dimensional ultrasound data to processor circuit 24 via communication link 58. Automatically, or alternatively based on a user input at graphical user interface 22, processor circuit 24 executes program instructions to store the two-dimensional ultrasound data in mass storage provided in non-transitory electronic memory 24-2.
Referring also to
Referring again to
In accordance with the present invention, active ultrasound transducer array 66 of ultrasound transducer mechanism 62 of ultrasound probe 16 may incorporate a movable one-dimensional (1D) transducer array, as in the embodiment depicted in
In the embodiment depicted in
In the embodiment of
Carriage 72 is connected to one-dimensional ultrasound transducer array 70, such that one-dimensional ultrasound transducer array 70 moves in unison with carriage 72. Carriage 72 converts a rotation of a rotatable shaft 74-1 of stepper motor 74 into a linear translation of carriage 72, and in turn, into a linear translation of one-dimensional ultrasound transducer array 70 relative to head portion 54 of probe housing 50, in a determined one of two translation directions D1, D2.
Stepper motor 74 is operably connected (electrically and communicatively) to probe position control circuit 31 (see
Carriage 72 converts the rotation of rotatable shaft 74-1 of stepper motor 74 into a linear translation of carriage 72, and in turn, moves one-dimensional ultrasound transducer array 70 relative to head portion 54 of probe housing 50 in a determined one of two translation directions D1, D2, to a location thus dictated by the digital positioning signal generated by processor circuit 24. Thus, based on the positioning signal initiated by processor circuit 24, the one-dimensional ultrasound transducer array 70 may be moved to a desired position relative to head portion 54 of probe housing 50.
In the alternative embodiment depicted in
In the embodiment of
In the embodiment of
As such, the embodiment of
In accordance with the present invention, and in view of the embodiments discussed above, ultrasound imaging system 10 provides a “lock-on” functionality, wherein the position of each of the ultrasound probe 16 and interventional medical device 18 are tracked, and the active ultrasound transducer array 66 in ultrasound probe 16 is dynamically positioned at a convergence of the tracking information, which is further described with reference to the flowchart of
Referring to
At step S102, “WHILE” defines the entry into a continuous loop to virtually converge the position of the ultrasound imaging plane of active ultrasound transducer array 66 of ultrasound probe 16 with the position of tracking element 44, and in turn distal tip 40, of interventional medical device 18. Processor circuit 24 remains in this continuous loop until the program execution is stopped.
At step S104, the current position of tracking element 44 of interventional medical device 18 is determined in relation to the 3D detection volume 38 defined by EM field generator 12. In particular, tracking element 44 of interventional medical device 18, generates tip location data as physical coordinates based on the EM locator field 36 generated by EM field generator 12, and provides the tip location data associated with the physical coordinates to processor circuit 24.
At step S106, in parallel to step S104, the current position of tracking element 64 of ultrasound (US) probe 16 is determined in relation to the 3D detection volume 38 defined by EM field generator 12. In particular, tracking element 64 of ultrasound probe 16 generates probe location data as physical coordinates based on the EM locator field 36 generated by EM field generator 12, and provides the probe location data associated with the physical coordinates to processor circuit 24.
At step S108, an ultrasound plane position (B-scan position) is determined based on the probe location data. In particular, processor circuit 24 executes program instructions to define a unit vector, i.e., the Z-axis at origin point 71 (0,0,0) of
ultrasound plane position=(Ax+By+Cz+D), Equation 1
where A, B, C are coefficients of the x, y, z position coordinates (of the probe location data) defining the plane of ultrasound probe 16, and D is the length of the distance vector from the origin point 71 to the Ax+By+Cz plane.
At step S110, processor circuit 24 executes program instructions to calculate an offset distance between the position of interventional medical device 18, as defined by the tip location data, and the ultrasound plane position (determined at step S108) of ultrasound probe 16, by using the equation:
OFFSET=(Ax1+By1+Cz1+D)/sqrt(A2+B2+C2), Equation 2
where: A, B, C, and D are coefficients of the ultrasound plane position (see step S108), and x1, y1, z1 are the position coordinates (of the tip location data) of interventional medical device 18.
The Equation 2 offset calculation gives the minimum, or perpendicular, distance from tracking element 44 of interventional medical device 18 to the ultrasound plane position, which is the distance (and direction) that ultrasound transducer mechanism 62 needs to move active ultrasound transducer array 66 so that there is a convergence (intersection) of the ultrasound position plane with the tracking element 44, and in turn distal tip 40, of interventional medical device 18. Thus, in essence, the calculation determines the offset used to achieve a convergence of the tip location data with the ultrasound plane position associated with the probe location data.
At step S112, ultrasound transducer mechanism 62 is driven to position active ultrasound transducer array 66 at the determined point of convergence as defined by the OFFSET calculated at step S110. In particular, processor circuit 24 executes program instructions to process the OFFSET to generate the positioning signal corresponding to the point of convergence, and the positioning signal is communicatively coupled to ultrasound transducer mechanism 62 to dynamically position active ultrasound transducer array 66 at a desired imaging location of the plurality of discrete imaging locations, so that the two-dimensional ultrasound slice data captured by active ultrasound transducer array 66 includes an image of at least the distal tip 40 of interventional medical device 18, so long as distal tip 40 of the interventional medical device 18 remains in the three-dimensional imaging volume 68 under the surface of the head portion of ultrasound probe 16.
In the embodiment of
Thereafter, the process returns to step S102, “WHILE”, to continue in the continuous loop in maintaining a convergence of the position of the active ultrasound transducer array 66 of ultrasound probe 16 with tracking element 44, and in turn distal tip 40, of interventional medical device 18.
Referring to
At step S200, ultrasound probe 16 is configured for acquisition of ultrasound data. For example, parameters such as the desired resolution, and emission strength of active ultrasound transducer array 66 to achieve a desired depth of penetration, may be set. For two-dimensional image scanning, ultrasound imaging system 10 is configured to collect a series of two-dimensional ultrasound imaging slices (ultrasound B-scan) data. For volume scan imaging, ultrasound imaging system 10 is configured to collect a series of ultrasound B-scan data to form three-dimensional ultrasound volumetric data representing the three-dimensional imaging volume 68, from which C-scan data, or other plane oriented data, may be derived.
At step S202, “WHILE” defines the entry into a continuous loop for acquisition of ultrasound data with active ultrasound transducer array 66 of ultrasound probe 16.
At step S204, ultrasound image data is acquired. More particularly, with reference to
For two-dimensional image scanning, a series of two-dimensional ultrasound imaging slices (ultrasound B-scan) data is collected and stored in non-transitory electronic memory 24-2. For volume scan imaging, active ultrasound transducer array 66 is scanned along the Y-axis across all, or a selected portion, of the three-dimensional imaging volume 68 to take a detailed volumetric scan of the underlying area beneath head portion 54 of ultrasound probe 16, such that a series of ultrasound B-scan data representing the three-dimensional imaging volume is collected and stored in non-transitory electronic memory 24-2.
Thereafter, the process returns to step S202, “WHILE”, to continue in the acquisition and updating of the ultrasound data.
While relative movement of ultrasound probe 16 and the distal tip 40 of interventional medical device 18 will result in a movement of the location of distal tip 40 of interventional medical device 18 in the three-dimensional imaging volume 68, so long as tracking element 44 and thus distal tip 40 of interventional medical device 18 remains in the three-dimensional imaging volume 68 of ultrasound probe 16, ultrasound imaging system 10 is able to dynamically position active ultrasound transducer array 66 to converge at a desired imaging location of the plurality of discrete imaging locations in the three-dimensional imaging volume 68 so that the two-dimensional ultrasound slice data includes an image of at least the distal tip 40 of interventional medical device 18 in generating the ultrasound image displayed on display screen 28.
However, referring again to
In particular, based on the tip location data provided by tracking element 44 of interventional medical device 18 and the probe location data tracking element 64 of ultrasound probe 16 processed by processor circuit 24, processor circuit 24 executes program logic to determine whether tracking element 44 of interventional medical device 18 is outside the three-dimensional imaging volume 68, and thus is outside the imageable range of ultrasound probe 16.
For example, when ultrasound probe 16 having tracking element 64 and interventional medical device 18 having tracking element 44 are placed within detection volume 38 of the EM field generator 12, the location of both tracking element 44 and tracking element 64, and the relative distance between tracking element 44 and tracking element 64, are calculated by processor circuit 24. Using this location and distance information, processor circuit 24 executes program instructions to determine whether the distal tip 40 of the interventional medical device 18 is presently located outside the three-dimensional imaging volume 68. If so, processor circuit 24 of ultrasound imaging system 10 further executes program instructions to generate a visual prompt at motion indicator 88 to prompt the user to move head portion 54 of ultrasound probe 16 in a particular direction to a general location such that tracking element 44, and thus distal tip 40, of interventional medical device 18 resides in the three-dimensional imaging volume 68 under ultrasound probe 16, thereby permitting the active ultrasound transducer array 66 of ultrasound probe 16 to automatically capture ultrasound image data containing the tracking element 44 and distal tip 40 of interventional medical device 18 for display on display screen 28.
Thus, in practicing the “lock-on” functionality mode of action of the present invention, if the tracking element 44, and thus distal tip 40, of the interventional medical device 18 is outside the three-dimensional imaging volume 68 of ultrasound probe 16, manual probe positioning prompts will be generated, in the form of motion indicator 88, which is present on ultrasound probe 16 and/or on graphical user interface 22 to prompt the user to move ultrasound probe 16 to the general location that contains the interventional medical device 18 having tracking element 44, such that tracking element 44 and distal tip 40 of interventional medical device 18 lies within the three-dimensional imaging volume 68 of ultrasound probe 16.
Once the user has placed ultrasound probe 16 over the general area to be visualized, location information from ultrasound probe 16 and interventional medical device 18 is further used to move the position of the active ultrasound transducer array 66 of ultrasound probe 16, which allows ultrasound imaging system 10 to converge on a two-dimensional ultrasound image slice that includes the underlying interventional medical device 18, even if ultrasound probe 16 is not placed directly over tracking element 44/distal tip 40 of interventional medical device 18.
The position of the active ultrasound transducer array 66 of ultrasound probe 16 is dynamically adjusted in near real time, limited by data acquisition and processing speed, which allows ultrasound imaging system 10 to adapt to small changes in position of ultrasound probe 16, the position of the tracking element 44 of interventional medical device 18, and/or the patient position, such that an ultrasound image of the underlying interventional medical device 18 is maintained within view of ultrasound probe 16.
If the interventional medical device 18 to be imaged moves outside of the possible three-dimensional imaging volume 68 beneath ultrasound probe 16, positioning prompts in the form of motion indicator 88 are again generated and used to prompt the user to move ultrasound probe 16 in a direction that allows ultrasound imaging system 10 to again converge on, and display, an ultrasound image of the underlying interventional medical device 18.
Ultrasound imaging system 10 also may be operated in a three-dimensional (3D) high resolution scan imaging mode, with reference to step S204 of
In general, with further reference to
More particularly, in the 3D high resolution imaging mode, processor circuit 24 of ultrasound console 14 is configured to execute program instructions to generate a scanning signal that is supplied to ultrasound transducer mechanism 62 to scan active ultrasound transducer array 66 over at least a portion of the three-dimensional imaging volume 68. The active ultrasound transducer array 66 is repeatedly actuated during the scan to generate a plurality, i.e., a series, of sequential two-dimensional ultrasound slices, which are stored in memory 24-2, and combined to form the 3D ultrasound volumetric data from which a three-dimensional (3D) high resolution ultrasound image is formed and displayed on display screen 28 of graphical user interface 22 (see also
The quality of the high resolution 3D images may be improved by generating a composite ultrasound image of the location of interest. Because the location of the ultrasound probe 16 is known by processor circuit 24, multiple sets of 2D or 3D, ultrasound images of a particular location in the three-dimensional imaging volume 68 underlying, e.g., perpendicular to, the surface of the head portion 54 of ultrasound probe 16 may be taken, and stored in non-transitory electronic memory 24-2, from which a compound composite ultrasound image may be generated from the multiple sets of 2D, or 3D, ultrasound images by summing together the multiple sets of ultrasound images of the same location.
In particular, processor circuit 24 is configured to execute program instructions to operate the active ultrasound transducer array 66 to generate multiple sets of ultrasound image data that includes metadata corresponding to a particular location, i.e., metadata describing the location of the scan within the three-dimensional volume 68, and save the multiple sets in non-transitory electronic memory 24-2. Processor circuit 24 is further configured to execute program instructions to sum the multiple sets of ultrasound image data to generate composite (compound) ultrasound image data, which is then stored in non-transitory memory 24-2 and/or is displayed on display screen 28 of graphical user interface 22.
Referring also to
Ultrasound imaging system 10 also may be operated to render and display one or more synthetic (user chosen) scan planes.
Referring also to
In particular, the user may define, using user controls 96, a desired synthetic plane orientation with respect to the 3D ultrasound volumetric data associated with three-dimensional ultrasound image 94. From the plane orientation inputs provided at user controls 96, processor circuit 24 of ultrasound imaging system 10 executes program instructions to identify within the 3D ultrasound volumetric data of three-dimensional ultrasound image 94 the image data associated with the desired synthetic plane orientation. The desired synthetic plane may pass through multiple two-dimensional image data slices in the 3D ultrasound volumetric data. Once the image data associated with the desired synthetic plane orientation within the 3D ultrasound volumetric data is identified, the desired one or more synthetic (user chosen) scan planes may be rendered and displayed on display screen 28 of graphical user interface 22 within the generated three-dimensional ultrasound image 94 as shown in
Various views, such as those associated with the sagittal plane, the transverse plane, and the coronal plane, may be visualized, and a slice from one or more, or all, of the planes, as defined by the location of the tracked device(s), e.g., tracking element 44 of interventional medical device 18 and/or tracking element 64 of ultrasound probe 16, can be displayed, individually or as a group. It is also envisioned that scan planes that do not exist at 90 degrees from each other could also be defined and selected by the user. Additionally, the user defined scan planes may not be planar, and may follow a curved path.
Another aspect of the present invention provides for a focusing of the three-dimensional imaging volume around a determined region of interest, i.e., the region around the location of tracking element 44 of interventional medical device 18, by reducing the scan extent along the Y-axis (see
In particular, processor circuit 24 executes program instructions to determine a region of interest in the three-dimensional ultrasound volumetric data defining the three-dimensional imaging volume 68. Processor circuit 24 also executes program instructions to reduce the scan range of the active ultrasound transducer array 66 of the ultrasound transducer mechanism 62 along the Y-axis for acquisition of subsequent three-dimensional ultrasound volumetric data at the region of interest from that of the scan range of the previous scan, so as to reduce the amount of acquired three-dimensional ultrasound volumetric data from that of the prior scan.
Referring to
Referring also to
Thus, slice selection sliders 102 permit the user to select a slice in each of one or more imaging planes for display, if desired, wherein the selected two-dimensional ultrasound image slice may intersect, or lie on either side of, the two-dimensional ultrasound image slice that was automatically, or manually, selected. The slice selection sliders 102 are configured to provide a sequential parallel variation from the initially selected two-dimensional ultrasound image slice to manually select a second two-dimensional ultrasound image slice parallel to the initially selected two-dimensional ultrasound image, wherein the second two-dimensional ultrasound image slice lies on either side of the initially selected two-dimensional ultrasound image slice.
For example,
Referring to
At step S300, ultrasound imaging system 10 is initialized for rendering a 3D ultrasound image as a set of three orthogonal images, such as setting up processor circuit 24 and graphical user interface 22 for construction of 3D models.
At step S302, “WHILE” defines the entry into a continuous loop for generation and updating of the displayed 3D ultrasound image.
At step S304, an ultrasound (US) volume transform node is updated based on the position of ultrasound probe 16, as determined at step S106 of
At step S306, using the calculated OFFSET from step S110 of
At step S308, processor circuit 24 executes program instructions to generate 3D display data representative of three orthogonal images in a virtual 3D environment associated with the three-dimensional imaging volume 68 matched to the current position of ultrasound probe 16. Processor circuit 24 sends the 3D display data to user interface 22 for display on display screen 28 as three orthogonal images that include the tracking element 44, and in turn the distal tip 40, of interventional medical device 18.
Thereafter, the process returns to step S302, “WHILE”, to continue updating the displayed 3D ultrasound image.
Referring now to
Referring also to
In comparison,
Advantageously, the patient oriented imaging window aspect of the present invention described above with respect to
More particularly,
At step S400, ultrasound imaging system 10 is initialized for rendering a 3D ultrasound image, such as setting up processor circuit 24 and graphical user interface 22 for construction of 3D models, initializing a camera video data transfer, and configuring appropriate patient lighting for video.
At step 402, “WHILE” defines the entry into a continuous loop for generation and updating of the displayed patient oriented imaging window 108 as depicted in
At step S404, an ultrasound (US) volume transform node is updated based on the position of ultrasound probe 16, as determined at step S106 of
At step S406, an ultrasound (US) image transform node is updated based on the calculated OFFSET from step S110 of
At step 408, based on 2D and/or 3D image data acquisition as described at step S204 of
Thereafter, the process returns to step 402, “WHILE”, to continue in updating the patient oriented imaging window 108.
As an additional aspect, since the offset distance (z-axis) between the ultrasound probe 16 and the interventional medical device 18 can be calculated using Equations 1 and 2 (see steps S108 and S110, discussed above), this offset, or depth information, can further be used to dynamically control some of the ultrasound imaging settings in near real time, as identified below. This allows the system to optimize the image quality settings such that the best image of the interventional medical device 18 is displayed to the user at display screen 28. The ultrasound imaging settings that may be dynamically controlled because the z-axis offset from the ultrasound probe 16 can be calculated may include:
1) Ultrasound focus; such that the lateral resolution is optimized at the depth that contains the interventional medical device 18. Using the z-axis offset between the ultrasound probe 16 and the interventional medical device 18, the focus can be automatically adjusted to the depth that contains the interventional medical device 18.
2) Depth setting; because the z-axis offset from the ultrasound probe 16 can be calculated, the Depth setting can be dynamically controlled such that the depth of imaging is automatically adjusted to match the depth of the interventional medical device 18.
3) Zoom; because the z-axis offset from the ultrasound probe 16 can be calculated, the imaging window can be “zoomed” such that a larger view of the area of interest may be automatically displayed to the user.
While this invention has been described with respect to at least one embodiment, the present invention can be further modified within the spirit and scope of this disclosure. This application is therefore intended to cover any variations, uses, or adaptations of the invention using its general principles. Further, this application is intended to cover such departures from the present disclosure as come within known or customary practice in the art to which this invention pertains and which fall within the limits of the appended claims.
This application is a U.S. national phase of International Application No. PCT/US2015/018068, filed Feb. 27, 2015, which claims priority to U.S. provisional patent application Ser. No. 62/081,275, filed Nov. 18, 2014, which is incorporated herein by reference in its entirety.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2015/018068 | 2/27/2015 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2016/081023 | 5/26/2016 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
3927661 | Takemura | Dec 1975 | A |
4362059 | Zwyssig | Dec 1982 | A |
4431007 | Amazeen et al. | Feb 1984 | A |
4483344 | Atkov et al. | Nov 1984 | A |
4669482 | Ophir | Jun 1987 | A |
4796632 | Boyd et al. | Jan 1989 | A |
4821731 | Martinelli et al. | Apr 1989 | A |
4831601 | Breimesser et al. | May 1989 | A |
4920966 | Hon et al. | May 1990 | A |
4974593 | Ng | Dec 1990 | A |
5094243 | Puy et al. | Mar 1992 | A |
5152294 | Mochizuki et al. | Oct 1992 | A |
5191889 | Mornhinweg et al. | Mar 1993 | A |
5335663 | Oakley | Aug 1994 | A |
5381794 | Tei et al. | Jan 1995 | A |
5460179 | Okunuki et al. | Oct 1995 | A |
5503152 | Oakley et al. | Apr 1996 | A |
5598845 | Chandraratna et al. | Feb 1997 | A |
5615680 | Sano | Apr 1997 | A |
5626554 | Ryaby et al. | May 1997 | A |
5669385 | Pesque et al. | Sep 1997 | A |
5701897 | Sano | Dec 1997 | A |
5715825 | Crowley | Feb 1998 | A |
5727553 | Saad | Mar 1998 | A |
5769843 | Abela et al. | Jun 1998 | A |
5860929 | Rubin et al. | Jan 1999 | A |
6048323 | Hon | Apr 2000 | A |
6080108 | Dunham | Jun 2000 | A |
6132376 | Hossack et al. | Oct 2000 | A |
6132378 | Marino | Oct 2000 | A |
6241667 | Vetter et al. | Jun 2001 | B1 |
6241675 | Smith et al. | Jun 2001 | B1 |
6248074 | Dhno et al. | Jun 2001 | B1 |
6261231 | Damphousse et al. | Jul 2001 | B1 |
6263093 | Mochizuki | Jul 2001 | B1 |
6413218 | Allison et al. | Jul 2002 | B1 |
6423006 | Banjanin | Jul 2002 | B1 |
6464642 | Kawagishi | Oct 2002 | B1 |
6517481 | Hoek et al. | Feb 2003 | B2 |
6524303 | Garibaldi | Feb 2003 | B1 |
6527718 | Connor et al. | Mar 2003 | B1 |
6565513 | Phillips | May 2003 | B1 |
6587709 | Solf et al. | Jul 2003 | B2 |
6600948 | Ben-haim et al. | Jul 2003 | B2 |
6684094 | Lehr et al. | Jan 2004 | B1 |
6685644 | Seo et al. | Feb 2004 | B2 |
6690963 | Ben-haim et al. | Feb 2004 | B2 |
6716166 | Govari | Apr 2004 | B2 |
6735465 | Panescu | May 2004 | B2 |
6755789 | Stringer et al. | Jun 2004 | B2 |
6757557 | Bladen et al. | Jun 2004 | B1 |
6772001 | Maschke | Aug 2004 | B2 |
6773402 | Govari et al. | Aug 2004 | B2 |
6884216 | Abe et al. | Apr 2005 | B2 |
6895267 | Panescu et al. | May 2005 | B2 |
6951543 | Roundhill | Oct 2005 | B2 |
6970733 | Willis et al. | Nov 2005 | B2 |
6988991 | Kim et al. | Jan 2006 | B2 |
7010338 | Ritter et al. | Mar 2006 | B2 |
7051738 | Dron et al. | May 2006 | B2 |
7081093 | Flesch | Jul 2006 | B2 |
7090639 | Govari | Aug 2006 | B2 |
7194294 | Panescu et al. | Mar 2007 | B2 |
7197354 | Sobe | Mar 2007 | B2 |
7214191 | Stringer et al. | May 2007 | B2 |
7311679 | Desilets et al. | Dec 2007 | B2 |
7364546 | Panescu et al. | Apr 2008 | B2 |
7433504 | Deischinger et al. | Oct 2008 | B2 |
7477763 | Willis et al. | Jan 2009 | B2 |
7493154 | Bonner et al. | Feb 2009 | B2 |
7517318 | Altmann et al. | Apr 2009 | B2 |
7520857 | Chalana et al. | Apr 2009 | B2 |
7536218 | Govari et al. | May 2009 | B2 |
7555330 | Gilboa et al. | Jun 2009 | B2 |
7604601 | Altmann et al. | Oct 2009 | B2 |
7637885 | Maschke | Dec 2009 | B2 |
7677078 | Sauer et al. | Mar 2010 | B2 |
7686767 | Maschke | Mar 2010 | B2 |
7713210 | Byrd et al. | May 2010 | B2 |
7735349 | Hochmitz | Jun 2010 | B2 |
7740584 | Donaldson et al. | Jun 2010 | B2 |
7749168 | Maschke et al. | Jul 2010 | B2 |
7766833 | Lee et al. | Aug 2010 | B2 |
7766836 | Waki | Aug 2010 | B2 |
7769427 | Shachar | Aug 2010 | B2 |
7774043 | Mills | Aug 2010 | B2 |
7778688 | Strommer | Aug 2010 | B2 |
7803116 | Sikdar et al. | Sep 2010 | B2 |
7806828 | Stringer | Oct 2010 | B2 |
7806829 | Hauck | Oct 2010 | B2 |
7819810 | Stringer et al. | Oct 2010 | B2 |
7822464 | Maschke et al. | Oct 2010 | B2 |
7831076 | Altmann et al. | Nov 2010 | B2 |
7837625 | Abe | Nov 2010 | B2 |
7848789 | Govari et al. | Dec 2010 | B2 |
7854237 | Hand | Dec 2010 | B2 |
7871379 | Ohtsuka | Jan 2011 | B2 |
7873401 | Shachar | Jan 2011 | B2 |
7881769 | Sobe | Feb 2011 | B2 |
7927279 | Kubota et al. | Apr 2011 | B2 |
7938847 | Fanton et al. | May 2011 | B2 |
7961924 | Viswanathan | Jun 2011 | B2 |
7967808 | Fitzgerald et al. | Jun 2011 | B2 |
7969142 | Krueger et al. | Jun 2011 | B2 |
7981038 | Kanade et al. | Jul 2011 | B2 |
7996057 | Govari et al. | Aug 2011 | B2 |
RE42856 | Karmarkar et al. | Oct 2011 | E |
8041411 | Camus | Oct 2011 | B2 |
8041413 | Barbagli et al. | Oct 2011 | B2 |
8082021 | Hyde et al. | Dec 2011 | B2 |
8086298 | Whitmore, III et al. | Dec 2011 | B2 |
8126534 | Maschke | Feb 2012 | B2 |
8167810 | Maschke | May 2012 | B2 |
8172757 | Jaffe et al. | May 2012 | B2 |
8175682 | Hamm et al. | May 2012 | B2 |
8196471 | Han et al. | Jun 2012 | B2 |
8206404 | De La Rama et al. | Jun 2012 | B2 |
8211025 | Donaldson et al. | Jul 2012 | B2 |
8212554 | Brazdeikis et al. | Jul 2012 | B2 |
8214015 | Macaulay et al. | Jul 2012 | B2 |
8216149 | Oonuki et al. | Jul 2012 | B2 |
8228347 | Beasley et al. | Jul 2012 | B2 |
8257261 | Kawae | Sep 2012 | B2 |
RE43750 | Martinelli | Oct 2012 | E |
8292817 | Mori | Oct 2012 | B2 |
8298149 | Hastings et al. | Oct 2012 | B2 |
8303507 | Baba et al. | Nov 2012 | B2 |
8320711 | Altmann et al. | Nov 2012 | B2 |
8332013 | Strommer | Dec 2012 | B2 |
8335555 | Lehman | Dec 2012 | B2 |
8343052 | Kawagishi et al. | Jan 2013 | B2 |
8359086 | Maschke | Jan 2013 | B2 |
8366738 | Dehnad | Feb 2013 | B2 |
8388541 | Messerly et al. | Mar 2013 | B2 |
8388546 | Rothenberg | Mar 2013 | B2 |
8412311 | Kenneth | Apr 2013 | B2 |
8428690 | Li et al. | Apr 2013 | B2 |
8428691 | Byrd et al. | Apr 2013 | B2 |
8439840 | Duffy | May 2013 | B1 |
8452376 | Elgort et al. | May 2013 | B2 |
8473029 | Gerhart et al. | Jun 2013 | B2 |
8475524 | Schwartz | Jul 2013 | B2 |
8478382 | Burnside et al. | Jul 2013 | B2 |
8480588 | Kanade et al. | Jul 2013 | B2 |
8485976 | Iimura et al. | Jul 2013 | B2 |
8496586 | Zhang et al. | Jul 2013 | B2 |
8512256 | Rothenberg | Aug 2013 | B2 |
8527032 | Li | Sep 2013 | B2 |
8535229 | Umemura et al. | Sep 2013 | B2 |
8577105 | Abe et al. | Nov 2013 | B2 |
8591417 | Suzuki et al. | Nov 2013 | B2 |
8634619 | Yoshiara et al. | Jan 2014 | B2 |
8670816 | Green et al. | Mar 2014 | B2 |
8693011 | Mori | Apr 2014 | B2 |
8781555 | Burnside et al. | Jul 2014 | B2 |
8801693 | He et al. | Aug 2014 | B2 |
8857263 | Both et al. | Oct 2014 | B2 |
8867808 | Satoh et al. | Oct 2014 | B2 |
8900149 | Satoh et al. | Oct 2014 | B2 |
8885897 | Xu et al. | Nov 2014 | B2 |
8945147 | Ritchey et al. | Feb 2015 | B2 |
8971600 | Yoshikawa et al. | Mar 2015 | B2 |
9005127 | Azuma | Apr 2015 | B2 |
9024624 | Brunner | May 2015 | B2 |
9055883 | Tgavalekos et al. | Jun 2015 | B2 |
9082178 | Hyun et al. | Jul 2015 | B2 |
9107607 | Hansegard et al. | Aug 2015 | B2 |
9119557 | Masui et al. | Sep 2015 | B2 |
9149568 | Gerg et al. | Oct 2015 | B2 |
9173638 | Govari et al. | Nov 2015 | B2 |
9216299 | Wolfe | Dec 2015 | B2 |
9220480 | Lee et al. | Dec 2015 | B2 |
9241683 | Slayton et al. | Jan 2016 | B2 |
9256947 | Gauthier et al. | Feb 2016 | B2 |
9282324 | Hamada | Mar 2016 | B2 |
9289187 | Owen et al. | Mar 2016 | B2 |
9295449 | Zhang et al. | Mar 2016 | B2 |
9307954 | Nishigaki | Apr 2016 | B2 |
9308041 | Altmann et al. | Apr 2016 | B2 |
9314222 | Creighton, IV et al. | Apr 2016 | B2 |
9332965 | Lee et al. | May 2016 | B2 |
9375163 | Ludwin et al. | Jun 2016 | B2 |
9380999 | Yoshida et al. | Jul 2016 | B2 |
9390495 | Lee et al. | Jul 2016 | B2 |
9439624 | Caluser | Sep 2016 | B2 |
9445780 | Hossack et al. | Sep 2016 | B2 |
9451933 | Duffy | Sep 2016 | B2 |
9456766 | Cox et al. | Oct 2016 | B2 |
9474465 | Ashe | Oct 2016 | B2 |
9492104 | Clark et al. | Nov 2016 | B2 |
9521961 | Silverstein et al. | Dec 2016 | B2 |
9554716 | Burnside et al. | Jan 2017 | B2 |
9572549 | Belevich et al. | Feb 2017 | B2 |
9612142 | Kristofferson et al. | Apr 2017 | B2 |
9649048 | Cox et al. | May 2017 | B2 |
20020019644 | Hastings et al. | Feb 2002 | A1 |
20040015079 | Berger et al. | Jan 2004 | A1 |
20040097803 | Panescu | May 2004 | A1 |
20040249287 | Kawashima et al. | Dec 2004 | A1 |
20050021063 | Hall et al. | Jan 2005 | A1 |
20050027195 | Govari | Feb 2005 | A1 |
20050085718 | Shahidi | Apr 2005 | A1 |
20050131289 | Aharoni et al. | Jun 2005 | A1 |
20060004291 | Heimdal et al. | Jan 2006 | A1 |
20060173304 | Wang | Aug 2006 | A1 |
20060174065 | Kuzara et al. | Aug 2006 | A1 |
20060184029 | Haim et al. | Aug 2006 | A1 |
20060241461 | White et al. | Oct 2006 | A1 |
20060247522 | Mcgee | Nov 2006 | A1 |
20060253031 | Altmann et al. | Nov 2006 | A1 |
20070167769 | Ikuma et al. | Jul 2007 | A1 |
20070213616 | Anderson et al. | Sep 2007 | A1 |
20070238979 | Huynh et al. | Oct 2007 | A1 |
20080021297 | Boosten | Jan 2008 | A1 |
20080039725 | Man et al. | Feb 2008 | A1 |
20080051652 | Ichioka et al. | Feb 2008 | A1 |
20080161840 | Osiroff et al. | Jul 2008 | A1 |
20080294037 | Richter | Nov 2008 | A1 |
20090018448 | Seo et al. | Jan 2009 | A1 |
20090093712 | Busch et al. | Apr 2009 | A1 |
20090105579 | Garibaldi | Apr 2009 | A1 |
20090118620 | Tgavalekos et al. | May 2009 | A1 |
20090131797 | Jeong et al. | May 2009 | A1 |
20090137900 | Bonner et al. | May 2009 | A1 |
20090143676 | Matsumura | Jun 2009 | A1 |
20090163810 | Kanade et al. | Jun 2009 | A1 |
20090192385 | Meissner et al. | Jul 2009 | A1 |
20090306497 | Manzke et al. | Dec 2009 | A1 |
20100016726 | Meier | Jan 2010 | A1 |
20100049052 | Shari et al. | Feb 2010 | A1 |
20100063398 | Halmann | Mar 2010 | A1 |
20100113919 | Maschke | May 2010 | A1 |
20100160781 | Carter et al. | Jun 2010 | A1 |
20100191101 | Lichtenstein | Jul 2010 | A1 |
20100222680 | Hamada | Sep 2010 | A1 |
20100249602 | Buckley et al. | Sep 2010 | A1 |
20100298713 | Robinson | Nov 2010 | A1 |
20110092862 | Chivers | Apr 2011 | A1 |
20110125022 | Lazebnik | May 2011 | A1 |
20110142319 | Lee et al. | Jun 2011 | A1 |
20110194748 | Tonomura et al. | Aug 2011 | A1 |
20110196238 | Jacobson et al. | Aug 2011 | A1 |
20110196397 | Frantz et al. | Aug 2011 | A1 |
20110224550 | Shinohara | Sep 2011 | A1 |
20110230763 | Emery et al. | Sep 2011 | A1 |
20110230796 | Emery et al. | Sep 2011 | A1 |
20110255762 | Deischinger et al. | Oct 2011 | A1 |
20110301460 | Anite | Dec 2011 | A1 |
20120046553 | Buckley et al. | Feb 2012 | A9 |
20120065499 | Chono | Mar 2012 | A1 |
20120070051 | Vincent et al. | Mar 2012 | A1 |
20120071752 | Sewell et al. | Mar 2012 | A1 |
20120143029 | Silverstein et al. | Jun 2012 | A1 |
20120165671 | Hill et al. | Jun 2012 | A1 |
20120197113 | Courtney et al. | Aug 2012 | A1 |
20120209114 | Staalsen et al. | Aug 2012 | A1 |
20120220854 | Messerly et al. | Aug 2012 | A1 |
20120245457 | Crowley | Sep 2012 | A1 |
20120259209 | Harhen | Oct 2012 | A1 |
20120289830 | Halmann et al. | Nov 2012 | A1 |
20120289836 | Ukimura et al. | Nov 2012 | A1 |
20120310093 | Tanabe et al. | Dec 2012 | A1 |
20130006100 | Shachar et al. | Jan 2013 | A1 |
20130006111 | Sasaki | Jan 2013 | A1 |
20130009957 | Arakita | Jan 2013 | A1 |
20130012820 | Brown et al. | Jan 2013 | A1 |
20130018264 | Gerard et al. | Jan 2013 | A1 |
20130060116 | Messerly et al. | Mar 2013 | A1 |
20130066193 | Olson et al. | Mar 2013 | A1 |
20130085416 | Mest | Apr 2013 | A1 |
20130102889 | Southard et al. | Apr 2013 | A1 |
20130102903 | Tanaka et al. | Apr 2013 | A1 |
20130123614 | Bernstein et al. | May 2013 | A1 |
20130165782 | Yawata | Jun 2013 | A1 |
20130165784 | Kim et al. | Jun 2013 | A1 |
20130172745 | Choi | Jul 2013 | A1 |
20130172747 | Kim et al. | Jul 2013 | A1 |
20130172748 | Kim | Jul 2013 | A1 |
20130184569 | Strommer et al. | Jul 2013 | A1 |
20130197365 | Baba | Aug 2013 | A1 |
20130217997 | Byrd et al. | Aug 2013 | A1 |
20130237826 | Levien | Sep 2013 | A1 |
20130253319 | Hamilton et al. | Sep 2013 | A1 |
20130289411 | Barnard et al. | Oct 2013 | A1 |
20130296691 | Ashe | Nov 2013 | A1 |
20130303886 | Ludwin et al. | Nov 2013 | A1 |
20130303915 | Barnard et al. | Nov 2013 | A1 |
20130317334 | Bar-tal et al. | Nov 2013 | A1 |
20130331697 | Park et al. | Dec 2013 | A1 |
20140035914 | Noshi et al. | Feb 2014 | A1 |
20140039307 | Harhen | Feb 2014 | A1 |
20140051984 | Berger et al. | Feb 2014 | A1 |
20140107487 | Kim et al. | Apr 2014 | A1 |
20140187919 | Parthasarathy et al. | Jul 2014 | A1 |
20140187950 | Torp et al. | Jul 2014 | A1 |
20140364734 | Huang | Dec 2014 | A1 |
20150073266 | Brannan et al. | Mar 2015 | A1 |
20150320386 | Liu | Nov 2015 | A9 |
20160007842 | Govari et al. | Jan 2016 | A1 |
20160331351 | Guracar | Nov 2016 | A1 |
20160338675 | Kubota | Nov 2016 | A1 |
Number | Date | Country |
---|---|---|
2014099825 | Jun 2014 | WO |
2016081023 | May 2016 | WO |
2016081321 | May 2016 | WO |
Entry |
---|
Anonymous: “Aurora”, Retrieved from the Internet: http://www.ndigital.com/medical/products/aurora, retrieved on Jun. 30, 2015. |
R.B. Peterson, J. Hutchins: “The iE33 intelligient echocardiographysystem”, MEDICAMUNDI, Nov. 1, 2004 (Nov. 1, 2004), XP002741613, Retrieved from the Internet: http://www.healthcare.philips.com/pwc_hc/main/about/assets/Docs/medicamundi/mm_vol148_no3/11_Petrson.pdf, retrieved on Jun. 30, 2015. |
Number | Date | Country | |
---|---|---|---|
20180296185 A1 | Oct 2018 | US |
Number | Date | Country | |
---|---|---|---|
62081275 | Nov 2014 | US |