The following generally relates to ultrasound imaging and more particularly to visually presenting an ultrasound image with a three-dimensional (3-D) graphical representation of at least a portion of the ultrasound transducer used to generate the ultrasound image superimposed over a portion of the display in a spatial orientation corresponding to a current spatial orientation of the ultrasound transducer, which is determined by a probe tacking system, with respect to a user of the transducer.
An ultrasound (US) imaging system has included an ultrasound probe with an array of transducer elements and interfaced with a console. The transducer elements transmit a pressure wave in response to being excited and sense echoes produced in response to the pressure wave interacting with structure and generates a signal indicative thereof. The console includes a processor that processes the signal to generate an image. In B-mode, the signal is processed to produce a sequence of focused, coherent echo samples along focused scanlines of a scanplane. The scanlines are scan converted into a format of a display monitor and visually presented as an image via the display monitor.
The probe housing has included a small protrusion near one side of the transducer array that protrudes out from the housing. The protrusion indicates a left/right orientation of the transducer array. By convention, the protrusion should point toward the patient's right side in transverse views and head in longitudinal views. The displayed image has been overlaid with an on-screen marking that corresponds to the protrusion. The side of the image corresponding with the protrusion end of the transducer is shown onscreen with an orientation marker. In this manner, the sonographer will be visually apprised of the image plane and orientation of the displayed image. An example is shown in
Aspects of the application address the above matters, and others.
In one aspect, an imaging system includes an ultrasound imaging probe, a display and a console. The console is electrically interfaced with the ultrasound imaging probe and the display. The console includes a rendering engine configured to visually present an ultrasound image generated with data acquired by the ultrasound imaging probe and a three-dimensional graphical representation of a portion of the probed superimposed over a predetermined region of the ultrasound image. The rendering engine is further configured to visually present the three-dimensional graphical representation in a spatial orientation of the probe with respect to a user.
In another aspect, a method includes acquiring scan data of a subject generated by an ultrasound imaging probe, processing the scan data to generate an ultrasound image, retrieving a three-dimensional representation including a 3-D graphical model of a probe, and visually presenting the ultrasound image with the three-dimensional graphical representation, including the 3-D graphical model of the probe and a scan plane, superimposed over the ultrasound image and in a spatial orientation of the ultrasound imaging probe with respect to a user of the ultrasound imaging probe.
In another aspect, a computer readable medium is encoded with computer executable instructions which when executed by a computer processor cause the computer processor to: acquire scan data of a subject generated by an ultrasound imaging probe, process the scan data to generate an ultrasound image, retrieve a three-dimensional representation including a 3-D graphical model of a probe, and visually present the ultrasound image with the three-dimensional graphical representation, including the 3-D graphical model of the probe and a scan plane, superimposed over the ultrasound image and in a spatial orientation of the ultrasound imaging probe with respect to a user of the ultrasound imaging probe.
Those skilled in the art will recognize still other aspects of the present application upon reading and understanding the attached description.
The application is illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which:
The following describes an approach that uses probe spatial tracking data to display a 3-D representation of at least part of an ultrasound probe in an orientation with respect to the sonographer, superimposed over part of a display with an ultrasound image of the scanned anatomy. The 3-D representation visually indicates the orientation of the ultrasound probe with respect to the sonographer.
Initially referring to
The imaging system 200 includes a probe 202 and a console 204, which are configured to interface over a communications path 206. In the illustrated example, the communications path 206 includes a hard-wired path 208 such as a cable or the like. In this instance, the cable 208 includes a connector 210 and the console 204 includes a complementary connector 212. In general, the console 204 may include multiple connectors, each configured to engage a complementary connector of a different probe. In another instance, the communications path 206 includes a wireless communications path, and the probe 202 and the console 204 include wireless interfaces.
In the illustrated example, the connectors 210 and 212 are electro-mechanical connectors. In one instance, the electro-mechanical connectors 210 and 212 are configured as plug and socket connectors, where the electro-mechanical connector 210 has a “male” configuration and is a plug with electrically conductive pins or prongs, and the complementary electro-mechanical connector 212 has a “female” configuration and is a mating receptacle with electrically conductive sockets configured to receive the pins or prongs. Mechanically engaging the electro-mechanical connectors 210 and 212 places the pins/prongs and sockets in electrical communication.
The probe 214 includes a housing 214, a transducer array 216 of transducer elements 218, a sensor(s) 220, and electronics 222. The housing 214 houses or encloses the transducer array 216, which is mechanically supported by and/or within the housing 214. The transducer array 216 includes one or more rows of the transducer elements 218, which are configured to transmit ultrasound signals and receive echo signals. The sensor(s) 220 includes one or more optical and/or electro-magnetic sensors and are used as discussed below for probe tracking purposes. The electronics 222 routes signals to and from the sensor(s) 220 and array 116 and the communications path 206. In one instance, the probe 202 transmits information that identifies the type (e.g., model) of the probe 202.
In a variation, the housing 214 includes a probe orientation marker. In one instance, the probe orientation marker is disposed at a predetermined location on the housing and visually and/or haptically indicates information such as the image plane and/or an orientation of the probe 202. An example of a suitable marker is described in patent application Ser. No. 15/513,216, publication number US 2017/303,892 A1, filed Mar. 22, 2017, and entitled “Transducer Orientation Market,” which is incorporated herein in its entirety by reference.
The console 204 includes transmit circuitry 224 configured to generate a set of radio frequency (RF) pulses that are conveyed to the transducer array 216 and selectively excite a set of the transducer elements 218, causing the set of elements to transmit ultrasound signals or pressure waves. The console 204 further includes receive circuitry 226 that receives electrical signals generated by the transducer elements 218 in response to the transducer elements 218 receiving echoes (RF signals) generated in response to the transmitted ultrasound signals or pressure waves interacting with structure (e.g., organ cells, blood cells, etc.).
The console 204 further includes a switch 228 that switches between the transmit circuitry 224 and the receive circuitry 226, depending on whether the transducer array 216 is in transmit mode or receive mode. In transmit mode, the switch 228 electrically connects the transmit circuitry 224 to the transducer array 216 and hence the transducer elements 218. In receive mode, the switch 228 electrically connects the receive circuitry 226 to the transducer array 216 and hence the transducer elements 218. In a variation, separate switches are used for transmit and receive operations.
The console 204 further includes an interface 230 to a complementary interface 232 of an electromagnetic tracking system 234, which includes a field generator 236. The interfaces 230 and 232 can be electro-mechanical connectors, e.g., similar to the connectors 210 and 212 described herein. The electromagnetic tracking system 234 and the sensor(s) 220 together track the spatial position of the probe 202. The electromagnetic tracking system 234 conveys a signal indicative of this position to the console 204 via the interfaces 230 and 232. A suitable example of the electromagnetic tracking system 234 is the Aurora tracking system, a product of NDI, which is headquartered in Ontario, Canada. In a variation, an optical tracking system in employed.
The console 204 further includes an echo processor 238 that processes the electrical signals from the receive circuitry 226. In one instance, such processing includes applying time delays, weighting on the channels, summing, and/or otherwise beamforming received electrical signals. In B-mode, the echo processor 238 produces a sequence of focused, coherent echo samples along focused scanlines of a scanplane. The echo processor 238 further synchronizes the tracking signal from the tracking system 234 with the beamformed image such that the spatial orientation of the probe 202 for each image is linked to the image.
The console 204 further includes a rendering engine 240 and a display monitor 242. The rendering engine 240 is configured to displays images via the display 212. In one instance, this includes displaying an ultrasound image with a graphical representation of the probe 202 and, optionally, a scan plan superimposed over the ultrasound image in a region outside of the scanned anatomy. In one instance, the probe 202 in the graphical representation visually resembles the probe 202. The type (e.g., model) of the probe 202 is ascertained by the identification signal from the probe 202 and/or user input identifying the type of the probe 202. In a variation, a same graphical representation is used for all probes. In this instance, the probe type is not provided and/or indicated.
A probe memory 244 stores models of each type of probe 202 that can be used with the console 204. The rendering engine 240 retrieves a model based on the identification of the type of probe 202, a default, a user preference, etc. Each model is a three-dimensional (3-D) model and is displayed as a 3-D graphical representation of the probe 202 oriented on the display 242 based on the tracking signal such that the displayed 3-D graphical representation of the probe 202 reflects a spatial orientation of the probe 202 with respect to the sonographer. The 3-D graphical representation moves (e.g., rotates, translates, etc.) with the probe 202 so that it represents a current spatial orientation of the probe 202.
In one instance, this provides for a more intuitive display, e.g., relative to the display shown in
The console 204 further includes a controller 246 (which includes a processor, etc.) The controller 246 controls one or more of the components 212-244. Such control, in one instance, is based on a selected and/or activated visualization mode. For example, when a first visualization mode is active, the rendering engine 240 is provided with the type of the probe 202 and uses this information along with the probe tracking signal, the retrieve a suitable 3-D probe model and display the ultrasound image with the 3-D graphical representation of the probe 202 as described herein. In another visualization mode, the 3-D graphical representation is not displayed.
The console 204 also interfaces a user interface (UI) 248. The UI 248 includes one or more input devices (e.g., buttons, knobs, trackball, etc.) and/or one or more output devices (e.g., visual, audio, etc. indicators). The UI 248 can be used to select an imaging mode, a visualization mode (e.g., the first visualization mode), etc. As discussed herein, in one instance the user identifies the type of the probe 202 being used. In this instance, the user employs the UI 248 to make the selection. For example, the user can employ of pointing device (e.g., a mouse) of the US 248 to select the probe types from a menu or list of available probe types, manually enter via a keyboard the probe type, etc.
It is to be appreciated that the echo processor 238 and/or the rendering engine 240 are implemented by a processor such a central processing unit, a microprocessor, etc. In this instance, the console 2047 further includes computer readable medium (which excludes transitory medium and includes physical memory) encoded with computer executable instructions. The instructions, when executed by the processor, cause the processor to perform one or more of the functions described herein.
In this example, the 3-D graphical representation 306 shows the probe 202 is currently facing down with respect to the sonographer. The 3-D graphical representation 306 moves with the probe 202 with continuous tilt in the plane of the display 242 and three hundred and sixty degrees (360°) rotation into and out of the plane with the probe 202 so that it always reflects a current 3-D orientation of the probe 202 with respect to the sonographer.
For probes with more than one transducer array (e.g., a bi-plane probe), in one instance, the 3-D representation simultaneously shows both scan planes. In another instance, the 3-D representation shows only a one scan plane at a time. In this instance, the user can toggle between the scan planes. In another instance, the user selects to show the scan planes simultaneously and/or individually.
At 1302, the probe 202 is connected to the console 204.
At 1304, the probe 202 transmits a signal indicating its type.
At 1306, the rendering engine 240 retrieves a 3-D representation of the probe 202 based on the signal.
At 1308, the probe 202 is used to scan a subject or object.
At 1310, the echo processor generates an ultrasound image from the acquired data.
At 1312, the rendering engine 240 displays the ultrasound image with the 3-D representation of the probe 202 showing a spatial orientation of the probe 202 with respect to the sonographer, as discussed herein and/or otherwise.
Acts 1308, 1310 and 1312 can be repeated one or more times.
At 1402, the probe 202 is connected to the console 204.
At 1404, a user indicates the type of probe 204 with the user interface 248.
At 1406, the rendering engine 240 retrieves a 3-D representation of the probe 202 based on the signal.
At 1408, the probe 202 is used to scan a subject or object.
At 1410, the echo processor generates an ultrasound image from the acquired data.
At 1412, the rendering engine 240 displays the ultrasound image with the 3-D representation of the probe 202 showing a spatial orientation of the probe 202 with respect to the sonographer, as discussed herein and/or otherwise.
Acts 1408, 1410 and 1412 can be repeated one or more times.
At 1502, the probe 202 is connected to the console 204.
At 1504, the rendering engine 240 retrieves a 3-D representation of a probe.
At 1506, the probe 202 is used to scan a subject or object.
At 1508, the echo processor generates an ultrasound image from the acquired data.
At 1510, the rendering engine 240 displays the ultrasound image with the 3-D representation of the probe showing a spatial orientation of the probe 202 with respect to the sonographer, as discussed herein and/or otherwise.
Acts 1506, 1508 and 1510 can be repeated one or more times.
At least a portion of the method(s) discussed herein may be implemented by way of computer readable instructions, encoded or embedded on computer readable storage medium (which excludes transitory medium), which, when executed by a computer processor(s), causes the processor(s) to carry out the described acts. Additionally, or alternatively, at least one of the computer readable instructions is carried by a signal, carrier wave or other transitory medium.
The application has been described with reference to various embodiments. Modifications and alterations will occur to others upon reading the application. It is intended that the invention be construed as including all such modifications and alterations, including insofar as they come within the scope of the appended claims and the equivalents thereof.