Embodiments disclosed herein relate to ultrasound systems. More specifically, embodiments disclosed herein are related to ultrasound devices that include and/or use a multi-dimensional and multi-frequency array of transducer elements for use with ultrasound.
In conventional ultrasound diagnostic imaging system applications, several separate ultrasound transducers (also referred to as ultrasound probes or ultrasound scanners) are used with the imaging system to accomplish various clinical needs for human body imaging. For example, a phased array is used for cardiac related diagnostics, a high frequency linear array for nerve visualization, and a curved linear array for abdominal imaging.
Because transducers are designed for specific applications, it is common for a clinician to switch between different types of ultrasound transducers to get images with optimized resolution and penetration depth during the imaging usages. However, switching transducers during the imaging diagnostics could interrupt clinical workflow, increase clinical exam time, and increase cleaning time of the transducers.
Another approach used by some manufacturers is to put two different functional arrays at the two separate ends of an ultrasound scanner. For example, in one implementation, a linear array could be mounted at one end of the scanner while a curved array is mounted at the opposite end of the scanner. While this approach may work for a wireless scanner, it will be difficult to manage the cable when a wired transducer is used. Further, switching the two ends of the scanner back and forth could bring some difficulties on workflow or require more time for array cleaning especially when clean clinical environments are required.
Thus, the approaches mentioned above either are too expensive to develop or not easy to use and clean in clinical environments.
Ultrasound devices that include and/or use a multi-dimensional and multi-frequency arrays of transducer elements for use with ultrasound and methods for using the same are disclosed. In some embodiments, an ultrasound device includes: a lens; an array coupled to the lens and having a center row of transducer elements of a first width that operate at a first frequency and two or more outer rows of transducer elements of a second or other widths that operate at a second or other frequencies different than the first frequency, where the first width is different than the second width and the center row is between the two or more outer rows. The ultrasound device also includes a controller coupled to the array and configured to control the center row of transducer elements and two or more outer rows of transducer elements to operate at a same time or at different times.
In some embodiments, an ultrasound device comprising: a lens; a high frequency array comprising a center row of transducer elements that operate at a first frequency and a low frequency array comprising two outer rows of transducer elements that operate at a second or other frequencies different than the first frequency, the center row of transducer elements being between the two outer rows of transducer elements. The ultrasound device also includes a controller coupled and configured to control each of the high frequency array and the low frequency array in a plurality of modes to operate at a same time or at different times. In some embodiments, the modes include a first mode in which the controller controls the high frequency array to operate as a linear array while low frequency array is not operating; a second mode in which the controller controls the low frequency array to operate as a phased array while the high frequency array is not operating; and a third mode in which the controller controls the high and low frequency arrays to operate at the same time to obtain signals for performing super broadband harmonic imaging.
In some embodiments, a method of controlling an ultrasound system includes controlling a multi-frequency and multi-dimensional array of transducer elements having a high frequency array comprising a center row of transducer elements that operate at a first frequency and a low frequency array comprising two outer rows of transducer elements that operate at a second or other frequencies different than the first frequency, where the center row of transducer elements is between the two outer rows of transducer elements. In some embodiments, controlling the multi-frequency and multi-dimensional array includes controlling both the high frequency array and low frequency array to operate at the same time; receiving, by both the high and low frequency arrays, reflected signals; and performing super broadband harmonic imaging based on the received signals.
The appended drawings illustrate examples and are, therefore, exemplary embodiments and not considered to be limiting in scope.
In the following description, numerous details are set forth to provide a more thorough explanation of the present invention. It will be apparent, however, to one skilled in the art, that the present invention may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form, rather than in detail, in order to avoid obscuring the present invention.
Embodiments disclosed herein include several array architectures and interconnection schemes for multi-functional and multi-frequency ultrasound transducers. Embodiments disclosed herein also include array-to-system electrical connection methods and several unique imaging modes for using the multi-functional and multi-frequency transducers with the imaging system.
In some embodiments, the ultrasound transducer has multiple functionalities (phased and linear), multiple elevation focal depths, multiple operation frequencies, super broader bandwidth that could provide clinicians a powerful tool for full-body scans without having to exchange transducers during the clinical procedures. The super broader bandwidths can be the result of an aggregation, or combining, of bandwidths (e.g., overlapping bandwidths, etc.) or resonances from multiple sub-arrays of transducer elements in a multi-functional transducer. Also, in some embodiments, the multi-functional transducer achieves broad bandwidths while maintaining or increasing sensitivity.
When part of a wired or wireless handheld ultrasound scanner, the multi-function transducers described herein are capable of being used for full body scanning, that will have huge advantages in various environments. In clinical applications, when multiple transducers with different operation frequencies are required, a single multi-functional transducer such as disclosed herein can provide detailed images with multiple operation frequencies in clinical diagnostics.
The ultrasound system 100 also includes the ultrasound scanner 112, which can be referred to as an ultrasound probe, ultrasound transducer as disclosed in more detail below, and the like. In some embodiments, the ultrasound scanner 112 is operated by an operator 116 (e.g., clinician, nurse, sonographer, etc.,) to transmit ultrasound at an anatomy of a patient 118 and receive reflections of the ultrasound from the patient anatomy as part of the ultrasound examination. The ultrasound system 100 can generate the ultrasound image 120 based on the reflections, and the computing device 102 can display the ultrasound image 120. In examples, the ultrasound scanner 112 is a multi-functional array transducer that includes multiple arrays, and the ultrasound system 100 can implement full aperture imaging, tissue harmonic imaging, super broadband imaging, combinations thereof, and the like, as further described below (e.g., with respect to
The ultraportable ultrasound system 200 also includes a docking station 208 that can be configured to support (e.g., physically hold in place) the mobile computing device 204 when the mobile computing device 204 is inserted into the docking station 208. This insertion is illustrated by the arrow 210. The docking station 208 can be of any suitable form factor to support the mobile computing device 204, and is illustrated as a plain rectangular box in
Hence, the docking station 208 can include any suitable hardware, software, and/or firmware to provide the additional ultrasound resources, including memory 208-1 and processors 208-2 that can execute instructions stored by the memory 208-1 to provide the additional ultrasound resources. The docking station 208 also includes a transceiver 208-3 for communicating with a cloud 212. Cloud 212 can include any network, network resources, server, database, and the like for providing resources, including the additional ultrasound resources, to the ultraportable ultrasound system 200. In some embodiments, the cloud 212 is maintained by a care facility (e.g., hospital, clinic, etc.) where the ultraportable ultrasound system 200 is used to perform ultrasound examinations.
In some embodiments, the docking station 208 is included as part of the mobile computing device 204. Hence, the mobile computing device 204 can include the memory 208-1, the processors 208-2, and the transceiver 208-3. In some embodiments, the docking station 208 can be attached to a patient bed, so that the mobile computing device 204 can be placed in a stationary position relative to the patient bed during an ultrasound examination. Additionally or alternatively, the docking station can be attached to an ultrasound cart to hold the mobile computing device 204 during transport and/or during an ultrasound examination.
The docking station 208 is coupled via a communication link 214 to a display device 216 of the ultraportable ultrasound system 200. The transceiver 208-3 can facilitate communication between the docking station 208 and the display device 216 over the communication link 214. The communication link 214 can include a wired connection (e.g., a cable) to propagate data between the docking station 208 and the display device 216. The display device 216 can include any suitable device for displaying ultrasound data, illustrated examples of which include a monitor 216-1, an ultrasound machine 216-2, and smart glasses 216-3. These examples of the display device 216 are exemplary and meant to be non-limiting. In embodiments, the display device 216, e.g., the glasses 216-3, display ultrasound data (e.g., an ultrasound image) in an augmented reality (AR) or a virtual reality (VR) environment. In some embodiments, the additional ultrasound resources described above are provided to the ultraportable ultrasound system 200 by the display device 216.
In some embodiments, the mobile computing device 204 is configured to display a first user interface for controlling the ultraportable ultrasound system 200 when supported by the docking station 208 and a second user interface for controlling the ultraportable ultrasound system 200 when the mobile computing device 204 is unsupported by the docking station 208. For instance, the second user interface can include only basic ultrasound controls, such as controls for B-mode and M-mode imaging, and the first user interface can include advanced ultrasound imaging controls, such as controls for color Doppler, power Doppler, spectral Doppler imaging, tissue harmonic imaging, full-aperture imaging (described below in more detail), and the like. In some embodiments, a user can transfer components of a user interface displayed on the display device 216 to a user interface displayed by the mobile computing device 204 (and vice versa). For instance, a user can select a portion of a user interface on the display device 216, such as by drawing a boundary container around the portion to select the portion, and then perform a swipe gesture to an edge of the display device 216 to transfer the portion of the user interface to the mobile computing device 204. Additionally or alternatively, the user can transfer components of a user interface displayed on the mobile computing device 204 to a user interface displayed by the display device 216, such as with a selection trace and a swiping gesture.
In some embodiments, when the mobile computing device 204 is supported by the docking station 208, the ultraportable ultrasound system 200 can disable the communication link 206 between the mobile computing device 204 and the ultrasound scanner 202, and enable a communication link 218 between the ultrasound scanner 202 and the display device 216. Hence, the ultrasound scanner 202 can be paired with the display device 216.
In some embodiments, the ultraportable ultrasound system 200 facilitates the simultaneous use of multiple ultrasound scanners for use by different clinicians (not shown in
By providing the additional ultrasound resources to the ultraportable ultrasound system 200, e.g., from the cloud 212, the docking station 208, and/or the display device 216, the components of the ultraportable ultrasound system 200, including the mobile computing device 204 and the ultrasound scanner 202, can remain small in form factor and light weight, allowing them to be used for their intended purpose of ultra-portability. Further, the mobile computing device 204 and the ultrasound scanner 202 can consume less power than if they were required to implement the additional ultrasound resources. Hence, the ultrasound scanner 202 generates less heat, and thus can be used for longer scan times with shorter wait times between scans, resulting in better patient care. In some embodiments, the docking station 208 can be removably attached to the ultrasound scanner 202. Hence, the docking station 208 and the ultrasound scanner 202 can be transported as a single unit, to make it less susceptible to loss and/or theft. At the point of care, the docking station 208 can be quickly removed from the ultrasound scanner 202 for use.
In some embodiments, a multi-functional array transducer is multi-dimensional and multi-frequency. In some embodiments, the multi-functional array transducer has a center row (sub-array) of transducer elements (e.g., PZT elements) that operate at one frequency and two outer rows (sub-arrays) of additional (e.g., low frequency) transducer elements (e.g., PZT elements) that operate at another frequency that is lower than the operating frequency of the high frequency transducer elements. In some embodiments, the two outer rows of low frequency transducer elements are side by side the center row of high frequency transducer elements with the center row being between the two outer rows. Furthermore, in some embodiments, the width of the transducer elements is different among the rows of elements. For example, in some embodiments, the width of the transducer elements in the center row is smaller than the width of the transducer elements in the outer rows. In some embodiments, the multi-functional array transducer has transducer elements that have different element widths and operation frequency that are selected based on clinical applications for which the multi-functional array transducer is to be used. In this way, the multi-functional array transducer is configurable based on the clinical application that is to be performed at the time.
Some traditional multi-row array designs, such as the 1.25D, 1.5D or 1.75D, have the same sub-element area and operation frequency in each row of elements.
Referring to
3B. Referring to
In contrast to the 1.25D array transducer of
Referring to
In some embodiments, transducer elements in row 323 including transducer element 331 are driven with one frequency using driver 334, while transducer elements in rows 322 and 324, along with transducer elements 330 and 332, are driven by driver 335 that are separate circuitry with different frequencies. In some embodiments, driver 335 drives rows 322 and 324 with a low frequency while driver 334 drives row 331 with a high frequency in comparison to the frequency that drives rows 322 and 324. Note that while rows 322 and 324 can be driven separately from row 323, all of rows 322-324 can be driven at the same time depending on which mode the array 321 is operating. A controller coupled to array 321 (not shown for clarity) causes drivers 334 and 335 to drive the transducer elements in rows 322-324 and is configured to control the center row of transducer elements and two or more outer rows of transducer elements with separate circuitry to operate at a same time or at different times.
Note that in
In some embodiments, the transducer is not limited to three rows of transducer elements. For example, in other symmetric designs, the transducer has five rows, such as shown in
Some embodiments of the transducer provide a solution to an interconnection problem from electrical to acoustic elements in multi-row transducers by embedding multiple signals and/or grounds inside a backing block.
Referring to
Each stack of the multi-row transducer 500 also includes three matching layers 560, 565, and 570. The first matching layer 560 includes a conductive surface that includes an electrode 561. In some embodiments, electrodes 561A and 561B are electrically connected to grounds 510A and 510B. In some embodiments, grounds 510A and 510B are coupled together, e.g., electrically connected.
Stacks of transducer 500 also includes an acoustic layer, such as piezoelectric layer 540 (e.g., PZT), below the conductive surface that includes the electrode 561. The matching layers 560, 565, 570 are located between the piezoelectric layer 540 and acoustic lens 580. In some embodiments, the lens 580 depicted in
Each stack also includes an electrode 571 electrically connected to signals 530, which can include and/or represent one of channels 1-64 and 65-128. In some embodiments, the low frequency stack also includes a Tungsten carbide (WC) layer interfacing electrode 590 to signal lines 530. In some embodiments, the high frequency stack also includes a Tungsten carbide (WC) layer interfacing its electrode to signal lines. In some other embodiments, such a layer is not included in either the low or high frequency stacks. Alternatively, this layer can be tungsten (W) or other suitable materials with higher acoustic impedance than PZT.
The acoustic stacks inside multi-function transducer could have different operation frequencies, element geometries, focal depths, and beam characteristics, such as beam width and lens focal depth. The acoustic stacks of
As discussed above with respect to symmetric and non-symmetric row configurations, the acoustic stacks can be built with a non-symmetric format with a combination of two, three, four, five, or more separated acoustic stacks operating at different operational frequencies (e.g., ultrasound frequencies) and with different imaging modes such as phased array, or linear array. In some embodiments, two or more of these sub-arrays have different imaging planes. In some other embodiments, all these sub-arrays have different imaging planes.
The acoustic stacks can also be built with a symmetric format with the combination of a number of 1+2n (n=1, 2, 3, . . . ) sub-arrays, e.g., three, five, seven, or more separated acoustic stacks operating at different operations frequencies and with different imaging modes (e.g., a phased array, a linear array, etc.). With the exception of the center sub-array, the symmetric sub-arrays on both sides of the center could have the same operation frequency. However, the sub-arrays with a different distance to the center can have different operation frequencies. In some embodiments, all of the sub-arrays have the same imaging plane in the middle of the transducer.
Thus, embodiments of multi-row transducers are disclosed that provide unique acoustic designs and architectures to integrate multiple acoustic stacks with different functionalities into one array enclosure. In some embodiments, the array enclosure has similar dimensions as traditional one-dimensional arrays used in diagnostic ultrasound imaging applications.
Techniques are disclosed herein include manufacturing processes to build cost-effective, high performance, multi-row transducers. In some embodiments, for example, in
Note that the techniques disclosed herein are not limited to λ/2. In some embodiments, based on the operation frequencies of the phased and linear arrays, a different pitch can be warranted (e.g., 0.8 λ, etc.).
The combination of phased and linear arrays with f0 and 2f0 as operation frequencies provide a simplification in the manufacturing process. For example, the multi-functional array could be diced at the same time with the same dicing pitch, which could greatly simplify the element dicing while keeping all the sections of the elements aligned well.
In
In some embodiments, the multi-functional transducer is built by combining each individual sub-arrays together. The combining can be performed by gluing them together. The combining can be performed by assembling sub-arrays together through mechanical structures. In some embodiments, each individual sub-array is built separately.
In some embodiments, the phased array section with 64 transducer elements 601 (e.g., PZT, etc.) operating at a lower frequency (ELF) are connected to imaging system channels 1-64 via an interface board 603, coax cables 610, and a transducer/system connector 605 containing tuning inductors of a first value L1. The linear array section with 64 transducer elements 602 (e.g., a piezo material (e.g., PZT), etc.) operating at a higher frequency (EHF) than the ELF are connected to imaging system channels 65-128 via an interface board 604, coax cables 610, and a transducer/system connector 606 containing tuning inductors of a second value L2, which is different from the inductor value of connector 605.
In some embodiments, the elements of lower frequency and higher frequency arrays (ELF and EHF) have not only the different acoustic architectures and designs, such as PZT and matching layer thickness, but also are tuned with different inductor values to obtain the best overall performances. Note that the size (e.g., values) of the tuning inductors can be different depending on the operating frequency of the transducer elements.
In some embodiments, the multi-functional array transducer provides unique flexibility and an acoustic method to create ultra-broadband operation frequencies and bandwidth, narrow beam widths and image resolutions in near and far field, and deeper penetration depths to meet various clinical needs in a single transducer. More specifically, when operated together at the same time, the bandwidths of each of the sub-arrays of transducer elements in the multi-functional array transducer are combined and, in essence, to extend the overall bandwidth, thereby creating an ultra-broadband bandwidth.
By incorporating a unique lens design method, such as, for example, multi-radius (e.g., multi-ROC) acoustic lens in front of the high and low frequency sections, with the multi-function transducer's elevation section widths, the multi-functional transducer's elevation beam pattern design can be further improved in both near and far fields, compared to a single-ROC lens. For example, for the elevation of the triple row configuration depicted
In some embodiments, the multi-radius lens for different acoustic sections can also reduce the attenuation of the ultrasound energy passing through the lens. In clinical applications, in Figure
The multi-functional transducer can provide a great flexibility for the ultrasound imaging system to control ultrasound resolution and penetration to meet various clinical needs. In some embodiments, the multi-functional transducer can be configured to use one or more of at least six unique imaging modes with the imaging system. Some of these imaging modes results in an expanded bandwidth when using both high and low frequency sections of the multi-functional transducer at the same time.
In clinical applications, the multi-functional transducer can switch between the imaging modes without changing the transducers, or in some cases, without even changing the operator's grip on the transducer/probe. The control of the switch between the imaging modes can use, but not limited to, a push button at the transducer, a pressure sensor at the transducer (e.g., grip equipped pressure to sense mode selection by a user, etc.), a push button at the system, a switch at the system or the transducer, a voice control command, a touch screen, an IMU with haptic feedback, etc. In one example, the ultrasound system can automatically and without user intervention switch between imaging modes based one or more of an anatomy being imaged (e.g., as determined by a machine-learned model), a current and/or subsequent step of an ultrasound protocol, such as the extended focused assessment with sonography in trauma (EFAST) protocol, an amount of battery life left (or charge remaining, or scan time) for the ultrasound probe, and a pressure applied from the probe to the patient.
Referring to
Mode 2 is where the high frequency section of the multi-functional transducer is used as a transmitter of ultrasound and a receiver of ultrasound reflections, while the low frequency section of the multi-functional transducer is not used. For example, in the case of the multi-functional transducer having a triple configuration (e.g.,
In some embodiments, full-body scans can be performed by using different functional sections of the transducer (e.g., Mode 1 and Mode 2). The low frequency section can perform as a phased array with deep penetration in clinical applications, while the high frequency array can perform as a linear array with high resolution in clinical applications.
Mode 3 can be used to have the multi-functional transducer operate as a conventional array transducer with a full aperture, i.e. both sections operated together to produce the overlapped section of the frequency bandwidth (e.g., overlap 703 where graphs 701 and 702 overlap in
In Mode 4, the low frequency section of the multi-functional transducer is used as a transmitter, and the high frequency section of the multi-functional transducer is used as a receiver. For example, in the case of the multi-functional transducer having a triple configuration (e.g.,
Therefore, no pulse inversion is required to cancel out the transmitting signal, and a fast frame rate, single pulse broad band THI can be realized.
In some embodiments, Mode 4 can be used in a cardiac application when a high frame rate is required to image the fast heart beats. However, the standard pulse inversion THI technique can also use this Mode 4 when a high frame rate is not required.
Mode 5 is a further extension from Mode 4 and can be used to increase the receiving aperture for deeper tissue sbTHI. In Mode 5, both low and high frequency sections of the multi-functional transducer are used in receiving. The low frequency array uses the lower-half of its frequency-spectrum-bandwidth in transmission. Both the low frequency and the high frequency array are used as receivers. Note that either the pulse inversion sbTHI or a fast frame rate sbTHI can be used in Mode 5 depending on clinical application.
Mode 6 is similar to Modes 4 and 5. In Mode 6, both low and high frequency sections (sub-arrays) are used in transmission and receiving. For example, in the case of the multi-functional transducer having a triple configuration (e.g.,
In any of Modes 1 to 6, the pulse inversion THI can be used when a high frame rate THI is not required.
Referring to
In some embodiments, the controller controls the array in a plurality of modes in which either the center row of transducer elements or the two or more rows of transducer elements is operating or both the center row of transducer elements and the two or more rows of transducer elements are operating at the same time based on which mode of the plurality of modes is being used. In some embodiments, the controller is configured to control the center row of transducer elements and two or more outer rows of transducer elements independently in one of the modes to operate at the same time to obtain signals for performing super broadband harmonic imaging by controlling the center row of transducer elements to perform a receive operation while controlling the two or more rows of transducer elements to perform transmit operations.
In some embodiments, the modes include one or more of a first mode in which the controller controls the high frequency array to operate as a linear array while low frequency array is not operating, a second mode in which the controller controls the low frequency array to operate as a phased array while the high frequency array is not operating, a third mode in which the controller controls the high and low frequency arrays to operate at the same time to obtain signals for performing super broadband harmonic imaging, a fourth mode in which the controller causes both the high and low frequency arrays to perform both transmit and receive operations to provide signals for full aperture imaging on an area of bandwidth produced by both that overlaps, a fifth mode in which the controller causes the high frequency array to perform a receive operation and the low frequency array to perform both transmit and receive operations to provide signals for THI, and a sixth mode in which the controller causes both the high and low frequency arrays to perform both transmit and receive operations to provide signals for super broadband full aperture THI. In some embodiments, during the third mode, the high frequency array filter out reflections from transmitted signals from the low frequency array.
Using the high and low frequency arrays of transducer elements, ultrasound is transmitted at a patient anatomy and receive reflections of the ultrasound by the high and low frequency arrays which are represented as reflected signals based on the mode (block 1012). In some embodiments, one or both arrays (e.g., center and outer rows of the transducer elements) transmit ultrasound and one or both arrays receive reflected signals.
In some embodiments, each of the transducer elements is part of an acoustic stack that includes a backing block through which at least a signal coupled to said each transducer element traverses. In some embodiments, each of the transducer elements is part of an acoustic stack, and stacks associated with transducer elements of the center row have different focal depths and beam characteristics than stacks associated with transducer elements of the two or more rows. In some embodiments, one or more system connectors interface signals from each of the transducer elements of the center row of transducer elements and the two or more rows of transducer elements to an ultrasound system, the one or more system connectors comprising first and second sets of tuning inductors, the first set of tuning inductors for interfacing signals from the transducer elements of the center row of transducer elements and the second set of tuning inductors for interfacing signals from the transducer elements of the two or more rows of transducer elements using tuning inductors with different inductor values than tuning inductors of the first set of tuning inductors.
Using the reflected signals, an ultrasound image is generated and displayed using an ultrasound machine (block 1013). In some embodiments, a computing device (e.g., an imaging subsystem) of the ultrasound system generates the image based on the received signals.
In some embodiments, the method also includes switching between modes using a user interface (e.g., one or more buttons, sensors, or switches coupled to a probe enclosure) (1014). Note that such as switch can be performed to change the imaging mode and/or when the user is going to image another type of anatomy.
Referring to
In some embodiments, the modes include one or more of a first mode in which the controller controls the high frequency array to operate as a linear array while low frequency array is not operating, a second mode in which the controller controls the low frequency array to operate as a phased array while the high frequency array is not operating, a third mode in which the controller controls the high and low frequency arrays to operate at the same time to obtain signals for performing super broadband harmonic imaging, a fourth mode in which the controller causes both the high and low frequency arrays to perform both transmit and receive operations to provide signals for full aperture imaging on an area of bandwidth produced by both that overlaps, a fifth mode in which the controller causes the high frequency array to perform a receive operation and the low frequency array to perform both transmit and receive operations to provide signals for THI, and a sixth mode in which the controller causes both the high and low frequency arrays to perform both transmit and receive operations to provide signals for super broadband full aperture THI. In some embodiments, during the third mode, the high frequency array filter out reflections from transmitted signals from the low frequency array.
Using the high and low frequency arrays of transducer elements, ultrasound is transmitted at a patient anatomy and receive reflections of the ultrasound by the high and low frequency arrays which are represented as reflected signals based on the mode (block 1022). In some embodiments, depending on the mode, one or both arrays (e.g., center and outer rows of the transducer elements) transmit ultrasound and one or both arrays receive reflected signals.
Using the reflected signals, an ultrasound image is generated and displayed using an ultrasound machine (block 1023). In some embodiments, a computing device (e.g., an imaging subsystem) of the ultrasound system generates the image based on the received signals.
Referring to
Using the multi-frequency and multi-dimensional array of transducer elements, ultrasound is transmitted at a patient anatomy and receive reflections of the ultrasound by both of the high and low frequency sub-arrays which are represented as reflected signals (block 1032). In some embodiments, both sub-arrays (e.g., center and outer rows of the transducer elements) receive reflected signals. In some embodiments, the bandwidth of the received signals covers bandwidth across both the high and low frequency arrays.
Based on the received signals, super broadband harmonic imaging is performed (block 1033). In some embodiments, a computing device (e.g., an imaging subsystem) of the ultrasound system performs the super broadband harmonic imaging based on the received signals. The result of performing the super broadband harmonic imaging is the display of an image generated using an ultrasound machine.
Referring to
In some embodiments, the ultrasound system can determine the occurrence of the system event, disable the imaging mode, and enable the additional imaging mode without changing the ultrasound scanner. For instance, the same ultrasound scanner can be used for both the imaging mode and the additional imaging mode, without swapping the ultrasound scanner, rotating or moving the ultrasound scanner, or exchanging the ultrasound scanner for a different ultrasound scanner.
In some embodiments, the controller is implemented to, for at least one of the imaging mode and the additional imaging mode, operate at least two arrays of the two or more arrays of transducer elements. These arrays can operate with at least one of different bandwidths and different frequencies. In some embodiments, these arrays include a linear array and a phased array. Additionally or alternatively, these arrays can include a first linear array and a second linear array. Additionally or alternatively, these arrays can include a first curvilinear array and a second curvilinear array.
In some embodiments, the occurrence of the system event includes a determination that the ultrasound image includes a patient anatomy. In some embodiments, the ultrasound system includes a machine-learned model (e.g., a neural network) implemented to make the determination. Additionally or alternatively, the ultrasound system can select, the additional imaging mode based on the patient anatomy.
In some embodiments, the occurrence of the system event includes a determination that the ultrasound image is part of a protocol step, such as step of a FAST or EFAST ultrasound protocol. Additionally or alternatively, the occurrence of the system event can include a determination that the ultrasound image includes an interventional instrument. Additionally or alternatively, the occurrence of the system event can include a determination that the ultrasound image has an image quality score that is above or below a threshold score. The ultrasound system can include one or more machine-learned models implemented to make one or more of the determinations of the occurrences of the system events.
In some embodiments, at least one of the imaging modes and the additional imaging mode includes at least one of a full aperture imaging mode, a tissue harmonic imaging mode, and a mode that combines full aperture imaging and tissue harmonic imaging. In an example, the imaging mode and the additional imaging mode can have different frame rates.
In some embodiments, the ultrasound system includes a display device implemented to display a user interface configured to display a visual representation of at least one mode of the imaging mode and the additional imaging mode. The visual representation can indicate at least one of an ultrasound frequency, a bandwidth, a transmission path, a reception path, and the transducer elements enabled for at least one mode. For instance, the visual representation can include one or more of text, an icon, an image, a graphic, an animation, a number, an arrow, and the like to indicate an ultrasound frequency, a bandwidth, a transmission path, a reception path, and the transducer elements enabled for the at least one mode.
Referring to
In some embodiments, the status of the ultrasound system includes an amount of battery charge for a battery of the ultrasound scanner. Additionally or alternatively, the ultrasound system can include an ultrasound machine configured to generate an ultrasound image, and the status of the ultrasound system can include an amount of battery charge for a battery of the ultrasound machine. Additionally or alternatively, the status of the ultrasound system can include an amount of scan time remaining.
In some embodiments, the status of the ultrasound system includes a determination of whether the ultrasound scanner is touching a patient. The ultrasound scanner can include one or more pressure sensors, and the processor system can make the determination based on pressure data generated by the one or more pressure sensors. Additionally or alternatively, the status of the ultrasound system can indicate the ultrasound system is operating according to a protocol step when the imaging mode is enabled.
In some embodiments, the status of the ultrasound system can indicate a receipt of a user selection via a user interface, the user selection indicating the additional imaging mode. The ultrasound scanner can include the user interface. Additionally or alternatively, the ultrasound system can include an ultrasound machine configured to generate an ultrasound image, and the ultrasound machine can include the user interface that is implemented to display the ultrasound image. In some embodiments, the status of the ultrasound system indicates an activation of an examination preset. Additionally or alternatively, the status of the ultrasound system can indicate a selection of a gain or depth.
In some embodiments, the ultrasound scanner includes the two or more arrays of transducer elements on a same end of the ultrasound scanner. The two or more arrays of transducer elements can be included in an enclosure that can be removably attached to the ultrasound scanner.
Referring to
In some embodiments, the two or more arrays are contained in a common enclosure that is attached to the one end of the ultrasound scanner. In some embodiments, the enclosure can be removed from the ultrasound scanner and reattached to the ultrasound scanner
In some embodiments, the ultrasound system includes a display device implemented to display a user interface. The user interface can receive the request as a user selection.
In some embodiments, the ultrasound system includes a machine-learned model implemented to generate an inference based on an ultrasound image generated by the ultrasound system. The ultrasound system can generate the request based on the inference. The inference can include at least one of a label, a classification, a segmentation, an object identification, an additional image, and a probability.
In some embodiments, the ultrasound scanner includes one or more pressure sensors configured to generate pressure data. The pressure sensors can be located on or near a lens of the ultrasound scanner, and indicate when the ultrasound scanner is pressed against a patient. The ultrasound system can generate the request based on the pressure data. For instance, the ultrasound system can compare the pressure data to a threshold pressure, and generate the request based on the comparison, e.g., when the pressure data indicates a larger pressure than the threshold pressure according to the comparison.
Additionally or alternatively, the ultrasound scanner can include in, under, or on a surface, any suitable type of sensors for determining a grip orientation. In one example, the ultrasound scanner includes capacitive sensors that can measure a capacitance, or change in capacitance, caused by a user's touch or proximity of touch, as is common in touchscreen technologies. Additionally or alternatively, the ultrasound scanner can include pressure sensors configured to determine an amount of pressure caused by the user's grip on the scanner. Hence, the pressure data can indicate a grip orientation on the ultrasound scanner, and the ultrasound system can generate the request based on the grip orientation, e.g., the grip orientation can indicate an intention to use the ultrasound scanner by a clinician gripping the ultrasound scanner.
Additionally or alternatively, the ultrasound system includes one or more location sensors (e.g., cameras, LIDAR for detection and ranging, etc.) implemented to determine a proximity of the ultrasound scanner to a patient. The ultrasound system can generate the request based on the proximity, such as when the proximity is less than a threshold proximity.
In some embodiments, the ultrasound scanner includes one or more buttons or switches implemented to receive the request as a user input. Additionally or alternatively, in some embodiments, the ultrasound system includes a voice recognition circuit implemented to receive the request as a spoken command.
In some embodiments, the two or more arrays of transducer elements make up a symmetric array architecture having a center row of the transducer elements and a same number of outer rows of the transducer elements on opposing sides of the center row. In some embodiments, the transducer elements of the center row and the outer rows have a common width. In some other embodiments, the transducer elements of the center row and the outer rows have different widths. In yet some other embodiments, the transducer elements of the center row and the outer rows have different operation frequencies and bandwidths.
In some embodiments, the two or more arrays of transducer elements make up an asymmetric array architecture having a center row of the transducer elements and different numbers of outer rows of the transducer elements on opposing sides of the center row.
In some embodiments, the two or more arrays of transducer elements are arranged in concentric shapes. For example, the concentric shapes can include concentric annuli or ellipses. In an example, the concentric shapes include concentric polygons. In another example, the concentric shapes include nested open shapes, such as in the shape of a “V”, “L”, or “C”.
The example computing device 1100 can include a processing device 1102 (e.g., a general-purpose processor, a programmable logic device (PLD), etc.), a main memory 1104 (e.g., synchronous dynamic random-access memory (DRAM), read-only memory (ROM), etc.), and a static memory 1106 (e.g., flash memory, a data storage device 1108, etc.), which can communicate with each other via a bus 1110. The processing device 1102 can be provided by one or more general-purpose processing devices such as a microprocessor, a central processing unit, or the like. In an illustrative example, the processing device 1102 comprises a complex instruction set computing (CISC) microprocessor, a reduced instruction set computing (RISC) microprocessor, a very long instruction word (VLIW) microprocessor, or a processor implementing other instruction sets or processors implementing a combination of instruction sets. The processing device 1102 can also comprise one or more special-purpose processing devices such as an application-specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), a network processor, or the like. The processing device 1102 can be configured to execute the operations described herein, in accordance with one or more aspects of the present disclosure, for performing the operations and steps discussed herein.
The computing device 1100 can further include a network interface device 1112, which can communicate with a network 1114. The computing device 1100 also can include a video display unit 1116 (e.g., a liquid crystal display (LCD), an organic light-emitting diode (OLED), a cathode ray tube (CRT), etc.), an alphanumeric input device 1118 (e.g., a keyboard), a cursor control device 1120 (e.g., a mouse), and an acoustic signal generation device 1122 (e.g., a speaker, a microphone, etc.). In some embodiments, the video display unit 1116, the alphanumeric input device 1118, and the cursor control device 1120 can be combined into a single component or device (e.g., an LCD touch screen).
The data storage device 1108 can include a computer-readable storage medium 1124 on which can be stored one or more sets of instructions 1126 (e.g., instructions for carrying out the operations described herein, in accordance with one or more aspects of the present disclosure). The instructions 1126 can also reside, completely or at least partially, within the main memory 1104 and/or within the processing device 1102 during execution thereof by the computing device 1100, where the main memory 1104 and the processing device 1102 also constitute computer-readable media. The instructions can further be transmitted or received over the network 1114 via the network interface device 1112.
Various techniques are described in the general context of software, hardware elements, or program modules. Generally, such modules include routines, programs, objects, elements, components, data structures, and so forth that perform particular tasks or implement particular abstract data types. The terms “module,” “functionality,” and “component” as used herein generally represent software, firmware, hardware, or a combination thereof. In some aspects, the modules described herein are embodied in the data storage device 1108 of the computing device 1100 as executable instructions or code. Although represented as software implementations, the described modules can be implemented as any form of a control application, software application, signal-processing and control module, hardware, or firmware installed on the computing device 1100.
While the computer-readable storage medium 1124 is shown in an illustrative example to be a single medium, the term “computer-readable storage medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database and/or associated caches and servers) that store the one or more sets of instructions. The term “computer-readable storage medium” shall also be taken to include any medium that is capable of storing, encoding, or carrying a set of instructions for execution by the machine and that causes the machine to perform the methods described herein. The term “computer-readable storage medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.
The ultrasound systems 1202 and 1204 can be in communication via the network 1206 as part of the environment 1200. The network 1206 can include any suitable network, such as a local area network, a wide area network, a near field communication network, the Internet, an intranet, an extranet, a system bus that couples devices or device components (e.g., in an ASIC, FPGA, or SOC), and combinations thereof. Accordingly, in embodiments, information can be communicated to the ultrasound systems 1202 and 1204 through the network 1206. For instance, the database 1208 can store instructions executable by a processor system of the ultrasound systems 1202 and 1204, and communicate the instructions via the network 1206. The database 1208 can store ultrasound resources and user interface components and share them with the ultrasound systems 1202 and 1204.
The environment 1200 also includes a server system 1210 that can implement any of the functions described herein. The server system 1210 can be a separate device from the ultrasound systems 1202 and 1204. Alternatively, the server system 1210 can be included in at least one of the ultrasound systems 1202 and 1204. In one example, the server system 1210 and the database 1208 are included in at least one of the ultrasound systems 1202 and 1204. In an example, the server system 1210 is implemented as a remote server system that is remote from (e.g., not collocated with) the ultrasound systems 1202 and 1204.
In some embodiments, the ultrasound control panel 1302 includes any suitable controls and settings for controlling an ultrasound system, such as depth and gain adjustments, and a button to store/save images and/or video clips. The ultrasound control panel 1302 can also include icons to select examination presets, such as a heart icon for a cardiac preset, a lung icon for a respiratory preset, an eye icon for an ocular present, and a leg icon for a muscular-skeletal preset. The ultrasound control panel 1302 can also include options (not illustrated in
The ultrasound image panel 1304 can display any suitable ultrasound image, such as a B-mode image, M-mode image, Doppler image, etc. The ultrasound image panel 1304 can also display a measurement, annotation, classification, and the like. In embodiments, the ultrasound image panel 1304 can display an inference generated by a neural network, such as a segmentation of a patient anatomy.
In some embodiments, the multi-functional transducer configuration panel 1306 includes controls for enabling and disabling arrays of a multi-functional transducer and/or setting imaging modes for imaging with a multi-functional transducer. The multi-functional transducer configuration panel 1306 in
In some embodiments, the multi-functional transducer configuration panel 1306 also includes pull-down tabs to select an array and configure the array. For example, the multi-functional transducer configuration panel 1306 in
In some embodiments, the multi-functional transducer configuration panel 1306 also includes pull-down tabs to enable the multi-functional transducer in an automatic mode that automatically configures the arrays based on a determination made by the ultrasound system. Example determinations displayed in
The transducer representation panel 1308 can display any suitable type and number of visual representations for a multi-functional transducer that indicates an imaging mode and/or array configuration for elements of arrays of the multi-functional transducer. In
In some embodiments, the transducer representation panel 1308 in
The visual representations 1310 and 1312 are meant to be exemplary and non-limiting. In some embodiments, a user can select what visual representations are displayed in transducer representation panel 1308, such as with pull-down tabs (not shown for clarity). In some embodiments, the transducer representation panel 1308 updates the visual representations it displays based on the imaging mode and/or array configuration selected in the multi-functional transducer configuration panel 1306.
The discussions of arrays of a multi-functional transducer above largely focus on arrays comprised of rows of transducer elements, as illustrated in
In some embodiments, the circular array 1402 includes an outer array 1408 of transducer elements and an inner array 1410 of transducer elements arranged in concentric circles. Although circles are illustrated in the circular array 1402, the outer array 1408 and the inner array 1410 can include elements arranged in concentric ellipses in embodiments.
In some embodiments, the polygonal array 1404 includes three nested arrays of triangular shape, including an outer array 1412 of transducer elements, a center array 1414 of transducer elements, and an inner array 1416 of transducer elements. The triangular shapes of the three arrays of the polygonal array 1404 are examples of polygons and are meant to be exemplary. Other polygonal shapes that can be included in the polygonal array 1404 include nested arrays arranged in rectangular, rhombus, pentagon, and the like shapes.
In some embodiments, the open-shaped array 1406 includes four nested arrays of L-shapes, including a first outer array 1418 of transducer elements, a second outer array 1420 of transducer elements, a first inner array 1422 of transducer elements, and a second inner array 1424 of transducer elements. The L-shapes of the four arrays of the open-shaped array 1406 are examples of open shapes and are meant to be exemplary. Other open shapes that can be included in the open-shaped array 1406 include nested arrays arranged in C-shapes, V-shapes, S-shapes, and the like.
As discussed above with respect to the rows of array elements, e.g., with regards to
There is a number of example embodiments described herein.
Example 1 is an ultrasound device comprising: a lens; an array coupled to the lens and having a center row of transducer elements of a first width that operate at a first frequency and two or more outer rows of transducer elements of a second or other widths that operate at a second or other frequencies different than the first frequency, wherein the first width is different than the second width, the center row being between the two or more outer rows; and a controller coupled to the array and configured to control the center row of transducer elements and two or more outer rows of transducer elements to operate at a same time or at different times.
Example 2 is the ultrasound device of example 1 that may optionally include that, during operation, the center row operates as a linear array and a pair of rows of the two or more rows operate as a phased array.
Example 3 is the ultrasound device of example 2 that may optionally include that the first frequency is twice the second frequency.
Example 4 is the ultrasound device of example 3 that may optionally include that element pitches for both center row and the pair of rows are equal.
Example 5 is the ultrasound device of example 1 that may optionally include that the controller controls the array in a plurality of modes in which either the center row of transducer elements or the two or more rows of transducer elements is operating or both the center row of transducer elements and the two or more rows of transducer elements are operating at the same time based on which mode of the plurality of modes is being used.
Example 6 is the ultrasound device of example 5 that may optionally include that the controller is configured to control the center row of transducer elements and two or more outer rows of transducer elements independently in one of the modes to operate at the same time to obtain signals for performing super broadband harmonic imaging by controlling the center row of transducer elements to perform a receive operation while controlling the two or more rows of transducer elements to perform transmit operations.
Example 7 is the ultrasound device of example 5 that may optionally include that the plurality of modes includes: a first mode in which the controller causes both the center row and the two or more rows of transducer elements to perform both transmit and receive operations to provide signals for full aperture imaging on an area of bandwidth produced by both that overlaps; a second mode in which the controller causes the center row of transducer elements to perform a receive operation and the two or more rows of transducer elements to perform transmit and receive operations to provide signals for Tissue Harmonic Imaging (THI); and a third mode in which the controller causes both the center row and the two or more rows of transducer elements to perform both transmit and receive operations to provide signals for super broadband full aperture THI.
Example 8 is the ultrasound device of example 5 that may optionally include a user interface to enable a user to cause the controller to switch between modes of the plurality of modes.
Example 9 is the ultrasound device of example 8 that may optionally include a probe enclosure that contains the array, and wherein the user interface comprises one or more buttons, one or more sensors, one or more switches coupled to the probe enclosure, or voice control to switch between the modes.
Example 10 is the ultrasound device of example 8 that may optionally include that the user interface comprises one or more buttons, one or more sensors, or one or more switches coupled to an ultrasound system or voice control communicably coupled to the array to switch between the modes.
Example 11 is the ultrasound device of example 1 that may optionally include that each of the transducer elements is part of an acoustic stack that includes a backing block through which at least a signal coupled to said each transducer element traverses.
Example 12 is the ultrasound device of example 1 that may optionally include that each of the transducer elements is part of an acoustic stack, and stacks associated with transducer elements of the center row have different focal depths and beam characteristics than stacks associated with transducer elements of the two or more rows.
Example 13 is the ultrasound device of example 1 that may optionally include one or more system connectors to interface signals from each of the transducer elements of the center row of transducer elements and the two or more rows of transducer elements to an ultrasound system, the one or more system connectors comprising first and second sets of tuning inductors, the first set of tuning inductors for interfacing signals from the transducer elements of the center row of transducer elements and the second set of tuning inductors for interfacing signals from the transducer elements of the two or more rows of transducer elements using tuning inductors with different inductor values than tuning inductors of the first set of tuning inductors.
Example 14 is the ultrasound device of example 1 that may optionally include that each of the transducer elements comprises a piezoelectric element.
Example 15 is an ultrasound device comprising: a lens; a high frequency array comprising a center row of transducer elements that operate at a first frequency and a low frequency array comprising two outer rows of transducer elements that operate at a second or other frequencies different than the first frequency, the center row of transducer elements being between the two outer rows of transducer elements; and a controller coupled and configured to control each of the high frequency array and the low frequency array in a plurality of modes to operate at a same time or at different times. The modes include a first mode in which the controller controls the high frequency array to operate as a linear array while low frequency array is not operating; a second mode in which the controller controls the low frequency array to operate as a phased array while the high frequency array is not operating; and a third mode in which the controller controls the high and low frequency arrays to operate at the same time to obtain signals for performing super broadband harmonic imaging.
Example 16 is the ultrasound device of example 15 that may optionally include that, during the third mode, the high frequency array filter out reflections from transmitted signals from the low frequency array.
Example 17 is the ultrasound device of example 15 that may optionally include that the plurality of modes includes one or more of: a fourth mode in which the controller causes both the high and low frequency arrays to perform both transmit and receive operations to provide signals for full aperture imaging on an area of bandwidth produced by both that overlaps; a fifth mode in which the controller causes the high frequency array to perform a receive operation and the low frequency array to perform both transmit and receive operations to provide signals for Tissue Harmonic Imaging (THI); a sixth mode in which the controller causes both the high and low frequency arrays to perform both transmit and receive operations to provide signals for super broadband full aperture THI.
Example 18 is the ultrasound device of example 15 that may optionally include that the first frequency is twice the second frequency, and wherein element pitches for both high frequency array and the low frequency array are equal.
Example 19 is method of controlling an ultrasound system, the method comprising: controlling a multi-frequency and multi-dimensional array of transducer elements having a high frequency array comprising a center row of transducer elements that operate at a first frequency and a low frequency array comprising two outer rows of transducer elements that operate at a second or other frequencies different than the first frequency, the center row of transducer elements being between the two outer rows of transducer elements, wherein controlling the multi-frequency and multi-dimensional array comprises controlling both the high frequency array and low frequency array to operate at the same time; receiving, by both the high and low frequency arrays, reflected signals; and performing super broadband harmonic imaging based on the received signals.
Example 20 is the method of example 19 that may optionally include that bandwidth of the received signals covers bandwidth across both the high and low frequency arrays.
All of the methods and tasks described herein may be performed and fully automated by a computer system. The computer system may, in some cases, include multiple distinct computers or computing devices (e.g., physical servers, workstations, storage arrays, cloud computing resources, etc.) that communicate and interoperate over a network to perform the described functions. Each such computing device typically includes a processor (or multiple processors) that executes program instructions or modules stored in a memory or other non-transitory computer-readable storage medium or device (e.g., solid state storage devices, disk drives, etc.). The various functions disclosed herein may be embodied in such program instructions or may be implemented in application-specific circuitry (e.g., ASICs or FPGAs) of the computer system. Where the computer system includes multiple computing devices, these devices may, but need not, be co-located. The results of the disclosed methods and tasks may be persistently stored by transforming physical storage devices, such as solid-state memory chips or magnetic disks, into a different state. In some embodiments, the computer system may be a cloud-based computing system whose processing resources are shared by multiple distinct business entities or other users.
Depending on the embodiment, certain acts, events, or functions of any of the processes or algorithms described herein can be performed in a different sequence, can be added, merged, or left out altogether (e.g., not all described operations or events are necessary for the practice of the algorithm). Moreover, in some embodiments, operations or events can be performed concurrently, e.g., through multi-threaded processing, interrupt processing, or multiple processors or processor cores or on other parallel architectures, rather than sequentially.
The various illustrative logical blocks, modules, routines, and algorithm steps described in connection with the embodiments disclosed herein can be implemented as electronic hardware (e.g., ASICs or FPGA devices), computer software that runs on computer hardware, or combinations of both. Moreover, the various illustrative logical blocks and modules described in connection with the embodiments disclosed herein can be implemented or performed by a machine, such as a processor device, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A processor device can be a microprocessor, but in the alternative, the processor device can be a controller, microcontroller, or state machine, combinations of the same, or the like. A processor device can include electrical circuitry configured to process computer-executable instructions. In another embodiment, a processor device includes an FPGA or other programmable device that performs logic operations without processing computer-executable instructions. A processor device can also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. Although described herein primarily with respect to digital technology, a processor device may also include primarily analog components. For example, some or all of the rendering techniques described herein may be implemented in analog circuitry or mixed analog and digital circuitry. A computing environment can include any type of computer system, including, but not limited to, a computer system based on a microprocessor, a mainframe computer, a digital signal processor, a portable computing device, a device controller, or a computational engine within an appliance, to name a few.
The elements of a method, process, routine, or algorithm described in connection with the embodiments disclosed herein can be embodied directly in hardware, in a software module executed by a processor device, or in a combination of the two. A software module can reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of a non-transitory computer-readable storage medium. An exemplary storage medium can be coupled to the processor device such that the processor device can read information from, and write information to, the storage medium. In the alternative, the storage medium can be integral to the processor device. The processor device and the storage medium can reside in an ASIC. The ASIC can reside in a user terminal. In the alternative, the processor device and the storage medium can reside as discrete components in a user terminal.
Conditional language used herein, such as, among others, “can,” “could,” “might,” “may,” “e.g.,” and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements, or steps. Thus, such conditional language is not generally intended to imply that features, elements, or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without other input or prompting, whether these features, elements or steps are included or are to be performed in any particular embodiment. The terms “comprising,” “including,” “having,” and the like are synonymous and are used inclusively, in an open-ended fashion, and do not exclude additional elements, features, acts, operations, and so forth. Also, the term “or” is used in its inclusive sense (and not in its exclusive sense) so that when used, for example, to connect a list of elements, the term “or” means one, some, or all of the elements in the list.
Disjunctive language such as the phrase “at least one of X, Y, or Z,” unless specifically stated otherwise, is otherwise understood with the context as used in general to present that an item, term, etc., may be either X, Y, or Z, or any combination thereof (e.g., X, Y, or Z). Thus, such disjunctive language is not generally intended to, and should not, imply that certain embodiments require at least one of X, at least one of Y, and at least one of Z to each be present.
While the above detailed description has shown, described, and pointed out novel features as applied to various embodiments, it can be understood that various omissions, substitutions, and changes in the form and details of the devices or algorithms illustrated can be made without departing from the spirit of the disclosure. As can be recognized, certain embodiments described herein can be embodied within a form that does not provide all of the features and benefits set forth herein, as some features can be used or practiced separately from others. The scope of certain embodiments disclosed herein is indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.
The present application is continuation-in-part and claims the benefit of U.S. patent application Ser. No. 17/561,313, filed Dec. 23, 2021, and entitled “ARRAY ARCHITECTURE AND INTERCONNECTION FOR TRANSDUCERS”, which is incorporated by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
Parent | 17561313 | Dec 2021 | US |
Child | 18613694 | US |