The present technology is generally related to sensing physiological information, more particular to multimodal sensing technologies for use in monitoring physiological information.
Systems exist for monitoring physiological information, including electrophysiological measurements, anatomical features, and the like. In one example, an arrangement of electrodes are placed on patient's thorax to measure cardiac electrophysiological signals. The measured electrophysiological information is combined with patient geometry to reconstruct electrophysiological signals on cardiac surface by solving an inverse problem. Typically, the patient geometry is derived from three-dimensional images acquired using computed tomography or other high-resolution imaging modality. The cost and availability of such high-resolution imaging modalities can limit the use of these and other similar technologies.
The techniques of this disclosure generally relate to multimodal sensing technologies for use in monitoring physiological information.
In one aspect, the present disclosure provides a system includes a sheet flexible material having a contact surface adapted to be placed on an outer surface of a patient's body. A plurality of sensing apparatuses have respective sensing surfaces distributed across the contact surface of the sheet. One or more of the sensing apparatuses include a multimodal sensing apparatus. Each multimodal sensing apparatus includes a monolithic substrate carrying a transducer, circuitry and an electrophysiological sensor. The transducer is coupled to the circuitry and configured to at least sense acoustic energy from a transducer location of the sheet. The electrophysiological sensor is also coupled to the circuitry, and the sensor configured to at least sense electrophysiological signals from a sensor location of the sheet, in which the sensor location has a known spatial position relative to the transducer location.
In another aspect, the disclosure provides a method that includes placing a sensing system on an outer surface of a patient's body, in which the sensing system includes an arrangement of electrophysiological sensors and ultrasound transducer modules. The method also includes providing ultrasound image data based on ultrasound images acquired by the ultrasound transducer modules according to the placement of the sensing system. The method also includes generating a three-dimensional image volume based on the ultrasound image data, in which the three-dimensional image volume includes patient anatomy and at least some of the electrophysiological sensors. The method also includes determining locations of the plurality of electrophysiological sensors and at least one anatomical surface within the patient's body based on image processing applied to the three-dimensional image volume. The method also includes generating geometry data representative of a spatial relationship between the electrophysiological sensors and the anatomical surface in a three-dimensional coordinate system.
In another aspect, the disclosure provides a system that includes a sensing system and a remote system. The remote system can be coupled to the sensing system through a communication link. The sensing system includes an arrangement of electrophysiological sensors and ultrasound transducer modules on a flexible sheet adapted to be placed on an outer surface of a patient's body. The electrophysiological sensors are configured to measure electrophysiological signals from the body surface, and the ultrasound transducer modules configured to measure acoustic waves from the body surface and provide respective ultrasound images. The remote system includes one or more non-transitory machine readable media to store data and instructions. The data includes ultrasound image data representative of the respective ultrasound images provided by the ultrasound transducer modules, electrophysiological data representative of the electrophysiological signals measured from the body surface, and geometry data representing a spatial relationship between the electrophysiological sensors and patient anatomy in a three-dimensional coordinate system, the geometry data being determined based on the ultrasound image data. The remote system also includes a processor to access the media and execute the instructions, such as to analyze at least one of the ultrasound image data and the electrophysiological data, and provide output data to visualize physiological information for the patient based on the analysis.
The details of one or more aspects of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the techniques described in this disclosure will be apparent from the description and drawings, and from the claims.
This description relates to systems and methods to implement multimodal sensing for use in measuring and/or monitoring physiological information. The sensed physiological information can be combined and analyzed to a measure of one or more physiological conditions. The systems and method described herein can be used to measure the one or more physiological conditions for diagnostic purposes. Additionally, or alternatively, the systems and methods described herein can be used during or in connection with an intervention, such as to measure of one or more physiological conditions as part of (e.g., before, during and/or after) delivery of a therapy to the patient and/or performing a surgical intervention.
For example, a sensing system includes a sheet of flexible material adapted to be applied to and conform to an outer surface of patient's body. The sheet can be in the form of garment, such as vest or shirt or hat, which can be worn by the patient so that a contact surface of the sheet can engage the outer surface of the body. A distributed arrangement of sensors are carried by the sheet. Thus when the sheet is applied (or worn) to the outer surface of the body, the sensors are adapted to sense more than one type physiological information from the body. In an example, the sensors include an arrangement of electrodes and audio transducers (e.g., ultrasound and/or auscultation transducers), in which at least some of the sensors are implemented in multimodal sensing modules.
For example, a multimodal sensing apparatus includes an electrode and a transducer integrated in a monolithic structure (e.g., an integrated circuit (IC) chip, system on chip (SoC) or mounted to a circuit board substrate). The transducer can be a solid state ultrasound or auscultation transducer implemented on or within a packaging material. The transducer can include a number of transducer elements coupled to electrical circuitry also implemented within the packaging material (e.g., on chip). The electrical circuitry is configured to control the transducer elements to transmit and receive signals and to process the received signals (e.g., amplification and filtering) to provide physiological signals. The electrical sensor (e.g., an electrode) can also be implemented on or partially within the packaging material, and have a spatial position that is known relative to the transducer. The physiological information acquired by the sensors can be communicated from the respective sensing apparatuses to a remote device through a communication link (e.g., a wireless or physical link) for storage and/or additional processing, such as described herein.
Respective image volumes generated by ultrasound transducers can be stitched together to provide a compounded (three-dimensional) image volume for the body, which can be used to generate geometry data representative of internal anatomy (e.g., one or more cardiac surfaces, lungs and bones) as well as the body surface over which the sensing system is placed. The 3D compounded image volume for the body further can be stitched together over time produce a four-dimensional anatomical image (e.g., a motion picture of anatomy) in real-time for the patient. The geometry data can also be combined with electrophysiological data representing electrophysiological signals sensed by electrodes distributed across the patient's body to generate one or more electro-anatomical maps. The electro-anatomical maps can display a visualization of real-time electrophysiological information (e.g., body surface and/or reconstructed electrophysiological signals) superimposed on the four-dimensional anatomical image of the heart, which can also be a live real time image of the heart. Advantageously, the patient workflow can be performed in the absence of ionizing radiation (e.g., without any CT, fluoroscopy or other x-ray imaging). Because the approach can be “fluoro-free,” patient as well as the caregivers need not be exposed to such radiation. Additionally, because both anatomical data and electrophysiological data can be obtained using a single sensing apparatus (e.g., in the form of a sheet or garment, such as a vest), the time and cost associated with collecting such information as well as generating anatomic and/or electrophysiological graphical maps can be reduced compared to many existing approaches.
The sensing apparatus 102 includes a transducer module (e.g., a solid state transducer module) 106 and an electrophysiological sensor 108. As shown in the example of
As an example, the sensor 108 includes an electrode 114 having one or more layers of an electrically conductive material (e.g., aluminum, silver, gold, copper etc.) that extend from a surface 116 of the substrate 110. For example, the electrode 114 can be formed by a deposition process through a patterned mask structure (e.g., by evaporation or sputtering). Metal interconnects, shown schematically at 118, also can be formed in the substrate 110 to couple the electrode 114 to associated circuitry formed in the substrate 110. The circuitry 120, shown schematically as 120, can be configured to amplify and/or filter electrophysiological signals sensed by the electrode 114.
In some examples, the circuitry 120 includes wireless interface configured to send and receive signals relative to sensing apparatus 102, such as through a wireless communication link between the sensing apparatus and a remote unit. The signals sent from the sensing apparatus 102 can include electrophysiological signals sensed by the electrode 114 and/or signals measured by the transducer module 106. The signals can be raw signals or processed signals, such as have been processed by control and signal processing electronics implemented as part of the circuitry 120.
In other examples, interconnects within the substrate 110 can also, or alternatively, couple the electrode 114 and the circuitry 120 (e.g., through respective interconnects) to an arrangement of output terminals 122 (e.g., contacts or pins) formed at a mounting surface of the substrate 110. The terminals 122 thus are adapted for sending and/or receiving signals (and/or electrical power) relative to the sensing apparatus 102. The configuration and arrangement of terminals 122 can vary according to the type of IC packaging used to form the sensing apparatus 102. The terminals 122 can couple to respective pads or contacts (not shown) implemented in the sheet 104. For example, the sheet 104 can include multiple layers 124 and 126. The pads or contacts can be formed on a surface of layer 126, which includes traces and/or wires configured to carry the electrical signals and/or power relative to the sensing apparatus 102. The other layer 124 can provide flexible cover over the entire layer 126 or over a portion that includes the traces and/or wires. The traces and/or wires can route to respective connectors provided at one or more locations of the sheet 104. The connectors can be coupled to a remote unit for further processing and/or analysis of the signals. The remote unit can also provide control instructions to the sensing apparatus 102 through the communication link, which includes the wires and/or traces.
In an example, the transducer module 106 can be implemented as an ultrasound transducer and/or an auscultation transducer. For example, the transducer module 106 is implemented as a transducer array having a number of transducer elements 112 distributed across the surface of the apparatus 102. Each of the transducer elements 112 can be formed as a microelectromechanical systems (MEMs) acoustic transducer element (e.g., a capacitive micromachined ultrasound transducer) configured to receive and/or transmit acoustic energy, such as acoustic waves that propagate through the body. As used herein, the acoustic waves can include audible sound waves and/or ultrasound waves propagating through the body.
For example, each of the MEMs elements is configured to transmit ultrasonic waves as well as to receive ultrasonic vibrations (e.g., about 10 KHz to about 100 MHz) greater which are converted to electronic signals, amplified, and processed by associated circuitry 120. The circuitry 120 can also convert the signals from the transducer elements 112 into electrical signals representative of a corresponding ultrasound image volume, which can vary over time (e.g., a 3D or 4D image volume).
In another example, the transducer is implemented as or includes an auscultation device configured to receive audible acoustic vibrations (e.g., about 10 Hz to about 20 KHz) which are converted to electronic signals, amplified, and processed by associated circuitry 120. The electronic signals can be communicated to a remote unit through a communication link (e.g., wireless or through a physical medium), such as described herein.
Similar to the example of
In one example, the SoC apparatus 302 includes a wireless interface configured to send and receive signals relative to sensing apparatus 302, such as through a wireless communication link between the sensing apparatus and a remote unit. The wireless communication link can be a bidirectional link or a unidirectional link. The wireless interface can be implemented as part of the circuitry 320 on the transducer IC die or another IC die within the SoC apparatus 302. The signals communicated through the communication link can include electrophysiological signals sensed by the electrode 314 and/or signals obtained by the respective elements of the transducer module 306. The signals can include raw signals and/or processed signals, such as have been processed by control and signal processing electronics implemented as part of the circuitry 320. Control instructions can also be received by the wireless interface through the wireless communication link.
In another example, the SoC apparatus 302 can further include an arrangement of terminals 332 (e.g., contacts or pins or pads) formed at a mounting surface of the substrate 310. The transducer IC bond pads 322 are coupled to respective terminals 332 through bond wiring or other connections. The terminals 332 can be input/output terminals adapted for sending and/or receiving signals (and/or electrical power) relative to the sensing apparatus 302. The configuration and arrangement of terminals 332 can vary according to the type of IC packaging used to form the sensing apparatus 302. The terminals 332 can also couple to respective pads or contacts (not shown) implemented in the sheet 304. For example, the sheet 304 can include multiple layers 324 and 326. The pads or contacts can be formed on a surface of layer 326, which includes traces and/or wires configured to carry the electrical signals and/or power relative to the sensing apparatus 302. The other layer 324 can provide flexible cover over the entire layer 326 or over a respective portion that includes the traces and/or wires. The traces and/or wires can route to respective connectors provided at one or more locations of the sheet 304. The connectors can be coupled to mating connector adapted to be coupled to a remote unit (not shown) for further processing and/or analysis of the signals. The remote unit can also provide control instructions to the sensing apparatus 302 through the communication link, which includes the wires and/or traces.
The sensing apparatus 502 can be configured similar to the example of
For example, the transducer module 506 is implemented as a transducer IC chip that includes an array of transducer elements 512 coupled to circuitry also implemented within the transducer IC chip. The transducer elements 512 can be MEMs ultrasound transducer elements, piezoelectric ultrasound elements or auscultation transducers. The transducer IC chip can include electrical circuitry configured to control the transducer elements to transmit and receive ultrasound signals and to process the received signals (e.g., amplification and filtering) to provide respective physiological signals.
The electrical electrophysiological sensor 508 can be implemented as an electrode 514 that is also mounted to the surface 516 of the PCB substrate 510. In an example, the electrode 514 includes a coupling (e.g., pad or terminal) coupled to circuitry 536 by electrical trace or wires 518. The circuitry 536 can be configured to process (e.g., amplify and filter) the electrophysiological signals acquired by the electrode 514. In another example, the electrode 514 is coupled to a terminal (or terminals) of an IC carrying the transducer module 506, which includes circuitry to process the electrophysiological signals acquired by the electrode 514. The circuitry can also include a communication interface to communicate signals relative to the sensing apparatus 502.
In example where the sheet to which the sensing apparatuses includes connectors and/or circuitry for further processing or communication of the acquired signals, the communication interface can be coupled to respective terminals of a connector 532, such as by traces and/or wires 540 route through one or more layers of the PCB substrate 510. The connector 532 can include input/output terminals adapted for sending and/or receiving signals (and/or electrical power) relative to the sensing apparatus 502 (e.g., through the communication interface). The connector 532 can be adapted to couple to respective pads or contacts (e.g., of a mating connector) implemented at respective locations of the sheet (not shown), which mating connectors are adapted to be coupled to a remote unit for further processing and/or analysis of the signals. The remote unit can also provide control instructions to the sensing apparatus 302 through a respective communication link, which includes the wires and/or traces. One or more sheet couplings can also be configured to hold the sensing apparatus at a fixed location with respect to the sheet.
In another example, the sensing apparatus 502 includes a wireless communication interface coupled to the PCB substrate 510, such as implemented in circuitry 536 or the transducer IC. The wireless interface can be configured to send and receive signals relative to sensing apparatus 502, such as through a wireless communication link between the sensing apparatus and a remote unit. The signals sent from the sensing apparatus 102 can include electrophysiological signals sensed by the sensor 508 and/or signals (e.g., acoustic energy) measured by respective elements of the transducer module 506. The signals can be raw signals or processed signals, such as have been processed by control and signal processing electronics, such as implemented as part of the circuitry 120. In an example that uses a wireless communication interface, the sensing apparatus can use an internal power supply such as a battery and/or electrical power can be supplied through respective power terminals (e.g., implemented through the connector 532).
An example of MEMs ultrasound transducers that can be used to implement the transducer module 106 and associated circuitry is disclosed in Ultrasound-on-Chip platform for medical imaging, analysis, and collective intelligence Proc Natl Acad Sci USA, 2021 Jul. 6; 118 (27), which is incorporated herein by reference. Other types and configurations of ultrasound transducers can be used in other examples.
A layer 904 of an electrically conductive gel (or other pliant and electrically conductive material) can be deposited over the electrode layer 902. For example, the layer 904 can be an adhesive gel (e.g., wet gel or a solid gel construction), which can applied by the manufacturer (e.g., before shipping) or the layer 904 can be applied prior to use (e.g., by the user). In another example, the layer 904 could be dry electrode structure. The layer 904 and the electrode 902, individually or collectively, form the electrode structure 314 that provides an electrically conductive interface configured to contact with a body surface of the patient.
An insulating layer 906 can be provided on a contact surface of the substrate layer 324 to cover the electrically conductive traces applied with the layer 902. The insulating layer can be a dielectric material having a high dielectric constant sufficient to prevent the flow of electrical current. The insulating layer 906 can be a coating that can be applied as a liquid or (e.g., via spraying, deposition, or the like) onto the contact surface of the flexible substrate layer 324 and over the exposed electrically conductive traces. The insulating layer 906 can be applied to the entire contact surface except where the electrode layers 902 and 904 have been applied to the substrate layer 324. A mask or other means can be utilized to prevent application of the insulating material onto the exposed electrode structures 902.
A corresponding adhesive layer 908 can be applied in a circumscribing relationship around each the electrode layers 902 and 904 to facilitate secure attachment of the electrode structure 314 to a patient's body surface. For example, the adhesive layer 908 can be in the form of an annular ring of a foam or fabric material that surrounds each the electrode structure 314. For example, the layer 906 can be secured to the elastic conformable layer 326 via an appropriate adhesive layer. The adhesive layer can be formed as an integral part of the layer 906 itself or be applied separately. Alternatively, the annular ring can formed from a sheet of a material having one side surface 910 containing a medical grade adhesive while the other side can be initially free of adhesive, but can be affixed to the contact surface side of the elastic polymer layer by applying an adhesive layer. The adhesive can be the same adhesive that is used to affix the polyester layer to the stretchable fabric layer 326 or it can be different.
Other example electrode configurations could be used to provide the electrophysiological sensor. For example, the electrophysiological sensors can be implemented as dry foam electrodes or dry polymer-based electrodes. An example of a dry polymer-based electrode structure that can be used is disclosed in I. Wang et al., “A Wearable Mobile Electrocardiogram measurement device with novel dry polymer-based electrodes,” TENCON 2010-2010 IEEE Region 10 Conference, 2010, pp. 379-384, doi: which is incorporated herein by reference.
For example,
The sensing system 1200 includes one or more sheets of flexible, conformable material 1202 that provides a sensor-carrying substrate. For example, a single sheet can be formed in the desired shape such as shown in
In the example of
In some examples each of the sensing apparatuses 1204 and 1206 can be coupled to a layer of the sheet configured to carry wires, traces and/or electrical circuitry (not shown in
The sensing system 1302 includes an arrangement of sensing apparatuses distributed across a substrate sheet 1310, such as the garment 1200 shown in
In one example, the remote system 1306 is configured to process physiological information received from the sensing apparatus through the wireless link, such as to generate one or more output visualizations based on the physiological information acquired by sensing apparatuses 1312, 1314 implemented in the sensing system 1302. One or more of the multimodal integrated sensing apparatuses 1312 can also include a wireless communication interface configured to communicate wirelessly with a wireless interface 1320 of the remote system 1306. For example, the wireless communication link can be implemented to include one or more wireless links implemented according to one or more wireless communication technologies, such as an 802.11x standard, Bluetooth, cellular (e.g., GSM, 4G, 5G) and/or another wireless communication technology. In some examples, the communication link between the sensing system 1302 and the remote system 1306 can include a network of wireless communication links between each of the respective integrated sensing apparatuses 1312 and the wireless interface 1320. In other examples, the integrated sensing apparatuses 1312 can be configured in a daisy-chain configuration or master-slave configuration. For instance, a selected one of the integrated sensing apparatuses 1312 is configured to communicate directly with the wireless interface 1320, and the other integrated sensing apparatuses 1312 communicate directly or indirectly with the selected module (e.g., through a wireless or physical communication medium).
In the example of
As described herein, the sensing system 1302 can stream (e.g., via one or more wireless communication links) real-time ultrasound image data and electrophysiological data to the remote system 1306, and the computing apparatus 1322 can process such information to provide corresponding graphical output on display 1326 to visualize such information. The visualization can include a graphical representation of electrophysiological information mapped spatially onto an anatomical surface of interest (e.g., a three-dimensional or four-dimensional image of the patient's heart). Because the physiological information provided by the sensing system 1302 can include both image data and electrophysiological data, the computing apparatus 1322 can generate the visualization to include both mechanical (e.g., rendered as a 3D or 4D image) and electrophysiological information over a number of cardiac cycles, and can be synchronized in time. In some examples, given sufficient processing capabilities for the computing apparatus, the visualization can provide a near real-time or even real-time visualization of such multi-modal physiological information.
In an example, the system 1300 can be used to perform a physiological study based on a combination of ultrasound images, auscultation recordings and/or electrophysiological signals obtained concurrently over one or more time intervals. In another example, the system 1300 can be used to monitor a patient's physiological condition during an intervention (e.g., using manual and/or robotic techniques), such as ablation, cardiac resynchronization therapy, valve repair or replacement, and the like. The computing device 1322 can guide (e.g., through analysis and results provided on the display) and/or control the intervention based on a combination of ultrasound images, auscultation recordings and/or electrophysiological signals acquired concurrently over one or more time intervals. As mentioned above, the wireless communication link enables the user of the remote system to be either co-located in a common space with the patient and sensing system 1302, or the user can be at a remote (different) location from the patient. The remote location can be in a different room, a different building on a given campus, a different city or even a different country.
In other examples, the remote system can be implemented portable monitoring device (e.g., similar to a Holter monitor) such as for storing in non-transitory memory physiological signals measured by the sensing apparatuses 1312, 1314 over an extended period of time (e.g., a number of hours, days, weeks or more), which can be uploaded for further processing and/or analysis. As described herein, the physiological signals measured by the sensing apparatuses 1312, 1314 can include a combination of ultrasound images, auscultation recordings and/or electrophysiological signals obtained concurrently over time. The aggregate data that is acquired thus provides mechanical and electrical information to help determine or diagnose more complex conditions and comorbidities.
The display 1326 can be coupled to the computing apparatus 1322, and the user interface (e.g., a graphical user interface) 1324 can be associated with the computing apparatus 1322, such as for enabling a user to control the data acquisition process and to ensure that appropriate sensor connections have been made. The user interface can also be used to control operating parameters of ultrasound transducers in the multimodal sensing apparatuses 1312 for acquiring ultrasound images. The display 1326 may present the GUI to facilitate such controls. The computing apparatus 1322 can also be programmed to provide various features associated with the sensors and the data acquisition process. As an example, a user can employ a pointing device (e.g., a mouse or touch screen) or other input device (e.g., a keyboard or gesture control) to interact with the computing apparatus 1322. Such interactions can change the graphical and textual information on the display 1326. For instance, the user interface 1324 can be utilized to change between different sensor views or to enable interaction between multiple views that may be displayed concurrently for different parts of the system 1300.
As another example, a user can select one or more sensing apparatuses 1312, 1314 via the user interface 1324, such as can be presented on the display 1326 as part of an interactive graphical representation of a torso generated from the ultrasound images acquired by sensing apparatuses 1312. Several different interactive views of the sensing system 1302 can be provided, which can be utilized to configure and select the sensing apparatuses 1312, 1314.
In the example of
The communication interface 1430 can include amplifiers and other circuitry configured to receive and aggregate signals from respective sets of cables 1432 from different sensor circuits. The communication interface 1430 thus can be configured to amplify and filter (e.g., baseline noise removal filtering) the signals from each of the sensing apparatuses 1412, 1414 and provide a corresponding set of signals to the computing apparatus 1422. Additional filtering, signal processing and analysis can be implemented by the computing apparatus 1422.
As described herein, the sensing system 1402 can stream (e.g., via a communication link) real-time multi-modal physiological information (e.g., ultrasound image data, electrophysiological data, etc.) to the remote system 1406, and the computing apparatus 1422 can process such information to provide corresponding graphical output on display 1426 to visualize such information, which can provide a near real-time or even real-time visualization of such multi-modal physiological information, as described herein (e.g., with respect to
In some examples, such as where the systems methods are used for electrocardiographic imaging, spatial geometry among the electrophysiological sensors and anatomy is needed. Typically, such geometry is determined from three-dimensional image data, such as acquired using computed tomography or magnetic resonance imaging modalities while a sensor apparatus is on the patient's body. However, such imaging modalities are expensive to use and may not be available in some settings. The sensing apparatuses, systems and methods disclosed herein can determine geometry information without using computed tomography or magnetic resonance imaging modalities and without using a spatial tracking system. This is enabled by use of ultrasound transducers integrated into the sensing system, as shown and described herein.
By way of example,
As an example, the multimodal sensing apparatuses 1504, 1506, 1508, 1510 and 1512 can enter a calibration function 1602, during which operating parameters of each of the ultrasound transducers are tuned to enable formation of an image volume within the patient's body where the sensing apparatus is placed. In some examples, such as shown in
In a further example, which can be implemented as part of the calibration function 1602 or a transducer localization function 1604, distances between respective transducer modules can be determined. For example, each of the multimodal sensing apparatuses 1504, 1506, 1508, 1510 and 1512 can be configured to sequentially (or otherwise separately) transmit an ultrasonic signal and receive waves reflected ultrasonic signals. A distance to reflected structures, including other transducer modules, can be calculated based on the reflected from reflected ultrasonic signals (e.g., distance is proportional to the time and speed of the ultrasonic waves propagating in the body 1502). The distances can be determined between a given module and all other modules, and the process can be repeated for each of the respective modules relative to the other modules.
For example, the calibration function 1602 is configured to use an image segmentation function to identify the transducers in the ultrasound image volume, which can include an automated identification function and/or selection by a user input through a graphical user interface. For each identified transducer, the calibration function can determine the inter-transducer distance based on the time (e.g., time difference between transmit and receive times) and the known speed of the ultrasonic signals through the body. However, in some sensing apparatus configurations, some of the modules (e.g., multimodal sensing apparatus 1506) may not be located in the path of the transmitted ultrasonic signal from a transducer of a given multimodal sensing apparatuses (e.g., sensing apparatus 1504), and therefore unable to provide reflected signals to produce distance calculations between such modules (e.g., between transducers of apparatuses 1504 and 1506). The calibration function 1602 can be implemented for each of the multimodal sensing apparatuses 1504, 1506, 1508, 1510 and 1512, and the resulting computed distances for each module can be stored (e.g., in an array or other data structure) in memory.
The transducer localization function 1604 is configured to spatially localize each of the multimodal sensing apparatuses 1504, 1506, 1508, 1510 and 1512 in a three-dimensional coordinate system. In one example, the coordinate system can be determined relative to a selected one of the multimodal sensing apparatuses (e.g., apparatus 1504) based on the set of distance calculations. The localization function 1604 can use the computed distances between multimodal sensing apparatuses 1504, 1506, 1508, 1510 and 1512 to construct three-dimensional locations each of the modules relative to the selected module.
In another example, transducer localization function 1604 is configured to spatially localize the each of the multimodal sensing apparatuses 1504, 1506, 1508, 1510 and 1512 in a three-dimensional coordinate system based processing applied to ultrasound image volumes generated by each of the multimodal sensing apparatuses 1504, 1506, 1508, 1510 and 1512. For example, the localization function 1604 can be configured to stitch together the ultrasound image volumes produced by ultrasonic transducers of the multimodal sensing apparatuses 1504, 1506, 1508, 1510 and 1512 through an image registration process to provide a compounded three-dimensional image volume. The image registration can be automated and/or be guided in response to user input selection of common points in the respective image volumes generated by the transducer modules.
The localization function 1604 can be configured to perform a 3D reconstruction of points (e.g., pixels or voxels) in the compounded image volume. For example, the localization function can reconstruct by triangulation of points from multiple projection matrices determined for the image volume of each respective transducer module. The localization can implement a reconstruction method to reconstruct 3-dimensional points (in a common 3D coordinate system) based on a rectification transform that models each ultrasound transducer module as a stereo camera. An example of a triangulation method that can be implemented by function 1604 for stereo reproduction of a 3D image from 2D sources (respective ultrasound transducers) is disclosed in Y. Furukawa and C. Hernandez, Multi-View Stereo: A Tutorial, Foundations and Trends® in Computer Graphics and Vision, vol. 9, no. 1-2, pp. 1-148, 2013, which is incorporated herein by reference.
A geometry calculator 1606 is configured to determine geometry data 1608 representative of spatial geometry of the electrophysiological sensors (e.g., electrodes), ultrasound transducers and patient anatomy. For example, the geometry for each of the electrodes and ultrasound transducers can be represented as 3D spatial coordinates for a centroid of the respective electrodes and transducers. The geometry of the patient anatomy can be modeled as a set of 3D spatial coordinates defining the anatomical surface or as a model thereof (e.g., a mesh model), such as for the heart, lungs, brain or other anatomical structures of interest. In some examples, one or more of the anatomical models can be a 4D model that varies over time.
The geometry calculator 1606 can determine the geometry data based on the transducer locations (e.g., determined by transducer localization function 1604) and data describing known geometry of the electrodes relative to the transducers, shown at 1610. For example, the electrode relative geometry data 1610 can represent a 2D or 3D position of each electrode relative to a nearest one or more transducer module (in a coordinate system of the sensing apparatus). For electrodes that are integrated with respective transducer modules, the spatial coordinates of the associated (integrated) electrode can be derived from the transducer coordinates with a high level of accuracy. For electrodes or other sensors (e.g., 1002, 1204, 1314, 1414, or 1516) that are carried by the flexible substrate separately from being in an integrated transducer module, the relative location can similarly be determined for one or more transducer modules and stored in memory as the relative location data 1610. The relative location data 1610 thus can be used co-register the position of such electrodes or other sensors from the coordinate system of the sensing apparatus into the common spatial coordinate system with the patient anatomy and transducer modules. Appropriate anatomical landmarks or other fiducials, including locations for some or all the electrodes, can also be identified in the ultrasound image volume to enable registration of the electrode locations in the coordinate system. The identification of such landmarks can be done manually (e.g., by a person via image editing software) or automatically (e.g., via image processing techniques). As a result, the geometry data 1608 can be provided to represent geometry for each of the sensors and relevant patient anatomy in a common spatial domain.
As an example, a catheter or other probe having one or more interventional devices 1757 affixed thereto can be inserted into a patient's body 1754 as to contact the patient's heart 1752, endocardially or epicardially. In other examples, the device 1757 can be a non-contact probe configured to deliver therapy to the patient's heart or other tissue. The placement of the device 1757 can be guided based on geometry data 1756, information provided from one or more electroanatomic maps and/or from a 3D or 4D ultrasound image volume, such as can be generated by a remote analysis/mapping system 1762, as described herein. The guidance can be automated, semi-automated or be manually implemented based on physiological information acquired from the patient's body 1754. The functions of the remote system 1762 may be implemented as machine-readable instructions executable by one or more processors.
As a further example, an interventional system 1758 can be located external to the patient's body 1754 and be configured to control therapy or other intervention that is being provided through the device 1757. The interventional system 1758 can include controls (e.g., hardware and/or software) 1760 that can communicate (e.g., supply) electrical signals via a conductive link electrically connected between the intervention device (e.g., one or more electrodes) 1757 and the interventional system 1758. One or more sensors (not shown) can also communicate sensor information from the intervention device 1757 back to the interventional system 1758. The position of the device 1757 relative to the heart 1752 can be determined and tracked intraoperatively using real-time ultrasound imaging (e.g., by an arrangement of transducer modules that form part of a sensing system 1764. In some examples, a tracking modality can also be used to track and display location of the intervention device 1757. The location of the device 1757 and the therapy parameters thus can be combined to determine and control corresponding application of therapy.
Those skilled in the art will understand and appreciate various type and configurations of intervention devices 1757 that can be utilized, which can vary depending on the type of intervention. For instance, the device 1757 can be configured to deliver electrical therapy (e.g., radiofrequency ablation or electrical stimulation), chemical therapy, sound wave therapy, thermal therapy or any combination thereof. Other types of devices can also be delivered via interventional system 1758 and the invasive intervention device 1757 when positioned within the body 1754, such as to implant an object (e.g., a heart valve, stent, defibrillator, pacemaker or the like) and/or perform a repair procedure. The interventional system 1758 and intervention device 1757 may be omitted in some examples.
In the example of
One or more electrophysiological sensors (e.g., electrodes) may also be located on the device 1757 that is inserted into the patient's body 1754. Such sensors can be utilized separately or in conjunction with the sensors in the non-invasive sensing system 1764 for mapping electrical activity for an endocardial surface, such as the wall of a heart chamber, as well as for an epicardial surface.
In each of such example approaches for acquiring patient physiological information, including acquiring ultrasound images, non-invasively sensing electrophysiological signals or a combination of invasive and non-invasive sensing electrophysiological signals, the sensing system 1764 provides electrophysiological data and ultrasound image data to the remote system 1762. In an example, the remote system 1762 includes measurement control 1766, which can be coupled to the sensing system 1764 through a communication link, shown as dotted line 1768. As described herein, the communication link can be a wireless link, a physical link or a combination of wireless and physical links. The sensing system 1764 thus can provide electrophysiological data 1770 and ultrasound image data 1759 based on respective physiological measurements that are performed, which can be stored in memory and communicated to the remote system 1762 through the communication link 1768. The electrophysiological data 1770 can include analog and/or digital information (e.g., representative of electrophysiological signals measured via electrodes). The ultrasound image data 1759 can be separate image volumes produced by respective ultrasound transducers or be a compounded image volume, such as described herein.
The control 1766 can be configured to control the data acquisition process (e.g., sample rate, line filtering) for measuring electrophysiological signals to provide the data 1770. The control 1766 can also be configured to control the image acquisition process (e.g., ultrasound mode, frequency gain or other parameters) for transmitting and receiving ultrasound signals to provide the data 1759. In some examples, the control 1766 can control acquisition of electrophysiological data 1770 separately from the ultrasound image data 1759, such as in response to a user input. In other examples, the electrophysiological data 1770 can be acquired concurrently with and in synchronization with ultrasound image data 1759. For example, appropriate time stamps can be utilized for indexing the temporal relationship between the respective data 1759 and 1770 as to facilitate the evaluation and analysis thereof by the remote system 1762.
The remote analysis/mapping system 1762 can also be programmed to perform other signal processing techniques on the physiological information (e.g., ultrasound image data 1759 and electrophysiological data 1770) received from the sensing system 1764. In an example, the remote analysis/mapping system 1762 can be programmed to apply a blind source separation (BSS) method on the set of electrophysiological signals represented by the electrophysiological data 1770. The BSS method is particularly useful extracting pertinent electrophysiological signals of interest measured by dry electrodes implemented in the sensing system 1764.
The remote system 1762 can further be programmed to combine electrophysiological data 1770 with geometry data 1756 by applying appropriate processing and computations to provide corresponding output data 1774. Additionally, in some examples, the remote system 1762 can also use the ultrasound image data 1759 to generate the output data 1774. The geometry data 1756 may correspond to ultrasound-based geometry data, such as be determined according to the example approach of
The remote system 1762 can provide the output data 1774 to represent multimodal physiological information for one or more regions of interest or the entire heart in a temporally and spatially consistent manner. For example, the sensing system 1764 can measure electrophysiological signals and provide electrophysiological data 1770 for a predetermined region or the entire heart concurrently (e.g., where the sensing system 1764 covers the entire thorax of the patient's body 1754). The sensing system 1764 can also obtain spatially and temporally consistent ultrasound images for the same predetermined region or the entire heart. The electrical and mechanical information can be correlated over time to determine one or more physiological metrics, which can be provided as part of the output data 1774 (e.g., visualizing relationships between electrocardiographic maps derived from measured electrophysiological signals along with mechanical properties of the heart derived ultrasound images). The time interval for which the output data/maps are computed can be selected based on user input (e.g., selecting a timer interval from one or more waveforms). Additionally or alternatively, the selected intervals can be synchronized with the application of therapy by the interventional system 1758.
For example, the remote system 1762 includes an electrogram (EGM) reconstruction function 1772 programmed to compute an inverse solution and provide corresponding reconstructed electrograms based on the electrophysiological data 1770 and the geometry data 1756. The reconstructed electrograms thus can correspond to electrocardiographic activity across a cardiac envelope, and can include static (three-dimensional at a given instant in time) and/or be dynamic (e.g., four-dimensional map that varies over time). Examples of inverse algorithms that can be implemented by electrogram reconstruction 1772 include those disclosed in U.S. Pat. Nos. 7,983,743 and 6,772,004. The EGM reconstruction function 1772 thus can reconstruct the body surface electrophysiological signals measured via electrodes of the sensing system 1764 onto a multitude of locations on a cardiac envelope (e.g., greater than 1000 locations, such as about 2000 locations or more
As disclosed herein, the cardiac envelope can correspond to a 3D surface geometry corresponding to the heart, which surface can be epicardial and/or endocardial surface model derived at least in part from ultrasound image data. For example, the locations may be nodes distributed across a mesh model (e.g., corresponding to the points defined by cardiac surface data 230, 1730) derived from ultrasound image data 1759, as disclosed herein. The locations of the nodes in the mesh model can be static (e.g., 3D points) or dynamic (e.g., 4D locations that vary over time), such as derived from a set of the ultrasound image data 1759.
As mentioned above, the geometry data 1756 can correspond to a mathematical model that has been constructed based on ultrasound image data for the patient. Thus, the geometry data 1756 that is utilized by the electrogram reconstruction function 1772 can correspond to actual patient anatomical geometry. In another example, the geometry data can include a preprogrammed generic model or a combination of patient anatomy and a generic model (e.g., a model/template that is modified based on patient anatomy). By way of further example, the ultrasound imaging and generation of the geometry data 1756 may be performed concurrently with recording the electrophysiological signals that is utilized to generate the electrophysiological data 1770. In another example, the ultrasound imaging can be performed separately (e.g., before or after the measurement data has been acquired) from the electrical measurements.
Following (or concurrently with) determining electrical potential data (e.g., electrogram data computed from non-invasively or from both non-invasively and invasively acquired measurements) across the geometric surface of the heart 1752, the electrogram data can undergo further processing by remote system 1762 to generate the output data 1774. The output data 1774 may include one or more graphical maps of electrophysiological signals or information derived from such signals.
An output generator 1784 can be programmed to generate the output data 1774 based on one or more of the electrophysiological data 1770, the ultrasound image data 1759, processed ultrasound data (derived by an ultrasound image processor 1780) and/or reconstructed electrophysiological signals (computed by EGM reconstruction function 1772). The remote system 1762 can provide the output data 1774 to one or more displays 1792 to provide a visualization including include one or more graphical outputs 1794 (e.g., waveforms, electroanatomic maps, related guidance, or the like). The remote system 1762 can also include a metric calculator 1776 having one or more computation methods programmed to characterize the physiological information for the patient based on one or more of the electrophysiological data 1770, the ultrasound image data 1759, processed ultrasound data (derived by an ultrasound image processor 1780) and/or reconstructed electrophysiological signals (computed by EGM reconstruction function 1772). The metric calculator can also characterize physiological information derived from acoustic waves received by the transducer (e.g., auscultation transducer) into data representative of physiological sounds of the heart, lungs (e.g., in the audible frequency range of about 10 Hz to about 20 KHz). Additionally, or alternatively, the acoustic waves received by the auscultation transducer can be amplified and supplied to an audio speaker for listening by one or more users.
The remote system 1762 can also include a user interface (e.g., a graphical user interface) 1796 configured to control functions applied by the remote system 1762 and resulting output 1794 that is provided to the display 1792 in response a user input. For example, parameters associated with the displayed graphical output, corresponding to an output visualization of a computed map or waveform, such as including selecting a time interval, temporal and spatial thresholds, as well as the type of information and/or viewing angle that is to be presented in the display 1792 can be selected in response to a user input via the user interface 1796. For example, a user can employ the GUI 1796 to selectively program one or more parameters (e.g., temporal and spatial thresholds, filter parameters, metric computations, and the like) used by the one or more functions 1772, 1776, 1780 to process the ultrasound image data 1759, electrophysiological data 1770 and geometry data 1756. The remote system 1762 thus can generate corresponding output data 1774 that can in turn be rendered as a corresponding graphical output in a display 1792, such as including one or more graphical output visualizations 1794. For example, the output generator 1784 can generate graphical maps and other output visualizations, which can be superimposed on an anatomical model or on a 3D or 4D ultrasound image (e.g., a real-time or prerecorded image) based on the ultrasound image data 1759.
In the example of
The feature extraction function 1804 can be programmed to extract one or more features from the compounded image volume. The feature extraction function 1804 can be applied automatically, such in response to function calls by one or more functions of the metric calculator 1776. In other examples, the feature extraction function 1804 can operate in response to a user input instruction specifying one or more features through the user interface 1796. For example, the feature extraction function 1804 can identify one or more anatomical surfaces (e.g., epicardial, endocardial, pulmonary surfaces or the like) or other objects (e.g., sensors, transducer modules, and the like) visible within the compounded image volume. The pixels or voxels for the extracted surfaces can be tagged and corresponding spatial coordinates of the surfaces can be stored in memory, such by specifying points on the surface or constructing a model. In some examples, the ultrasound image processor 1780 is programmed to generate the geometry data 1756, such as described herein, based on the compounded image volume. As a result, the spatial coordinates for the extracted anatomical surface or other objects can be provided in the same coordinate spatial coordinate system as the geometry data 1756.
As mentioned, the metric calculator 1776 is programmed to compute one or more metrics (e.g., quantitative assessments) for a number physiological conditions. In the example of
In one example, the cardiac function calculator 1810 is programmed to determine one or more anatomical mechanical properties based on analysis of the ultrasound image data 1759 and/or electrical properties based on reconstructed electrophysiological signals. For example, the cardiac function calculator 1810 can invoke the ultrasound image processor to segment the image and analyze dimensions of the heart and/or one or more of its chambers in one or more image frames acquired over time. Based on such analysis over a plurality of frames (including at least one full cardiac cycle), the cardiac function calculator 1810 can quantify one or more functional parameters, such as heart rate, stroke volume, cardiac output, and ejection fraction.
As a further example, the control 1766 can provide instructions to a selected one or more of the ultrasound transducer modules to operate the ultrasound in the B-mode and acquire respective images of cardiac anatomy, including long and short-axis views. The ultrasound image processor 1780 can analyze the acquired B-mode ultrasound images to determine spatial coordinates for epicardial and endocardial surfaces, including an identification of long and short axes of the heart. The cardiac function calculator 1810 (or other function) can determine a measure of wall thickness across the heart (e.g., distance between coordinates along the epicardial and endocardial surfaces. The cardiac function calculator 1810 can also determine stroke volume, ejection fraction, cardiac output, endocardial and epicardial area based on such measurements.
In another example, the control 1766 can provide instructions to a selected one or more of the ultrasound transducer modules to operate the ultrasound in the M-mode and acquire respective images of cardiac anatomy. The ultrasound image processor 1780 can analyze the acquired M-mode ultrasound images using feature extraction (e.g., automated and/or manually responsive to a user input selection of features) identify anatomical features of interest, such as one or more heart valves or other tissue. The cardiac function calculator 1810 can monitor motion of the identified features and track motion over time, such as to provide an assessment of valve plane motion and/or leaflet dynamics.
The pulmonary function calculator 1812 is programmed to determine one or more anatomical mechanical properties of pulmonary system (e.g., lungs) based on analysis of the ultrasound image data 1759. The pulmonary function calculator 1812 can use function of the ultrasound image processor 1780 in a similar manner to as described above with respect to the cardiac function calculator 1810. For example, the pulmonary function calculator 1812 can invoke the ultrasound image processor 1780 and its functions to segment and extract pulmonary structural features (e.g., representative of anatomical surfaces and/or fluid within spaces between surfaces) from one or a series of ultrasound images. The pulmonary function calculator 1812 can compute one or more pulmonary properties based on the extracted features, such as pneumothorax, pleural effusion, pneumonia/consolidation, volume assessment, such as representing volume changes (e.g., free fluid within the lungs).
The tissue property calculator 1814 is programmed to determine one or more properties of tissue (e.g., tissue properties of the heart, lung and other tissue) based on analysis of the ultrasound image data 1759 and/or electrophysiological data 1170. Examples of mechanical tissue properties that the tissue property calculator 1814 can determine based on ultrasound image data 1759 include strain, deformation, stiffness, and elasticity to name a few. Examples of electrical tissue properties that the tissue property calculator 1814 can determine based on ultrasound image data 1759 and/or electrophysiological data 1170 include impedance or conductivity.
For example, the ultrasound transducer modules and ultrasound image processor 1780 can be configured (e.g., by control 1766) to implement speckle tracking of cardiac tissue. The tissue property calculator 1814 can be programmed to compute and/or strain rates (e.g., global longitudinal strain, global circumferential strain, radial strain etc.) of respective tissue regions. The strain-related information can be used (e.g., by cardiac function calculator 1810) to provide a quantitative assessment of cardiac function for respective regions based on the determined strain properties. In some examples, the tissue property calculator 1814 can be programmed to provide a quantitative assessment of tissue stiffness, such as can be measured/inferred from ultrasound elastography measures (e.g., by computing a value representative of Young's modulus for tissue) based on tissue displacement (e.g., longitudinal deformation) determined responsive to ultrasonic signals or other acoustic energy transmitted by one or more of the transducer modules. The tissue property calculator 1814 can also be programmed to calculate an elastic modulus of tissue (e.g., stiffness of cardiac or lung tissue) can also be calculated using ultrasound shear wave velocity measurements and based on an estimated tissue density.
The hemodynamic function calculator 1816 is programmed to determine one or more fluid dynamic properties (e.g., of blood or other fluids present within the patient's body) based on analysis of the ultrasound image data 1759. For example, the hemodynamic function calculator 1816 is programmed to compute velocity gradient of blood based on speckle tracking methods (e.g., speckle decorrelation- and correlation-based lateral speckle-tracking methods performed by the ultrasound transducer processor). As a further example, the ultrasound image processor 1780 can process the acquired ultrasound images acquired over time to and analyze the pixels (or voxels) acquired through intermittent sampling over time and determine properties representative of fluid flow velocity and direction. For example, the ultrasound image processor 1780 can be programmed to implement color-flow Doppler in which flow (e.g., blood flow) having a positive or negative Doppler shifts are mapped to respective different color-codes depending on the directions of flow. The color-coded pixels can be rendered on a grey-scale or other (e.g., M-mode) image of the anatomy. The intensity or contrast of the respective colors can also be adjusted within a given color palate according to the velocity of the blood that is computed based on the change in pixel (or voxel) positions over time. The hemodynamic function calculator 1816 can compute properties that provide measure of blood velocity based on the velocity values of pixels within hollow portions of the tissue (e.g., heart chambers, blood vessels, lungs and other spaces).
In some examples, the metric calculator 1776 can be configured to instruct the ultrasound transducer modules and ultrasound image processor 1780 to implement other forms Doppler ultrasound or speckle tracking for computing one or more other metrics. For example, the ultrasound transducer modules and ultrasound image processor 1780 can implement power Doppler ultrasound can be implemented in which the amplitude of the Doppler signal is mapped to a continuous color range. Such power Doppler can be used to spatially identify small anatomical structures, such as blood vessels or calcified regions. In some examples, ultrasound contrast agents (injectable microspheres) can be injected into the patient's body (e.g., into the blood stream) to facilitate detection of blood flow dynamics in particular regions.
Any of the computed metrics 1776, 1810, 1812, 1814 and 1816 and associated graphical outputs thereof can be synchronized with respect to the measured electrophysiological signals, such as provided in one or more maps generated by the EGM reconstruction function 1772. For example, a graphical representation of the cardiac function information can be superimposed on a graphical representation of the electrocardiographic maps that is provided in a given window of the display 1792. In another example, the graphical representation of the cardiac function can be superimposed on an ultrasound image in a respective window of the display 1792 and the graphical representation of the electrocardiographic map can be displayed concurrently in a separate window of the display 1792. The electrophysiological information derived from the electrophysiological data and the mechanical information derived from the ultrasound data thus can be combined in various ways to provide complementary data for assessing the patient's condition.
As a further example, the remote system 1762 can generate the output data 1774 to provide guidance or controls based on the ultrasound image data 1759, electrophysiological data 1770 and associated maps, and/or one or more of the metrics computed by the metric calculator 1776. The guidance can be provided before, during (e.g., in real-time) or after an intervention. In some examples, the guidance and/or controls can be provided automatically based on applying rules (e.g., programmed responsive to a user input) to the ultrasound image data 1759, electrophysiological data 1770 and associated maps, and/or one or more of the metrics computed by the metric calculator 1776. The guidance can be presented in an output graphical visualization, and controls can be in the form of control applied to delivery of one or more therapy or other intervention. In yet another example, the guidance can be provided a robotically controlled surgical system based on which the robotically controlled (or computer-assisted) surgical system can control one or more parameters for moving one or more instruments other control functions as part of performing the intervention.
Referring back to
As described herein, because the systems and methods disclosed herein are configured to obtain and analyze multimodal physiological information, such as based on at least electrophysiological data and ultrasound image data obtained concurrently from a given patient, the resulting output data can provide broader, complementary and more encompassing assessment between electrophysiological conditions and biomechanical conditions (e.g., cardiac function, pulmonary function, hemodynamics, etc.) compared to existing systems.
The invention may be further described with respect to the following numbered paragraphs:
It should be understood that various aspects disclosed herein may be combined in different combinations than the combinations specifically presented in the description and accompanying drawings. It should also be understood that, depending on the example, certain acts or events of any of the processes or methods described herein may be performed in a different sequence, may be added, merged, or left out altogether (e.g., all described acts or events may not be necessary to carry out the techniques). In addition, while certain aspects of this disclosure are described as being performed by a single module or unit for purposes of clarity, it should be understood that the techniques of this disclosure may be performed by a combination of units or modules associated with, for example, a medical device.
In one or more examples, the described techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit. Computer-readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer).
Instructions may be executed by one or more processors, such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Accordingly, the term “processor” as used herein may refer to any of the foregoing structure or any other physical structure suitable for implementation of the described techniques. Also, the techniques could be fully implemented in one or more circuits or logic elements.
This application claims the benefit of priority to U.S. Provisional Patent Application No. 63/347864, filed Jun. 1, 2022, which is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
63347864 | Jun 2022 | US |