Tumor ablation therapy is an approach to remove tumor tissue by minimally invasive surgical procedures. In such procedures, an interventional ablation tool is typically directed to a target location within a body of a patient that is either close to, or within, the tumor tissue. Energy is then delivered to the tumor tissue in a sufficiently rapid manner to destroy the tumor tissue. The interventional ablation tool may be a radio frequency ablation tool, a laser ablation tool, a microwave ablation tool, a cryoablation tool, and/or the like. However, such interventional ablation tools are inaccurate and unsafe since the tools do not monitor temperature and/or a thermal dose.
According to some implementations, a system may include an ultrasound transmitter to transmit ultrasound signals through a region of tissue during an ablation procedure; an ultrasound receiver to receive the ultrasound signals transmitted by the ultrasound transmitter after the ultrasound signals pass through the region of tissue; and a signal processor communicatively coupled to the ultrasound transmitter and the ultrasound receiver. The signal processor may communicate with the ultrasound transmitter and the ultrasound receiver to obtain a set of measurements related to the ultrasound signals transmitted through the region of tissue during the ablation procedure, determine one or more acoustic characteristics of the ultrasound signals transmitted through the region of tissue based on the set of measurements, and generate an image representing a thermal map of the region of tissue during the ablation procedure based on a mapping between the one or more acoustic characteristics of the ultrasound signals and changes in temperature.
According to some implementations, a method may include obtaining patient-specific simulation data including expected temperature-dependent measurements for ultrasound signals to be transmitted through a region of tissue during an ablation procedure; determining a relative geometry between an ultrasound transmitter arranged to transmit the ultrasound signals through the region of tissue during the ablation procedure and an ultrasound receiver arranged to receive the ultrasound signals transmitted by the ultrasound transmitter after the ultrasound signals pass through the region of tissue; calculating actual temperature-dependent measurements for the ultrasound signals transmitted through the region of tissue during the ablation procedure based on the relative geometry between the ultrasound transmitter and the ultrasound receiver; and performing an action to guide the ablation procedure based on a comparison of the actual temperature-dependent measurements for the ultrasound signals and the expected temperature-dependent measurements for the ultrasound signals
According to some implementations, a non-transitory computer-readable medium may store one or more instructions. The one or more instructions, when executed by one or more processors, may cause the one or more processors to determine relative locations associated with one or more ultrasound transmitters arranged to transmit ultrasound signals through a region of tissue during an ablation procedure and one or more ultrasound receivers arranged to receive the ultrasound signals transmitted by the one or more ultrasound transmitters after the ultrasound signals pass through the region of tissue. The one or more instructions, when executed by the one or more processors, may cause the one or more processors to calculate a set of temperature-dependent measurements for the ultrasound signals transmitted through the region of tissue during the ablation procedure and determine, based on the set of temperature-dependent measurements and the relative locations associated with the one or more ultrasound transmitters and the one or more ultrasound receivers, one or more acoustic characteristics of the ultrasound signals transmitted through the region of tissue, wherein the one or more acoustic characteristics include one or more of a speed, an intensity, an attenuation, a phase, or a nonlinearity for the ultrasound signals. The one or more instructions, when executed by the one or more processors, may cause the one or more processors to generate an image representing a thermal map of the region of tissue during the ablation procedure based on temperature-dependent variations in the one or more acoustic characteristics of the ultrasound signals.
The following detailed description of example implementations refers to the accompanying drawings. The same reference numbers in different drawings may identify the same or similar elements.
During a heat-induced tumor ablation treatment, a cryoablation treatment in which extreme cold is used to destroy targeted tissue, and/or the like, guiding and monitoring the ablation process is crucial, especially when the operation requires high accuracy. However, due to a low contrast between ablated and untreated tissue in ultrasound images (e.g., B-mode images), conventional ultrasound imaging is usually not effective for the monitoring. Other imaging modalities, including computed tomography (CT) and magnetic resonance imaging (MRI), can be incorporated with the ablation therapy and provide effective image guidance and monitoring. However, these high-end imaging devices have requirements that may make this approach unaffordable or inaccessible for many patients. The radiation dose and magnetic field compatibility requirements also prevent these methods from being widely used.
Some implementations described herein provide real-time ultrasound temperature monitoring systems for ablation therapy using a high-intensity focused ultrasound (HIFU) system. For example, some implementations described herein may utilize one or more of several different thermal dose monitoring systems and methods, which may be based on ultrasound imaging modalities, to assist an operator with control of an ablation treatment process. One or more of these methods can be implemented in an interventional ablation system to provide a thermal dose at a low cost and with zero radiation. Furthermore, these methods enable real-time, high-accuracy guidance and monitoring during ablation therapy, thus reducing the risk and difficulty of the ablation treatment. Furthermore, these methods can be used to monitor and/or guide any suitable ablation treatment modality, including radiofrequency ablation, laser ablation, microwave ablation, cryoablation, and/or the like.
In this way, some implementations described herein provide a new configuration of an ultrasound imaging system that includes an ablation applicator with ultrasound transceiver sources (e.g., transducer elements formed from a piezoelectric (PZT) material, a polyvinylidene difluoride (PVDF) material, and/or the like) that can generate imaging pulses using a photoacoustic effect, and an external transducer array arranged to receive a signal. Some implementations described herein may provide acoustic radiation force imaging (ARFI) in an intra-ablation region, where acoustic radiation force (ARF) pulses may be generated by a transducer attached to the ablation applicator, and an image representing a thermal map of the intra-ablation region may be generated in real-time using a synchronized ultrasound imaging array. Some implementations described herein may also provide imaging reconstruction models that can be used to generate the image representing the thermal map of the intra-ablation region.
The terms “light” and “optical” are intended to have a broad meaning. These terms can include visible regions of the electromagnetic spectrum. Additionally, or alternatively, these terms can include nonvisible regions of the electromagnetic spectrum such as infrared light, ultraviolet light, and/or the like.
The term “photoacoustic” is intended to have a broad meaning, which can include photons at any energy suitable for the particular application in which energy that generates an acoustic signal is deposited in a body of interest.
The term “body” is intended to refer generally to a mass, and not necessarily or specifically to a human or animal body. In some implementations, a body of interest can include a human or animal organ, or a portion thereof.
The term “interstitial” means to be inserted into tissue, such as a needle inserted into tissue with the inserted tip being surrounded by the tissue.
In some implementations, as shown in
In some implementations, the ultrasound transceivers 101 and/or HIFU elements 103 may include one or more piezoelectric transducers, one or more photoacoustic transmitters and/or receivers, and/or the like. Furthermore, in some implementations, the ultrasound transceivers 101 and/or HIFU elements 103 may include one or more ultrasound transmitters and/or receivers described in U.S. Patent Application Publication No. 2014/0024928, the content of which is incorporated herein by reference in its entirety.
In some implementations, ultrasound data used to monitor the thermal dose can be collected sequentially for a non-invasive ablation procedure. For example, the HIFU system 201 may transmit ultrasound pulses and a trigger signal, and the trigger signal may initiate collection of ultrasound data received by one or more external elements. By example interventional system 200 repeating this procedure, ultrasound pulses transmitted from each element in the HIFU system 201 may be collected.
In some implementations, the ultrasound data used to monitor the thermal dose can be collected sequentially for a minimally-invasive ablation procedure. For example, each element in an ultrasound transducer may transmit ultrasound pulses and a trigger signal, and the trigger signal may initiate collection of ultrasound data received by one or more external elements. By example interventional system 200 repeating this procedure, ultrasound pulses transmitted from each ultrasound element in the ultrasound transducer may be collected.
As indicated above,
Bus 710 includes a component that permits communication among multiple components of device 700. Processor 720 is implemented in hardware, firmware, and/or a combination of hardware and software. Processor 720 is a central processing unit (CPU), a graphics processing unit (GPU), an accelerated processing unit (APU), a microprocessor, a microcontroller, a digital signal processor (DSP), a field-programmable gate array (FPGA), an application-specific integrated circuit (ASIC), or another type of processing component. In some implementations, processor 720 includes one or more processors capable of being programmed to perform a function. Memory 730 includes a random-access memory (RAM), a read only memory (ROM), and/or another type of dynamic or static storage device (e.g., a flash memory, a magnetic memory, and/or an optical memory) that stores information and/or instructions for use by processor 720.
Storage component 740 stores information and/or software related to the operation and use of device 700. For example, storage component 740 may include a hard disk (e.g., a magnetic disk, an optical disk, and/or a magneto-optic disk), a solid-state drive (SSD), a compact disc (CD), a digital versatile disc (DVD), a floppy disk, a cartridge, a magnetic tape, and/or another type of non-transitory computer-readable medium, along with a corresponding drive.
Input component 750 includes a component that permits device 700 to receive information, such as via user input (e.g., a touch screen display, a keyboard, a keypad, a mouse, a button, a switch, and/or a microphone). Additionally, or alternatively, input component 750 may include a component for determining location (e.g., a global positioning system (GPS) component) and/or a sensor (e.g., an accelerometer, a gyroscope, an actuator, another type of positional or environmental sensor, and/or the like). Output component 760 includes a component that provides output information from device 700 (via, e.g., a display, a speaker, a haptic feedback component, an audio or visual indicator, and/or the like).
Communication interface 770 includes a transceiver-like component (e.g., a transceiver, a separate receiver, a separate transmitter, and/or the like) that enables device 700 to communicate with other devices, such as via a wired connection, a wireless connection, or a combination of wired and wireless connections. Communication interface 770 may permit device 700 to receive information from another device and/or provide information to another device. For example, communication interface 770 may include an Ethernet interface, an optical interface, a coaxial interface, an infrared interface, a radio frequency (RF) interface, a universal serial bus (USB) interface, a wireless local area network interface, a cellular network interface, and/or the like.
Device 700 may perform one or more processes described herein. Device 700 may perform these processes based on processor 720 executing software instructions stored by a non-transitory computer-readable medium, such as memory 730 and/or storage component 740. As used herein, the term “computer-readable medium” refers to a non-transitory memory device. A memory device includes memory space within a single physical storage device or memory space spread across multiple physical storage devices.
Software instructions may be read into memory 730 and/or storage component 740 from another computer-readable medium or from another device via communication interface 770. When executed, software instructions stored in memory 730 and/or storage component 740 may cause processor 720 to perform one or more processes described herein. Additionally, or alternatively, hardware circuitry may be used in place of or in combination with software instructions to perform one or more processes described herein. Thus, implementations described herein are not limited to any specific combination of hardware circuitry and software.
The number and arrangement of components shown in
In some implementations,
In the first example method 800, with reference to the example interventional system 100 shown in
The method 800 for mapping changes in acoustic properties to changes in temperature may be based on various ultrasound characteristics that are temperature-dependent, which may include changes in the speed of sound (SoS) at different temperatures. Furthermore, in some implementations, the temperature-dependent ultrasound characteristics may include attenuation, phase, nonlinearity, and/or other characteristics that relate to an intensity of an ultrasound signal (e.g., a concentration of energy in a beam) at different temperatures. Accordingly, to map the changes in acoustic properties to the changes in temperature, the ultrasound transceivers 101 may communicate with the HIFU elements 103 to acquire time of flight measurements and other acoustic properties in order to calculate thermal maps.
For image reconstruction (e.g., generating an image that represents the thermal map), a distance between the ultrasound transceivers 101 and the HIFU elements 103 may be determined in various ways. For example, the distance may be determined based on imaging, using an optical tracker to locate the ultrasound transceivers 101 and/or HIFU elements 103, using encoders to track change in the position of the ultrasound transceivers 101 and/or HIFU elements 103, using ultrasound triangulation localization, using a fixed system (e.g., a robotic arm as shown in
In some implementations, one or more tomographic methods may be utilized to map a temperature in a volume and to derive unknown parameters. For example, in some implementations, the tomographic methods may perform imaging by sections or sectioning to produce a three-dimensional image of internal structures of a solid object through the use of penetrating waves. One example tomographic method may include utilizing multiple ultrasound transceivers 101 that receive ultrasound pulses from different known locations. As a result, a quantity of independent equations may be greater than a quantity of unknowns, and image reconstruction becomes a solvable problem.
Additionally, or alternatively, a quantity of FUS elements generating pulses to an external receiver may be optimized to limit pulse generation to effective FUS elements that transmit ultrasound signals that travel through an ablation zone and carry valuable ToF information. In some implementations, several MRI-compatible receivers may be utilized, and real-time MRI thermometry may be utilized as a reference to assess and optimize system configurations and thermal image generation methods.
In some implementations, by sweeping a patient body surface with ultrasound transceivers (e.g., using a robot arm associated with a control system, as shown in
In some implementations, a region of interest may be reduced to an area where a temperature is actually changing. As described in further detail elsewhere herein, such implementations may use a thermal propagation model to segment a region, and voxels that include similar temperatures may be grouped together and may decrease a total number of unknowns.
With reference to
In some implementations, various reconstruction methods can be used to generate speed-of-sound (SoS) volumes that can in turn be used to generate the thermal maps. For example, an SoS volume may be generated using a batch optimization technique in which the SoS volume is generated using all available ToF information, an iterative optimization technique in which the SoS volume is generated for various segmented regions, and/or the like. For example,
In some implementations,
The relationship between temperature and speed-of-sound (SoS) enables ultrasound thermometry through tomographic reconstruction of SoS maps from direct ToF measures. However, a tomographic problem is rank deficient, which may lead to multiple solutions. More specifically, recorded ToF data may be sparse, as a maximum number of equations may equal a number of FUS elements multiplied by a number of receivers employed. In some implementations, the SoS may be reconstructed at a voxel level in order to address the tomographic problem. Furthermore, a relationship between a change in temperature and SoS may be linear until a certain point, and this relationship may be tissue-specific. In some implementations, combining advanced quantitative tomographic imaging, and patient-specific simulation incorporating prior knowledge of biological and physical phenomena in thermal ablation, may address the relationship problem.
For example, in some implementations, limited angle tomography may be utilized. Often used in x-ray mammography, and referred to as tomosynthesis, limited angle tomography has a highly transformed routine screening, enabling quantitative 3D imaging with significantly enhanced value. In transmission ultrasound imaging (e.g., in contrast to reflection imaging in B-mode), the transmitter and receiver transducers may be located at different known positions with respect to a volume of interest. This ToF approach is affected less by directivity of a receiver element than in a pulse-echo scheme. A received signal can be used to reconstruct acoustic properties of a volume, such as SoS, attenuation, and/or the like.
In some implementations, optimal probe placement and image reconstruction are two important considerations to achieve effective in vivo ultrasound tomosynthesis. In such implementations, extensive phantom studies involving anthropomorphic digital phantoms and experimental phantoms may be performed. Phantoms studies may narrow down data acquisition parameters and improve reconstruction algorithms, which are then followed up by real subject studies.
In some implementations, to address the SoS reconstruction problem, a time-of-flight technique (e.g., a ray-based technique) may be utilized. For example, given a grid of pixels, a system matrix S may be used to describe how much a given ray travels through each pixel of the grid. Each row of the system matrix may correspond to one ray and may contain path lengths corresponding to each grid cell. The system matrix S may be constructed based on a Siddon method, and may be further refined using more advanced interpolations (e.g., splines). The SoS can be reconstructed by ToF measurements for all rays. The ToF measurements may be represented by a vector of length N_t (e.g., a number of possible rays to the receivers), and an image, X, may be represented by a vector of length N_g (e.g., a number of voxels). Instead of directly calculating the speed, an inverse of the speed (e.g., referred to as slowness) may be calculated. A time required for an ultrasound signal to travel along a ray may be equal to a sum of a time to pass through all cells, and a time t required to travel through one cell may be t=s*x, where s is the path length along the cell, and x is the cell's slowness. This leads to solving the following equation for X: SX=ToF, where S is the system matrix and X is an image vector. In some implementations, analytical methods (e.g., pseudo-inverse), iterative methods (e.g., conjugate gradient), statistical methods (e.g., expectation maximization), and/or the like may be utilized. Iterative and statistical image generation may be considerably more quantitative, but more time consuming
In some implementations, to address the SoS reconstruction problem, a wavefront tomography method may be utilized. The waveform tomography method can improve results (e.g., as used in breast imaging and past efforts of geophysics researchers on seismic waveform inversion). For the methods to address the SoS reconstruction problem, Bayesian maximum a posteriori (MAP) frameworks can also be appended, including sparse tomographic reconstruction, utilizing compressed sensing involving L1-norm minimization.
In some implementations, an attenuation coefficient, which may also convey information about temperature (e.g., especially when linked with SoS) may be reconstructed. The above reconstruction methods may be used to reconstruct the attenuation coefficient. In such implementations, an original transmitter signal may be known, and a detected signal intensity may define a measurement, which may be tomographically reconstructed to generate attenuation maps.
In some implementations, prior knowledge of biological and physical phenomena involved in thermal ablation may be utilized. Via the usage of computational models, heat diffusion and cellular necrosis may be simulated. The prior knowledge may include developing computational models of radiofrequency ablation (RFA), which may be evaluated against pre-clinical and clinical data of subjects with tumors.
In some implementations,
In some implementations, these patient-specific simulations may be coupled with tomographic image reconstruction. Expected thermal maps can be used to reduce a number of SoS unknowns in a region of interest (ROI) around an ablation focal point by grouping together, in a same layer, voxels which are expected to have a same temperature according to the patient-specific simulation.
In some implementations, generating thermal maps from the SoS maps, the attenuation maps, and/or the like could lead to a range of solutions. However, using a patient-specific simulation, solutions may be distinguished because a temporal and spatial temperature evolution should be smooth (e.g., there should not be a significant jump in temperature from one time point to a next time point, from a particular spatial point to an adjacent spatial point, and/or the like). Furthermore, the patient-specific simulation may be used to distinguish solutions because temperature should decrease farther away from the ablation focal point. In some implementations, the SoS reconstruction problem may be addressed by initialization based on patient-specific thermal maps.
In some implementations, limited angle reconstruction of speed-of-sound may be developed based on direct ToF measurements. These measurements may be validated within ±3° C. with MRI thermometry up to 55° C.
In some implementations, machine learning may be utilized to generate synthetic images from ultrasound data. A machine learning model may be utilized to synthetize thermal images using ultrasound acquisition. This may provide a simple and low-cost system with thermal images at a high frame rate and without the requirement of an MRI scanner.
In some implementations, thermal images may be reconstructed based on a machine learning model. Due to ultrasound physics, ultrasound signals may change as a target is heated (e.g., during a HIFU ablation treatment), cooled (e.g., during a cryoablation treatment), and/or the like. With MR thermal images and corresponding ultrasound signals, a machine learning model can be trained. The training information may include ultrasound channel data, B-mode images, time of flight data, ultrasound elements locations, and/or other ultrasound information. By detecting a change of these ultrasound signal properties during ablation, a thermal map can be recovered through the machine learning model.
In some implementations, intra-operative ultrasound measurements may be utilized to personalize biophysical parameters involved in the simulation. A patient-specific model may enable simulating a delay that is expected to affect different ToF data due to the ablation evolution. Therefore, biophysical model parameters can be optimized by minimizing an error between the measured ToF and the simulated ToF, and between the reconstructed thermal maps and the simulated thermal maps. Modeling can be performed in real-time or faster than real-time with a Lattice Boltzmann method. In some implementations, an iterative loop between SoS image reconstruction and patient-specific simulation may be utilized in order to generate more accurate 3D thermal images. Limited SoS maps at the regional level from direct ToF measurements may be utilized.
In some implementations,
In the third example method 1000, a virtual pattern may be injected in an ultrasound B-mode image. In some implementations, the pattern may be generated using an ultrasound machine or a HIFU system.
In some implementations, a certain ultrasound pattern can be injected in the ultrasound B-mode image before ablation. Due to an increase in temperature, acoustic properties such as the speed of sound, attenuation, and/or the like may vary, whereby the injected pattern may deform from an initial state. Thermal maps and thus thermal dosage can be reconstructed by tracking these changes in the injected patterns and thus tracking the acoustic property changes. Learning tools based on deep learning, machine learning, simulation of synthetic images, and/or the like may be utilized to determine an effect of temperature changes on the virtual injection pattern, and thus enable recovery of thermal maps and a delivered thermal dose. This learning process can be achieved with simulation and/or actual data acquisition.
In some implementations, the imaging method for temperature monitoring may be based on the injection of the virtual ultrasound pattern in the ultrasound brightness mode (B-mode) image coupled with biophysical simulation of heat propagation. This imaging method does not require any hardware extensions to an ultrasound B-mode system. The imaging method may establish a bi-directional ultrasound communication between an ultrasound imaging machine and an active element inserted within the tissue. A virtual pattern can then directly be created in the ultrasound B-mode display during the ablation by controlling a timing and an amplitude of the ultrasound field generated by the active element. Changes of the injected pattern are related to the change of the ablated tissue temperature through the additional knowledge of a biophysical model of heat propagation in the tissue. Such changes may be monitored during ablation and used to generate spatially and temporally accurate thermal maps.
In some implementations,
In the fourth example method 1100, a thermal dose may be measured with ultrasound standing wave elastography. An ultrasound elastography image may be acquired by applying stress to the tissue and measuring a local displacement and deformation, from which a tissue stiffness map can be derived. There are various methods to apply stress to tissue, such as free hand palpation, vibration, acoustic radiation force (ARF), shear wave (SW), and/or the like. However, these methods typically cannot generate a fully controllable, programmable, and stable stress in the tissue. Standing wave, also known as a stationary wave, is a wave that stays in a constant position. A standing wave can be generated when two waves traveling in opposite directions have identical frequencies, amplitudes, and beam paths. As a result, a node and antinode of a combined wave may remain at a same position. An ultrasound wave is a longitudinal wave, which means that antinodes have higher pressure than the nodes, and the pressure is along the wave traveling direction. If an ultrasound standing wave is formed in the tissue, because the antinodes have higher pressure than the nodes, tissue will be pushed from the higher-pressure region to the lower-pressure region, and physical displacements and deformations may be formed, which can be used for the elastography imaging.
Aside from the elastography, the fourth example method 1100 may be utilized for speed of sound map measurement. Given a wave frequency, a wave length is determined by the speed of sound. A standing wave pattern may freeze a wave and make a direct measurement of wavelength possible.
Deep learning based inverse problems (or image reconstruction) can be broadly categorized into two groups. A first group uses an end-to-end deep neural network to solve the problem. Instead of completely relying on neural networks, a second group uses a standard image reconstruction approach (e.g., filtered back projection) for an initial reconstruction followed by exploiting a deep neural network to correct the reconstruction artifacts.
In some implementations, a deep neural network-based approach may be utilized to estimate spatial and temporal temperature distribution from a set of ultrasound signals. Such implementations may integrate a standard physics-based method, convolutional neural networks, and recurrent neural networks.
There may be two possible scenarios in a thermal monitoring setup. The first scenario includes acquiring RF signals using a piezoelectric element or an ultrasound transducer. In the second scenario, a B-mode image may be obtained based on pattern injection (e.g., as shown in
For the first scenario, a filtered back-projection model may be utilized, where a standard high-pass filtering may be replaced before the back-projection by a deep convolutional neural network (CNN).
After filtering by CNNHPF, a physics-based back-projection technique may be utilized for reconstruction. The back-projection may be based on a bent-ray model that provides a good approximation of ultrasound wave propagation in a heterogenous medium. After the back-projection, a post CNN may be utilized to eliminate the artifacts due to partial reconstruction from a limited angle tomography. This CNN is indicated as CNNIDN in part (b) of
A recurrent neural network (RNN) may be utilized at an end, as shown in
ConvGRU is an extension of GRU to extract spatial temporal features among a series of 2D feature maps. ConvGRU may take a current input Xt and previous hidden state Ht−1, and may generate a current hidden state Ht. A relation of Ht with Xt and Ht−1 is given below:
Z
t=σ(Wxz*Xt+Whz*Ht−1)
R
t=σ(Wxr*Xt+Whr*Ht−1)
{tilde over (H)}
t=tanh(Wxh*Xt+Whh*(Rt°Ht−1))
H
t=(1−Zt)°Ht−1+Zt°{tilde over (H)}t
where * and ° represent convolution and element-wise matrix multiplication, respectively, Wxz, Whz, Wxr, Whr, Wxh, Whh are the parameters to be learned during training, σ represents the sigmoid operation, and tanh is the activation function.
As indicated above,
As shown in
As further shown in
As further shown in
Process 1300 may include additional implementations, such as any single implementation or any combination of implementations described below and/or in connection with one or more other processes described elsewhere herein.
In a first implementation, the one or more acoustic characteristics may include changes in one or more of speeds or intensities at which the ultrasound signals travel through the region of tissue. In some implementations, the mapping used to generate the image representing the thermal map of the region of tissue may be based on a relationship between changes in temperature and the changes in the speeds or intensities at which the ultrasound signals travel through the region of tissue.
In a second implementation, alone or in combination with the first implementation, the mapping used to generate the image representing the thermal map of the region of tissue may be further based on temperature-dependent variations in a time of flight, an attenuation, a phase, and/or a nonlinearity for at least one of the ultrasound signals transmitted through the region of tissue.
In a third implementation, alone or in combination with one or more of the first and second implementations, the ultrasound receiver may include a transducer array having one or more transducer elements with known locations, and the signal processor may determine a relative geometry between the ultrasound transmitter and the ultrasound receiver based on the known locations of the one or more transducer elements and time of flight data associated with ultrasound signals transmitted from the ultrasound transmitter to the ultrasound receiver before the ablation procedure.
In a fourth implementation, alone or in combination with one or more of the first through third implementations, the signal processor may generate the image representing the thermal map of the region of tissue using one or more tomographic techniques based on the relative geometry between the ultrasound transmitter and the ultrasound receiver.
In a fifth implementation, alone or in combination with one or more of the first through fourth implementations, the signal processor, when generating the thermal map of the region of tissue, may use a thermal propagation model to segment the region of tissue into groups of voxels that have similar temperatures and reduce a region of interest to be represented by the thermal map to an area where the ultrasound signals are causing a change in temperature during the ablation procedure based on the groups of voxels that have the similar temperatures.
In a sixth implementation, alone or in combination with one or more of the first through fifth implementations, the signal processor may obtain patient-specific simulation data including an expected temperature evolution for the region of tissue during the ablation procedure based on a three-dimensional anatomical image of the region of tissue and one or more biophysical parameters and use the patient-specific simulation data including the expected temperature evolution for the region of tissue in combination with one or more tomographic image reconstruction techniques to generate the image representing the thermal map of the region of tissue.
In a seventh implementation, alone or in combination with one or more of the first through sixth implementations, the signal processor may obtain simulation data including a simulated thermal map based on expected time of flight measurements for the ultrasound signals to be transmitted through the region of tissue during the ablation procedure and perform an action based on a comparison of actual time of flight measurements for the ultrasound signals transmitted through the region of tissue during the ablation procedure and the expected time of flight measurements for the ultrasound signals.
Although
As shown in
As further shown in
As further shown in
As further shown in
Process 1400 may include additional implementations, such as any single implementation or any combination of implementations described below and/or in connection with one or more other processes described elsewhere herein.
In a first implementation, the patient-specific simulation data may further include a simulated thermal map based on the expected temperature-dependent measurements for the ultrasound signals, and the action may include displaying the simulated thermal map to guide the ablation procedure based on the comparison indicating a threshold similarity between the actual temperature-dependent measurements and the expected temperature-dependent measurements for the ultrasound signals.
In a second implementation, alone or in combination with the first implementation, the action may include causing the ablation procedure to stop based on the comparison indicating one or more of insufficient ablation in a targeted area of the region of tissue or off-target ablation in the region of tissue.
In a third implementation, alone or in combination with one or more of the first and second implementations, the patient-specific simulation data may further include an expected temperature evolution for the region of tissue during the ablation procedure based on a three-dimensional anatomical image of the region of tissue and one or more biophysical parameters, and the action may include using the expected temperature evolution for the region of tissue in combination with one or more tomographic image reconstruction techniques to generate an image representing a thermal map of the region of tissue during the ablation procedure.
In a fourth implementation, alone or in combination with one or more of the first through third implementations, the expected temperature evolution for the region of tissue may be represented according to one or more of a temporal resolution or a spatial resolution.
In a fifth implementation, alone or in combination with one or more of the first through fourth implementations, the image may be a synthesized thermal image generated using one or more of a deep learning technique or a machine learning technique.
In a sixth implementation, alone or in combination with one or more of the first through fifth implementations, the image may be an ultrasound elastography image based on pressure changes along the ultrasound signals transmitted through the region of tissue.
Although
As shown in
As further shown in
As further shown in
As further shown in
Process 1500 may include additional implementations, such as any single implementation or any combination of implementations described below and/or in connection with one or more other processes described elsewhere herein.
In a first implementation, the relative locations associated with the one or more ultrasound transmitters and the one or more ultrasound receivers may be determined based on temperature-dependent measurements associated with ultrasound signals transmitted from the one or more ultrasound transmitters to the one or more ultrasound receivers before the ablation procedure.
In a second implementation, alone or in combination with the first implementation, the signal processor may obtain simulation data including an expected temperature evolution for the region of tissue during the ablation procedure based on a three-dimensional anatomical image of the region of tissue and one or more biophysical parameters. In some implementations, the image representing the thermal map of the region of tissue may be generated based on the expected temperature evolution for the region of tissue in combination with one or more tomographic image reconstruction techniques.
In a third implementation, alone or in combination with one or more of the first and second implementations, the signal processor may obtain simulation data including a simulated thermal map based on expected temperature-dependent measurements for the ultrasound signals to be transmitted through the region of tissue during the ablation procedure. In some implementations, the signal processor may perform an action based on a comparison of the set of temperature-dependent measurements for the ultrasound signals transmitted through the region of tissue during the ablation procedure and the expected temperature-dependent measurements for the ultrasound signals.
In a fourth implementation, alone or in combination with one or more of the first through third implementations, the image representing the thermal map of the region of tissue may be generated using one or more tomographic techniques based on the relative locations associated with the one or more ultrasound transmitters and the one or more ultrasound receivers.
Although
The foregoing disclosure provides illustration and description, but is not intended to be exhaustive or to limit the implementations to the precise forms disclosed. Modifications and variations may be made in light of the above disclosure or may be acquired from practice of the implementations.
As used herein, the term “component” is intended to be broadly construed as hardware, firmware, and/or a combination of hardware and software.
As used herein, satisfying a threshold may, depending on the context, refer to a value being greater than the threshold, more than the threshold, higher than the threshold, greater than or equal to the threshold, less than the threshold, fewer than the threshold, lower than the threshold, less than or equal to the threshold, equal to the threshold, or the like.
Certain user interfaces have been described herein and/or shown in the figures. A user interface may include a graphical user interface, a non-graphical user interface, a text-based user interface, and/or the like. A user interface may provide information for display. In some implementations, a user may interact with the information, such as by providing input via an input component of a device that provides the user interface for display. In some implementations, a user interface may be configurable by a device and/or a user (e.g., a user may change the size of the user interface, information provided via the user interface, a position of information provided via the user interface, and/or the like). Additionally, or alternatively, a user interface may be pre-configured to a standard configuration, a specific configuration based on a type of device on which the user interface is displayed, and/or a set of configurations based on capabilities and/or specifications associated with a device on which the user interface is displayed.
It will be apparent that systems and/or methods described herein may be implemented in different forms of hardware, firmware, or a combination of hardware and software. The actual specialized control hardware or software code used to implement these systems and/or methods is not limiting of the implementations. Thus, the operation and behavior of the systems and/or methods are described herein without reference to specific software code—it being understood that software and hardware can be designed to implement the systems and/or methods based on the description herein.
Even though particular combinations of features are recited in the claims and/or disclosed in the specification, these combinations are not intended to limit the disclosure of various implementations. In fact, many of these features may be combined in ways not specifically recited in the claims and/or disclosed in the specification. Although each dependent claim listed below may directly depend on only one claim, the disclosure of various implementations includes each dependent claim in combination with every other claim in the claim set.
No element, act, or instruction used herein should be construed as critical or essential unless explicitly described as such. Also, as used herein, the articles “a” and “an” are intended to include one or more items, and may be used interchangeably with “one or more.” Further, as used herein, the article “the” is intended to include one or more items referenced in connection with the article “the” and may be used interchangeably with “the one or more.” Furthermore, as used herein, the term “set” is intended to include one or more items (e.g., related items, unrelated items, a combination of related and unrelated items, and/or the like), and may be used interchangeably with “one or more.” Where only one item is intended, the phrase “only one” or similar language is used. Also, as used herein, the terms “has,” “have,” “having,” or the like are intended to be open-ended terms. Further, the phrase “based on” is intended to mean “based, at least in part, on” unless explicitly stated otherwise. Also, as used herein, the term “or” is intended to be inclusive when used in a series and may be used interchangeably with “and/or,” unless explicitly stated otherwise (e.g., if used in combination with “either” or “only one of”).
This application claims priority under 35 U.S.C. § 119 to U.S. Provisional Patent Application No. 62/677,872, filed on May 30, 2018, the content of which is incorporated by reference herein in its entirety.
This invention was made with government support under grant R01EB021396, awarded by the National Institutes of Health/NIH/DHHS; and grant IIS-0653322, awarded by the National Science Foundation. The government has certain rights in the invention.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2019/034330 | 5/29/2019 | WO | 00 |
Number | Date | Country | |
---|---|---|---|
62677872 | May 2018 | US |