Optical imaging reveals fundamental biomedical mechanisms by recording fine structures and functions in biological tissues. Modern optical microscopy technologies, such as confocal microscopy, multi-photon microscopy, and optical coherence tomography, form an image via mechanical or optical scanning by detecting photons returned from tissue through scattering or fluorescence. The imaging speed of these existing systems limits their ability to capture fast dynamics over a large field of view (FOV). Conventional optical photography techniques have achieved high imaging speeds by employing two-dimensional optical sensor arrays to acquire snapshot images, but due to strong optical scattering, conventional optical photography can reveal only superficial structures in biological tissue, and it is therefore not well suited for imaging physiological parameters.
Photoacoustic (PA) imaging plays an important complementary role to optical imaging by sensing optical absorption, thereby enabling functional and molecular imaging capabilities. PA signals, unlike signals received by optical imaging systems, are due to photons that are not returned from the tissue during imaging. To date, most PA imaging systems acquire images over extended fields of view by either scanning with a single-element ultrasound transducer or by sensing PA signals using an ultrasound transducer array. Similar to the scanning devices and methods used by other scanning microscopic systems, the single-transducer-based approach to PA imaging has an imaging speed limited by mechanical scanning. Although the transducer-array-based method overcomes the mechanical scanning speed limitation by acquiring multi-channel PA ultrasonic signals in parallel, the use of ultrasound transducer arrays in PA imaging systems is typically accompanied by high system complexity and cost. Moreover, voluminous data acquired from multiple channels may eventually limit the continuous-mode imaging speed due to high demands on data streaming and processing. A need in the art exists for an imaging system suitable for imaging fine structures and functions in biological tissues with enhanced imaging speed without the added complexity and cost of transducer arrays.
In one aspect, a photoacoustic imaging system is provided that includes an ergodic relay coupled optically to a light source configured to produce a light pulse and further coupled acoustically to at least one transducer device. The ergodic relay is further configured to couple acoustically and optically to an object to be imaged.
In another aspect, a method of imaging a field of view within an object using a photoacoustic imaging system is provided that includes providing a photoacoustic imaging system with an ergodic relay coupled optically to a light source at a light input face and further coupled acoustically to at least one transducer device. The method also includes acoustically and optically coupling an object to be imaged to a light output face of the ergodic relay. The method further includes directing a diffuse light pulse produced by the light source into the object to be imaged via the light output face. The diffuse light pulse illuminates a field of view within the object. The method additionally includes receiving, via the light output face, a plurality of PA signals from a plurality of positions within the field of view. Each of the plurality of PA signals is produced at one of the plurality of positions within the field of view in response to illumination by the diffuse light pulse. The method further includes directing each of the plurality of PA signals to the at least one transducer device via the ergodic relay. Each of the plurality of PA signals is detected at the at least one transducer device after one of a plurality of corresponding delays after producing the diffuse light pulse. Each delay corresponds to one of the plurality of positions at which one of the pluralities of PA signals is produced. The method also includes forming a PA imaging data set that includes the plurality of PA signals and a corresponding plurality of positions within the field of view, each position corresponding to one of the PA signals, as well as reconstructing an image from the PA imaging data set.
The following drawings illustrate various aspects of the disclosure.
While multiple embodiments are disclosed, still other embodiments of the present disclosure will become apparent to those skilled in the art from the following detailed description, which shows and describes illustrative embodiments of the disclosure. As will be realized, the invention is capable of modifications in various aspects, all without departing from the spirit and scope of the present disclosure. Accordingly, the drawings and detailed description are to be regarded as illustrative in nature and not restrictive.
A photoacoustic imaging technique based on a high-throughput ergodic relay coupled with at least one single-element transducer to capture a widefield image with only one laser shot is disclosed. This method is referred to as photoacoustic topography through an ergodic relay (PATER). In one aspect, the PATER imaging system enables detection of up to 10,000 points over an 8×6 cm2 FOV in parallel with a 2 KHz frame rate and resolved by solving the inverse problem as described herein below, with a lateral resolution of ˜210 μm. The PATER imaging system overcomes the limitations of previous PA imaging systems, achieving a higher 2D frame rate, lower data volume, and simpler system configuration than an ultrasound transducer array-based PA tomography system, while maintaining a comparable lateral resolution and FOV. In various aspects, the PATER system makes use of broad light illumination and a single-element unfocused ultrasonic transducer in combination with an ergodic relay to reconstruct widefield snapshot images without scanning. Unlike either OR-PAM or PACT, the PA waves from the entire 2D imaging plane are encoded by the ER and then are decoded and reconstructed through the calibrated impulse responses to yield a widefield image.
The PATER imaging system makes use of an acoustic ergodic relay, also known as an acoustic ergodic cavity, defined herein as a waveguide that allows a sound wave from any input point to reach any one or more output points with distinct delay characteristics. If the ergodic relay is assumed to be lossless and the boundaries are assumed to be perfect reflectors, the acoustic wave at the input point will visit a given output position multiple times with a unique path relative to acoustic waves received at other inputs. Because an acoustic ergodic relay is a linear, temporally shift-invariant system and the ergodic relay's response can be calibrated in advance, ultrasonic waves from multiple PA sources can be detected in parallel using at least one single-element transducer coupled to the ergodic relay. The ultrasonic waves from multiple PA sources detected by the on at least one single-element transducer may be subsequently separated mathematically using analysis methods described herein below.
In various other aspects, the ergodic relay of the PATER system may be coupled to any of a variety of ultrasound transducer devices that include additional transducer elements in excess of the at least one single-element transducer disclosed herein above. The one or more ultrasound transducer devices coupled to the ergodic relay are selected from a variety of transducer devices including, but not limited to: one or more single-element transducers, one or more linear transducer arrays, one or more 2-D transducer arrays, and any combination thereof. Without being limited to any particular theory, the frame rate and/or imaging resolution of the PATER imaging system may be enhanced using transducer devices that include more than one transducer element by providing multiple parallel channels by which the ultrasonic waves from multiple PA sources may be detected and recorded.
The relatively high framing rate of the PATER imaging system, compared to both scanning-based single-transducer PA microscopy systems and array-based PA imaging systems, potentially enable neural activity imaging and other applications previously thought unachievable using previously existing PA imaging systems. In addition, the PATER imaging system may be used to enable biometric authentication of individuals for security applications based on unique internal biometric characteristics of in vivo physiological features such as blood flow, arterial oxygenation, and venous oxygenation that cannot be readily duplicated with existing technologies.
PATER Imaging System
In various aspects, the light source 102 may be configured to compensate for fluctuations in power supply to the light source/laser 102 that may cause variation in the energy delivered by each light pulse produced by the light source/laser 102. In one aspect, a portion of each light pulses emitted by the light source/laser 102 may directed to a photodiode 106 (DET36A, Thorlabs, Inc.) using a sampling element including, but not limited to a beam sampler (BS) 108 as shown in
The series of laser pulses produced by the light source 102 of the PATER imaging system 100 may be directed to selectable focusing elements 122/124 to modify (i.e. narrow or widen) the lateral dimensions of the light pulses delivered to the object 132 to be visualized using the PATER imaging system 100. As disclosed in further detail herein below, in various aspects imaging using the PATER imaging system 100 as disclosed herein is performed in two stages, each stage making use of light pulses with different cross-sectional diameters: a calibration stage in which a narrowly focused laser pulse is scanned across the object 132 to be viewed, and a widefield imaging phase, in which a diffuse (i.e. wide diameter) light pulse illuminates the entire field of view (FOV) 216 (shown in
In one aspect, each selectable focusing element is positioned separately within the PATER imaging system 100 to conduct each stage of the PATER imaging method. In this aspect, a plane convex lens 124 may be positioned to focus the light pulses during the calibration stage, and the plane convex lens 124 may be replaced by a diffuser 122 during the widefield imaging phase of the PATER imaging method. In another aspect, the selectable focusing elements 122/124 may be mounted on an adjustable stage 114 configured to position each selectable focusing element as needed by adjusting the position, orientation, and/or configuration of the adjustable stage 114. In one aspect, the adjustable stage 114 may be an optical element rotator (OER) (not illustrated). In one aspect, the diffuser 122 (D) and the plane convex lens (PCL) 124 are mounted to the OER.
In an aspect, illustrated in
As illustrated in
In various aspects, the light source/laser 102 is coupled to the selectable focusing element using any one or more known optical coupling elements to deliver the light pulse from the light source/laser 102 to the selectable focusing element. Non-limiting examples of suitable optical coupling elements include optical fibers, lenses, mirrors, and any other known optical coupling elements without limitation. In one aspect, the optical coupling element may include a mirror 110 positioned to reflect the light beam from the light source/laser 102 upward toward the selectable optical element as illustrated in
Referring again to
In some aspects, the ergodic relay 126 may be any device configured to direct light pulses from the light source 102 into a field of FOV of an object 132, to receive a plurality of PA signals 204 induced by the light pulse within the object 132, to direct each of the plurality of the PA signals 204 along one of a plurality of different pathways through the ergodic relay 126, and to deliver each of the PA signals 204 to one or more transducer devices coupled to the ergodic relay 126, such that each of the plurality of PA signals 204 is delivered at a characteristic delay that is correlated with the position from which the PA signal originated within the object 132.
In some aspects, the ergodic relay 126 may be any device configured to internally reflect light pulses and PA signals 204. In one aspect, the ergodic relay 126 is a right angle prism, such as an ergodic relay prism 126 (ERP) as illustrated in
The light pulses directed into the object 132 via the light output face 214 carry sufficient energy to induce localized heating of objects 132 illuminated by the light beams, resulting in the production of photoacoustic (PA) signals through known methods. PA signals 204 in the form of ultrasound waves induced by the light pulse propagate in all directions away from the object 132 producing the pulse, including in the direction of the light output face 214 of the ERP 126. In an aspect, an acoustic coupling gel may be applied to the object 132 and/or the surface of the light output face 214 of the ERP 126 to enhance the efficiency of transfer of the PA signals 204 into the light output face 214 of the ERP 126 as illustrated in
PA signals 204 entering the ergodic relay 126 via the light output face 214 propagate across the material of the ergodic relay 126 and are internally reflected due to the discontinuity between the acoustic transmissivity of the material of the ergodic relay 126 and the surrounding air. Each PA signal may reflect internally within the ergodic relay 126a plurality of times between the internal surfaces of the light input face 210, the angled face 212, and the light output face 214. An exemplary path taken by one PA signal is illustrated in
In various aspects, each PA signal produced at one position within the FOV 216 of the PATER imaging system 100 is directed from the light output face 214 of the ERP 126 to a transducer element of the one or more transducer devices along a unique path involving multiple internal reflections. Because each unique path is characterized by a unique total path length along the multiple reflections between the light output face 214 and the transducer element, each PA signal takes a slightly different time to travel through the ERP 126. When the entire FOV 216 within the object 132 is illuminated by a single diffuse pulse, the resulting plurality of PA signals 204 induced by the diffuse pulse arrive at the one or more transducer devices at different times due to the different path ways traveled by each PA signal. As a result, the characteristic delay between each PA signal arriving at the one or more transducer devices may be used to determine the position from which each PA signal originated.
Without being limited to any particular theory, the construction of the ERP 126 provides a plurality of pathways along which each PA signal may be transmitted from the object 132 to the one or more transducer devices. Because each pathway is of slightly different length, the transit time of a PA pulse along each pathway is slightly different, causing a delay in the arrival time of different PA signals 204 depending on the individual pathway taken through the ERP 126. Because the individual pathway taken by a PA signal through the ERP 126 depends upon the spatial position at which the PA signal enters the ERP 126, the resulting delay time may similarly be associated with a spatial position within the object 132. As will be described below, each signature delay time for each position within the object 132 may be identified during a calibration stage by illuminating a small region within the object 132 using a narrow-diameter light pulse and determining the delay time between the delivery of the light pulse and the detection of the resulting PA signal. The signature delays over the entire field of view 216 (FOV) within the object 132 may be mapped by scanning the position of the narrow-diameter light pulse and measuring the resulting delay times at each position over the entire FOV 216.
In various aspects, the ERP 126 may be selected from any known ERP device without limitation. In one aspect, the ERP 126 may be selected based on at least one functional parameter including, but not limited to, low acoustic attenuation inside the cavity, high acoustic reflectivity at its boundaries, and high asymmetricity in the geometric relationship between the ERP 126 and the coupled transducer 128 to provide distinctive propagation paths for each acoustic input point. By way of non-limiting example, the ERP 126 may be a right-angle prism made of UV fused silica (PS615, Thorlabs, Inc.; 1.5 cm right-angle edge length, 2,203-kg/m3 density, 73.6-GPa Young's modulus). The ERP in this non-limiting example has a 99.99% normal-incidence acoustic reflectivity by amplitude at the boundary between the prism and air. The acoustic attenuation coefficient is approximately 2.4 nepers/m at 20 MHz, indicating negligible attenuation in the prism over the detected pathlength range.
In various aspects, the one or more transducer devices coupled to the ergodic relay 126 may be any transducer device known in the art without limitation. In various other aspects, the one or more transducer devices may be coupled to the ergodic relay 126 at any one or more positions on the ergodic relay 126 so long as the coupled one or more transducer devices do not interfere with the delivery of the light pulses or the transmittal and internal reflection of the PA signals 204 from the object 132 into the light output face 214. In various aspects, an acoustic coupling medium may be inserted between the ERP 126 and the input of the one or more transducer devices including, but not limited to, a polyester resin material.
In one aspect, the one or more transducer devices may include at least one single-element needle transducer coupled to the right angle prism or other ergodic relay 126; each single-element needle transducer may be coupled to the ergodic relay 126 at a single point including, but not limited to, at a vertex of the prism as illustrated in
By way of non-limiting example, the PATER imaging system 100 illustrated schematically in
In this non-limiting example, the PATER imaging system 100 may further include an ergodic relay 126 in the form of a right-angle prism made of UV fused silica (PS611, Thorlabs, Inc.; 2,203 kg/m3 density, 73.6 GPa Young's modulus), which has a 99.99% normal-incidence acoustic reflectivity by amplitude at the boundary between the prism and air. The acoustic attenuation coefficient is 1.23 nepers/m at 10 MHz, indicating negligible attenuation in the prism, which has a right-angle edge length of 2.5 cm. With such high boundary reflectivity and low acoustic attenuation, the PA signals 204 reflecting within the prism may be assumed to be essentially lossless and sufficiently scrambled, so that the prism forms an acoustic ergodic relay 126 to guide PA signals 204.
Also in this non-limiting example, the PATER imaging system 100 may further include a pin-shaped ultrasound transducer (VP-0.5, CTS Electronics, Inc.; 10 MHz central frequency, 0.5 mm element size or VP-0.5-20 MHz, CTS Electronics, Inc.; 20 MHz central frequency, 56% one-way bandwidth, 0.5 mm element size) placed at a corner of the prism to maximize the distinctions among the received signals from the different input points. In this non-limiting example, an acoustic coupling material including, but not limited to, polyester resin may be positioned between the ergodic relay prism 126 and the ultrasound transducer to act as an ultrasound coupling medium.
PATER Imaging Method
The PATER imaging system 100 obtains images 218 of a field of view 216 of an object 132 according to a PATER imaging method that includes several stages. After positioning the object 132 on the light output face 214 of the ergodic relay prism 126 (ERP), a point-by-point calibration of the PATER system response is conducted during a calibration stage. During the calibration stage, a tightly focused light pulse is scanned over the field of view 216 of the object 132 while recording the resulting PA signals 204 at each position of the light pulse to map the response characteristics of the PATER system 100 for the object 132 including, but not limited to, a signature delay time between light pulse production and the detection of the resulting PA signal at the transducer as a function of position within the FOV 216. In addition to a map of the response characteristics at all positions within the field of view 216 resulting from the calibration, the PA signals 204 obtained during the calibration stage may be reconstructed into an image 218 of the object 132 using known data analysis methods.
Widefield images 218 may be obtained using the PATER imaging system 100 during a subsequent widefield imaging stage. Widefield images 218 are obtained by illuminating the entire FOV 216 of the object 132 using a single diffuse light pulse and recording a plurality of PA signals 204 produced within the FOV 216 of the object 132. The map of the response characteristics of the PATER imaging system 100 obtained during the calibration stage are used to map the plurality of PA signals 204 to corresponding positions within the FOV 216 of the object 132, an images 218 of the FOV 216 may be reconstructed using known methods once the positions from which each PA signal of the plurality of PA signals 204 originated.
Calibration Stage
The calibration stage of imaging using the PATER imaging system 100 obtains a PA signal via the ergodic relay prism 126 in response to illumination of an isolated small region of the FOV 216 of the object 132 for a plurality of small regions, and then maps the characteristics of each PA signal recorded at each isolated position within the FOV 216 of the object 132 onto the positions of each isolated region. In one aspect, the light pulse may be tightly focused and scanning across a plurality of calibration points 1602 distributed across the FOV 216 of the object 132 as illustrated in
PA signals 204 (not illustrated) from a plurality of calibration points, corresponding to the plurality of isolated illumination regions, are obtained as illustrated schematically in
Without being limited to any particular theory, because the pulse width of the laser pulse is much shorter than the transducer-defined acoustic period and the focused beam spot is much smaller than the acoustic wavelength, the PA signals 204 from each calibration point received by the ergodic relay prism 126 (ERP) can be approximated as a spatiotemporal delta function. Consequently, each point-by-point calibration measurement provides the impulse response of the ERP 126 for one calibration position (i.e. from one spot on the input surface of the ERP 126).
In conventional PA microscopy (PAM), each PA signal detected by a focused ultrasound transducer has a well-defined propagation direction and can be directly mapped onto a line image. By contrast, each signal detected by the unfocused ultrasound transducer after passing through the ERP 126 contains many acoustic modes. To accommodate the multiple acoustic modes contained within each PA signal from each calibration point, the root-mean-squared (RMS) value of each PA signal was computed to form an RMS amplitude projection image at the calibration points on the light output surface of the ERP 126.
In one aspect, the PA signals 204 from each point i along the raster scan pattern can be measured sequentially as Ci(t), where i=index number of a calibration point detected in response to light pulse input xi(t) at point i. The measured calibration signal Ci(t) may be stored as a calibration signal C(n), where n is the number of samples obtained over time t.
The RMS value, Xi,rms, for each calibration point is the root-mean-squared value of the calibration signal C(n), calculated as:
where N is the number of samples in the calibration signal.
By way of non-limiting example, a point-by-point calibration may be performed on the ergodic relay prism 126 (ERP) by illuminating a small spot within the FOV 216 of the object 132 using a 5 ns pulse focused by a plano-convex lens 124 (LA1433, Thorlabs, Inc.; 150 mm focal length) to a small spot (˜30 μm) on the light input face 210 of the ergodic prism 126 (ERP) (see
The RMS values obtained during the calibration stage are used to process the data obtained during the widefield imaging stage as described herein below. Further, the RMS values may be used to form an RMS projection image of the FOV 216 of the object 132 using a TwIsT algorithm as described herein below.
Widefield Imaging Stage
The widefield imaging stage of imaging using the PATER imaging system 100 obtains a plurality of PA signals 204 via the ergodic relay prism 126 in response to illumination of the entire FOV 216 of the object 132 using a diffuse widefield light pulse 206, and then separates and analyzes the signals originating from different positions within the FOV 216 using the calibration RMS signal values Si,rms. The separated PA signals 204 from each position within the FOV 216 are then used to form an image 218 of the FOV 216 within the object 132.
In one aspect, each widefield measurement can be expressed as a linear combination of the responses from all calibrated points:
s(t)=Σiki(t)xi,(i=1, . . . ,n), Eqn. (2)
where s is the widefield signal, and ki and xi are the impulse response and relative optical absorption coefficient from the ith calibrated point, respectively.
Once time t is discretized, Eqn. (1) can be recast in a matrix form, as expressed in Eqn. (2):
s=Kx, Eqn. (3)
where K=[k1, . . . , kn] is the system matrix.
The widefield image 218 is reconstructed by solving the inverse problem of Eqn. (3). A two-step iterative shrinkage/thresholding (TwIST) algorithm was adopted to implement the reconstruction as described herein below.
In another aspect, each PA signal 204 originating from each calibration point may be separated from the combined PA signals 204 recorded during the widefield imaging stage using the RMS values of the calibration signals determined above. In this other aspect, the signal recorded during the widefield imaging stage may be expressed at Eqn. (4):
y(t)=Σinaisi(t) Eqn. (4)
where ai is an amplitude coefficient defined according to Eqn. (5):
ai∝μaF Eqn. (5)
where μa is the light absorbance and F is the fluence of the light pulse.
The inner product from Eqn. (4) may be formed according to Eqn. (6):
Y=[<y,s1>, . . . ,<y,sn>] Eqn. (6)
The amplitude coefficient ai may be corrected using the root mean squared value of the calibration magnitudes according to Eqn. (7):
where si,rms is the root mean squared value of the calibration signal magnitude for the ith calibration point.
The calculated coefficients may then be normalized according to Eqn. (8):
By way of non-limiting example, the PA signals 204 for each image 218 may be acquired within 164 μs during the widefield imaging stage. In this example, the resulting image 218 is devoid of motion artifacts. Further, because the PATER imaging system 100 does not require scanning to form images 218 in the widefield stage, it is able to achieve a widefield recording rate of up to 6 KHz (typically limited by the laser repetition rate) over an 8×6 mm2 FOV. The system's recording rate is 2,000 times greater than that of fast functional PAM, over an eight times larger FOV by area; hence, the rate per FOV area is 8,000 greater. However, each of the system's recorded frames contains up to about 25 times fewer pixels/frame; hence, the throughput (defined herein as the product of frame rate and pixels/frame) is about 80 times higher compared to fast functional PAM.
Image Reconstruction Stage
In various aspects, the RMS calibration values obtained during the calibration stage and the widefield image data may be reconstructed into images 218 of the FOV 216 within the object 132 during an image reconstruction stage of the PATER imaging method. Any known image reconstruction method suitable for photoacoustic imaging may be used without limitation. In one aspect, a two-step iterative shrinkage/thresholding (TwIST) algorithm may be implemented to solve for x as a minimizer of the objective function:
{circumflex over (x)}=argminx∥y−Kx|2+2λΦTV(x) Eqn. (9)
Here, ΦTV(x) is the total variation regularization term, and λ is the regularization parameter. To avoid computational instability and ensure image quality, RMS values may be used to select valid calibration points to form the system matrix. Calibration points with RMS values lower than twice (6 dB) the noise level were considered as belonging to the background that was too dark to calibrate for, and, therefore, the impulse responses of these points are excluded from the system matrix K in some aspects. This initial check ensures that only the foreground points with sufficient signal-to-noise ratios are considered in the reconstruction. The approach in this aspect holds valid if the background does not merge into the foreground during the experiment.
The following examples illustrate various embodiments of the disclosure.
To assess the lateral resolution of images obtained using the PATER imaging system 100 described herein above, the following experiments were conducted.
Two physical calibration objects were imaged to quantify the lateral resolution in both calibration and snapshot imaging modes. In calibration mode, the edge of a sharp metal blade was imaged. An RMS amplitude projection of the blade edge is provided in
To quantify the lateral resolution of the snapshot widefield image obtained by the PATER imaging system 100 during the snapshot photoacoustic tomography step as described herein above, a phantom calibration body consisting of a pair of laser spot beams projected on to a black acrylic sheet was placed on the imaging plane of the PATER imaging system 100 and snapshot widefield images were obtained at a variety of separation distances between the laser spot pair.
The black acrylic sheet was selected to provide uniform absorption of the laser spots throughout the FOV 216 of the PATER imaging system 100. Two laser spot beams with beam diameters of about 5 μm were shined on the black acrylic sheet. The first laser spot was held stationary in the center of the black acrylic sheet and the second laser spot was traversed linearly across the black acrylic sheet to intercept the first laser spot while obtaining snapshot widefield images of the black acrylic sheet and spots at various separation distances. The step size for the movement of the second laser spot was 15 μm, and the area of the FOV was 1.5×0.51 mm2.
For each snapshot widefield image of the two laser spots, the contrast-to-noise ratio (CNR) was calculated.
The results of this experiment demonstrated that the PATER imaging system 100 acquired snapshot widefield images at a lateral resolution comparable to the diameter of an arteriole, thereby enabling the imaging of fine structures and functions in biological tissues.
The following experiments were conducted to assess the ability of the PATER imaging system 100 to measure concentrations of Evans Blue dye (EB, Sigma-Aldrich, Inc.) in a tube.
A phantom calibration body consisting of two silicone tubes with a 0.65 mm inner diameter placed in parallel was situated within the FOV of the PATER imaging system 100 of Example 1. An EB solution with 0.6% concentration-by-mass was injected into each of the two tubes and a point-by-point calibration was performed as described herein above. Snapshot widefield images of the phantom calibration body in which the concentration of EB in Tube 2 was maintained at 0.6% concentration-by-mass as a control reference, and the concentration of EB in Tube 1 was varied from 0 to 0.9%.
The results demonstrated the high sensitivity of the PATER imaging system 100, enabling detection of subtle changes in the imaged object 132.
To assess the imaging depth of brain vasculature obtained through intact scalp and intact skull using the PATER imaging system 100, the following experiments were conducted.
In comparison to visible light (380-700 nm), near-infrared light (700-1400 nm) had previously demonstrated deeper and finer photoacoustic microscopy (PAM) imaging of vasculature in the mouse brain with the scalp removed (data not shown). To assess the effect of illumination wavelength on the imaging depth and resolution enabled by the PATER imaging system 100, a widefield illumination wavelength of 1064 nm was used to noninvasively image mouse brain vasculature through both intact scalp and intact skull.
Female ND4 Swiss Webster mice (Harlan Laboratory, Inc.; 18-20 g, 6-8 weeks old) were used for these experiments. Each mouse was anesthetized in a small chamber with gaseous isoflurane mixed with air, and then transferred to a customized animal mount where the anesthesia gas was continuously supplied. The animal mount consisted of a stereotaxic frame that fixed the mouse's head, and a heating pad that maintained the mouse's body temperature at ˜38° C. The hair on the mouse's head was razor trimmed and the scalp was either kept intact or surgically removed depending on the experiment; the skull was left intact for all mice. Bloodstains on the skull were carefully cleaned with phosphate buffered saline solution, and ultrasound gel was applied on the skull as an acoustic coupling medium. The animal mount was raised to contact the mouse's skull with the imaging surface of the ergodic relay 126 of the PATER imaging system 100. The amount of pressure maintained between the animal mount and the ergodic relay 126 was adequate to prevent the mouse's head from moving, but not sufficient to interrupt the blood supply in the brain.
A 6×6 mm2 region of the mouse brain was imaged at the illumination wavelength of 1064 nm to obtain the point-by-point calibration and widefield measurement images. A representative RMS amplitude projection image of the mouse brain vasculature reconstructed using the data obtained through an intact scalp during the point-by-point calibration at the illumination wavelength of 1064 nm is shown in
In order to verify that the vasculature appearing in the images of
The mouse's skull was removed after completion of OR-PAM imaging to visually inspect the brain vasculature.
The OR-PAM B-scan image of
The results demonstrated that the PATER imaging system 100 is capable of PA imaging of brain vasculature through an intact mouse scalp and skull, and can achieve lateral resolution of blood vessels within a mouse brain.
The following experiments were conducted to assess the ability to image rapidly occurring dynamic activity using the PATER imaging system 100 by imaging in vivo the hemodynamic response in a mouse brain to front paw stimulations.
The mice were prepared for imaging in a manner similar to the methods described in Example 3, with modifications as described below.
Light pulses used for illumination were delivered at a wavelength of 532 nm. This wavelength is approximately an isosbestic wavelength for both oxy- and deoxy-hemoglobin, i.e. the molar absorption coefficients are equal for these two forms of hemoglobin. Each mouse brain vasculature was imaged through the intact skull (with the scalp removed) in a point-by-point calibration similar to the point-by-point calibration described above. Snapshot widefield images were then obtained using the PATER imaging system 100 with an 8 mm×6 mm field-of-view and at a frame rate of 10 Hz as described herein above. During widefield image acquisition, the left and right paws of each mouse were alternately stimulated by nipping the paws with pointed forceps. Each paw stimulation lasted approximately 1 second. The resulting snapshot widefield images were processed with a 2D median filter and a Gaussian filter to construct an image of fractional changes in PA signal amplitude due to paw stimulation.
The mean of measurements of the first 5 widefield images/frames was used as a baseline to calculate subsequent differences in widefield measurements. This baseline was subtracted from each frame xi averaged over a sliding averaging window size of 5 to obtain the difference in widefield measurement ΔXn, as expressed in Eqn. (10):
A representative RMS amplitude projection image of the mouse brain vasculature reconstructed using the data obtained during the point-by-point calibration of the PATER imaging system 100 is shown in
An increase in PA amplitude was observed in the contralateral somatosensory brain region during stimulations, as well as a weaker increase in PA amplitude in the ipsilateral somatosensory brain region. These increases in PA signal magnitudes in contralateral and ipsilateral somatosensory brain regions during stimulations were consistent with previous findings that suggested a vascular interconnection between the left and right hemispheres of the brain. An increase in PA amplitude was also observed between the two hemispheres (see
The results of these experiments confirmed that the PATER imaging system 100 possessed sufficient spatial and temporal resolution to image dynamic changes in brain vasculature in response to paw stimulation through intact scalp and intact skull.
The following experiments were conducted to image the in vivo blood oxygen saturation in a mouse brain responding to oxygen challenge using a single wavelength of light.
The initial equipment and experimental animal set-up of Example 3 was used in this experiment. The absorption of light in blood mainly occurs within oxy- and deoxy-hemoglobin of circulating red blood cells. Thus, the absorption coefficient, μa, of blood can be calculated as expressed in Eqn. (11):
μa=ln(10)(εHbO
where ε is the molar extinction coefficient [M−1cm−1], C is the concentration [M], and the HbO2 and Hb subscripts denote oxy- and deoxy-hemoglobin respectively.
The oxygen saturation in blood (sO2) can be calculated as expressed in Eqn. (12):
where THb is the total hemoglobin concentration in blood (combined oxygenated and deoxygenated hemoglobin), as expressed in Eqn. (13):
THb=CHbO
Therefore, the change in the blood oxygen saturation can be calculated as expressed in Eqn. (14):
Assuming that the change in THb, is insignificant, then a change in blood oxygen saturation signifies that ΔCHb=−ΔCHbO
Using a single 620 nm wavelength of light, the in vivo blood oxygen saturation (sO2) in a mouse brain responding to oxygen challenge was imaged using a PATER imaging system 100 over a field of view of 3 mm×3 mm and at a frame rate of 50 Hz. A tunable dye laser (CBR-D, Sirah GmbH) with Rhodamine B dissolved in ethanol as gain medium was used to generate the laser beam at 620 nm wavelength for the widefield illumination 206 used for imaging.
The oxygen challenge was performed by manipulating the oxygen concentration of the mouse's inhaled gas. In this study, a mixture of 95% oxygen and 5% nitrogen was initially administered to the mouse along with an amount of gaseous isoflurane for anesthesia. The mouse brain vasculature was imaged through an intact scalp in the calibration step as described herein above. During the oxygen challenge, the mixture was changed to 5% oxygen and 95% nitrogen for 3 minutes, and then changed back to the initial concentration to end the challenge. Snapshot widefield image data were recorded during the oxygen challenge, and the widefield difference was calculated pixel by pixel. The resulting widefield image was processed with a 2D median filter and a Gaussian filter, and then overlaid on the RMS amplitude projection image as described in Example 4. Two oxygen challenge cycles were performed and analyzed.
Widefield measurements with a laser beam delivered at 532 nm wavelength were taken before and 3 minutes into the oxygen challenge to provide dual wavelength measurements for a sO2 calculation, and a vessel segmentation algorithm was used to individually identify and label the arterial and venous vessels for sO2 calculations.
The PATER imaging system 100 was able to image the oxygen saturation in a mouse brain caused by the exposure to different mixtures of oxygen and nitrogen. The absolute rate of signal change was greater when the mouse was challenged than during recovery from hypoxia, which is consistent with the results reported previously. The sO2 in both arteries and veins of the mouse brain dropped significantly during the oxygen challenge (see
The results of this experiment demonstrated the ability to monitor changes in blood oxygenation levels in brain vasculature resulting from stimulations or treatments such as an oxygen challenge.
The following experiments were conducted to assess the ability to differentiate blood vessel patterns between individual mice with two single laser shots using the PATER imaging system 100.
Two mice were used in this experiment, hereinafter referred to as Mouse 1 and Mouse 2. The mice were prepared for PATER imaging as described previously in Example 3. Mouse 1 was fixed in a stereotaxic frame and a region of the cortical vasculature was recorded in a point-by-point calibration of the PATER imaging system 100 as described above. Widefield signals of the same vasculature region of Mouse 1 were subsequently recorded as described herein above over a FOV of 3 mm×3 mm and at a frame rate of 1 Hz. During the widefield recording of Mouse 1, the mouse was detached from the ergodic relay 126 and then reattached at the same position using a linear translational stage 114 (PT1, Thorlabs, Inc.). Random noise was recorded during widefield recording while the mouse was detached from the ergodic relay 126, and PA signals 204 were observed again after the mouse was reattached to the ergodic relay 126. Point-by-point calibration and widefield recording were similarly conducted for Mouse 2. Reconstruction of widefield images of the brain vasculature of Mouse 1 was performed using the recorded calibration data sets for Mouse 1 as well as for Mouse 2.
The widefield image 218 reconstructed from matched calibration data revealed the original vasculature, and the widefield image 218 reconstructed from unmatched calibration data failed to reconstruct an image 218 in which the original vasculature could be discerned. Consequently, the brain vasculature patterns of individual mice could be differentiated between the two mice. The correlation coefficients between the widefield reconstruction images 218 and the calibration images indicated that the widefield images 218 reconstructed from matched calibration data have much higher correlation than those reconstructed from unmatched calibration data. This differentiation of brain vasculature patterns was not impacted by detaching and reattaching the mice to the ergodic relay 126, indicating that the recognition of individual brain vascularization patterns by the PATER imaging system 100 are relatively insensitive to slight variations in the degree of contact of the mouse to the input surface of the ergodic relay 126.
The results of this experiment demonstrated the ability of the PATER imaging system 100 to discriminate between vascularization patterns of individual mice.
To demonstrate the ability to perform super-resolution imaging and flow direction mapping, as well as detection of CTCs in vivo the following experiments were conducted. Most melanoma tumor cells contain high concentrations of melanin, which have much stronger optical absorption than hemoglobin at around 660 nm, thus providing a large optical contrast between the moving CTCs and the background blood. In this study, the ability of the PATER system described above to monitor the migrations of numerous melanoma CTCs simultaneously in a mouse brain vasculature without exogenous labelling was demonstrated. Super-resolution imaging and flow direction mapping of the mouse brain vasculature was accomplished by localizing and recording the positions of the melanoma CTCs.
The PATER system used in these experiments included a 5-ns pulsed laser beam at 532 nm (INNOSAB IS8II-E, Edgewave GmbH; 2-KHz pulse repetition rate). The laser beams were partially reflected by a beam sampler to a photodiode (DET36A, Thorlabs, Inc.) to correct PA signals for fluctuations in light energy. The laser beam that passed through the beam splitter was reflected by a mirror (PF10-03-P01, Thorlabs, Inc.) and passed through an optical-element wheel that enabled switching the active optical element in the light path according to the acquisition mode as described in detail above. The system further included two-channel digitizer (ATS9350, AlazarTech, Inc.; 100-MS/s sampling rate, 16,384 data points/acquisition sampling length) to record the PA signals and the photodiode measurements. The dye laser pumped by the 532-nm pulsed laser was tuned to generate a 660-nm laser beam by using DCM dissolved in ethanol as the gain medium for monitoring melanoma CTC migrations in the mouse brain.
The mouse was prepared for imaging in a manner similar to the methods described in Example 3, with modifications as described below. A carotid artery cannulation procedure was performed on the mouse to access the left common carotid artery. An arterial catheter (MAC-02, SAI Infusion Technologies Inc.; 1-F tip outer diameter) was inserted into the left common carotid artery to facilitate the melanoma cancer cell injection. The skin of the mouse was sutured after the procedure while the arterial catheter was exposed as the CTC injection port.
A cortical region of the mouse's brain was scanned for calibration after performing the carotid artery cannulation procedure. Approximately 200 μL of cell suspension containing 3×106 B16 cells was slowly injected through the arterial catheter into the left common carotid artery. Then, snapshot images of the cortical region were acquired using 660-nm light to monitor the migration of CTCs for approximately two minutes. Differences in snapshot measurements were calculated by temporal running averaging over five consecutive frames using Eqn. (10) provided above. The snapshot differential images were processed in MATLAB with a 2D median filter (medfilt2) using median values from the 5-by-5 neighborhood pixels and a 2D Gaussian filter (imgaussfit) with a sigma value of 10.
CTC localization accuracy was estimated based on a least-squares fitting to a Gaussian function. A significant source of noise within the snapshot images obtained using the PATER system included random, additive, and statistically stable (across all pixels in the FOV) detector or environmental noise. This detector or environmental noise contrasted with the photon shot noise associated with other super-resolution imaging methods, which follows a Poisson distribution whose parameter is related with the true photon count at each pixel. Therefore, the localization accuracy, as quantified by the RMS error of the fitted center position {circumflex over (x)}0, in the detector or environmental noise-limited case was expressed as:
where σ is the standard deviation of the original Gaussian point-spread function (PSF), and SNRL is the signal-to-noise ratio with a conceptual pixel of width L=√{square root over (2π)}σ. The localization accuracy as expressed in Eqn. (15) was proportional to the original resolution of the PATER system and inversely proportional to the value of SNRL.
In order to track and localize the flowing CTCs over the FOV, the mouse's brain was imaged at 1,000 frames per second (fps) for 100 seconds. In the entire 3D (x-y-t) volume, candidate CTCs were found using a sequential procedure that included 1) applying the temporal running average to the consecutive image frames to suppress background noise; 2) filtering the volume with a Difference-of-Gaussian filter, whose scales were empirically determined as 1.4 and 1.8 pixels, 3) detecting all local maxima that were greater than 20% of the largest maximum value, and 4) fitting a 2D Gaussian function to the neighborhood of each local maximum to localize the center of the CTCs at the sub-pixel level. After localization, the CTC particles were further filtered with additional trackability criterion.
All candidate particles were tracked over the course of the video, and only those particles that could be tracked across at least 3 frames within a 5-frame window (10-ms) at the maximum flow speed of 10 mm/s were retained. Tracked CTC particles were then connected into paths and plotted to form a localization image. The finest resolution for the in vivo CTC localization study was estimated to be 300 nm (10-μm reconstruction pixel width, ˜100-μm standard deviation of the Gaussian PSF, and an SNR of ˜110, data not presented).
Major vessels were manually identified on the optical-resolution image to analyze the flow speed pattern. PA signals were extracted along vessel centerlines from the video to form a length-time 2D image for each vessel. A moving-window Fourier Transform-based flow speed extraction algorithm was used to calculate the flow speed and direction along the vessels.
Snapshot measurements on a cortical region of the mouse brain were taken after the injection of melanoma CTCs through the mouse's carotid artery. The reconstructed snapshot images (with a threshold of 20% from the peak value of the largest local maxima) were overlaid on the RMS amplitude projection of the cortical region to highlight the current positions of melanoma CTCs, as illustrated in
The localized positions of the melanoma CTCs from 100,000 snapshot frames were computed to reconstruct the high-resolution localization map of the cortical vasculature, as illustrated in
The structural density of the localization-enhanced vessels depends on the number of traceable melanoma CTCs that were recognized from the images. Therefore, vessels traced with more melanoma CTC events resembled the vasculature more closely than did vessels traced with fewer events. The resolution of the localization image may be further improved with longer integration times.
The flow rate and flow direction of the tumor cells in each vessel was estimated by tracing the melanoma CTCs in real time and analyzing the movements of flowing melanoma CTCs. The computed flow rate of melanoma CTCs had a maximum of 0.54 mm/s, which was lower than the cerebral blood speed. A velocity-contrast map was computed for the mouse cortical vasculature by analyzing the flow rate of melanoma CTCs in each vessel in the spatiotemporal frequency domain. Vessels in the mouse brain were individually identified based on the difference in flow speed and flow direction of CTCs. The tumor cell flow directions in the vessels are shown in
The results of this experiment demonstrated that if the imaged CTCs are sufficiently separated in a snapshot image (i.e., at most a single CTC or CTC cluster is within a resolution voxel at a time), individual CTC or CTC clusters may be localized using the PATER imaging system with super-resolution. The localization accuracy (quantified by the RMS error of the fitted center position of a single CTC) was inversely proportional to the signal-to-noise ratio (SNR), resulting in super-resolution imaging since SNR is sufficiently high enough.
This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined by the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal languages of the claims.
This application is a National Stage Entry of International Patent Application No. PCT/US2018/032007, filed on May 10, 2018, which claims the benefit of U.S. Provisional Patent Application Ser. No. 62/503,997, filed on May 10, 2017, entitled SNAPSHOT PHOTOACOUSTIC PHOTOGRAPHY USING AN ERGODIC RELAY, the entire contents of which are hereby incorporated by reference in their entireties.
This invention was made with government support under CA186567 awarded by the National Institutes of Health. The government has certain rights in the invention.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2018/032007 | 5/10/2018 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2018/209046 | 11/15/2018 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
4029756 | Gaafar | Jun 1977 | A |
4127318 | Determann et al. | Nov 1978 | A |
4255971 | Rosencwaig | Mar 1981 | A |
4267732 | Quate | May 1981 | A |
4284324 | Huignard et al. | Aug 1981 | A |
4375818 | Suwaki et al. | Mar 1983 | A |
4385634 | Bowen | May 1983 | A |
4430897 | Quate | Feb 1984 | A |
4430987 | Heller | Feb 1984 | A |
4462255 | Guess et al. | Jul 1984 | A |
4468136 | Murphy et al. | Aug 1984 | A |
4489727 | Matsuo et al. | Dec 1984 | A |
4546771 | Eggleton et al. | Oct 1985 | A |
4596254 | Adrian et al. | Jun 1986 | A |
4687304 | Piller et al. | Aug 1987 | A |
4740081 | Martens et al. | Apr 1988 | A |
4802461 | Cho | Feb 1989 | A |
4802487 | Martin et al. | Feb 1989 | A |
4809703 | Ishikawa et al. | Mar 1989 | A |
4850363 | Yanagawa | Jul 1989 | A |
4860758 | Yanagawa et al. | Aug 1989 | A |
4869256 | Kanno et al. | Sep 1989 | A |
4872758 | Miyazaki et al. | Oct 1989 | A |
4921333 | Brody et al. | May 1990 | A |
4929951 | Small | May 1990 | A |
4995396 | Inaba et al. | Feb 1991 | A |
5070455 | Singer et al. | Dec 1991 | A |
5083549 | Cho et al. | Jan 1992 | A |
5107844 | Kami et al. | Apr 1992 | A |
5115814 | Griffith et al. | May 1992 | A |
5125410 | Misono et al. | Jun 1992 | A |
5140463 | Yoo et al. | Aug 1992 | A |
5170793 | Takano et al. | Dec 1992 | A |
5194723 | Cates et al. | Mar 1993 | A |
5207672 | Roth et al. | May 1993 | A |
5227912 | Ho et al. | Jul 1993 | A |
5305759 | Kaneko et al. | Apr 1994 | A |
5321501 | Swanson et al. | Jun 1994 | A |
5329817 | Garlick et al. | Jul 1994 | A |
5331466 | Van Saarloos | Jul 1994 | A |
5345938 | Nishiki et al. | Sep 1994 | A |
5373845 | Gardineer et al. | Dec 1994 | A |
5414623 | Lu et al. | May 1995 | A |
5445155 | Sieben | Aug 1995 | A |
5465722 | Fort et al. | Nov 1995 | A |
5546187 | Pepper et al. | Aug 1996 | A |
5546947 | Yagami et al. | Aug 1996 | A |
5546948 | Hamm et al. | Aug 1996 | A |
5606975 | Liang et al. | Mar 1997 | A |
5615675 | O'Donnell et al. | Apr 1997 | A |
5635784 | Seale | Jun 1997 | A |
5651366 | Liang et al. | Jul 1997 | A |
5713356 | Kruger | Feb 1998 | A |
5718231 | Dewhurst et al. | Feb 1998 | A |
5781294 | Nakato et al. | Jul 1998 | A |
5836872 | Kenet et al. | Nov 1998 | A |
5840023 | Oraevsky et al. | Nov 1998 | A |
5860934 | Sarvazyan | Jan 1999 | A |
5913234 | Julliard et al. | Jun 1999 | A |
5971998 | Russell et al. | Oct 1999 | A |
5977538 | Unger et al. | Nov 1999 | A |
5991697 | Nelson et al. | Nov 1999 | A |
6055097 | Lanni et al. | Apr 2000 | A |
6102857 | Kruger | Aug 2000 | A |
6104942 | Kruger | Aug 2000 | A |
6108576 | Alfano et al. | Aug 2000 | A |
6111645 | Tearney et al. | Aug 2000 | A |
6134003 | Tearney et al. | Oct 2000 | A |
6216025 | Kruger | Apr 2001 | B1 |
6233055 | Mandella et al. | May 2001 | B1 |
6282011 | Tearney et al. | Aug 2001 | B1 |
6292682 | Kruger | Sep 2001 | B1 |
6309352 | Oraevsky et al. | Oct 2001 | B1 |
6341036 | Tearney et al. | Jan 2002 | B1 |
6379325 | William et al. | Apr 2002 | B1 |
6405069 | Oraevsky et al. | Jun 2002 | B1 |
6413228 | Hung et al. | Jul 2002 | B1 |
6421164 | Tearney et al. | Jul 2002 | B2 |
6432067 | Martin et al. | Aug 2002 | B1 |
6466806 | Geva et al. | Oct 2002 | B1 |
6485413 | Boppart et al. | Nov 2002 | B1 |
6490470 | Kruger | Dec 2002 | B1 |
6498942 | Esenaliev et al. | Dec 2002 | B1 |
6498945 | Alfheim et al. | Dec 2002 | B1 |
6501551 | Tearney et al. | Dec 2002 | B1 |
6545264 | Stern | Apr 2003 | B1 |
6564087 | Pitris et al. | May 2003 | B1 |
6567688 | Wang | May 2003 | B1 |
6590830 | Garlick et al. | Jul 2003 | B1 |
6626834 | Dunnie et al. | Sep 2003 | B2 |
6628404 | Kelley et al. | Sep 2003 | B1 |
6633774 | Kruger | Oct 2003 | B2 |
6654630 | Zuluaga et al. | Nov 2003 | B2 |
6658279 | Swanson et al. | Dec 2003 | B2 |
6694173 | Bende et al. | Feb 2004 | B1 |
6701181 | Tang et al. | Mar 2004 | B2 |
6751490 | Esenaliev et al. | Jun 2004 | B2 |
6764450 | Yock | Jul 2004 | B2 |
6831781 | Tearney et al. | Dec 2004 | B2 |
6833540 | MacKenzie et al. | Dec 2004 | B2 |
6839496 | Mills et al. | Jan 2005 | B1 |
6846288 | Nagar et al. | Jan 2005 | B2 |
6853446 | Almogy et al. | Feb 2005 | B1 |
6877894 | Vona et al. | Apr 2005 | B2 |
6937886 | Zavislan | Aug 2005 | B2 |
6956650 | Boas et al. | Oct 2005 | B2 |
7072045 | Chen et al. | Jul 2006 | B2 |
7198778 | Achilefu et al. | Apr 2007 | B2 |
7231243 | Tearney et al. | Jun 2007 | B2 |
7245789 | Bates et al. | Jul 2007 | B2 |
7266407 | Li et al. | Sep 2007 | B2 |
7322972 | Viator et al. | Jan 2008 | B2 |
7357029 | Falk | Apr 2008 | B2 |
7382949 | Bouma et al. | Jun 2008 | B2 |
7541602 | Metzger et al. | Jun 2009 | B2 |
7610080 | Winchester, Jr. et al. | Oct 2009 | B1 |
7917312 | Wang et al. | Mar 2011 | B2 |
8016419 | Zhang et al. | Sep 2011 | B2 |
8025406 | Zhang et al. | Sep 2011 | B2 |
8143605 | Metzger et al. | Mar 2012 | B2 |
8397573 | Kobayashi | Mar 2013 | B2 |
8416421 | Wang et al. | Apr 2013 | B2 |
8454512 | Wang et al. | Jun 2013 | B2 |
8891088 | Goldschmidt et al. | Nov 2014 | B2 |
8997572 | Wang et al. | Apr 2015 | B2 |
9086365 | Wang et al. | Jul 2015 | B2 |
9096365 | Kim | Aug 2015 | B2 |
9220415 | Mandelis et al. | Dec 2015 | B2 |
9226666 | Wang et al. | Jan 2016 | B2 |
9234841 | Wang et al. | Jan 2016 | B2 |
9335605 | Wang et al. | May 2016 | B2 |
9528966 | Wang et al. | Dec 2016 | B2 |
9618445 | Sun et al. | Apr 2017 | B2 |
10285595 | Zalev et al. | May 2019 | B2 |
10359400 | Wang et al. | Jul 2019 | B2 |
10433733 | Wang et al. | Oct 2019 | B2 |
10448850 | Wang et al. | Oct 2019 | B2 |
11020006 | Wang et al. | Jun 2021 | B2 |
11029287 | Wang et al. | Jun 2021 | B2 |
11135375 | Brady et al. | Oct 2021 | B2 |
11137375 | Wang et al. | Oct 2021 | B2 |
11369280 | Wang et al. | Jun 2022 | B2 |
20010052979 | Treado et al. | Dec 2001 | A1 |
20020093637 | Yuan et al. | Jul 2002 | A1 |
20020173780 | Altshuler et al. | Nov 2002 | A1 |
20020176092 | Deck | Nov 2002 | A1 |
20030097066 | Shelby et al. | May 2003 | A1 |
20030160957 | Oldham et al. | Aug 2003 | A1 |
20030160967 | Houston et al. | Aug 2003 | A1 |
20040030255 | Alfano et al. | Feb 2004 | A1 |
20040039379 | Viator et al. | Feb 2004 | A1 |
20040082070 | Jones et al. | Apr 2004 | A1 |
20040254474 | Seibel et al. | Dec 2004 | A1 |
20050015002 | Dixon et al. | Jan 2005 | A1 |
20050028482 | Cable et al. | Feb 2005 | A1 |
20050085725 | Nagar et al. | Apr 2005 | A1 |
20050143664 | Chen et al. | Jun 2005 | A1 |
20050154313 | Desilets et al. | Jul 2005 | A1 |
20050168749 | Ye et al. | Aug 2005 | A1 |
20050217381 | Falk | Oct 2005 | A1 |
20050234315 | Mayevsky et al. | Oct 2005 | A1 |
20050277824 | Aubry et al. | Dec 2005 | A1 |
20060055936 | Yun et al. | Mar 2006 | A1 |
20060058614 | Tsujita | Mar 2006 | A1 |
20060122516 | Schmidt et al. | Jun 2006 | A1 |
20060181791 | Van Beek et al. | Aug 2006 | A1 |
20060184042 | Wang et al. | Aug 2006 | A1 |
20060235299 | Martinelli | Oct 2006 | A1 |
20060247510 | Wiemker et al. | Nov 2006 | A1 |
20060264717 | Pesach et al. | Nov 2006 | A1 |
20070075063 | Wilbanks et al. | Apr 2007 | A1 |
20070088206 | Peyman et al. | Apr 2007 | A1 |
20070093702 | Yu et al. | Apr 2007 | A1 |
20070213590 | Squicciarini | Sep 2007 | A1 |
20070213618 | Li et al. | Sep 2007 | A1 |
20070213693 | Plunkett | Sep 2007 | A1 |
20070282200 | Johnson et al. | Dec 2007 | A1 |
20070299341 | Wang et al. | Dec 2007 | A1 |
20080029711 | Viellerobe et al. | Feb 2008 | A1 |
20080037367 | Gross et al. | Feb 2008 | A1 |
20080088838 | Raicu et al. | Apr 2008 | A1 |
20080123083 | Wang et al. | May 2008 | A1 |
20080173093 | Wang et al. | Jul 2008 | A1 |
20080230717 | Ashkenazi et al. | Sep 2008 | A1 |
20090051900 | Moon et al. | Feb 2009 | A1 |
20090054763 | Wang et al. | Feb 2009 | A1 |
20090088631 | Dietz et al. | Apr 2009 | A1 |
20090116518 | Patel et al. | May 2009 | A1 |
20090138215 | Wang et al. | May 2009 | A1 |
20090185191 | Boppart et al. | Jul 2009 | A1 |
20090227997 | Wang et al. | Sep 2009 | A1 |
20100079768 | Wang et al. | Apr 2010 | A1 |
20100134793 | Krishnamachari et al. | Jun 2010 | A1 |
20100245766 | Zhang et al. | Sep 2010 | A1 |
20100245769 | Zhang et al. | Sep 2010 | A1 |
20100245770 | Zhang et al. | Sep 2010 | A1 |
20100249562 | Zhang et al. | Sep 2010 | A1 |
20100268042 | Wang et al. | Oct 2010 | A1 |
20100285518 | Viator et al. | Nov 2010 | A1 |
20100309466 | Lucassen et al. | Dec 2010 | A1 |
20100322497 | Dempsey et al. | Dec 2010 | A1 |
20110071402 | Masumura | Mar 2011 | A1 |
20110122416 | Yang et al. | May 2011 | A1 |
20110201914 | Wang et al. | Aug 2011 | A1 |
20110251515 | Leuthardt et al. | Oct 2011 | A1 |
20110275890 | Wang et al. | Nov 2011 | A1 |
20110282181 | Wang et al. | Nov 2011 | A1 |
20110282192 | Axelrod et al. | Nov 2011 | A1 |
20120065490 | Zharov et al. | Mar 2012 | A1 |
20120070817 | Wang et al. | Mar 2012 | A1 |
20120074294 | Streuber et al. | Mar 2012 | A1 |
20120118052 | O'Donnell et al. | May 2012 | A1 |
20120204648 | Wang et al. | Aug 2012 | A1 |
20120275262 | Song et al. | Nov 2012 | A1 |
20120307250 | Wang | Dec 2012 | A1 |
20130151188 | Rokni et al. | Jun 2013 | A1 |
20130199299 | Wang et al. | Aug 2013 | A1 |
20130218002 | Kiraly | Aug 2013 | A1 |
20130245406 | Wang et al. | Sep 2013 | A1 |
20140009808 | Wang et al. | Jan 2014 | A1 |
20140029829 | Jiang et al. | Jan 2014 | A1 |
20140142404 | Wang et al. | May 2014 | A1 |
20140356897 | Wang et al. | Dec 2014 | A1 |
20150005613 | Kim et al. | Jan 2015 | A1 |
20150178959 | Huang et al. | Jun 2015 | A1 |
20150185187 | Wang et al. | Jul 2015 | A1 |
20150245771 | Wang et al. | Sep 2015 | A1 |
20150272444 | Maslov et al. | Oct 2015 | A1 |
20150272446 | Wang et al. | Oct 2015 | A1 |
20150316510 | Fukushima et al. | Nov 2015 | A1 |
20160081558 | Wang et al. | Mar 2016 | A1 |
20160235305 | Wang et al. | Aug 2016 | A1 |
20160242651 | Wang et al. | Aug 2016 | A1 |
20160249812 | Wang et al. | Sep 2016 | A1 |
20160262628 | Wang et al. | Sep 2016 | A1 |
20160305914 | Wang et al. | Oct 2016 | A1 |
20160310083 | Wang et al. | Oct 2016 | A1 |
20160345886 | Wang et al. | Dec 2016 | A1 |
20170065182 | Wang et al. | Mar 2017 | A1 |
20170105636 | Wang et al. | Apr 2017 | A1 |
20170367586 | Wang et al. | Dec 2017 | A9 |
20180020920 | Ermilov et al. | Jan 2018 | A1 |
20180088041 | Zhang et al. | Mar 2018 | A1 |
20180132728 | Wang et al. | May 2018 | A1 |
20180177407 | Hashimoto et al. | Jun 2018 | A1 |
20190008444 | Wang et al. | Jan 2019 | A1 |
20190125583 | Wang et al. | May 2019 | A1 |
20190227038 | Wang et al. | Jul 2019 | A1 |
20190307334 | Wang et al. | Oct 2019 | A1 |
20200056986 | Wang et al. | Feb 2020 | A1 |
20200073103 | Wang et al. | Mar 2020 | A1 |
20200268253 | Wang et al. | Aug 2020 | A1 |
20200275846 | Wang et al. | Sep 2020 | A1 |
20200397523 | Gao et al. | Dec 2020 | A1 |
20210010976 | Wang et al. | Jan 2021 | A1 |
20210132005 | Wang et al. | May 2021 | A1 |
20210321874 | Wang et al. | Oct 2021 | A1 |
20210333241 | Wang et al. | Oct 2021 | A1 |
Number | Date | Country |
---|---|---|
1883379 | Dec 2006 | CN |
106338473 | Jan 2017 | CN |
0012262 | Jun 1980 | EP |
0919180 | Jun 1999 | EP |
1493380 | Jan 2005 | EP |
05-126725 | May 1993 | JP |
2000292416 | Oct 2000 | JP |
2009068977 | Apr 2009 | JP |
2010017426 | Jan 2010 | JP |
2010040161 | Feb 2010 | JP |
2012143384 | Aug 2012 | JP |
2013244122 | Dec 2013 | JP |
2014124242 | Jul 2014 | JP |
2014224806 | Dec 2014 | JP |
2016-101260 | Jun 2016 | JP |
6086718 | Mar 2017 | JP |
100946550 | Mar 2010 | KR |
20160091059 | Aug 2016 | KR |
2017-0006470 | Jan 2017 | KR |
WO2006111929 | Oct 2006 | WO |
WO2007088709 | Aug 2007 | WO |
WO2007148239 | Dec 2007 | WO |
WO2008062354 | May 2008 | WO |
WO2008100386 | Aug 2008 | WO |
WO2009055705 | Apr 2009 | WO |
WO2010048258 | Apr 2010 | WO |
WO2010080991 | Jul 2010 | WO |
WO2011060101 | May 2011 | WO |
WO2011091360 | Jul 2011 | WO |
WO2011127428 | Oct 2011 | WO |
WO2012035472 | Mar 2012 | WO |
WO2013086293 | Jun 2013 | WO |
2015118881 | Aug 2015 | WO |
WO2018102446 | Jun 2018 | WO |
WO-2018102467 | Jun 2018 | WO |
WO2018209046 | Nov 2018 | WO |
Entry |
---|
Office Action from related U.S. Appl. No. 11/625,099, dated Nov. 1, 2010. |
Final Office Action from related U.S. Appl. No. 11/625,099, dated Apr. 20, 2010. |
Office Action from related U.S. Appl. No. 12/254,643, dated Aug. 6, 2010. |
Notice of Allowance from related U.S. Appl. No. 12/254,643, dated Nov. 22, 2010. |
Office Action from related U.S. Appl. No. 12/568,069, dated Dec. 21, 2012. |
Office Action from related U.S. Appl. No. 12/568,069, dated Mar. 29, 2012. |
Final Office Action from related U.S. Appl. No. 12/568,069, dated Sep. 18, 2012. |
Notice of Allowance from related U.S. Appl. No. 12/568,069, dated Feb. 22, 2013. |
Office Action from related U.S. Appl. No. 12/739,589, dated Jul. 19, 2012. |
Notice of Allowance from related U.S. Appl. No. 12/739,589, dated Feb. 5, 2013. |
Office Action from related U.S. Appl. No. 13/125,522, dated Jan. 22, 2013. |
Final Office Action from related U.S. Appl. No. 13/125,522, dated May 23, 2013. |
Office Action from related U.S. Appl. No. 13/125,522, dated Jul. 17, 2014. |
Final Office Action from related U.S. Appl. No. 13/125,522, dated Oct. 29, 2014. |
Office Action dated Aug. 26, 2015 issued in U.S. Appl. No. 13/125,522. |
Final Office Action dated Mar. 3, 2016 issued in U.S. Appl. No. 13/125,522. |
Notice of Allowance dated Sep. 19, 2016 issued in U.S. Appl. No. 13/125,522. |
Office Action from related U.S. Appl. No. 13/143,832, dated Apr. 18, 2014. |
Office Action from related U.S. Appl. No. 13/450,793, dated Jun. 5, 2013. |
Final Office Action from related U.S. Appl. No. 13/450,793, dated Nov. 22, 2013. |
Office Action from related U.S. Appl. No. 13/450,793, dated Mar. 24, 2014. |
Office Action from related U.S. Appl. No. 13/450,793, dated Aug. 1, 2014. |
Office Action from related U.S. Appl. No. 13/574,994, dated Mar. 17, 2014. |
Final Office Action from related U.S. Appl. No. 13/574,994, dated Aug. 26, 2014. |
Notice of Allowance dated Nov. 17, 2015 from U.S. Appl. No. 13/574,994. |
Office Action dated Jan. 20, 2015, from U.S. Appl. No. 14/026,577. |
Final Office Action dated Sep. 30, 2015, from U.S. Appl. No. 14/026,577. |
Notice of Allowance dated Jan. 5, 2016, from U.S. Appl. No. 14/026,577. |
Office Action dated Nov. 13, 2017, from U.S. Appl. No. 15/148,685. |
Final Office Action dated Sep. 24, 2018, from U.S. Appl. No. 15/148,685. |
Notice of Allowance dated May 16, 2019, from U.S. Appl. No. 15/148,685. |
Office Action from related U.S. Appl. No. 13/637,897, dated Aug. 1, 2014. |
Office Action from related U.S. Appl. No. 14/164,117, dated Dec. 11, 2015. |
Office Action dated Dec. 13, 2019 issued in U.S. Appl. No. 15/037,468. |
Notice of Allowance dated Mar. 23, 2020 issued in U.S. Appl. No. 15/037,468. |
Notice of Allowance dated Oct. 28, 2020 issued in U.S. Appl. No. 15/037,468. |
Office Action dated Oct. 3, 2018 issued in U.S. Appl. No. 14/436,581. |
Amendment and Request for Continued Examination dated Nov. 25, 2019 in U.S. Appl. No. 14/436,581. |
Final Office Action dated May 24, 2019 issued in U.S. Appl. No. 14/436,581. |
Office Action dated Apr. 3, 2020 issued in U.S. Appl. No. 14/436,581. |
Office Action dated Jun. 20, 2014 issued in U.S. Appl. No. 13/369,558. |
Notice of Allowance dated Jul. 29, 2014 issued in U.S. Appl. No. 13/369,558. |
Notice of Allowance dated Dec. 5, 2014 issued in U.S. Appl. No. 13/369,558. |
Office Action dated Apr. 21, 2017 issued in U.S. Appl. No. 14/639,676. |
Final Office Action dated Nov. 15, 2017 issued in U.S. Appl. No. 14/639,676. |
Office Action dated May 31, 2018 issued in U.S. Appl. No. 14/639,676. |
Notice of Allowance dated Dec. 12, 2018 issued in U.S. Appl. No. 14/639,676. |
Office Action dated Feb. 28, 2020 issued in U.S. Appl. No. 16/372,597. |
Office Action dated Aug. 19, 2019 issued in U.S. Appl. No. 16/372,597. |
Office Action dated Oct. 8, 2020 issued in U.S. Appl. No. 16/372,597. |
The International Search Report and Written Opinion dated Mar. 27, 2014 issued in Application No. PCT/US2013/065594. |
The International Search Report and the Written Opinion of the International Searching Authority, or the Declaration, for PCT/US2009/061435, dated Mar. 29, 2010, 6 pages. |
The International Search Report and the Written Opinion of the International Searching Authority, dated Sep. 22, 2011, from related application No. PCT/US2011/022253, 6 pgs. |
International Search Report of International Application No. PCT/US2014/066437, dated Feb. 26, 2015, 3 pages. |
Partial European Search Report issued for European Application No. 17159220.7, dated Aug. 23, 2017 (9 pages). |
International Search Report and Written Opinion dated Apr. 22, 2009, from Application No. PCT/US2008/081167 (7 pages). |
International Search Report and Written Opinion from Application Serial No. PCT/US2010/020488, dated Aug. 31, 2010 (10 pages). |
International Search Report and Written Opinion from Application Serial No. PCT/US2011/031823, dated Dec. 26, 2011 (8 pages). |
International Search Report and Written Opinion from Application Serial No. PCTlUS2012/068403, dated Mar. 19, 2013 (10 pages). |
Extended European Search Report from European Application Serial No. 08842292.8, dated Dec. 17, 2013 (8 pages). |
Final Office Action from related Japanese Patent Application No. JP 2010-531281, dated Mar. 11, 2014, (5 pages). |
International Search Report and Written Opinion dated Dec. 2, 2019, issued in Application No. PCT/US2019/046574. |
International Search Report and Written Opinion dated Dec. 23, 2019, issued in Application No. PCT/US2019/049594. |
International Search Report and Written Opinion dated Aug. 31, 2020, issued in Application No. PCT/US2020/019368. |
International Search Report and Written Opinion dated Oct. 14, 2020, issued in Application No. PCT/US2020/07174. |
International Preliminary Report on Patentability dated Nov. 12, 2019 issued in PCT/US2018/032007. |
Abdelmohsen, et al., “Micro- and nano-motors for biomedical applications,” J. Mater. Chem. B 2, (2014) pp. 2395-2408. |
Al, et al., “Spectral-domain optical coherence tomography: Removal of autocorrelation using an optical switch,” Applied Physics Letters, (Mar. 15, 2006), 88(11): pp. 111115-1-111115-3. <doi:10.1063/1.2186520>. |
Allen, et al. “Pulsed Near-Infrared Laser Diode Excitation System for Biomedical Photoacoustic Imaging,” Optics Letters, Optical Society of America, USA., vol. 31, No. 23, Dec. 1, 2006, pp. 3462-3464. |
Alomair, et al., “In vivo high angular resolution diffusion-weighted imaging of mouse brain at 16.4 Tesla,” PloS One 10, Jun. 25, 2015, e0130133, pp. 1-17. |
Aubry J.-F., et al., “Experimental demonstration of noninvasive transskull adaptive focusing based on prior computed tomography scans,” J. Acoust. Soc. Am. 113(1), 84-93 (2003). (Year: 2003). |
Baheiraei, et al., “Investigation of magnesium incorporation within gelatin/calcium phosphate nanocomposite scaffold for bone tissue engineering,” Int. J. Appl. Ceram. Technol. 12, (2015) pp. 245-253. |
Baker, M. J. et al., “Using Fourier transform IR spectroscopy to analyze biological materials,” Nat. Protoc. 9, 1771-1791 (2014). |
Bansil, et al., “The biology of mucus: Composition, synthesis and organization” Adv. Drug Deliv. Rev. 124, (2018) pp. 3-15. |
Beaven, G. H. & Holiday, E. R., “Ultraviolet absorption spectra of proteins and amino acids,” Adv. Protein Chem 7, 319-386 (1952). |
Bell, A.G., “On the Production and Reproduction of Sound by Light,” American Journal of Sciences, Oct. 1880, pp. 305-324, Third Series, vol. XX, USA. |
Bellinger, et al., “Oral, ultra-long-lasting drug delivery: Application toward malaria elimination goals” Sci Transl. Med. 8(365), Nov. 16, 2016, 365ra157, pp. 1-25. <doi:10.1126/scitranslmed.aag2374>. |
Bioucas-Dias, J.M. And Figueiredo, M.A.T. “A new TwIST: two-step iterative shrinkage/thresholding algorithms for image restoration,” IEEE Trans. Image Process. 16, 2992-3004 (Dec. 2007). |
Brenner, et al., “Computed Tomography—An Increasing Source of Radiation Exposure” N. Engl. J. Med 357;22, Nov. 29, 2007, pp. 2277-2284. |
Calasso et al., “Photoacoustic Point Source,” Physical Review Letters, vol. 86, No. 16, Apr. 16, 2001, pp. 3550-3553. |
Cannata et al., “Development of a 35-MHz Piezo-Composite Ultrasound Array for Medical Imaging,” IEE Transactions on Ultrasonics, Ferroelectrics, and Frequency Control, 53(1): pp. 224-236 (2006). |
Celli, J. P., et al., “Helicobacter pylori moves through mucus by reducing mucin viscoelasticity,” Proc. Natl. Acad. Sci. U. S. A. 106, (2009) pp. 14321-14326. |
Chan, et al., “New opportunities in micro- and macro-attenuated total reflection infrared spectroscopic imaging: spatial resolution and sampling versatility,” Appl. Spectrosc. 57, 381-389 (2003). |
Cheng, J.-X. Et al., “Vibrational spectroscopic imaging ofliving systems: an emerging platform for biology and medicine,” Science, vol. 350 aaa8870, No. 6264, Nov. 27, 2015, pp. 1054-1063. |
Cheong, et al., “A review of the optical properties of biological tissues,” IEEE J. Quantum Electronics, 26(12): pp. 2166-2185 (1980). |
Chourasia, et al., “Design and Development of Multiparticulate System for Targeted Drug Delivery to Colon,” Drug Delivery, 11:3, (2004) pp. 201-207. |
Cox, B., Beard, P., “Photoacoustic tomography with a single detector in a reverberant cavity” J. Acoust. Soc. Am. 125, 1426 (Mar. 2009). |
Cui, Y., et al. “Transferring-conjugated magnetic silica PLGA nanoparticles loaded with doxorubicin and paclitaxel for brain glioma treatment,” Biomaterials 34, (2013) pp. 8511-8520. |
De Boer, et al., “Improved signal-to-noise ratio in spectral-domain compared with time-domain optical coherence tomography” Optics Letters, vol. 28, No. 21, Nov. 1, 2003, pp. 2067-2069. |
D'Andrea, et al., “Time-resolved optical imaging through turbid media using a fast data acquisition system based on a gated CCD camera” Journal of Physics D: Applied Physics, vol. 36, No. 14, Jul. 1, 2003, pp. 1675-1681. |
Danielli, et al., “Label-free photoacoustic nanoscopy,” Journal of Biomedical Optics, vol. 19, No. 8, Aug. 2014, pp. 086006-1-086006-10. |
Dazzi, A. et al., “AFM-IR: technology and applications in nanoscale infrared spectroscopy and chemical imaging,” Chem. Rev. 117, 5146-5173 (2017). |
Dazzi, A., et al., “Local infrared microspectroscopy with subwavelength spatial resolution with an atomic force microscope tip used as a photothermal sensor,” Optics Letters, vol. 30, No. 18, Sep. 15, 2005, pp. 2388-2390. |
De Avila, et al., “Micromotor-enabled active drug delivery for in vivo treatment of stomach infection” Nat. Commun. 8: 272, (2017) pp. 1-9. |
De Zerda, et al., “Family of enhanced photoacoustic imaging agents for high-sensitivity and multiplexing studies in living mice,” ACS Nano 6(6), Jun. 26, 2012, pp. 4694-4701. |
Diebold, et al., “Photoacoustic Monopole Radiation in One, Two and Three Dimensions,” Physical Review Letters, Figs. 1 and 2, vol. 67, No. 24, Dec. 9, 1991, pp. 3384-3387. |
Diebold, et al., “Photoacoustic Signature of Particulate Matter: Optical Production of 9 Acoustic Monopole Radiation,” Science New Series, Oct. 5, 1990, pp. 101-104, vol. 250, No. 4977, pp. 101-104. |
Diem, M. et al., “Molecular pathology via IR and Raman spectral imaging.” Journal of Biophotonics, vol. 6, No. 11-12 (2013) pp. 855-886. <doi:10.1002/jbio.201300131>. |
Diem, M., et al., “A decade of vibrational micro-spectroscopy of human cells and tissue (1994-2004),” Analyst, Oct. 2004, vol. 129, No. 10, pp. 880-885. <doi:10.1039/b408952a>. |
Draeger, C., Fink, M., “One-channel time reversal of elastic waves in a chaotic 2D-silicon cavity,” Phys. Rev. Lett. 79, 407-410 (Jul. 21, 1997). |
Dunn, et al., “Transport-based image reconstruction in turbid media with small source-detector separations,” Optics Letters, vol. 25, No. 24, Dec. 15, 2000, pp. 1777-1779. |
Eghtedari, et al., “High Sensitivity of in Vivo Detection of Gold Nanorods Using a Laser Optoacoustic Imaging System,” Nano Letters, vol. 7, No. 7, 2007, pp. 1914-1918. |
Ermilov et al., “Laser optoacoustic imaging system for detection of breast cancer,” Journal of Biomedical Optics, vol. 14 No. 2, pp. 24007-024007-14 (2009). |
Erpelding et al., “Sentinel Lymph Nodes in the Rat: Noninvasive Photoacoustic and US Imaging with a Clinical US System,” Radiology, 256(1): 102-110 (2010). |
Evans, et al., “Coherent Anti-Stokes Raman Scattering Microscopy: Chemical Imaging for Biology and Medicine,” Annual Review of Analytical Chemistry 1, (2008), pp. 883-909. |
Fan, et al., “Development of a Laser Photothermoacoustic Frequency-Swept System for Subsurface Imaging: Theory and Experiment,” J. Acoust. Soc. Am., vol. 116 (6), Dec. 2004, pp. 3523-3533. |
Fan, et al., “Sub-Cellular Resolution Delivery of a Cytokine via Precisely Manipulated Nanowires” Nat. Nanotechnol. 5(7), Jul. 2010, 545-551. <doi:10.1038/nnano.2010.104>. |
Fang, et al., “Photoacoustic Doppler effect from flowing small light-absorbing particles,” Physical Review Letters 99(18) 184501-(1-4) (Nov. 2, 2007). |
Fercher, et al., “Measurement of Intraocular Distances by Backscattering Spectral Interferometry,” Optics Communications, 1995, vol. 117, pp. 43-48. |
Fernandez, D. C., Bhargava, R., Hewitt, S. M. & Levin, I. W., “Infrared spectroscopic imaging for histopathologic recognition,” Nat. Biotechnol. 23, 469-474 (2005). |
Foster, et al., “Advances in ultrasound biomicroscopy” Ultrasound in Medicine & Biology, vol. 26, No. 1, Jan. 2000, pp. 1-27. |
Fujita, K., et al., “Confocal multipoint multiphoton excitation microscope with microlens and pinhole arrays,” Opt. Comm. 174, 7-12 (Jan. 15, 2000). |
Furstenberg, et. al., “Chemical Imaging using Infrared Photo-thermal Microspectroscopy,” In Proceedings of SPIE Defense, Security, and Sensing (eds Druy, M.A. & Crocombe, R. A.) 837411 (SPIE, 2012). |
Gaihre, et al., “Gelatin-coated magnetic iron oxide nanoparticles as carrier system: Drug loading and in vitro drug release study,” Int. J. Pharm. 365, (2009) pp. 180-189. |
Gao, et al., “Single-shot compressed ultrafast photography at one hundred billion frames per second,” Nature 516(7529) 74-77 (Dec. 4, 2014). |
Gao, et al., “A review of snapshot multidimensional optical imaging: measuring photon tags in parallel” Phys Rep. 616, Feb. 29, 2016, pp. 1-37. <doi:10.1016/j.physrep.2015.12.004>. |
Gao, et al., “Artificial micromotors in the mouse's stomach: A step toward in vivo use of synthetic motors,” ACS Nano 9, (2015) pp. 117-123. |
Gibson, et al., “Recent advances in diffuse optical imaging” Physics in Medicine and Biology 50, 2005, pp. R1-R43, Institute of Physics Publishing, UK. |
Gong, L. et al., “Breaking the diffraction limit by saturation in stimulated-Raman-scattering microscopy: a theoretical study,” Phys. Rev. A 90, 13818 (2014). |
Griffiths, P., “Fourier transform infrared spectrometry,” Science 21, 297-302 (1983). |
Guggenheim, et al., “Ultrasensitive planoconcave optical microresonators for ultrasound sensing”, Nat. Photon. 11, 714-721 (2017). |
Guittet C, et al., “In vivo high-frequency ultrasonic characterization of human dermis” IEEE Transactions on Bio-medical Engineering. Jun. 1999;46(6):740-746. <doi:10.1109/10.764950>. |
Guo, et al., “Calibration-free absolute quantification of optical absorption coefficients using acoustic spectra in three-dimensional photoacoustic microscopy of biological tissue” Opt Lett. 2010; 35(12): 2067-2069. <doi:10.1364/OL.35.002067>. |
Guo, et al., “CsxWO3 nanorods coated with polyelectrolyte multilayers as a multifunctional nanomaterial for bimodal imaging-guided photothermal/photodynamic cancer treatment,” Adv. Mater. 29, 1604157 (2017). |
Haas, J. et al., “Advances in Mid-Infrared Spectroscopy for Chemical Analysis,” Annu. Rev. Anal. Chem. 9 (2016) pp. 45-68. |
Hai, et al., “Near-infrared optical-resolution photoacoustic microscopy”, Opt. Lett. 39, 5192-5195 (Sep. 1, 2014). |
Hai, et al., “High-throughput, label-free, single-cell photoacoustic microscopy of intratumoral metabolic heterogeneity,” Nature Biomedical Engineering 3(5) 381-391 (May 2019). |
Hebden et al., “Enhanced time-resolved imaging with a diffusion model of photon transport” Optics Letters, vol. 19, No. 5, 1994, pp. 311-313. |
Hee, et al., “Femtosecond transillumination tomography in thick tissues” Optics Letters, vol. 18, No. 13, 1993, pp. 1107-1109. |
Hillman, et al., “Laminar optical tomography: demonstration of millimeter-scale depth-resolved imaging in turbid media,” Optics Letters, vol. 29, No. 14, Jul. 15, 2004, pp. 1650-1652. |
Hoelen, et al., “Three-Dimensional Photoacoustic Imaging of Blood Vessels in Tissue” Optics Letters, 1998, pp. 648-650, vol. 23, No. 8, Optical Society of America, USA. |
Hong, et al., “Simple Method to Produce Janus Colloidal Particles in Large Quantity” Langmuir 22, (2006) pp. 9495-9499. |
Hu, C., et al., “Soft Micro- and Nanorobotics,” Annu. Rev. Control. Robot. Auton. Syst. 1, (2018) pp. 53-75. |
Hu, W., et al., “Small-scale soft-bodied robot with multimodal locomotion,” Nature 554, 81-85, (2018). |
Hu, S. et al., “Three-dimensional optical-resolution photoacoustic microscopy,” Journal of Visualized Experiments 51 (2011). |
Hu, S., et al., “Label-free Photoacoustic Ophthalmic Angiography” Optics Letters, 35(1), Jan. 1, 2010, pp. 1-3. |
Huang, et al., “Aberration correction for transcranial photoacoustic tomography of primates employing adjunct image data,” Journal of Biomedical Optics, vol. 17, No. 6, Jun. 2012, pp. 066016-1 to 066016-8. |
Huang, et al., “Optical Coherence Tomography,” Science, New Series, vol. 254, No. 5035, Nov. 22, 1991, pp. 1178-1181. |
Huber, et al., “Three-Dimensional and C-Mode 6 OCT Imaging with a Compact, Frequency Swept Laser Source at 1300 nn” Optics Express, vol. 13, No. 26, Dec. 26, 2005, pp. 10523-10526. |
Imai, T. et al., “High-throughput ultraviolet photoacoustic microscopy with multifocal excitation,” Journal of Biomedical Optics 23(3), 036007 (Mar. 15, 2018). |
Ing, R. K., Quieffin, N., Catheline, S., Fink, M., “In solid localization of finger impacts using acoustic time-reversal process,” Appl. Phys. Lett. 87, 204104 (Nov. 14, 2005). |
Ji, M. et al., “Detection of human brain tumor infiltration with quantitative stimulated Raman scattering microscopy,” Sci. Transl. Med 7, 309ra163 (2015). |
Ji, T. et al. “Preparation, Characterization, and Application of Au-Shell/Polystyrene Beads and Au-hell/Magnetic Beads” Adv. Mater. 13(16), Aug. 2001, pp. 1253-1256. |
Karamata, et al., “Multiple Scattering in Optical Coherence Tomography I Investigation and Modeling” Journal of Optical Society of America, vol. 22, No. 7 (2005) pp. 1369-1379. |
Karamata, et al., “Multiple scattering in optical coherence tomography. II. Experimental and theoretical investigation of cross talk in wide-field optical coherence tomography” J. Opt. Soc. Am. A/vol. 22, No. 7/Jul. 2005, pp. 1380-1388. |
Karshalev, E. et al., “Micromotor Pills as a Dynamic Oral Delivery Platform” American Chemical Society Nano, 2018, vol. 12, No. 8, pp. 8397-8405 <DOI: 10.1021/acsnano.8b03760>. |
Kim, C. et al., “In vivo molecular photoacoustic tomography of melanomas targeted by bio-conjugated gold nanocages” ACS Nano, 2010; 4(8), pp. 4559-4564. <doi:10.1021/nn100736c>. |
Kirch, J., et al., “Optical tweezers reveal relationship between microstructure and nanoparticle penetration of pulmonary mucus,” Proc. Natl. Acad. Sci. 109, (2012) pp. 18355-18360. |
Knoll, B. & Keilmann, F., “Near-field probing of vibrational absorption for chemical microscopy,” Nature 399, 134-137 (1999). |
Kole, M. R., et al., “Discrete frequency infrared microspectroscopy and imaging with a tunable quantum cascade laser,” Anal. Chem. 84, 10366-10372 (2012). |
Kolkman, et al., “In Vivo Photoacoustic Imaging of Blood Vessels Using an Extreme-Narrow Aperture Sensor” IEEE Journal of Selected Topics in Quantum Electronics, vol. 9, No. 2, Mar./Apr. 2003, pp. 343-346. |
Koziolek, et al., “Navigating the human gastrointestinal tract for oral drug delivery: Uncharted waters and new frontiers,” Adv. Drug Delivery Rev. 101, (2016) pp. 75-88. |
Kruger et al., “Photoacoustic Ultrasound (PAUS)-Reconstruction Tomography” Med. Phys., Oct. 1995, vol. 22 (10) Am. Assoc. Phys. Med., USA, pp. 1605-1609. |
Kruger, et al., “Thermoacoustic computed tomography—technical considerations” Medical Physics, 26(9): 1832-1837 (1999). |
Kruger et al., “Thermoacoustic computed tomography using a conventional linear transducer array,” Medical Physics, 30(5): 856-860 (2003). |
Kruger, et al., “Thermoacoustic Molecular Imaging of Small Animals,” Molecular Imaging, 2(2): 113-123 (2003). |
Kruger, et al., “Thermoacoustic CT: imaging principles,” Proc. SPIE 3916, (2000) pp. 150-160. |
Kruger, et al., “Breast Cancer in Vivo: Contrast Enhancement with Thermoacoustic CT at 434 MHz-Feasibility Study,” Radiology, 216(1): 279-283 (2000). |
Ku and Wang, “Scanning thermoacoustic tomography in biological tissue.” Medical physics 27.5 (2000): 1195-1202. |
Ku and Wang, “Scanning microwave-induced thermoacoustic tomography: Signal, resolution, and contrast,” Medical Physics, 28(1): 4-10 (2001). |
Ku, G. et al., “Multiple-bandwidth photoacoustic tomography,” Physics in Medicine & Biology, 49(7): 1329-1338 (2004). |
Ku and Wang, “Deeply penetrating photoacoustic tomography in biological tissues enhanced with an optical contrast agent,” Optics Letters, 30(5): 507-509 (2005). |
Ku, et al., “Imaging of tumor angiogenesis in rat brains in vivo by photoacoustic tomography,” Applied Optics, 44(5): 770-775 (2005). |
Ku, et al., “Thermoacoustic and Photoacoustic Tomography of Thick Biological Tissues Toward Breast Imaging,” Technology in Cancer Research & Treatment, 4(5): 559-566 (2005). |
Kunitz, M., “Crystalline desoxyribonuclease; isolation and general properties; spectrophotometric method for the measurement of desoxyribonuclease activity,” The Journal General Physiology, vol. 33, Mar. 20, 1950, pp. 349-362. <URL:http://doi.org./10.1085/jgp.33.4.349>. |
Kuppusami, S. et al., “Parylene Coatings in Medical Devices and Implants: A Review” Universal Journal of Biomedical Engineering, 2015, vol. 3, No. 2, pp. 9-14 <DOI: 10.13189/ujbe.2015.030201>. |
Lai, S. et al., “Mucus-penetrating nanoparticles for drug and gene delivery to mucosal tissues,” Adv. Drug Deliv. Rev. 61(2), Feb. 27, 2009, pp. 158-171. <doi:10.1016/j.addr.2008.11.002>. |
Lai, P. et al., “Photoacoustically guided wavefront shaping for enhanced optical focusing in scattering media,” Nature Photonics 9 126-132 (Jan. 19, 2015). |
Lai, P. et al., “Dependence of optical scattering from Intralipid in gelatin-gel based tissue-mimicking phantoms on mixing temperature and time” Journal of Biomedical Optics, vol. 19, No. 3, Mar. 2014, pp. 035002-1-035002-6. |
Larina, et al., Real-time optoacoustic monitoring of temperature in tissues: Journal of Physics D: Applied Physics, vol. 38, (2005) pp. 2633-2639. |
Lasch, et al., “FT-IR spectroscopic investigations of single cells on the subcellular level,” Vibr. Spectrosc. 28, 147-157 (2002). |
Laser Institute of America, “American National Standard for the safe use of lasers,” American National Standard Institute (ANSI Z136.1-2007 Revision of ANSI Z136.1-2000). |
Leal, et al., “Physicochemical properties of mucus and their impact on transmucosal drug delivery,” Int. J. Pharm. 532, (2017) pp. 555-572. |
Lewis, E. N. et al., “Fourier transform spectroscopic imaging using an infrared focal-Plane array detector,” Anal. Chem. 67, 3377-3381 (1995). |
Leitgeb, et al., “Performance of fourier domain vs. time domain optical coherence tomography,” Optical Express, vol. 11, No. 8, Apr. 21, 2003, pp. 889-894. |
Li, et al., “An Enteric Micromotor Can Selectively Position and Spontaneously Propel in the Gastrointestinal Tract,” ACS Nano. 10(10), Oct. 25, 2016, pp. 9536-9542. <doi:10.1021/acsnano.6b04795>. |
Li, et al., “Autonomous Collision-Free Navigation of Microvehicles in Complex and Dynamically Changing Environments” ACS Nano, 11, (2017) pp. 9268-9275. |
Li, G., et al., “Reflection-mode multifocal optical-resolution photoacoustic microscopy,” J. Biomed. Opt. 18, 030501 (Feb. 12, 2013). |
Li, J. et al., “Micromotors Spontaneously Neutralize Gastric Acid for pH-Responsive Payload Release” ANGEWANDTE CHEMIE International Edition, vol. 56, No. 8, 2017, pp. 2156-2161. <DOI: 10.1002/anie.201611774>. |
Li, L., et al., “Small near-infrared photochromic protein for photoacoustic multi-contrast imaging and detection of protein interactions in vivo,” Nature Communications 9(1) 2734 (Jul. 16, 2018). |
Li, et al., “Single-impulse panoramic photoacoustic computed tomography of small-animal whole-body dynamics at high spatiotemporal resolution,” Nat Biomed Eng. 1(5) May 2017, pp. 1-11. <doi:10.1038/s41551-017-0071>. |
Li, L . . . , et al., “Simultaneous Molecular and Hypoxia Imaging of Brain Tumors in Vivo Using Spectroscopic Photoacoustic Tomography,” Proceedings of the IEEE, 96(3): 481-489 (2008). |
Li, J. et al., “Micro/Nanorobots for Biomedicine: Delivery, Surgery, Sensing, and Detoxification” Sci Robot, 2(4), Mar. 15, 2017, pp. 1-20. <doi:10.1126/scirobotics.aam6431>. |
Li, Y. et al., “Multifocal photoacoustic microscopy through an ergodic relay (Conference Presentation)”, Proc. SPIE 10878, Photons Plus Ultrasound: Imaging and Sensing 2019, 108781C, presented Feb. 4, 2019, published Mar. 4, 2019, https://doi.org/10.1117/12.2513502. |
Li, et al., “Optical Coherence Computed Tomography,” Applied Physics Letters, vol. 91, American Institute of Physics, 2007, pp. 141107-1-141107-3. |
Li, et al., “Snapshot photoacoustic topography through an ergodic relay for high-throughput imaging of optical absorption,” Nature Photonics 14(3) (2020) pp. 164-170. <URL:https://doi.org/10.1038/s41566-019-0576-2>. |
Li, Z., et al., “Super-resolution far-field infrared imaging by photothermal heterodyne imaging,” The Journal of Physical Chemistry B, vol. 121 (2017) pp. 8838-8846. |
Li, Z., et al., “Super-resolution imaging with mid-IR photothermal microscopy on the single particle level,” in Proceedings of SPIE Physical Chemistry of Interfaces and Nano-materials XIV, vol. 9549, Aug. 20, 2015, pp. 954912-1-954912-8. |
Liang, et al., “Single-shot real-time femtosecond imaging of temporal focusing,” Light-Science & Applications 7(1) 42 (Aug. 8, 2018). |
Liang, et al., “Single-shot real-time video recording of a photonic Mach cone induced by a scattered light pulse,” Science Advances 3(1) e1601814 (Jan. 20, 2017). |
Liang, et al., “Single-shot ultrafast optical imaging,” Optica 5(9) 1113-1127 (Sep. 2018). |
Lin, et al., “Single-breath-hold photoacoustic computed tomography of the breast,” Nature Communications 9(1) 2352 (Jun. 15, 2018). |
Liu, et al., “Optical focusing deep inside dynamic scattering media with near-infrared time-reversed ultrasonically encoded (TRUE) light,” Nature Communications 6 5409 (Jan. 5, 2015). |
Liu, et al., “Label-free cell nuclear imaging by Grüneisen relaxation photoacoustic microscopy” Opt Lett. Feb. 15, 2018; 43(4), (2018) pp. 947-950. |
Lovell, et al., “Porphysome nanovesicles generated by porphyrin bilayers for use as multimodal biophotonic contrast agents,” Nature Materials 10(4) 324-32 (Mar. 20, 2011). |
Lu, F., et al., “Tip-enhanced infrared nanospectroscopy via molecular expansion force detection,” Nat. Photon. 8, 307-312 (2014). |
Lu, F.-K. et al., “Label-free DNA imaging in vivo with stimulated Raman scattering microscopy,” Proc. Natl Acad Sci. USA 112, 11624-11629 (2015). |
Ma, et al., “Time-reversed adapted-perturbation (TRAP) optical focusing onto dynamic objects inside scattering media,” Nature Photonics 8(12) 931-936 (Nov. 2, 2014). |
Manohar, et al., “Initial results of in vivo non-invasive cancer imaging in the human breast using near-infrared photoacoustics,” Optics Express, 15(19): 12277-12285 (2007). |
Maslov, et al., “In vivo dark-field reflection-mode photoacoustic microscopy,” Optics Letters 30(6), Mar. 15, 2005, pp. 625-627. |
Maslov, et al., “Optical-resolution photoacoustic microscropy for in vivo imaging of single capillaries,” Optical Letters, 33(9): 929-931 (2008). |
Maslov, et al., “Photoacoustic Imaging of biological tissue with Intensity-Modulated Continuous-Wave Laser” Journal of Biomedical Optics, 2008, pp. 024006 1-024006 5, vol. 13(2), SPIE, USA. |
Medina-Sanchez, et al., “Medical microbots need better imaging and control,” Nature 545, (2017) pp. 406-408. |
Michaelian, Kirk H. Photoacoustic IR spectroscopy: instrumentation, applications and data analysis. Pub: John Wiley & Sons; Dec 1, 2010. <PREFACE ONLY>. |
Miller, et al., “Synchrotron-based biological microspectroscopy: From the mid-infrared through the far-infrared regimes,” Journal of Biological Physics 29, 219-230 (2003). |
Mishra et al., “Development and comparison of the DTM, the DOM and the FVM formulations for the short-pulse laser transport through a participating medium” International Journal of Heat and Mass Transfer, vol. 49 (2006) pp. 1820-1832. |
Montaldo, et al., “Building three-dimensional images using time-reversal chaotic cavity”, IEEE Trans. Ultrason. Ferroelectr. Freq. Control 52, pp. 1489-1497 (2005). |
Morgner et al., “Spectroscopic optical coherence tomography,” Optics Letters, vol. 25, No. 2, Jan. 15, 2000, pp. 111-113. |
Murray et al., “High-Sensitivity Laser-Based Acoustic Microscopy Using a Modulated Excitation Source,” Applied Physics Letters, vol. 85, No. 14, American Institute of Physics, USA., Oct. 4, 2004, pp. 2974-2976. |
Nakajima, et al., “Three-dimensional analysis and classification of arteries in the skin and subcutaneous adipofascial tissue by computer graphics imaging,” Plastic and Reconstructive Surgery, 102(3): 748-760 (1998). |
Nasiriavanaki, et al., “High-resolution photoacoustic tomography of resting-state functional connectivity in the mouse brain,” Proceedings of the National Academy of Sciences 111(1) 21-26 (Jan. 7, 2014). |
Nasse, M. J. et al., “High-resolution Fourier-transform infrared chemical imaging with multiple synchrotron beams,” Nat. Methods 8, 413-416 (2011). |
Nelson et al., “Imaging Glioblastoma Multiforme,” The Cancer Journal vol. 9, No. 2, Mar./Apr. 2003, pp. 134-145. |
Niederhauser et al., “Combined Ultrasound and Optoacoustic System for Real-Time High-Contrast Vascular imaging in Vivo,” IEEE Transactions on MedicalImaging, 24(4): 436-440 (2005). |
Nowak, D. et al., “Nanoscale chemical imaging by photoinduced force microscopy,” Sci. Adv. 2, Mar. 25, 2016, e1501571, pp. 1-9. |
Ntziachristos, V., “Going deeper than microscopy: the optical imaging frontier in biology” Nature Methods vol. 7, No. 8, Aug. 2010, pp. 603-614. |
Oraevsky et al., “Optoacoustic Tomography,” Biomedical Photonics Handbook, 2003, chapter 34: pp. 931-964, CRC Press LLC, USA. |
Oraevsky et al., “Ultimate Sensitivity of Time-Resolved Opto-Acoustic Detection,” Biomedical Optoacoustics, 2000, pp. 228-239, vol. 3916, SPIE, USA. |
Oraevsky et al., “Laser Optoacoustic Tomography of Layered Tissues: Signal Processing” Proceedings of SPIE, 2979: 59-70 (1997). |
Oraevsky et al., “Laser opto-acoustic imaging of the breast: Detection of cancer angiogenesis” Proceedings of SPIE, 3597: 352-363 (1999). |
Patel, et al., “Pulsed optoacoustic spectroscopy of condensed matter,” Rev. Mod. Phys., vol. 53 (1981) pp. 517-550. |
Paxton, et al., “Catalytic nanomotors: Autonomous movement of striped nanorods,” J. Am. Chem. Soc. 126, 13424-13431 (2004). |
Petrov, et al., “Optoacoustic, Noninvasive, Real-Time, Continuous Monitoring of Cerebral Blood Oxygenation: An in Vivo Study in Sheep” Anesthesiology, vol. 102, No. 1, Jan. 2005, pp. 69-75. |
Potter, et al., “Capillary diameter and geometry in cardiac and skeletal muscle studied by means of corrosion casts” Microvascular Research, 25(1): 68-84 (1983). |
Prati, et al., “New advances in the application of FTIR microscopy and spectroscopy for the characterization of artistic materials,” Accounts of Chemical Research, vol. 43, (2010) pp. 792-801. |
Prevedel, et al., “Simultaneous whole-animal 3D imaging of neuronal activity using light-field microscopy,” Nat. Methods 11, 727-730 (Jul. 2014). |
Quickenden, et al., “The ultraviolet absorption spectrum ofliquid water,” J Chem. Phys. 72, 4416-4428 (1980). |
Razansky, et al., “Multispectral opto-acoustic tomography of deep-seated fluorescent proteins in vivo,” Nature Photonics 3, (2009) pp. 412-417. |
Robert et al., “Fabrication of Focused Poly (Vinylidene Fluoride-Trifluoroethylene) P19 (VDF-TrFE) Copolymer 40-50 MHz Ultrasound Transducers on Curved Surfaces,” Journal of Applied Physics, vol. 96, No. 1. Jul. 1, 2004, pp. 252-256. |
Rockley, M.G., “Fourier-transformed infrared photoacoustic spectroscopy of polystyrene film,” Chem. Phys. Lett. 68, 455-456 (1979). |
Rosenblum, et al., “Progress and challenges towards targeted delivery of cancer therapeutics” Nat. Commun. 9, (2018) 1410, pp. 1-12. |
Saager et al., “Direct characterization and removal of interfering absorption trends in two-layer turbid media” J. Opt. Soc. Am. A, vol. 22, No. 9, Sep. 2005, pp. 1874-1882. |
Sanchez, et al., “Chemically powered micro- and nanomotors,” Angew. Chem. Int. Ed. 54, (2015) pp. 1414-1444. |
Sakadzic, et al., “Correlation transfer and diffusion of ultrasound-modulated multiply scattered light,” Physical Review Letters 96(16) 163902—(1-4) (Apr. 28, 2006). |
Savateeva, et al., “Noninvasive detection and staging or oral cancer in vivo with confocal opto-acoustic tomography” Biomedical Optoacoustics, vol. 3916, International Society for Optics and Photonics 2000, pp. 55-66. |
Schambach, et al., “Application of micro-CT in small animal imaging” Methods, vol. 50, No. 1, Jan. 2010, pp. 2-13. |
Schmidt, et al., “A 32-Channel Time Resolved Instrument for Medical Optical Tomography” Review of Scientific Instruments, vol. 71, No. 1, Jan. 2000, pp. 256-265. |
Schroeter, et al., “Spontaneous slow hemodynamic oscillations are impaired in cerebral microangiopathy,” Journal of Cerebral Blood Flow & Metabolism (2005) 25, pp. 1675-1684. |
Servant, et al., “Controlled in Vivo Swimming of a Swarm of Bacteria-Like Microrobotic Flagella” Advanced Materials 27, (2015) pp. 2981-2988. |
Sezer, et al., “Review of magnesium-based biomaterials and their applications,” J. Magnesium Alloys 6, (2018) pp. 23-43. |
Sethuraman et al., “Development of a combined intravascular ultrasound and photoacoustic imaging system” Proceedings of SPIE, 6086: 60860F.1-60860F.10 (2006). |
Sethuraman et al., “Intravascular photoacoustic imaging of atherosclerotic plaques: Ex vivo study using a rabbit model of atherosclerosis” Proceedings of SPIE, 6437: 643729.1-643729.9 (2007). |
Shah, J. et al, “Photoacoustic imaging and temperature measurement for photothermal cancer therapy,” Journal of Biomedical Optics, vol. 13, No. 3, (May/Jun. 2008) pp. 034024-1-034024-9. |
Sheth, et al., “Columnar Specificity of Microvascular Oxygenation and Volume Responses: Implications for Functional Brain Mapping,” The Journal of Neuroscience, vol. 24, No. 3, Jan. 21, 2004, pp. 634-641. |
Shi, J., et al., “High-resolution, high-contrast mid-infrared imaging of fresh biological samples with ultraviolet-localized photoacoustic microscopy,” Nature Photonics 13 609-615 (May 2019). |
Shmueli, et al., “Low Frequency Fluctuations in the Cardiac Rate as a Source of Variance in the Resting-State fMRI BOLD Signal,” Neuroimage, vol. 38, No. 2, Nov. 1, 2007, pp. 306-320. |
Silva, et al., “Toward Label-Free Super-Resolution Microscopy,” ACS Photon. 3, 79-86 (2016). |
Sim, et al., “In vivo Microscopic Photoacoustic Spectroscopy for Non-Invasive Glucose Monitoring Invulnerable to Skin Secretion Products,” Sci. Rep. 8, 1059 (2018). |
Siphanto et al., “Imaging of Small Vessels Using Photoacoustics: an in Vivo Study,” Lasers in Surgery and Medicince, vol. 35, Wiley-Liss, Inc., Netherlands, Dec. 20, 2004, pp. 354-362. |
Sitti, M., “Miniature soft robots—road to the clinic,” Nat. Rev. Mater, 3, (2018) pp. 74-75. |
Smith, et al., “Beyond C, H, O, and Ni analysis of the elemental composition of U.S. FDA approved drug architectures,” J. Med. Chem. 57, pp. 9764-9773 (2014). |
Sommer, A. J., et al., “Attenuated total internal reflection infrared mapping microspectroscopy using an imaging microscope,” Appl. Spectrosc. 55, 252-256 (2001). |
Song, et al., “Fast 3-D dark-field reflection-mode photoacoustic microscopy in vivo with a 30-MHz ultrasound linear array” Journal of Biomedical Optics, 13(5): 054028.1-054028.5 (2008). |
Song et al., “Multi-focal optical-resolution photoacoustic microscopy in vivo.” NIH Public Access Author Manuscript, May 13, 2011. pp. 1-7. |
Song, et al., “Section-illumination photoacoustic microscopy for dynamic 3D imaging of microcirculation in vivo” Optics Letters, 35(9): 1482-1484 (2010). |
Soppimath, et al., “Microspheres as floating drug-delivery systems to increase gastric retention of drugs” Drug Metab. Rev. 33, (2001) pp. 149-160. |
Steinbrink, et al., “Illuminating the BOLD signal: combined fMRI-fNIRS studies” Magnetic Resonance Imaging, vol. 24, No. 4, May 2006, pp. 495-505. |
Stern, MD., “In vivo evaluation of microcirculation by coherent light scattering,” Nature, 254(5495): 56-58 (1975). |
Tay, et al., “Magnetic Particle Imaging Guided Heating in Vivo using Gradient Fields for Arbitrary Localization of Magnetic Hyperthermia Therapy” ACS Nano. 12(4), Apr. 24, 2018, pp. 3699-3713. <doi:10.1021/acsnano.8b00893>. |
Tam, A. C., “Applications of photoacoustic sensing techniques,” Reviews of Modern Physics, vol. 58, No. 2, Apr. 1986, pp. 381-431. |
Tearney, et al., “Scanning single-mode fiber optic catheter-endos cope for optical coherence tomography” Optics Letters, 21(7): 543-545 (1996). |
Tran, et al., “In vivo endoscopic optical coherence tomography by use of a rotational microelectromechanical system probe” Optics Letters, 29(11): 1236-1238 (2004). |
Treeby B. E., et al., “Photoacoustic tomography in absorbing acoustic media using time reversal,” Inverse Probl. (2010) 26(11), pp. 1-20. |
Tu, et al., “Self-propelled supramolecular nanomotors with temperature-responsive speed regulation,” Nat. Chem. 9, 480 (2016). |
Van Essen, et al., “An Integrated Software Suite for Surface-based Analyses of Cerebral Cortex” Journal of the American Medical Informatics Association, vol. 8, No. 5, Sep./Oct. 2001, pp. 443-459. |
Velasco, E., “Ultrafast Camera Takes 1 Trillion Frames Per Second of Transparent Objects and Phenomena” [Webpage] Caltech, California Institute of Technology, Jan. 17, 2020, pp. 1-2. <URL:https://www.eurekalert.org/pub_releases/2020-01/ciot-uct012120.php>. |
Viator et al., “Design testing of an endoscopic photoacoustic probe for determination of treatment depth after photodynamic therapy” Proceedings of SPIE in Biomedical Optoacoustics II, 4256: 16-27 (2001). |
Vilela, et al., “Medical imaging for the tracking of micromotors,” ACS Nano 12, (2018) pp. 1220-1227. |
Wang, et al., “Ballistic 2-D Imaging Through Scattering Walls Using an Ultrafast Oplical Kerr Gale,” Science, vol. 253, Aug. 16, 1991, pp. 769-771. |
Wang, et al., “Biomedical Optics, Principles and Imaging,” Wiley-Interscience, A John Wiley & Sons, Inc., (2007) p. 7. |
Wang, et al., “Fabrication of micro/nanoscale motors” Chem. Rev. 115, (2015) pp. 8704-8735. |
Wang, B. et al., “Recent progress on micro- and nano-robots: towards in vivo tracking and localization” Quantitative Imaging in Medicine and Surgery, 2018, vol. 8, No. 5, pp. 461-479. <DOI: 10.21037/qims.2018.06.07>. |
Wang, L. et al., “Grueneisen relaxation photoacoustic microscopy,” Physical Review Letters 113 174301 (Oct. 24, 2014). |
Wang, L. V & Yao, J., “A practical guide to photoacoustic tomography in the life sciences,” Nat. Methods 13, 627-638 (Jul. 28, 2016). |
Wang, L. V., “Multiscale photoacoustic microscopy and computed tomography,” Nat. Photon. 3, 503-509 (Aug. 29, 2009). |
Wang, L. V.; “Mechanisms of ultrasonic modulation of multiply scattered coherent light: an analytic model,” Physical Review Letters 87(4) 043903-(1-4) (Jul. 23, 2001). |
Wang, L. V.; “Prospects of photoacoustic tomography,” Medical Physics 35(12), Nov. 19, 2008, pp. 5758-5767. |
Wang, L., et al., “Single-cell label-free photoacoustic flowoxigraphy in vivo,” Proceedings of the National Academy of Sciences 110(15) 5759-5764 (Apr. 9, 2013). |
Wang, L., et al., “Ultrasonically encoded photoacoustic flowgraphy in biological tissue,” Physical Review Letters 111(20), 204301 (Nov. 15, 2013). |
Wang, L.V., Hu, S. “Photoacoustic Tomography: in vivo imaging from organelles to organs,” Science 335, 1458-1462 (Mar. 23, 2012). |
Wang, X. D., et al., “Noninvasive laser-induced photoacoustic tomography for structural and functional in vivo imaging of the brain,” Nature Biotechnology 21(7) 803-806 (Jul. 2003). |
Wang, et al., “MCML—Monte Carlo modeling of light transport in multi-layered tissues” Computer Methods and Programs in Biomedicine, vol. 47, No. 2, Jul. 1995, pp. 131-146. |
Wang et al., “Three-dimensional laser-induced photoacoustic tomography of mouse brain with the skin and skull intact,” Optics Letters, 28(19): 1739-1741 (2003). |
Wang et al., “Noninvasive photoacoustic angiography of animal brains in vivo with near-infrared light and an optical contrast agent” Optics Letters, 29(7): 730-732 (2004). |
Wang, et al., “Intravascular Photoacoustic Imaging” IEEE J Quantum Electronics, 16(3): 588-599 (2010). |
Wang, et al., “Nano/microscale motors: biomedical opportunities and challenges,” ACS Nano 6, (2012) pp. 5745-5751. |
Wetzel, et al., “Imaging molecular chemistry with infrared microscopy,” Science, New Series, vol. 285, No. 5431, Aug. 20, 1999, pp. 1224-1225. |
Wong, T. et al., “Fast label-free multilayered histology-like imaging of human breast cancer by photoacoustic microscopy,” Sci. Adv. 3, 1602168 (May 17, 2017). |
Wong, T. et al., “Label-free automated three-dimensional imaging of whole organ by microtomy-assisted photoacoustic microscopy,” Nat. Comm. 8, (Nov. 9, 2017). |
Wu, Z., et al., “A microrobotic system guided by photoacoustic computed tomography for targeted navigation in intestines in vivo,” Science Robotics 4(32) eaax0613 (Jul. 24, 2019). |
Wu, D., et al., “In vivo Mapping of Macroscopic Neuronal Projections in the Mouse Hippocampus using High-resolution Diffusion MRI,” Neuroimage 125, Jan. 15, 2016, pp. 84-93. |
Xia, J., et al., “Photoacoustic tomography: principles and advances,” Electromagn. Waves 147, 1 (2014; available in PMC Jan. 30, 2015). |
Xia, J., et al., “Wide-field two-dimensional multifocal optical-resolution photoacoustic-computed microscopy,” Opt. Lett. 38(24), Dec. 15, 2013, pp. 5236-5239. |
Xu, et al., “Photoacoustic Imaging in Biomedicine,” Review of Scientific Instruments, American Institute of Physics, vol. 77 (2006) pp. 041101 1-041101 22. |
Xu, et al., “Rhesus monkey brain imaging through intact skull with thermoacoustic tomography,” IEEE Trans. Ultrason. Ferroelectr. Freq. Control, vol. 53, No. 3, Mar. 2006, pp. 542-548. |
Xu, M. H.; Wang, L. V.; “Time-domain reconstruction for thermoacoustic tomography in a spherical geometry,” IEEE Transactions on Medical Imaging 21(7) 814-822 (Jul. 2002). |
Xu, M. H.; Wang, L. V.; “Universal back-projection algorithm for photoacoustic computed tomography,” Physical Review E 71(1) 016706-(1-7) (Jan. 19, 2005). |
Xu, S., et al., “Thermal expansion of confined water,” Langmuir 25, 5076-5083 (2009). |
Xu, X. et al., “Time-reversed ultrasonically encoded optical focusing into scattering media,” Nature Photonics 5(3) 154-157 (Jan. 16, 2011). |
Xu, Y.; Wang, L. V.; “Time reversal and its application to tomography with diffracting sources,” Physical Review Letters 92(3) 033902-(1-4) (Jan. 23, 2004). |
Xu et al. “Time Reversal Ultrasound Modulated Optical Tomography Using a BSO Phase Conjugate Mirror,” poster presented at SIPE Conference 7177 on Jan. 26, 2009, 1 page. |
Yadlowsky, et al., “Multiple scattering in optical coherence microscopy” Applied Optics, vol. 34, No. 25 (1995) pp. 5699-5707. <doi.org/10.1364/AO.34.005699>. |
Yan, et al., “Multifunctional biohybrid magnetite microrobots for imaging-guided therapy” Yan et al., Sci. Robot. 2, eaaq1155, Nov. 22, 2017, pp. 1-14. |
Yang, “Optical coherence and Doppler tomography for monitoring tissue changes induced by laser thermal therapy—An in vivo feasibility study” Review of Scientific Instruments, vol. 74, No. 1, Jan. 2003, p. 437-440. |
Yang, J. M. et al., “Simultaneous functional photoacoustic and ultrasonic endoscopy of internal organs in vivo,” Nature Medicine 18(8) 1297-1303 (Aug. 2012). |
Yang, J., et al., “Motionless volumetric photoacoustic microscopy with spatially invariant resolution,” Nature Communications 8(1) 780 (Oct. 3, 2017). |
Yang, et al., “Novel biomedical imaging that combines intravascular ultrasound (IVUS) and optical coherence tomography (OCT)” IEEE International Ultrasonics Symposium, Beijing, China, Nov. 2-5, 2008, pp. 1769-1772. |
Yang, et al., “Time-reversed ultrasonically encoded optical focusing using two ultrasonic transducers for improved ultrasonic axial resolution” Journal of Biomedical Optics 18(11), 110502 (Nov. 2013) pp. 110502-1-110502-4. |
Yang, et al., “The grand challenges of science robotics,” Science Robotics 3, Jan. 31, 2018, eaar7650, pp. 1-14. |
Yang, J.M., et al., “Focusing light inside live tissue using reversibly switchable bacterial phytochrome as a genetically encoded photochromic guide star” Science Advances 5(12) (2019) pp. 1-9. |
Yao, et al., “Monte Carlo simulation of an optical coherence tomography signal in homogeneous turbid media” Phys. Med. Biol. 44(9), Jul. 8, 1999, pp. 2307-2320. |
Yao, et al., “Absolute photoacoustic thermometry in deep tissue,” Opt. Lett. 38, 5228-5231 (2013). |
Yao, et al., “In vivo label-free photoacoustic microscopy of cell nuclei by excitation of DNA and RNA,” Opt. Lett. 35, 4139-4141 (2010). |
Yao, et al., “Optimal ultraviolet wavelength for in vivo photoacoustic imaging of cell nuclei,” J Biomed. Opt. 17, 056004 (2012). |
Yao, et al., “Photoimprint photoacoustic microscopy for three-dimensional label-free sub-diffraction imaging,” Physical Review Letters 112(1) 014302 (Jan. 10, 2014). |
Yao, L. et al., “Multiscale photoacoustic tomography using reversibly switchable bacterial phytochrome as near-infrared photochromic probe,” Nature Methods 13(1) 67-73 (Jan. 2016). |
Yao, L. et al., “High-speed label-free functional photoacoustic microscopy of mouse brain in action,” Nat. Methods 12(5), 407-410 (May 12, 2015). |
Yao, L. et al., “Photoacoustic microscopy: superdepth, superresolution, and superb contrast”, IEEE Pulse 6, 34-7 (May 13, 2015). |
Yaqoob, et al., “Methods and application areas of endoscopic optical coherence tomography” Journal of Biomedical Optics, 11(6): 063001.1-063001.19 (2006). |
Yavuz, M. S., et al., “Gold nanocages covered by smart polymers for controlled release with near-infrared light,” Nature Materials 8(12) 935-939 (Nov. 1, 2009). |
Yin, et al., “Agarose particle-templated porous bacterial cellulose and its application in cartilage growth in vitro” Acta Biomater. Jan. 12, 2015, pp. 129-138. <doi:10.1016/j.actbio.2014.10.019>. |
Yodh et al., “Functional Imaging with Diffusing Light” Biomedical Photonics Handbook, 2003, Ch. 21, pp. 45, CRC Press, Boca Raton. |
Yodh, et al. “Spectroscopy and Imaging with Diffusing Light” Physics Today 48(3), Mar. 1995, pp. 34-40. |
Zeff, et al., “Retinotopic mapping of adult human visual cortex with high-density diffuse optical tomography” PNAS, vol. 104, No. 29, Jul. 17, 2007, pp. 12169-12174. |
Zemp, et al., “Realtime photoacoustic microscopy in vivo with a 30MHZ ultrasonic array transducer” Optics Express, 16(11): 7915-7928 (2008). |
Zhang, C., et al., “Coherent Raman scattering microscopy in biology and medicine,” Annu. Rev. Biomed. Eng. 17, 415-445 (2015). |
Zhang, D. et al., “Depth-resolved mid-infrared photothermal imaging of living cells and organisms with submicrometer spatial resolution,” Sci. Adv. 2, e1600521 (2016). |
Zhang, H. F. et al., “Functional photoacoustic microscopy for high-resolution and noninvasive in vivo imaging,” Nature Biotechnology 24(7) 848-851 (Jul. 2006). |
Zhang, H. F. et al., “In vivo imaging of subcutaneous structures using functional photoacoustic microscopy,” Nature Protocols 2(4) 797-804 (Apr. 5, 2007). |
Zhang, et al., “Intrinsic Functional Relations Between Human Cerebral Cortex and Thalamus” Journal of Neurophysiology, vol. 100, No. 4, Oct. 2008, pp. 1740-1748. |
Zharov, et al., “In vivo photoacoustic flow cytometry for monitor of circulating single cancer cells and contrast agents,” Optics Letters, 31(24): 3623-3625 (2006). |
Zou, et al., “BOLD responses to visual stimulation in survivors of childhood cancer” NeuroImage, vol. 24, No. 1, Jan. 1, 2005, pp. 61-69. |
U.S. Appl. No. 16/946,496, filed Jun. 24, 2020, Gao et al. |
U.S. Appl. No. 17/090,752, filed Nov. 5, 2020, Wang et al. |
International Search Report and Written Opinion of the International Searching Authority regarding PCT/US2018/032007 dated Aug. 9, 2018; pp. 1-7. |
Notice of Allowance dated Jan. 26, 2021 issued in U.S. Appl. No. 14/436,581. |
Notice of Allowance dated Feb. 2, 2021 issued in U.S. Appl. No. 16/372,597. |
International Preliminary Report on Patentability dated Feb. 25, 2021, issued in Application No. PCT/US2019/046574. |
International Preliminary Report on Patentability dated Mar. 18, 2021, issued in Application No. PCT/US2019/049594. |
International Search Report and Written Opinion dated Mar. 2, 2021 issued in PCT/US2020/059214. |
Arridge, et al., “Accelerated high-resolution photoacoustic tomography via compressed sensing,” ArXiv Prepr. ArXiv160500133, 2016, pp. 8908-8940. |
Cox, et al., “Artifact trapping during time reversal photoacoustic imaging for acoustically heterogeneous media,” IEEE Trans. Med. Imaging, vol. 29, No. 2, (2010) pp. 387-396. |
Deán-Ben, et al., “Functional optoacoustic neuro-tomography for scalable whole-brain monitoring of calcium indicators,” Light Sci. Appl., vol. 5, No. 12, p. e16201, 2016, pp. 1-7. |
Deán-Ben, et al., “Portable spherical array probe for volumetric real-time optoacoustic imaging at centimeter-scale depths,” Opt. Express, vol. 21, No. 23, 2013, pp. 28062-28071. |
Deserno, M., “How to generate equidistributed points on the surface of a sphere,” Polym. Ed, p. 99, 2004, p. l. |
Han, Y. et al., “Three-dimensional optoacoustic reconstruction using fast sparse representation,” Opt. Lett., vol. 42, No. 5, (2017) pp. 979-982. |
Han, et al., “Optoacoustic image reconstruction and system analysis for finite-aperture detectors under the wavelet-packet framework,” J. Biomed. Opt., vol. 21, No. 1, Jan. 2016, pp. 016002-1-016002-9. |
Huang, et al., “Full-wave iterative image reconstruction in photoacoustic tomography with acoustically inhomogeneous media,” IEEE Trans. Med. Imaging, vol. 32, No. 6, Jun. 2013, pp. 1097-1110. |
R. A. Kruger, et al., “Dedicated 3D photoacoustic breast imaging,” Med. Phys., vol. 40, No. 11, 2013, pp. 113301-1-113301-8. |
Matthews, et al., “Parameterized Joint Reconstruction of the Initial Pressure and Sound Speed Distributions for Photoacoustic Computed Tomography,” SIAM J. Imaging Sci., vol. 11, No. 2, (2018) pp. 1560-1588. |
Matsumoto, et al., “Label-free photoacoustic imaging of human palmar vessels: a structural morphological analysis,” Sci. Rep., vol. 8, No. 1, (2018) p. 786. |
Mitsuhashi, et al., “A forward-adjoint operator pair based on the elastic wave equation for use in transcranial photoacoustic computed tomography,” SIAM J. Imaging Sci., vol. 10, No. 4, 2017, pp. 2022-2048. |
Mitsuhashi, et al., “Investigation of the far-field approximation for modeling a transducer's spatial impulse response in photoacoustic computed tomography,” Photoacoustics, vol. 2, No. 1, 2014, pp. 21-32. |
Ogunlade, et al., “In vivo three-dimensional photoacoustic imaging of the renal vasculature in preclinical rodent models,” Am. J. Physiol.-Ren. Physiol., vol. 314, No. 6, (2018) pp. F1145-F1153. |
Pramanik, M., “Improving tangential resolution with a modified delayand-sum reconstruction algorithm in photoacoustic and thermoacoustic tomography,” JOSA A, vol. 31, No. 3, (2014) pp. 621-627. |
Scholte, et al., “On spatial sampling and aliasing in acoustic imaging” 12th Intern. congress on sound and vibration, Lisbon, Portugal (2005) pp. 1-8. |
Schoeder, et al., “Optoacoustic image reconstruction: the full inverse problem with variable bases,” Proc. R. Soc. A, vol. 474, No. 2219, (2018) pp. 1-20. |
Treeby, et al., “k-Wave: MATLAB toolbox for the simulation and reconstruction of photoacoustic wave fields,” J. Biomed. Opt., vol. 15, No. 2, Mar./Apr. 2010, pp. 021314. |
Treeby, et al., “Advanced photoacoustic image reconstruction using the k-Wave toolbox,” in Photons Plus Ultrasound: Imaging and Sensing 2016, 2016, vol. 9708, p. 97082P. |
Tzoumas, et al., “Eigenspectra optoacoustic tomography achieves quantitative blood oxygenation imaging deep in tissues,” Nat. Commun., vol. 7, 2016, pp. 1-10. |
Wang et al., “Biomedical optics: principles and imaging,” Section 12.5; Photoacoustic Tomography, John Wiley & Sons (2012) pp. 288-290. |
Wang, K. et al., “Investigation of iterative image reconstruction in three-dimensional optoacoustic tomography,” Phys. Med. Biol., vol. 57, No. 17, 2012, p. 5399-5423. |
Xu, et al., “Exact frequency-domain reconstruction for thermoacoustic tomography—II: Cylindrical geometry,” IEEE Trans. Med. Imaging, vol. 21, No. 7, (2002) pp. 829-833. |
Zhou, et al., “Tutorial on photoacoustic tomography,” J. Biomed. Opt., vol. 21, No. 6, Jun. 2016, pp. 061007-1-061007-14. |
U.S. Appl. No. 17/302,313, filed Apr. 29, 2021, Wang et al. |
U.S. Appl. No. 17/302,041, filed Apr. 22, 2021, Wang et al. |
Notice of Allowance dated Jun. 23, 2021 issued in U.S. Appl. No. 15/037,468. |
Duan, T. et al., “Hybrid Multi-wavelength Photoacoustic Imaging”, 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Jul. 18, 2018, pp. 4804-4807. |
EP Office Action dated May 11, 2022, in Application No. EP19849860.2. |
Extended European Search Report dated Apr. 22, 2022, in Application No. 19849860.2. |
Extended European search report dated May 23, 2022, in Application No. EP19857631.6. |
International Preliminary Report on Patentability dated Jan. 6, 2022 in PCT Application No. PCT/US2020/070174. |
International Preliminary Reporton Patentability dated May 19, 2022, in PCT Application No. PCT/US2020/059214. |
International Preliminary Reporton Patentability dated Sep. 2, 2021, issued in Application No. PCT/US2020/019368. |
Li, Y. et al., “Multifocal Photoacoustic Microscopy Using a Single-element Ultrasonic Transducer Through an Ergodic Relay”, Light: Science & Applications, Jul. 31, 2020, vol. 9, No. 135, pp. 1-7. |
Notice of Allowance dated Jan. 5, 2022 issued in U.S. Appl. No. 16/540,936. |
U.S. Notice of Allowance dated Oct. 19, 2022 in U.S. Appl. No. 16/560,680. |
U.S. Corrected Notice of Allowance dated Nov. 14, 2022 in U.S. Appl. No. 16/540,936. |
U.S. Corrected Notice of Allowance dated Oct. 26, 2022 in U.S. Appl. No. 16/560,680. |
U.S Corrected Notice of Allowance dated Apr. 27, 2022 in U.S. Appl. No. 16/540,936. |
U.S. Corrected Notice of Allowance dated Jun. 2, 2022 In U.S. Appl. No. 16/806,796. |
U.S. Non Final Office Action dated Aug. 26, 2022 in U.S. Appl. No. 17/302,313. |
U.S. Non-Final Office Action dated May 2, 2022 in U.S. Appl. No. 16/798,204. |
U.S Notice of Allowance dated Apr. 19, 2022 in U.S. Appl. No. 16/540,936. |
U.S. Notice of Allowance dated Aug. 5, 2022 in U.S. Appl. No. 16/540,936. |
U.S. Notice of Allowance dated Feb. 23, 2022 in U.S. Appl. No. 16/806,796. |
U.S. Office Action dated Apr. 7, 2022, in U.S. Appl. No. 16/560,680. |
U.S. Requirement for Restriction dated Oct. 29, 2021 in U.S. Appl. No. 16/560,680. |
Yao, J. et al., “Double-illumination Photoacoustic Microscopy”, Optics Letters, Feb. 15, 2012, vol. 37, No. 4, pp. 659-661. |
U.S. Final office Action dated Jan. 27, 2023 in U.S. Appl. No. 16/798,204. |
U.S. Non-Final office Action dated Jan. 23, 2023 in U.S. Appl. No. 17/302,313. |
U.S. Non-Final Office Action dated Mar. 20, 2023 in U.S. Appl. No. 17/302,041. |
U.S. Notice of Allowance dated Jan. 26, 2023 in U.S. Appl. No. 16/560,680. |
Number | Date | Country | |
---|---|---|---|
20210010976 A1 | Jan 2021 | US |
Number | Date | Country | |
---|---|---|---|
62503997 | May 2017 | US |