A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner, East Carolina University of Greenville, N.C., has no objection to the reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever.
The inventive concept relates generally to visualization of organs and/or tissue and, more particularly, to visualization of anatomical structures, blood flow and perfusion.
Visible light imaging lends itself well to detailed anatomic visualization of a surface of organs and/or tissue for medical purposes. However, visible light imaging is not as useful for real-time imaging of physiology, particularly the physiology and pathophysiology of blood flow and perfusion. Near Infra-Red (NIR) imaging, on the other hand, can be used to visualize the surface of anatomic structures of target organs and/or tissue, but is substantially inferior to visible light anatomic imaging. Improved techniques for visualization of the organs and/or tissues are desired.
Some embodiments of the inventive concept provide methods for combining anatomical data and physiological data on a single image, the methods including obtaining an image of a sample, the image of the sample including anatomical structure of the sample; obtaining a physiologic map of blood flow and perfusion of the sample; and combining the anatomical structure of the image and the physiologic map of the sample into a single image of the sample. The single image of the sample displays anatomy and physiology of the sample in the single image in real time. At least one of the obtaining an image, obtaining a physiologic map and combining is performed by at least one processor.
In further embodiments, obtaining may include obtaining at least one of a raw near-infrared (NIR) image having a wavelength of from about 780 nm to about 2500 nm and a visible light image having a wavelength of from about 400 nm to about 700 nm.
In still further embodiments, combining the anatomical structure of the image and the physiologic map of the sample into a single image may include adjusting one or more properties of the image and/or the physiologic map. The one or more properties may include at least one of colorization, transparency and a weight function. The physiologic map may illustrate one of blood flow and perfusion, flow distribution, velocity, and/or volume rate of blood flow (cc/min) quantification in primary vessels based on fluid dynamic modeling.
In some embodiments, combining may further include creating an 8 bit RGB color image represented by the following equation:
Img(i,j)=ImgA(i,j)×RGB(i,j)
wherein ImgA(i,j) is an 8 bit gray scale visible image of the target tissue/organ, wherein i and j are pixel indexes along horizontal and vertical directions, respectively, and ImgA(i,j) for each color channel is adjusted separately to achieve a desired visualization effect.
In further embodiments, the sample may be one of tissue and an organ. Obtaining the image may include obtaining the image including anatomical structure of the vasculature of at least one of the tissue and the organ.
In still further embodiments, obtaining the image may be preceded by illuminating the sample with at least one light source. A portion of light may be reflected from the at least one light source to obtain the image and the physiologic map during a single data acquisition.
In some embodiments, obtaining a physiologic map of the sample may include obtaining one of a blood flow and perfusion physiologic map from one or more images using laser speckle imaging (LSI); a blood flow and perfusion physiologic map from one or more images using laser Doppler imaging (LDI); and a blood flow and perfusion angiography resemblance from a fluorescence image.
In further embodiments, the method may further include combining a plurality of images with a corresponding plurality of physiologic maps to provide a video displaying anatomy and physiology of the sample in real time.
Still further embodiments of the present inventive concept provide computer systems for combining anatomical data and physiological data on a single image, the system comprising includes a processor; and a memory coupled to the processor and comprising computer readable program code that when executed by the processor causes the processor to perform operations including obtaining an image of a sample, the image of the sample including anatomical structure of the sample; obtaining a physiologic map of blood flow and perfusion of the sample; and combining the anatomical structure of the image and the physiologic map of the sample into a single image of the sample. The single image of the sample displays anatomy and physiology of the sample in the single image in real time.
Some embodiments of the present inventive concept provide computer program products for combining anatomical data and physiological data on a single image, the computer program product including a non-transitory computer readable storage medium having computer readable program code embodied in the medium, the computer readable program code comprising computer readable program code to obtain a image of a sample, the image of the sample including anatomical structure of the sample; computer readable program code to obtain a physiologic map of blood flow and perfusion of the sample; and computer readable program code to combine the anatomical structure of the image and the physiologic map of the sample into a single image of the sample. The single image of the sample displays anatomy and physiology of the sample in the single image in real time.
Specific example embodiments of the inventive concept now will be described with reference to the accompanying drawings. This inventive concept may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the inventive concept to those skilled in the art. In the drawings, like numbers refer to like elements. It will be understood that when an element is referred to as being “connected” or “coupled” to another element, it can be directly connected or coupled to the other element or intervening elements may be present. As used herein the term “and/or” includes any and all combinations of one or more of the associated listed items.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the inventive concept. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless expressly stated otherwise. It will be further understood that the terms “includes,” “comprises,” “including” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this inventive concept belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the specification and the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
As discussed above, both visible light imaging and near-infrared (NIR) imaging fall short in one or more areas of visualization, either anatomical or blood flow/perfusion. Accordingly, some embodiments of the present inventive concept combine visualization of anatomic structures with physiologic functionality derived from image data, for example, raw image data from the NIR spectrum of any open tissue/organ. In particular, some embodiments of the inventive concept combine an anatomical image obtained using NIR imaging, visible light imaging and the like and structural details related to blood flow/perfusion to provide a new image/video for presentation in real-time. The blood flow/perfusion data may be provided by, for example, Laser Speckle or Laser Doppler Imaging technology (LDI) or, in some embodiments, fluorescence imaging. Details of provision of the blood flow/perfusion data using Laser Speckle Imaging (LSI) are discussed in, for example, commonly assigned U.S. Patent Publication Nos. 2013/0223705 and 2013/0245456, the contents of which are hereby incorporated herein by reference as if set forth in their entirety. It will be understood that embodiments of the present inventive concept are not limited to LSI, LDI and/or fluorescence imaging, any image form that will represent blood flow and perfusion physiology may be used. In particular, the blood/flow and perfusion data may be provided by any effective method that lends itself to embodiments discussed herein without departing from the scope of the present inventive concept.
Some embodiments of the present inventive concept provide a new image/video visualization for presentation and real-time evaluation and assessment of an anatomical-physiological result. In other words, the new image provides both a usable anatomic image provided by, for example, NIR or visible light imaging, and blood/flow and perfusion information on a same image that can be manipulated in real-time. Thus, the new visualization, referred to hereinafter as a Velocity-Flow-Anatomy (VFA) image or video, contains information of both anatomic structure and blood flow and perfusion physiology simultaneously in real time.
The VFA image/video in accordance with some embodiments of the present inventive concept combines (1) highly specific anatomic detail with (2) underlying physiologic processes sufficient to make real-time medical decisions. An NIR/visible light image is used as one layer of the final visualization (VFA image), which reveals anatomical structure of the targeting tissue/organ surface and vasculature. The physiologic map of blood flow and perfusion quantified by, for example, LSI, LDI and fluorescence technology, is used as another layer of the final VFA visualization. The physiologic map provides functionality and physiology of the targeted tissue/organ vasculature. As will be understood herein, the term “physiologic map” refers to maps generated by different types of imaging, for example, LSI and LDI may generate a “velocity map,” but the term “physiologic map” may generally refer to a map resulting from the use of any imaging technology. For example, a physiologic map may illustrate one of blood flow and perfusion, flow distribution, velocity, and/or volume rate of blood flow (cc/min) quantification in primary vessels based on fluid dynamic modeling and/or any the combination thereof without departing from the scope of the present inventive concept.
Both aspects of normal physiology of blood flow and perfusion and pathophysiological manifestations of abnormalities of blood flow and perfusion in tissues/organs may be provided. Some embodiments of the inventive concept provide software algorithms configured to adjust multiple aspects of each of the layers, for example, the colorization and transparency of the layers. In some embodiments, each of the layers may be derived from a same single video acquisition of raw NIR data/visible light image.
Some embodiments of the present inventive concept may provide distinct advantages over conventional visualization methods. For example, embodiments of the present inventive concept may provide substantially improved anatomic fidelity of the NIR image/visible light image. Furthermore, the anatomy layer may provide an important context for velocity imaging. The improved anatomic fidelity in turn improves the velocity fidelity, the velocity data interpretation, the timing of the interpretation, and the understanding of the interpretation; all of which make the interpretation of the VFA image/video result more intuitive. The anatomic fidelity allows for simultaneous, real-time assessment of multiple levels of analysis, such as target epicardial coronary arteries (flow) and surrounding myocardial tissue (perfusion). Finally, the combination of anatomy and physiology provided by embodiments of the inventive concept may be useful in convincing users, i.e., surgeons in surgical procedures, that the functional data is real and accurately represents the underlying physiology and/or pathophysiology. The VFA image/video combines and displays the unknown, i.e., quantification of perfusion, with the known, i.e., anatomy, where the anatomy component provides a useful frame of reference as will be discussed further herein with respect to
Referring first to
As illustrated in
In particular, in some embodiments, the light source unit 120 may be, provided by, for example, one or more lasers or light emitting diode (LED) lights. In some embodiments, the light source 120 is an NIR light source having a wavelength of from about 780 nm to about 2500 nm. In some embodiments, the light source 120 may be a visible light source having a wavelength of from about 400 nm to about 780 nm. In some embodiments, both a visible light source and a NIR light source may be used having the respective wavelength. Thus, some embodiments of the present inventive concept provide a system that uses two wavelengths of differential transmittance through a sample to apply LSI and/or LDI. For example, a first of the two wavelengths may be within the visible range that has zero or very shallow penetration, such as blue light 450-495 nm. This wavelength captures the anatomical structure of tissue/organ surface and serves as a position marker of the sample but not the subsurface movement of blood flow and perfusion. A second wavelength may be in the near Infra-Red (NIR) range, which has much deeper penetration. This wavelength reveals the underlying blood flow physiology and correlates both to the motion of the sample and also the movement of blood flow and perfusion. Using the imaging measurement of the visible light as a baseline, the true motion of blood flow and perfusion can be derived from the NIR imaging measurement without being affected by the motion artifact of the target. Furthermore, the anatomical structure information captured by visible light and the physiological characteristics measured by NIR light are combined. Details with respect to systems using two wavelengths are discussed in detail in U.S. Provisional Application No. 62/136,010, filed Mar. 20, 2015, the disclosure of which was incorporated herein by reference above. Although embodiments are discussed herein with respect to NIR raw images and visible light images, embodiments of the present inventive concept are not limited to this configuration. Any other image form that can adequately represent anatomy can be used without departing from the scope of the present inventive concept.
The light source 120 may be used to illuminate a region of interest 140 (hereinafter “tissue/organ”). As used herein, the “region of interest” refers to the region of the subject that is being imaged, for example, the principal vessels and tissue, organs, etc. When incident light 127, for example, NIR light or visible light, from the source 120 is directed to a living target (region of interest 140), such as a tissue/organ, part of the light will go through multiple scattering inside the target and eventually reflect back (Reflecting light) to the camera 130 as shown in
The camera 130 is configured to collect the reflecting light and provide a visible light or NIR image (NIR/Visble Layer 115), each with different characteristics depending, for example, upon a depth of penetration of the illumination light determined by the wavelength energy. In some embodiments, the camera 130 is provided by Lumenera Lt225 NIR CMOS camera for single wavelength (Visible/NIR) image acquisition. For simultaneous multiple wavelength image acquisition applications, a customer designed beam splitting system may be located in front of the regular camera body.
The reflected NIR/Visible image 115 reveals an anatomical structure. In some embodiments, this anatomical structure may be multiple millimeters beneath the surface of the tissue/organ, depending on the penetration into the tissue, which is a function of wavelength and energy. The resulting unmodified image presentation (the raw NIR image of
Moreover, since the NIR image of
Accordingly, as illustrated in
The VFA image presentation of the LSI analysis creates a methodology for presenting the velocity data within a framework already known and understood by surgeons and medial imagers, thus making the interpretation of the novel flow and perfusion data more readily understandable and usable for decision-making. The anatomic detail provides the frame of reference to combine the known (anatomy) with the unknown or measured (flow and perfusion). The anatomic detail improves the accuracy of interpretation and understanding of the flow and perfusion data, in both physiologic and pathophysiologic circumstances. This is particularly true when the imaging technology is challenged to provide two different ‘levels’ of velocity (and flow) data, for example, when the epicardial surface of the heart is imaged to measure flow in the epicardial coronaries (level 1) and perfusion in the surrounding myocardium (level 2).
In some embodiments of the inventive concept, both the anatomic detail and the physiologic map analysis can be derived from a same single raw NIR image data/visible light image. Thus by combining these anatomic and analysis data, and using algorithms in accordance with embodiments of the present inventive concept to adjust, for example, the colorization, transparency, superposition and integration of the data, a new VFA analysis image 135 can be generated to contain both anatomical (vascular structure) and functional (blood flow and perfusion) information of the tissue/organ as illustrated in, for example,
In particular,
Referring now to
It will be understood that although generation of a single VFA image has been discussed herein, embodiments of the present inventive concept are not limited to this configuration. For example, a series of VFA images may be generated and may be assembled into a VFA video image sequence without departing from the scope of the present inventive concept.
Embodiments of the present inventive concept may be applied to the determined acquisition of blood flow and perfusion data from any tissue and/or organ system where blood flow and perfusion are an important determinant for evaluation, measurement, clinical decision-making, therapeutic decision-making, product development using physiologic imaging data derived from this technique, or experimental investigation into the physiology and/or pathophysiology of blood flow and perfusion.
Referring now to
Referring now to
As shown in
As illustrated in
Although the data 356 illustrated in
As further illustrated in
Furthermore, while the adjustment module 351, the image capture module 352, the NIR/Visible module 353 and the VFA processing module 354 are illustrated in a single data processing system, as will be appreciated by those of skill in the art, such functionality may be distributed across one or more data processing systems. Thus, the present inventive concept should not be construed as limited to the configuration illustrated in
As discussed above with respect to
In some embodiments, a solid color, for example, black, is used as the base at the bottom; a physiological image or its adjusted form is used as a layer on top of the base; an anatomical image or its adjusted form is used to modify the transparency of the physiological layer, so the anatomically less significant part (lower intensity in the anatomical image) will make the physiological image more transparent and, thus, less visible.
In further embodiments, a solid color, for example, black, is used as the base at the bottom; an anatomical image or its adjusted form is used as a layer on top of the base; and the physiological image or its adjusted form is used to modify the transparency of the anatomical layer, so the physiologically less significant part (lower value in the physiological image) will make the anatomical image more transparent and, thus, less visible.
Using the NIR wavelength, Laser Speckle Image and Laser Doppler technologies can quantify the speed of blood flow and perfusion and, thus, reveal the functionality of the vascular system. In many clinical situations, the novel opportunity to use real-time visualization of the combination of anatomic detail and the underlying physiologic processes within that anatomic detail will be transformational in current and future therapeutic strategies.
Operations in accordance with various embodiments of the inventive concept will now be discussed with respect to the flowcharts of
The image of the sample includes anatomical structure of the sample. The sample may be, for example, tissue and/or organs. A physiologic map of blood flow and perfusion of the sample is obtained (block 825). The physiologic map of the sample may be obtained using, for example, LSI, LDI or fluorescence. The anatomical structure of the image and the physiologic map of the sample are combined into a single image of the sample (block 835). The single image of the sample displays anatomy and physiology of the sample in the single image in real time. As further illustrated in
In some embodiments, a plurality of images may be combined with a corresponding plurality of physiologic maps to provide a video displaying anatomy and physiology of the sample in real time.
In some embodiments, combining the anatomical structure of the image and the physiologic map of the sample into a single image includes adjusting one or more properties of the image and/or the physiologic map. The one or more properties may include at least one of colorization, transparency and a weight function.
Referring now to
Further operations in accordance with embodiments discussed herein will now be discussed with respect to the images illustrated in
Referring first to
Finally,
where Img is a raw (original) image frame of visible or near infra-red light (10A or 10B) and x is an adjustable parameter greater than zero (>0) and less and or equal to two (<=2). In other words, each pixel value in T(i,j) is between 0 and 1 with 0 representing no transparency and 1 representing 100% transparency. Parameter x controls the contrast of the transparency map and if x>1, transparency has a larger dynamic range and if x<1, the transparency has a smaller dynamic range.
Referring now to
Referring now to
Img(i,j)=ImgA(i,j)×RGB(i,j) Eqn. (2)
where ImgA(i,j) for each color channel might be adjusted separately and differently to achieve optimal visualization effect. Thus, the image of
As discussed above, a near infra-red image/visible light image can visualize the surface and sub-surface anatomical structure of the vasculature of a tissue/organ. Blood flow measuring technologies, such as LSI, LDI or fluorescence, can quantify the speed of blood flow and perfusion, thus revealing the functionality of the vasculature of a tissue/organ. In certain clinical situations, visualization of both the anatomical structure and the functionally of tissue/organ vasculature is important. Thus, in some embodiments of the present inventive concept NIR images are used as one layer of the VFA image, which reveals anatomical structure of the targeting tissue/organ vasculature. The physiologic map of blood flow and perfusion quantified by, for example, LSI, LDI or fluorescence technology, is used as another layer of the VFA, which reveals functionality and physiology of the targeted tissue/organ vasculature. Embodiments of the present inventive concept are configured to adjust the colorization and transparency of the two layers and a final visualization (VFA image) is achieved, which represents both anatomy and functionality of the vascular system of a certain tissue/organ.
It will be understood that embodiments of the present inventive concept may be used in any format of clinical imaging, which includes both surgical imaging (usually an in-patient application) and other out-patient imaging procedure (non-surgical application) without departing from the scope of the present inventive concept.
Example embodiments are described above with reference to block diagrams and/or flowchart illustrations of methods, devices, systems and/or computer program products. It is understood that a block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, and/or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, create means (functionality) and/or structure for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instructions which implement the functions/acts specified in the block diagrams and/or flowchart block or blocks.
The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks.
Accordingly, example embodiments may be implemented in hardware and/or in software (including firmware, resident software, micro-code, etc.). Furthermore, example embodiments may take the form of a computer program product on a computer-usable or computer-readable storage medium having computer-usable or computer-readable program code embodied in the medium for use by or in connection with an instruction execution system. In the context of this document, a computer-usable or computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
The computer-usable or computer-readable medium may be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, and a portable compact disc read-only memory (CD-ROM). Note that the computer-usable or computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
Computer program code for carrying out operations of data processing systems discussed herein may be written in a high-level programming language, such as Java, AJAX (Asynchronous JavaScript), C, and/or C++, for development convenience. In addition, computer program code for carrying out operations of example embodiments may also be written in other programming languages, such as, but not limited to, interpreted languages. Some modules or routines may be written in assembly language or even micro-code to enhance performance and/or memory usage. However, embodiments are not limited to a particular programming language. It will be further appreciated that the functionality of any or all of the program modules may also be implemented using discrete hardware components, one or more application specific integrated circuits (ASICs), or a field programmable gate array (FPGA), or a programmed digital signal processor, a programmed logic controller (PLC), or microcontroller.
It should also be noted that in some alternate implementations, the functions/acts noted in the blocks may occur out of the order noted in the flowcharts. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Moreover, the functionality of a given block of the flowcharts and/or block diagrams may be separated into multiple blocks and/or the functionality of two or more blocks of the flowcharts and/or block diagrams may be at least partially integrated.
In the drawings and specification, there have been disclosed example embodiments of the inventive concept. Although specific terms are employed, they are used in a generic and descriptive sense only and not for purposes of limitation, the scope of the inventive concept being defined by the following claims.
The present application claims priority from U.S. Provisional Application Nos. 62/063,673, filed Oct. 14, 2014 and 62/136,010, filed Mar. 20, 2015, the disclosures of which are hereby incorporated herein by reference as if set forth in their entirety.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2015/055251 | 10/13/2015 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2016/061052 | 4/21/2016 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
4541433 | Baudino | Sep 1985 | A |
5058596 | Makino et al. | Oct 1991 | A |
5074307 | Aizu et al. | Dec 1991 | A |
5129400 | Makino et al. | Jul 1992 | A |
5161531 | Parsons et al. | Nov 1992 | A |
5240006 | Fujii et al. | Aug 1993 | A |
5291885 | Taniji et al. | Mar 1994 | A |
5291886 | Katayama et al. | Mar 1994 | A |
5490505 | Diab et al. | Feb 1996 | A |
5588436 | Narayanan et al. | Dec 1996 | A |
5692510 | Gordon et al. | Dec 1997 | A |
5860922 | Gordon et al. | Jan 1999 | A |
6045511 | Ott et al. | Apr 2000 | A |
6263227 | Boggett | Jul 2001 | B1 |
6323880 | Yamada | Nov 2001 | B1 |
6537223 | Kristiansen | Mar 2003 | B1 |
6587701 | Stranc et al. | Jul 2003 | B1 |
6631286 | Pfeiffer et al. | Oct 2003 | B2 |
6671540 | Hochman | Dec 2003 | B1 |
6766188 | Soller | Jul 2004 | B2 |
6915154 | Docherty et al. | Jul 2005 | B1 |
6944494 | Forrester et al. | Sep 2005 | B2 |
6974416 | Booker et al. | Dec 2005 | B2 |
7031504 | Argiro et al. | Apr 2006 | B1 |
7096058 | Miyahara et al. | Aug 2006 | B2 |
7113817 | Winchester, Jr. et al. | Sep 2006 | B1 |
7200431 | Franco et al. | Apr 2007 | B2 |
7231243 | Tearney et al. | Jun 2007 | B2 |
7270637 | Shin | Sep 2007 | B2 |
7309313 | Nakata et al. | Dec 2007 | B2 |
7404640 | Ferguson et al. | Jul 2008 | B2 |
7468039 | Lui | Dec 2008 | B2 |
7496395 | Serov et al. | Feb 2009 | B2 |
7541602 | Metzger et al. | Jun 2009 | B2 |
7542790 | Jensen et al. | Jun 2009 | B2 |
7809225 | Bouma et al. | Oct 2010 | B2 |
7809226 | Bouma et al. | Oct 2010 | B2 |
9028421 | Fujii et al. | May 2015 | B2 |
9226673 | Ferguson, Jr. et al. | Jan 2016 | B2 |
9271658 | Ferguson, Jr. et al. | Mar 2016 | B2 |
9610021 | Dvorsky | Apr 2017 | B2 |
20010035503 | Quistorff et al. | Nov 2001 | A1 |
20020016533 | Marchitto et al. | Feb 2002 | A1 |
20020173723 | Lewis et al. | Nov 2002 | A1 |
20030120156 | Forrester | Jun 2003 | A1 |
20030225328 | DeMeester et al. | Dec 2003 | A1 |
20030231511 | Thibault | Dec 2003 | A1 |
20040068164 | Diab et al. | Apr 2004 | A1 |
20050046969 | Beatson et al. | Mar 2005 | A1 |
20060058662 | Kobayashi et al. | Mar 2006 | A1 |
20060241460 | Kimura et al. | Oct 2006 | A1 |
20060291708 | Dehmeshki et al. | Dec 2006 | A1 |
20070008615 | Miyawaki et al. | Jan 2007 | A1 |
20070100245 | Kashima | May 2007 | A1 |
20070109784 | Kosnick et al. | May 2007 | A1 |
20070203413 | Frangioni | Aug 2007 | A1 |
20080025579 | Sidlauskas | Jan 2008 | A1 |
20080049268 | Hardy et al. | Feb 2008 | A1 |
20080071176 | Docherty et al. | Mar 2008 | A1 |
20080107361 | Asukai et al. | May 2008 | A1 |
20080132794 | Alfano et al. | Jun 2008 | A1 |
20080188726 | Presura et al. | Aug 2008 | A1 |
20080262359 | Tearney et al. | Oct 2008 | A1 |
20090041201 | Wang et al. | Feb 2009 | A1 |
20090054908 | Zand et al. | Feb 2009 | A1 |
20090118623 | Serov et al. | May 2009 | A1 |
20090177098 | Yakubo et al. | Jul 2009 | A1 |
20090209834 | Fine | Aug 2009 | A1 |
20090214098 | Hornegger et al. | Aug 2009 | A1 |
20090216098 | Stranc et al. | Aug 2009 | A1 |
20090275841 | Melendez | Nov 2009 | A1 |
20100056936 | Fujii et al. | Mar 2010 | A1 |
20100067767 | Arakita et al. | Mar 2010 | A1 |
20100069759 | Schuhrke et al. | Mar 2010 | A1 |
20100168585 | Fujii et al. | Jul 2010 | A1 |
20100172567 | Prokoski | Jul 2010 | A1 |
20100209002 | Thiel et al. | Aug 2010 | A1 |
20100284693 | Agmon et al. | Nov 2010 | A1 |
20100305454 | Dvorsky et al. | Dec 2010 | A1 |
20110013002 | Thompson et al. | Jan 2011 | A1 |
20110068007 | Pang et al. | Mar 2011 | A1 |
20110071382 | Miyazaki et al. | Mar 2011 | A1 |
20110137169 | Akaki et al. | Jun 2011 | A1 |
20110164035 | Liao et al. | Jul 2011 | A1 |
20110169978 | Lasser et al. | Jul 2011 | A1 |
20110176048 | Rockley | Jul 2011 | A1 |
20110319775 | Fujii et al. | Dec 2011 | A1 |
20120071769 | Dunn et al. | Mar 2012 | A1 |
20120078113 | Hitestone et al. | Mar 2012 | A1 |
20120095354 | Dunn et al. | Apr 2012 | A1 |
20120108956 | Warger, II et al. | May 2012 | A1 |
20120165627 | Yamamoto | Jun 2012 | A1 |
20120191005 | Sobol et al. | Jul 2012 | A1 |
20120277559 | Kohl-Bareis et al. | Nov 2012 | A1 |
20130204112 | White | Aug 2013 | A1 |
20130223705 | Ferguson, Jr. | Aug 2013 | A1 |
20130245456 | Ferguson, Jr. | Sep 2013 | A1 |
20130324866 | Gladshtein | Dec 2013 | A1 |
20130345560 | Ferguson, Jr. et al. | Dec 2013 | A1 |
20140003740 | Bone | Jan 2014 | A1 |
20140081133 | Nie | Mar 2014 | A1 |
20140161421 | Shoemaker et al. | Jun 2014 | A1 |
20140187966 | Theirman | Jul 2014 | A1 |
20140213861 | Van Leest | Jul 2014 | A1 |
20140276097 | Sharonov | Sep 2014 | A1 |
20140285702 | Higashiyama et al. | Sep 2014 | A1 |
20140293091 | Rhoads et al. | Oct 2014 | A1 |
20140340482 | Kanarowski | Nov 2014 | A1 |
20150077716 | Peng | Mar 2015 | A1 |
20150148623 | Benaron | May 2015 | A1 |
20150196257 | Yousefi | Jul 2015 | A1 |
20150342479 | Liu et al. | Dec 2015 | A1 |
20160198961 | Homyk et al. | Jul 2016 | A1 |
20160270672 | Chen et al. | Sep 2016 | A1 |
20160278718 | Fujii | Sep 2016 | A1 |
20160317041 | Porges et al. | Nov 2016 | A1 |
20160358332 | Watanabe | Dec 2016 | A1 |
20170017858 | Roh | Jan 2017 | A1 |
20170049377 | Littell | Feb 2017 | A1 |
20170059408 | Korner | Mar 2017 | A1 |
20170091962 | Hagiwara | Mar 2017 | A1 |
20170135555 | Yoshizaki | May 2017 | A1 |
20170270379 | Kasai et al. | Sep 2017 | A1 |
20170274205 | Chen et al. | Sep 2017 | A1 |
20180153422 | Watanabe | Jun 2018 | A1 |
Number | Date | Country |
---|---|---|
101784227 | Jul 2010 | CN |
102083362 | Jun 2011 | CN |
102770071 | Nov 2012 | CN |
103340601 | Oct 2013 | CN |
103417196 | Dec 2013 | CN |
2 524 650 | Nov 2012 | EP |
10-290791 | Nov 1998 | JP |
2005-118325 | May 2005 | JP |
2005-185834 | Jul 2005 | JP |
2007-125144 | May 2007 | JP |
2008-139543 | Jun 2008 | JP |
2011-249267 | Dec 2011 | JP |
2012-130629 | Jul 2012 | JP |
2013-118978 | Jun 2013 | JP |
2014-000246 | Jan 2014 | JP |
2015-223463 | Dec 2015 | JP |
WO 9612435 | May 1996 | WO |
9743950 | Nov 1997 | WO |
9844839 | Oct 1998 | WO |
WO 2006021096 | Mar 2006 | WO |
WO 2006116672 | Nov 2006 | WO |
WO 2009127972 | Oct 2009 | WO |
WO 2010131550 | Nov 2010 | WO |
WO 2012096878 | Jul 2012 | WO |
WO 2013190391 | Dec 2013 | WO |
WO 2014006465 | Jan 2014 | WO |
WO 2014009859 | Jan 2014 | WO |
WO 2016153741 | Sep 2016 | WO |
Entry |
---|
International Search Report and the Written Opinion of the International Searching Authority corresponding to International Patent Application No. PCT/US2015/055251 (13 pages) (dated Feb. 3, 2016). |
Extended European Search Report corresponding to related European Patent Application No. 15849925.1 (7 pages) (dated Jun. 6, 2018). |
Aizu, Y et al. (1991) “Bio-Speckle Phenomena and Their Application to the Evaluation of Blood Flow” Optics and Laser Technology, vol. 23, No. 4, Aug. 1, 1991. |
Brezinski, M. E., G. J. Tearney, et al. (1996). “Imaging of coronary artery microstructure (in vitro) with optical coherence tomography.” American Journal of Cardiology 77 (1): 92-93. |
Briers et al., (1995) “Quasi real-time digital version of single-exposure speckle photography for full-field monitoring of velocity or flow fields,” Optics Communications 116: 36-42. |
Briers, J. David, (2001) “Laser Doppler, speckle and related techniques for blood perfusion mapping and imaging,” Physiol. Meas. 22: R35-R66. |
Chen, Z. P., T. E. Milner, et al. (1997). “Noninvasive imaging of in vivo blood flow velocity using optical Doppler tomography.” Optics Letters 22(14): 1119-1121. |
Cheng et al., (2004) “Laser speckle imaging of blood flow in microcirculation,” Phys. Med. Biol., 49: 1347-1357. |
Choi et al., “Linear response range characterization and in vivo application of laser speckle imaging of blood flow dynamics,” Journal of Biomedical Optics, Jul./Aug. 2006, 11(4): 041129. |
Cioffi, G. A. (2001). “Three common assumptions about ocular blood flow and glaucoma.” Survey of Ophthalmology 45: S325-S331. |
Draijer, Matthijs J., “High Speed Perfusion Imaging Based on Laser Speckle Fluctuations,” Printed by Ridderprint, Ridderkerk, The Netherlands 2010, 145 pages. |
Draijer et al., “Twente Optical Perfusion Camera: system overview and performance for video rate laser Doppler perfusion imaging,” Optics Express, Mar. 2, 2009, 17(5): 3211-3225. |
Duncan et al., “Can laser speckle flowmetry be made a quantitative tool?,” J. Opt. Soc. Am. A, Aug. 2008, 24(8): 2088-2094. |
Dunn et al. “Dynamic imaging of cerebral blood flow using laser speckle”, J. of Cerebral Blood Flow and Metabolism 21, 195-201 (2001). |
Dunn et al., (2011) A Transmissive Laser Speckle Imaging Technique for Measuring Deep Tissue Blood Flow: An Example Application in Finger Joints, Lasers in Surgery and Medicine, 43: 21-28. |
Eun, H. C. (1995). “Evaluation of skin blood flow by laser Doppler flowmetry. [Review] [151 refs].” Clinics in Dermatology 13(4): 337-47. |
Fercher et al., “Flow Visualization by Means of Single—Exposure Speckle Photography,” Optics Communications, Jun. 1, 1981, 37( 5): 326-330. |
Gandjbakhche, A. H., P. Mills, et al. (1994). “Light-Scattering Technique for the Study of Orientation and Deformation of Red-Blood-Cells in a Concentrated Suspension.” Applied Optics 33(6): 1070-1078. |
Izatt, J. A., M. D. Kulkarni, et al. (1996). “Optical coherence tomography and microscopy in gastrointestinal tissues.” IEEE Journal of Selected Topics in Quantum Electronics 2(4): 1017. |
Jang, I. K., G. J. Tearney, et al. (2001). “Visualization of Tissue Prolapse Between Coronary Stent Struts by Optical Coherence Tomography: Comparison With Intravascular Ultrasound.” Images in Cardiovascular Medicine, American Heart Association, http://circ.ahajournals.org/content, p. 2754. |
Konishi and Fujii “Real-time visualization of retinal microcirculation by laser flowgraphy”, Opt. Eng. 34, 753-757 (1995). |
Kruijt et al., (2006), “Laser speckle imaging of dynamic changes in flow during photodynamic therapy,” Lasers Med Sci, 21: 208-212. |
Leitgeb, R. A., L. Schmetterer, et al. (2003). “Real-time assessment of retinal blood flow with ultrafast acquisition by color Doppler Fourier domain optical coherence tomography.” Optics Express 11(23): 3116-3121. |
Li et al., “Imaging cerebral blood flow through the intact rate skull with temporal laser speckle imaging,” Optics Letters, Jun. 15, 2006, 31(12): 1824-1826. |
Matsievskii, D.D., (2004) “Blood Flow Measurements in Studies of Macro- and Microcirculation,” Bulletin of Experimental Biology and Medicine, 6: 541-544. |
Nadkarni, Seemantini K. et al (2005) “Characterization of Atherosclerotic Plaques by Laser Speckle Imaging” Circulation vol. 112, pp. 885-892. |
Nelson, J. S., K. M. Kelly, et al. (2001). “Imaging blood flow in human port-wine stain in situ and in real time using optical Doppler tomography.” Archives of Dermatology 137(6): 741-744. |
Ohtsubo et al., (1976) “Velocity measurement of a diffuse object by using time-varying speckles,” Optical and Quantum Electronics, 8: 523-529. |
Oshima, M., R. Torii, et al. (2001). “Finite element simulation of blood flow in the cerebral artery.” Computer Methods in Applied Mechanics and Engineering 191 (6-7): 661-671. |
Parthasarathy et al., “Laser speckle contrast imaging of cerebral blood flow in humans during neurosurgery: a pilot clinical study,” Journal of Biomedical Optics, 15(6) Nov./Dec. 2010, pp. 066030-1 to 066030-8. |
Rege et al., “Multiexposure laser speckle contrast imaging of the angiogenic microenvironment,” Journal of Biomedical Optics, 16(5), May 2011, pp. 056006-1 to 056006-10. |
Ren, Hongwu et al., “Phase-Resolved Functional Optical Coherence Tomography: Simultaneous Imaging of in Situ Tissue Structure, Blood Flow Velocity, Standard Deviation, Birefringence, and Stokes Vectors in Human Skin,” Optics Letters, vol. 27, No. 19, Oct. 1, 2002, pp. 1702-1704. |
Richards G.J. et al. (1997) “Laser Speckle Contrast Analysis (LASCA): A Technique for Measuring Capillary Blood Flow Using the First Order Statistics of Laser Speckle Patterns” Apr. 2, 1997. |
Ruth, B. “blood flow determination by the laser speckle method”, Int J Microcirc: Clin Exp, 1990, 9:21-45. |
Ruth, et al., (1993) “Noncontact Determination of Skin Blood Flow Using the Laser Speckle Method: Application to Patients with Peripheral Arterial Occlusive Disease (PAOD) and to Type-I Diabetes,” Lasers in Surgery and Medicine 13: 179-188. |
Subhash, Hrebesh M., “Biophotonics Modalities for High-Resolution Imaging of Microcirculatory Tissue Beds Using Endogenous Contrast: A Review of Present Scenario and Prospects,” International Journal of Optics, vol. 2011, Article ID 293684, 20 pages. |
Tearney et al., “Atherosclerotic plaque characterization by spatial and temporal speckle pattern analysis”, CLEO 2001, vol. 56, pp. 307-307. |
Wang, X. J., T. E. Milner, et al. (1997). “Measurement of fluid-flow-velocity profile in turbid media by the use of optical Doppler tomography.” Applied Optics 36(1): 144-149. |
Wardell et al., “ECG-Triggering of the Laser Doppler Perfusion Imaging Signal,” Proceedings of the 20th Annual International Conference of the IEEE Engineering in Medicine and Biology Socieity, vol. 20, No. 4, 1998, pp. 1879-1880. |
Weber et al., (2004) “Optical imaging of the spatiotemporal dynamics of cerebral blood flow and oxidative metabolism in the rat barrel cortex,” European Journal of Neuroscience, 20: 2664-2670. |
White, Brian R. et al., “In Vivo Dynamic Human Retinal Blood Flow Imaging Using Ultra-High-Speed Spectral Domain Optical Doppler Tomography,” Optics Express, Dec. 15, 2003, 11(25): 3490-3497. |
Wussling et al., “Laser diffraction and speckling studies in skeletal and heart muscle”, Biomed, Biochem, Acta, 1986, 45(1/2):S 23-S 27. |
Yazdanfar et al., “In Vivo imaging in blood flow in human retinal vessels using color Doppler optical coherence tomography”, SPIE, 1999 vol. 3598, pp. 177-184. |
Yazdanfar, S., A. M. Rollins, et al. (2000). “Imaging and velocimetry of human retinal circulation with color Doppler optical coherence tomography.” Optics Letters, vol. 25, No. 19, Oct. 1, 2000, pp. 1448-1450. |
Yazdanfar, S., A. M. Rollins, et al. (2003). “In vivo imaging of human retinal flow dynamics by color Doppler optical coherence tomography.” Archives of Ophthalmology 121(2): 235-239. |
Zakharov et al., “Dynamic laser speckle imaging of cerebral blood flow,” Optics Express, vol. 17, No. 16, Aug. 3, 2009, pp. 13904-13917. |
Zakharov et al., “Quantitative modeling of laser speckle imaging,” Optics Letters, Dec. 1, 2006; 31(23): 3465-3467. |
Zhao, Y. H., Z. P. Chen, et al. (2000). “Doppler standard deviation imaging for clinical monitoring of in vivo human skin blood flow.” Optics Letters 25(18): 1358-1360. |
U.S. Appl. No. 15/054,830, Chen et al., filed Feb. 26, 2016. |
U.S. Appl. No. 15/559,605, Peng et al., filed Sep. 19, 2017. |
U.S. Appl. No. 15/559,646, Peng et al., filed Sep. 19, 2017. |
U.S. Appl. No. 15/688,472, Chen et al., filed Aug. 28, 2017. |
Furstenberg et al. “Laser speckle reduction techniques for mid-infrared microscopy and stand-off spectroscopy” Proceedings of SPIE 10210:1021004-1-8 (2017). |
Redding et al. “Speckle-free laser imaging using random laser illumination” Nature Photonics 6:355-359 (2012). |
Ren et al. “A simultaneous multimodal imaging system for tissue functional parameters” Proceedings of SPIE 8937:893706-1-12 (2014). |
Zhang et al. “Multimodal imaging of ischemic wounds” Proceedings of SPIE 8553:85531G-1-8 (2012). |
LeSniok et al., “New Generation Optical Wound Monitoring Device,” CW Optics, Inc., 2008 Mid-Atlantic Bio Conference, Chantilly, Virginia, USA, Oct. 24, 2018, 1 page. |
Gioux et al., “Motion-gated acquisition for in vivo optical imaging,” Journal of Biomedical Optics, Nov./Dec. 2009, vol. 14(6), pp. 064038-1 through 064038-8. |
First Office Action, Chinese Patent Application No. 201580066744.6, dated Oct. 9, 2019, 23 pages. |
Decision of Refusal, Japanese Patent Application No. 2017-519928, dated Jan. 7, 2020, 5 pages. |
Nakamura et al., “'Applying Hyper Eye Medical System; HEMS to abdominal surgery,” Progress in Medicine, Mar. 10, 2011, vol. 31, No. 3, 806-809. |
Notification of Reason(s) for Refusal, JP 2017-519923, dated Mar. 10, 2020. 7 pages. |
Notification of Reason(s) for Refusal, JP 2017-568002, dated Mar. 31, 2020, 6 pages. |
Number | Date | Country | |
---|---|---|---|
20170224274 A1 | Aug 2017 | US |
Number | Date | Country | |
---|---|---|---|
62136010 | Mar 2015 | US | |
62063673 | Oct 2014 | US |