The present inventive concept relates to projecting light of multiple wavelengths or multiple wavelength bands onto a target such as tissues or organs with embedded blood vessels and capturing multiple images simultaneously or sequentially for image processing, modeling, visualization, and quantification by various parameters as biomarkers.
Soft tissue imaging by optical means has been gaining more and more interests in the medical field for its safety and cost-effectiveness. It includes visible and near-infrared (NIR) light imaging, narrow band imaging; fluorescence imaging; laser speckle imaging, laser doppler imaging; other soft tissue imaging such as oxygen saturation and composition imaging.
Multispectral technologies allow combining light of visible light and NIR wavelengths during the imaging process and provide benefits of visualizing anatomical structure and quantitively visualizing distribution of functional/physiologic/compositional characteristics of organs and tissues.
Some embodiments of the present inventive concept provide several light engine designs for multispectral illumination. The method includes modular design of each light source in a light engine which can be of free space optics, or fiber optics coupling light emitting devices such as lasers, LEDs, noncoherent lamps etc. Each light source can be coherent, or non-coherent depending on the imaging application and processing requirement. Other optics characters of each light source such as power, irradiance and flux can be adjusted depending on the imaging application.
Some embodiments of the present inventive concept provide several camera designs for multispectral sensing. The method includes modular design for detecting separately light from a target in different wavelengths or wavelength bands which can be simultaneous and/or sequentially over these wavelengths or wavelength bands. The designs can include multi-sensor or single sensor with multispectral pixels or pixel regions or single sensor to detect each selected wavelength or wavelength band at a chosen time. The spectral regions of illumination and detection can range, for example, from 350 nm to 1050 nm which is determined by the spectral sensitivity of chosen sensor.
Some embodiments of the present inventive concept require innovative software architectural optimization based on selected multispectral illumination and camera sensing designs. Software flowchart includes image acquisition, processing, modeling, visualization, and quantification.
Some embodiments of the present inventive concept provide optimization of a list of imaging modules based on selected multispectral illumination and camera sensing designs. Imaging modules in a medical device include visible and NIR light imaging, narrow bandwidth light imaging; fluorescence imaging; laser speckle imaging, laser doppler imaging; other soft tissue imaging such as oxygen saturation and tissue composition imaging.
Some embodiments of the present inventive concept provide optimization of device form factors based on multispectral illumination and camera sensing designs. Form factors of medical devices include endoscopic/laparoscopic/arthroscopic devices for medical towers or robots, cart device with extension arm and camera head, and handheld scanning or tablet device.
Embodiments of the present inventive concept will now be described more fully hereinafter with reference to the accompanying figures, in which some embodiments of the inventive concept are shown. This inventive concept may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Like numbers refer to like elements throughout. In the figures, layers, regions, elements or components may be exaggerated for clarity. Broken lines illustrate optional features or operations unless specified otherwise.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the inventive concept. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items. As used herein, phrases such as “between X and Y” and “between about X and Y” should be interpreted to include X and Y. As used herein, phrases such as “between about X and Y” mean “between about X and about Y.” As used herein, phrases such as “from about X to Y” mean “from about X to about Y.”
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this inventive concept belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the specification and relevant art and should not be interpreted in an idealized or overly formal sense unless expressly so defined herein. Well-known functions or constructions may not be described in detail for brevity and/or clarity.
It will be understood that when an element is referred to as being “on”, “attached” to, “connected” to, “coupled” with, “contacting”, etc., another element, it can be directly on, attached to, connected to, coupled with or contacting the other element or intervening elements may also be present. In contrast, when an element is referred to as being, for example, “directly on”, “directly attached” to, “directly connected” to, “directly coupled” with or “directly contacting” another element, there are no intervening elements present. It will also be appreciated by those of skill in the art that references to a structure or feature that is disposed “adjacent” another feature may have portions that overlap or underlie the adjacent feature.
It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms are only used to distinguish one element, component, region, layer or section from another element, component, region, layer or section. Thus, a first element, component, region, layer or section discussed below could be termed a second element, component, region, layer or section without departing from the teachings of the inventive concept. The sequence of operations (or steps) is not limited to the order presented in the claims or figures unless specifically indicated otherwise.
Spatially relative terms, such as “under”, “below”, “lower”, “over”, “upper” and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if a device in the figures is inverted, elements described as “under” or “beneath” other elements or features would then be oriented “over” the other elements or features. Thus, the exemplary term “under” can encompass both an orientation of over and under. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly. Similarly, the terms “upwardly”, “downwardly”, “vertical”, “horizontal” and the like are used herein for the purpose of explanation only unless specifically indicated otherwise.
As will be appreciated by one of skill in the art, embodiments of the present inventive concept may be embodied as a method, system, data processing system, or computer program product. Accordingly, the present inventive concept may take the form of an embodiment combining software and hardware aspects, all generally referred to herein as a “circuit” or “module.” Furthermore, the present inventive concept may take the form of a computer program product on a non-transitory computer usable storage medium having computer usable program code embodied in the medium. Any suitable computer readable medium may be utilized including hard disks, CD ROMs, optical storage devices, or other electronic storage devices.
Computer program code for carrying out operations of the present inventive concept may be written in an object oriented programming language such as Matlab, Mathematica, Java, Smalltalk, C or C++. However, the computer program code for carrying out operations of the present inventive concept may also be written in conventional procedural programming languages, such as the “C” programming language or in a visually oriented programming environment, such as Visual Basic.
Certain of the program code may execute entirely on one or more of a user's computer, partly on the user's computer, as a stand alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer. In the latter scenario, the remote computer may be connected to the user's computer through a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
The inventive concept is described in part below with reference to flowchart illustrations and/or block diagrams of methods, devices, systems, computer program products and data and/or system architecture structures according to embodiments of the inventive concept. It will be understood that each block of the illustrations, and/or combinations of blocks, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general-purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the block or blocks.
These computer program instructions may also be stored in a computer readable memory or storage that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory or storage produce an article of manufacture including instruction means which implement the function/act specified in the block or blocks.
The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the block or blocks.
Animal and human organs are composed of different types of soft and hard tissues. The soft tissues have complex structures and compositions. As the largest organ of the human body, for example, skin possess a layered structure of multiple tissues that include epidermis, dermis and hypodermis. The skin dermis consists of connective tissues, blood, endothelium and subendothelial connective tissues of blood vessels, fat, etc. To accurately model soft tissue imaging by optical means, one often applies the radiative transfer (RT) theory to quantify the light-tissue interaction. With the RT theory, one may quantify the optical response of imaged tissues by the following RT equation
s·∇L(r, s)=−(μa+μs)L(r, s)+μs∫4πp(s, s′)L(r, s′)dω′+S(r, s) Eqn. (1)
where s is an unit vector in the direction of light propagation, L(r, s) is the radiance (W·sr−1·m−2) describing the power flux propagating at r position and along s direction per unit solid angle, S(r, s) represents a directional light source density (W·sr−1·m−3) contributing to L(r, s). In fluorescence imaging, S may be used to model the fluorophores in blood that are excited by the incident light. In other cases, such as tissue imaging by light of narrow wavelength band, one may ignore the S term if the medium is source-free. The optical parameters defined in Eqn. (1) for a particular type of tissues such as blood consists of absorption coefficient μa, scattering coefficient μs and single-scattering phase function p(s, s′). For modeling light-tissue interaction in all embodiments of the inventive concept in this application, we assume the phase function p(s, s′) can be replaced by a single-parameter function first proposed by Henyey and Greenstein in their classic paper published in 1941
where cosθ=s·ss′ and g is the mean value of cosθ. With the above assumption, the optical parameters for characterization of light-tissue interaction by the RT theory consist of μa, μs and g. We note that these parameters are function light wavelength and tissue types.
Referring first to
Referring to
P
Total=α×[α1×P1(λ1, T1) + . . . +αi×Pi(λi, Ti)+ . . . +αN×PN(λN, TN)] Eqn. (3)
where PTotal is the total light intensity emitted from multispectral light projector 13; Pi(λi, Ti) is the power emitted from the source of λth wavelength or band λi , for example, 112; Ti is a pulsing parameter controlling how long the illumination of wavelength λi is turned on; αi is an attenuation parameter due to loss of the light intensity from optics components such as 122, 132 which are different for each light source of wavelength λi; α is an attenuation parameter due to loss of the light intensity from optics components such as 141, 151, 13 which are the same for all wavelengths.
Referring to
Referring to
Referring to
Referring now to
Img
sensor i
=Img(λi, Pi, ti, gi, x, y) Eqn. (4)
Where is the λth wavelength for example 112; Pi is the light power emitted by the ith wavelength source; t is the ith sensor exposure time; g is the ith sensor gain; x is the horizonal pixel coordinate, y is the vertical pixel coordinate.
Referring to
Imgi=Img(λi, Pi, t, g, x/N, y/M) Eqn. (5)
Where is the ith wavelength for example 112; Pi is the light power emitted by the ith wavelength source; t is the sensor exposure time; g is the sensor gain; x is the horizonal pixel coordinate, y is the vertical pixel coordinate; N is the horizonal image resolution resampling factor based on the layout of visible and near infrared pixels; M is the vertical image resolution resampling factor based on the layout of visible and near infrared pixels. The image for each wavelength has lower resolution than the original resolution of the sensor. The total number of effective pixels of each wavelength is defined by the following equation:
Where X is the horizonal resolution of the original sensor; Y is the vertical resolution of the original sensor; Xi is the horizonal resolution of the image for ith wavelength; Yi is the vertical resolution of the image for ith wavelength.
Referring to
T1: wavelength 1 is detected, and wavelengths 2-N are not detected
T2: wavelength 2 is detected, and wavelengths 1, 3-N are not detected
T3: wavelength 3 is detected, and wavelength 1-2, 4-N are not detected
TN: wavelength N is detected, and wavelength 1 to N-1 are not detected
Img=Img1(λ1, P1, T1, t1, g1, x, y) + . . . + Imgi(λi, Pi, Ti, ti, gi, x, y)+ . . . + ImgN(λN, PN, TN, tN, gN, x, y) Eqn (7)
Where λi is the ith wavelength for example 112; Pi is the light power emitted by the ith wavelength source; t is the ith sensor exposure time; g is the ith sensor gain; x is the horizonal pixel coordinate, y is the vertical pixel coordinate; Ti is a pulsing parameter controlling how long the wavelength is turned on. When a single sensor is used to detect a wavelength at a chosen time or multiple wavelengths at a chose time, additional optics such as notch filter, band pass filter may be needed for a specific imaging module such as fluorescence imaging.
Combination of multispectral sensing design can be made through multi-sensor (
Referring first to
The modular light engine, sensing and software designs addressed above allow a variety of form factors while applying multispectral soft tissue imaging to a medical device. For example, multispectral light projector (13 in
Referring first to
The form actors of multispectral imaging device may include but not limited to