This invention pertains broadly to monitoring changes of pathologic conditions in the retina and optic nerve head and, in particular, to use of simultaneous acquisition of reflection characteristics of the retinal scene at pre-determined discrete wavelengths to characterize the relative spatial changes in retinal oxygen saturation.
The visual system, as part of the central nervous system enables an organism to process visual information. It interprets information from visible light to build a representation of the surrounding world. The visual system accomplishes a number of complex tasks, including the reception of light and the formation of monocular representations; the construction of a binocular perception from a pair of two dimensional projections; the identification and categorization of visual objects; assessing distances to and between objects; and guiding body movements in relation to visual objects. Quite understandably, the health of the visual system is of critical importance for keeping the organism to be efficiently operational.
Pathologic conditions in the retina and optic nerve head (ONH) can cause vision loss and blindness. Both structures have a high demand for oxygen, and loss of the normal oxygen supply through vascular insufficiency is believed to play an important role in diseases affecting the retina and ONH. Hypoxia of the retina and ONH is believed to be a factor in the development of ocular vascular disorders, such as diabetic retinopathy, arteriovenous occlusion, and glaucoma. The ability to obtain relative measurements of oxygen saturation in the human ocular fundus could aid diagnosis and monitoring of these and other disorders. For example, measurement of changes in retinal and ONH oxygen saturation under controlled conditions could establish relationships among oxygen consumption, blood sugar levels, and vascular autoregulatory function in diabetic retinopathy. Moreover, the assessment of oxygenation in the ONH may facilitate early detection of the onset of glaucoma, a disease in which timely diagnosis is crucial for effective treatment.
Several attempts to develop a methodology of accurate assessment of oxygen content such as a level of oxygen saturation (OS) in a human visual system were discussed in related art and include, among others, physically-invasive techniques, the use of phosphorescent dyes in an eye of a human subject (which has not been approved yet), and techniques based on evaluation of reflectance of a component of the visual system. One of the biggest obstacles of using imaging to acquire information relevant to OS remain short times of saccadic movements of the eye that are generally insufficiently long for sequential collection of spectral information about the eye.
Embodiments of the present invention prove a method for determining an oxygen saturation signature of an ocular tissue or a component of an eye such as, for example, retina blood vessels, retinal tissue, optic nerve head and vessels, choroid, and iris. The method includes receiving optical data representing a spectral distribution of light that has been reflected by multiple points of the ocular tissue and that is defined by a predetermined number of discrete wavelengths at least two of which are isosbestic wavelengths. In a specific embodiment, the received optical data may represent a distribution of reflected light acquired during a time period that is shorter than a duration of a saccade. The method further includes processing the received optical data for each point of the ocular tissue such as to determine first and second spectral distribution lines on a spectral graph, determine a first area of spectral graph regions that are bound by these first and second spectral distribution lines, and normalizing this determined value of the area by a value of a second area under the second spectral distribution line. In a specific embodiment, the second area may be independent from a level of oxygen saturation on the ocular tissue at a given point at the tissue. In one embodiment, determination of the first area may include determination of the area of three spectral graph regions that are bound by the first and second spectral distribution lines and that adjoin each other at isosbestic points. The method may further include assigning thus determined normalized value to a corresponding point of the ocular tissue and storing this assigned value in an array representing a two-dimensional distribution of points across a region of interest of the ocular tissue, and storing these assigned values. In addition, the method may contain at least one of the steps of acquiring the optical data and presenting the assigned normalized values as a map of 2D distribution of the oxygen saturation signature of the ocular tissue across the region of interest. In a specific embodiment, such map may include a color-coded image of the region of the ocular tissue, where the color-coding represents levels of oxygen saturation values.
Embodiments of the invention also provide for a computer program product encoded in a computer-readable medium and usable with a programmable computer processor disposed in a computer system. According to the idea of the invention, the computer program product includes a computer-readable program code which causes the computer processor to receive data, from an optical detector, that represent a discrete spectral distribution of intensity of light reflected by a fundus of a subject and detected at predetermined wavelengths at least two of which are isosbestic wavelengths. In a specific embodiment, the received data represents a discrete spectral distribution of light reflected from a fundus and detected within a period of time that is no longer that a duration of a saccadic movement of the eye of the subject. In addition, the computer program product includes a computer-readable program code that causes the processor to transform the received data into data representing oxygen saturation of blood in the retina. The transformation of the received data may include normalization of the oxygen saturation level with respect to at least one of the amount of blood in a portion of fundus, that has been imaged, and the intensity of light that has been detected. The processor may also be caused to display a color-coded map of spatial distribution of the oxygen saturation values across the imaged portion of fundus.
Embodiments of the invention additionally provide for a computer program product for displaying a color-coded map of oxygen saturation levels corresponding to a component of an eye. The component of an eye may include a retina, an optical nerve head, a choroid, an iris, or any other ocular tissue. Such computer program product includes a computer-readable tangible and non-transitory medium having a computer-readable program code thereon which includes a program code for acquiring digital data representing intensity of light that has interacted with the component of an eye and that has been detected with at least one optical detector at a plurality of discrete wavelengths at least two of which are isosbestic wavelengths. In a specific embodiment the data is acquired in a time period that is shorter than a duration of a saccadic movement of the eye. The computer program product may also include program code for determining a first curve that represents, in a chosen system of coordinates, a distribution of intensity values associated with the acquired digital data as a function of the plurality of discrete wavelengths, and program code for determining a second curve representing, in the same system of coordinates, a distribution of isosbestic intensity values as a function of the isosbestic wavelengths. The transformation of the received data may include normalization of the oxygen saturation level with respect to at least one of the amount of blood in a portion of the component of an eye that has been imaged and the intensity of light that has been detected. In addition, the computer program product may include program code for deriving, in the system of coordinates, values that represent area bound by the first and second curves and that correspond to the oxygen saturation levels. Additional program code may be used for storing the received data, the isosbestic data, and the data defining the second and first curves, and, additionally or alternatively, for assigning a color parameter for each of data points from the acquired digital data based on intensity values respectively corresponding to these data points.
The foregoing features of the invention will be more readily understood by reference to the following detailed description, taken with reference to the accompanying, drawn not-to-scale, figures where like features and elements are denoted by like numbers and labels, and in which:
For the purpose of the application and the appended claim, the following terms are defined as described unless the context requires otherwise. The term “image” refers to an ordered representation of detector signals corresponding to spatial positions. For example, an image may be an array of values within an electronic memory, or, alternatively, a visual image may be formed on a display device X such as a video screen or printer.
The following specification provides a description of the embodiments of the invention with reference to the accompanying drawings. In the drawings, wherever possible, the same reference numerals and labels refer to the same or like components or elements. It will be understood, however, that similar components or elements may also be referred to with different numerals and labels.
Throughout this specification, a reference to “one embodiment,” “an embodiment,” or similar language implies that a particular feature, structure, or characteristic described in connection with the embodiment referred to is included in at least one embodiment of the present invention. Thus, phrases “in one embodiment,” “in an embodiment,” and similar terms used throughout this specification may, but do not necessarily, all refer to the same embodiment. Moreover, it will be understood that features, elements, components, structures, details, or characteristics of various embodiments of the invention described in the specification may be combined in any suitable manner in one or more embodiments. A skilled artisan will recognize that the invention may possibly be practiced without one or more of the specific features, elements, components, structures, details, or characteristics, or with the use of other methods, components, materials, and so forth. Therefore, although a particular detail of an embodiment of the invention may not be necessarily shown in each and every drawing describing such embodiment, the presence of this detail in the drawing may be implied unless the context of the description requires otherwise. In other instances, well known structures, details, materials, or operations may be not shown in a given drawing or described in detail to avoid obscuring aspects of an embodiment of the invention.
The schematic flow chart diagram that is included is generally set forth as a logical flow-chart diagram. As such, the depicted order and labeled steps of the logical flow are indicative of one embodiment of the presented method. Other steps and methods may be conceived that are equivalent in function, logic, or effect to one or more steps, or portions thereof, of the illustrated method. Additionally, the format and symbols employed are provided to explain the logical steps of the method and are understood not to limit the scope of the method. Although various arrow types and line types may be employed in the flow-chart diagrams, they are understood not to limit the scope of the corresponding method. Indeed, some arrows or other connectors may be used to indicate only the logical flow of the method. For instance, an arrow may indicate a waiting or monitoring period of unspecified duration between enumerated steps of the depicted method. Additionally, the order in which a particular method occurs may or may not strictly adhere to the order of the corresponding steps shown.
Collection of information about oxygenation levels in the human visual system have been previously carried out with various methods including, but not limited to, physically invasive measurements of oxygen tension (Po2) in the ONH using O2-sensitive microelectrodes inserted into the eye; injection of a phosphorescent dye has been used to study Po2 in the retinal and choroidal vessels, as well as the microvasculature of the ONH rim; and spectral imaging. While the first methodology allows to relatively accurately determine Po2 distribution in three dimensions, its invasive nature limits its use to animal models and precludes clinical application. The second technique has not been approved for use in humans yet. Spectral imaging, on the other hand, is a non-invasive technique that can be a powerful tool for identifying retinal hypoxia that is associated with established stages of diabetic retinopathy (DR), for example. Results of several oximetry studies conducted with the use of spectral imaging indicate that the interest in developing methodology of oxymetry and its applications to studies of retinal disorders is increasing.
Conventional hyperspectral imaging approach, when used for detecting optical spectra of a human eye in reflection, for example, employs sequential one-dimensional imaging across a chosen spectral region (such as the entire visible and infra-red, IR, region) with a chosen spectral resolution. The technique uses a push-broom style scanner. Each acquired imaging frame holds the spatial (x) and spectral (λ) axis for each line of the acquired hyper-spectral image, with successive lines of the frame forming the z-axis in the stack of frames. A “band-sequential” hyper-spectral image is obtained by rotation of the stack of images, interchanging the z and λ axes. After rotation, each frame contains a two-dimensional spatial image at a distinct wavelength in which intact structures are recognizable. A push-broom scanner spectrometer is a scanning spectral imaging device conventionally used for image acquisition. The scanner employs a lens system to focus input light (such as light reflected from fundus) onto a field-limiting entrance slit and collimate the light. The collimated light is then spectrally dispersed (by, e.g., a diffraction grating) and focused on a CCD camera to acquire spectral information. The device allows a two-dimensional (2D) sensor detector to sample the spectral dimension and one spatial dimension (e.g., x-axis) simultaneously. As a result, a one-dimensional (1D), line spectral image is obtained. Information in another spatial dimension (e.g., z-axis) is generated by spatial scanning. The resulting full image is obtained by appropriate compiling of individual, line images.
It is appreciated, therefore, that in conventional sequential 1D hyper-spectral imaging the overall image acquisition process can take long time, for example at least several seconds. At the same time, typical latency and duration of saccades, that serve as a mechanism for fixation and rapid eye movements and cannot be controlled, are about 100 msec or even shorter and depend of frequency of the eye movement. At approximately equal time scale, lighting conditions may also change, unpredictably complicating the processing of acquired image data. As a result of the sequential nature of the hyper-spectral image collection, therefore, the eye must be immobilized for the duration of the imaging scan, which at least impedes the imaging procedure and often requires implementation of special means to prevent the cornea from drying. If the eye of the subject remains free to move, the resulting full image, taken line-by-line, is fractured in a fashion similar to that of a photograph that has been shredded into “stripes” and then has been reassembled or reconstructed without a frame of reference that is common to each stripe. It is clear, therefore, that conventionally-implemented multi-spectral imaging requiring reconstruction of the final image from the individually obtained spectral images into what is sometimes referred to as a “spectral cube” or a “composite image” (every portion of which contains the spectral information about the object) complicates the data acquisition procedure.
Embodiments of the present invention implement a seven-wavelength oxymetry methodology and facilitate the acquisition of OS-related information via imaging of the eye that is not impeded by the saccadic movements.
Multi-spectral imaging (MSI) equips the analysis of specimens with computerized imaging systems by providing access to spectral distribution of an image at a pixel level. While there exists a variety of multispectral imaging systems, an operational aspect that is common to all of these systems is the capability to form a multispectral image. A multispectral image is one that captures image data at specific wavelengths or at specific spectral bandwidths across the electromagnetic spectrum. These wavelengths may be singled out by optical filters or by the use of other instruments capable of selecting a pre-determined spectral component including electromagnetic radiation at wavelengths beyond the range of visible light range, such as, for example, infrared (IR).
Embodiment of an imaging camera that may be used with the present invention stems from the realization that the use of a two-dimensional array of secondary objective lenses positioned so as to spatially split or segment the incoming beam substantially at the plane where an image of the entrance pupil of the primary objective of the imaging system is located significantly simplifies the multispectral multi-channel imaging system. In such a configuration, the array of secondary objectives performs the role of a beam-splitting means and there is no need in a separate beam-splitting component. As a result, folding of the optical path can be avoided. Additional advantages of this configuration include simplicity of assembly, modularity, and reconfigurability of the imaging system.
It is also realized that due to the very nature of conventional multi-channel systems, that are configured to maximize spatial resolution of the resulting images, axially-asymmetric spatial truncation of the incoming light distribution in such conventional systems should be avoided at all costs. Related art recognizes this limitation and admonishes specific configurations that spatially segment or split the incoming beam asymmetrically with respect to the optical axis of the system). In particular, related art refers to difficulties of proper correlation and registration of the images produced by imaging sub-portions of the so segmented incoming beam. In conventional multi-channel systems, precision and symmetry of positioning of beam-splitting components in a transverse (with respect to the optical axis of the system) plane substantially defines the resulting spatial resolution in the image plane. However, in applications that do not require imaging systems with maximized spatial resolution or that can employ imaging systems having spatial resolution below a pre-defined threshold, axially-asymmetric positioning of the secondary objectives forming multiple images in the imaging plane can be sufficient. Embodiments of the present invention and applications of these embodiments take advantage of such configuration.
The detector element 340 may be a single detector such as a CCD disposed in a back focal plane of the array 330 that contains substantially optically identical objectives 334. In reference to
In further reference to
Generally, the optical components of embodiments of the present invention may be made of optical quality glass, crystals, or polymeric materials, or of any other materials possessing optical quality in transmission of light. The focal-plane 2D-array 340 can be CCD, CMOS, InGaAs, InSb, or any other type of focal plane arrays used for purposes of detecting light.
An alternative embodiment of the present invention is further described in reference to
In general, embodiments of the invention can be adapted to operate in conjunction with commercially-available imaging devices such as, e.g., a fundus imaging device 502 (Zeiss FF450 IR) schematically shown in
In a particular application contemplated by the present invention, and in reference to
In reference to
As shown in
As discussed herein, the fact that spatial segmentation of the incoming beam is achieved by using secondary objectives, of the re-imaging sub-system, as a spatial beam-splitting means at the location of the exit pupil of the relay-subsystem of the present invention allows to avoid the use of auxiliary beam-splitting components used in prior art and, therefore, increases the system tolerance to mechanical misalignments.
A specific embodiment 1000 of the camera is shown in
The advantages of embodiments of an imaging system described above include, without limitation, a recordation of a complete spectral image with the use of a two-dimensional focal-plane detector array in a single exposure, without the need in spatially deviating the image-forming beams from one another. Advantages of using embodiment of the present invention are particularly pronounced in applications that involve imaging of dynamic objects, for example moving objects. This greatly helps in the recombination of the individual images into a single spectral image.
While the discussion below mainly refers to processing of data acquired through imaging of retina-related biological tissue, it is done only for the sake of simplicity of the discussion, and the processing of data related to images of an ocular tissue or a component of an eye in general is within the scope of the invention. According to an embodiment of the invention, optical data characterizing retinal scene in different spectral bands are acquired simultaneously (for example, with an embodiment of the imaging system discussed above or a similarly performing imaging system). Generally, optical data is acquired with a use of an optical system including a beam-splitting arrangement that divides in incoming light wavefront into a plurality of optical beams each of which defines a specific spectral bandwidth. In one, such acquisition may occur within a time period that is shorter than a duration of a typical saccade, and, optionally, at several discrete wavelengths. In a specific embodiment, the time-length of a snap-shot imaging is no more than about 10 msec, and preferably less than 5 msec. The acquired data are then processed, as presented below, to determine the OS level of the retinal scene. Such method of data acquisition facilitates performing a retinal diagnostic on the subject without immobilizing the subject's eye. For example, a single snapshot of the retina taken by an imaging camera such as that described above provides, at an output, seven discrete 2D images at seven pre-determined wavelengths (referred to as isosbestic wavelengths of 522 nm, 548 nm, 569 nm, and 586 nm; and oxygen-sensitive wavelengths of 54 nm, 560 nm, and 577 nm) that define a discrete approximation of the reflectance spectrum of the retina. The image data contained in such a single snapshot also allows to determine the relative values of intensity of light reflected by the imaged tissue (such as an ONH) at the pre-determined wavelengths, which are then plotted in arbitrary units of intensity as a function of wavelength, as shown by curve A in
As used for the purposes of the description and the appended claims, an isosbestic wavelength is a wavelength at which reflectance spectra of oxygen saturated (HbO2 signature) and oxygen-unsaturated (Hb signature) blood in biological tissue, measured under otherwise equal conditions, have equal values. The HbO2-signature curve typically contains two minima respectively corresponding to wavelengths of peaks of light absorption by the saturated retinal blood, while the Hb-signature curve typically has a single broad minimum. A point on a reflectance spectrum of a given blood sample that corresponds to an isosbestic wavelength is referred to as an isosbestic point.
Sequentially connecting the isosbestic points of curve A further defines curve B (shown as a dotted line in
To appreciate the significance of defining the discrete reflectance spectrum via measurements of light intensity reflected off the retina at seven pre-determined wavelengths, in accordance with an embodiment of the present invention,
A further data normalization step may be required in order to be able to compare the relative OS values determined with the use of different blood volumes. For this purpose, the relative perfusion index (RPI), defined as
RPI=7/2*(I522+I586)/(I522+I542+I548+I560+I569+I577+I586),
where Iijk is a detected intensity at a wavelength of ijk nm, is used as a normalizing coefficient. The relative OS value of the imaged scene (such as an ONH) that is normalized with respect to the volume of blood in the imaged scene is obtained by dividing the previously determined normalized areas a, b, and c by the RPI.
Percent OS value is calculated from groups of pixels associated with separately imaged tissue components using any appropriately fitted (with the use of, e.g., a linear least square curve fit model) recorded hemoglobin spectrum to reference spectral curves acquired with the use of fully oxygenated (substantially 100% oxygenation level) and deoxygenated (substantially 0% oxygenation level) blood. An assumption that the arterial blood has an OS of 98% was used.
An embodiment of the invention has been described as including a processor controlled by instructions stored in a memory. The memory may be random access memory (RAM), read-only memory (ROM), flash memory or any other memory, or combination thereof, suitable for storing control software or other instructions and data. Some of the functions performed by embodiments have been described with reference to flowcharts and/or block diagrams. Those skilled in the art should readily appreciate that functions, operations, decisions, etc. of all or a portion of each block, or a combination of blocks, of the flowcharts or block diagrams may be implemented as computer program instructions, software, hardware, firmware or combinations thereof Those skilled in the art should also readily appreciate that instructions or programs defining the functions of the present invention may be delivered to a processor in many forms, including, but not limited to, information permanently stored on non-writable storage media (e.g. read-only memory devices within a computer, such as ROM, or devices readable by a computer I/O attachment, such as CD-ROM or DVD disks), information alterably stored on writable storage media (e.g. floppy disks, removable flash memory and hard drives) or information conveyed to a computer through communication media, including wired or wireless computer networks. In addition, while the invention may be embodied in software, the functions necessary to implement the invention may optionally or alternatively be embodied in part or in whole using firmware and/or hardware components, such as combinatorial logic, Application Specific Integrated Circuits (ASICs), Field-Programmable Gate Arrays (FPGAs) or other hardware or some combination of hardware, software and/or firmware components.
Distribution of the relative OS values across the imaged scene (e.g., an ONH) can be further used to produce 2D maps of oxygen saturation values across the region of interest of the retina. The maps may include arrays of data, or, alternatively, may include color-coded 2D images of the retinal ROI where the color-coding represents the levels of oxygen saturation.
Embodiments of an imaging system described above are intended to be merely exemplary; numerous variations and modifications will be apparent to those skilled in the art. All such variations and modifications are intended to be within the scope of the present invention. For example, although a relay portion of the discussed embodiments of the invention operates in telecentric configuration, it is recognized that generally a primary objective may operate to produce an intermediate image with finite magnification. In this case, an additional relay lens may be positioned in proximity of the intermediate image. Alternatively or in addition, it is understood that spectral filters may be positioned after the secondary objectives, with respect to the object.
Discussed embodiments and related modified embodiments can be advantageously used not only in medical applications, but also in military applications, agricultural applications such as harvesting, and geology, for example. The implementations of the idea of the invention allow the user to image and characterize dermatological diseases, ocular diseases caused by hypoxia or ischemia, such as glaucoma, optic neuropathies, retinal vascular occlusions (in veins, arteries), retinopathies such as infectious inflammatory or ischemic (e.g. sickle cell disease, retinopathy of prematurity), diabetic retinopathy, macular degeneration, degenerative diseases that ischemia has a role such as Alzheimer's disease, retinal dystrophies or degenerations (e.g. retinitis pigmentosa). Other imaging applications of embodiments such as, for example, imaging of cardiovascular disease or kidney disease with retinal vascular implications, are also contemplated within the scope of the invention. Furthermore, embodiments of the imaging system could be appropriately modified to be used in imaging of other systemic diseases as diagnostic or therapeutic follow up. One non-limiting example would be the use of an embodiment in endoscopic imaging (e.g., in colonoscopy), brain imaging, dermatological imaging where tissues are analyzed for ischemia or response to treatment.
The present application claims priority from U.S. Provisional Patent Application No. 61/329,205 titled “Single Exposure Multispectral Camera” and filed on Apr. 29, 2010, and U.S. Provisional Patent Application No. 61/478,847 titled “Determination of Oxygen Saturation in a Tissue of Visual System” and filed on Apr. 25, 2011. The disclosure of each of the above-mentioned applications is incorporated herein in its entirety by reference.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/US11/33939 | 4/26/2011 | WO | 00 | 12/18/2012 |
Number | Date | Country | |
---|---|---|---|
61329205 | Apr 2010 | US | |
61478847 | Apr 2011 | US |