VIRTUAL HYPERSPECTRAL IMAGING OF BIOLOGICAL TISSUE FOR BLOOD HEMOGLOBIN ANALYSIS

Information

  • Patent Application
  • 20230000357
  • Publication Number
    20230000357
  • Date Filed
    November 24, 2020
    3 years ago
  • Date Published
    January 05, 2023
    a year ago
Abstract
A system for generating hyperspectral imaging data for measuring biochemical compositions is disclosed which includes a spectral imaging device adapted to acquire one or more hyperspectral linescan images, an optical imaging device with a red-green-blue (RGB) sensor adapted to acquire an RGB dataset, a processor adapted to co-locate a plurality of pixels in the RGB dataset vs. a corresponding plurality of pixels of the one or more hyperspectral linescan datasets, establish a transformation matrix utilizing the plurality of co-located pixels, apply the transformation matrix to the RGB dataset to thereby generate the hyperspectral dataset, and analyze the generated hyperspectral image dataset to determine the biochemical compositions.
Description
TECHNICAL FIELD

The present disclosure generally relates to generating a hyperspectral imaging dataset, recovering hyperspectral information from RGB values, analyzing blood, and in particular, to a system and method of analyzing biological tissue for blood hemoglobin analysis.


BACKGROUND

This section introduces aspects that may help facilitate a better understanding of the disclosure. Accordingly, these statements are to be read in this light and are not to be understood as admissions about what is or is not prior art.


Blood hemoglobin (Hgb) tests are routinely ordered as an initial screening of the amount of red blood cells (or hemoglobin) in the blood as part of a general health test for a subject. Blood Hgb tests are extensively performed for a variety of patient care needs, such as anemia detection as a cause of other underlying diseases, hemorrhage detection after traumatic injury, assessment of hematologic disorders, and for transfusion initiation. There are several biological assays for measuring blood Hgb content in grams per deciliter (i.e. g dL−1) from blood drawn via traditional needle-based methods. Portable point-of-care hematology analyzers using blood draws (e.g. Abbott i-STAT and HemoCue) are also commercially available. However, all these tests require expensive and environment-sensitive analytical cartridges with short shelf lives, as well as unaffordable for both resource-limited and homecare settings. In addition, repeated blood Hgb measurements using these invasive tests can cause iatrogenic complications such as blood loss.


Unlike measuring oxygen saturation with pulse oximetry, noninvasive measurements of a total Hgb concentration in the blood are not straightforward. A few noninvasive Hgb testing devices (e.g. MASIMO and ORSENSE) have recently become available that are currently undergoing clinical studies for immediate reading and continuous monitoring of blood Hgb levels in different clinical settings. Aside from the relatively high cost associated with operating and maintaining the equipment, the medical community agrees that the broad limits of agreement between these devices and central laboratory tests pose a significant challenge in making clinical decision, thus generating skepticism in clinical adaptation. Several smartphone-based anemia detection technologies (e.g., HEMOGLOBE, EYENAEMIA, AND HEMAAPP) have also made progress, however, most of these mobile applications are intended for initial screening or risk stratification of severe anemia and are not developed for measuring exact Hgb content in the unit of g dL−1.


Therefore, there is an unmet need for a novel technology that can provide non-invasive Hgb measurements that can be relied for accuracy without the complications associated with expensive laboratory equipment.


SUMMARY

A system for generating hyperspectral imaging data for measuring biochemical compositions is disclosed. The system includes a spectral imaging device adapted to acquire one or more hyperspectral linescan images from one or more regions of interest of a subject, thereby generating one or more hyperspectral linescan datasets. The system further includes an optical imaging device with a red-green-blue (RGB) sensor adapted to acquire an RGB image from the region of interest of the subject, thereby generating an RGB dataset. The system further includes a processor which is adapted to co-locate a plurality of pixels in the RGB dataset vs. a corresponding plurality of pixels of the one or more hyperspectral linescan datasets. The processor is further adapted to establish a transformation matrix utilizing the plurality of co-located pixels, the transformation matrix adapted to convert the RGB dataset into a hyperspectral dataset of the region of interest. Additionally, the processor is adapted to apply the transformation matrix to the RGB dataset to thereby generate the hyperspectral dataset for the region of interest. Furthermore, the processor is adapted to analyze the generated hyperspectral image dataset to determine the biochemical compositions.


According to one embodiment, in the system of the present disclosure each of the plurality of co-located pixels from the RGB dataset is associated with a 3×1 RGB value matrix.


According to one embodiment, in the system of the present disclosure each of the co-located plurality of pixels from the hyperspectral linescan dataset is associated with an N×1 spectrum matrix, where N represents discretized spectra between a lower bound and an upper bound.


According to one embodiment, in the system of the present disclosure the lower and upper bounds are determined by the spectral range of RGB sensors.


According to one embodiment, in the system of the present disclosure the spectral range of sensors are between 400 nm and 800 nm.


According to one embodiment, in the system of the present disclosure the transformation matrix is an inverse of the RGB response function matrix of the RGB sensor.


According to one embodiment, in the system of the present disclosure the inverse of the transformation matrix is determined numerically by using RGB and spectral data from a subset of the collocated plurality of pixels.


According to one embodiment, in the system of the present disclosure the region of interest includes inner eyelid.


According to one embodiment, in the system of the present disclosure the biochemical compositions includes blood hemoglobin.


According to one embodiment, in the system of the present disclosure the biochemical compositions are determined using spectral analysis.


According to one embodiment, in the system of the present disclosure the spectral analysis includes a partial least square regression statistical modeling technique to first build a model from a training set of a first hyperspectral dataset vs. the biochemical compositions and then apply the model to a second dataset from the generated hyperspectral image dataset.


A method for generating hyperspectral imaging data for measuring biochemical compositions is also disclosed. The method includes obtaining one or more hyperspectral linescan images using a spectral imaging device from one or more region of interest of a subject, thereby generating one or more hyperspectral linescan datasets. The method also includes obtaining an RGB image from the region of interest using an optical imaging device with a red-green-blue (RGB) sensor, thereby generating an RGB dataset. The method further includes co-locating a plurality of pixels in the RGB dataset vs. a corresponding plurality of pixels of the one or more hyperspectral linescan datasets. Additionally, the method includes establishing a transformation matrix utilizing the plurality of co-located pixels, the transformation matrix adapted to convert the RGB dataset into a hyperspectral dataset of the region of interest. Furthermore, the method includes applying the transformation matrix to the RGB dataset to thereby generate the hyperspectral dataset for the region of interest. The method also includes analyzing the generated hyperspectral image dataset to determine the biochemical compositions.


According to one embodiment, in the method of the present disclosure each of the plurality of co-located pixels from the RGB dataset is associated with a 3×1 RGB value matrix.


According to one embodiment, in the method of the present disclosure each of the co-located plurality of pixels from the hyperspectral linescan dataset is associated with an N×1 spectrum matrix, where N represents discretized spectra between a lower bound and an upper bound.


According to one embodiment, in the method of the present disclosure the lower and upper bounds are determined by the spectral range of RGB sensors.


According to one embodiment, in the method of the present disclosure the spectral range of sensors are between 400 nm and 800 nm.


According to one embodiment, in the method of the present disclosure the transformation matrix is an inverse of the RGB response function matrix of the RGB sensor.


According to one embodiment, in the method of the present disclosure the inverse of the transformation matrix is determined numerically by using RGB and spectral data from a subset of the co-located plurality of pixels.


According to one embodiment, in the method of the present disclosure the region of interest includes inner eyelid.


According to one embodiment, in the method of the present disclosure the biochemical compositions are determined using spectral analysis.


According to one embodiment, in the method of the present disclosure the spectral analysis includes a partial least square regression statistical modeling technique to first build a model from a training set of a first hyperspectral dataset vs. the biochemical compositions and then apply the model to a second dataset from the generated hyperspectral image dataset.





BRIEF DESCRIPTION OF DRAWINGS


FIG. 1a is a simplified block diagram depicting the major blocks of the system of the present disclosure.



FIG. 1b a flowchart of steps of an algorithm of the present disclosure is provided.



FIG. 1c a flowchart of steps of an algorithm of the present disclosure is provided.



FIG. 1c a flowchart of steps of an algorithm of the present disclosure is provided.



FIG. 1d a flowchart of steps of an algorithm of the present disclosure is provided.



FIGS. 2a and 2b provide a photograph of a hyper-spectrographic setup capable of providing a hyperspectral image data for a subarea (or a line) as shown in FIG. 2b.



FIG. 2c is a photograph of the microvessel-mimicking phantom.



FIGS. 2d and 2e are hyperspectral linescans are shown outlined in the RGB image of the microvessel phantom of FIG. 2c.



FIG. 2f spectra are shown corresponding to the average intensity along the distance outline in FIG. 2e for two different Hgb concentrations.



FIG. 2g is a photograph of the inner eyelid, with a frame of pixels shown thereon.



FIG. 2h is a graph of relative sensitivity (%) vs wavelength in (nm) for red, green, and blue spectral responses of a camera.



FIG. 3 is a graph of normalized intensity vs. wavelength in nm of full width at half maximum (FWHM) of a HeNe laser.



FIG. 4 are graphs of blood hemoglobin (HGb) in g dL−1 vs. wavelength in nm showing comparison between a hyperspectral dataset (acquired by the image-guided hyperspectral system) and the algorithm-reconstructed hyperspectral datasets (based on an RGB image) according to the algorithm of the present disclosure for various levels of blood Hgb levels.



FIG. 5a is a schematic of an inner eyelid showing a first linescan (line 1) used for training a model and a second line scan (line 2) for testing the model.



FIGS. 5b and 5c are blood hemoglobin (HGb) in g dL-1 vs. wavelength in nm (FIG. 5b) and intensity vs. wavelength in nm for an obtained hyperspectral dataset vs. a calculated hyperspectral dataset (i.e., line 1 vs. line 2 of FIG. 5a) according to the algorithm of the present disclosure.



FIG. 6 is histogram which summarizes the blood Hgb values of a total of 153 individuals that were used for spectroscopic and blood Hgb reconstruction measurements using the algorithm of the present disclosure.



FIG. 7 is a collection of graphs of a linear correlation between the computed blood Hgb content and the laboratory blood Hgb levels and differences in blood hemoglobin in g dL−1 for one subset of the population of individuals (138) used as training data as well as a second population of individuals (15) used as testing data.





DETAILED DESCRIPTION

The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee.


For the purposes of promoting an understanding of the principles of the present disclosure, reference will now be made to the embodiments illustrated in the drawings, and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of this disclosure is thereby intended.


In the present disclosure, the term “about” can allow for a degree of variability in a value or range, for example, within 10%, within 5%, or within 1% of a stated value or of a stated limit of a range.


In the present disclosure, the term “substantially” can allow for a degree of variability in a value or range, for example, within 90%, within 95%, or within 99% of a stated value or of a stated limit of a range.


For noninvasive blood Hgb measurements, it is important to rely on an appropriate anatomical sensing site where the underlying microvasculature is exposed on the skin surface without being affected by confounding factors of skin pigmentation and light absorption of molecules (e.g. melanin) in tissue. Commonly used clinical examination sites of pallor or microcirculation, such as the conjunctiva, the nailbed, the palm, and the sublingual region, provide a clue for an examination site selection. Specially, the palpebral conjunctiva (i.e. inner eyelid) can serve as an ideal site for peripheral access, because the microvasculature is easily visible and melanocytes are absent. The easy accessibility of the inner eyelid allows for reflectance spectroscopy and digital photography to be tested for anemia assessments.


To this end, the present disclosure advantageously applies techniques typically used in astrophysics to the inside of the eyelid to ascertain hyperspectral imaging data of tissue which can then be used to measure effective blood hemoglobin content. Referring to FIGS. 1a, 1b, 1c, and 1d, a block diagram and flowcharts depicting the system and method of the present disclosure are provided. Referring to FIG. 1a, a simplified block diagram is shown depicting the major blocks of the system 10 of the present disclosure. The system 10 includes a hyperspectral imaging system that can associate wavelength from a plurality of pixels. It should be appreciated that hyperspectral imaging of a large area represents complex imaging equipment and complex processes; however, producing a linescan, as discussed below, is significantly more straightforward and simplified. A linescan, however, cannot generate the information on the entire area of interest needed to calculate the target output which is a measure of hemoglobin. Referring to FIG. 1a, the system 10 includes a linescan imaging apparatus 12 capable of generating linescans of a subarea of an area of interest, and a color (RGB) imaging apparatus 14 capable of generating a red-green-blue (RGB) image of the entire area of interest. The output of these two imaging apparatuses are combined by a processing system (not shown) but partially represented as a summer 16 which produces a matrix of intensity as a function of the position (x, y) and the wavelength of light λ (also known as a hypercube or a hyperspectral image) without using a conventional hyperspectral imaging system. Using a hyperspectral dataset and an RGB dataset from the subarea (or a line), a transformation (or extrapolation) algorithm (referred to herein as a virtual hyperspectral imaging (VHI) algorithm depicted across FIGS. 1b, 1c, and 1d) is used to construct a hyperspectral image for the larger area of interest, all of which is represented by the summer 16. The aforementioned transformation algorithm is then applied to the RGB dataset at pixels outside the subarea (or line) to generate a hyperspectral image dataset for the entire area by the processing system (not shown), as represented by the block 18.


Referring to FIG. 1b, the specifics of VHI algorithm is provided, as provided herein. First, a method 100 is described performed by the processing system (not shown). The Method 100 begins at the initial block 102 (identified as “Begin”). The method 100 then proceeds to a block 104 wherein the method 100 obtains a hyperspectral linescan data from the linescan imaging apparatus 12 (see FIG. 1a) of a subarea (or line) of interest. The output of the linescan is provided as I(xsub, ysub, λ) for each pixel in the subarea (or line), where λ is the wavelength output of the pixel having location x and y. The method 100 next move to a block 106 where an RGB image from the area of interest is obtained from the RGB imaging apparatus 14. The data obtained includes three image intensity sets: one for Red (R(xfull, yfull)), one for Green (G(xfull, yfull)), and one for Blue (B(xfull, yfull)). Next the method 100 proceeds to a block 108 where the image intensity sets of the block 106 are divided into the subarea (or line) and everywhere else. The subarea image intensity dataset includes: R(xsub, ysub), G(xsub, ysub), and B(xsub, ysub); and R(xout, yout), G(xout, yout), and B(xout, yout), where sub out. Next, the method 100 proceeds to a block 110 where the hyperspectral datasets from the block 104 matched to the RGB image intensity data of the subarea (or line) from the block 108. In particular, the method 100 makes the following correspondences: R(xsub, ysub)←→I(xsub, ysub, λ), G(xsub, ysub)←→I(xsub, ysub, λ), and B(xsub, ysub)←→I(xsub, ysub, λ). The method 100 then obtains the RGB spectral response functions (also known as RGB spectral sensitivity functions) of the RGB imaging apparatus 14 (see FIG. 1a), as shown in a block 112. The RGB spectral response functions of the RGB imaging apparatus is data that is provided by the manufacturer of the RGB imaging apparatus (sensors and cameras) 14 (see FIG. 1a). The datasets in the blocks 112 and 110 are combined as provided by the summer 114, in order to generate a transformation matrix (M) in a block 116, according to a procedure provided in FIG. 1d, discussed below. Once the transformation matrix is generated in the block 116, the method 100 applies the transformation matrix to the image intensity data of the area of interest as provided in a block 118. This operation is defined by the operation: M·R(xfull, yfull), M·G(xfull, yfull), and M·B(xfull, yfull). The operation in the block 118 results in the hyperspectral data for the area of interest I(xfull, yfull, λ) as provided in a block 120.


Referring to FIG. 1c, a method 130 is shown that uses the hyperspectral data of the area of interest (i.e., the output of the method 100 as provided in the block 120 (see FIG. 1b)) and applies that output to obtain an estimate of hemoglobin. The method 130 starts in a block 132 (identified as “Begin”). The method then proceeds to a block 134 where the output of the block 120 (see FIG. 1b) is obtained. Next an area of the eyelid is isolated (xeyelid, yeyelid) as provided in a block 136 and the hyperspectral data from this area is separated from the larger area of interest. This output is shown in a block 138 as I(xeyelid, yeyelid, λ). Next the method 130 computes the hemoglobin concentration based on the isolated hyperspectral data output of the block 138, as provided in a block 140. The method 130 then proceeds to a block 142 where the estimate of the hemoglobin concentration (gdL−1) is provided as the output of the method 130.


Referring to FIG. 1c, a method 150 according to the present disclosure is provided to depict how the transformation matrix (M), see block 116 of FIG. 1b, is generated. The method 150 begins at a block 152 (identified as “Begin”). The steps of the method 150 are described below.


Referring to FIGS. 2a and 2b, a photograph of a hyper-spectrographic setup is shown capable of providing a hyperspectral image data for a subarea (or a line). The system shown in FIG. 2a includes a dual-channel system in which one detection arm is coupled with a hyperspectral linescanning system (see linsescan imaging apparatus 12 of FIG. 1a) and the other detection arm is coupled with an imaging camera with an RGB imaging sensor (see RGB imaging apparatus of FIG. 1b). The setup shown in FIG. 2a is inspired by an astronomical hyperspectral imaging system, which is used for imaging the inner eyelid. A subject sits in front of the system, facing a telecentric lens (or a lens), places the chin on the chinrest, and pulls down the eyelid for imaging when instructed. The system is adapted to instantaneously acquire a hyperspectral line in the center of the inner eyelid. The RGB image also shows the exact location where the hyperspectral linescanning is performed (shown as a translucent white rectangle with a physical height of about 6.4 mm). The hyperspectral linescan dataset contains spatial (y), which serves as a subarea and wavelength (λ) information. When the averaged spectrum corresponds to the average intensity along the spatial y axis for each λ value, the characteristic absorption of hemoglobin (Hgb) is clearly visible. As a result, a two-dimensional hyperspectral graph of y vs. λ is generated, shown adjacent the photograph of an example inner eyelid, provided in FIG. 2b, for the hyperspectral linescan dataset. This two-dimensional graph can then be represented as a graph of wavelength in nm vs. intensity, also shown in FIG. 2b and aligned with the two-dimensional hyperspectral graph of y vs. λ.


Referring back to FIG. 2a, this custom-built dual-channel system shown uses a dual-channel spectrograph (Shelyak Instruments) that has two detection arms to allows for simultaneous acquisitions of hyperspectral and RGB image data along the line. To provide broadband white-light illumination to the inner eyelid, a white-light LED ring (Neopixel RGBW 24 LED ring, Adafruit Industries) is attached to a telecentric lens (0.5×, Edmund Optics) via a custom-built 3-D printed ring holder to fit the lens circumference. Other lenses can be used, however, a telecentric lens offers the ability to eliminate parallax error and to provide constant magnification. Telecentric imaging is also beneficial for biological tissue, including resolution enhancement by diffuse light suppression, large constant transverse field of view, consistent magnification over the axial direction, and long working distance. The intensity of the LED ring is controlled with a microcontroller (Arduino UNO). The image-guided hyperspectral linescanning system has two data acquisition ports: hyperspectral line-scanning port mounted with a mono CCD camera (e.g., PointGrey Grasshopper3 5.0 MP Mono, FUR Integrated Imaging Solutions Inc.) and image port mounted with a 3-color CCD (e.g., PointGrey Grasshopper3 5.0 MP Color). For hyperspectral linescanning (length=6.4 mm), the telecentric lens collects light scattered from the inner eyelid, which passes through the slit of the spectrograph and disperses with a diffraction grating. The diffraction grating in the dual-channel spectrograph was selected to cover the visible wavelength range of 400-700 nm and the slitwidth is 23 μm inside the system, resulting in a spectral resolution of Δλ=1 nm as shown in FIG. 2b. For RGB imaging, the light from the inner eyelid is reflected via another mirror toward the imaging port to generate a field of view (14 mm×12 mm) with a spatial resolution of ˜150 μm. It should be noted that by changing the imaging lens, the field of view can easily be increased for other applications. The custom-built dual-channel system rests on a base with two interlocked x-y-z positioning bases that serve to move the imaging system to locate the eyelid image centered within the rectangular region of interest (ROI), also referred to herein as the area of interest, guide. A LabVIEW Virtual Instrument (National Instruments Corporation) was generated to synchronize data acquisition, LED light control, and background room light subtraction. A series of tests were conducted using tissue-mimicking phantoms, see FIGS. 2c, 2d, 2e, and 2f. In FIG. 2c a photograph of the microvessel-mimicking phantom is presented. Hgb-filled tubings are fixed at the bottom of the glass petri dish and are submerged in the optical scattering suspension. In FIGS. 2d and 2e, a hyperspectral linescan is outlined in the RGB image of the microvessel phantom. The microvessels are positioned perpendicular to the linescan. In FIG. 2f, each spectrum shown corresponds to the average intensity along the distance outline in FIG. 2e. The microvessel with a higher Hgb concentration (5.0 g dL-1) has a lower reflection intensity for the wavelengths between 450 and 550 nm than the lower Hgb concentration (3.0 g dL-1). During eyelid imaging, the subject is asked to sit down facing the imaging system and to place their head in a chinrest. Once the eyelid is correctly focused and positioned within the ROI rectangle, we proceed with data acquisition, while reminding the individual the instructions not to move or close their eyes until the completion of imaging session. Measurements of a reference reflectance standard (SRT-99-050, Labsphere, Inc.) are also conducted with the hyperspectral line-scanning system to correct for the system response (both illumination and detection).


As discussed above, a single hyperspectral linescan dataset, however, does not have sufficient information to reliably use for hemoglobin data extraction from the entire inner eyelid. Therefore, what is needed is additional hyperspectral data for the entire inner eyelid that can be used for averaging and other statistical operations. Such additional data can be formed from additional linescans or based on extrapolation of one or more linescans, see FIGS. 1a, 1b, and 1c. According to one embodiment of the present disclosure, a sufficient number of hyperspectral linescan datasets can be progressively scanned and generated and then stitched together to form an ensemble of a portion of the inner eyelid. This is a typical approach of conventional hyperspectral imaging systems. Such a process is cumbersome and very slow for hyperspectral data acquisition, since it will require accounting for slight movement of the subject during each linescan capture. Alternatively, the linescan dataset can be used as a baseline and the same extrapolated using an RGB image whereby the RGB image is converted into a hyperspectral graph based on the single or multiple hyperspectral linescan datasets, as discussed above and with reference to the present disclosure. For example, in the latter approach, one or more hyperspectral linescan datasets or hyperspectral data for the entire area can be generated. In other words, the hyperspectral data and the RGB image from the line is used to construct a transformation matrix (M, see block 116 in FIG. 1b) that mathematically predicts a hyperspectrum from the RGB data at a pixel location outside the line. One hyperspectral linescan dataset and its corresponding RGB dataset are sufficient to construct the transformation matrix. By applying this transformation matrix to all of the pixel locations outside the line, a hyperspectral imaging dataset is generated for the entire area (see block 120 in FIG. 1b). This extrapolation approach is referred to herein as the virtual hyperspectral imaging (VHI) algorithm/system. The VHI approach advantageously requires only one raw hyperspectral linescan dataset and an RGB image at the minimum, preferably produced at the same time to avoid subject movement, simplifying the imaging requirements, significantly. The custom-built dual channel system shown in FIG. 2a, allows simultaneous acquisitions of hyperspectral line-scanning and RGB imaging. It should be noted that the hyperspectral image(s) and the RGB image can be produced at different times in close proximity to one another, as long as any variations due to movement of the eyelid is considered. With the VHI approach, a hyperspectral dataset for the entire eyelid can be generated in order to form a more accurate correlation to hemoglobin with only one or more hyperspectral linescan datasets and an RGB image.


The spectroscopic and VHI blood Hgb measurements systems and methods of the present disclosure are not affected by variations in the illumination and detection of the imaging systems as well as the background ambient room light as follows: The measured spectral intensity Im(λ) reflected from the inner eyelid in a given location of (x, y) is expressed as a function of the wavelength λ:






I
m(λ)=L(λ)C(λ)D(λ)r(λ)  (1)


where L (λ) is the spectral shape of the illumination light source,


C(λ) is the spectral response of all optical components in the imaging system (e.g. lenses and diffraction grating),


D(λ) is the spectral response of the detector (e.g. mono imaging sensor or RGB imaging sensor in the image-guided hyperspectral linescanning system), and


r(λ) is the true spectral intensity reflected from the inner eyelid. First, to compensate for the system response (i.e. L(λ)C(λ)D(λ)), we use the reference reflectance standards that have a reflectivity of 99% in the visible range. Im(λ) is normalized by the reflectance measurement Ireference(λ) of the diffuse reflectance standard in which rreference(λ)=0.99 in the visible range










r

(
λ
)

=



I
m

(
λ
)



I
reference

(
λ
)






(
2
)







Second, to remove the ambient stray and background light Ibackground(λ), two measurements are acquired with the external light source (i.e., white-light LED ring illuminator of the custom-built dual imaging system) on and off. The measurements are repeated without the sample while the illumination is kept on. Finally, r(λ) is calculated by subtracting Ibackground(λ) from each measurement such that:










r

(
λ
)

=




I
m

(
λ
)

-


I
backgroud

(
λ
)





I
reference

(
λ
)

-


I
backgroud

(
λ
)







(
3
)







This systematic and rigorous data acquisition procedure serves as the foundation for developing a reliable VHI transformation matrix and a universal blood Hgb computation algorithm. It should be noted that the built-in data acquisition step to factor out the contributions of room light conditions provides a unique advantage to generate this reliable blood Hgb calculation.


To better understand this approach, reference is made to FIG. 2g which is a photograph of the inner eyelid, with a frame of pixels shown thereon. The frame is an RGB 2-dimensional frame and includes pixels in the X-direction and the Y-direction. The corner pixel is shown as P1. The first row of pixels includes pixels P11, P12, P13, P14, . . . P1l, . . . and P1q. The second row includes pixels P21, P22, P23, P24, . . . P2l, . . . and P2q, until the last row which includes pixels Pm1, Pm2, Pm3, Pm4, . . . Pml, . . . and Pmq. The first column of pixels includes pixels P11, P21, P31, P41, . . . Pm1. The second column includes pixels P12, P22, P32, P42, . . . Pm2. The last column includes pixels P1q, P2q, P3q, P4q, . . . Pmq. The lth column which happens to be the column which is coincident with the column with the hyperspectral data (linescan) includes pixels P11, P21, P31, P41, . . . Pm1 (note the second index in each pixel is lowercase L (i.e., “1” and not one (“1”)). This column provides the aforementioned limited hyperspectral data. Concentrating on the lth column, each pixel (i.e., P11, P21, P31, P41, . . . Pm1) having an RGB intensity can be paired to the corresponding hyperspectral pixel with a corresponding wavelength data (i.e., λ1, λ2, λ3, λ4, . . . λm) where λ represents a discretized wavelength between a first wavelength (e.g., 450 nm) and a second wavelength (e.g., 679 nm). That is each hyperspectral pixel is represented by a spectrum bounded between the lower and upper bounds. For a discretized spectrum, the number of wavelengths is identified as N. Therefore, for pixel P11, one can correlate the RGB intensity of the pixel P11 to the spectrum obtained from the hyperspectral imaging of the same pixel (obtained at preferably the same time with different cameras). A transformation matrix can be derived from this correlation that can then be applied to other pixels and their associated RGB intensities in order to derive corresponding spectra of those other pixels.


In the case of VHI, a mathematical reconstruction of the full spectral information from an RGB image taken by a conventional camera (i.e. three-color information from Red, Green, and Blue channels) is generated, according to the present disclosure. Referring to FIG. 3, the spectral resolution (Δλ=1 nm) of the system by measuring full width at half maximum (FWHM) of a HeNe laser is provided. The mathematical relationship between the full spectrum and the RGB intensity is described as






x=Sr+e  (4)


where x is a vector corresponding to the reflection intensity in each R, G, and B channel,


S is a matrix of the RGB spectral response functions of the three-color sensor,


r is a vector of the spectral intensity reflected from the inner eyelid, and


e is a vector of the system noise. In our case, the hyperspectral reconstruction from the RGB signal is an inverse problem such that the number of actual measurements (i.e. three-color information) is less than the dimensionality of the full spectrum with λ=λ1, λ2, . . . , λN. Given the relatively limited sample size, we took advantage of fixed-design linear regression with polynomial features to reliably reconstruct the full spectral information r(λ1, λ2, . . . , λN) from the RGB signals x(R, G, B) of the three-color RGB sensor, as shown in FIG. 4; wherein a comparison between the original hyperspectral dataset (acquired by the image-guided hyperspectral system) and the VHI-reconstructed hyperspectral datasets (based on an RGB image) are provided for various levels of blood Hgb levels. The differences in the wavelength range between 450 and 575 nm are generally higher, because the distinct Hgb absorption is present in this range. To better demonstrate the construction of the transformation matrix, reference is made to FIG. 1d. First, the method 150 describes the measured RGB intensity as provided in a block 154:






x
3×1
=S
3xN
r
N×1
+e
3×1  (5-1)


where x is a 3×1 vector corresponding to the reflection intensity in each R, G, and B channel (e.g., pixel P11, is identified by x3×1 which is a 3×1 matrix with each row associated with an RGB channel output, i.e., the first row represents the R value, the second row represents the G value, and the third row represents the B value),


S is a 3×N matrix of the RGB spectral response functions of the 3-color sensor, i.e. built-in camera (S represent the discretized versions of the spectra for each RGB channel, as shown in FIG. 2h, in the form of a matrix, i.e., the first row of S is the spectrum for relative intensity of the R channel output of the sensor over discretized range bounded between a lower and upper range, the second row of S is the spectrum for relative intensity of the G channel output of the sensor over discretized range bounded between the lower and upper range, the third row of S is the spectrum for relative intensity of the B channel output of the sensor over discretized range bounded between the lower and upper range),


r is an N×1 vector that has the spectral reflection intensity (that is r is the spectrum over discretized range bounded between the lower and upper range of the pixel from the hyperspectral image)—in our case, r(λ=λ1, λ2, . . . , λN) is discretized from 450 nm to 679 nm with a spectral interval of 1 nm, and


e is a 3×1 vector of the system noise with zero mean. The hyperspectral reconstruction from the RGB signal is to obtain [S3×N]−1. However, this inverse calculation is an underdetermined problem since N>3.


To solve this underdetermined problem, we formulate fixed-design linear regression with polynomial features of the three-color information to infer the spectral information r from the RGB signals x. We take advantage of multiple collections of the hyperspectral reflection dataset (acquired by the image-guided hyperspectral line-scanning system) and the RGB dataset (acquired by the RGB camera), respectively. X3×m and RN×m are formed by adding x3×1 and rN×1 from m different measurements. Referring to FIG. 2g, while the aforementioned underdetermined problem as initially described with respect to pixel P11, there are m pixels in the lth column of the frame. Therefore, other pixels in equation (5-1) can be used to numerically solve for [S3×N]−1 in an alternative manner, as provided in a block 156 of FIG. 1d. The relationship in Equation (5-1) is described as:






X
3×m
=S
3×N
R
N×m  (5-2)





which can be expressed as:






R
N×m
=T
N×3
X
3×m  (5-3)


where the transformation (or extrapolation) matrix is TN×3=[S3×N]−1, as provided in a block 158 in FIG. 1d. If Equation (5-3) is solved for the unknown TN×3, then TN×3 can be used to convert the RGB dataset into the hyperspectral reflection dataset, as provided in a block 160 in FIG. 1d.


Next, each three-color sensor model in different cameras has unique RGB spectral responses with spectral overlaps among the R, G, and B channels (also known as the sensitivity function of the camera), as discussed above with reference to FIG. 2h. To effectively incorporate the RGB spectral response of the camera, we expand X3×m to {circumflex over (X)}p×m for maximizing the accuracy of the hyperspectral reconstruction such that:






R
N×m
={circumflex over (T)}
N×p
{circumflex over (X)}
p×m  (5-4)


here {circumflex over (X)}p×m can be expressed explicitly such that:











X
^


p
×
m


=


[




R
1




G
1




B
1







R
1
i




G
1
i




B
1
i





R
1



G
1






G
1



B
1






B
1



R
1









(


R
1



G
1


)

j





(


G
1



B
1


)

j





(


B
1



R
1


)

j





R
1



G
1



B
1









(


R
1



G
1



B
1


)

j



























































R
m




G
m




B
m







R
m
i




G
m
i




B
m
i





R
m



G
m






G
m



B
m






B
m



R
m









(


R
m



G
m


)

j





(


G
m



B
m


)

r





(


B
m



R
m


)

j





R
m



G
m



B
m









(


R
m



G
m



B
m


)

j




]

T





(

5
-
5

)







where the exact powers of i and j of the single and cross terms are uniquely determined for a specific three-color sensor model, by checking the error between the reconstructed hyperspectral data and the original data.


Next, the inverse of the expanded transformation matrix {circumflex over (T)} in Equation (5-4) can be considered to be the minimum-norm-residual solution to R=TX. Typically, this inverse problem is to solve a least-squares problem. We take use of QR decomposition, in particular the QR solver. After QR factorization is applied to {circumflex over (X)}, {circumflex over (T)} is estimated by minimizing the sum of the squares of the elements of R-{circumflex over (T)}{circumflex over (X)} and is selected such that the number of nonzero entries in T is minimized. Overall, the computation of the transformation (extrapolation) matrix establishes VHI, eliminating a need of bulky dispersion hardware components (e.g. spectrometer, spectrograph, mechanical filter wheel, or liquid crystal tunable filter).


We validated the performance of the RGB-assisted VHI as shown FIGS. 5a, 5b, and 5c. The line #1 (shown in FIG. 5a) was used to construct the transformation matrix that generates hyperspectral data from RGB data. The transformation matrix was applied to the RGB data of Line #2 to generate a hyperspectral linescan of Line #2 (see FIGS. 5a, 5b, and 5c). This extrapolated data is compared with a measured hyperspectral linescan of Line #2 by moving the systems. The reconstructed data is in excellent agreement with the original data. When the hyperspectral data were averaged over the line scanning direction into a spectrum, the reconstructed spectrum is also in excellent agreement with the original one.


In order to make comparison with clinical data, reference is made to FIG. 6 which summarizes the blood Hgb values of a total of 153 individuals that were used for spectroscopic and VHI blood Hgb measurements. The study covers a wide range of Hgb values from 3.3 to 19.2 g dL−1. We conducted a clinical study within the facilities overseen by the accepted authorities. We enrolled patients who were referred for complete blood count (CBC) tests. For all individuals enrolled in the study, we collected hyperspectral data and RGB images from the palpebral conjunctiva (i.e. inner eyelid) using an image-guided hyperspectral line-scanning system. As the ‘gold standard’ clinical laboratory measurements, blood Hgb levels were measured in an Accredited Clinical Laboratory using a commercial hematology analyzer (BECKMAN COULTER AcT 5diff auto, BECKMAN COULTER, INC.). For developing the blood Hgb quantification algorithm, we randomly selected 138 individuals (78 females and 60 males) to use as a preliminary (training) dataset. The average Hgb level is 12.65 g dL−1 with a standard deviation (SD) of 3.11 g dL−1 and the average age is 37.78 years with SD of 16.38 years. As a new masked (testing) dataset, we employed the rest of 15 individuals (12 females and 3 males) not included in the preliminary dataset. The average Hgb level is 11.06 g dL−1 with SD of 3.62 g dL−1. The average age is 39.13 years with SD of 17.30 years.


We now describe the partial least square regression (PLSR) to estimate a blood Hgb level from the hyperspectral information averaged from the entire inner eyelid. We built a model for computing blood Hgb content from the hyperspectral reflection data averaged over the inner eyelid. Analytical model-based Hgb prediction methods are often used, because Hgb has distinct spectral signatures (e.g. Soret and Q bands) in the visible range. However, such model-based approaches often require a priori information on all possible light absorbers in tissue for reliable Hgb quantification. Thus, we made use of PLSR, which can be used to model relationships among measured variables (i.e. predictors) and response variables (i.e. outcomes). Because PLSR transforms high-dimensional measured variables onto a reduced space of latent variables, it is highly beneficial to examine the significance of individual measured variables by eliminating insignificant variables. While PLSR is based on the extraction of principal components, it incorporates variations of both predictor and outcome variables simultaneously, enhancing the prediction performance. Similar to principal component analysis, it is critical to determine an optimal number of components in PLSR, as a greater number of components better captures variations in the predictor and outcome variables, thus lowering the prediction error. The determination of an optimal number of principal components in ten-fold cross-validation of partial least squares regression (PLSR) is thus performed. In particular, as the number of partial least squares (PLS) components increases, the percentage variance in the true Hgb values (outcome variable) increases, while the mean squared prediction error has minimal values for 18 components. These numbers of PLS components contribute to appropriate representation of variations in the spectroscopic and laboratory blood Hgb values simultaneously, thus making its prediction errors lower. As a result, 18 PLS components are selected and used in the Hgb prediction model. We can select an optimal number of components using cross-validation in a conservative manner as follows: The original dataset was randomly grouped into sub-datasets with the same sample size. One sub-dataset was not used for training the model and was retained as a validation dataset for testing the model. After this process was repeated, the different validations were averaged. The main advantage of such a cross-validation is that all of the datasets were incorporated to determine the optimal number of principal components, given the limited number of individuals. Although the use of PLSR often avoids overfitting when the number of predictors is larger than the sample size, it is also important to evaluate the ability for predicting Hgb levels from a completely new dataset after the model is established properly. Thus, we defined the two datasets for training and testing the blood Hgb model without reutilization of data from the same individuals.


Based on the aforementioned information, a hyperspectral/imaging data processing and statistical analysis is now provided. For data processing and algorithm development, we computed the hyperspectral and RGB data and developed the blood Hgb prediction model and the VHI algorithm using MATLAB (MATLAB R2018b, The MathWorks, Inc.). For statistical analyses, we evaluated multiple linear regression, linear correlations, and intra-class correlations using STATA (STATA 14.2, STATACORP LLC). We conducted Bland-Altman analyses to compare the blood Hgb measurements as non-parametric methods. The bias is defined by the mean of the differences between the hyperspectral (or VHI) and central laboratory blood Hgb measurements (d=yVHI−ycentral):









Bias
=


d
_

=


1
n






k
=
1

n



d
k

.








(
6
)







The 95% limits of agreement (LOA) is defined by a 95% prediction interval of the standard deviation:










L

O

A

=


d
_

±

1.96





1

n
-
1





k
=
1

n


=

(


d
k

-

d
_


)



.







(
7
)







Several patients have multiple disorders. Types of cancer include Kaposi sarcoma, breast cancer, skin cancer, and Hodgkin's lymphoma. SD means standard deviation.









TABLE 1







Patient characteristics










Disorder
Number of patients














Cancer
45



HIV
46



Tuberculosis
8



Sickle cell disease
19



Acute kidney failure
1



Heart failure
4



Malaria
1



Anemia
3



Immune thrombocytopenic purpura
1



No major disease
25



Dataset
Average Hgb (g dL−1)



Preliminary (training) dataset (n = 138)
12.65 (SD = 3.11)



Masked (testing) dataset (n = 15)
11.06 (SD = 3.62)










Using the system shown in FIG. 2a which includes a hyperspectral imaging system with an integrated and cooperative RGB camera, we conducted imaging sessions which results are provided in FIG. 6 and referred to above. In particular, the guiding camera allows us to pinpoint the exact location of one or more hyperspectral line-scanning in the inner eyelid, see FIG. 2b. The subject sits and places his/her chin on the chinrest and pulls down the eyelid for imaging when instructed. The white-light LED is illuminated on the inner eyelid, ensuring minimal light exposure to the eye. The guiding image panel shows a guide line corresponding to the location of the spectrograph slit, which is positioned vertically to acquire a hyperspectral line-scan dataset over the entire inner eyelid from top to bottom, see FIGS. 2c-2f. The image-guided hyperspectral line-scanning system acquires a snapshot of spatial-spectral information only for three seconds. To factor out the ambient room light, two measurements are conducted with the white-light LED on and off. To compensate for the system spectral response, the reflectance standard is used as a reference measurement.


A spectrum reflected from the inner eyelid directly acquired by the image-guided hyperspectral line-scanning system allows us to build a blood Hgb extraction model for predicting actual blood Hgb content. First, we constructed a prediction model of blood Hgb levels using the preliminary (training) dataset of 138 individuals, using PLSR. In our case, a reflection spectrum r(λ) has multicollinearity due to the large number of wavelengths (λ=λ1, λ2, . . . , λN) and only a handful of the underlying latent variables are responsible for capturing the most variations in the predictor variables. Using ten-fold cross-validation, we determined 18 principal components as an optimal number of PLSR components for the blood Hgb prediction model. The results are shown in FIG. 7. In FIG. 7, a linear correlation between the computed blood Hgb content and the laboratory blood Hgb levels (i.e. the gold standard) is provided which shows an excellent R2 value of 0.95 for the preliminary dataset. When the Bland-Altman analysis is used to compare the two blood Hgb measurements, the 95% limits of agreement (LOA) exclude three out of 138 (2.17% outside LOA) with bias of 0 g dL−1. Second, we applied the same Hgb prediction model to the separate testing dataset of 15 individuals. FIG. 7 shows that LOA includes 15 computed blood Hgb values with bias of 0.01 g dL−1. In addition, an excellent R2 value of 0.95 for the testing dataset supports the prediction model. Both preliminary and testing results clearly support the underlying idea that the full hyperspectral information of the inner eyelid can be used for accurately and precisely extracting actual Hgb count in the blood, noninvasively.


In further description of FIG. 7, to strengthen the validation of the blood Hgb prediction, two separate preliminary and masked testing datasets are used without reuse of individuals; a subset of 138 individuals is randomly selected as the preliminary dataset and the remaining 15 individuals is used to test the blood Hgb model. The Bland-Altman analyses compare the computed blood Hgb measurements with the laboratory blood Hgb test results, showing 95% limits of agreement (LOA) and bias in each system. With reference to FIG. 7, spectroscopic blood Hgb measurements using the image-guided hyperspectral line-scanning system show the excellent performance with narrow LOA of [−1.31, 1.31 g dL−1] with bias of 0 g dL−1 for the preliminary dataset. The testing dataset also supports a consistent yet low error in the blood Hgb measurements for the testing dataset. Only three out of 138 data points fall outside LOA for the preliminary dataset and none out of 15 for the testing dataset, indicating a consistent yet low error in the blood Hgb measurements.


Those having ordinary skill in the art will recognize that numerous modifications can be made to the specific implementations described above. The implementations should not be limited to the particular limitations described. Other implementations may be possible.

Claims
  • 1. A system for generating hyperspectral imaging data for measuring biochemical compositions, comprising: a spectral imaging device adapted to acquire one or more hyperspectral linescan images from one or more regions of interest of a subject, thereby generating one or more hyperspectral linescan datasets;an optical imaging device with a red-green-blue (RGB) sensor adapted to acquire an RGB image from the region of interest of the subject, thereby generating an RGB dataset;a processor adapted to: co-locate a plurality of pixels in the RGB dataset vs. a corresponding plurality of pixels of the one or more hyperspectral linescan datasets,establish a transformation matrix utilizing the plurality of co-located pixels, the transformation matrix adapted to convert the RGB dataset into a hyperspectral dataset of the region of interest,apply the transformation matrix to the RGB dataset to thereby generate the hyperspectral dataset for the region of interest, andanalyze the generated hyperspectral image dataset to determine the biochemical compositions.
  • 2. The system of claim 1, wherein each of the plurality of co-located pixels from the RGB dataset is associated with a 3×1 RGB value matrix.
  • 3. The system of claim 2, wherein each of the co-located plurality of pixels from the hyperspectral linescan dataset is associated with an N×1 spectrum matrix, where N represents discretized spectra between a lower bound and an upper bound.
  • 4. The system of claim 3, wherein the lower and upper bounds are determined by the spectral range of RGB sensors.
  • 5. The system of claim 1, wherein the transformation matrix is an inverse of the RGB response function matrix of the RGB sensor.
  • 6. The system of claim 6, wherein the inverse of the transformation matrix is determined numerically by using RGB and spectral data from a subset of the collocated plurality of pixels.
  • 7. The system of claim 1, wherein the region of interest includes inner eyelid.
  • 8. The system of claim 1, wherein the biochemical compositions includes blood hemoglobin.
  • 9. The system of claim 1, wherein the biochemical compositions are determined using spectral analysis.
  • 10. The system of claim 9, wherein the spectral analysis includes a partial least square regression statistical modeling technique to first build a model from a training set of a first hyperspectral dataset vs. the biochemical compositions and then apply the model to a second dataset from the generated hyperspectral image dataset.
  • 11. A method for generating hyperspectral imaging data for measuring biochemical compositions, comprising: obtaining one or more hyperspectral linescan images using a spectral imaging device from one or more region of interest of a subject, thereby generating one or more hyperspectral linescan datasets;obtaining an RGB image from the region of interest using an optical imaging device with a red-green-blue (RGB) sensor, thereby generating an RGB dataset;co-locating a plurality of pixels in the RGB dataset vs. a corresponding plurality of pixels of the one or more hyperspectral linescan datasets;establishing a transformation matrix utilizing the plurality of co-located pixels, the transformation matrix adapted to convert the RGB dataset into a hyperspectral dataset of the region of interest;applying the transformation matrix to the RGB dataset to thereby generate the hyperspectral dataset for the region of interest; andanalyzing the generated hyperspectral image dataset to determine the biochemical compositions.
  • 12. The method of claim 11, wherein each of the plurality of co-located pixels from the RGB dataset is associated with a 3×1 RGB value matrix.
  • 13. The method of claim 12, wherein each of the co-located plurality of pixels from the hyperspectral linescan dataset is associated with an N×1 spectrum matrix, where N represents discretized spectra between a lower bound and an upper bound.
  • 14. The method of claim 13, wherein the lower and upper bounds are determined by the spectral range of sensors.
  • 15. The method of claim 11, wherein the transformation matrix is an inverse of the RGB response matrix of the RGB sensor.
  • 16. The method of claim 16, wherein the inverse of the transformation matrix is determined numerically by using RGB and spectral data from a subset of the co-located plurality of pixels.
  • 17. The method of claim 11, wherein the region of interest includes inner eyelid.
  • 18. The method of claim 11, wherein the biochemical compositions includes blood hemoglobin.
  • 19. The method of claim 11, wherein the biochemical compositions are determined using spectral analysis.
  • 20. The method of claim 19, wherein the spectral modeling includes a partial least square regression statistical modeling technique to first build a model from a training set of a first hyperspectral dataset vs. the biochemical compositions and then apply the model to a second dataset from the generated hyperspectral image dataset.
  • 21. A method for generating hyperspectral imaging data for measuring biochemical compositions, comprising: obtaining one or more hyperspectral linescan images using a spectral imaging device from one or more region of interest of a subject, thereby generating one or more hyperspectral linescan datasets;obtaining an RGB image from the region of interest using an optical imaging device with a red-green-blue (RGB) sensor, thereby generating an RGB dataset;co-locating a plurality of pixels in the RGB dataset vs. a corresponding plurality of pixels of the one or more hyperspectral linescan datasets;converting the RGB dataset into a hyperspectral dataset of the region of interest; andanalyzing the generated hyperspectral image dataset to determine the biochemical compositions.
CROSS-REFERENCE TO RELATED APPLICATIONS

The present patent application is related to and claims the priority benefit of U.S. Provisional Patent Application Ser. No. 62/945,808 filed Dec. 9, 2019, titled “VIRTUAL HYPERSPECTRAL IMAGING OF BIOLOGICAL TISSUE FOR BLOOD HEMOGLOBIN ANALYSIS”; and U.S. Provisional Patent Application Ser. No. 62/945,816, filed Dec. 9, 2019, titled “HYPERSPECTRAL IMAGE CONSTRUCTION OF BIOLOGICAL TISSUE FOR BLOOD HEMOGLOBIN ANALYSIS USING A SMARTPHONE” the contents of each of which are hereby incorporated by reference in its entirety into the present disclosure.

STATEMENT REGARDING GOVERNMENT FUNDING

This invention was made with government support under R21TW010620 awarded by the National Institutes of Health and 7200AA18CA00019 awarded by the US Agency for International Development. The government has certain rights in the invention.

PCT Information
Filing Document Filing Date Country Kind
PCT/US2020/062016 11/24/2020 WO
Provisional Applications (2)
Number Date Country
62945808 Dec 2019 US
62945816 Dec 2019 US