Method and apparatus for non-invasive hemoglobin level prediction

Information

  • Patent Grant
  • 12089930
  • Patent Number
    12,089,930
  • Date Filed
    Tuesday, March 5, 2019
    5 years ago
  • Date Issued
    Tuesday, September 17, 2024
    2 months ago
  • Inventors
    • Hasan; Md Kamrul (Milwaukee, WI, US)
    • Ahamed; Sheikh Iqbal (Fox Point, WI, US)
    • Love; Richard R. (Madison, WI, US)
  • Original Assignees
  • Examiners
    • Winakur; Eric F
    Agents
    • Quarles & Brady LLP
Abstract
An image-based hemoglobin estimation tool for measuring hemoglobin can be embedded in hand held devices such as smartphones, and similar known and to be developed technology. The hand-held device acquires video data of a finger illuminated from the dorsal surface by a first near infrared light responsive to hemoglobin and a second near infrared light near responsive to plasma. The acquired video is segmented into frames and processed to produce a Photoplethysmography (PPG) waveform. The features of the PPG waveform can then be identified, and the waveform and corresponding features evaluated by a predictive hemoglobin model. The predictive hemoglobin model can be provided at a remote computer, enabling non-invasive hemoglobin analysis from point of care locations. Near infrared lights of 850 nm and 1070 nm are particularly effective in the process.
Description
BACKGROUND

Hematologic diseases are a major global public health problem. The principle constituent of blood is hemoglobin in red blood cells. Broadly, hematologic diseases are of two major types: the anemias and hematologic disorders—primarily hemoglobinopathies. Hemoglobin functions to carry oxygen to body tissues, which activity is compromised with disease. Iron-deficiency anemia occurs in 2 billion people worldwide, and during the past decade, the numbers of people affected has increased. The World Health Organization (WHO) has estimated that anemia affects about 25% of the global population, and an average of 5.6% of the US population. Anemia is particularly problematic in children, because it enhances the risk of cognitive development impairment, and in pregnant women, who suffer higher maternal mortality rates. Many of those suffering from anemia are in developing countries where medical resources are limited.


The most common hematological disorder is sickle cell hemoglobinopathy, called sickle cell disease (SCD). SCD patients are anemic and have abnormal, sickle-shaped red blood cells, the percentages of which increase under stress (such as with infections) causing small vessel obstruction. The most common clinical problems with SCD patient are crises with acute and sever musculoskeletal pain. In the United States, according to the Centers for Disease Control and Prevention (CDC), about 100,000 Americans have SCD and the cases numbers are increasing. Approximately one in 365 African Americans and one in 16,300 Hispanic Americans have SCD.


Currently, the most common measure to assess for hematologic disease is a laboratory plasma hemoglobin (Hgb) test, which determines the concentration of hemoglobin in the blood. These laboratory tests are done on venous or capillary blood specimens obtained invasively, most commonly with drawing blood from a vein, which involves insertion of a needle. Patients therefore can feel discomfort, pain, numbness, or a shocking sensation. Itching or burning at the collection site is also common. These procedures can be particularly traumatic for children and mentally disabled persons. Additionally, these tests require travel to a medical facility, and can be expensive. While there are some point-of-care systems for hemoglobin assess, these are also expensive. In sum, the current technology is inconvenient, costly, slow, uncomfortable and for many not readily accessible.


Some non-invasive point-of-care tools for assessment of hemoglobin levels are available. However, these tools are expensive, have poor performance measures, and require specific training for proper operation and appropriate use. As a result, only large research centers and hospitals can purchase, operate, and maintain these systems.


Recently, smartphone-based hemoglobin measurement technologies have been developed for hemoglobin level assessment. Some of these technologies rely on analysis of the lower eyelid conjunctiva, which has been shown to be useful because the conjunctival mucosa is thin and the underlying micro-vessels are easily seen. One such smartphone-based system compares conjunctival pallor with an eye color chart. Estimation of precise hemoglobin levels with these systems is presently poor.


In these circumstances, a non-invasive, easy-to-use, inexpensive measure of hemoglobin levels is desirable to improve access to diagnostics and to provide safe management of patients with hematologic disease.


SUMMARY

In one aspect, the present disclosure provides a method for non-invasively blood hemoglobin levels. The method comprises acquiring a time-based series of images of the finger ventral pad-tip illuminated from the dorsal side of the finger with a near infrared light responsive to blood hemoglobin, and white light, and acquiring a second time-based series of images of the finger ventral pad-tip illuminated from the dorsal side of the finger with a near infrared light responsive to blood plasma, and white light. Each image in each of the first and second time-based series is divided into groups of blocks. A time series signal is generated from each block, and at least one Photoplethysmography (PPG) cycle is identified from each of the time series signals, including a systolic peak and a diastolic peak. The PPG cycles are processed to determine blood hemoglobin levels.


The step of acquiring a time-based series of images can include acquiring a first and a second video. The video can be separated into frames, each frame comprising an image.


The near infrared light responsive to blood hemoglobin can have a wavelength of between 800 and 950 nm, and the near infrared light responsive to plasma can have a wavelength of 1070 nm. The near infrared light responsive to blood hemoglobin can have a wavelength of 850 nm.


The method can include calculating a ratio of the PPG signal of the first time-based series of images of a blood flow illuminated with a near infrared light responsive to blood hemoglobin, to the second time-based series of the images of a blood flow illuminated with a near infrared light responsive to blood plasma.


The method can also comprise identifying at least one feature in each of the PPG cycles, and the feature can be used to determine the hemoglobin level. The feature can comprise at least one of a relative augmentation of a PPG, an area under the systolic peak; an area under a diastolic peak, a slope of the systolic peak, a slope of the diastolic peak, a relative timestamp value of the peak, a normalized PPG rise time, a pulse transit time (PTT), a pulse shape, or an amplitude.


The step of processing the PPG can comprise analyzing the PPG signals using a prediction model constructed using a support vector machine regression.


The step of generating a time series signal for each of the first and second time-based series of images comprises acquiring red green blue (RGB) digital images of a blood flow. Here, the step of subdividing each image into a plurality of blocks further comprises subdividing each image into a plurality of blocks further comprising a defined number of pixels, calculating a mean intensity value for the red pixels in each block, generating the time series signal identifying each image in the series versus an average value of a block, and subsequently identifying at least one PPG signal in each time series.


In another aspect, a system for non-invasive analysis of a hemoglobin level is disclosed. The system comprises a camera, a first lighting device comprising a near infrared light of a wavelength responsive to blood hemoglobin and adapted to provide images of a finger of a subject, and a second lighting device comprising a near infrared light of wavelength responsive to blood plasma and adapted to provide images of a finger of a subject, and at least one processor. The processor is programmed to receive a first time series of images of a finger of a subject while illuminated by the first lighting device, the first time series of images acquired under conditions selected to capture at least one complete detailed Photoplethysmography (PPG) cycle representative of blood hemoglobin and to receive a second time series of images of the finger while illuminated by the second lighting device, the second time series of images acquired under conditions to capture at least one complete detailed PPG cycle representative of plasma. The processor is further programmed to identify at least one feature in the PPG cycle representative of blood hemoglobin, identify at least one feature in the PPG cycle representative of blood plasma, and provide the identified feature representative of blood hemoglobin and the feature representative of blood plasma to a predictive model adapted to identify a hemoglobin level as a function of the features.


The processor can be further programmed to calculate a ratio of the at least one feature in the PPG cycle representative of blood hemoglobin to the at least one feature in the PPG cycle representative of blood plasma, and provide the ratio to a predictive model adapted to identify a hemoglobin level as a function of the ratio.


The camera can be a red green blue (RGB) digital camera, and, for each of the first and second time series of images, the processor can further be programmed to subdivide each image into a plurality of blocks comprising a defined number of pixels, calculate a mean intensity value for the red pixels in each block, generate a time series signal identifying each image in the series versus an average value of a block for each of the first and second time series, and subsequently identify the at least one PPG signal in each of the first and second time series.


The predictive model can be stored in a remote computer having a second processor, and the operator transmits the videos to the remote computer. The predictive model can comprise a plurality of predictive models, each corresponding to a near infrared light selected to have a wavelength responsive to blood hemoglobin.


The lighting device can comprise a plurality of light emitting diodes mounted in an enclosure, wherein the enclosure includes a slot sized and dimensioned to receive a finger for illumination. The light emitting diodes can include at least one white light LED. The enclosure comprises a material selected to minimize interference from ambient light. The lighting device can comprises one or more coupling device for coupling the lighting device to a camera.


These and other aspects of the invention will become apparent from the following description. In the description, reference is made to the accompanying drawings which form a part hereof, and in which there is shown a preferred embodiment of the invention. Such embodiment does not necessarily represent the full scope of the invention and reference is made therefore, to the claims herein for interpreting the scope of the invention.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates a finger where its approximately 15 mm thickness is penetrated by near infra-red (NIR) light that is minimally absorbed by the intervening tissues, from the dorsal to the ventral surfaces. Through ballistic, snake and diffuse photon scattered paths, NIR light exposure on the dorsal side of the finger, despite intervening finger nail and osseous tissues, can be detected on the ventral surface.



FIG. 2 illustrates the optical densities of the responses of oxygenated hemoglobin, deoxygenated hemoglobin, and plasma illuminated by light of various wavelengths;



FIG. 3A illustrates the process of capturing a fingertip video using 850 nm NIR LED light board, and a plot of light intensity versus time (frame) where the graph defines a photoplethysmogram (PPG) signal caused by the modulation of light intensity by the changes in arterial blood volume change with each heartbeat;



FIG. 3B illustrates the process of capturing a fingertip video using 1070 nm NIR LED light board, and a plot of light intensity versus time (frame) where the graph defines a PPG signal caused by the modulation of light intensity by the changes in arterial blood volume change with each heartbeat;



FIG. 4 illustrates a PPG signal generated from a fingertip video.



FIG. 5 illustrates the ratio of two PPG signals captured under two different wavelengths of NIR.



FIG. 6 illustrates the subdivision of an image frame to generate multiple time series signals;



FIG. 7 illustrates PPG signal generation from a time series signal;



FIG. 8 illustrates multiple features of a PPG signal;



FIG. 9 illustrates feature generation from a PPG signal generated in all blocks;



FIG. 10 illustrates features captured from 100 blocks averaged.



FIG. 11 illustrates the Support Vector Machine (SVM) algorithm for linear and non-linear regression.



FIG. 12A illustrates the regression line developed based on gold standard-laboratory-measured hemoglobin levels and the estimated hemoglobin values using Model;



FIG. 12B illustrates the Bland-Altman plot for estimated hemoglobin levels using Model;



FIG. 13 is a block diagram of a system for performing a non-invasive hemoglobin level test in accordance with at least some embodiments of the current disclosure.



FIG. 14 is a schematic of a light source device constructed in accordance with the disclosure.





DETAILED DESCRIPTION

The present disclosure relates to the measurement of blood hemoglobin concentration using two optical absorption video sets of signals captured under near infrared light exposure with pulsatile blood volume changes. The blood volume changes are captured in the photoplethysmogram (PPG) signals generated. As described below, the measurement can be performed using a hand-held computing device such as a cell phone or smartphone. Images of dorsal fingertip tissue exposed to near infrared light wavelengths selected based on responsiveness to plasma and hemoglobin are acquired with simultaneous dorsal fingertip exposure to white light. The images can, for example, be obtained as a 10 second video of the ventral finger pad. The images allow creation two sets of photoplethysmogram (PPG) signal features that can be analyzed together to determine blood hemoglobin levels.


Photoplethysmogram


PPG is an optical technique for observing blood volume changes noninvasively in the microvascular bed of tissue. Referring now to FIG. 1, a PPG system includes a light source and a photodetector where the light source illuminates the tissue area (e.g., a finger), and the photodetector captures the variation of light intensity. In IR or near-IR wavelengths, the changes in blood flow in tissues such as finger and muscle due to arteries and arterioles can be detected using PPG sensors. The PPG signal can be captured by detecting light intensity which is reflected or transmitted from the tissue. The intensity variations are observed due to vascular blood pressure changes. The PPG signal represents the differences in light intensities with the pulse.


A PPG waveform has two main components: a direct current (DC) component and an alternating current (AC) component. The direct current (DC) component is generated by the transmitted or reflected signal from the tissue and the average blood volume of both arterial and venous blood (see FIG. 4). The AC component fluctuates with the blood volume in the systolic and diastolic phases. When a finger is illuminated under two different wavelengths of NIR lights, and a ratio between the AC and DC components is determined for each, the effects from tissue and venous blood can be removed, providing a measure of the hemoglobin level.


To measure the hemoglobin level with respect to the blood plasma level, one response can be from the blood hemoglobin and another response from the blood plasma. In living tissue, water absorbs photons strongly above the 1000 nm wavelength of light; melanin absorbs in the 400 nm-650 nm spectrum. Hemoglobin response occurs across a spectrum from 650 to 950 nm. The spectrum range from 650 nm to 1100 nm is known as the tissue optical window or NIR region. To get a response from hemoglobin, an 850 nm wavelength NIR LED light which is hemoglobin responsive can be used. Similarly, to get a response from blood plasma, a 1070 nm wavelength NIR LED that is blood plasma responsive can be used. By analyzing the ratio of these two responses as presented as PPG signals, the tissue absorbance effects are removed and a more detailed characteristic of a PPG signal can be obtained for hemoglobin and plasma.


Referring now to FIG. 2, in the finger, the blood, tissue, and bone absorb much of the non-IR (or visible range) light. A video camera can be used to capture the transmitted light, which changes based on the pulsation of arterial blood. The pulsation response can be extracted in time series data calculated from the fingertip video and converted into a PPG signal, which can be analyzed to build a hemoglobin prediction model. A small lighting surface can penetrate only a small part of the living tissue whereas a large planar lighting surface enables penetration of light to a deeper level (such as around bone tissue).


Acquire Image Data for a PPG Signal



FIGS. 3A and 3B illustrate the approach for acquiring data. A finger, such as the index finger, is illuminated by two near-infrared (NIR) light sources with unequal wavelengths λH and λP. The wavelength λH is substantially sensitive to hemoglobin and insensitive to any other blood component. The light of wavelength λP provides a significant response to blood plasma where other blood constituents have no response or negligible response under this NIR (λP) light. Here, 850 nm as λH and 1070 nm as λP NIR LED lights are used. To increase the amount of surface area that is illuminated, a number of LEDs of the same wavelength can be used. In our system, six 850 nm NIR and two white LED lights were used for the hemoglobin response (light source L850, having a wavelength λH), and six 1070 nm NIR, and two LED white lights were used for the plasma response (light source L1070, having wavelength λP). The NIR and white light are always turned on while collecting the data. The white light enables acquiring a photo of the finger that can be visualized.


Referring still to FIGS. 3A and 3B, the light beams of both the L850 nm and L1070 nm light sources are applied to cross from the dorsal side of the finger to the pulp area, resulting in scattering and absorption in the tissue and bone. The light beams exit the ventral pad side of the finger by transmission and transflection and are captured by a video camera. By placing two different light sources L850 and L1070 under the dorsal side of a finger at different times, the response of hemoglobin and plasma can be captured in the fingertip videos, and these videos can then be converted to PPG signals. Here, one PPG signal is extracted from a video captured using light source L850 and another PPG signal is generated using the fingertip video recorded under L1070. Both PPG signals are presented in FIG. 3. A plot of the PPG intensity received for one light source over time or across the frame number is illustrated in FIG. 4. The relative magnitude of the AC signal is due to increased amount of blood (in systolic phase) and the decreasing amount of blood (in the diastolic phase). In addition to the AC component, there is a DC component that is steady in magnitude since this light intensity is captured in the tissue and non-pulsating venous blood. The value of each PPG signal captured for both light sources L850 and L1070 are normalized by dividing the AC component by the DC component. Here, the value calculated by—






A



C
850

/
D



C
850






is defined as R850 and






A



C
1070

/
D



C
1070






as R1070. The normalized value of a PPG signal cancels out the effect of tissue, so that R850 represents the hemoglobin response and R1070 the plasma response. By calculating the ratio of R850 and R1070, a relationship is generated which provides the information on the light absorbed by both hemoglobin and plasma. The ratio of R850 and R1070 for each subjects' PPG signal in a mathematical model is then highly correlated with laboratory-measured (“gold standard”) hemoglobin values as shown in FIG. 5. In addition to the ratio of AC and DC component of a PPG, other features from the PPG signal such as relative augmentation of a PPG, area under the systolic peak and diastolic peak, a slope of each peak, and a relative timestamp value of the peak, can be calculated or otherwise determined, as discussed below.


Pre-Process Data and Identify Region of Interest in Images


To identify a region of interest in the acquired video data, the following steps are taken:

    • 1. Extract all frames from the video.
    • 2. Subdivide each frame into blocks and assign an index number to each block. In one example, the frames were divided into 10×10 blocks, and the index numbers ranged from 1 to 100 where the number 1 starts from top left part of the frame increases towards the right (See FIG. 6).
    • 3. Generate time series signal for each block from the starting frame to the last frame of the video
    • 4. For each time series signal, perform the following steps:
      • a. Apply bandpass filter to filter noise from the acquired video. In one example, a bandpass filter of 0.66 Hz-8.33 Hz was used, where the minimum cut off value was selected to discard the signal fluctuations due to breathing (0.2-0.3 Hz). The other sources of noise can include finger movements, finger quaking resulting in motion artifacts, coughing, and gasping.
      • b. Sample using the Nyquist frequency as frames per second (FPS)/2. In one example, the frames per second is 60, and FPS/2 is 60/2=30.
      • c. Filter the data to remove areas of fluctuation at the beginning and end due to finger movement to start and stop the video camera.
      • d. Define this filtered and cropped signal as the PPG signal and look for three good PPG cycles where each cycle includes a systolic peak and a diastolic peak.
      • e. If three continuous PPG cycles are not found, then select at least one cycle which has a systolic peak and a diastolic peak, replicate the selected cycle three times, and combined them to make a three-cycle PPG signal as shown in FIG. 7.
      • f. Transfer this PPG signal with three cycles to extract the features.


Referring now to FIG. 6, in one example, six 140 mW NIR LEDs were used, along with two white LED lights. These eight LED lights were put in one LED-board which was used for video recording. Three LED boards were created with three light wave-lengths: 850 nm, 940 nm, and 1070 nm NIR LED lights. Videos were acquired at a rate of 60 frames per second (FPS), by a camera that had a 1080×1920-pixel resolution. Here, in a 10-second video, there are 600 frames per 10 second video, and a single block of 10×10 block matrix contains 108×192 pixels of information.


Referring still to FIG. 6, each frame of the video has three two-dimensional pixel intensity arrays for each color: red, green, and blue (RGB). Since each frame has 10×10 blocks, a mean value is computed from each color pixel for each block of a frame which gives 100 mean values (dots in FIG. 6) for one frame. In FIG. 6, 600 frames extracted from a fingertip video are illustrated as subdivided into the 10×10 block matrix. Then, a time series signal is generated, with the frame number in the X-axis and the calculated averaged value of a block in Y-axis. FIG. 6 illustrates three different time series signals for red pixel intensity between first and last frame where the top signal was generated by block number 50, the middle signal was made by block number 97, and the third signal was calculated from block number 91. The dot in each block represents the average of all red pixel intensities of the block area. This dot is the averaged value of the all red pixels in the block. Since each dot has a different intensity, the plot of their averaged values across all frames produce a time series signal. Only red pixel intensities were used because only weak intensity signals were found with green and blue pixels.


After generating the PPG signal from the fingertip video, features were extracted from each PPG signal. Referring now to FIG. 7, three PPG cycles for each block of a video were captured. From these, the AC (systolic peak) and DC (trough) can be measured and used for hemoglobin level analysis.


Referring now to FIG. 8, to characterize the PPG signal generated on each infrared (IR) LED light more fully, features including its diastolic peak, dicrotic notch height, ratio and augmented ratio among systolic, diastolic, and dicrotic notch, systolic and diastolic rising slope, and inflection point area ratio were extracted. About 80% of blocks that have a PPG include these features. The rest of the blocks are assigned as no feature values as shown in FIG. 9 and filtered out. To determine whether a specific PPG signal should be used, systolic and diastolic peaks are noted, and the height of the systolic peak is checked to verify that it is higher than the diastolic peak. If any block has no single PPG cycle that satisfies the selection criteria, the signal does not provide an adequate PPG, and the features are not determined. Finally, the PPG features calculated from a fingertip video are averaged (See FIG. 10).


Constructing the Model


To develop a hemoglobin prediction model, fingertip videos and corresponding known gold standard hemoglobin levels of 167 adult individuals were used; these data were selected from an initial set from 212 individuals. Forty-five cases exhibited poor quality video images or missing laboratory values, and were filtered out. Of the remaining 167 subjects, 82 were men and 85 were women. Laboratory hemoglobin levels ranged from 9.0-13.5 gm/dL across the set of subjects. Video data were acquired with the finger illuminated with three LED boards at 850 nm, 940 nm, and 1070 nm light wave lengths. The data were analyzed using the Support Vector Machine Regression (SVR), where SVR uses “Gaussian” kernels to build the prediction model using support vectors.


The Support Vector Machine (SVM) maximizes the boundary value (sometimes called a “wide street”) to generate a line that separates two classes, as illustrated in FIG. 11. In the regression, the model predicts a real number and optimizes the generalization bounds given for regression. Here, the loss function is known as the epsilon intensive loss function as shown in FIG. 11. In SVR, the input matrix is mapped onto multi-dimensional feature space applying nonlinear mapping to build a linear model as shown in Equation 1 where φj(x), j=1, 2, 3, m is a set of non-linear transformations and ‘b’ is the ‘bias’ term.

f(x,ω)=Σj=1mωjφj(x)+b  (1)


The SVR uses ε-intensive loss function.

min ½∥ω∥2+CΣ+)  (2)


subject to









{







y
i

-

f

(


x
i

,
ω

)






+

ζ
+











f

(


x
i

,
ω

)

-

y
i






+

ζ
-









ζ
+

,


ζ
-

>
0

,

i
=
1

,
2
,

3




,
n








(
3
)







In the data analysis, MATLAB command “fitrsvm” was used with Xtrain, Ytrain, and “Gaussian” kernel as parameters. The “Standardize” function was set to standardize the data using the same mean and standard deviation in each data column. The prediction model was generated as a “Gaussian SVR Model” and the test data applied on this model using the MATLAB command “predict”, while providing the model and test data as the parameter. The results are illustrated using MAPE, correlation coefficient (R), and Bland-Altman plot.


The Mean Absolute Percent Error (MAPE) is a commonly used metric to present the error level in the data. The MAPE is calculated as the following equation 4.









M
=



100

%

1






i
=
1

n



|


A
t

-

E
t


|


|

A
t

|








(
4
)







Where, At=Actual value or gold standard measurement, Et=estimated value, and n=number of measurements or observations. MAPE has been used because MAPE does not depend on scale.


The correlation coefficient R can also be used to determine how strongly two measurement methods are related. R is computed as the ratio of covariance between the variables to the product of their standard deviations. The value of R is in between −1.0 and +1.0. If the value of R is +1.0 or −1.0, then a strong linear relationship between two estimation methods, and the linear regression can be calculated. The R value, however, does not identify whether there is a good agreement between the measurement methods. The Bland-Altman plot was used to evaluate a bias between the mean differences and to assess the agreement between the two measurement processes. The formula for Pearson's correlation is:









R
=





i
=
1

n



(


x
i

-
x

)



(


y
i

-
y

)






[




i
=
1

n



(


x
i

-

x
_


)

2


]

[




i
=
1

n



(


y
i

-

y
_


)

2


]







(
5
)









    • where, n is the sample size, xi, yi are the individual sample points indexed with i,






x=½Σi=1nxi is the sample mean, and y=½Σi=1nyi is the target mean value.


The Bland-Altman graph plot represents the difference between the two measurement methods against the mean value of the measurement. The differences between these two methods are normally plotted against the mean of the two measurements. A plotting difference against mean helps identify the relationship between measurement error and the clinically measured value.


As described above, the model was developed using data from 167 subjects, which was filtered from an initial set of data of 212 fingertip videos. (IR) LED lights were applied with wavelengths of 850 nm, 940 nm, and 1070 nm. A Google Pixel 2 smartphone was used to capture video at 60 frames per second (FPS). The Google Pixel 2 has a 950 nm LED on board, and video was also acquired using this LED.


Sixteen PPG features were computed from a block of a video (600 frames) including systolic peak, diastolic peak, a dicrotic notch, augmentation among those peaks, peaks arrival time, inflection point area ratio, and peak rising slopes. To normalize the data, a ratio of two PPG features generated from different wavelengths of light was used. The ratio of two PPG signals' feature values was calculated as follows:











R

1

0

7

0


(

8

5

0

)

=


P

P


G
850



P

P


G
1070







(
6
)







The ratio of two PPG feature values here is the individual ratio between each feature value. For example, the ratio of the systolic peak value under a 1070 nm NIR light and the systolic peak value under an 850 nm NIR. Similarly, the ratio of all other features that were applied to the SVR machine learning algorithm were measured, along with ratios for the other wavelengths, referred to as herein as R1070(940), R1070(Pixel2) where:











R
1070

(
940
)

=


PPG
940


PPG
1070






(
7
)














R
1070

(

Pixel

2

)

=


PPG

Pixel

2



PPG
1070






(
8
)







Here, PPG1070 was considered as a plasma responsive PPG signal, as discussed above. The other PPG signals were chosen as hemoglobin responsive PPG signal.


As described above, SVR was applied to the features generated from each of these ratios. For the ratio R1070(850) (Equation 6), an optimal prediction model was developed and defined. A regression line based on the clinically measured hemoglobin levels and the estimated hemoglobin values is illustrated in FIG. 12 based a combination of features that gave this optimal result. In FIG. 12a, the Mean Absolute Percentage Error (MAPE) is 2.08% where the linear correlation coefficient (R) between gold standard and estimated hemoglobin was 0.97.


Comparative Predictive Model Results


Other models using data obtained with the LED light board at 940 nm, and a cell phone camera using only the white light with this phone on the ventral finger pad were developed and evaluated. The described model was found to be the most accurate and predictive.


Hemoglobin Estimation Procedure Using the Predictive Model


With further confirmatory data, the predictive model described above can therefore be used to provide a noninvasive point of care tool for hemoglobin assessment. In this framework, a fingertip video is recorded while the finger is illuminated by two near-infrared (NIR) light sources with unequal wavelengths, one that is sensitive to hemoglobin (λH) and another that is sensitive to plasma (λP). The videos are then processed as described above and analyzed as in the defined optimal prediction model.


Referring now to FIG. 13, a block diagram of a device or system of devices for analyzing an object of interest, such as a finger, in accordance with the present disclosure is shown. The system includes a processor 30 which is in communication with a camera 32 and a light source 34. In operation the light source is activated, either by the processor or individually by, for example, input from a user or caregiver, and is positioned to shine light on the object of interest 36, which in the system described here is the finger 36. The camera 32 takes a series of pictures of the finger 36, which are preferably video but could, in some cases, be still photographs acquired in sufficiently quick succession to enable reproduction of a PPG signal, as described above. The image data is provided to the processor 30 which can either process the image data, as described below, or optionally transmit the data to a remote computer system 38 for analysis. The processor 30, camera 32, and light 34 can be part of a single device, which can be produced specifically for the application, but can also be a smart phone, laptop, tablet, or other devices having the described equipment and capable of providing light on an object to be evaluated and to acquire images of the object. The processor, camera, and light can all also be provided as separate components. The remote computer system 38 can, for example, be a cloud computer system or other types of wired or wireless networks. As described more fully below, the system can be used to evaluate hemoglobin by processing the frames of the image data and applying a trained machine learning model. Although not shown here, the processor can be further connected to various user interfaces, including a display, keyboard, mouse, touch screen, voice recognition system, or other similar devices.


In one example, image data can be captured using a personal electronic device containing processor 30, and camera 32, and the data transferred through a communications network to the remote computer or server 38 using secure communications for processing. For example, video images can be acquired with a smart phone, and a mobile application (app), such as an Android or iOS-based application, and sent to a cloud server 38 through the internet. A software application can be stored on the hand-held device and used to capture, for example, a 10-second fingertip video with the support of the built-in camera and a near infrared LED device adapted to provide illumination on a finger. The remote computer 38 can provide user authentication, video processing, and feature extraction from each video, as described above. Other methods of communicating to a remote computer can include, for example, communications via wired or wireless networks, cellular phone communications, Bluetooth communications, or storing data on a memory device and transferring the information to the remote computer through a memory drive or port.


A mobile application can store data useable by the camera 32 to monitor the position of the user's finger for appropriate placement, and activate an indicator, such as a light, or a speaker, to provide a signal to the user when the finger is appropriately positioned. The camera can also compare to stored data to evaluate whether the finger is sufficiently motionless to acquire data with the camera, and whether the finger is applying normal pressure. A video recording process can be automatically started by the mobile application when the user's finger is appropriately positioned so that user doesn't have to activate the video recording button, and stopped after a pre-determined period of time, such as a 10-second duration. The application can communicate with and automatically transfer video to the remote computer 38 or ask the user to affirm whether they are ready to transmit the data. Based on available bandwidth, the entire video can be transferred at one time. Alternatively, portions of the video can be iteratively sent to the remote computer 38. Communications through a USB port connection, Bluetooth, or other wired or wireless system can also be used with corresponding communications devices associated with the light device 34 to activate lighting.


The light source 34 can be an LED associated with the device, and video can be acquired using the built-in camera in the equipment. In alternative embodiments, a specific NIR device, such as a printed circuit board can be provided (See, for example, FIGS. 3A and 3B). Referring now to FIG. 14, as discussed above, the light source 34 can have, for example, a plurality of LEDs 40 emitting light at a wavelength of 850 nm, and another plurality of LEDs 42 emitting light at a wavelength of 1070 nm. Other wavelength variations within the spectrum range of 650 nm to 1100 nm are also possible. For example, wavelengths responsive to hemoglobin can be used in a range between 800 and 950 nm. Wavelengths responsive to plasma can be in the range of 950 nm-1100 nm, with a peak response at around 1000 nm. In one embodiment, to provide a sufficient amount of light, 6 LEDs of 140 mW were used for each wavelength. One or more white lights 44 can also be provided on the board. A battery, such as a rechargeable battery, can be provided to power the LEDs. A charging point 46 for charging the battery can be included, along with a three-way or on/off switch 48. Although a single board is illustrated here, in some applications, the LEDS of specific wavelengths can be provided on two separate devices or boards, one adapted to provide NIR light responsive to plasma, and a second adapted to provide NIR light responsive to hemoglobin. In one embodiment, six 850 nm LEDs were used to provide light responsive to hemoglobin and six 1070 nm LEDs were used to provide light responsive to plasma. Two white LEDs were used to illuminate the finger during acquisition of images. This configuration was shown to be particularly successful in providing an accurate reading of hemoglobin. A similar configuration using 950 nm light also provided reasonably accurate results.


Referring still to FIG. 14, the LEDs 40, 42, and 44 are preferably mounted to a printed circuit board that can be provided in a housing 50. The charging point 46, switch 48, and battery can also be mounted in the housing 50. A light restrictive enclosure 52, which encloses the LEDs, is mounted to the housing 50, and comprises a slot 54 sized and dimensioned to receive a finger illuminated by the LEDs 40, 42, and 44. The shape of the upper layer of the enclosure 52 enables positioning a finger adjacent the board for illumination. In particular, the enclosure 52 is dimensioned to cause the dorsal area of the finger to touch the LEDs 40 and 42, and video can be captured from the opposing ventral side of the finger. Although the sides of the enclosure 52 are illustrated as open to enable viewing the LEDs, the sides of the enclosure are typically closed to prevent ambient illumination from interfering with the LEDs. The enclosure is preferably black in color, and can further be constructed of a material selected to minimize light interference from outside of the enclosure. Although a box shape is illustrated here, the shape of the enclosure is not limited to box-like enclosures, but can include, for example, a round or oblong profile sized to receive a finger, or other types of enclosures. Further, although three LEDs of each wavelength are illustrated here, the number of LEDs is not intended to be limiting. Various numbers of LEDs can be used. As described above, it has been shown experimentally that six or more LEDs of each wavelength provide improved results. Further, although LEDs of two different wavelengths are illustrated in the LED device here, LEDs 40 and 42 can be provided in separate LED devices. Where LEDs of both wavelengths are provided, the switch 48 can be a three way switch, switching between LEDs 42, LEDs 44, and an off position. When LEDs of one wavelength are provided in the enclosure, the switch 48 can be a two way on/off switch. In some applications, the LED device can be coupled to a camera, video camera, or a handheld device including a camera such as a smartphone, tablet, laptop, or similar device using brackets, straps, fasteners, adhesives, or other such devices.


Alternatively, the light 34 can be coupled directly to the user's finger, such as the index finger, using coupling devices including hook and loop fasteners, adhesives, tie straps, elastic bands, or similar elements. In some application, the light 34 device may be curved or otherwise formed specifically to engage a finger. The light 34 device may also include coupling elements enabling coupling of the device to a cellular phone or other device containing the processor 30 or to a camera 32.


The system can perform the hemoglobin level prediction at a local processor, such as the processor 30, or at a remote computer 38, which can be, for example, a cloud-based device. The cloud computing system can be HIPAA (Health Insurance Portability and Accountability Act) compliant or otherwise secured to address security and privacy issues, such as protected health information (PHI), to protect the stored database from unauthorized access, and data breach.


It should be understood that the methods and apparatuses described above are only exemplary and do not limit the scope of the invention, and that various modifications could be made by those skilled in the art that would fall under the scope of the invention. For example, although specific hardware configurations are described above, it will be apparent that a number of variations are available. Images of an illuminated finger could, for example, be acquired by a camera and transferred directly to a computer through hard wired or wireless links, or through transportable memory storage such as an SD card, USB flash drive, or other device. As described above, processing to analyze the hemoglobin content of a PPG signal acquired from a series of images or video can be performed by a local processor or computer, or at a remote location, such as a cloud device, as described above. Various off the shelf hand held devices, including smartphones and cellular phones that include an on-board camera and a processor can be used in the process described above. However, a device constructed specifically for this purpose can also be used.

Claims
  • 1. A system for non-invasive analysis of a hemoglobin level, the system comprising the following: a camera;a first lighting device comprising a near infrared light of a wavelength responsive to blood hemoglobin and adapted to illuminate a finger of a subject;a second lighting device comprising a near infrared light of a wavelength responsive to blood plasma and adapted to illuminate a finger of a subject;at least one processor, wherein the at least one processor is programmed to:receive a first time series of images of a ventral pad of the finger of a subject while illuminated by the first lighting device from a dorsal side of the finger, the first time series of images acquired by the camera under conditions selected to capture at least one complete detailed Photoplethysmography (PPG) cycle representative of blood hemoglobin; andreceive a second time series of images of the ventral pad of the finger of the subject while illuminated by the second lighting device from a dorsal side of the finer, the second time series of images acquired by the camera under conditions to capture at least one complete detailed PPG cycle representative of plasma; identify at least one feature in the PPG cycle representative of blood hemoglobin;identify at least one feature in the PPG cycle representative of blood plasma;provide the identified feature representative of blood hemoglobin and the feature representative of blood plasma to a predictive model adapted to identify a hemoglobin level as a function of the features.
  • 2. The system of claim 1, wherein the processor is further programmed to: calculate a ratio of the at least one feature in the PPG cycle representative of blood hemoglobin to the at least one feature in the PPG cycle representative of blood plasma; andprovide the ratio to a predictive model adapted to identify a hemoglobin level as a function of the ratio.
  • 3. The system of claim 1, wherein the predictive model is stored in a remote computer having a second processor, and the processor is further programmed to transmit data for analysis to the remote computer.
  • 4. The system of claim 3, wherein the remote computer comprises a database storing model data for hand held computerized devices including data related to an on-board camera of the hand held computerized device.
  • 5. The system of claim 1, wherein the predictive model comprises a plurality of predictive models, each corresponding to a near infrared light selected to have a wavelength responsive to blood hemoglobin.
  • 6. The system of claim 1, wherein the lighting device comprises a plurality of light emitting diodes mounted in an enclosure, wherein the enclosure includes a slot sized and dimensioned to receive a finger for illumination.
  • 7. The system of claim 6, wherein the light emitting diodes include at least one white light LED.
  • 8. The system of claim 6, wherein the enclosure comprises a material selected to minimize interference from ambient light.
  • 9. The system of claim 6, wherein the lighting device comprises one or more coupling device for coupling the lighting device to a camera.
  • 10. The system of claim 1, wherein the camera and the processor are embedded in a hand held computerized device.
  • 11. The system of claim 10, wherein the hand held computerized device comprises a cellular phone.
  • 12. The system of claim 1, wherein the near infrared light responsive to blood hemoglobin has a wavelength of between 800 and 950 nm and the near infrared light responsive to plasma has a wavelength of 1070 nm.
  • 13. The system of claim 12, wherein the near infrared light responsive to blood hemoglobin has a wavelength of 850 nm.
  • 14. The system of claim 1, wherein the camera is a video camera.
  • 15. The system of claim 1, wherein the camera is a red green blue (RGB) digital camera, and, for each of the first and second time series of images, the processor is further programmed to: subdivide each image in each of the first and second time-based series into a plurality of blocks comprising a defined number of pixels;calculate a mean intensity value for the red pixels in each block;generate a time series signal identifying each image in the series versus an average value of a block for each of the first and second time series; and subsequently identify the at least one PPG signal in each of the first and second time series.
  • 16. The system of claim 1, wherein the processor is further programmed to calculate a ratio of the PPG signal of the first time-based series of images of a blood flow illuminated with a near infrared light responsive to blood hemoglobin, to the second time-based series of the images of a blood flow illuminated with a near infrared light responsive to blood plasma.
  • 17. The system of claim 1, wherein the processor is further programmed to identify at least one feature in each of the PPG cycles to determine the hemoglobin level, wherein the feature comprises at least one of a relative augmentation of a PPG, an area under the systolic peak; an area under a diastolic peak, a slope of the systolic peak, a slope of the diastolic peak, a relative timestamp value of the peak, a normalized PPG rise time, a pulse transit time (PTT), a pulse shape, or an amplitude.
  • 18. The system of claim 1, wherein processor is further programmed to: subdivide each image into a plurality of blocks comprising a defined number of pixels;calculate a mean intensity value for the red pixels in each block;generating a time series signal identifying each image in the series versus an average value of a block; and subsequently identify at least one PPG signal in each time series.
  • 19. The system of claim 18, wherein the processor is further programmed to filter the data in each of the frames to identify the at least one PPG signal.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application represents the U.S. national stage entry of International Application No. PCT/US2019/020675 filed on Mar. 5, 2019, which claims the benefit of U.S. Provisional patent application Ser. No. 62/638,630, filed on Mar. 5, 2018, which disclosures are incorporated herein by reference in their entirety.

PCT Information
Filing Document Filing Date Country Kind
PCT/US2019/020675 3/5/2019 WO
Publishing Document Publishing Date Country Kind
WO2019/173283 9/12/2019 WO A
US Referenced Citations (540)
Number Name Date Kind
4485820 Flower Dec 1984 A
5277181 Mendelson et al. Jan 1994 A
5365066 Krueger, Jr. et al. Nov 1994 A
5385143 Aoyagi Jan 1995 A
5424545 Block et al. Jun 1995 A
5634461 Faithfull et al. Jun 1997 A
5701902 Vari et al. Dec 1997 A
5717605 Komiya et al. Feb 1998 A
5788647 Eggers Aug 1998 A
5810723 Aldrich Sep 1998 A
5813987 Modell et al. Sep 1998 A
5885621 Head et al. Mar 1999 A
5928155 Eggers et al. Jul 1999 A
5978691 Mills Nov 1999 A
6007996 McNamara et al. Dec 1999 A
6028311 Sodickson Feb 2000 A
6033862 Matsuda et al. Mar 2000 A
6081612 Gutkowicz-Krusin et al. Jun 2000 A
6088099 Cabib et al. Jul 2000 A
6165734 Garini et al. Dec 2000 A
6208749 Gutkowicz-Krusin et al. Mar 2001 B1
6276798 Gil et al. Aug 2001 B1
6402697 Calkins et al. Jun 2002 B1
6556853 Cabib et al. Apr 2003 B1
6615064 Aldrich Sep 2003 B1
6648820 Sarel Nov 2003 B1
6743172 Blike Jun 2004 B1
6905827 Wohlgemuth et al. Jun 2005 B2
6936254 Baker et al. Aug 2005 B2
7026121 Wohlgemuth et al. Apr 2006 B1
7058515 Selifonov et al. Jun 2006 B1
7187790 Sabol et al. Mar 2007 B2
7189507 Mack et al. Mar 2007 B2
7194301 Jenkins et al. Mar 2007 B2
RE39672 Shehada et al. Jun 2007 E
7270970 Anderson et al. Sep 2007 B2
7365858 Fang-Yen et al. Apr 2008 B2
7394919 Rowe et al. Jul 2008 B2
7490085 Walker et al. Feb 2009 B2
7539330 Rowe May 2009 B2
7545963 Rowe Jun 2009 B2
7659077 Lund et al. Feb 2010 B2
7684934 Shvartsburg et al. Mar 2010 B2
7709461 Liu et al. May 2010 B2
7711662 Buscema May 2010 B2
7785797 Wohlgemuth et al. Aug 2010 B2
7968088 Honmou et al. Jun 2011 B2
8138265 Calabro et al. Mar 2012 B2
8207262 Calabro et al. Jun 2012 B2
8229185 Ennis et al. Jul 2012 B2
8252743 Guyon et al. Aug 2012 B2
8257696 Steindler et al. Sep 2012 B2
8283122 Khan et al. Oct 2012 B2
8285366 Hyde et al. Oct 2012 B2
8285367 Hyde et al. Oct 2012 B2
8306607 Levi et al. Nov 2012 B1
8335652 Soykan et al. Dec 2012 B2
8399525 Lockhart Mar 2013 B2
8425444 Keenan et al. Apr 2013 B2
8435167 Oohashi et al. May 2013 B2
8457705 Shoureshi et al. Jun 2013 B2
8478389 Brockway et al. Jul 2013 B1
8500636 Tran Aug 2013 B2
8548549 Schurman et al. Oct 2013 B2
8583565 Shoureshi et al. Nov 2013 B2
8585627 Dacey, Jr. et al. Nov 2013 B2
8606592 Hyde et al. Dec 2013 B2
8684900 Tran Apr 2014 B2
8684922 Tran Apr 2014 B2
8706518 Hyde et al. Apr 2014 B2
8708903 Tran Apr 2014 B2
8738395 Hyde et al. May 2014 B2
8743354 Barrett et al. Jun 2014 B2
8814923 Nissiläet al. Aug 2014 B2
8870813 Ferren et al. Oct 2014 B2
8913800 Rowe Dec 2014 B2
8928671 Adler et al. Jan 2015 B2
8928877 Lim et al. Jan 2015 B2
8968195 Tran Mar 2015 B2
9005263 Boyden et al. Apr 2015 B2
9028405 Tran May 2015 B2
9037217 Peyman May 2015 B1
9060722 Teixeira Jun 2015 B2
9064036 Hyde et al. Jun 2015 B2
9091676 Rule et al. Jul 2015 B2
9106038 Telfort et al. Aug 2015 B2
9131844 Heanue et al. Sep 2015 B2
9132145 Lee-Huang et al. Sep 2015 B2
9159223 Proud Oct 2015 B2
9298985 Krueger Mar 2016 B2
9332917 Zhang May 2016 B2
9340772 Bhatia et al. May 2016 B2
9345404 Proud May 2016 B2
9361572 Proud et al. Jun 2016 B2
9392943 Reinisch Jul 2016 B2
9402546 Segman Aug 2016 B2
9436903 Proud et al. Sep 2016 B2
9445651 Proud et al. Sep 2016 B2
9456776 Ando Oct 2016 B2
9462856 Proud et al. Oct 2016 B2
9501624 Vishnubhatla et al. Nov 2016 B2
9510765 Greder Dec 2016 B2
9510974 Peyman Dec 2016 B1
9528817 Fang-Yen et al. Dec 2016 B2
9553486 Proud et al. Jan 2017 B2
9554742 Lim et al. Jan 2017 B2
9560967 Hyde et al. Feb 2017 B2
9569719 Proud et al. Feb 2017 B2
9569720 Proud et al. Feb 2017 B2
9576236 Proud et al. Feb 2017 B2
9582748 Proud et al. Feb 2017 B2
9582749 Proud et al. Feb 2017 B2
9619883 Yudovsky Apr 2017 B2
9626650 Hwang et al. Apr 2017 B2
9655558 Proud et al. May 2017 B2
9662015 Proud et al. May 2017 B2
9687670 Dacey, Jr. et al. Jun 2017 B2
9704209 Proud et al. Jul 2017 B2
9706952 Zhang Jul 2017 B2
9714900 Haider et al. Jul 2017 B2
9724489 Barbour et al. Aug 2017 B2
9743837 Ando Aug 2017 B2
9764162 Willcut et al. Sep 2017 B1
9770189 Hyde et al. Sep 2017 B2
9801542 Tran et al. Oct 2017 B2
9814425 Tran Nov 2017 B2
9820657 Tran Nov 2017 B2
9841415 Kim et al. Dec 2017 B2
9858540 Firminger et al. Jan 2018 B2
9886729 Firminger et al. Feb 2018 B2
9892435 Firminger et al. Feb 2018 B2
9911165 Firminger et al. Mar 2018 B2
9931171 Peyman Apr 2018 B1
9946344 Ayaz et al. Apr 2018 B2
9956393 Perez et al. May 2018 B2
9968264 Tzvieli et al. May 2018 B2
9993197 Proud Jun 2018 B2
9994228 Krueger Jun 2018 B2
9999351 Proud Jun 2018 B2
10045726 Tzvieli et al. Aug 2018 B2
10045737 Tzvieli et al. Aug 2018 B2
10052016 Ehlers et al. Aug 2018 B2
10064559 Tzvieli et al. Sep 2018 B2
10076250 Tzvieli et al. Sep 2018 B2
10076270 Tzvieli et al. Sep 2018 B2
10085642 Frederick et al. Oct 2018 B2
10085643 Bandic et al. Oct 2018 B2
10085685 Tzvieli et al. Oct 2018 B2
10094649 Bagherinia Oct 2018 B2
10111923 Ma et al. Oct 2018 B2
10117617 Cantu et al. Nov 2018 B2
10118006 Davidson et al. Nov 2018 B2
10136852 Tzvieli et al. Nov 2018 B2
10136856 Tzvieli et al. Nov 2018 B2
10216981 Tzvieli et al. Feb 2019 B2
10219705 Addison et al. Mar 2019 B2
10261071 Hall et al. Apr 2019 B2
10272029 Ahlfors Apr 2019 B2
10299717 Tzvieli et al. May 2019 B2
10335302 Perez et al. Jul 2019 B2
10351620 Rodriguez et al. Jul 2019 B2
10354051 Hickle et al. Jul 2019 B2
10366793 Apte et al. Jul 2019 B2
10402980 Mutti et al. Sep 2019 B2
10413182 Flitsch et al. Sep 2019 B2
10420491 Rajan et al. Sep 2019 B2
10423893 Bendfeldt Sep 2019 B2
10445846 Hwang et al. Oct 2019 B2
10448836 Darty Oct 2019 B2
10456209 Peyman Oct 2019 B2
10467754 Ando et al. Nov 2019 B1
10468131 Macoviak et al. Nov 2019 B2
10468135 Lynn et al. Nov 2019 B2
10478131 Jain et al. Nov 2019 B2
10523852 Tzvieli et al. Dec 2019 B2
10524664 Liu et al. Jan 2020 B2
10524667 Tzvieli et al. Jan 2020 B2
10537270 Sarussi et al. Jan 2020 B2
10553316 Neumann Feb 2020 B1
10553319 Neumann Feb 2020 B1
10559386 Neumann Feb 2020 B1
10560643 Barnes et al. Feb 2020 B2
10568570 Sherpa Feb 2020 B1
10580129 Lin et al. Mar 2020 B2
10580130 Frangioni Mar 2020 B2
10593431 Neumann Mar 2020 B1
10596387 Walder et al. Mar 2020 B2
10610111 Tran Apr 2020 B1
10638938 Tzvieli et al. May 2020 B1
10638960 Hatch May 2020 B2
10656015 McQuilkin et al. May 2020 B2
10660531 Libove et al. May 2020 B1
10660557 Lim et al. May 2020 B2
10666928 Liu May 2020 B2
10667749 Myslinski Jun 2020 B2
10677688 Rivas et al. Jun 2020 B2
10682517 Hoffman et al. Jun 2020 B2
10716469 Krueger Jul 2020 B2
10719992 Samec et al. Jul 2020 B2
10722562 Pedersen et al. Jul 2020 B2
20020052551 Sinclair et al. May 2002 A1
20020082485 Faithfull et al. Jun 2002 A1
20020099295 Gil et al. Jul 2002 A1
20020165439 Schmitt Nov 2002 A1
20030063300 Rubinstenn Apr 2003 A1
20030064356 Rubinstenn et al. Apr 2003 A1
20030065256 Rubinstenn Apr 2003 A1
20030065552 Rubinstenn et al. Apr 2003 A1
20030114371 Feder et al. Jun 2003 A1
20030204070 Chen et al. Oct 2003 A1
20030224386 Guild et al. Dec 2003 A1
20040029114 Mack et al. Feb 2004 A1
20040122702 Sabol et al. Jun 2004 A1
20040122703 Walker et al. Jun 2004 A1
20040122704 Sabol et al. Jun 2004 A1
20040122705 Sabol et al. Jun 2004 A1
20040122706 Walker et al. Jun 2004 A1
20040122707 Sabol et al. Jun 2004 A1
20040122708 Avinash et al. Jun 2004 A1
20040122709 Avinash et al. Jun 2004 A1
20040122719 Sabol et al. Jun 2004 A1
20040122787 Avinash et al. Jun 2004 A1
20050003487 Eaton et al. Jan 2005 A1
20050009771 Levanon et al. Jan 2005 A1
20050014226 Ashkenazi et al. Jan 2005 A1
20050048620 Wu et al. Mar 2005 A1
20050124010 Short et al. Jun 2005 A1
20050130321 Nicholson et al. Jun 2005 A1
20050152908 Liew et al. Jul 2005 A1
20060088473 Dowding et al. Apr 2006 A1
20060257941 McDevitt et al. Nov 2006 A1
20070014719 Reading et al. Jan 2007 A1
20070118399 Avinash et al. May 2007 A1
20070123801 Goldberger et al. May 2007 A1
20070135335 Collier et al. Jun 2007 A1
20070258083 Heppell et al. Nov 2007 A1
20080072663 Keenan et al. Mar 2008 A1
20080113358 Kapur et al. May 2008 A1
20080161723 Keenan et al. Jul 2008 A1
20080200838 Goldberger et al. Aug 2008 A1
20080241839 Potkin et al. Oct 2008 A1
20090010908 Gow et al. Jan 2009 A1
20090032111 Tong et al. Feb 2009 A1
20090041825 Kotov et al. Feb 2009 A1
20090070145 Haider Mar 2009 A1
20090104602 Fernandez-Reyes et al. Apr 2009 A1
20090156911 Rule et al. Jun 2009 A1
20090157430 Rule et al. Jun 2009 A1
20090160656 Seetharaman et al. Jun 2009 A1
20090270694 Hyde et al. Oct 2009 A1
20090270700 Van Herpen et al. Oct 2009 A1
20090271122 Hyde et al. Oct 2009 A1
20090271347 Hyde et al. Oct 2009 A1
20090281412 Boyden et al. Nov 2009 A1
20090287076 Boyden et al. Nov 2009 A1
20090287094 Ferren et al. Nov 2009 A1
20090287109 Ferren et al. Nov 2009 A1
20090287110 Ferren et al. Nov 2009 A1
20090292212 Ferren et al. Nov 2009 A1
20090292213 Ferren et al. Nov 2009 A1
20090292214 Ferren et al. Nov 2009 A1
20090292222 Ferren et al. Nov 2009 A1
20090312595 Leuthardt et al. Dec 2009 A1
20090312668 Leuthardt et al. Dec 2009 A1
20090318773 Jung et al. Dec 2009 A1
20100004762 Leuthardt et al. Jan 2010 A1
20100015583 Leuthardt et al. Jan 2010 A1
20100017001 Leuthardt et al. Jan 2010 A1
20100022820 Leuthardt et al. Jan 2010 A1
20100030089 Hyde et al. Feb 2010 A1
20100041958 Leuthardt et al. Feb 2010 A1
20100041964 Hyde et al. Feb 2010 A1
20100042578 Leuthardt et al. Feb 2010 A1
20100063368 Leuthardt et al. Mar 2010 A1
20100069724 Leuthardt et al. Mar 2010 A1
20100076249 Leuthardt et al. Mar 2010 A1
20100076691 Palucka et al. Mar 2010 A1
20100081190 Hyde et al. Apr 2010 A1
20100081860 Leuthardt et al. Apr 2010 A1
20100081861 Leuthardt et al. Apr 2010 A1
20100081915 Hyde et al. Apr 2010 A1
20100081916 Hyde et al. Apr 2010 A1
20100081919 Hyde et al. Apr 2010 A1
20100081923 Hyde et al. Apr 2010 A1
20100081924 Hyde et al. Apr 2010 A1
20100081925 Hyde et al. Apr 2010 A1
20100081926 Hyde et al. Apr 2010 A1
20100081927 Hyde et al. Apr 2010 A1
20100081928 Hyde et al. Apr 2010 A1
20100086481 Baird et al. Apr 2010 A1
20100100036 Leuthardt et al. Apr 2010 A1
20100120665 Kaleko et al. May 2010 A1
20100125561 Leuthardt et al. May 2010 A1
20100130811 Leuthardt et al. May 2010 A1
20100145175 Soldo et al. Jun 2010 A1
20100145412 Boyden et al. Jun 2010 A1
20100163027 Hyde et al. Jul 2010 A1
20100168525 Hyde et al. Jul 2010 A1
20100168529 Hyde et al. Jul 2010 A1
20100168602 Hyde et al. Jul 2010 A1
20100185064 Bandic et al. Jul 2010 A1
20100209914 Bigwood et al. Aug 2010 A1
20100241449 Firminger et al. Sep 2010 A1
20100268057 Firminger et al. Oct 2010 A1
20100274577 Firminger et al. Oct 2010 A1
20100280332 Hyde et al. Nov 2010 A1
20100305962 Firminger et al. Dec 2010 A1
20100312579 Firminger et al. Dec 2010 A1
20110035231 Firminger et al. Feb 2011 A1
20110097330 Horner et al. Apr 2011 A1
20110111973 Mecklenburg et al. May 2011 A1
20110160681 Dacey, Jr. et al. Jun 2011 A1
20110163163 Rowe Jul 2011 A1
20110189680 Keown et al. Aug 2011 A1
20110190613 Zhang et al. Aug 2011 A1
20110236903 McClelland et al. Aug 2011 A1
20110251099 Visvanathan et al. Oct 2011 A1
20110257892 Selifonov et al. Oct 2011 A1
20110306518 Wohlgemuth et al. Dec 2011 A1
20120030776 Combs et al. Feb 2012 A1
20120034157 Hyde et al. Feb 2012 A1
20120059778 Soykan et al. Mar 2012 A1
20120072124 Radich et al. Mar 2012 A1
20120115248 Ansyln et al. May 2012 A1
20120119089 Sanchez del Rio Saez May 2012 A1
20120130196 Jain et al. May 2012 A1
20120130201 Jain May 2012 A1
20120130202 Jain May 2012 A1
20120150003 Zhang Jun 2012 A1
20120178100 Wagner et al. Jul 2012 A1
20120190947 Chon et al. Jul 2012 A1
20120190964 Hyde et al. Jul 2012 A1
20120197621 Jain Aug 2012 A1
20120197622 Jain Aug 2012 A1
20120264686 Guyon et al. Oct 2012 A9
20120265546 Hwang et al. Oct 2012 A1
20120265547 Hwang et al. Oct 2012 A1
20120265548 Hwang et al. Oct 2012 A1
20120265591 Hwang et al. Oct 2012 A1
20120272341 Combs et al. Oct 2012 A1
20120277999 Somogyi et al. Nov 2012 A1
20120282353 Roth et al. Nov 2012 A1
20120321759 Marinkovich et al. Dec 2012 A1
20120323127 Boyden et al. Dec 2012 A1
20130035569 Heanue et al. Feb 2013 A1
20130071837 Winters-Hilt et al. Mar 2013 A1
20130123592 Rule May 2013 A1
20130160150 Leibel et al. Jun 2013 A1
20140011879 Baribaud et al. Jan 2014 A1
20140016116 Maier et al. Jan 2014 A1
20140107080 Koga et al. Apr 2014 A1
20140163409 Arndt Jun 2014 A1
20140199273 Cesano et al. Jul 2014 A1
20140200511 Boyden et al. Jul 2014 A1
20140247155 Proud Sep 2014 A1
20140276090 Breed Sep 2014 A1
20140296693 Binder et al. Oct 2014 A1
20140308930 Tran Oct 2014 A1
20140378795 McKenna Dec 2014 A1
20150068069 Tran et al. Mar 2015 A1
20150118689 Egan et al. Apr 2015 A1
20150150460 Krishnaswamy et al. Jun 2015 A1
20150205992 Rowe Jul 2015 A1
20150269825 Tran Sep 2015 A1
20150287191 Koruga et al. Oct 2015 A1
20150313532 Marinkovich et al. Nov 2015 A1
20150327799 Vosch et al. Nov 2015 A1
20150351655 Coleman Dec 2015 A1
20150359467 Tran Dec 2015 A1
20160103123 Holmes et al. Apr 2016 A1
20160157725 Munoz Jun 2016 A1
20160175408 Chang et al. Jun 2016 A1
20160194718 Lane et al. Jul 2016 A1
20160228008 Lee Aug 2016 A1
20160269411 Malachi Sep 2016 A1
20160317744 Rule Nov 2016 A1
20160360980 Sinha Dec 2016 A1
20160371451 Rule et al. Dec 2016 A1
20170020391 Flitsch et al. Jan 2017 A1
20170020431 Flitsch et al. Jan 2017 A1
20170020440 Flitsch et al. Jan 2017 A1
20170020441 Flitsch et al. Jan 2017 A1
20170020442 Flitsch et al. Jan 2017 A1
20170024530 Flitsch et al. Jan 2017 A1
20170024555 Flitsch et al. Jan 2017 A1
20170024771 Flitsch et al. Jan 2017 A1
20170026790 Flitsch et al. Jan 2017 A1
20170035348 Bandic et al. Feb 2017 A1
20170049377 Littell Feb 2017 A1
20170071516 Bhagat Mar 2017 A1
20170086672 Tran Mar 2017 A1
20170105681 Singh et al. Apr 2017 A1
20170119235 Hyde et al. May 2017 A1
20170119236 Hyde et al. May 2017 A1
20170119278 Hyde et al. May 2017 A1
20170127959 Paulussen et al. May 2017 A1
20170150903 Barnes et al. Jun 2017 A1
20170173262 Veltz Jun 2017 A1
20170177812 Sjõlund Jun 2017 A1
20170188943 Braig et al. Jul 2017 A1
20170189629 Newberry Jul 2017 A1
20170209049 Wang et al. Jul 2017 A1
20170216518 Davis et al. Aug 2017 A1
20170231560 Hyde et al. Aug 2017 A1
20170246473 Marinkovich et al. Aug 2017 A1
20170262614 Vishnubhatla et al. Sep 2017 A1
20170308813 Boyden et al. Oct 2017 A1
20170333454 Simard Nov 2017 A1
20170343634 Lencz et al. Nov 2017 A1
20170349894 Dahlman et al. Dec 2017 A1
20170349948 Lo et al. Dec 2017 A1
20170357760 Han et al. Dec 2017 A1
20170363633 Cesano et al. Dec 2017 A1
20180045654 Park et al. Feb 2018 A1
20180110462 Asvadi et al. Apr 2018 A1
20180125430 Al-Ali et al. May 2018 A1
20180161658 Felker Jun 2018 A1
20180168459 Tran Jun 2018 A1
20180168488 Jones et al. Jun 2018 A1
20180182475 Cossler et al. Jun 2018 A1
20180184972 Carmi et al. Jul 2018 A1
20180197636 Firminger et al. Jul 2018 A1
20180207198 Salome et al. Jul 2018 A1
20180214088 Newberry Aug 2018 A1
20180242844 Liu et al. Aug 2018 A1
20180253840 Tran Sep 2018 A1
20180259420 Rule et al. Sep 2018 A1
20180263555 Rule et al. Sep 2018 A1
20180310828 DiMaio et al. Nov 2018 A1
20180310862 Khoja et al. Nov 2018 A1
20180311510 Sjolund et al. Nov 2018 A1
20180318529 Davidson et al. Nov 2018 A1
20180344228 Yelin Dec 2018 A1
20180344231 Butler et al. Dec 2018 A1
20180360320 Werahera et al. Dec 2018 A1
20180372715 Kluckner et al. Dec 2018 A1
20190000349 Narayan et al. Jan 2019 A1
20190022152 Elinav et al. Jan 2019 A1
20190062813 Amin Feb 2019 A1
20190065961 Szu Feb 2019 A1
20190081497 Pugh et al. Mar 2019 A1
20190082990 Poltorak Mar 2019 A1
20190083805 Etkin Mar 2019 A1
20190085324 Regev et al. Mar 2019 A1
20190114469 Sartor et al. Apr 2019 A1
20190117146 Barbour et al. Apr 2019 A1
20190125272 Szu May 2019 A1
20190125361 Shelton, IV et al. May 2019 A1
20190125454 Stokes et al. May 2019 A1
20190125455 Shelton, IV et al. May 2019 A1
20190125456 Shelton, IV et al. May 2019 A1
20190125457 Parihar et al. May 2019 A1
20190125458 Shelton, IV et al. May 2019 A1
20190125459 Shelton, IV et al. May 2019 A1
20190138907 Szu May 2019 A1
20190148013 Pulitzer et al. May 2019 A1
20190159735 Rundo et al. May 2019 A1
20190192855 Bharmi et al. Jun 2019 A1
20190200977 Shelton, IV et al. Jul 2019 A1
20190201124 Shelton, IV et al. Jul 2019 A1
20190201136 Shelton, IV et al. Jul 2019 A1
20190201848 Rao Jul 2019 A1
20190206562 Shelton, IV et al. Jul 2019 A1
20190206563 Shelton, IV et al. Jul 2019 A1
20190206565 Shelton, IV Jul 2019 A1
20190206576 Shelton, IV et al. Jul 2019 A1
20190212345 Lam et al. Jul 2019 A1
20190214147 Ariely Jul 2019 A1
20190216326 Cross et al. Jul 2019 A1
20190223791 Sayani et al. Jul 2019 A1
20190224441 Poltorak Jul 2019 A1
20190231249 Dascalu Aug 2019 A1
20190231903 Bradbury et al. Aug 2019 A1
20190237186 El-Baz et al. Aug 2019 A1
20190247650 Tran Aug 2019 A1
20190247662 Poltroak Aug 2019 A1
20190256924 Vogelstein et al. Aug 2019 A1
20190277870 Kluckner et al. Sep 2019 A1
20190282141 Causey, III et al. Sep 2019 A1
20190290172 Hadad et al. Sep 2019 A1
20190311191 Aarabi et al. Oct 2019 A1
20190313966 Lanzkowsky Oct 2019 A1
20190321394 Wager et al. Oct 2019 A1
20190321583 Poltorak Oct 2019 A1
20190325991 Ishii Oct 2019 A1
20190330350 Freeman et al. Oct 2019 A1
20190332757 Chen et al. Oct 2019 A1
20190336678 Rule Nov 2019 A1
20190351031 Wang et al. Nov 2019 A1
20190392931 Abousy et al. Dec 2019 A1
20200004336 Newberry Jan 2020 A1
20200020247 Simpson et al. Jan 2020 A1
20200022560 Oosake Jan 2020 A1
20200033258 Benni Jan 2020 A1
20200057661 Bendfeldt Feb 2020 A1
20200066405 Peyman Feb 2020 A1
20200077892 Tran Mar 2020 A1
20200082925 Iwata Mar 2020 A1
20200085312 Tzvieli et al. Mar 2020 A1
20200086078 Poltorak Mar 2020 A1
20200093387 Gutierrez et al. Mar 2020 A1
20200093427 Akhbardeh et al. Mar 2020 A1
20200097814 Devesa Mar 2020 A1
20200098461 Macoviak et al. Mar 2020 A1
20200114164 Bourke, Jr. et al. Apr 2020 A1
20200118458 Shriberg et al. Apr 2020 A1
20200121262 De Haan Apr 2020 A1
20200126226 Adiri et al. Apr 2020 A1
20200126227 Adiri et al. Apr 2020 A1
20200126664 Sato Apr 2020 A1
20200134672 el Kaliouby et al. Apr 2020 A1
20200135042 An et al. Apr 2020 A1
20200138360 Fan et al. May 2020 A1
20200151878 Kluckner et al. May 2020 A1
20200155001 Perez et al. May 2020 A1
20200158745 Tian et al. May 2020 A1
20200160998 Ward et al. May 2020 A1
20200163602 Pareddy et al. May 2020 A1
20200164132 Loderer et al. May 2020 A1
20200164209 Hogg et al. May 2020 A1
20200166760 Samec et al. May 2020 A1
20200176099 Welss et al. Jun 2020 A1
20200179717 Lee et al. Jun 2020 A1
20200182778 Srivastava Jun 2020 A1
20200185100 Francois Jun 2020 A1
20200187860 Myslinski Jun 2020 A1
20200188164 Myslinski Jun 2020 A1
20200188708 Myslinski Jun 2020 A1
20200193587 Mairhofer Jun 2020 A1
20200193597 Fan et al. Jun 2020 A1
20200196968 Toyoda et al. Jun 2020 A1
20200209214 Zohar et al. Jul 2020 A1
20200211692 Kalafut et al. Jul 2020 A1
20200211709 Devesa Jul 2020 A1
20200211713 Shadforth et al. Jul 2020 A1
20200211716 Lefkofsky et al. Jul 2020 A1
20200214571 Bradbury et al. Jul 2020 A1
20200222711 Walder et al. Jul 2020 A1
20200227144 Callicoat et al. Jul 2020 A1
20200237274 Hatch Jul 2020 A1
Foreign Referenced Citations (97)
Number Date Country
2013237667 Oct 2013 AU
2013201634 May 2015 AU
2018201076 Mar 2018 AU
2311487 Nov 1999 CA
1712226 Oct 2006 EP
2416270 Feb 2012 EP
2397969 Apr 2013 EP
2389573 Jan 2016 EP
1773943 Mar 2016 EP
2355690 Jan 2018 EP
3150239 Jan 2019 EP
3663785 Jun 2020 EP
3419519 Jul 2020 EP
3160554 Oct 2021 EP
3419502 May 2022 EP
2678070 Oct 2022 EP
3426131 Nov 2022 EP
276599 Oct 2016 IN
292623 Sep 2018 IN
432622 May 2023 IN
488928 Dec 2023 IN
9639928 Dec 1996 WO
9939633 Aug 1999 WO
0042560 Jul 2000 WO
0067635 Nov 2000 WO
0130231 May 2001 WO
0215818 Feb 2002 WO
02032406 Apr 2002 WO
0239873 May 2002 WO
02085195 Oct 2002 WO
02086478 Oct 2002 WO
02086500 Oct 2002 WO
2006110172 Oct 2006 WO
2007030124 Mar 2007 WO
2007050902 May 2007 WO
2007138598 Dec 2007 WO
2007144148 Dec 2007 WO
2008086311 Jul 2008 WO
2008106644 Sep 2008 WO
2008111994 Sep 2008 WO
2008144613 Nov 2008 WO
2009089292 Jul 2009 WO
2010002278 Jan 2010 WO
2011031351 Mar 2011 WO
2011127467 Oct 2011 WO
2012159012 Nov 2012 WO
2013052318 Apr 2013 WO
2014137913 Sep 2014 WO
2014164717 Oct 2014 WO
2014165607 Oct 2014 WO
2015119520 Aug 2015 WO
2015130333 Sep 2015 WO
2015176043 Nov 2015 WO
2015197385 Dec 2015 WO
2016001922 Jan 2016 WO
2017075009 May 2017 WO
2017205047 Nov 2017 WO
2017206888 Dec 2017 WO
2017216724 Dec 2017 WO
2017217597 Dec 2017 WO
2018035387 Feb 2018 WO
2018057058 Mar 2018 WO
2018060996 Apr 2018 WO
2018064569 Apr 2018 WO
2018069789 Apr 2018 WO
2018081423 May 2018 WO
2018102740 Jun 2018 WO
2018112459 Jun 2018 WO
2019043446 Mar 2019 WO
2019060298 Mar 2019 WO
2019084517 May 2019 WO
2019086955 May 2019 WO
2019102277 May 2019 WO
2019113512 Jun 2019 WO
2019118941 Jun 2019 WO
2019126774 Jun 2019 WO
2019130313 Jul 2019 WO
2019132915 Jul 2019 WO
2019136513 Jul 2019 WO
2019141869 Jul 2019 WO
2019157277 Aug 2019 WO
2019161411 Aug 2019 WO
2019173237 Sep 2019 WO
2019173283 Sep 2019 WO
2019183399 Sep 2019 WO
2019210272 Oct 2019 WO
2019212833 Nov 2019 WO
2019213133 Nov 2019 WO
2019213783 Nov 2019 WO
2019237191 Dec 2019 WO
2019246239 Dec 2019 WO
2020006145 Jan 2020 WO
2020025684 Feb 2020 WO
2020025696 Feb 2020 WO
2020035852 Feb 2020 WO
2020036620 Feb 2020 WO
2020041204 Feb 2020 WO
Non-Patent Literature Citations (66)
Entry
Raikhel, Accuracy of noninvasive and invasive point-of-care total blood hemoglobin measurement in an outpatient setting, Postgraduate Medicine, 2012, 124(4):250-255.
Rice et al., Noninvasive hemoglobin monitoring: how accurate is enough?, Anesthesia & Analgesia, 2013, 117(4):902-907.
Scully et al., Physiological parameter monitoring from optical recordings with a mobile phone, IEEE Transactions on Biomedical Engineering, 2011, 59(2):303-306.
Siddiqui et al., A pulse rate estimation algorithm using PPG and smartphone camera, Journal of Medical Systems, 2016, 40(126):1-6.
Smith et al., Second window for in vivo imaging, Nature Nanotechnology, 2009, 4(11):710-711.
STAT Innovations, Making Screening for Anaemia as Simple as Taking a Selfie, Retrieved from https://web.archive.org/web/20221004232209/https://www.statinnovations.com/eyenaemia, Copyright 2018 STAT Innovations Pty. Ltd., 4 pages.
Suner et al., Non-invasive determination of hemoglobin by digital photography of palpebral conjunctiva, The Journal of Emergency Medicine, 2007, 33(2):105-111.
Tamura et al., Wearable photoplethysmographic sensors—past and present, Electronics, 2014, 3(2):282-302.
Uguz et al., Multifunctional photoplethysmography sensor design for respiratory and cardiovascular diagnosis, in “World Congress on Medical Physics and Biomedical Engineering 2018: Jun. 3-8, 2018, Prague, Czech Republic,” Springer, vol. 2, 2019, pp. 905-909.
Wang et al., HemaApp: noninvasive blood screening of hemoglobin using smartphone cameras, Proceeding of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing, 2016, 12 pages.
Wang et al., HemaApp IR: noninvasive hemoglobin measurement using unmodified smartphone cameras and built-in LEDs, Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers, 2017, pp. 305-308.
Wu et al., Screening for iron deficiency, Pediatrics in Review, 2002, 23(5):171-178.
Yamakoshi et al., Pulse glucometry: a new approach for noninvasive blood glucose measurement using instantaneous differential near-infrared spectrophotometry, Journal of Biomedical Optics, 2006, 11(5):054028, pp. 1-9.
Zhang, Photoplethysmography-based heart rate monitoring in physical activities via joint sparse spectrum reconstruction, IEEE Transactions on Biomedical Engineering, arXiv preprint, arXiv:1503.00688, 2015, 9 pages.
Zhang et al., Evaluating photoplethysmogram as a real-time cognitive load assessment during game playing, International Journal of Human-Computer Interaction, 2018, 34(8):695-706.
Zheng et al., The preliminary investigation of imaging photoplethysmographic system, Journal of Physics: Conference Series, IoP Publishing, 2007, 85(012031):1-5.
PCT International Search Report and Written Opinion, PCT/US2019/020675, May 23, 2019, 28 pages.
Ahsan et al., A novel real-time non-invasive hemoglobin level detection using video images from smartphone camera, 2017 IEEE 41st Annual Computer Software and Applications Conference (COMPSAC), IEEE, 2017, vol. 1, 15 pages.
Allen, Photoplethysmography and its application in clinical physiological measurement, Physiological Measurement, 2007, 28(3):R1-R39.
Anderson, The accuracy of pulse oximetry in neonates: effects of fetal hemoglobin and bilirubin, Journal of Perinatology, 1987, 7(4):323.
Anggraeni et al., Non-invasive self-care anemia detection during pregnancy using a smartphone camera, IOP Conference Series: Materials Science and Engineering, 2017, 172(012030):1-6.
Benenson et al., Sickle cell disease: bone, joint, muscle, and motor complications, Orthopaedic Nursing, 2018, 37(4):221-227.
Bui et al., Pho2: Smartphone based blood oxygen level measurement systems using near-ir and red wave-guided light, Proceedings of the 15th ACM Conference on Embedded Network Sensor Systems, 2017, 14 pages.
Carroll et al., Laser-tissue interactions, Clinics in Dermatology, 2006, 24(1):2-7.
Causey et al., Validation of noninvasive hemoglobin measurements using the Masimo Radical-7 SpHb Station, The American Journal of Surgery, 2011, 201(5):592-598.
Centers for Disease Control and Prevention, Sickle Cell Disease (SCD), Retrieved from https://www.cdc.gov/ncbddd/sicklecell/index.html, Accessed in 2017, 4 pages.
Challoner, Photoelectric plethysmography for estimating cutaneous blood flow, in “Non-Invasive Physiological Measurements,” Academic Press, 1979, Chapter 6, pp. 125-151.
Chang et al., Visible light optical spectroscopy is sensitive to neovascularization in the dysplastic cervix, Journal of Biomedical Optics, 2010, 15(5):057006, pp. 1-9.
Chanklan et al., Runoff prediction with a combined artificial neural network and support vector regression, International Journal of Machine Learning and Computing, 2018, 8(1):39-43.
Collings et al., Non-invasive detection of anaemia using digital photographs of the conjunctiva, PloS One, 2016, 11(4):e0153286, pp. 1-10.
Dantu et al., Non-invasive blood glucose monitor based on spectroscopy using a smartphone, 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, IEEE, 2014, pp. 3695-3698.
Edwards et al., Smartphone based optical spectrometer for diffusive reflectance spectroscopic measurement of hemoglobin, Scientific Reports, 2017, 7(1):12224, pp. 1-7.
Giavarina, Understanding bland altman analysis, Biochemia Medica, 2015, 25(2):141-151.
Gordy et al., Spectrophotometric studies: XVI. Determination of the oxygen saturation of blood by a simplified technique, applicable to standard equipment, Journal of Biological Chemistry, 1957, 227(1):285-299.
Hadar et al., Precision and accuracy of noninvasive hemoglobin measurements during pregnancy, The Journal of Maternal-Fetal and Neonatal Medicine, 2012, 25(12):2503-2506.
Hasan et al., Road structure analysis using GPS information, 2013 International Conference on Electrical Information and Communication Technology (EICT), IEEE, 2014, 6 pages.
Hasan et al., Pain level detection from facial image captured by smartphone, Journal of Information Processing, 2016, 24(4):598-608.
Hasan et al., A novel process to extract important information from invisible video captured by smartphone, 2017 IEEE Great Lakes Biomedical Conference (GLBC), IEEE, 2017, 2 pages.
Hasan et al., Analyzing the existing noninvasive hemoglobin measurement techniques, 2017 IEEE 8th Annual Ubiquitous Computing, Electronics and Mobile Communication Conference (UEMCON), IEEE, 2017, pp. 442-448.
Hasan et al., Bild (big image in less dimension): A novel technique for image feature selection to apply partial least square algorithm, 2017 IEEE Great Lakes Biomedical Conference (GLBC), IEEE, 2017, 2 pages.
Hasan et al., RGB pixel analysis of fingertip video image captured from sickle cell patient with low and high level of hemoglobin, 2017 IEEE 8th Annual Ubiquitous Computing, Electronics and Mobile Communication Conference (UEMCON), IEEE, 2017, pp. 499-505.
Hasan et al., A novel technique of noninvasive hemoglobin level measurement using hsv value of fingertip image, arXiv preprint, arXiv:1910.02579, 2019, 23 pages.
Hemoglobe, Hemoglobe: See Who is Nearby with Hemoglobe, Retrieved from https://hemoglobe.com/archive/, Accessed on Mar. 1, 2018, 3 pages.
Johnson et al., Smartphone-based light and sound intensity calculation application for accessibility measurement, InRESNA Annual Conference, 2015, 5 pages.
Jonathan et al., Investigating a smartphone imaging unit for photoplethysmography, Physiological Measurement, 2010, 31(11):N79.
Jonathan et al., Cellular phone-based photoplethysmographic imaging, Journal of Biophotonics, 2010, 1-4, 5 pages.
Jones, Medical electro-optics: measurements in the human microcirculation, Physics in Technology, 1987, 18(2):79.
Kalra, Developing FE Human Models from Medical Images, in “Basic Finite Element Method as Applied to Injury Biomechanics,” Academic Press, 2018, Chapter 9, pp. 389-415.
Kawsar et al., A novel activity detection system using plantar pressure sensors and smartphone, 2015 IEEE 39th Annual International Computers, Software & Applications Conference, IEEE, 2015, vol. 1. , pp. 44-49.
Kawsar et al., Activity detection using time-delay embedding in multi-modal sensor system, Inclusive Smart Cities and Digital Health: 14th International Conference on Smart Homes and Health Telematics, ICOST 2016, Proceedings 14, 2016, pp. 489-499.
Keijzer et al., Light distributions in artery tissue: Monte Carlo simulations for finite-diameter laser beams, Lasers in Surgery and Medicine, 1989, 9(2):148-154.
Le, The prevalence of anemia and moderate-severe anemia in the US population (NHANES 2003-2012), PloS One, 2016, 11(11):e0166635, pp. 1-14.
Li et al., Noninvasive hemoglobin measurement based on optimizing Dynamic Spectrum method, Spectroscopy Letters, 2017, 50(3):164-170.
Loonsk, BioSense—A national initiative for early detection and quantification of public health emergencies, Morbidity and Mortality Weekly Report, 2004, 53, Supplement: Syndromic Surveillance, pp. 53-55.
Lisboa , A review of evidence of health benefit from artificial neural networks in medical intervention, Neural Networks, 2002, 15(1):11-39.
Lisboa et al., The use of artificial neural networks in decision support in cancer: a systematic review, Neural Networks, 2006, 19(4):408-415.
Mahmud et al., Designing Access Control Model and Enforcing Security Policies Using Permis for a Smart Item E-Health Scenario, International Journal of Engineering Science and Technology, 2010, 2(8):3777-3787.
Masimo, Masimo rainbow® Pulse CO-Oximetry, Retrieved from https://professional.masimo.com/technology/co-oximetry/rainbow/, Copyright 2024 Masimo, 5 pages.
McMillan, These Medical Apps Have Doctors and the FDA Worried, Retrieved from https://www.wired.com/2014/07/medical-apps/, Jul. 29, 2014, 14 pages.
Mendelson, Pulse oximetry: theory and applications for noninvasive monitoring, Clinical Chemistry, 1992, 38(9):1601-1607.
Millasseau et al., Contour analysis of the photoplethysmographic pulse measured at the finger, Journal of Hypertension, 2006, 24(8):1449-1456.
Mukaka, Statistics Corner: A guide to appropriate use of Correlation coefficient in medical research, Malawi Medical Journal, 2012, 24(3):69-71.
Nam et al., Photoplethysmography signal analysis for optimal region-of-interest determination in video imaging on a built-in smartphone under different conditions, Sensors, 2017, 17(10):2385, pp. 1-18.
Pelegris et al., A novel method to detect heart beat rate using a mobile phone, 32nd Annual International Conference of the IEEE Engineering in Medicine and Biology, IEEE, 2010, 5 pages.
Punter-Villagrasa et al., An instantaneous low-cost point-of-care anemia detection device, Sensors, 2015, 15(2):4564-4577.
Qui et al., Recent progress in upconversion photodynamic therapy, Nanomaterials, 2018, 8(5):344, pp. 1-18.
Related Publications (1)
Number Date Country
20210007648 A1 Jan 2021 US
Provisional Applications (1)
Number Date Country
62638630 Mar 2018 US