Blood oxygenation is an important biomarker and accurate measurement of blood oxygenation is desirable for many health reasons. For example, regularly monitoring blood oxygenation can help detect cardiac and pulmonary conditions such as hypoxemia and sleep apnea. Athletes often use blood oxygenation measures to monitor their performance and improve their endurance training.
Pulse oximetry is one technique that has been used to measure blood oxygenation. Pulse oximeters leverage the differing light absorption rates of hemoglobin (oxygenated red blood cells) and deoxyhemoglobin (non-oxygenated red blood cells) at different wavelengths of light, typically, red and near-infrared. An oximeter typically includes a small measurement device clipped to a finger or ear lobe to measure peripheral arterial oxygen saturation. The device typically includes red and near-infrared light emission sources on one side of the finger, and light sensors on the other side. The light sensors measure the red and near-infrared light that has passed through the finger and uses the relative red and near-infrared light intensities to estimate oxygenation. While devices designed specifically for pulse oximetry are inexpensive and accurate (e.g., ±2-3%), they are single-purpose devices, because of the inconvenience of keeping a specialized device at hand, pulse oximeters are not often used by people without compelling reasons.
Unlike pulse oximeters, people often keep smartphones on their person or nearby. The potential to use smartphones as pulse oximeters without special hardware has been considered. The main solution to date has been to use a smartphone's photography flash as an illuminant in combination with the smartphone's rear-facing camera. A finger is placed over both the flash and the camera, white light from the flash passes through the finger and some is reflected to the camera. The camera signal is processed to estimate oxygenation. Although this technique provides an accurate measure of heartrate, oxygenation measures are unreliable for several reasons. Most smartphone cameras have integrated block filters which minimize optical sensitivity in the near-infrared region. Much of this filter-blocked region of light happens to include wavebands where deoxyhemoglobin reflects more light than oxyhemoglobin. Consequently, due to near elimination of sensing in these high-contrast bands, and due to the roughly uniform spectrum of flash light, flash light reflections from oxyhemoglobin and deoxyhemoglobin have low contrast and therefore result in less precise measures. Another approach has been to equip smartphones with additional hardware illuminants (e.g., light emitting diodes) and/or sensors, but low utilization of such hardware, the amount of cost it adds, and the additional hardware footprint might not be justified.
Techniques for using a computing device to measure pulse oximetry are discussed below.
The following summary is included only to introduce some concepts discussed in the Detailed Description below. This summary is not comprehensive and is not intended to delineate the scope of the claimed subject matter, which is set forth by the claims presented at the end.
A computing device has a display and a camera. The display emits light comprising a first waveband component and a second waveband component. The light from the display transmits through matter and is reflected to the camera. The reflected display light has a first waveband component and a second waveband component. Image data from the camera provides a first intensity corresponding to the first waveband component and a second intensity corresponding to the second waveband component. In one embodiment, a ratio of the first intensity and the second intensity are used to determine a property of the matter. Other embodiments may use other functions that involve the intensities of two or more bands of illumination. The technique may be used to measure relative ratios (or other functions) of any light-transmitting constituents of the matter. If the matter includes pulsing blood, the ratio corresponds to blood oxygenation.
Many of the attendant features will be explained below with reference to the following detailed description considered in connection with the accompanying drawings.
The present description will be better understood from the following detailed description read in light of the accompanying drawings, wherein like reference numerals are used to designate like parts in the accompanying description.
Embodiments discussed below relate to using a display and camera of a computing device to measure pulse oximetry. The display of the device is used as an illuminant, in one embodiment a finger is placed over a portion of the display and a camera facing in the same direction as the display (e.g., a front-facing camera of a smartphone). One or more colors are selected to enhance hemoglobin-deoxyhemoglobin contrast in view of display and camera sensitivities. The one or more colors are displayed while a finger or other body part covers the displayed color and the camera. The camera captures images of light that has passed through the finger and been partially internally reflected to the camera. The light reaching the camera has been diminished by absorption by arterial hemoglobin and deoxyhemoglobin at different rates in respective different wavebands. Differences in attenuation of display light at the different wavebands provide sufficient contrast (ratio R) to compute an accurate pulse oxygenation estimate (e.g., commonly by using a lookup table).
The computing device 100 also includes processing hardware 108 to execute a process 110 for determining a property 112 of the matter 106. The process 110 may be an application executed by the computing device's operating system or other such software. Initially, an illuminant color is selected 114. Consider that there are high-contrast wavebands where the constituent components of the matter 106 have different respective absorption rates (typically in bands between isosbestic points). The illuminant color is selected to maximize illumination at these high-contrast bands. The illuminant color 116 is displayed 118 by the display 104. The light from the displayed color 116 transits through the matter 106 and is partially reflected to the camera 102. While the color 116 is displayed, the camera captures one or more images 120 of the light from the display 104 that has both transited through the matter 106 and reflected to the camera. The images 120 include two or more color channels. As described later, the images 120 are processed 122 to extract whichever color channels are appropriate for contrast-sensitive wavebands of the elements in the optical pathway (i.e., the display, matter, and camera).
At step 132 target color channels are extracted from the captured video/image sequence. The result is a raw time-domain intensity signal for each color. At step 134 each target color signal is passed through one or more filters for noise reduction, etc. At step 136 a statistical measure of intensity is obtained for each target color signal. The statistical measure may be any type of statistical aggregation such as arithmetic mean, harmonic mean, geometric mean, root means squared, average, etc. Different statistical measures might be taken for the respective target color signals, for different time periods, for different signal components, etc. For discussion, it will be assumed that the first color signal yields a statistical intensity for each respective target color signal. At step 138 a ratio of the intensities is computed, and at step 140 the ratio is applied to a table or function that maps the ratio to relative proportions of the constituent components. It is also possible to use other functions of the intensities to identify the composition of the measured matter. Any function that meaningfully varies with varying intensities of the color signals may be considered.
An embodiment for implementing pulse oximetry on a smartphone with stock hardware is not described.
While embodiments are described for emitting two color channels, depending on the material being measured and the profiles of the camera and display, accuracy might be higher if three color channels are displayed (either uniformly or non-uniformly, as circumstances suggest). Similarly, more than two color channels of the images may be used for higher accuracy. Furthermore, although this description mentions selecting one or more colors for illumination, an automated decision-making process to identify ideal colors is not required. For applications intended for a known material (e.g., blood and tissue), the particular colors to be displayed and/or analyzed for intensity may be hard-coded to be specific to the material. In another embodiment, there may be an incremental walk through the camera/display spectrum with sampling and analysis performed across many wavebands of the spectrum, which can reveal wavebands where there is maximal contrast. In yet another embodiment, a user interface may allow a user to specify the target material and target colors are set accordingly during runtime.
As can be seen in
To extract the amplitudes of the camera/image color signals, the intensity levels of the red and green channels are obtained from a sensor/image region that is closest to the light source, i.e., the displayed color patch. In one embodiment, ⅓ of the image width for this region is used. In short, a sub-portion of each captured frame may be used as the initial sample. It is also possible to determine a sampling area based on the location where the finger is contacting the display (if the geometry of the smartphone is known in advance). Twenty seconds of camera sampling data may suffice. For each frame or image, a raw value is derived from the sampled region's average intensity, for each color channel. The image area used for processing may also be determined automatically. For example, an image of the finger with and without screen illumination may be compared and only a part of the frame where there is sufficient difference in the signal between the two states/images is used.
Although a liquid crystal display was tested, organic light emitting diode displays have similar emission profiles and may provide better contrast. Another approach to illumination is to alternate between displaying red and displaying green. That is, as opposed to displaying red and green together (i.e., yellow), the red channel is obtained only from images captured when the display emits red light and the green channel is obtained only from images captured when the display emits green light. Measurements have demonstrated that using the display as the illuminant provides twice the contrast as using a smartphone flash as the illuminant (assuming similar illumination intensities). Although a smartphone is well-suited to the techniques described herein, any device with suitable processing circuitry and with a display near a camera and both facing the same direction may be used. As noted above, by varying the choice of illuminants, it is possible to determine information about the composition of display-illuminated matter by choosing the illumination colors according to isosbestic points of the illuminated matter; relative changes in the contrast signal can be used to determine relative ratios of constituent components (compounds, elements, etc.) of the target material.
In one embodiment, the color displayed by the display is sized and positioned according to finger position, and low-intensity guides (e.g., lines) are displayed to show where the finger should be placed and kept. Contrast—and hence accuracy and precision—can be improved by minimizing non-display illumination. At the least, covering the device during measurement may be helpful. Performing a measurement in a dark room or measuring with the device may be placed flush against a body area such as forehead or wrist may also increase accuracy. Measurement periods can be communicated to a user using sounds, haptic feedback, or graphics displayed sufficiently distant from the camera.
In yet another embodiment, the captured image/video data is transmitted via a network to another computing device or compute cloud that processes the image/video data to derive a ratio or other measure of constituent components. An application protocol may include elements such as an initial exchange where the device with the camera and display transmit information identifying the device. A backend service and the measuring device both implement the protocol. The backend service maintains a database of devices and their properties, model, and manufacturer, which camera and display each device has, properties of the cameras and displays (e.g. brightness and sensitivity profiles), user instructions for each device, display instructions for displaying color(s), etc. On the measuring device, when a measurement application is registered, installed, or executed, the application sends its identity to the service. The service stores this information in session data, for instance, and returns device-specific information such as display information indicating which color(s) should be displayed, for how long, what patterns or location on the display, etc. When a measurement is taken, the captured image data is sent to the service. The service processes the image data according to the profile of the device and returns the final analysis to the measuring device or smartphone. A final measurement, for instance a percentage of blood oxygenation, is displayed on the display of the measuring device.
[Eyal: In other embodiments, I can imagine someone using a transparent sticker that can accumulate the display light and ‘stream’ it to a point next to the camera, under the finger. Such a contraption might increase the light entering the finger.
In another embodiment I can imagine using a mirror to reflect the display light to the camera. This arrangement could be used to measure transmittance of a liquid between the phone and the mirror.].
In addition to the display 104, the computing device 100 may have a network interface 354 (or several), as well as storage hardware 356 and processing hardware 358, which may be a combination of any one or more: central processing units, graphics processing units, analog-to-digital converters, bus chips, FPGAs, ASICs, Application-specific Standard Products (ASSPs), or Complex Programmable Logic Devices (CPLDs), etc. The storage hardware 356 may be any combination of magnetic storage, static memory, volatile memory, non-volatile memory, optically or magnetically readable matter, etc. The meaning of the terms “storage” and “storage hardware”, as used herein does not refer to signals or energy per se, but rather refers to physical apparatuses and states of matter. The hardware elements of the computing device 100 may cooperate in ways well understood in the art of machine computing. In addition, input devices may be integrated with or in communication with the computing device 100. The computing device 100 may have any form-factor or may be used in any type of encompassing device. The computing device 100 may be in the form of a handheld device such as a smartphone, a tablet computer, a gaming device, a server, a rack-mounted or backplaned computer-on-a-board, a system-on-a-chip, or others.
Embodiments and features discussed above can be realized in the form of information stored in volatile or non-volatile computer or device readable storage hardware. This is deemed to include at least hardware such as optical storage (e.g., compact-disk read-only memory (CD-ROM)), magnetic media, flash read-only memory (ROM), or any means of storing digital information in to be readily available for the processing hardware 358. The stored information can be in the form of machine executable instructions (e.g., compiled executable binary code), source code, bytecode, or any other information that can be used to enable or configure computing devices to perform the various embodiments discussed above. This is also considered to include at least volatile memory such as random-access memory (RAM) and/or virtual memory storing information such as central processing unit (CPU) instructions during execution of a program carrying out an embodiment, as well as non-volatile media storing information that allows a program or executable to be loaded and executed. The embodiments and features can be performed on any type of computing device, including portable devices, workstations, servers, mobile wireless devices, and so on.