The present invention relates to a wearable device, and more particularly, to a wearable device capable of reducing the influence of ambient light in order to detect physiological information.
A wearable device applying photoplethysmography techniques to detect physiological information of a user (for example, heart rate) must be tightly attached to the user (for example, by the wrists); otherwise, the detected physiological information will not be 100% correct due to the influence of ambient light. Therefore, a novel design to reduce the influence of the ambient light is desired.
One of the objectives of the present inventions is to provide a wearable device and an associated method to reduce the influence of ambient light.
According to an embodiment of the present invention, a wearable device is disclosed, comprising: a light source, a sensor and a processor. The light source selectively operates in an illuminating mode or a non-illuminating mode. In the illuminating mode, the light source generates an auxiliary light passing through a physical body. The sensor is arranged to capture detecting images from the physical body, wherein the detecting images comprise at least one illuminating image captured while the light source is in the illuminating mode, at least one pre-illuminating image captured before the illuminating image is captured while the light source is in the non-illuminating mode, and at least one post-illuminating image captured after the illuminating image is captured while the light source is in the non-illuminating mode. The processor is coupled to the sensing circuit, and is arranged to generate physiological information of the physical body according to the illuminating image, the pre-illuminating image and the post-illuminating image.
According to an embodiment of the present invention, a detecting method employed by a wearable device is disclosed, comprising: controlling a light source of the wearable device to selectively operate in an illuminating mode or a non-illuminating mode; in the illuminating mode, generating, by the light source, an auxiliary light passing through a physical body; capturing detecting images from the physical body, wherein the detecting images comprise at least one illuminating image captured in the illuminating mode, at least one pre-illuminating image captured before the illuminating image is captured while in the non-illuminating mode, and at least one post-illuminating image captured after the illuminating image is captured while in the non-illuminating mode; and generating physiological information of the physical body according to the illuminating image, the pre-illuminating image and the post-illuminating image.
These and other objectives of the present invention will no doubt become obvious to those of ordinary skill in the art after reading the following detailed description of the preferred embodiment that is illustrated in the various figures and drawings.
Certain terms are used throughout the description and following claims to refer to particular components. As one skilled in the art will appreciate, manufacturers may refer to a component by different names. This document does not intend to distinguish between components that differ in name but not function. In the following description and in the claims, the terms “include” and “comprise” are used in an open-ended fashion, and thus should not be interpreted as a close-ended term such as “consist of”. Also, the term “couple” is intended to mean either an indirect or direct electrical connection. Accordingly, if one device is coupled to another device, that connection may be through a direct electrical connection, or through an indirect electrical connection via other devices and connections.
In this embodiment, the sensor 110 may be a camera for applying the photoplethysmography technique to detect physiological information, e.g. heart rate, of the user 20 by capturing detecting images of the user 20. The detecting images comprise illuminating images IMA1-IMAi captured in the illuminating mode (i.e. when the auxiliary light AUX is provided), pre-illuminating images PreIMA1-PreIMAj captured before the illuminating images IMA1-IMAi are captured while in the non-illuminating mode, and post-illuminating images PostIMA1-PostIMAk captured after the illuminating images IMA1-IMAi are captured while in the non-illuminating mode, wherein j and k can be any positive integers. When i is 1, only one illuminating image (i.e. the illuminating image IMA1) is captured. When j is 1, only one pre-illuminating image (i.e. the pre-illuminating image PreIMA1) is captured. When k is 1, only one post-illuminating image (i.e. the post-illuminating image PostIMA1) is captured. The number of captured detecting images is not a limitation of the present invention. Each of the detecting image could be provided as a 2D information (including X*Y pixel data) or a statistic information (such as intensity distribution or color distribution in 1D or 2D direction) of the 2D information.
The sensor 110, for sensing purposes, is preferably installed on a bottom surface of the wearable device 10 which attaches to the user's skin for higher accuracy, as shown in
The processor 120 is arranged to process the detecting images captured by the sensor 110 to generate physiological information PHY which may be shown on a display (not shown in
For example, the pre-illuminating image PreIMA1 corresponds to a pre-illuminating detected data PreData1, wherein the pre-illuminating detected data PreData1 may include the influence of the ambient light, the illuminating image IMA1 corresponds to an illuminating detected data Data1, wherein the illuminating detected data Data1 includes the influence of the ambient light and the auxiliary light AUX passing through the body of the user 20, and the post-illuminating image PostIMA1 corresponds to a post-illuminating detected data PostData1, wherein the post-illuminating detected data PostData1 includes the influence of the ambient light. The processor 120 generates the physiological information PHY according to the pre-illuminating detected data, the illuminating detected data, and the post-illuminating values. It should be noted that that transformation may be done by an analog-to-digital converter (ADC) of the processor 120. This is only for illustrative purposes, however. The process of transforming a detecting image into raw data should be well-known to those skilled in the art.
In a brief example, the sensor 110 includes four pixels (ex: 2×2 sensor array). The detected data for each detecting images is an intensity summation of the four pixels (Data=Pixel1+Pixel2+Pixel3+Pixel4), wherein Pixel1, Pixel2, Pixel3 and Pixel4 are intensity values of each pixel in the detecting image.
When j is not 1, i.e. more than one pre-illuminating image is captured, the processor 120 may further generate an average pre-illuminating detected data PreDataAvg from the pre-illuminating detected data PreData1-PreDataj. When j is 1, the average pre-illuminating detected data PreDataAVG can be easily derived from the pre-illuminating detected data PreData1. In addition, when k is not 1, i.e. more than one pre-illuminating image is captured, the processor 120 may further generate an average post-illuminating detected data PostDataAvg from the post-illuminating detected data PostData1-PostDatak. When k is 1, the average post-illuminating detected data PostDataAvg can be easily derived from the post-illuminating detected data PostData1. Likewise, when i is not 1, i.e. more than one illuminating image is captured, the processor 120 may further generate an average illuminating detected data DataAvg from the illuminating detected data Data1-Datai. When i is 1, the average illuminating detected data DataAvg can be easily derived from the illuminating detected data Data1. To reduce the influence of the ambient light, the processor 120 generates an output detected data OutData by subtracting an average of the average pre-illuminating detected data PreDataAvg and the average post-illuminating detected data PostDataAvg from the average illuminating detected data DataAvg which can be represented by the following equation:
OutData=DataAvg−(PreDataAvg+PostDataAvg)/2.
Considering that the influence of the ambient light can be regarded as linear in a very short period, applying the above equation can effectively reduce the influence of the ambient light from the average illuminating detected data DataAvg, so that the output detected data OutData will only contain the influence of the auxiliary light AUX passing through the body of the user 20. In this way, the physiological information PHY generated by the processor 120 according to the output detected data OutData can be more accurate. It should be noted that the output detected data Outdata may be directly or indirectly regarded as the physiological information PHY (e.g. heart rate); for example, the output detected data Outdata may further be transformed into the heart rate of the user via some specific operations which will not be discussed in the present invention.
Briefly summarized, the present invention proposes a wearable device and an associated method to reduce the influence of ambient light by capturing illuminating images in the illuminating mode, pre-illuminating images and post-illuminating images in the non-illuminating mode, and subtracting the influence of the ambient light of the pre-illuminating images and the post-illuminating images from the illuminating images to assure high accuracy of the physiological information.
Those skilled in the art will readily observe that numerous modifications and alterations of the device and method may be made while retaining the teachings of the invention. Accordingly, the above disclosure should be construed as limited only by the metes and bounds of the appended claims.